Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Surrogate

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Swarm and Evolutionary Computation 1 (2011) 61–70

Contents lists available at ScienceDirect

Swarm and Evolutionary Computation


journal homepage: www.elsevier.com/locate/swevo

Survey paper

Surrogate-assisted evolutionary computation: Recent advances and


future challenges
Yaochu Jin
Department of Computing, University of Surrey, Guildford, Surrey, GU2 7XH, UK

article info abstract


Article history: Surrogate-assisted, or meta-model based evolutionary computation uses efficient computational models,
Received 6 February 2011 often known as surrogates or meta-models, for approximating the fitness function in evolutionary
Received in revised form algorithms. Research on surrogate-assisted evolutionary computation began over a decade ago and
20 April 2011
has received considerably increasing interest in recent years. Very interestingly, surrogate-assisted
Accepted 14 May 2011
Available online 6 June 2011
evolutionary computation has found successful applications not only in solving computationally
expensive single- or multi-objective optimization problems, but also in addressing dynamic optimization
Keywords:
problems, constrained optimization problems and multi-modal optimization problems. This paper
Evolutionary computation provides a concise overview of the history and recent developments in surrogate-assisted evolutionary
Surrogates computation and suggests a few future trends in this research area.
Meta-models © 2011 Elsevier B.V. All rights reserved.
Machine learning
Expensive optimization problems
Model management

1. Introduction of expensive problems, in particular when the problems are


of high-dimension, the development of a model management
In most evolutionary algorithms, it is often implicitly assumed strategy remains a challenging research topic.
that there exists a means for evaluating the fitness value of The remainder of the paper is organized as follows. Section 2
all individuals in a population. In general, the fitness value takes a brief look back at the history of surrogate-assisted evolu-
of an individual can be computed using an explicit fitness tionary computation starting from the late 1990s. Representative
function, a computational simulation, or an experiment. In model management strategies are discussed in Section 3, which
practice, however, fitness evaluations may become non-trivial. distinguish themselves into managing a single surrogate, homo-
Such situations typically occur when evolutionary algorithms are geneous multiple surrogates, and heterogeneous multiple surro-
employed to solve expensive optimization problems, where either gates. Application of surrogates to addressing problems other than
the computational simulation for each fitness evaluation is highly expensive optimization in evolutionary computation is presented
time-consuming, or the experiments for fitness estimation are in Section 4. Application examples of meta-model based evolution-
prohibitively costly, or an analytical function for fitness evaluations ary optimization are briefly accounted in Section 5. A few promis-
simply does not exist. ing yet challenging research topics are suggested in Section 4. The
Surrogate-assisted evolutionary computation was mainly paper concludes with a brief summary in Section 7.
motivated from reducing computational time in evolutionary op-
timization of expensive problems, such as aerodynamic design op- 2. A brief look back
timization [1] or drug design [2], where complex computational
simulations are involved. Research on evolutionary optimization using approximate
In principle, surrogates should be used together with the fitness evaluations was first reported in the mid-1980s [4],
real fitness function, as long as such a fitness function exists to
and sporadic yet increasing research results on evolutionary
prevent the evolutionary algorithm from being misled by a false
optimization using computational models for fitness estimation
minimum introduced by the surrogates [3]. A strategy for properly
appeared after the mid-1990s [5–9]. The first event devoted to
using the surrogates is often known as model management or
research on using surrogates in evolutionary optimization was
evolution control. In surrogate-assisted evolutionary optimization
a workshop held in 2002 within the Genetic and Evolutionary
Computation Conference (GECCO) [10]. Since then, a series of
special sessions and workshops have been organized on the major
E-mail address: yaochu.jin@surrey.ac.uk. conferences including GECCO and IEEE Congress on Evolutionary
2210-6502/$ – see front matter © 2011 Elsevier B.V. All rights reserved.
doi:10.1016/j.swevo.2011.05.001
62 Y. Jin / Swarm and Evolutionary Computation 1 (2011) 61–70

Computation, and journal special issues have also been edited.


An overview of the research on surrogated-assisted evolutionary
optimization reported in various fields was first presented in
a conference paper [11], and then a journal paper in a special
issue [12]. A first tutorial on fitness approximation on evolutionary
optimization was given at the GECCO in 2005. Most recently, an
edited book on the use of surrogates in evolutionary computation
was also published [13].
In the review paper [12], the importance of managing
surrogates was emphasized for the first time to prevent the
evolutionary algorithms from being misled to a false optimum
that can be introduced in a surrogate. In that review, methods for
managing surrogates in evolutionary computation were divided
into three categories, namely, individual-based, generation-based
and population-based strategies. A variety of computational
models, including polynomials (also known as response surface
methodologies in the field of traditional design optimization),
Gaussian processes (also known as Kriging in traditional design Fig. 1. An illustration of a trade-off between fidelity (approximation accuracy)
and computational cost. Usually, high-fidelity fitness evaluations are more time-
optimization), neural networks, together with data sampling
consuming. By contrast, low-fidelity fitness evaluations are often less time-
techniques such as design of experiments, active learning and consuming.
boosting were also presented. Practically, fitness inheritance from
parents or fitness imitation from siblings [14–16] can be seen as
In the research of surrogate-assisted evolutionary optimiza-
a sort of simplified yet effective interpolation technique. General
tion, most algorithms have been developed based on benchmark
issues such as the global and local approximation, approximation
problems, where it is assumed that fully accurate fitness evalu-
of nonlinear constraints and the use of multiple surrogates having
ations can be provided. Such fitness functions are often termed
various fidelities were discussed. Theoretical analysis of the
‘‘real fitness function’’ or ‘‘original fitness function’’. In the follow-
convergence properties was also raised.
ing, we use surrogates for denoting computational models con-
Since the review paper [12], very encouraging research structed with data, whereas other approximate fitness techniques
progresses have been made in many of the areas, whereas such as full or incomplete 2-D CFD simulations are called prob-
some issues remain unsolved, in particular with respect to a lem approximations as termed in [12]. In addition, we do not dis-
rigorous theoretical support for the benefit for using surrogates tinguish between surrogate-assisted single objective optimization
in evolutionary computation. Note that this paper focuses on and surrogate-assisted multi-objective optimization if the method
surrogates in evolutionary computation. Readers interested in for model management does not differ.
recent developments of surrogate-assisted design and analysis In the early work on surrogate-assisted evolutionary opti-
methods are referred to [17,18]. mization, the evolutionary search is based solely on a surrogate,
The next section provides a brief overview of recent advances assuming that the surrogate can provide sufficiently accurate fit-
in the research on surrogate-assisted evolutionary optimization, ness evaluations. However, such assumptions can give rise to seri-
emphasizing on the progresses made after the review paper [12]. ous problems if the surrogate introduces optima that do not exist
Research on using surrogates beyond solving expensive problems in the original optimization problem. This issue was first explic-
is discussed in Section 4. A few challenging topics for future itly raised in [3] to stress the importance of model management in
research are suggested in Section 6. A summary of the paper is surrogate-assisted evolutionary optimization, mainly by using the
given in Section 7. surrogate together with the real fitness function.
Surrogates can be applied to almost all operations of evolution-
3. Strategies for managing surrogates ary algorithms, such as population initialization, cross-over, mu-
tation, local search and fitness evaluations, as illustrated in Fig. 2.
In most real-world optimization problems, no analytical For instance, a surrogate can be used for filtering out poor solu-
fitness function exists for accurately evaluating the fitness of tions in population initialization, crossover [19] or mutation [20].
a candidate solution. Instead, there are only more accurate The use of surrogates in initialization, mutation or crossover [21]
and less accurate fitness estimation methods, which often can reduce the randomness in the genetic operators, thus termed
trade off accuracy with computational costs, as illustrated in informed operators. Most recently, a similar approach is adopted
Fig. 1. For example, in evolutionary optimization of aerodynamic for multi-objective optimization [22], where a single, aggregated
structures [1], wind tunnel experiments may provide the most meta-model is built to pre-screen candidate solutions before fit-
accurate estimation of the quality of candidate designs. The cost ness evaluation. The requirement on the quality of surrogates is
of such experiments is often prohibitively high. In addition, three- minimum, as an estimated fitness that is better than a random
dimensional (3-D) computational fluid dynamic (CFD) simulations guess is adequate.
using Navier–Stokes equations may provide very accurate fitness Techniques for managing surrogates for fitness evaluations
evaluations. Unfortunately, such CFD simulations are highly time- can generally be divided into individual-based, generation-based
consuming, which can take hours or even days for one single fitness and population-based [12]. By generation-based, we mean that
evaluation. Computationally more efficient simulations can be surrogates are used for fitness evaluations in some of the
achieved by 2-D full simulations or even incomplete simulations. generations, while in the rest of the generations, the real fitness
By incomplete simulation, we mean that a simulation process is function is used [8,23,24,7]. By contrast, in individual-based model
stopped before it converges. The computationally most efficient management techniques, the real-fitness function is used for
way for estimating fitness is the use of machine learning models, fitness evaluations for some of the individuals in a generation
i.e., surrogates. Note, however that this graphic only shows a [25,3,23]. In population-based approaches, more than one sub-
simplified version of actual levels of accuracy. population co-evolves, each using its own surrogate for fitness
Y. Jin / Swarm and Evolutionary Computation 1 (2011) 61–70 63

Fig. 2. A diagram for an evolutionary algorithm for optimization of a turbine blade.


A star denotes an evolutionary operation where a surrogate can be helpful.

evaluations. Migration of individuals from one sub-population to


another is allowed.
b
A strategy closely related to the above methods is the
pre-selection strategy [26]. Pre-selection does not exactly fall
in individual-based strategies. Assume the population size is
λ. In pre-selection, an initial offspring population contains λ′
individuals, where λ′ > λ are produced in each generation. Then,
all λ′ offspring individuals are evaluated using the surrogate. Based
on the fitness value obtained using the surrogate, only λ offspring
are kept and re-evaluated using the original fitness function.
A main difference between individual-based strategies and pre-
selection here is that in pre-selection, selection is always based
on the real fitness value whereas in individual-based methods,
selection may be conducted partly based on fitness values from the
surrogate.
The main steps of one specific individual-based model manage-
ment method, termed best strategy, and the pre-selection method
for a (µ, λ) evolution strategy, (µ, λ)-ES, are illustrated in Fig. 3 (a)
and (b), respectively. In a (µ, λ)-ES using best strategy, all λ off-
spring are first evaluated using the surrogate. Then, λ⋆ ≤ λ best Fig. 3. Two individual-based model management strategies. (a) Best strategy, and
individuals according to the surrogate are re-evaluated using the (b) Pre-selection strategy.
expensive real fitness function. As a result, it can happen that the
fitness value of some of the selected µ parents is based on the sur- assume fitness evaluations using the real fitness function is time-
rogate. Contrary to that, in a (µ, λ)-ES using pre-selection, λ⋆ ≥ λ consuming, the next question is how to adapt the number of in-
offspring are generated and then evaluated using the surrogate. dividuals to be evaluated with the real fitness function so that the
Then, λ best individuals are re-evaluated using the expensive real time for fitness evaluations can be reduced as much as possible,
fitness function. Consequently, all the selected µ parents for the while the evolutionary algorithm can still find the global optimum.
next generation are evaluated by the real fitness function. In the following, we discuss a few issues related to the answer to
A large category of surrogate-assisted evolutionary algorithms the above questions.
use surrogates in local search only for both single [27,28] and
multi-objective optimization [29]. In this case, sophisticated 3.1.1. Criteria for choosing individuals for re-evaluation
model management methods developed in traditional design The most straightforward idea is to evaluate those individuals
optimization [30], such as the trust-region method [31] can be that potentially have a good fitness value and the higher the
directly employed. approximation accuracy, the more often the surrogate can be
Recently, surrogates have also been used in stochastic search used [3,23]. A slightly different idea is that representative
methods other than evolutionary algorithms, such as surrogate- individuals can be chosen for re-evaluation by clustering the
assisted simulated annealing [32] or surrogate-assisted artificial population into a number of crisp or fuzzy clusters. The individual
immune systems [33]. closest to each cluster center [34,15,35] or the best individual in
In the following, we discuss a few interesting ideas for model each cluster [36,37] can be chosen for re-evaluation.
management, which are divided into two major categories — use It has also been suggested that individuals having a large degree
of a single surrogate and multiple surrogates. of uncertainty in approximation can be good candidates for re-
evaluation [25,26]. This idea can be justified by two arguments.
3.1. Managing a single surrogate First, a large degree of uncertainty in approximation of the
individuals indicates that the fitness landscape around these
The essential question to answer in surrogate-assisted evolu- solutions has not been well explored and therefore may provide
tionary computation is which individuals should be chosen to be a good chance of finding a better solution. Second, re-evaluation
evaluated or re-evaluated using the real fitness function. As we of these solutions may be the most effective in improving the
64 Y. Jin / Swarm and Evolutionary Computation 1 (2011) 61–70

evaluations. The expectation


 µ   λ−µ 
µ
− m µ−m
⟨ξ ⟩ = m  
λ
m=0
µ

µ2
= (2)
λ
of ξ in case random selection is used as a normalization in (1). It
Fig. 4. Examples of surrogates that have a large approximation error but are can be seen that if all µ parent individuals are selected correctly,
adequately good for evolutionary search. Solid curves denote the original function the measure reaches its maximum of ρ (sel.) = 1, and that negative
and dashed curves are their approximation. values indicate that the selection based on the surrogate is worse
than a random selection.
approximation accuracy of the surrogate, similar to the idea of The measure ρ (sel.) only evaluates the absolute number of
active learning [38]. correctly selected individuals. If ρ (sel.) < 1, the measure does
The estimation of the approximation error can be achieved with not indicate whether the (µ + 1)-th or the worst offspring
different methods. In [25], the degree of uncertainty is roughly set individual has been selected, which may have significant influence
to be inversely proportional to the average distance to the closest on the evolutionary process. Therefore, the measure ρ (sel.) can be
data samples used for constructing the surrogate. Alternatively, an extended to include the rank of the selected individuals, calculated
ensemble can be used for estimating the variance of the individual based on the real fitness function. A surrogate is assumed to be
good, if the rank of the selected individuals based on the model
estimates given by an ensemble of surrogates. The most often
is above-average according to the rank based on the real fitness
used surrogate model for estimating model uncertainties is the
function.
Gaussian processes [39], also known as the Kriging model [40].
The definition of the extended measure ρ (∼sel.) is as follows: The
Unlike deterministic models, Gaussian processes provide an
surrogate achieves a grade of λ − m, if the m-th best individual
estimate of the fitness (mean) together with an estimate of the
based on the real fitness function is selected. Thus, the quality of
uncertainty (variance), which is a statistically sound boundary
the surrogate can be indicated by summing up the grades of the
of the uncertainty in fitness estimation. Due to this property,
selected individuals, which is denoted by π . It is obvious that π
Gaussian processes have increasingly been employed as surrogates
reaches its maximum, if all µ individuals are selected correctly:
in evolutionary single- and multi-objective optimization [41–44].
µ
Note however, the computational cost for constructing Gaussian
π (max.) =

processes itself can be very high when the number of samples used (λ − m)
m=1
is large and online learning of the Gaussian processes is non-trivial
µ+1
 
when new samples are available.
= µ λ− . (3)
2
3.1.2. Metrics for evaluating surrogates and adaptation Similar to (1) the measure ρ (∼sel.) is defined by transforming π
Not much attention has been paid to adapting the frequency linearly, using the maximum π (max.) as well as the expectation
of using the surrogates. In [45], the model quality is estimated by ⟨π ⟩ = µλ2
for the case of a purely random selection:
calculating the average approximation error after re-evaluation,
which is used to adapt the frequency of using the surrogate in a π − ⟨π ⟩
ρ (∼sel.) = . (4)
generation-based model management method. Based on empirical π (max.)
− ⟨π ⟩
observations that large approximation errors must not mislead
Besides these two problem-dependent measures for evaluating
the evolutionary search, see e.g., Fig. 4, a few metrics other than
the quality of the surrogate, two established measures – the
approximation error have been proposed in [46,45] in evaluating
rank correlation and the (continuous) correlation – partially fit
the quality of surrogates. In the following, we present a few the requirements formulated above. The rank correlation can be
performance measures for surrogates in great detail. expressed by
The most common measure for model quality or model fidelity
is the mean squared error between the individual’s real fitness λ
d2l

6
value and the predicted fitness by the meta-model. However, from
ρ (rank) = 1 −
l =0
the evolutionary perspective, selecting the right individuals for , (5)
λ(λ2 − 1)
the next generation is the main concern. For instance in Fig. 4,
the quality of the surrogate is poor in terms of approximation is a measure for the monotonic relation between the ranks of two
accuracy. However, an evolutionary algorithm searching on the variables. In our case, dl is the difference between the ranks of the l-
surrogate only will nevertheless find the right optimum. Consider th offspring individual based on the original fitness function and on
(µ, λ)-selection with λ ≥ 2µ, which is of particular relevance the approximate model. The range of ρ (rank) is the interval [−1; 1].
in evolutionary optimization of complex real-world problems, the The higher the value of ρ (rank) , the stronger the monotonic relation
number of individuals that have been selected correctly using the with a positive slope between the ranks of the two variables.
surrogate can be obtained by: In contrast to ρ (∼sel.) , the rank correlation does not only take
the ranking of the selected individuals, but also the ranks of all
ξ − ⟨ξ ⟩ individuals into account.
ρ (sel.) = , (1) A slightly different quality measure can be defined by
µ − ⟨ξ ⟩
calculating the (continuous) correlation between the surrogate and
where ξ (0 ≤ ξ ≤ µ) is the number of correctly selected the original fitness function.
individuals, i.e., the number of individuals that would have also Using the selection-based criterion [45] for evaluating surro-
been selected if the real fitness function was used for fitness gates, an adaptation scheme has been suggested for adapting the
Y. Jin / Swarm and Evolutionary Computation 1 (2011) 61–70 65

types of training samples used for constructing the surrogates


can be generated from different problem approximations, such as
wind-tunnel experiments, 3-D or 2-D CFD simulations. Another
way of generating surrogates of heterogeneous fidelity is to control
the complexity of the surrogates explicitly, e.g., by using a different
number of training samples or by controlling the model complexity
with regularized learning [3] or a Pareto-based multi-objective
learning method [53].
In the following, we discuss the use of multiple surrogates in
evolutionary optimization by dividing the methods into homoge-
neous and heterogeneous multiple surrogates. By homogeneous
multiple surrogates, the fidelity of the surrogates are not explic-
itly controlled, even if different types of surrogates are used. On
the contrary, heterogeneous surrogates vary in their fidelity due to
an explicit control in model complexity or training data.

3.2.1. Homogeneous multiple surrogates


Fig. 5. An illustration of learning iterative fitness evolutionary process using Use of ensembles for fitness approximation was suggested
recurrent neural networks for predicting converged fitness value. in [34], where it was shown that neural network ensembles
can improve the performance of surrogate-assisted evolutionary
number of individuals to be evaluated using the surrogate (λ′ ) [47]. optimization in two aspects. First, ensembles can improve the
It has been shown that λ′ increases as the evolution proceeds, in- quality in fitness prediction. Second, the variance of the predicted
dicating that the quality of the surrogates improves. Interestingly, fitness of the ensemble members can help identify large prediction
when noise is introduced into the fitness data samples, λ′ first de- errors so that false optima can be avoided.
creases and then increases again. The various selection based cri- The benefit of using multiple surrogates has also been shown
teria suggested in [45] have been benchmarked for adapting the empirically in many papers [54,55,52]. In this category of research,
number of individuals to be re-evaluated by the real fitness func- no explicit control of fidelity of the multiple surrogates is
tion [37]. The results, however, failed to show a clear advantage of employed. For example in [56,57] multiple surrogates such as
any particular criterion. Kriging, polynomials, radial-basis-function networks (RBFN), and
weighted average ensemble are used to demonstrate the improved
robustness of optimization. Polynomial and RBFN surrogates are
3.1.3. Improving approximation accuracy
employed for multiobjective optimization and it was shown that
Although approximation quality is not the only criterion for
each of the models performs better in different regions of the
surrogates for fitness prediction in evolutionary optimization,
Pareto front.
improving its approximation quality is desirable. Much work has Multiple surrogates have been used in evolutionary multi-
been reported along this line. For example, in [3,23], regularization objective optimization [58]. In that work, a co-evolutionary
of the neural network model has been suggested to alleviate genetic algorithm for multiple-objective optimization based on
overfitting. Structure and parameter optimization of the surrogate surrogates was introduced. After some fixed search intervals, the
can co-evolve with the original optimization problem [46,48]. surrogates that approximate different objectives are exchanged
One of the main difficulties in improving the approximation and shared among multiple sub-populations of genetic algorithms.
accuracy can be attributed to the high-dimensionality in the design Spatially distributed multiple surrogates have been used for fitness
space. To overcome this difficulty, the surrogate can be built up approximation in multi-objective optimization [59].
in a new space of a lower dimension using dimension reduction
techniques [49,50].
3.2.2. Heterogeneous multiple surrogates
It is noticed that in many expensive optimization problems,
As illustrated in Fig. 1, in many real-world optimization
the fitness evaluation often consists of an iterative computation
problems, various problem approximation techniques can be
process, such as the numerical solution of differential equations
employed. For example, in aerodynamic optimization 3-D or 2-D
in computational fluid dynamics simulations. In such cases,
numerical simulations can be used for estimating the quality of the
many intermediate data will be produced before the simulation
designs in addition to wind tunnel experiments. In more extreme
converges. Such intermediate data can also be used for training
situations, incomplete simulations can also be used, where a
a surrogate in the first iterations and then the surrogate can be
numerical simulation is stopped earlier to reduce computation
used for predicting the converged fitness [51]. An example of such
time. Data from all these different processes can also be applied
a process is illustrated in Fig. 5.
for building up surrogates.
The motivation of explicitly controlling the fidelity of the
3.2. Managing multiple surrogates surrogates can also be justified by taking the computational costs
of constructing surrogates into account. To reduce the cost for
Methods for multiple surrogates in evolutionary optimization building up surrogates, it makes good sense to use surrogates of a
distinguish themselves in type and fidelity of the surrogates. For lower fidelity that can be obtained with less cost in the early stage
example, a neural network ensemble has been used in [34], where of evolutionary optimization. Another more tricky motivation is to
all ensemble members are of the same type of feed-forward neural take advantage of approximation errors introduced by surrogates,
networks. Alternatively, multiple surrogates of different types, hoping to smoothen a rugged fitness landscape, or to increase the
such as polynomials, support vector machines and neural networks diversity of the population, or simply to use data from incomplete
can be used [52]. simulations.
A step further is to use surrogates of different fidelities. Early work that uses heterogeneous multiple surrogates was
Surrogates of different fidelities can be obtained by using models of reported in [60,61], where a population-based model management
different complexities or different data sets. For instance, different strategy is used. In both papers, three sub-populations are used,
66 Y. Jin / Swarm and Evolutionary Computation 1 (2011) 61–70

each using a surrogate for fitness evaluations. In [60], individuals 4.1. Surrogates in interactive evolutionary computation
from a sub-population that use a surrogate of lower fidelity are
allowed to migrate to the sub-population that uses a surrogate of In interactive evolutionary computation, the fitness value of
higher fidelity. The method presented in [61] is a minor variant each individual is evaluated by human user subjectively [68].
of [60], where migration is allowed between all sub-populations. Human fitness evaluations are necessary where no fitness function
One approach to reducing the computational cost for construct- is available. For instance, when evolutionary algorithms are used
ing surrogates is to use coarse surrogates (of lower fidelity) in for aesthetic product design or art design. One main challenge of
the early stage of the optimization and increase the quality of the interactive evolutionary computation is the issue of human fatigue.
surrogate gradually as the search proceeds [6]. This idea of using To address this problem to a certain degree, surrogates can be used
coarse-to-fine surrogates has been introduced into a surrogate- to replace in part human evaluations. The main idea is to use a
assisted evolutionary search in [42,62], where surrogates are used machine learning model to predict the fitness value the human
for evolutionary multi-objective optimization. may assign to a design based on history data [69–71].
A more subtle way to control the fidelity of surrogates is to
use surrogates of a sufficiently good fidelity based on a correlation 4.2. Surrogated-assisted evolution for solving dynamic optimization
based measure [63]. The fidelity control strategy was applied
to a memetic algorithm in which the local search is based on Evolutionary optimization of dynamic optimization problems
surrogates of a changing fidelity. The proposed method was has become a popular research topic recently [12]. The primary
evaluated empirically on an aerodynamic airfoil design problem goal is to develop an evolutionary search strategy that can follow a
and demonstrated that the use of a dynamic fidelity is able to moving optimum or a moving Pareto front. To this end, a certain
improve the search speed. degree of diversity in the population should be maintained or
The idea of taking advantage of approximation errors intro- a memory mechanism must be embedded in the evolutionary
duced by surrogates was further exploited in [64]. In that work, two algorithm. Memory mechanisms include sub-populations, archives
types of surrogates are used in the local search of an evolutionary of optimal solutions found so far, or multiploidy in genetic
multi-objective optimization: One for getting a reliable local
representation.
prediction and the other for a higher degree of diversity. Empir-
In addition to memory and diversity based strategies, antici-
ical results show that an evolutionary search based on hetero-
pation and prediction of the change in the fitness function can
geneous multiple models can considerably improve the search
be helpful in solving dynamic problems more efficiently. In such
performance, compared to surrogate-assisted evolutionary algo-
strategies, a surrogate can be helpful in learning the changing fit-
rithms that use a single surrogate or homogeneous multiple
ness function [72–74].
surrogates. Interestingly enough, the proposed algorithm also out-
performs its counterpart that uses an artificial perfect surrogate.
Detailed analysis of the search processes confirmed the hypothe- 4.3. Surrogates for robust optimization
sis that controlled approximation errors introduced by surrogates
can speed up the search process in both single- and multi-objective In evolutionary optimization of real-world problems, one is
optimization. concerned not only with the performance of the obtained optimal
solution, but also the sensitivity of the performance to small
changes in the design variables or in the environment. If an optimal
3.3. Which model management strategy?
solution is insensitive to such changes, the solution is known as
As we discussed above, surrogates can be used in population robust optimization.
initialization, crossover, mutation and preselection to pre-screen To obtain robust optimal solutions using evolutionary algo-
candidate solutions. The advantage of these relatively conservative rithms, either implicit averaging or explicit averaging can be
approaches to using surrogates is that they are less likely to used [12], wherein an assumption on the probability distribution
mislead the search process. One concern might be that they may of the noise is often made. By contrast, one can predefine the al-
cause premature convergence. It is also less risky if a surrogate is lowed performance decrease and then search for an optimum that
used in a local search of memetic algorithms. The common feature has the maximum tolerance of changes in the design variables,
of these approaches is that all individuals have been re-evaluated which is termed inverse robust optimization [75]. In both explicit
using the original fitness function before selection. averaging based or inverse robust optimization, additional fitness
In addition, among the model management strategies, the evaluations are needed. To enhance the efficiency, some of these
individual-based model management may be more suited for additional fitness evaluations can be done based on a surrogate
steady state evolution, or generational evolution implemented on [76–78].
a single machine. By contrast, population-based and generation-
based model management is better for parallel implementation on 4.4. Surrogates for constrained optimization
heterogeneous machines having different speeds. An optimization
strategy may be desirable when multi-level surrogates having Many optimization problems are subject to constraints. To
different computational complexities are used on machines having judge if a candidate solution is feasible, the constraint functions
different computational powers. need to be frequently evaluated. Therefore, if the evaluations
of constraint functions are time-consuming, it is desirable to
4. Beyond evolutionary optimization of expensive problems replace the constraint functions with computationally efficient
approximate models [79].
In addition to reducing the computation time in evolutionary In some real-world applications, an explicit constraint is not
optimization of expensive problems, surrogates can be useful in available. For example in aerodynamic optimization, some of the
addressing other problems in evolutionary computation, such as candidate designs may result in unstable computational fluid
the use of surrogates for reducing fitness evaluations in search of dynamic (CFD) simulations. In order to reduce the number of
robust optimal solutions [65]. In addition, surrogates have been unnecessary, time-consuming CFD simulations, it is very helpful
found helpful in improving the efficiency of evolutionary algo- to judge whether a solution is feasible (e.g., converges in a CFD
rithms for solving optimization with noisy fitness evaluations [66] simulation) before it is evaluated in a CFD simulation. Surrogates
or for solving multi-modal optimization with a very rugged fit- can be used for this purpose [80,81].
ness landscape [6,67], where the purpose of using a surrogate is An interesting idea of using surrogates in constrained opti-
to smoothen the fitness landscape. mization has been recently reported in [82], where surrogates are
Y. Jin / Swarm and Evolutionary Computation 1 (2011) 61–70 67

a evolutionary algorithms need to be demonstrated in real-world


applications. One intensively researched area is surrogate-assisted
design optimization, such as turbine blades [9,23,84,85], air-
foils [27,86], forging [87], vehicle crash tests [88], multi-processor
systems-on-chip design [89] and injection systems [90]. Other ap-
plications include drug design [2], protein design [5], hydroinfor-
matics [91] and evolutionary robotics [92]. We must note that
not many substantial successful applications of meta-model based
evolutionary optimization have been reported, which, however
does not necessarily mean no such work has been done. Some work
carried out in industry has not been published. We also want to
note that meta-model based evolutionary optimization has been
included in a few commercial design software tools.

6. Future challenges

b Surrogate-assisted evolutionary computation has achieved


considerable advances over the past decade, not only in algorithm
design, but also in real-world applications. Nevertheless, many
challenges remain to be addressed. In the following, we discuss a
few of these challenges and hope that these discussions will trigger
more research efforts devoted to approaching these challenges.

6.1. Theoretic work

A wide range of trust-region methods have shown to converge


to the global optimum [93] when a gradient-based method is
used to search on the surrogate. Unfortunately, a convergence
proof for surrogate-assisted evolutionary algorithms to the global
optimum or to a local optimum is not straightforward, as a proof
of any stochastic search algorithm to a global optimum is non-
trivial. Meanwhile, approximation errors introduced by surrogates
c can usually neither be described by a Gaussian nor a uniform
distribution, which makes a quantitative analysis of the search
dynamics on a surrogate very difficult, if not impossible.
If we go one step back, we may raise the question of whether
we can guarantee that a surrogate-assisted evolutionary algorithm
converges faster than its counterpart without using a surrogate
using the same number of expensive fitness evaluations. Again, no
theoretical work has been reported to show this.

6.2. Multi-level, multi-fidelity heterogeneous surrogates

Use of multi-level, multi-fidelity surrogates has already been


suggested in [12]. The heterogeneity can include the model type of
the surrogates and the degree of fidelity (modeling accuracy). On
the one hand, various surrogates, ranging from the deterministic
Fig. 6. An illustrative example of manipulating the constraints to facilitate linear model (e.g., linear interpolation) to nonlinear models
evolutionary search [82]. (a) The true feasible region (shaded), which consists of (e.g. feedforward neural networks, support vector machines)
three separate sub-regions. (b) A linear approximation of the original constraints
and to stochastic models, such as Gaussian processes (Kriging)
by using two data points, resulting in an enlarged single feasible region. (c) A more
accurate approximation of the constraint functions and the resulting feasible region and to dynamic models such as recurrent neural networks.
is close to real one. Meanwhile, multi-fidelity models can be used either by using data
from different problem approximations (e.g., 2D Navier–Stokes
applied to manipulate the shape and size of the feasible region to simulations and 3D Navier–Stokes simulations) or experiments,
ease the solution of highly constrained optimization problems. The or different degrees of incomplete simulations, or by deliberately
basic idea is to deliberately enlarge the feasible region by build- controlling the complexity of the models.
ing up a very simple surrogate for each constraint function. As the When heterogeneous surrogates are used, the computational
evolutionary optimization proceeds, the complexity of the surro- times for fitness evaluations using different models can be very
gates increases gradually so that the approximated feasible region different. To further improve the computational efficiency of
can converge to the real feasible region. An illustration of this ba- the whole evolutionary process, non-generational evolutionary
sic idea is given in Fig. 6. Simulation results on a set of benchmark algorithms with grid-based or asynchronous computing structure
problems and a few structural design problems demonstrated that may be preferred [86,56].
the idea works well. Genetic programming based generation of in-
creasingly complex surrogates has also been reported [83]. 6.3. Surrogate-assisted combinatorial optimization

5. Real-world applications Surrogate-assisted evolutionary algorithms have been studied


extensively for continuous optimization. In real-world applica-
Surrogate-assisted evolutionary optimization is more tions, however, there are also many computationally intensive
application driven. Thus, the effectiveness of surrogate-assisted combinatorial optimization problems, such as job shop scheduling
68 Y. Jin / Swarm and Evolutionary Computation 1 (2011) 61–70

and wireless network or mobile sensor network optimization. In [4] J.J. Grefenstette, J.M. Fitzpatrick, Genetic search with approximate fitness
such cases, discrete modeling techniques must be employed, e.g., evaluations, in: Proceedings of the International Conference on Genetic
Algorithms and Their Applications, 1985, pp. 112–120.
binary neural networks [94]. In [95], an RNF neural network is ap- [5] G. Schneider, J. Schuchhardt, P. Wrede, Artificial neural networks and
plied to assist a mixed integer evolution strategy for intravascu- simulated molecular evolution are potential tools for sequence-oriented
lar ultrasound image analysis [95]. Recently, an integrated Kriging protein design, CABIOS 10 (6) (1994) 635–645.
[6] D. Yang, S.J. Flockton, Evolutionary algorithms with a coarse-to-fine function
model is used for mobile network optimization [96]. smoothing, in: IEEE International Conference on Evolutionary Computation,
1995, pp. 657–662.
6.4. Surrogate-assisted dynamic optimization [7] A. Ratle, Accelerating the convergence of evolutionary algorithms by fitness
landscape approximation, in: Parallel Problem Solving from Nature, 1998,
pp. 87–96.
If an expensive optimization is time-varying, evolutionary [8] L. Bull, On model-based evolutionary computation, Soft Computing 3 (1999)
algorithms for solving dynamic optimization problems must be 76–82.
[9] S. Pierret, Turbomachinery blade design using a Navier–Stokes solver and
adopted to track the moving optima or moving Pareto front [97]. artificial neural network, ASME Journal of Turbomachinery 121 (3) (1999)
Practically, an optimal solution that is robust over time may be 326–332.
more preferred [74]. In either case, the surrogate must be updated [10] Y. Jin, S.J. Louis, K.M. Rasheed, Approximation and learning in evolutionary
computation. GECCO Workshop, July 2002.
online. Therefore, it may be of interest to introduce incremental [11] Y. Jin, B. Sendhoff, Fitness approximation in evolutionary computation —
learning techniques [98] for efficient online learning when the a survey, in: Genetic and Evolutionary Computation Conference, 2002,
objective functions change over time. pp. 1105–1112.
[12] Y. Jin, A comprehensive survey of fitness approximation in evolutionary
computation, Soft Computing 9 (1) (2005) 3–12.
6.5. Rigorous benchmarking and test problems [13] Y. Tenne, C.-K. Goh (Eds.), Computational Intelligence in Expensive Optimiza-
tion Problems, Springer, 2009.
[14] M.M. Davarynejad, C.W. Ahn, J. Vranckena, J. van den Berga, C.A. Coello
Although many surrogate-assisted evolutionary algorithms Coello, Evolutionary hidden information detection by granulation-based
have been proposed and demonstrated to be more efficient than fitness approximation, Applied Soft Computing 10 (3) (2010) 719–729.
their counterpart without using a surrogate, no rigorous compar- [15] H.S. Kim, S.B. Cho, An efficient genetic algorithms with less fitness evaluation
by clustering, in: Congress on Evolutionary Computation, 2001, pp. 887–894.
ative studies on surrogate-assisted evolutionary algorithms have [16] M. Salami, T. Hendtlass, The fast evaluation strategy for evolvable hardware,
been reported. This may be attributed to two reasons. First, no Genetic Programming and Evolvable Machines 6 (2) (2005) 139–162.
widely accepted performance index for benchmarking surrogate- [17] A. Forrester, A. Sobester, A. Keane, Engineering Design via Surrogate
Modelling: A Practical Guide, John Wiley & Sons, 2008.
assisted evolutionary algorithms has been suggested. Second, no [18] J.P.C. Kleijnen, Design and analysis of simulation experiments, in: Interna-
benchmark problems dedicated to surrogate-assisted evolutionary tional Series in Operations Research & Management Science, Springer, 2009.
algorithms have been proposed. Most work on surrogate-assisted [19] K. Anderson, Y. Hsu, Genetic crossover strategy using an approximation
concept, in: IEEE Congress on Evolutionary Computation, 1999, pp. 527–533.
evolutionary algorithms uses either standard test functions such [20] K. Abboud, M. Schoenauer, Surrogate deterministic mutation: Preliminary
as the Ackley function [99] or specific real-world applications for results, in: Artificial Evolution, in: LNCS, 2002, pp. 919–954.
empirical evaluations. However, design of test problems relevant [21] K. Rasheed, H. Hirsh, Informed operators: speeding up genetic-algorithm-
based design optimization using reduced models, in: Genetic and Evolution-
to real-world applications is non-trivial. Ideally, such test prob-
ary Computation Conference, Morgan Kaufmann, 2000, pp. 628–635.
lems should reflect the major difficulties in real-world applications [22] I. Loshchilov, M. Schoenauer, S. Sebag, A mono surrogate for multiobjective
yet tractable for intensive empirical comparisons. As indicated optimization, in: Genetic and Evolutionary Computation Conference, 2010,
in [1], expensive optimization problems such as aerodynamic de- pp. 471–478.
[23] Y. Jin, M. Olhofer, B. Sendhoff, A framework for evolutionary optimization
sign optimization not only involve highly time-consuming fitness with approximate fitness functions, IEEE Transactions on Evolutionary
evaluations, the fitness landscape is often multi-modal as well. In Computation 6 (5) (2002) 481–494.
addition, the CFD simulations may be unstable, resulting in many [24] D. Lim, Y.-S. Ong, Y. Jin, B. Sendhoff, Trusted evolutionary algorithms, in: IEEE
Congress on Evolutionary Computation, 2006, pp. 456–463.
isolated infeasible solutions. Finally, the design space is very high [25] J. Branke, C. Schmidt, Fast convergence by means of fitness estimation, Soft
and geometry representation may be critical for the efficiency of Computing 9 (1) (2005) 13–20.
the whole evolutionary design optimization. [26] M. Emmerich, A. Giotis, M. Uezdenir, T. Baeck, K. Giannakoglou, Metamodel-
assisted evolution strategies, in: Parallel Problem Solving from Nature,
in: LNCS, Springer, 2002, pp. 371–380.
7. Summary [27] Y.S. Ong, P.B. Nair, A.J. Keane, Evolutionary optimization of computationally
expensive problems via surrogate modeling, AIAA Journal 41 (4) (2003)
687–696.
Surrogate-assisted evolutionary algorithms are motivated from [28] Z. Zhou, Y.-S. Ong, M.-H. Lim, B.-S. Lee, Memetic algorithm using multi-
real-world applications. As evolutionary algorithms are increas- surrogates for computationally expensive optimization problems, Soft
ingly applied to solving complex problems, research interests in Computing 11 (10) (2007) 957–971.
[29] S.Z. Martinez, C.A. Coello Coello, A memetic algorithm with non gradient-
surrogate-assisted evolutionary algorithms have considerably in- based local search assisted by a meta-model, in: Parallel Problem Solving
creased in recent years. This paper provides a brief overview of from Nature, in: LNCS, Springer, 2010, pp. 576–585.
recent advances in this research area and suggests a few chal- [30] J.-F.M. Barthelemy, Approximation concepts for optimum structural design
— A review, Structural Optimization 5 (1993) 129–144.
lenging issues that remain to be resolved in the future. We ex- [31] M. Celis, J.E. Dennis, R.A. Tapia, A trust region strategy for nonlinear equality
pect that successful resolution of these challenges heavily depends constrained optimization, in: P. Boggs, R. Byrd, R. Schnabel (Eds.), Numerical
on the progress in both optimization and learning, and new com- Optimization 1984, SIAM, Philadelphia, 1985, pp. 71–82.
[32] H.K. Singh, T. Ray, W. Smith, Surrogate assisted simulated annealing
puting techniques such as grid computing [100] and cloud com-
(SASA) for constrained multi-objective optimization, in: IEEE Congress on
puting [101], with which more computing resources will be made Evolutionary Computation, 2010, pp. 1–8.
available to common users via computer networks. [33] H.S. Bernardino, H.J.C. Barbosa, L.G. Fonseca, A faster clonal selection
algorithm for expensive optimization problems, in: Artificial Immune
Systems, Springer, 2010, pp. 130–143.
References [34] Y. Jin, B. Sendhoff, Reducing fitness evaluations using clustering techniques
and neural network ensembles, in: Genetic and Evolutionary Computation
[1] Y. Jin, B. Sendhoff, A systems approach to evolutionary multi-objective struc- Conference, 2004, pp. 688–699.
tural optimization and beyond, IEEE Computational Intelligence Magazine 4 [35] F. Mota, F. Gomide, Fuzzy clustering in fitness estimation models for genetic
(3) (2009) 62–76. algorithms and applications, in: IEEE International Conference on Fuzzy
[2] D. Douguet, e-LEA3D: A computational-aided drug design web server, Systems, 2006, pp. 1388–1395.
Nucleic Acids Research 38 (2010) w615–w621. [36] L. Graening, Y. Jin, B. Sendhoff, Efficient evolutionary optimization using
[3] Y. Jin, M. Olhofer, B. Sendhoff, On evolutionary optimization with approxi- individual-based evolution control and neural networks: A comparative
mate fitness functions, in: Genetic and Evolutionary Computation Congress, study, in: European Symposium on Artificial Neural Networks, 2005,
2000, pp. 786–793. pp. 273–278.
Y. Jin / Swarm and Evolutionary Computation 1 (2011) 61–70 69

[37] L. Graening, Y. Jin, B. Sendhoff, Individual-based Management of [67] K.-H. Liang, X. Yao, C. Newton, Evolutionary search of approximated n-
Meta-models for Evolutionary Optimization with Applications to dimensional landscapes, International Journal of Knowledge-Based Intelli-
Three-dimensional Blade Optimization, Springer, 2007, pp. 225–250 gent Engineering Systems 4 (3) (2000) 172–183.
(chapter 6). [68] H. Takagi, Interactive evolutionary computation: fusion of the capabilities
[38] S. Tong, Active learning: theory and applications. Ph.D. Thesis, Department of of ec optimization and human evaluation, Proceedings of IEEE 89 (2002)
Computer Science, Stanford University, 2001. 1275–1296.
[39] D.J.C. MacKay, Introduction to gaussian processes, in: C.M. Bishop (Ed.), [69] J.A. Biles, P.G. Anderson, L.W. Loggi, Neural network fitness functions for a
Neural Networks and Machine Learning, Springer, 1998, pp. 133–165. musical IGA, in: The International ICSC Symposium on Intelligent Industrial
[40] J. Sacks, W.J. Welch, T.J. Mitchell, H.P. Wynn, Design and analysis of computer Automation and Soft Computing, 1996.
experiments, Statistical Science 4 (1989) 409–435. [70] R.R Kamalian, A.M. Agogino, H. Takagi, Use of interactive evolutionary
computation with simplified modeling for computationally expensive layout
[41] D. Buche, N.N. Schraudolph, P. Koumoutsakos, Accelerating evolutionary
design optimization, in: IEEE Congress on Evolutionary Computation, 2007,
algorithms with gaussian process fitness function models, IEEE Transactions
pp. 4124–4129.
on Systems, Man, and Cybernetics, Part C: Applications and Reviews 35 (2)
[71] X. Sun, D. Gong, S. Li, Classification and regression-based surrogate model
(2005) 183–194.
- Assisted interactive genetic algorithm with individual fuzzy fitness, in:
[42] M. Emmerich, K.C. Giannakoglou, B. Naujoks, Single- and multiobjective Genetic and Evolutionary Computation Conference, 2009, pp. 907–914.
evolutionary optimization assisted by gaussian random field metamodels, [72] A. Zhou, Y. Jin, Q. Zhang, B. Sendhoff, E. Tsang, Prediction-based population
IEEE Transactions on Evolutionary Computation 10 (4) (2006) 421–439. re-initialization for evolutionary dynamic multi-objective optimization,
[43] J. Knowles, Parego: a hybrid algorithm with on-line landscape approximation in: The Fourth International Conference on Evolutionary Multi-Criterion
for expensive multiobjective optimization problems, IEEE Transactions on Optimization, in: LNCS, Springer, 2007, pp. 832–846.
Evolutionary Optimization 10 (1) (2006) 50–66. [73] I. Hatzakis, D. Wallace, Dynamic multi-objective optimization with evolu-
[44] Q. Zhang, W. Liu, E. Tsang, B. Virginas, Expensive multiobjective optimization tionary algorithms: a forward-looking approach, in: Genetic and Evolution-
by MOEA/D with gaussian process model, IEEE Transactions on Evolutionary ary Computation Conference, 2006, pp. 1201–1208.
Computation 14 (3) (2010) 456–474. [74] X. Yu, Y. Jin, K. Tang, X. Yao, Robust optimization over time — a new
[45] Y. Jin, M. Huesken, B. Sendhoff, Quality measures for approximate models in perspective on dynamic optimization problems, in: Congress on Evolutionary
evolutionary computation, in: Proceedings of GECCO Workshops: Workshop Computation, 2010, pp. 3998–4003.
on Adaptation, Learning and Approximation in Evolutionary Computation, [75] D. Lim, Y.-S. Ong, Y. Jin, B. Sendhoff, B.S. Lee, Inverse multi-objective robust
2003, pp. 170–174. evolutionary optimization, Genetic Programming and Evolvable Machines 7
[46] M. Huesken, Y. Jin, B. Sendhoff, Structure optimization of neural networks for (4) (2006) 383–404.
aerodynamic optimization, Soft Computing 9 (1) (2005) 21–28. [76] D. Lim, Y.-S. Ong, M.-H. Lim, Y. Jin, Single/multi-objective inverse robust
[47] H. Ulmer, F. Streichert, A. Zell, Evolution strategies with controlled model evolutionary design methodology in the presence of uncertainty, in: S. Yang,
assistance, in: Congress on Evolutionary Computation, 2004, pp. 1569–1576. Y.S. Ong, Y. Jin (Eds.), Evolutionary Computation in Dynamic and Uncertain
[48] M. Schmidt, H. Lipson, Coevolution of fitness predictors, IEEE Transactions on Environments, Springer, 2007, pp. 437–456.
[77] Y.-S. Ong, P.B. Nair, K.Y. Lum, Max–min surrogate-assisted evolutionary
Evolutionary Computation 12 (6) (2008) 736–749.
algorithm for robust design, IEEE Transactions on Evolutionary Computation
[49] Y. Tenne, K. Izui, S. Nishiwaki, Dimensionality-reduction frameworks
10 (4) (2006) 392–404.
for computationally expensive problems, IEEE Congress on Evolutionary
[78] I. Paenke, J. Banke, Y. Jin, Efficient search for robust solutions by means
Computation, 2010, pp. 1–8.
of evolutionary algorithms and fitness approximation, IEEE Transactions on
[50] Z. Zhou, Y.S. Ong, P.B. Nair, A.J. Keane, K.Y. Lum, Combining global and local Evolutionary Computation 10 (4) (2006) 405–420.
surrogate models to accelerate evolutionary optimization, IEEE Transactions [79] T.P. Runarsson, Constrained evolutionary optimization by approximate
on Systems, Man, and Cybernetics, Part C: Applications and Reviews 37 (1) ranking and surrogate models, in: Parallel Problem Solving from Nature,
(2007) 66–76. in: LNCS, Springer, 2004, pp. 401–410.
[51] Y. Cao, Y. Jin, M. Kowalczykiewicz, B. Sendhoff, Prediction of convergence dy- [80] S.D. Handoko, C.K. Kwoh, Y.S. Ong, Feasibility structure modeling: an
namics of design performance using differential recurrent neural networks, effective chaperon for constrained memetic algorithms, IEEE Transactions on
in: International Joint Conference on Neural Networks, 2008, pp. 529–534. Evolutionary Computation 14 (5) (2010) 740–758.
[52] Y. Tenne, S.W. Armfield, A framework for memetic optimization using [81] Y. Tenne, K. Izui, S. Nishiwaki, Handling undefined vectors in expensive
variable global and local surrogate models, Soft Computing 13 (2009) 81–793. optimization problems, in: Applications of Evolutionary Computation,
[53] Y. Jin, B. Sendhoff, Pareto-based multi-objective machine learning: an in: LNCS, Springer, 2010, pp. 582–591.
overview and case studies, IEEE Transactions on Systems, Man, and [82] Y. Jin, S. Oh, M. Jeon, Incremental approximation of nonlinear constraint
Cybernetics, Part C: Applications and Reviews 38 (3) (2008) 397–415. functions for evolutionary constrained optimization, in: IEEE Congress on
[54] T. Goel, R.T. Haftka, W. Shyy, N.V. Queipo, Ensemble of surrogates, Structural Evolutionary Computation, 2010, pp. 2966–2973.
and Multidisciplinary Optimization 33 (3) (2007) 199–216. [83] S. Oh, Y. Jin, M. Jeon, Approximate models for constraint functions in
[55] E. Sanchez, S. Pintos, N.V. Queipo, Toward an optimal ensemble of evolutionary constrained optimization, International Journal of Innovative
kernel-based approximations with engineering applications, Structural and Computing, Information and Control 7 (10) (2011).
Multidisciplinary Optimization 36 (3) (2008) 247–261. [84] M.K. Karakasis, K.C. Giannakoglou, Metamodel-assisted multi-objective
evolutionary optimization, in: Evolutionary and Deterministic Methods for
[56] D. Lim, Y.S. Ong, Y. Jin, B. Sendhoff, A study on metamodeling techniques,
Design, Optimization and Control with Applications to Industrial and Societal
ensembles, and multi-surrogates in evolutionary computation, in: Genetic
Problems, EUROGEN, 2005.
and Evolutionary Computation Conference, 2007, pp. 1288–1295.
[85] A. Shahrokhi, A. Jahangirian, A surrogate assisted evolutionary optimization
[57] A. Samad, K.-Y. Kim, Multiple surrogate modeling for axial compressor blade
method with application to the transonic airfoil design, Engineering
shape optimization, Journal of Propulsion Power 24 (2) (2008) 302–310.
Optimization 42 (6) (2010) 497–515.
[58] D. Chafekar, L. Shi, K. Rasheed, J. Xuan, Constrained multi-objective ga [86] V.G. Asouti, I.C. Kampolis, K.C. Giannakoglou, A grid-enabled asynchronous
optimization using reduced models, IEEE Transactions on Systems, Man and metamodel-assisted evolutionary algorithm for aerodynamic optimization,
Cybernetics, Part C: Applications and Reviews 35 (2) (2005) 261–265. Genetic Programming and Evolvable Machines 10 (4) (2009) 373–389.
[59] A. Isaacs, T. Ray, W. Smith, An evolutionary algorithm with spatially [87] M.H.A. Bonte, L. Fourment, T.-T. Do, A.H. van den Boogaard, J. Huetink,
distributed surrogates for multiobjective optimization, in: The 3rd Australian Optimization of forging processes using finite element simulations, Struc-
Conference on Progress in Artificial Life, 2007, pp. 257–268. tural Multidisciplinary Optimization 42 (2010) 797–810.
[60] D. Eby, R. Averill, W. Punch, E. Goodman, Evaluation of injection island model [88] K. Hamza, K. Saitou, Crashworthiness design using meta-models for
ga performance on flywheel design optimization, in: Third Conference on aprroximating the response of structural members, in: Cairo University
Adaptive Computing in Design and Manufacturing, 1998, pp. 121–136. Conference on Mechanical Design and Production, 2004.
[61] M. Sefrioui, J. Periaux, A hierarchical genetic algorithm using multiple [89] G. Mariani, G. Palermo, C. Silvano, V. Zaccaria, Meta-model assisted
models for optimization, in: Parallel Problem Solving from Nature, 2000, optimization for design space exploration of multi-processor systems-on-
pp. 879–888. chip, in: 12th Euromicro Conference on Digital System Design, Architectures,
[62] P.K.S. Nain, K. Deb, Computationally effective search and optimization Methods and Tools, 2009, pp. 383–389.
procedure using coarse to fine approximation, in: IEEE Congress on [90] G. Dellino, P. Lino, C. Meloni, A. Rizzo, Kriging metamodel management in the
Evolutionary Computation, 2003, pp. 2081–2088. design optimization of a cng injection system, Mathematics and Computers
[63] D. Lim, Y.S. Ong, Y. Jin, B. Sendhoff, Evolutionary optimization with dynamic in Simulation 79 (8) (2009) 2345–2360.
fidelity computational models, in: International Conference on Intelligent [91] S.-T. Khu, D. Savic, Z. Kapelan, Evolutionary-based meta-modelling v the rele-
Computing, in: LNCS, Springer, 2008, pp. 235–242. vance of using approximate models in hydroinformatics, in: Hydroinformat-
ics in Practice: Computational Intelligence and Technological Developments
[64] D. Lim, Y. Jin, Y.S. Ong, B. Sendhoff, Generalizing surrogate-assisted
in Water Applications, Springer, 2008.
evolutionary computation, IEEE Transactions on Evolutionary Computation
[92] I. Dahm, J. Ziegler, Using artificial neural networks to construct a meta-model
14 (3) (2010) 329–355.
for the evolution of gait patterns, in: Proceedings of the 5th International
[65] J. Branke, Creating robust solutions by means of evolutionary algorithms, Conference on Climbing and Walking Robots, 2002.
in: Parallel Problem Solving from Nature, in: LNCS, Springer, 1998, [93] M.J. Powell, On the convergence of a wide range of trust region methods for
pp. 119–128. unconstrained optimization, IMA Journal of Numerical Analysis 30 (1) (2010)
[66] M. Bhattacharya, Reduced computation for evolutionary optimization in 289–301.
noisy environment, in: Genetic and Evolutionary Computation Conference, [94] V.J. Hodge, K.J. Lees, J.L. Austin, A high performance k-nn approach using
2008, pp. 2117–2122. binary neural networks, Neural Networks 17 (3) (2004) 441–458.
70 Y. Jin / Swarm and Evolutionary Computation 1 (2011) 61–70

[95] R. Li, M.T.M. Emmerich, J. Eggermont, E.G.P. Bovenkamp, T. Bäck, J. Dijkstra, [98] R. Polikar, L. Upda, S.S. Upda, V. Honavar, Learn++: An incremental learning
J.H.C. Reiber, Metamodel-assisted mixed integer evolution strategies and algorithm for supervised neural networks, IEEE Transactions on Systems,
their application to intravascular ultrasound image analysis, in: IEEE Man, and Cybernetics, Part C: Applications and Reviews 31 (4) (2001)
Congress on Evolutionary Computation, 2008, pp. 2764–2771. 479–508.
[96] C. Ling, J.-H. Liang, B. Wu, Z.-W. Hu, A metamodel-based optimiza- [99] D.H. Ackley, A Connectionist Machine for Genetic Hillclimbing, Kluwer
tion method for mobile ad hoc networks, in: 2010 International Con- Academic Publishers, Norwell, MA, 1987.
ference on Computer Application and System Modeling, ICCASM, 2010, [100] W. Hendrickx, D. Gorisson, T. Dhaene, Grid enabled sequential design and
pp. 412–416. adaptive metamodeling, in: Proceedings of the 2006 Winter Simulation
[97] Y. Jin, J. Branke, Evolutionary optimization in uncertain environments- Conference, 2006.
a survey, IEEE Transactions on Evolutionary Computation 9 (3) (2005) [101] M. Miller, Cloud Computing: Web-Based Applications That Change the Way
303–317. You Work and Collaborate Online, Que Publishing, 2008.

You might also like