Surrogate
Surrogate
Surrogate
Survey paper
µ2
= (2)
λ
of ξ in case random selection is used as a normalization in (1). It
Fig. 4. Examples of surrogates that have a large approximation error but are can be seen that if all µ parent individuals are selected correctly,
adequately good for evolutionary search. Solid curves denote the original function the measure reaches its maximum of ρ (sel.) = 1, and that negative
and dashed curves are their approximation. values indicate that the selection based on the surrogate is worse
than a random selection.
approximation accuracy of the surrogate, similar to the idea of The measure ρ (sel.) only evaluates the absolute number of
active learning [38]. correctly selected individuals. If ρ (sel.) < 1, the measure does
The estimation of the approximation error can be achieved with not indicate whether the (µ + 1)-th or the worst offspring
different methods. In [25], the degree of uncertainty is roughly set individual has been selected, which may have significant influence
to be inversely proportional to the average distance to the closest on the evolutionary process. Therefore, the measure ρ (sel.) can be
data samples used for constructing the surrogate. Alternatively, an extended to include the rank of the selected individuals, calculated
ensemble can be used for estimating the variance of the individual based on the real fitness function. A surrogate is assumed to be
good, if the rank of the selected individuals based on the model
estimates given by an ensemble of surrogates. The most often
is above-average according to the rank based on the real fitness
used surrogate model for estimating model uncertainties is the
function.
Gaussian processes [39], also known as the Kriging model [40].
The definition of the extended measure ρ (∼sel.) is as follows: The
Unlike deterministic models, Gaussian processes provide an
surrogate achieves a grade of λ − m, if the m-th best individual
estimate of the fitness (mean) together with an estimate of the
based on the real fitness function is selected. Thus, the quality of
uncertainty (variance), which is a statistically sound boundary
the surrogate can be indicated by summing up the grades of the
of the uncertainty in fitness estimation. Due to this property,
selected individuals, which is denoted by π . It is obvious that π
Gaussian processes have increasingly been employed as surrogates
reaches its maximum, if all µ individuals are selected correctly:
in evolutionary single- and multi-objective optimization [41–44].
µ
Note however, the computational cost for constructing Gaussian
π (max.) =
−
processes itself can be very high when the number of samples used (λ − m)
m=1
is large and online learning of the Gaussian processes is non-trivial
µ+1
when new samples are available.
= µ λ− . (3)
2
3.1.2. Metrics for evaluating surrogates and adaptation Similar to (1) the measure ρ (∼sel.) is defined by transforming π
Not much attention has been paid to adapting the frequency linearly, using the maximum π (max.) as well as the expectation
of using the surrogates. In [45], the model quality is estimated by ⟨π ⟩ = µλ2
for the case of a purely random selection:
calculating the average approximation error after re-evaluation,
which is used to adapt the frequency of using the surrogate in a π − ⟨π ⟩
ρ (∼sel.) = . (4)
generation-based model management method. Based on empirical π (max.)
− ⟨π ⟩
observations that large approximation errors must not mislead
Besides these two problem-dependent measures for evaluating
the evolutionary search, see e.g., Fig. 4, a few metrics other than
the quality of the surrogate, two established measures – the
approximation error have been proposed in [46,45] in evaluating
rank correlation and the (continuous) correlation – partially fit
the quality of surrogates. In the following, we present a few the requirements formulated above. The rank correlation can be
performance measures for surrogates in great detail. expressed by
The most common measure for model quality or model fidelity
is the mean squared error between the individual’s real fitness λ
d2l
∑
6
value and the predicted fitness by the meta-model. However, from
ρ (rank) = 1 −
l =0
the evolutionary perspective, selecting the right individuals for , (5)
λ(λ2 − 1)
the next generation is the main concern. For instance in Fig. 4,
the quality of the surrogate is poor in terms of approximation is a measure for the monotonic relation between the ranks of two
accuracy. However, an evolutionary algorithm searching on the variables. In our case, dl is the difference between the ranks of the l-
surrogate only will nevertheless find the right optimum. Consider th offspring individual based on the original fitness function and on
(µ, λ)-selection with λ ≥ 2µ, which is of particular relevance the approximate model. The range of ρ (rank) is the interval [−1; 1].
in evolutionary optimization of complex real-world problems, the The higher the value of ρ (rank) , the stronger the monotonic relation
number of individuals that have been selected correctly using the with a positive slope between the ranks of the two variables.
surrogate can be obtained by: In contrast to ρ (∼sel.) , the rank correlation does not only take
the ranking of the selected individuals, but also the ranks of all
ξ − ⟨ξ ⟩ individuals into account.
ρ (sel.) = , (1) A slightly different quality measure can be defined by
µ − ⟨ξ ⟩
calculating the (continuous) correlation between the surrogate and
where ξ (0 ≤ ξ ≤ µ) is the number of correctly selected the original fitness function.
individuals, i.e., the number of individuals that would have also Using the selection-based criterion [45] for evaluating surro-
been selected if the real fitness function was used for fitness gates, an adaptation scheme has been suggested for adapting the
Y. Jin / Swarm and Evolutionary Computation 1 (2011) 61–70 65
each using a surrogate for fitness evaluations. In [60], individuals 4.1. Surrogates in interactive evolutionary computation
from a sub-population that use a surrogate of lower fidelity are
allowed to migrate to the sub-population that uses a surrogate of In interactive evolutionary computation, the fitness value of
higher fidelity. The method presented in [61] is a minor variant each individual is evaluated by human user subjectively [68].
of [60], where migration is allowed between all sub-populations. Human fitness evaluations are necessary where no fitness function
One approach to reducing the computational cost for construct- is available. For instance, when evolutionary algorithms are used
ing surrogates is to use coarse surrogates (of lower fidelity) in for aesthetic product design or art design. One main challenge of
the early stage of the optimization and increase the quality of the interactive evolutionary computation is the issue of human fatigue.
surrogate gradually as the search proceeds [6]. This idea of using To address this problem to a certain degree, surrogates can be used
coarse-to-fine surrogates has been introduced into a surrogate- to replace in part human evaluations. The main idea is to use a
assisted evolutionary search in [42,62], where surrogates are used machine learning model to predict the fitness value the human
for evolutionary multi-objective optimization. may assign to a design based on history data [69–71].
A more subtle way to control the fidelity of surrogates is to
use surrogates of a sufficiently good fidelity based on a correlation 4.2. Surrogated-assisted evolution for solving dynamic optimization
based measure [63]. The fidelity control strategy was applied
to a memetic algorithm in which the local search is based on Evolutionary optimization of dynamic optimization problems
surrogates of a changing fidelity. The proposed method was has become a popular research topic recently [12]. The primary
evaluated empirically on an aerodynamic airfoil design problem goal is to develop an evolutionary search strategy that can follow a
and demonstrated that the use of a dynamic fidelity is able to moving optimum or a moving Pareto front. To this end, a certain
improve the search speed. degree of diversity in the population should be maintained or
The idea of taking advantage of approximation errors intro- a memory mechanism must be embedded in the evolutionary
duced by surrogates was further exploited in [64]. In that work, two algorithm. Memory mechanisms include sub-populations, archives
types of surrogates are used in the local search of an evolutionary of optimal solutions found so far, or multiploidy in genetic
multi-objective optimization: One for getting a reliable local
representation.
prediction and the other for a higher degree of diversity. Empir-
In addition to memory and diversity based strategies, antici-
ical results show that an evolutionary search based on hetero-
pation and prediction of the change in the fitness function can
geneous multiple models can considerably improve the search
be helpful in solving dynamic problems more efficiently. In such
performance, compared to surrogate-assisted evolutionary algo-
strategies, a surrogate can be helpful in learning the changing fit-
rithms that use a single surrogate or homogeneous multiple
ness function [72–74].
surrogates. Interestingly enough, the proposed algorithm also out-
performs its counterpart that uses an artificial perfect surrogate.
Detailed analysis of the search processes confirmed the hypothe- 4.3. Surrogates for robust optimization
sis that controlled approximation errors introduced by surrogates
can speed up the search process in both single- and multi-objective In evolutionary optimization of real-world problems, one is
optimization. concerned not only with the performance of the obtained optimal
solution, but also the sensitivity of the performance to small
changes in the design variables or in the environment. If an optimal
3.3. Which model management strategy?
solution is insensitive to such changes, the solution is known as
As we discussed above, surrogates can be used in population robust optimization.
initialization, crossover, mutation and preselection to pre-screen To obtain robust optimal solutions using evolutionary algo-
candidate solutions. The advantage of these relatively conservative rithms, either implicit averaging or explicit averaging can be
approaches to using surrogates is that they are less likely to used [12], wherein an assumption on the probability distribution
mislead the search process. One concern might be that they may of the noise is often made. By contrast, one can predefine the al-
cause premature convergence. It is also less risky if a surrogate is lowed performance decrease and then search for an optimum that
used in a local search of memetic algorithms. The common feature has the maximum tolerance of changes in the design variables,
of these approaches is that all individuals have been re-evaluated which is termed inverse robust optimization [75]. In both explicit
using the original fitness function before selection. averaging based or inverse robust optimization, additional fitness
In addition, among the model management strategies, the evaluations are needed. To enhance the efficiency, some of these
individual-based model management may be more suited for additional fitness evaluations can be done based on a surrogate
steady state evolution, or generational evolution implemented on [76–78].
a single machine. By contrast, population-based and generation-
based model management is better for parallel implementation on 4.4. Surrogates for constrained optimization
heterogeneous machines having different speeds. An optimization
strategy may be desirable when multi-level surrogates having Many optimization problems are subject to constraints. To
different computational complexities are used on machines having judge if a candidate solution is feasible, the constraint functions
different computational powers. need to be frequently evaluated. Therefore, if the evaluations
of constraint functions are time-consuming, it is desirable to
4. Beyond evolutionary optimization of expensive problems replace the constraint functions with computationally efficient
approximate models [79].
In addition to reducing the computation time in evolutionary In some real-world applications, an explicit constraint is not
optimization of expensive problems, surrogates can be useful in available. For example in aerodynamic optimization, some of the
addressing other problems in evolutionary computation, such as candidate designs may result in unstable computational fluid
the use of surrogates for reducing fitness evaluations in search of dynamic (CFD) simulations. In order to reduce the number of
robust optimal solutions [65]. In addition, surrogates have been unnecessary, time-consuming CFD simulations, it is very helpful
found helpful in improving the efficiency of evolutionary algo- to judge whether a solution is feasible (e.g., converges in a CFD
rithms for solving optimization with noisy fitness evaluations [66] simulation) before it is evaluated in a CFD simulation. Surrogates
or for solving multi-modal optimization with a very rugged fit- can be used for this purpose [80,81].
ness landscape [6,67], where the purpose of using a surrogate is An interesting idea of using surrogates in constrained opti-
to smoothen the fitness landscape. mization has been recently reported in [82], where surrogates are
Y. Jin / Swarm and Evolutionary Computation 1 (2011) 61–70 67
6. Future challenges
and wireless network or mobile sensor network optimization. In [4] J.J. Grefenstette, J.M. Fitzpatrick, Genetic search with approximate fitness
such cases, discrete modeling techniques must be employed, e.g., evaluations, in: Proceedings of the International Conference on Genetic
Algorithms and Their Applications, 1985, pp. 112–120.
binary neural networks [94]. In [95], an RNF neural network is ap- [5] G. Schneider, J. Schuchhardt, P. Wrede, Artificial neural networks and
plied to assist a mixed integer evolution strategy for intravascu- simulated molecular evolution are potential tools for sequence-oriented
lar ultrasound image analysis [95]. Recently, an integrated Kriging protein design, CABIOS 10 (6) (1994) 635–645.
[6] D. Yang, S.J. Flockton, Evolutionary algorithms with a coarse-to-fine function
model is used for mobile network optimization [96]. smoothing, in: IEEE International Conference on Evolutionary Computation,
1995, pp. 657–662.
6.4. Surrogate-assisted dynamic optimization [7] A. Ratle, Accelerating the convergence of evolutionary algorithms by fitness
landscape approximation, in: Parallel Problem Solving from Nature, 1998,
pp. 87–96.
If an expensive optimization is time-varying, evolutionary [8] L. Bull, On model-based evolutionary computation, Soft Computing 3 (1999)
algorithms for solving dynamic optimization problems must be 76–82.
[9] S. Pierret, Turbomachinery blade design using a Navier–Stokes solver and
adopted to track the moving optima or moving Pareto front [97]. artificial neural network, ASME Journal of Turbomachinery 121 (3) (1999)
Practically, an optimal solution that is robust over time may be 326–332.
more preferred [74]. In either case, the surrogate must be updated [10] Y. Jin, S.J. Louis, K.M. Rasheed, Approximation and learning in evolutionary
computation. GECCO Workshop, July 2002.
online. Therefore, it may be of interest to introduce incremental [11] Y. Jin, B. Sendhoff, Fitness approximation in evolutionary computation —
learning techniques [98] for efficient online learning when the a survey, in: Genetic and Evolutionary Computation Conference, 2002,
objective functions change over time. pp. 1105–1112.
[12] Y. Jin, A comprehensive survey of fitness approximation in evolutionary
computation, Soft Computing 9 (1) (2005) 3–12.
6.5. Rigorous benchmarking and test problems [13] Y. Tenne, C.-K. Goh (Eds.), Computational Intelligence in Expensive Optimiza-
tion Problems, Springer, 2009.
[14] M.M. Davarynejad, C.W. Ahn, J. Vranckena, J. van den Berga, C.A. Coello
Although many surrogate-assisted evolutionary algorithms Coello, Evolutionary hidden information detection by granulation-based
have been proposed and demonstrated to be more efficient than fitness approximation, Applied Soft Computing 10 (3) (2010) 719–729.
their counterpart without using a surrogate, no rigorous compar- [15] H.S. Kim, S.B. Cho, An efficient genetic algorithms with less fitness evaluation
by clustering, in: Congress on Evolutionary Computation, 2001, pp. 887–894.
ative studies on surrogate-assisted evolutionary algorithms have [16] M. Salami, T. Hendtlass, The fast evaluation strategy for evolvable hardware,
been reported. This may be attributed to two reasons. First, no Genetic Programming and Evolvable Machines 6 (2) (2005) 139–162.
widely accepted performance index for benchmarking surrogate- [17] A. Forrester, A. Sobester, A. Keane, Engineering Design via Surrogate
Modelling: A Practical Guide, John Wiley & Sons, 2008.
assisted evolutionary algorithms has been suggested. Second, no [18] J.P.C. Kleijnen, Design and analysis of simulation experiments, in: Interna-
benchmark problems dedicated to surrogate-assisted evolutionary tional Series in Operations Research & Management Science, Springer, 2009.
algorithms have been proposed. Most work on surrogate-assisted [19] K. Anderson, Y. Hsu, Genetic crossover strategy using an approximation
concept, in: IEEE Congress on Evolutionary Computation, 1999, pp. 527–533.
evolutionary algorithms uses either standard test functions such [20] K. Abboud, M. Schoenauer, Surrogate deterministic mutation: Preliminary
as the Ackley function [99] or specific real-world applications for results, in: Artificial Evolution, in: LNCS, 2002, pp. 919–954.
empirical evaluations. However, design of test problems relevant [21] K. Rasheed, H. Hirsh, Informed operators: speeding up genetic-algorithm-
based design optimization using reduced models, in: Genetic and Evolution-
to real-world applications is non-trivial. Ideally, such test prob-
ary Computation Conference, Morgan Kaufmann, 2000, pp. 628–635.
lems should reflect the major difficulties in real-world applications [22] I. Loshchilov, M. Schoenauer, S. Sebag, A mono surrogate for multiobjective
yet tractable for intensive empirical comparisons. As indicated optimization, in: Genetic and Evolutionary Computation Conference, 2010,
in [1], expensive optimization problems such as aerodynamic de- pp. 471–478.
[23] Y. Jin, M. Olhofer, B. Sendhoff, A framework for evolutionary optimization
sign optimization not only involve highly time-consuming fitness with approximate fitness functions, IEEE Transactions on Evolutionary
evaluations, the fitness landscape is often multi-modal as well. In Computation 6 (5) (2002) 481–494.
addition, the CFD simulations may be unstable, resulting in many [24] D. Lim, Y.-S. Ong, Y. Jin, B. Sendhoff, Trusted evolutionary algorithms, in: IEEE
Congress on Evolutionary Computation, 2006, pp. 456–463.
isolated infeasible solutions. Finally, the design space is very high [25] J. Branke, C. Schmidt, Fast convergence by means of fitness estimation, Soft
and geometry representation may be critical for the efficiency of Computing 9 (1) (2005) 13–20.
the whole evolutionary design optimization. [26] M. Emmerich, A. Giotis, M. Uezdenir, T. Baeck, K. Giannakoglou, Metamodel-
assisted evolution strategies, in: Parallel Problem Solving from Nature,
in: LNCS, Springer, 2002, pp. 371–380.
7. Summary [27] Y.S. Ong, P.B. Nair, A.J. Keane, Evolutionary optimization of computationally
expensive problems via surrogate modeling, AIAA Journal 41 (4) (2003)
687–696.
Surrogate-assisted evolutionary algorithms are motivated from [28] Z. Zhou, Y.-S. Ong, M.-H. Lim, B.-S. Lee, Memetic algorithm using multi-
real-world applications. As evolutionary algorithms are increas- surrogates for computationally expensive optimization problems, Soft
ingly applied to solving complex problems, research interests in Computing 11 (10) (2007) 957–971.
[29] S.Z. Martinez, C.A. Coello Coello, A memetic algorithm with non gradient-
surrogate-assisted evolutionary algorithms have considerably in- based local search assisted by a meta-model, in: Parallel Problem Solving
creased in recent years. This paper provides a brief overview of from Nature, in: LNCS, Springer, 2010, pp. 576–585.
recent advances in this research area and suggests a few chal- [30] J.-F.M. Barthelemy, Approximation concepts for optimum structural design
— A review, Structural Optimization 5 (1993) 129–144.
lenging issues that remain to be resolved in the future. We ex- [31] M. Celis, J.E. Dennis, R.A. Tapia, A trust region strategy for nonlinear equality
pect that successful resolution of these challenges heavily depends constrained optimization, in: P. Boggs, R. Byrd, R. Schnabel (Eds.), Numerical
on the progress in both optimization and learning, and new com- Optimization 1984, SIAM, Philadelphia, 1985, pp. 71–82.
[32] H.K. Singh, T. Ray, W. Smith, Surrogate assisted simulated annealing
puting techniques such as grid computing [100] and cloud com-
(SASA) for constrained multi-objective optimization, in: IEEE Congress on
puting [101], with which more computing resources will be made Evolutionary Computation, 2010, pp. 1–8.
available to common users via computer networks. [33] H.S. Bernardino, H.J.C. Barbosa, L.G. Fonseca, A faster clonal selection
algorithm for expensive optimization problems, in: Artificial Immune
Systems, Springer, 2010, pp. 130–143.
References [34] Y. Jin, B. Sendhoff, Reducing fitness evaluations using clustering techniques
and neural network ensembles, in: Genetic and Evolutionary Computation
[1] Y. Jin, B. Sendhoff, A systems approach to evolutionary multi-objective struc- Conference, 2004, pp. 688–699.
tural optimization and beyond, IEEE Computational Intelligence Magazine 4 [35] F. Mota, F. Gomide, Fuzzy clustering in fitness estimation models for genetic
(3) (2009) 62–76. algorithms and applications, in: IEEE International Conference on Fuzzy
[2] D. Douguet, e-LEA3D: A computational-aided drug design web server, Systems, 2006, pp. 1388–1395.
Nucleic Acids Research 38 (2010) w615–w621. [36] L. Graening, Y. Jin, B. Sendhoff, Efficient evolutionary optimization using
[3] Y. Jin, M. Olhofer, B. Sendhoff, On evolutionary optimization with approxi- individual-based evolution control and neural networks: A comparative
mate fitness functions, in: Genetic and Evolutionary Computation Congress, study, in: European Symposium on Artificial Neural Networks, 2005,
2000, pp. 786–793. pp. 273–278.
Y. Jin / Swarm and Evolutionary Computation 1 (2011) 61–70 69
[37] L. Graening, Y. Jin, B. Sendhoff, Individual-based Management of [67] K.-H. Liang, X. Yao, C. Newton, Evolutionary search of approximated n-
Meta-models for Evolutionary Optimization with Applications to dimensional landscapes, International Journal of Knowledge-Based Intelli-
Three-dimensional Blade Optimization, Springer, 2007, pp. 225–250 gent Engineering Systems 4 (3) (2000) 172–183.
(chapter 6). [68] H. Takagi, Interactive evolutionary computation: fusion of the capabilities
[38] S. Tong, Active learning: theory and applications. Ph.D. Thesis, Department of of ec optimization and human evaluation, Proceedings of IEEE 89 (2002)
Computer Science, Stanford University, 2001. 1275–1296.
[39] D.J.C. MacKay, Introduction to gaussian processes, in: C.M. Bishop (Ed.), [69] J.A. Biles, P.G. Anderson, L.W. Loggi, Neural network fitness functions for a
Neural Networks and Machine Learning, Springer, 1998, pp. 133–165. musical IGA, in: The International ICSC Symposium on Intelligent Industrial
[40] J. Sacks, W.J. Welch, T.J. Mitchell, H.P. Wynn, Design and analysis of computer Automation and Soft Computing, 1996.
experiments, Statistical Science 4 (1989) 409–435. [70] R.R Kamalian, A.M. Agogino, H. Takagi, Use of interactive evolutionary
computation with simplified modeling for computationally expensive layout
[41] D. Buche, N.N. Schraudolph, P. Koumoutsakos, Accelerating evolutionary
design optimization, in: IEEE Congress on Evolutionary Computation, 2007,
algorithms with gaussian process fitness function models, IEEE Transactions
pp. 4124–4129.
on Systems, Man, and Cybernetics, Part C: Applications and Reviews 35 (2)
[71] X. Sun, D. Gong, S. Li, Classification and regression-based surrogate model
(2005) 183–194.
- Assisted interactive genetic algorithm with individual fuzzy fitness, in:
[42] M. Emmerich, K.C. Giannakoglou, B. Naujoks, Single- and multiobjective Genetic and Evolutionary Computation Conference, 2009, pp. 907–914.
evolutionary optimization assisted by gaussian random field metamodels, [72] A. Zhou, Y. Jin, Q. Zhang, B. Sendhoff, E. Tsang, Prediction-based population
IEEE Transactions on Evolutionary Computation 10 (4) (2006) 421–439. re-initialization for evolutionary dynamic multi-objective optimization,
[43] J. Knowles, Parego: a hybrid algorithm with on-line landscape approximation in: The Fourth International Conference on Evolutionary Multi-Criterion
for expensive multiobjective optimization problems, IEEE Transactions on Optimization, in: LNCS, Springer, 2007, pp. 832–846.
Evolutionary Optimization 10 (1) (2006) 50–66. [73] I. Hatzakis, D. Wallace, Dynamic multi-objective optimization with evolu-
[44] Q. Zhang, W. Liu, E. Tsang, B. Virginas, Expensive multiobjective optimization tionary algorithms: a forward-looking approach, in: Genetic and Evolution-
by MOEA/D with gaussian process model, IEEE Transactions on Evolutionary ary Computation Conference, 2006, pp. 1201–1208.
Computation 14 (3) (2010) 456–474. [74] X. Yu, Y. Jin, K. Tang, X. Yao, Robust optimization over time — a new
[45] Y. Jin, M. Huesken, B. Sendhoff, Quality measures for approximate models in perspective on dynamic optimization problems, in: Congress on Evolutionary
evolutionary computation, in: Proceedings of GECCO Workshops: Workshop Computation, 2010, pp. 3998–4003.
on Adaptation, Learning and Approximation in Evolutionary Computation, [75] D. Lim, Y.-S. Ong, Y. Jin, B. Sendhoff, B.S. Lee, Inverse multi-objective robust
2003, pp. 170–174. evolutionary optimization, Genetic Programming and Evolvable Machines 7
[46] M. Huesken, Y. Jin, B. Sendhoff, Structure optimization of neural networks for (4) (2006) 383–404.
aerodynamic optimization, Soft Computing 9 (1) (2005) 21–28. [76] D. Lim, Y.-S. Ong, M.-H. Lim, Y. Jin, Single/multi-objective inverse robust
[47] H. Ulmer, F. Streichert, A. Zell, Evolution strategies with controlled model evolutionary design methodology in the presence of uncertainty, in: S. Yang,
assistance, in: Congress on Evolutionary Computation, 2004, pp. 1569–1576. Y.S. Ong, Y. Jin (Eds.), Evolutionary Computation in Dynamic and Uncertain
[48] M. Schmidt, H. Lipson, Coevolution of fitness predictors, IEEE Transactions on Environments, Springer, 2007, pp. 437–456.
[77] Y.-S. Ong, P.B. Nair, K.Y. Lum, Max–min surrogate-assisted evolutionary
Evolutionary Computation 12 (6) (2008) 736–749.
algorithm for robust design, IEEE Transactions on Evolutionary Computation
[49] Y. Tenne, K. Izui, S. Nishiwaki, Dimensionality-reduction frameworks
10 (4) (2006) 392–404.
for computationally expensive problems, IEEE Congress on Evolutionary
[78] I. Paenke, J. Banke, Y. Jin, Efficient search for robust solutions by means
Computation, 2010, pp. 1–8.
of evolutionary algorithms and fitness approximation, IEEE Transactions on
[50] Z. Zhou, Y.S. Ong, P.B. Nair, A.J. Keane, K.Y. Lum, Combining global and local Evolutionary Computation 10 (4) (2006) 405–420.
surrogate models to accelerate evolutionary optimization, IEEE Transactions [79] T.P. Runarsson, Constrained evolutionary optimization by approximate
on Systems, Man, and Cybernetics, Part C: Applications and Reviews 37 (1) ranking and surrogate models, in: Parallel Problem Solving from Nature,
(2007) 66–76. in: LNCS, Springer, 2004, pp. 401–410.
[51] Y. Cao, Y. Jin, M. Kowalczykiewicz, B. Sendhoff, Prediction of convergence dy- [80] S.D. Handoko, C.K. Kwoh, Y.S. Ong, Feasibility structure modeling: an
namics of design performance using differential recurrent neural networks, effective chaperon for constrained memetic algorithms, IEEE Transactions on
in: International Joint Conference on Neural Networks, 2008, pp. 529–534. Evolutionary Computation 14 (5) (2010) 740–758.
[52] Y. Tenne, S.W. Armfield, A framework for memetic optimization using [81] Y. Tenne, K. Izui, S. Nishiwaki, Handling undefined vectors in expensive
variable global and local surrogate models, Soft Computing 13 (2009) 81–793. optimization problems, in: Applications of Evolutionary Computation,
[53] Y. Jin, B. Sendhoff, Pareto-based multi-objective machine learning: an in: LNCS, Springer, 2010, pp. 582–591.
overview and case studies, IEEE Transactions on Systems, Man, and [82] Y. Jin, S. Oh, M. Jeon, Incremental approximation of nonlinear constraint
Cybernetics, Part C: Applications and Reviews 38 (3) (2008) 397–415. functions for evolutionary constrained optimization, in: IEEE Congress on
[54] T. Goel, R.T. Haftka, W. Shyy, N.V. Queipo, Ensemble of surrogates, Structural Evolutionary Computation, 2010, pp. 2966–2973.
and Multidisciplinary Optimization 33 (3) (2007) 199–216. [83] S. Oh, Y. Jin, M. Jeon, Approximate models for constraint functions in
[55] E. Sanchez, S. Pintos, N.V. Queipo, Toward an optimal ensemble of evolutionary constrained optimization, International Journal of Innovative
kernel-based approximations with engineering applications, Structural and Computing, Information and Control 7 (10) (2011).
Multidisciplinary Optimization 36 (3) (2008) 247–261. [84] M.K. Karakasis, K.C. Giannakoglou, Metamodel-assisted multi-objective
evolutionary optimization, in: Evolutionary and Deterministic Methods for
[56] D. Lim, Y.S. Ong, Y. Jin, B. Sendhoff, A study on metamodeling techniques,
Design, Optimization and Control with Applications to Industrial and Societal
ensembles, and multi-surrogates in evolutionary computation, in: Genetic
Problems, EUROGEN, 2005.
and Evolutionary Computation Conference, 2007, pp. 1288–1295.
[85] A. Shahrokhi, A. Jahangirian, A surrogate assisted evolutionary optimization
[57] A. Samad, K.-Y. Kim, Multiple surrogate modeling for axial compressor blade
method with application to the transonic airfoil design, Engineering
shape optimization, Journal of Propulsion Power 24 (2) (2008) 302–310.
Optimization 42 (6) (2010) 497–515.
[58] D. Chafekar, L. Shi, K. Rasheed, J. Xuan, Constrained multi-objective ga [86] V.G. Asouti, I.C. Kampolis, K.C. Giannakoglou, A grid-enabled asynchronous
optimization using reduced models, IEEE Transactions on Systems, Man and metamodel-assisted evolutionary algorithm for aerodynamic optimization,
Cybernetics, Part C: Applications and Reviews 35 (2) (2005) 261–265. Genetic Programming and Evolvable Machines 10 (4) (2009) 373–389.
[59] A. Isaacs, T. Ray, W. Smith, An evolutionary algorithm with spatially [87] M.H.A. Bonte, L. Fourment, T.-T. Do, A.H. van den Boogaard, J. Huetink,
distributed surrogates for multiobjective optimization, in: The 3rd Australian Optimization of forging processes using finite element simulations, Struc-
Conference on Progress in Artificial Life, 2007, pp. 257–268. tural Multidisciplinary Optimization 42 (2010) 797–810.
[60] D. Eby, R. Averill, W. Punch, E. Goodman, Evaluation of injection island model [88] K. Hamza, K. Saitou, Crashworthiness design using meta-models for
ga performance on flywheel design optimization, in: Third Conference on aprroximating the response of structural members, in: Cairo University
Adaptive Computing in Design and Manufacturing, 1998, pp. 121–136. Conference on Mechanical Design and Production, 2004.
[61] M. Sefrioui, J. Periaux, A hierarchical genetic algorithm using multiple [89] G. Mariani, G. Palermo, C. Silvano, V. Zaccaria, Meta-model assisted
models for optimization, in: Parallel Problem Solving from Nature, 2000, optimization for design space exploration of multi-processor systems-on-
pp. 879–888. chip, in: 12th Euromicro Conference on Digital System Design, Architectures,
[62] P.K.S. Nain, K. Deb, Computationally effective search and optimization Methods and Tools, 2009, pp. 383–389.
procedure using coarse to fine approximation, in: IEEE Congress on [90] G. Dellino, P. Lino, C. Meloni, A. Rizzo, Kriging metamodel management in the
Evolutionary Computation, 2003, pp. 2081–2088. design optimization of a cng injection system, Mathematics and Computers
[63] D. Lim, Y.S. Ong, Y. Jin, B. Sendhoff, Evolutionary optimization with dynamic in Simulation 79 (8) (2009) 2345–2360.
fidelity computational models, in: International Conference on Intelligent [91] S.-T. Khu, D. Savic, Z. Kapelan, Evolutionary-based meta-modelling v the rele-
Computing, in: LNCS, Springer, 2008, pp. 235–242. vance of using approximate models in hydroinformatics, in: Hydroinformat-
ics in Practice: Computational Intelligence and Technological Developments
[64] D. Lim, Y. Jin, Y.S. Ong, B. Sendhoff, Generalizing surrogate-assisted
in Water Applications, Springer, 2008.
evolutionary computation, IEEE Transactions on Evolutionary Computation
[92] I. Dahm, J. Ziegler, Using artificial neural networks to construct a meta-model
14 (3) (2010) 329–355.
for the evolution of gait patterns, in: Proceedings of the 5th International
[65] J. Branke, Creating robust solutions by means of evolutionary algorithms, Conference on Climbing and Walking Robots, 2002.
in: Parallel Problem Solving from Nature, in: LNCS, Springer, 1998, [93] M.J. Powell, On the convergence of a wide range of trust region methods for
pp. 119–128. unconstrained optimization, IMA Journal of Numerical Analysis 30 (1) (2010)
[66] M. Bhattacharya, Reduced computation for evolutionary optimization in 289–301.
noisy environment, in: Genetic and Evolutionary Computation Conference, [94] V.J. Hodge, K.J. Lees, J.L. Austin, A high performance k-nn approach using
2008, pp. 2117–2122. binary neural networks, Neural Networks 17 (3) (2004) 441–458.
70 Y. Jin / Swarm and Evolutionary Computation 1 (2011) 61–70
[95] R. Li, M.T.M. Emmerich, J. Eggermont, E.G.P. Bovenkamp, T. Bäck, J. Dijkstra, [98] R. Polikar, L. Upda, S.S. Upda, V. Honavar, Learn++: An incremental learning
J.H.C. Reiber, Metamodel-assisted mixed integer evolution strategies and algorithm for supervised neural networks, IEEE Transactions on Systems,
their application to intravascular ultrasound image analysis, in: IEEE Man, and Cybernetics, Part C: Applications and Reviews 31 (4) (2001)
Congress on Evolutionary Computation, 2008, pp. 2764–2771. 479–508.
[96] C. Ling, J.-H. Liang, B. Wu, Z.-W. Hu, A metamodel-based optimiza- [99] D.H. Ackley, A Connectionist Machine for Genetic Hillclimbing, Kluwer
tion method for mobile ad hoc networks, in: 2010 International Con- Academic Publishers, Norwell, MA, 1987.
ference on Computer Application and System Modeling, ICCASM, 2010, [100] W. Hendrickx, D. Gorisson, T. Dhaene, Grid enabled sequential design and
pp. 412–416. adaptive metamodeling, in: Proceedings of the 2006 Winter Simulation
[97] Y. Jin, J. Branke, Evolutionary optimization in uncertain environments- Conference, 2006.
a survey, IEEE Transactions on Evolutionary Computation 9 (3) (2005) [101] M. Miller, Cloud Computing: Web-Based Applications That Change the Way
303–317. You Work and Collaborate Online, Que Publishing, 2008.