Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Inverse distance weighting and radial basis function based surrogate model for high-dimensional expensive multi-objective optimization

Published: 01 February 2024 Publication History

Abstract

Radial basis function (RBF) models have attracted a lot of attention in assisting evolutionary algorithms for solving computationally expensive optimization problems. However, most RBFs cannot directly provide the uncertainty information of their predictions, making it difficult to adopt principled infill sampling criteria for model management. To overcome this limitation, an inverse distance weighting (IDW) and RBF based surrogate assisted evolutionary algorithm, named IR-SAEA, is proposed to address high-dimensional expensive multi-objective optimization problems. First, an RBF-IDW model is developed, which can provide both the predicted objective values and the uncertainty of the predictions. Moreover, a modified lower confidence bound infill criterion is proposed based on the RBF-IDW for the balance of exploration and exploitation. Extensive experiments have been conducted on widely used benchmark problems with up to 100 dimensions. The empirical results have validated that the proposed algorithm is able to achieve a competitive performance compared with state-of-the-art SAEAs.

Highlights

Proposed an surrogate assisted evolutionary algorithm.
Builded surrogates based on the radial basis function and inverse distance weighting.
Proposed a modified lower confidence bound to balance exploration and exploitation.
Evaluated algorithm on three test suites with up to 100 dimensions.
Empirical results show effectiveness in solving high-dimensional expensive MOPs.

References

[1]
Yang S., Li M., Liu X., Zheng J., A grid-based evolutionary algorithm for many-objective optimization, IEEE Trans. Evol. Comput. 17 (5) (2013) 721–736,.
[2]
Deb K., Pratap A., Agarwal S., Meyarivan T., A fast and elitist multiobjective genetic algorithm: NSGA-II, IEEE Trans. Evol. Comput. 6 (2) (2002) 182–197.
[3]
Zhang Q., Li H., MOEA/D: A multiobjective evolutionary algorithm based on decomposition, IEEE Trans. Evol. Comput. 11 (6) (2007) 712–731.
[4]
R. Hernández Gómez, C.A. Coello Coello, Improved metaheuristic based on the R2 indicator for many-objective optimization, in: Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, 2015, pp. 679–686.
[5]
Cheng R., Jin Y., Olhofer M., Sendhoff B., A reference vector guided evolutionary algorithm for many-objective optimization, IEEE Trans. Evol. Comput. 20 (5) (2016) 773–791.
[6]
Chugh T., Sindhya K., Miettinen K., Jin Y., Kratky T., Makkonen P., Surrogate-assisted evolutionary multiobjective shape optimization of an air intake ventilation system, in: 2017 IEEE Congress on Evolutionary Computation, CEC, 2017, pp. 1541–1548,.
[7]
Wang S., Peng Y., Wang T., Che Q., Xu P., Collision performance and multi-objective robust optimization of a combined multi-cell thin-walled structure for high speed train, Thin-Walled Struct. 135 (2019) 341–355.
[8]
Jin Y., Wang H., Chugh T., Guo D., Miettinen K., Data-driven evolutionary optimization: An overview and case studies, IEEE Trans. Evol. Comput. 23 (3) (2019) 442–458,.
[9]
He C., Tian Y., Wang H., Jin Y., A repository of real-world datasets for data-driven evolutionary multiobjective optimization, Complex Intell. Syst. 6 (1) (2020) 189–197.
[10]
Sun Y., Wang H., Xue B., Jin Y., Yen G.G., Zhang M., Surrogate-assisted evolutionary deep learning using an end-to-end random forest-based performance predictor, IEEE Trans. Evol. Comput. 24 (2) (2020) 350–364,.
[11]
Jin Y., A comprehensive survey of fitness approximation in evolutionary computation, Soft Comput. 9 (1) (2005) 3–12.
[12]
Emmerich M., Giotis A., Özdemir M., Bäck T., Giannakoglou K., Metamodel—Assisted evolution strategies, in: International Conference on Parallel Problem Solving from Nature, Springer, 2002, pp. 361–370.
[13]
Broomhead D.S., Lowe D., Radial Basis Functions, Multi-Variable Functional Interpolation and Adaptive Networks, Royal Signals and Radar Establishment Malvern (United Kingdom), 1988.
[14]
Liu Y., Liu J., Jin Y., Surrogate-assisted multipopulation particle swarm optimizer for high-dimensional expensive optimization, IEEE Trans. Syst. Man Cybern. A 52 (7) (2022) 4671–4684.
[15]
Cortes C., Vapnik V., Support-vector networks, Mach. Learn. 20 (3) (1995) 273–297.
[16]
Zurada J., Introduction to Artificial Neural Systems, West Publishing Co., 1992.
[17]
Jones D.R., Schonlau M., Welch W.J., Efficient global optimization of expensive black-box functions, J. Global Optim. 13 (4) (1998) 455–492.
[18]
Shahriari B., Swersky K., Wang Z., Adams R.P., De Freitas N., Taking the human out of the loop: A review of Bayesian optimization, Proc. IEEE 104 (1) (2015) 148–175.
[19]
Knowles J., ParEGO: A hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems, IEEE Trans. Evol. Comput. 10 (1) (2006) 50–66.
[20]
Ponweiser W., Wagner T., Biermann D., Vincze M., Multiobjective optimization on a limited budget of evaluations using model-assisted S-metric selection, in: International Conference on Parallel Problem Solving from Nature, Springer, 2008, pp. 784–794.
[21]
Zhang Q., Liu W., Tsang E., Virginas B., Expensive multiobjective optimization by MOEA/D with Gaussian process model, IEEE Trans. Evol. Comput. 14 (3) (2009) 456–474.
[22]
Chugh T., Jin Y., Miettinen K., Hakanen J., Sindhya K., A surrogate-assisted reference vector guided evolutionary algorithm for computationally expensive many-objective optimization, IEEE Trans. Evol. Comput. 22 (1) (2018) 129–142,.
[23]
Buche D., Schraudolph N., Koumoutsakos P., Accelerating evolutionary algorithms with Gaussian process fitness function models, IEEE Trans. Syst. Man Cybern. C 35 (2) (2005) 183–194,.
[24]
Sammon J.W., A nonlinear mapping for data structure analysis, IEEE Trans. Comput. 100 (5) (1969) 401–409.
[25]
Liu B., Zhang Q., Gielen G.G.E., A Gaussian process surrogate model assisted evolutionary algorithm for medium scale expensive optimization problems, IEEE Trans. Evol. Comput. 18 (2) (2014) 180–192,.
[26]
Yang Z., Qiu H., Gao L., Jiang C., Zhang J., Two-layer adaptive surrogate-assisted evolutionary algorithm for high-dimensional computationally expensive problems, J. Global Optim. 74 (2) (2019) 327–359.
[27]
Li Y., Shi J., Yin Z., Shen J., Wu Y., Wang S., An improved high-dimensional kriging surrogate modeling method through principal component dimension reduction, Mathematics 9 (16) (2021) 1985.
[28]
Hajikolaei K.H., Gary Wang G., High dimensional model representation with principal component analysis, J. Mech. Des. 136 (1) (2014).
[29]
Gaspar-Cunha A., Vieira A., et al., A hybrid multi-objective evolutionary algorithm using an inverse neural network, in: Hybrid Metaheuristics, Citeseer, 2004, pp. 25–30.
[30]
Isaacs A., Ray T., Smith W., An evolutionary algorithm with spatially distributed surrogates for multiobjective optimization, in: Australian Conference on Artificial Life, Springer, 2007, pp. 257–268.
[31]
Georgopoulou C.A., Giannakoglou K.C., A multi-objective metamodel-assisted memetic algorithm with strength-based local refinement, Eng. Optim. 41 (10) (2009) 909–923.
[32]
I. Loshchilov, M. Schoenauer, M. Sebag, A mono surrogate for multiobjective optimization, in: Proceedings of the 12th Annual Conference on Genetic and Evolutionary Computation, 2010, pp. 471–478.
[33]
S. Zapotecas Martínez, C.A. Coello Coello, MOEA/D assisted by RBF networks for expensive multi-objective optimization problems, in: Proceedings of the 15th Annual Conference on Genetic and Evolutionary Computation, 2013, pp. 1405–1412.
[34]
Pan L., He C., Tian Y., Wang H., Zhang X., Jin Y., A classification-based surrogate-assisted evolutionary algorithm for expensive many-objective optimization, IEEE Trans. Evol. Comput. 23 (1) (2019) 74–88,.
[35]
Guo D., Jin Y., Ding J., Chai T., Heterogeneous ensemble-based infill criterion for evolutionary multiobjective optimization of expensive problems, IEEE Trans. Cybern. 49 (3) (2019) 1012–1025,.
[36]
Guo D., Wang X., Gao K., Jin Y., Ding J., Chai T., Evolutionary optimization of high-dimensional multiobjective and many-objective expensive problems assisted by a dropout neural network, IEEE Trans. Syst. Man Cybern. A 52 (4) (2022) 2084–2097,.
[37]
Regis R.G., Particle swarm with radial basis function surrogates for expensive black-box optimization, J. Comput. Sci. 5 (1) (2014) 12–23.
[38]
Sun C., Jin Y., Cheng R., Ding J., Zeng J., Surrogate-assisted cooperative swarm optimization of high-dimensional expensive problems, IEEE Trans. Evol. Comput. 21 (4) (2017) 644–660.
[39]
Branke J., Schmidt C., Faster convergence by means of fitness estimation, Soft Comput. 9 (1) (2005) 13–20.
[40]
Bemporad A., Global optimization via inverse distance weighting and radial basis functions, Comput. Optim. Appl. 77 (2) (2020) 571–595.
[41]
Bemporad A., Piga D., Global optimization based on active preference learning with radial basis functions, Mach. Learn. 110 (2) (2021) 417–448.
[42]
D. Shepard, A two-dimensional interpolation function for irregularly-spaced data, in: Proceedings of the 1968 23rd ACM National Conference, 1968, pp. 517–524.
[43]
Joseph V.R., Kang L., Regression-based inverse distance weighting with applications to computer experiments, Technometrics 53 (3) (2011) 254–265.
[44]
Hansen M.P., Jaszkiewicz A., Evaluating the Quality of Approximations to the Non-Dominated Set, Citeseer, 1994.
[45]
D. Brockhoff, T. Wagner, H. Trautmann, On the properties of the R2 indicator, in: Proceedings of the 14th Annual Conference on Genetic and Evolutionary Computation, 2012, pp. 465–472.
[46]
Li F., Cheng R., Liu J., Jin Y., A two-stage R2 indicator based evolutionary algorithm for many-objective optimization, Appl. Soft Comput. 67 (2018) 245–260.
[47]
Liu Y., Liu J., Li T., Li Q., An R2 indicator and weight vector-based evolutionary algorithm for multi-objective optimization, Soft Comput. 24 (2020) 5079–5100.
[48]
V. Torczon, M. Trosset, Using approximations to accelerate engineering design optimization, in: 7th AIAA/USAF/NASA/ISSMO Symposium on Multidisciplinary Analysis and Optimization, 1998, p. 4800.
[49]
Emmerich M.T., Giannakoglou K.C., Naujoks B., Single-and multiobjective evolutionary optimization assisted by Gaussian random field metamodels, IEEE Trans. Evol. Comput. 10 (4) (2006) 421–439.
[50]
Allmendinger R., Emmerich M.T., Hakanen J., Jin Y., Rigoni E., Surrogate-assisted multicriteria optimization: Complexities, prospective solutions, and business case, J. Multi-Criteria Decis. Anal. 24 (1–2) (2017) 5–24.
[51]
Sonoda T., Nakata M., Multiple classifiers-assisted evolutionary algorithm based on decomposition for high-dimensional multiobjective problems, IEEE Trans. Evol. Comput. 26 (6) (2022) 1581–1595,.
[52]
Deb K., Thiele L., Laumanns M., Zitzler E., Scalable multi-objective optimization test problems, in: Proceedings of the 2002 Congress on Evolutionary Computation. CEC’02 (Cat. No.02TH8600), Vol. 1, 2002, pp. 825–830,.
[53]
Huband S., Hingston P., Barone L., While L., A review of multiobjective test problems and a scalable test problem toolkit, IEEE Trans. Evol. Comput. 10 (5) (2006) 477–506,.
[54]
Cheng R., Jin Y., Olhofer M., sendhoff B., Test problems for large-scale multiobjective and many-objective optimization, IEEE Trans. Cybern. 47 (12) (2017) 4108–4121,.
[55]
Bosman P., Thierens D., The balance between proximity and diversity in multiobjective evolutionary algorithms, IEEE Trans. Evol. Comput. 7 (2) (2003) 174–188,.
[56]
While L., Hingston P., Barone L., Huband S., A faster algorithm for calculating hypervolume, IEEE Trans. Evol. Comput. 10 (1) (2006) 29–38,.
[57]
Deb K., Multi-objective optimisation using evolutionary algorithms: an introduction, in: Multi-Objective Evolutionary Optimisation for Product Design and Manufacturing, Springer, 2011, pp. 3–34.
[58]
Deb K., Goyal M., et al., A combined genetic adaptive search (geneas) for engineering design, Comput. Sci. Inform. 26 (1996) 30–45.
[59]
Tian Y., Cheng R., Zhang X., Jin Y., PlatEMO: A MATLAB platform for evolutionary multi-objective optimization [Educational forum], IEEE Comput. Intell. Mag. 12 (4) (2017) 73–87,.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Applied Soft Computing
Applied Soft Computing  Volume 152, Issue C
Feb 2024
1017 pages

Publisher

Elsevier Science Publishers B. V.

Netherlands

Publication History

Published: 01 February 2024

Author Tags

  1. High-dimensional expensive multi-objective optimization
  2. RBF
  3. Uncertainty estimation
  4. Lower confidence bound
  5. Surrogate-assisted evolutionary algorithm

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 11 Feb 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media