Abstract
Multi-objective optimization problem is widespread in the real world. However, plenty of typical evolutionary multi-objective optimization (EMO) algorithms are extremely tough to deal with large-scale optimization problems (LSMOPs) due to the curse of dimensionality. In reality, the dimension of the manifold representing the Pareto solution set is much lower than that of the decision space. This work proposes a decision space reduction technique based on manifold learning using locality-preserving projects. The critical insight is to improve search efficiency through decision space reduction. The high-dimensional decision space is first mapped to a low-dimensional subspace for a more effective search. Subsequently, a transformation matrix which is Pseudo-inverse of the projection matrix, maps the resultant offspring solutions back to the primal decision space. The proposed decision space reduction technique can be integrated with most multi-objective evolutionary algorithms. This paper integrates it with NSGA-II, namely LPP-NSGA-II. We compare the proposed LPP-NSGA-II with four state-of-the-art EMO algorithms on thirteen test problems. The experimental results reveal the effectiveness of the proposed algorithm.
This research was funded by the Natural Science Foundation of Guangdong Province (2021A1515011839).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Cheng, R., Jin, Y.: A competitive swarm optimizer for large scale optimization. IEEE Trans. Cybern. 45(2), 191ā204 (2014)
Deb, K., Jain, H.: An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part i: solving problems with box constraints. IEEE Trans. Evol. Comput. 18(4), 577ā601 (2014)
Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182ā197 (2002)
Gu, F., Liu, H.L., Cheung, Y.M., Zheng, M.: A rough-to-fine evolutionary multiobjective optimization algorithm. IEEE Trans. Cybern. 52(12), 13472ā13485 (2021)
He, X., Niyogi, P.: Locality preserving projections. In: Advances in Neural Information Processing Systems, vol. 16, pp. 153ā160 (2003)
Huang, D.S., Mi, J.X.: A new constrained independent component analysis method. IEEE Trans. Neural Networks 18(5), 1532ā1535 (2007)
Huband, S., Hingston, P., Barone, L., While, L.: A review of multiobjective test problems and a scalable test problem toolkit. IEEE Trans. Evol. Comput. 10(5), 477ā506 (2006)
Li, B., Tang, K., Li, J., Yao, X.: Stochastic ranking algorithm for many-objective optimization based on multiple indicators. IEEE Trans. Evol. Comput. 20(6), 924ā938 (2016)
Li, B., Li, Y.R., Zhang, X.L.: A survey on laplacian eigenmaps based manifold learning methods. Neurocomputing 335, 336ā351 (2019)
Li, X., Yao, X.: Cooperatively coevolving particle swarms for large scale optimization. IEEE Trans. Evol. Comput. 16(2), 210ā224 (2011)
Liu, H.L., Gu, F., Zhang, Q.: Decomposition of a multiobjective optimization problem into a number of simple multiobjective subproblems. IEEE Trans. Evol. Comput. 18(3), 450ā455 (2013)
Ma, X., et al.: A survey on cooperative co-evolutionary algorithms. IEEE Trans. Evol. Comput. 23(3), 421ā441 (2019)
Mohamed, A.W., Almazyad, A.S.: Differential evolution with novel mutation and adaptive crossover strategies for solving large scale global optimization problems. Appl. Comput. Intell. Soft Comput. 2017 (2017)
Omidvar, M.N., Li, X., Mei, Y., Yao, X.: Cooperative co-evolution with differential grouping for large scale optimization. IEEE Trans. Evol. Comput. 18(3), 378ā393 (2013)
Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323ā2326 (2000)
Tenenbaum, J.B., Silva, V.D., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319ā2323 (2000)
Tian, Y., Cheng, R., Zhang, X., Jin, Y.: Platemo: a matlab platform for evolutionary multi-objective optimization [educational forum]. IEEE Comput. Intell. Mag. 12(4), 73ā87 (2017)
Tian, Y., Lu, C., Zhang, X., Tan, K.C., Jin, Y.: Solving large-scale multiobjective optimization problems with sparse optimal solutions via unsupervised neural networks. IEEE Trans. Cybern. 51(6), 3115ā3128 (2020)
Trivedi, A., Srinivasan, D., Sanyal, K., Ghosh, A.: A survey of multiobjective evolutionary algorithms based on decomposition. IEEE Trans. Evol. Comput. 21(3), 440ā462 (2016)
Wang, H., Jiao, L., Yao, X.: Two_arch2: an improved two-archive algorithm for many-objective optimization. IEEE Trans. Evol. Comput. 19(4), 524ā541 (2014)
Xanthopoulos, P., Pardalos, P.M., Trafalis, T.B., Xanthopoulos, P., Pardalos, P.M., Trafalis, T.B.: Linear discriminant analysis. In: Robust Data Mining, pp. 27ā33 (2013)
Yekkehkhany, B., Safari, A., Homayouni, S., Hasanlou, M.: A comparison study of different kernel functions for SVM-based classification of multi-temporal polarimetry SAR data. Int. Arch. Photogram. Remote Sens. Spatial Inf. Sci. 40(2), 281 (2014)
Zhang, Q., Li, H.: MOEA/D: a multiobjective evolutionary algorithm based on decomposition. IEEE Trans. Evol. Comput. 11(6), 712ā731 (2007)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
Ā© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Jiang, J., Gu, F., Shang, C. (2024). An Evolutionary Multiobjective Optimization Algorithm Based onĀ Manifold Learning. In: Liu, Q., et al. Pattern Recognition and Computer Vision. PRCV 2023. Lecture Notes in Computer Science, vol 14431. Springer, Singapore. https://doi.org/10.1007/978-981-99-8540-1_35
Download citation
DOI: https://doi.org/10.1007/978-981-99-8540-1_35
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-8539-5
Online ISBN: 978-981-99-8540-1
eBook Packages: Computer ScienceComputer Science (R0)