Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Diagonal Acceleration for Covariance Matrix Adaptation Evolution Strategies

Published: 01 September 2020 Publication History

Abstract

We introduce an acceleration for covariance matrix adaptation evolution strategies (CMA-ES) by means of adaptive diagonal decoding (dd-CMA). This diagonal acceleration endows the default CMA-ES with the advantages of separable CMA-ES without inheriting its drawbacks. Technically, we introduce a diagonal matrix D that expresses coordinate-wise variances of the sampling distribution in DCD form. The diagonal matrix can learn a rescaling of the problem in the coordinates within a linear number of function evaluations. Diagonal decoding can also exploit separability of the problem, but, crucially, does not compromise the performance on nonseparable problems. The latter is accomplished by modulating the learning rate for the diagonal matrix based on the condition number of the underlying correlation matrix. dd-CMA-ES not only combines the advantages of default and separable CMA-ES, but may achieve overadditive speedup: it improves the performance, and even the scaling, of the better of default and separable CMA-ES on classes of nonseparable test functions that reflect, arguably, a landscape feature commonly observed in practice.
The article makes two further secondary contributions: we introduce two different approaches to guarantee positive definiteness of the covariance matrix with active CMA, which is valuable in particular with large population size; we revise the default parameter setting in CMA-ES, proposing accelerated settings in particular for large dimension.
All our contributions can be viewed as independent improvements of CMA-ES, yet they are also complementary and can be seamlessly combined. In numerical experiments with dd-CMA-ES up to dimension 5120, we observe remarkable improvements over the original covariance matrix adaptation on functions with coordinate-wise ill-conditioning. The improvement is observed also for large population sizes up to about dimension squared.

References

[1]
Akimoto, Y., Auger, A., and Hansen, N. (2014). Comparison-based natural gradient optimization in high dimension. In <italic>Proceedings of Genetic and Evolutionary Computation Conference</italic>, pp. 373- 380.
[2]
Akimoto, Y., Auger, A., and Hansen, N. (2018). Quality gain analysis of the weighted recombination evolution strategy on general convex quadratic functions. <italic>Theoretical Computer Science</italic>. <pub-id pub-id-type="doi" xlink:href="10.1010/j.tcs.2018.05.015" assigning-authority="crossref">10.1010/j.tcs.2018.05.015</pub-id>.
[3]
Akimoto, Y., and Hansen, N. (2016). Projection-based restricted covariance matrix adaptation for high dimension. In <italic>Genetic and Evolutionary Computation Conference (GECCO)</italic>, pp. 197-204.
[4]
Akimoto, Y., Nagata, Y., Ono, I., and Kobayashi, S. (2010). Bidirectional relation between CMA evolution strategies and natural evolution strategies. In <italic>Parallel Problem Solving from Nature</italic>, pp. 154-163. Lecture Notes in Computer Science, Vol. 6238.
[5]
Akimoto, Y., Sakuma, J., Ono, I., and Kobayashi, S. (2008). Functionally specialized CMA-ES: A modification of CMA-ES based on the specialization of the functions of covariance matrix adaptation and step size adaptation. In <italic>Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation (GECCO)</italic>. <pub-id pub-id-type="doi" xlink:href="10.1145/13895.1389188" assigning-authority="crossref">10.1145/13895.1389188</pub-id>.
[6]
Arnold, D. V. (2006). Weighted multirecombination evolution strategies. <italic>Theoretical Computer Science</italic>, 361:18-37.
[7]
Arnold, D. V., and Hansen, N. (2010). Active covariance matrix adaptation for the (1 + 1)-CMA-ES. In <italic>Proceedings of Genetic and Evolutionary Computation Conference</italic>, pp. 385-392.
[8]
Auger, A., and Hansen, N. (2005a). Performance evaluation of an advanced local search evolutionary algorithm. In <italic>2005 IEEE Congress on Evolutionary Computation</italic>, pp. 1777-1784.
[9]
Auger, A., and Hansen, N. (2005b). A restart CMA evolution strategy with increasing population size. In <italic>2005 IEEE Congress on Evolutionary Computation</italic>, pp. 1769-1776.
[10]
Beyer, H.-G. (1995). Toward a theory of evolution strategies: On the benefits of sex--The (&#181;/&#181;, &#955;) theory. <italic>Evolutionary Computation</italic>, 3(1):81-111.
[11]
Glasmachers, T., Schaul, T., Yi, S., Wierstra, D., and Schmidhuber, J. (2010). Exponential natural evolution strategies. In <italic>Proceedings of Genetic and Evolutionary Computation Conference</italic>, pp. 393-400.
[12]
Hansen, N. (2000). Invariance, self-adaptation and correlated mutations in evolution strategies. In <italic>Parallel Problem Solving from Nature</italic>, pp. 355-364.
[13]
Hansen, N. (2009). Benchmarking a bi-population CMA-ES on the BBOB-2009 function testbed. In <italic>Workshop Proceedings of the GECCO Genetic and Evolutionary Computation Conference</italic>, pp. 2389-2395.
[14]
Hansen, N., Arnold, D. V., and Auger, A. (2015). Evolution strategies. In J. Kacprzyk and W. Pedrycz (Eds.), <italic>Handbook of computational intelligence</italic>, pp. 871-898. Berlin: Springer.
[15]
Hansen, N., and Auger, A. (2014). Principled design of continuous stochastic search: From theory to practice. In Y. Borenstein and A. Moraglio (Eds.), <italic>Theory and principled methods for the design of metaheuristics</italic>. Berlin: Springer.
[16]
Hansen, N., and Kern, S. (2004). Evaluating the CMA evolution strategy on multimodal test functions. In <italic>Parallel Problem Solving from Nature</italic>, pp. 282-291.
[17]
Hansen, N., Muller, S. D., and Koumoutsakos, P. (2003). Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). <italic>Evolutionary Computation</italic>, 11(1):1-18.
[18]
Hansen, N., and Ostermeier, A. (2001). Completely derandomized self-adaptation in evolution strategies. <italic>Evolutionary Computation</italic>, 9(2):159-195.
[19]
Hansen, N., Ros, R., Mauny, N., Schoenauer, M., and Auger, A. (2011). Impacts of invariance in search: When CMA-ES and PSO face ill-conditioned and non-separable problems. <italic>Applied Soft Computing</italic>, 11(8):5755-5769.
[20]
Harik, G. R., and Lobo, F. G. (1999). A parameter-less genetic algorithm. In <italic>Proceedings of the 1st Annual Conference on Genetic and Evolutionary Computation--Volume 1 (GECCO)</italic>, pp. 258- 265.
[21]
Harville, D. A. (2008). <italic>Matrix algebra from a statistician's perspective</italic>. Berlin: Springer-Verlag.
[22]
Heijmans, R. (1999). When does the expectation of a ratio equal the ratio of expectations? <italic>Statistical Papers</italic>, 40:107-115.
[23]
Jastrebski, G., and Arnold, D. V. (2006). Improving evolution strategies through active covariance matrix adaptation. In <italic>2006 IEEE Congress on Evolutionary Computation</italic>, pp. 9719-9726.
[24]
Karafotias, G., Hoogendoorn, M., and Eiben, A. E. (2015). Parameter control in evolutionary algorithms: Trends and challenges. <italic>IEEE Transactions on Evolutionary Computation</italic>, 19(2):167-187.
[25]
Knight, J. N., and Lunacek, M. (2007). Reducing the space-time complexity of the CMA-ES. In <italic>Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation (GECCO)</italic>, pp. 658-665.
[26]
Krause, O., Arbon&#232;s, D. R., and Igel, C. (2016). CMA-ES with optimal covariance update and storage complexity. In <italic>Advances in Neural Information Processing Systems</italic>, pp. 370-378.
[27]
Krause, O., and Glasmachers, T. (2015). A CMA-ES with multiplicative covariance matrix updates. In <italic>Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation (GECCO)</italic>, pp. 281-288.
[28]
Loshchilov, I. (2017). LM-CMA: An alternative to L-BFGS for large-scale black box optimization. <italic>Evolutionary Computation</italic>, 25(1):143-171.
[29]
Loshchilov, I., Schoenauer, M., and Sebag, M. (2012). Alternative restart strategies for CMA-ES. In <italic>International Conference on Parallel Problem Solving from Nature</italic>, pp. 296-305.
[30]
Ollivier, Y., Arnold, L., Auger, A., and Hansen, N. (2017). Information-geometric optimization algorithms: A unifying picture via invariance principles. <italic>Journal of Machine Learning Research</italic>, 18(18):1-65.
[31]
Ostermeier, A., Gawelczyk, A., and Hansen, N. (1994). A derandomized approach to self-adaptation of evolution strategies. <italic>Evolutionary Computation</italic>, 2(4):369-380.
[32]
Price, K. V. (1997). Differential evolution vs. the functions of the 2nd ICEO. In <italic>IEEE International Conference on Evolutionary Computation, 1997</italic>, pp. 153-157.
[33]
Rios, L.M., and Sahinidis, N. V. (2013). Derivative-free optimization: A review of algorithms and comparison of software implementations. <italic>Journal of Global Optimization</italic>, 56(3):1247-1293.
[34]
Ros, R. (2009). Benchmarking sep-CMA-ES on the BBOB-2009 function testbed. In <italic>Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers (GECCO)</italic>, pp. 2435-2440.
[35]
Ros, R., and Hansen, N. (2008). A simple modification in CMA-ES achieving linear time and space complexity. In <italic>Parallel Problem Solving from Nature</italic>, pp. 296-305.
[36]
Sun, Y., Wierstra, D., Schaul, T., and Schmidhuber, J. (2009). Efficient natural evolution strategies. In <italic>Proceedings of the 11th Annual Conference on Genetic and Evolutionary Computation (GECCO)</italic>, pp. 539-545.
[37]
Wierstra, D., Schaul, T., Glasmachers, T., Sun, Y., Peters, J., and Schmidhuber, J. (2014). Natural evolution strategies. <italic>Journal of Machine Learning Research</italic>, 15:949-980.
[38]
Wierstra, D., Schaul, T., Peters, J., and Schmidhuber, J. (2008). Natural evolution strategies. In <italic>IEEE Congress on Evolutionary Computation</italic>, pp. 3381-3387.

Cited By

View all
  • (2024)Analysis of Surrogate-Assisted Information-Geometric Optimization AlgorithmsAlgorithmica10.1007/s00453-022-01087-886:1(33-63)Online publication date: 1-Jan-2024
  • (2024)LB+IC-CMA-ES: Two Simple Modifications of CMA-ES to Handle Mixed-Integer ProblemsParallel Problem Solving from Nature – PPSN XVIII10.1007/978-3-031-70068-2_18(284-299)Online publication date: 14-Sep-2024
  • (2023)Covariance Matrix Adaptation Evolutionary Strategy with Worst-Case Ranking Approximation for Min–Max Optimization and Its Application to Berthing Control TasksACM Transactions on Evolutionary Learning and Optimization10.1145/36037163:2(1-32)Online publication date: 28-Jun-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Evolutionary Computation
Evolutionary Computation  Volume 28, Issue 3
Fall 2020
190 pages
ISSN:1063-6560
EISSN:1530-9304
Issue’s Table of Contents

Publisher

MIT Press

Cambridge, MA, United States

Publication History

Published: 01 September 2020
Published in EVOL Volume 28, Issue 3

Author Tags

  1. Evolution strategies
  2. covariance matrix adaptation
  3. adaptive diagonal decoding
  4. active covariance matrix update
  5. default strategy parameters.

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)2
  • Downloads (Last 6 weeks)0
Reflects downloads up to 25 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Analysis of Surrogate-Assisted Information-Geometric Optimization AlgorithmsAlgorithmica10.1007/s00453-022-01087-886:1(33-63)Online publication date: 1-Jan-2024
  • (2024)LB+IC-CMA-ES: Two Simple Modifications of CMA-ES to Handle Mixed-Integer ProblemsParallel Problem Solving from Nature – PPSN XVIII10.1007/978-3-031-70068-2_18(284-299)Online publication date: 14-Sep-2024
  • (2023)Covariance Matrix Adaptation Evolutionary Strategy with Worst-Case Ranking Approximation for Min–Max Optimization and Its Application to Berthing Control TasksACM Transactions on Evolutionary Learning and Optimization10.1145/36037163:2(1-32)Online publication date: 28-Jun-2023
  • (2023)Benchmarking CMA-ES with Basic Integer Handling on a Mixed-Integer Test Problem SuiteProceedings of the Companion Conference on Genetic and Evolutionary Computation10.1145/3583133.3596411(1628-1635)Online publication date: 15-Jul-2023
  • (2023)Configuring a Hierarchical Evolutionary Strategy Using Exploratory Landscape AnalysisProceedings of the Companion Conference on Genetic and Evolutionary Computation10.1145/3583133.3596403(1785-1792)Online publication date: 15-Jul-2023
  • (2023)CMA-ES with Learning Rate Adaptation: Can CMA-ES with Default Population Size Solve Multimodal and Noisy Problems?Proceedings of the Genetic and Evolutionary Computation Conference10.1145/3583131.3590358(839-847)Online publication date: 15-Jul-2023
  • (2022)Benchmarking of two implementations of CMA-ES with diagonal decoding on the bbob test suiteProceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3520304.3534011(1700-1707)Online publication date: 9-Jul-2022
  • (2022)High-performance evolutionary algorithms for online neuron controlProceedings of the Genetic and Evolutionary Computation Conference10.1145/3512290.3528725(1308-1316)Online publication date: 8-Jul-2022
  • (2022)Black-box min-max continuous optimization using CMA-ES with worst-case ranking approximationProceedings of the Genetic and Evolutionary Computation Conference10.1145/3512290.3528702(823-831)Online publication date: 8-Jul-2022
  • (2022)Monotone improvement of information-geometric optimization algorithms with a surrogate functionProceedings of the Genetic and Evolutionary Computation Conference10.1145/3512290.3528690(1354-1362)Online publication date: 8-Jul-2022
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media