Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3583131.3590358acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

CMA-ES with Learning Rate Adaptation: Can CMA-ES with Default Population Size Solve Multimodal and Noisy Problems?

Published: 12 July 2023 Publication History

Abstract

The covariance matrix adaptation evolution strategy (CMA-ES) is one of the most successful methods for solving black-box continuous optimization problems. One practically useful aspect of the CMA-ES is that it can be used without hyperparameter tuning. However, the hyperparameter settings still have a considerable impact, especially for difficult tasks such as solving multimodal or noisy problems. In this study, we investigate whether the CMA-ES with default population size can solve multimodal and noisy problems. To perform this investigation, we develop a novel learning rate adaptation mechanism for the CMA-ES, such that the learning rate is adapted so as to maintain a constant signal-to-noise ratio. We investigate the behavior of the CMA-ES with the proposed learning rate adaptation mechanism through numerical experiments, and compare the results with those obtained for the CMA-ES with a fixed learning rate. The results demonstrate that, when the proposed learning rate adaptation is used, the CMA-ES with default population size works well on multimodal and/or noisy problems, without the need for extremely expensive learning rate tuning.

Supplementary Material

PDF File (p839-nomura-suppl.pdf)
Supplemental material.

References

[1]
Youhei Akimoto, Anne Auger, and Nikolaus Hansen. 2020. Quality gain analysis of the weighted recombination evolution strategy on general convex quadratic functions. Theoretical Computer Science 832 (2020), 42--67. Theory of Evolutionary Computation.
[2]
Youhei Akimoto and Nikolaus Hansen. 2020. Diagonal Acceleration for Covariance Matrix Adaptation Evolution Strategies. Evol. Comput. 28, 3 (2020), 405--435.
[3]
Youhei Akimoto, Yuichi Nagata, Isao Ono, and Shigenobu Kobayashi. 2010. Bidirectional relation between CMA evolution strategies and natural evolution strategies. In International conference on parallel problem solving from nature. Springer, 154--163.
[4]
Anne Auger and Nikolaus Hansen. 2005. A restart CMA evolution strategy with increasing population size. In 2005 IEEE congress on evolutionary computation, Vol. 2. IEEE, 1769--1776.
[5]
H.-G. Beyer. 1998. Mutate Large, But Inherit Small! On the Analysis of Rescaled Mutations in (see PDF)-ES with Noisy Fitness Data. In Parallel Problem Solving from Nature, 5, A. E. Eiben, T. Bäck, M. Schoenauer, and H.-P. Schwefel (Eds.). Springer, Heidelberg, 109--118.
[6]
H.-G. Beyer. 2000. Evolutionary Algorithms in Noisy Environments: Theoretical Issues and Guidelines for Practice. Computer Methods in Applied Mechanics and Engineering 186, 2--4 (2000), 239--267.
[7]
Armand Gissler, Anne Auger, and Nikolaus Hansen. 2022. Learning Rate Adaptation by Line Search in Evolution Strategies with Recombination. In Proceedings of the Genetic and Evolutionary Computation Conference (Boston, Massachusetts) (GECCO '22). Association for Computing Machinery, New York, NY, USA, 630--638.
[8]
Nikolaus Hansen. 2016. The CMA evolution strategy: A tutorial. arXiv preprint arXiv:1604.00772 (2016).
[9]
Nikolaus Hansen and Anne Auger. 2014. Principled design of continuous stochastic search: From theory to practice. In Theory and principled methods for the design of metaheuristics. Springer, 145--180.
[10]
Nikolaus Hansen, Anne Auger, Raymond Ros, Olaf Mersmann, Tea Tušar, and Dimo Brockhoff. 2021. COCO: A platform for comparing continuous optimizers in a black-box setting. Optimization Methods and Software 36, 1 (2021), 114--144.
[11]
Nikolaus Hansen and Stefan Kern. 2004. Evaluating the CMA evolution strategy on multimodal test functions. In International conference on parallel problem solving from nature. Springer, 282--291.
[12]
Nikolaus Hansen and Andreas Ostermeier. 2001. Completely derandomized self-adaptation in evolution strategies. Evolutionary computation 9, 2 (2001), 159--195.
[13]
Michael Hellwig and Hans-Georg Beyer. 2016. Evolution Under Strong Noise: A Self-Adaptive Evolution Strategy Can Reach the Lower Performance Bound - The pcCMSA-ES. In Parallel Problem Solving from Nature - PPSN XIV, Julia Handl, Emma Hart, Peter R. Lewis, Manuel López-Ibáñez, Gabriela Ochoa, and Ben Paechter (Eds.). Springer International Publishing, Cham, 26--36.
[14]
Oswin Krause. 2019. Large-Scale Noise-Resilient Evolution-Strategies. In Proceedings of the Genetic and Evolutionary Computation Conference (Prague, Czech Republic) (GECCO '19). Association for Computing Machinery, New York, NY, USA, 682--690.
[15]
Ilya Loshchilov, Marc Schoenauer, Michele Sebag, and Nikolaus Hansen. 2014. Maximum likelihood-based online adaptation of hyper-parameters in CMA-ES. In International Conference on Parallel Problem Solving from Nature. Springer, 70--79.
[16]
Hidekazu Miyazawa and Youhei Akimoto. 2017. Effect of the Mean Vector Learning Rate in CMA-ES. In Proceedings of the Genetic and Evolutionary Computation Conference (Berlin, Germany) (GECCO '17). Association for Computing Machinery, New York, NY, USA, 721--728.
[17]
Duc Manh Nguyen and Nikolaus Hansen. 2017. Benchmarking CMAES-APOP on the BBOB Noiseless Testbed. In Proceedings of the Genetic and Evolutionary Computation Conference Companion (Berlin, Germany) (GECCO '17). Association for Computing Machinery, New York, NY, USA, 1756--1763.
[18]
Kouhei Nishida and Youhei Akimoto. 2016. Population Size Adaptation for the CMA-ES Based on the Estimation Accuracy of the Natural Gradient. In Proceedings of the Genetic and Evolutionary Computation Conference 2016 (Denver, Colorado, USA) (GECCO '16). Association for Computing Machinery, New York, NY, USA, 237--244.
[19]
Kouhei Nishida and Youhei Akimoto. 2018. PSA-CMA-ES: CMA-ES with Population Size Adaptation. In Proceedings of the Genetic and Evolutionary Computation Conference (Kyoto, Japan) (GECCO '18). Association for Computing Machinery, New York, NY, USA, 865--872.
[20]
Masahiro Nomura and Isao Ono. 2022. Towards a Principled Learning Rate Adaptation for Natural Evolution Strategies. In Applications of Evolutionary Computation, Juan Luis Jiménez Laredo, J. Ignacio Hidalgo, and Kehinde Oluwatoyin Babaagba (Eds.). Springer International Publishing, Cham, 721--737.
[21]
Yann Ollivier, Ludovic Arnold, Anne Auger, and Nikolaus Hansen. 2017. Information-geometric optimization algorithms: A unifying picture via invariance principles. Journal of Machine Learning Research 18, 18 (2017), 1--65.
[22]
I. Rechenberg. 1994. Evolutionsstrategie '94. Frommann-Holzboog Verlag, Stuttgart.

Cited By

View all
  • (2024)CMA-ES with Adaptive Reevaluation for Multiplicative NoiseProceedings of the Genetic and Evolutionary Computation Conference10.1145/3638529.3654182(731-739)Online publication date: 14-Jul-2024

Index Terms

  1. CMA-ES with Learning Rate Adaptation: Can CMA-ES with Default Population Size Solve Multimodal and Noisy Problems?

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    GECCO '23: Proceedings of the Genetic and Evolutionary Computation Conference
    July 2023
    1667 pages
    ISBN:9798400701191
    DOI:10.1145/3583131
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 12 July 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. covariance matrix adaptation evolution strategy
    2. black-box optimization

    Qualifiers

    • Research-article

    Conference

    GECCO '23
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)74
    • Downloads (Last 6 weeks)8
    Reflects downloads up to 12 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)CMA-ES with Adaptive Reevaluation for Multiplicative NoiseProceedings of the Genetic and Evolutionary Computation Conference10.1145/3638529.3654182(731-739)Online publication date: 14-Jul-2024

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media