Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2739480.2754704acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Sample Reuse in the Covariance Matrix Adaptation Evolution Strategy Based on Importance Sampling

Published: 11 July 2015 Publication History

Abstract

Recent studies reveal that the covariance matrix adaptation evolution strategy (CMA-ES) updates the parameters based on the natural gradient. The rank-based weight is considered the result of the quantile-based transformation of the objective value and the parameters are adjusted in the direction of the natural gradient estimated by Monte-Carlo with the samples drawn from the current distribution. In this paper, we propose a sample reuse mechanism for the CMA-ES. On the basis of the importance sampling, the past samples are reused to reduce the estimation variance of the quantile and the natural gradient. We derive the formula for the rank-¥mu update of the covariance matrix and the mean vector update using the past samples, then incorporate it into the CMA-ES without the step-size adaptation. From the numerical experiments, we observe that the proposed approach helps to reduce the number of function evaluations on many benchmark functions, especially when the number of samples at each iteration is relatively small.

References

[1]
Y. Akimoto, A. Auger, and N. Hansen. Convergence of the Continuous Time Trajectories of Isotropic Evolution Strategies on Monotonic(C2)-composite Functions. In Problem Solving from Nature-PPSN XII, pages 42--51, 2012.
[2]
Y. Akimoto, Y. Nagata, I. Ono, and S. Kobayashi. Bidirectional Relation between CMA Evolution Strategies and Natural Evolution Strategies. In Parallel Problem Solving from Nature --PPSN XI, volume 6238 of LNCS, pages 154--163. Springer, 2010.
[3]
Y. Akimoto, J. Sakuma, I. Ono, and S. Kobayashi. Functionally Specialized CMA-ES : A Modification of CMA-ES Based on the Specialization of the Functions of Covariance Matrix Adaptation and Step Size Adaptation. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO '08), pages 479--486, 2008.
[4]
S. Amari. Natural Gradient Works Efficiently in Learning. Neural Computation, 10(2):251--276, 1998.
[5]
T. Glasmachers, T. Schaul, Y. Sun, D. Wierstra, and J. Schmidhuber. Exponential Natural Evolution Strategies. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO '10), pages 393--400, 2010.
[6]
N. Hansen. The CMA Evolution Strategy: A Comparing Review. In J. A. Lozano, P. Larranaga, I. n. Inza, and E. Bengoetxea, editors, Towards a New Evolutionary Computation, volume 192 of Studies in Fuzziness and Soft Computing, pages 75--102. Springer, 2006.
[7]
N. Hansen and A. Auger. Principled Design of Continuous Stochastic Search: From Theory to Practice. In Y. Borenstein and A. Moraglio, editors, Theory and Principled Methods for Designing Metaheustics, Natural Computing Series, pages 145--180. Springer, 2014.
[8]
N. Hansen, A. Auger, R. Ros, S. Finck, and P. Posık. Comparing Results of 31 Algorithms from the Black-Box Optimization Benchmarking BBOB-2009. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO '10) Companion, pages 1689--1696, 2010.
[9]
N. Hansen and A. Ostermeier. Completely Derandomized Self-Adaptation in Evolution Strategies. Evolutionary Computation, 9(2):159--195, 2001.
[10]
Y. Ollivier, L. Arnold, A. Auger, and N. Hansen. Information-Geometric Optimization Algorithms: A Unifying Picture via Invariance Principles. eprint arXiv:1106.3708v2, 2011.
[11]
C. R. Shelton. Policy Improvement for POMDPs Using Normalized Importance Sampling. Technical Report AI Momo 2001-002, MIT AI Lab., 2001.
[12]
C. R. Shelton. Policy Improvement for POMDPs Using Normalized Importance Sampling. In Proceedings of the Seventeenth Conference on Uncertainty in Artificial Intelligence (UAI '01), pages 496--503, 2001.
[13]
Y. Sun, D. Wierstra, T. Schaul, and J. Schmidhuber. Efficient Natural Evolution Strategies. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO '09), pages 539--546, 2009.
[14]
Y. Sun, D. Wierstra, T. Schaul, and J. Schmidhuber. Stochastic Search Using the Natural Gradient. In Proceedings of the 26th International Conference on Machine Learning (ICML '09), pages 1161--1168, 2009.
[15]
E. Veach and L. J. Guibas. Optimally Combining Sampling Techniques for Monte Carlo Rendering. In Proceedings of the 22nd Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH '95), pages 419--428, 1995.

Cited By

View all
  • (2024)Natural Gradient Interpretation of Rank-One Update in CMA-ESParallel Problem Solving from Nature – PPSN XVIII10.1007/978-3-031-70068-2_16(252-267)Online publication date: 7-Sep-2024
  • (2022)Efficient Search of Multiple Neural Architectures with Different Complexities via Importance SamplingArtificial Neural Networks and Machine Learning – ICANN 202210.1007/978-3-031-15937-4_51(607-619)Online publication date: 7-Sep-2022
  • (2018)Efficient sample reuse in policy search by multiple importance samplingProceedings of the Genetic and Evolutionary Computation Conference10.1145/3205455.3205564(545-552)Online publication date: 2-Jul-2018

Index Terms

  1. Sample Reuse in the Covariance Matrix Adaptation Evolution Strategy Based on Importance Sampling

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      GECCO '15: Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation
      July 2015
      1496 pages
      ISBN:9781450334723
      DOI:10.1145/2739480
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 11 July 2015

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. covariance matrix adaptation evolution strategy
      2. importance sampling
      3. information geometric optimization
      4. natural gradient

      Qualifiers

      • Research-article

      Conference

      GECCO '15
      Sponsor:

      Acceptance Rates

      GECCO '15 Paper Acceptance Rate 182 of 505 submissions, 36%;
      Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)9
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 25 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Natural Gradient Interpretation of Rank-One Update in CMA-ESParallel Problem Solving from Nature – PPSN XVIII10.1007/978-3-031-70068-2_16(252-267)Online publication date: 7-Sep-2024
      • (2022)Efficient Search of Multiple Neural Architectures with Different Complexities via Importance SamplingArtificial Neural Networks and Machine Learning – ICANN 202210.1007/978-3-031-15937-4_51(607-619)Online publication date: 7-Sep-2022
      • (2018)Efficient sample reuse in policy search by multiple importance samplingProceedings of the Genetic and Evolutionary Computation Conference10.1145/3205455.3205564(545-552)Online publication date: 2-Jul-2018

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media