Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3449726.3459575acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
poster

CMA-ES with coordinate selection for high-dimensional and ill-conditioned functions

Published: 08 July 2021 Publication History
  • Get Citation Alerts
  • Abstract

    Algorithms for black-box optimization need considering numerous properties of objective functions in advance. The covariance matrix adaptation evolution strategy (CMA-ES) is known as one of the state-of-the-art algorithms for black-box optimization. Despite its achievement, the CMA-ES fails to minimize the objective function which is high-dimensional and ill-conditioned such as 100,000-dimensional Ellipsoid function. This fact is a serious problem to apply the CMA-ES to recent high-dimensional machine learning models. We confirm that the "single" step-size for all coordinates is one of the hindrances to the adaptation of the variance-covariance matrix. To solve this, we propose a CMA-ES with coordinate selection. Coordinate selection enables us to vectorize the step-size and adapt each component of the vector to the scale of selected coordinates. Furthermore, coordinate selection based on estimated curvature reduces the condition number during updating variables in selected coordinate space. Our method is enough simple to easily apply to most of variations of CMA-ES: only execute conventional algorithms in the selected coordinate space. The experimental results show that our method applied to the CMA-ES, the sep-CMA-ES and the VD-CMA outperforms the conventional variations of CMA-ES in terms of function evaluations and an objective value in the optimization of high-dimensional and ill-conditioned functions.

    Supplementary Material

    ZIP File (p209-shimizu_suppl.zip)
    p209-shimizu_suppl.zip

    References

    [1]
    Y. Akimoto, A. Auger, and N. Hansen. 2014. Comparison-Based Natural Gradient Optimization in High Dimension. In Proc. of GECCO '14. ACM Press, 373--380.
    [2]
    N. Hansen. 2016. The CMA Evolution Strategy: A Tutorial. arXiv:1604.00772.
    [3]
    N. Hansen and A. Auger. 2014. Principled Design of Continuous Stochastic Search: From Theory to Practice. Theory and Principled Methods for the Design of Metaheuristics (2014), 145--180.
    [4]
    I. Loshchilov. 2014. A Computationally Efficient Limited Memory CMA-ES for Large Scale Optimization. In Proc. of GECCO '14. ACM Press, 397--404.
    [5]
    R. Ros and N. Hansen. 2008. A Simple Modification in CMA-ES Achieving Linear Time and Space Complexity. In PPSN X. Vol. 5199. Springer, 296--305.

    Cited By

    View all
    • (2022)A survey on multi-objective hyperparameter optimization algorithms for machine learningArtificial Intelligence Review10.1007/s10462-022-10359-256:8(8043-8093)Online publication date: 24-Dec-2022

    Index Terms

    1. CMA-ES with coordinate selection for high-dimensional and ill-conditioned functions

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      GECCO '21: Proceedings of the Genetic and Evolutionary Computation Conference Companion
      July 2021
      2047 pages
      ISBN:9781450383516
      DOI:10.1145/3449726
      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 08 July 2021

      Check for updates

      Author Tags

      1. CMA-ES
      2. high dimensional function
      3. ill-conditioned function

      Qualifiers

      • Poster

      Funding Sources

      • JST (Japan Science and Technology Agency)

      Conference

      GECCO '21
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)23
      • Downloads (Last 6 weeks)4
      Reflects downloads up to 11 Aug 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2022)A survey on multi-objective hyperparameter optimization algorithms for machine learningArtificial Intelligence Review10.1007/s10462-022-10359-256:8(8043-8093)Online publication date: 24-Dec-2022

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media