Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3449726.3459532acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
poster

Meta-learning for symbolic hyperparameter defaults

Published: 08 July 2021 Publication History
  • Get Citation Alerts
  • Abstract

    Hyperparameter optimization in machine learning (ML) deals with the problem of empirically learning an optimal algorithm configuration from data, usually formulated as a black-box optimization problem. In this work, we propose a zero-shot method to meta-learn symbolic default hyperparameter configurations that are expressed in terms of the properties of the dataset. This enables a much faster, but still data-dependent, configuration of the ML algorithm, compared to standard hyperparameter optimization approaches. In the past, symbolic and static default values have usually been obtained as hand-crafted heuristics. We propose an approach of learning such symbolic configurations as formulas of dataset properties from a large set of prior evaluations on multiple datasets by optimizing over a grammar of expressions using an evolutionary algorithm. We evaluate our method on surrogate empirical performance models as well as on real data across 6 ML algorithms on more than 100 datasets and demonstrate that our method indeed finds viable symbolic defaults.

    Supplementary Material

    PDF File (p151-gijsbers_suppl.pdf)
    p151-gijsbers_suppl.pdf

    References

    [1]
    Kalyanmoy Deb, Amrit Pratap, Sameer Agarwal, and TAMT Meyarivan. 2002. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE transactions on evolutionary computation 6, 2 (2002), 182--197.
    [2]
    M. Feurer, A. Klein, K. Eggensperger, J. T. Springenberg, M. Blum, and F. Hutter. 2015. Efficient and Robust Automated Machine Learning. In Advances in Neural Information Processing Systems 28. Curran Associates, Inc., 2962--2970.
    [3]
    John R Koza. 1994. Genetic programming as a means for programming computers by natural selection. Statistics and computing 4, 2 (1994), 87--112.
    [4]
    M. O'Neill and C. Ryan. 2001. Grammatical evolution. IEEE Transactions on Evolutionary Computation 5, 4 (Aug 2001), 349--358.
    [5]
    Jan N. van Rijn, Florian Pfisterer, Janek Thomas, Andreas Müller, Bernd Bischl, and Joaquin Vanschoren. 2018. Meta learning for defaults: symbolic defaults. In Workshop on Meta-Learning @ NeurIPS2018.
    [6]
    J. Vanschoren, J. N. van Rijn, B. Bischl, and L. Torgo. 2014. OpenML: networked science in machine learning. ACM SIGKDD Explorations Newsletter 15, 2 (2014), 49--60.

    Cited By

    View all
    • (2024)Automated machine learning: past, present and futureArtificial Intelligence Review10.1007/s10462-024-10726-157:5Online publication date: 18-Apr-2024
    • (2023)Hyperparameter optimization: Foundations, algorithms, best practices, and open challengesWIREs Data Mining and Knowledge Discovery10.1002/widm.148413:2Online publication date: 16-Jan-2023

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    GECCO '21: Proceedings of the Genetic and Evolutionary Computation Conference Companion
    July 2021
    2047 pages
    ISBN:9781450383516
    DOI:10.1145/3449726
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 08 July 2021

    Check for updates

    Author Tags

    1. hyperparameter optimization
    2. metalearning

    Qualifiers

    • Poster

    Funding Sources

    • DARPA and Air Force Research Laboratory
    • German Federal Ministryof Education and Research (BMBF)

    Conference

    GECCO '21
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)18
    • Downloads (Last 6 weeks)1

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Automated machine learning: past, present and futureArtificial Intelligence Review10.1007/s10462-024-10726-157:5Online publication date: 18-Apr-2024
    • (2023)Hyperparameter optimization: Foundations, algorithms, best practices, and open challengesWIREs Data Mining and Knowledge Discovery10.1002/widm.148413:2Online publication date: 16-Jan-2023

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media