Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3447548.3467061acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article

OpenBox: A Generalized Black-box Optimization Service

Published: 14 August 2021 Publication History

Abstract

Black-box optimization (BBO) has a broad range of applications, including automatic machine learning, engineering, physics, and experimental design. However, it remains a challenge for users to apply BBO methods to their problems at hand with existing software packages, in terms of applicability, performance, and efficiency. In this paper, we build OpenBox, an open-source and general-purpose BBO service with improved usability. The modular design behind OpenBox also facilitates flexible abstraction and optimization of basic BBO components that are common in other existing systems. OpenBox is distributed, fault-tolerant, and scalable. To improve efficiency, OpenBox further utilizes "algorithm agnostic" parallelization and transfer learning. Our experimental results demonstrate the effectiveness and efficiency of OpenBox compared to existing systems.

Supplementary Material

MP4 File (openbox_a_generalized_blackbox_optimization-yang_li-yu_shen-38958030-g3KR.mp4)
OpenBox is an efficient and generalized blackbox optimization (BBO) system, which supports the following characteristics: 1) BBO with multiple objectives and constraints, 2) BBO with transfer learning, 3) BBO with distributed parallelization, 4) BBO with multi-fidelity acceleration and 5) BBO with early stops. The design of OpenBox follows the following principles: 1) Ease of use: Minimal user effort, and user-friendly visualization for tracking and managing BBO tasks. 2) Consistent performance: Host state-of-the-art optimization algorithms; Choose the proper algorithm automatically. 3) Resource-aware management: Give cost model-based advice to users, e.g., minimal workers or time-budget. 4) Scalability: Scale to dimensions on the number of input variables, objectives, tasks, trials, and parallel evaluations. 5) High efficiency: Effective use of parallel resources, system optimization with transfer-learning and multi-fidelities, etc. 6) Fault tolerance, extensibility, and data privacy protection.

References

[1]
Leonel Aguilar Melgar, David Dao, Shaoduo Gan, Nezihe M. Gürel, Nora Hollenstein, Jiawei Jiang, Bojan Karla?, Thomas Lemmin, Tian Li, Yang Li, Susie Rao, Johannes Rausch, Cedric Renggli, Luka Rimanic, Maurice Weber, Shuai Zhang, Zhikuan Zhao, Kevin Schawinski, Wentao Wu, and Ce Zhang. 2021. In Proceedings of the Annual Conference on Innovative Data Systems Research (CIDR), 2021. CIDR.
[2]
Omid Azizi, Aqeel Mahesri, Benjamin C. Lee, Sanjay J. Patel, and Mark Horowitz. 2010. Energy-Performance Tradeoffs in Processor Architecture and Circuit Design: A Marginal Cost Analysis. In Proceedings of the 37th Annual International Symposium on Computer Architecture. Association for Computing Machinery, New York, NY, USA.
[3]
Maximilian Balandat, Brian Karrer, Daniel R. Jiang, Samuel Daulton, Benjamin Letham, Andrew Gordon Wilson, and Eytan Bakshy. 2020. BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization. In NeurIPS.
[4]
Syrine Belakaria, Aryan Deshwal, and Janardhan Rao Doppa. 2019. Max-value entropy search for multi-objective Bayesian optimization. In NeurIPS.
[5]
Syrine Belakaria, Aryan Deshwal, Nitthilan Kannappan Jayakodi, and Janardhan Rao Doppa. 2020. Uncertainty-aware search framework for multi-objective Bayesian optimization. In AAAI, Vol. 34. 10044--10052.
[6]
James S Bergstra, Rémi Bardenet, Yoshua Bengio, and Balázs Kégl. 2011. Algorithms for hyper-parameter optimization. In Advances in neural information processing systems. 2546--2554.
[7]
Ivo Couckuyt, Dirk Deschrijver, and Tom Dhaene. 2014. Fast Calculation of Multiobjective Probability of Improvement and Expected Improvement Criteria for Pareto Optimization. J. of Global Optimization, Vol. 60, 3 (2014), 575--594.
[8]
Samuel Daulton, Maximilian Balandat, and Eytan Bakshy. 2020. Differentiable Expected Hypervolume Improvement for Parallel Multi-Objective Bayesian Optimization. arXiv preprint arXiv:2006.05078 (2020).
[9]
Tobias Domhan, Jost Tobias Springenberg, and Frank Hutter. 2015. Speeding up automatic hyperparameter optimization of deep neural networks by extrapolation of learning curves. In IJCAI International Joint Conference on Artificial Intelligence.
[10]
Katharina Eggensperger, Frank Hutter, Holger H Hoos, and Kevin Leyton-Brown. 2015. Efficient Benchmarking of Hyperparameter Optimizers via Surrogates. In AAAI. 1114--1120.
[11]
M. T. M. Emmerich, K. C. Giannakoglou, and B. Naujoks. 2006. Single- and multiobjective evolutionary optimization assisted by Gaussian random field metamodels. IEEE Transactions on Evolutionary Computation (2006).
[12]
David Eriksson and Matthias Poloczek. 2021. Scalable constrained bayesian optimization. In International Conference on Artificial Intelligence and Statistics. PMLR, 730--738.
[13]
Stefan Falkner, Aaron Klein, and Frank Hutter. 2018. BOHB: Robust and efficient hyperparameter optimization at scale. arXiv preprint arXiv:1807.01774 (2018).
[14]
Matthias Feurer, Benjamin Letham, and Eytan Bakshy. 2018. Scalable meta-learning for bayesian optimization using ranking-weighted gaussian process ensembles. In AutoML Workshop at ICML.
[15]
Adam Foster, Martin Jankowiak, Eli Bingham, Paul Horsfall, Yee Whye Teh, Tom Rainforth, and Noah Goodman. 2019. Variational bayesian optimal experimental design. arXiv preprint arXiv:1903.05480 (2019).
[16]
Jacob R. Gardner, Matt J. Kusner, Zhixiang Xu, Kilian Q. Weinberger, and John P. Cunningham. 2014. Bayesian Optimization with Inequality Constraints. In Proceedings of the 31st International Conference on International Conference on Machine Learning - Volume 32 (ICML'14). JMLR.org.
[17]
Eduardo C Garrido-Merchán and Daniel Hernández-Lobato. 2019. Predictive entropy search for multi-objective bayesian optimization with constraints. Neurocomputing, Vol. 361 (2019), 50--68.
[18]
Michael Adam Gelbart. 2015. Constrained Bayesian Optimizationand Applications. Ph.D. Dissertation. Harvard University, Graduate School of Arts & Sciences.
[19]
Daniel Golovin, Benjamin Solnik, Subhodeep Moitra, Greg Kochanski, John Karro, and D Sculley. 2017. Google vizier: A service for black-box optimization. In Proceedings of the 23rd ACM SIGKDD. ACM, 1487--1495.
[20]
Javier Gonzá lez, Zhenwen Dai, Philipp Hennig, and Neil Lawrence. 2016. Batch bayesian optimization via local penalization. In AISTATS 2016 .arxiv: 1505.08052
[21]
Robert B Gramacy, Genetha A Gray, Sébastien Le Digabel, Herbert KH Lee, Pritam Ranjan, Garth Wells, and Stefan M Wild. 2016. Modeling an augmented Lagrangian for blackbox constrained optimization. Technometrics, Vol. 58, 1 (2016), 1--11.
[22]
Ryan-Rhys Griffiths and José Miguel Hernández-Lobato. 2020. Constrained Bayesian optimization for automatic chemical design using variational autoencoders. Chem. Sci., Vol. 11 (2020).
[23]
N. Hansen and A. Ostermeier. 2001. Completely derandomized self-adaptation in evolution strategies.
[24]
José Miguel Hernández-Lobato, Michael Gelbart, Matthew Hoffman, Ryan Adams, and Zoubin Ghahramani. 2015. Predictive entropy search for bayesian optimization with unknown constraints. In International conference on machine learning. PMLR, 1699--1707.
[25]
José Miguel Hernández-Lobato, Michael A. Gelbart, Ryan P. Adams, Matthew W. Hoffman, and Zoubin Ghahramani. 2016. A General Framework for Constrained Bayesian Optimization using Information-based Search. Journal of Machine Learning Research, Vol. 17, 160 (2016), 1--53.
[26]
Yu-Chi Ho and David L Pepyne. 2001. Simple explanation of the no free lunch theorem of optimization. In Proceedings of the 40th IEEE Conference on Decision and Control (Cat. No. 01CH37228), Vol. 5. IEEE, 4409--4414.
[27]
Frank Hutter, Holger H Hoos, and Kevin Leyton-Brown. 2011. Sequential model-based optimization for general algorithm configuration. In International Conference on Learning and Intelligent Optimization. Springer, 507--523.
[28]
Aaron Klein, Stefan Falkner, Jost Tobias Springenberg, and Frank Hutter. 2017. Learning Curve Prediction With Bayesian Neural Networks. ICLR (2017).
[29]
Joshua Knowles. 2006. ParEGO: A hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Transactions on Evolutionary Computation (2006).
[30]
Nicolas Knudde, Joachim van der Herten, Tom Dhaene, and Ivo Couckuyt. 2017. GPflowOpt: A Bayesian Optimization Library using TensorFlow. arXiv preprint -- arXiv:1711.03845 (2017).
[31]
Lisha Li, Kevin Jamieson, Giulia DeSalvo, Afshin Rostamizadeh, and Ameet Talwalkar. 2018. Hyperband: A novel bandit-based approach to hyperparameter optimization. Proceedings of the ICLR (2018), 1--48.
[32]
Yang Li, Jiawei Jiang, Jinyang Gao, Yingxia Shao, Ce Zhang, and Bin Cui. 2020 a. Efficient Automatic CASH via Rising Bandits. In AAAI, Vol. 34. 4763--4771.
[33]
Yang Li, Yu Shen, Jiawei Jiang, Jinyang Gao, Ce Zhang, and Bin Cui. 2020 b. MFES-HB: Efficient Hyperband with Multi-Fidelity Quality Measurements. arXiv preprint arXiv:2012.03011 (2020).
[34]
Edo Liberty, Zohar Karnin, Bing Xiang, Laurence Rouesnel, Baris Coskun, Ramesh Nallapati, Julio Delgado, Amir Sadoughi, Yury Astashonok, Piali Das, et al. 2020. Elastic Machine Learning Algorithms in Amazon SageMaker. In Proceedings of the 2020 ACM SIGMOD International Conference on Management of Data. 731--737.
[35]
Microsoft. 2020. Smart buildings: From design to reality. https://azure.microsoft.com/en-us/resources/smart-buildings-from-design-to-reality/.
[36]
J Movc kus. 1975. On Bayesian methods for seeking the extremum. In Optimization Techniques IFIP Technical Conference. Springer, 400--404.
[37]
Biswajit Paria, Kirthevasan Kandasamy, and Barnabás Póczos. 2020. A flexible framework for multi-objective Bayesian optimization using random scalarizations. In Uncertainty in Artificial Intelligence. PMLR, 766--776.
[38]
Valerio Perrone, Michele Donini, Muhammad Bilal Zafar, Robin Schmucker, Krishnaram Kenthapadi, and Cédric Archambeau. 2020. Fair bayesian optimization. arXiv preprint arXiv:2006.05109 (2020).
[39]
Victor Picheny, Robert B Gramacy, Stefan M Wild, and Sebastien Le Digabel. 2016. Bayesian optimization under mixed constraints with a slack-variable augmented Lagrangian. arXiv preprint arXiv:1605.09466 (2016).
[40]
Matthias Poloczek, Jialei Wang, and Peter Frazier. 2017. Multi-information source optimization. In Advances in Neural Information Processing Systems. 4288--4298.
[41]
L. M. Rios and N. Sahinidis. 2013. Derivative-free optimization: a review of algorithms and comparison of software implementations. Journal of Global Optimization, Vol. 56 (2013), 1247--1293.
[42]
Bobak Shahriari, Kevin Swersky, Ziyu Wang, Ryan P. Adams, and Nando De Freitas. 2016. Taking the human out of the loop: A review of Bayesian optimization.
[43]
Jasper Snoek, Hugo Larochelle, and Ryan P Adams. 2012. Practical bayesian optimization of machine learning algorithms. In NIPS. 2951--2959.
[44]
N. Srinivas, Andreas Krause, Sham M. Kakade, and Matthias W. Seeger. 2010. Gaussian Process Optimization in the Bandit Setting: No Regret and Experimental Design. In ICML.
[45]
Dana Van Aken, Andrew Pavlo, Geoffrey J Gordon, and Bohan Zhang. 2017. Automatic database management system tuning through large-scale machine learning. In Proceedings of the 2017 SIGMOD. 1009--1024.
[46]
Kaifeng Yang, Michael Emmerich, André Deutz, and Thomas B"ack. 2019. Multi-Objective Bayesian Global Optimization using expected hypervolume improvement gradient. Swarm and evolutionary computation, Vol. 44 (2019), 945--956.
[47]
Ji Zhang, Yu Liu, Ke Zhou, Guoliang Li, Zhili Xiao, Bin Cheng, Jiashu Xing, Yangtao Wang, Tianheng Cheng, Li Liu, et al. 2019. An end-to-end automatic cloud database tuning system using deep reinforcement learning. In Proceedings of the 2019 International Conference on Management of Data. 415--432.
[48]
Eckart Zitzler, Kalyanmoy Deb, and Lothar Thiele. 2000. Comparison of Multiobjective Evolutionary Algorithms: Empirical Results. Evolutionary Computation, Vol. 8, 2 (2000), 173--195.

Cited By

View all
  • (2025)Research on Reservoir Hydrocarbon‐Bearing Property Identification Method Based on Logging Data and Machine LearningGeofluids10.1155/gfl/85168102025:1Online publication date: 18-Feb-2025
  • (2025)Centrum: Model-based Database Auto-tuning with Minimal Distributional AssumptionsProceedings of the ACM on Management of Data10.1145/37096713:1(1-26)Online publication date: 11-Feb-2025
  • (2025)Point-by-point transfer learning for Bayesian optimization: An accelerated search strategyComputers & Chemical Engineering10.1016/j.compchemeng.2024.108952194(108952)Online publication date: Mar-2025
  • Show More Cited By

Index Terms

  1. OpenBox: A Generalized Black-box Optimization Service

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      KDD '21: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining
      August 2021
      4259 pages
      ISBN:9781450383325
      DOI:10.1145/3447548
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 14 August 2021

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Bayesian optimization
      2. black-box optimization
      3. hyper-parameter optimization

      Qualifiers

      • Research-article

      Funding Sources

      • National Key Research and Development Program of China award number(s)
      • NSFC award number(s)

      Conference

      KDD '21
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 1,133 of 8,635 submissions, 13%

      Upcoming Conference

      KDD '25

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)153
      • Downloads (Last 6 weeks)16
      Reflects downloads up to 25 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2025)Research on Reservoir Hydrocarbon‐Bearing Property Identification Method Based on Logging Data and Machine LearningGeofluids10.1155/gfl/85168102025:1Online publication date: 18-Feb-2025
      • (2025)Centrum: Model-based Database Auto-tuning with Minimal Distributional AssumptionsProceedings of the ACM on Management of Data10.1145/37096713:1(1-26)Online publication date: 11-Feb-2025
      • (2025)Point-by-point transfer learning for Bayesian optimization: An accelerated search strategyComputers & Chemical Engineering10.1016/j.compchemeng.2024.108952194(108952)Online publication date: Mar-2025
      • (2024)Grey-box Bayesian optimization for sensor placement in assisted living environmentsProceedings of the Thirty-Eighth AAAI Conference on Artificial Intelligence and Thirty-Sixth Conference on Innovative Applications of Artificial Intelligence and Fourteenth Symposium on Educational Advances in Artificial Intelligence10.1609/aaai.v38i20.30208(22049-22057)Online publication date: 20-Feb-2024
      • (2024)CASH via Optimal Diversity for Ensemble LearningProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3671894(2411-2419)Online publication date: 25-Aug-2024
      • (2024)Auto-Tuning Dynamics Parameters of Intelligent Electric Vehicles via Bayesian OptimizationIEEE Transactions on Transportation Electrification10.1109/TTE.2023.334687410:3(6915-6927)Online publication date: Sep-2024
      • (2024)FastTuning: Enabling Fast and Efficient Hyper-Parameter Tuning With Partitioning and Parallelism of Search SpaceIEEE Transactions on Parallel and Distributed Systems10.1109/TPDS.2024.338693935:7(1174-1188)Online publication date: 10-Apr-2024
      • (2024)Predictive Vehicle Stability Assessment Using Lyapunov Exponent Under Extreme ConditionsIEEE Transactions on Intelligent Transportation Systems10.1109/TITS.2024.346369325:12(21559-21571)Online publication date: Dec-2024
      • (2024)Multimodal Multi-Objective Test Data Generation Method based on Particle Swarm Optimization2024 IEEE 24th International Conference on Software Quality, Reliability and Security (QRS)10.1109/QRS62785.2024.00016(61-71)Online publication date: 1-Jul-2024
      • (2024)Time-Optimal TCP and Robot Base Placement for Pick-and-Place Tasks in Highly Constrained Environments2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS58592.2024.10801373(2251-2257)Online publication date: 14-Oct-2024
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media