Abstract
We consider the challenge of numerically comparing optimization algorithms that employ random-restarts under the assumption that only limited test data is available. We develop a bootstrapping technique to estimate the incumbent solution of the optimization problem over time as a stochastic process. The asymptotic properties of the estimator are examined and the approach is validated by an out-of-sample test. Finally, three methods for comparing the performance of different algorithms based on the estimator are proposed and demonstrated with data from a real-world optimization problem.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Notes
It is worth remarking that the term initial condition (as apposed to starting point) is used to emphasize that the initial condition could control a starting point, or a selection of parameters, or even just the random seed employed by a stochastic optimization algorithm.
Full results are available upon request.
Problem and algorithm details are irrelevant to the results of this paper, so we omit them for space.
References
Abramson, M.A., Audet, C., Couture, G., Dennis Jr., J.E., Le Digabel, S., Tribes, C.: The NOMAD project. Software available at https://www.gerad.ca/nomad/
Audet, C., Dennis Jr., J.: Mesh adaptive direct search algorithms for constrained optimization. SIAM J. Optim. 17(1), 188–217 (2006)
Beiranvand, V., Hare, W., Lucet, Y.: Best practices for comparing optimization algorithms. Optim. Eng. 18(4), 815–848 (2017)
Bickel, P.J., Freedman, D.A.: Some asymptotic theory for the bootstrap. Ann. Stat. 9(6), 1196–1217 (1981)
Butler, A., Haynes, R.D., Humphries, T.D., Ranjan, P.: Efficient optimization of the likelihood function in Gaussian process modelling. Comput. Stat. Data Anal. 73, 40–52 (2014)
Currie, J., Wilson, D.: OPTI: lowering the barrier between open source optimizers and the industrial MATLAB user. In: Sahinidis, N., Pinto, J. (eds.) Foundations of Computer-Aided Process Operations. Savannah, Georgia (2012)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)
Efron, B., Tibshirani, R.J.: An Introduction to the Bootstrap. Number 57 in Monographs on Statistics and Applied Probability. Chapman & Hall/CRC, Boca Raton (1993)
Feo, T.A., Resende, M.G.C., Smith, S.H.: A greedy randomized adaptive search procedure for maximum independent set. Oper. Res. 42(5), 860–878 (1994)
Fowler, K.R., Reese, J.P., Kees, C.E., Dennis Jr., J.E., Kelley, C.T., Miller, C.T., Audet, C., Booker, A.J., Couture, G., Darwin, R.W., Farthing, M.W., Finkel, D.E., Gablonsky, J.M., Gray, G., Kolda, T.G.: Comparison of derivative-free optimization methods for groundwater supply and hydraulic capture community problems. Adv. Water Resour. 31(5), 743–757 (2008)
Glover, F.: A Template for Scatter Search and Path Relinking, pp. 1–51. Springer, Berlin (1998)
Grishagin, V.A.: Operating characteristics of some global search algorithms. Probl. Stoch. Search 7, 198–206 (1978)
Hoos, H.H., Stützle, T.: Evaluating Las Vegas algorithms: pitfalls and remedies. In: Proceedings of the Fourteenth Conference on Uncertainty in Artificial Intelligence, UAI’98, pp. 238–245, San Francisco, CA, USA, 1998. Morgan Kaufmann Publishers Inc
Houck, C.R., Joines, J.A., Kay, M.G.: Comparison of genetic algorithms, random restart and two-opt switching for solving large location–allocation problems. Comput. Oper. Res. 23(6), 587–596 (1996)
Kocsis, L., György, A.: Efficient Multi-start Strategies for Local Search Algorithms, pp. 705–720. Springer, Berlin (2009)
Kuindersma, S.R., Grupen, R.A., Barto, A.G.: Variable risk control via stochastic optimization. Int. J. Robot. Res. 32(7), 806–825 (2013)
Mondal, S., Lucet, Y., Hare, W.: Optimizing horizontal alignment of roads in a specified corridor. Comput. Oper. Res. 64, 130–138 (2015)
Moré, J., Wild, S.: Benchmarking derivative-free optimization algorithms. SIAM J. Optim. 20(1), 172–191 (2009)
G.A. Ortiz: Evolution strategies, May 2012. http://www.mathworks.com/matlabcentral/fileexchange/35801-evolution-strategies--es-
Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: Operational zones for comparing metaheuristic and deterministic one-dimensional global optimization algorithms. Math. Comput. Simul. 141, 96–109 (2016)
Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget. Sci. Rep. (2018). https://doi.org/10.1038/s41598-017-18940-4
Spall, J.C.: Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control. Wiley-Interscience, Hoboken (2003)
Stich, S.U., Müller, C.L., Gärtner, B.: Optimization of convex functions with random pursuit. SIAM J. Optim. 23(2), 1284–1309 (2013)
Vasant, P., Weber, G., Dieu, V.: Handbook of Research on Modern Optimization Algorithms and Applications in Engineering and Economics. IGI Global, Hershey (2016)
Acknowledgements
This work was supported by Natural Sciences and Engineering Research Council of Canada (NSERC) under Collaborative Research and Development (CRD) Grant #CRDPJ 411318-15. The Grant was sponsored by Softree Technical Systems Inc. This work was supported by Discovery Grants #355571-2013 (Hare) and #2015-03895 (Loeppky) from NSERC. Part of the research was performed in the Computer-Aided Convex Analysis (CA2) laboratory funded by a Leaders Opportunity Fund (LOF) from the Canada Foundation for Innovation (CFI) and by a British Columbia Knowledge Development Fund (BCKDF).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Hare, W., Loeppky, J. & Xie, S. Methods to compare expensive stochastic optimization algorithms with random restarts. J Glob Optim 72, 781–801 (2018). https://doi.org/10.1007/s10898-018-0673-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10898-018-0673-7