Abstract
Neural architecture search (NAS) is a subdomain of AutoML that consists of automating the design of neural networks. NAS has become a hot topic in the last few years. As a result, many methods are being developed in this area. Local search (LS), on the other hand, is a famous heuristic that has been around for many years. It is extensively used for optimization problems due to its simplicity and efficiency. LS has a lot of advantages in the world of NAS; it can naturally exploit methods that accelerate the global search time such as weight inheritance and network morphism. LS is also easy to implement and does not require a complex encoding or any parameter tuning. In the present work, we aim at making LS faster by guiding the exploration of the neighborhood. Our objective is to limit the number of solution evaluations, which are particularly time-consuming in NAS. We propose the method LS-PON (Local Search with a Predicted Order of Neighbors) that uses linear regression models to order the exploration of neighbors during the search. LS-PON, unlike other prediction-based NAS methods, requires neither pre-sampling nor tuning. Our experiments on popular NAS benchmarks show that LS-PON keeps the simplicity and advantages of LS while being as efficient in quality as state-of-the-art methods and can be more than twice as fast as classical LS.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Baker, B., Gupta, O., Raskar, R., Naik, N.: Accelerating neural architecture search using performance prediction. arXiv:1705.10823 (2017)
Chrabaszcz, P., Loshchilov, I., Hutter, F.: A downsampled variant of imagenet as an alternative to the cifar datasets. arXiv:1707.08819 (2017)
Den Ottelander, T., Dushatskiy, A., Virgolin, M., Bosman, P.A.N.: Local search is a remarkably strong baseline for neural architecture search. In: Ishibuchi, H., Zhang, Q., Cheng, R., Li, K., Li, H., Wang, H., Zhou, A. (eds.) EMO 2021. LNCS, vol. 12654, pp. 465–479. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-72062-9_37
Dong, X., Yang, Y.: Nas-bench-201: Extending the scope of reproducible neural architecture search. arXiv:2001.00326 (2020)
Elsken, T., Metzen, J.H., Hutter, F.: Neural architecture search, pp. 69–86
Elsken, T., Metzen, J.-H., Hutter, F.: Simple and efficient architecture search for convolutional neural networks. arXiv:1711.04528 (2017)
Elsken, T., Metzen, J.H., Hutter, F.: Neural architecture search: a survey. J. Mach. Learn. Res., 1997–2017 (2019)
Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Coello, C.A.C. (ed.) LION 2011. LNCS, vol. 6683, pp. 507–523. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-25566-3_40
Jaafra, Y., Laurent, J.L., Deruyver, A., Naceur, M.S.: Reinforcement learning for neural architecture search: a review. Image and Vision Computing, pp. 57–66 (2019)
Krizhevsky, A., Hinton, G., et al.: Learning multiple layers of features from tiny images (2009)
Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097–1105 (2012)
Kwasigroch, A., Grochowski, M., Mikolajczyk, M.: Deep neural network architecture search using network morphism. In: International Conference on Methods and Models in Automation and Robotics, pp. 30–35. IEEE (2019)
Li, L., Talwalkar, A.: Random search and reproducibility for neural architecture search. In: Uncertainty in Artificial Intelligence, pp. 367–377. PMLR (2020)
Li, Y., Dong, M., Wang, Y., Xu, C.: Neural architecture search in a proxy validation loss landscape. In: International Conference on Machine Learning, pp. 5853–5862. PMLR (2020)
Liu, C., et al.: Progressive neural architecture search. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11205, pp. 19–35. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01246-5_2
Liu, H., Simonyan, K., Yang, Y.: Darts: differentiable architecture search. arXiv:1806.09055 (2018)
Liu, Y., Sun, Y., Xue, B., Zhang, M., Yen, G.G., Tan, K.C.: A survey on evolutionary neural architecture search. IEEE Trans. Neural Networks Learn. Syst. (2021)
Luo, R., Tan, X., Wang, R., Qin, T., Chen, E., Liu, T.-Y.: Accuracy prediction with non-neural model for neural architecture search. arXiv:2007.04785 (2020)
Real, E., Aggarwal, A., Huang, Y., Le, Q.V.: Regularized evolution for image classifier architecture search. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 4780–4789 (2019)
Real, E., et al.: Large-scale evolution of image classifiers. In: International Conference on Machine Learning, pp. 2902–2911. PMLR (2017)
Siems, J., Zimmer, L., Zela, A., Lukasik, J., Keuper, M., Hutter, F.: Nas-bench-301 and the case for surrogate benchmarks for neural architecture search. arXiv:2008.09777 (2020)
Wei, C., Niu, C., Tang, Y., Wang, Y., Hu, H., Liang, J.: Npenas: neural predictor guided evolution for neural architecture search. arXiv:2003.12857 (2020)
Wei, T., Wang, C., Rui, Y., Chen, C.W.: Network morphism. In: International Conference on Machine Learning, pp. 564–572. PMLR (2016)
White, C., Nolen, S., Savani, Y.: Exploring the loss landscape in neural architecture search. arXiv:2005.02960 (2020)
White, C., Zela, A., Ru, B., Liu, Y., Hutter, F.: How powerful are performance predictors in neural architecture search? arXiv:2104.01177 (2021)
Wu, J., et al.: Weak nas predictors are all you need. arXiv:2102.10490 (2021)
Xie, L., et al.: Weight-sharing neural architecture search: a battle to shrink the optimization gap. ACM Computing Surveys (CSUR), pp. 1–37 (2021)
Yu, K., Sciuto, C., Jaggi, M., Musat, C., Salzmann, M.: Evaluating the search phase of neural architecture search. arXiv:1902.08142 (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Zouambi, M., Jacques, J., Dhaenens, C. (2023). LS-PON: A Prediction-Based Local Search for Neural Architecture Search. In: Nicosia, G., et al. Machine Learning, Optimization, and Data Science. LOD 2022. Lecture Notes in Computer Science, vol 13810. Springer, Cham. https://doi.org/10.1007/978-3-031-25599-1_9
Download citation
DOI: https://doi.org/10.1007/978-3-031-25599-1_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-25598-4
Online ISBN: 978-3-031-25599-1
eBook Packages: Computer ScienceComputer Science (R0)