Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

LS-PON: A Prediction-Based Local Search for Neural Architecture Search

  • Conference paper
  • First Online:
Machine Learning, Optimization, and Data Science (LOD 2022)

Abstract

Neural architecture search (NAS) is a subdomain of AutoML that consists of automating the design of neural networks. NAS has become a hot topic in the last few years. As a result, many methods are being developed in this area. Local search (LS), on the other hand, is a famous heuristic that has been around for many years. It is extensively used for optimization problems due to its simplicity and efficiency. LS has a lot of advantages in the world of NAS; it can naturally exploit methods that accelerate the global search time such as weight inheritance and network morphism. LS is also easy to implement and does not require a complex encoding or any parameter tuning. In the present work, we aim at making LS faster by guiding the exploration of the neighborhood. Our objective is to limit the number of solution evaluations, which are particularly time-consuming in NAS. We propose the method LS-PON (Local Search with a Predicted Order of Neighbors) that uses linear regression models to order the exploration of neighbors during the search. LS-PON, unlike other prediction-based NAS methods, requires neither pre-sampling nor tuning. Our experiments on popular NAS benchmarks show that LS-PON keeps the simplicity and advantages of LS while being as efficient in quality as state-of-the-art methods and can be more than twice as fast as classical LS.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Baker, B., Gupta, O., Raskar, R., Naik, N.: Accelerating neural architecture search using performance prediction. arXiv:1705.10823 (2017)

  2. Chrabaszcz, P., Loshchilov, I., Hutter, F.: A downsampled variant of imagenet as an alternative to the cifar datasets. arXiv:1707.08819 (2017)

  3. Den Ottelander, T., Dushatskiy, A., Virgolin, M., Bosman, P.A.N.: Local search is a remarkably strong baseline for neural architecture search. In: Ishibuchi, H., Zhang, Q., Cheng, R., Li, K., Li, H., Wang, H., Zhou, A. (eds.) EMO 2021. LNCS, vol. 12654, pp. 465–479. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-72062-9_37

    Chapter  Google Scholar 

  4. Dong, X., Yang, Y.: Nas-bench-201: Extending the scope of reproducible neural architecture search. arXiv:2001.00326 (2020)

  5. Elsken, T., Metzen, J.H., Hutter, F.: Neural architecture search, pp. 69–86

    Google Scholar 

  6. Elsken, T., Metzen, J.-H., Hutter, F.: Simple and efficient architecture search for convolutional neural networks. arXiv:1711.04528 (2017)

  7. Elsken, T., Metzen, J.H., Hutter, F.: Neural architecture search: a survey. J. Mach. Learn. Res., 1997–2017 (2019)

    Google Scholar 

  8. Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Coello, C.A.C. (ed.) LION 2011. LNCS, vol. 6683, pp. 507–523. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-25566-3_40

    Chapter  Google Scholar 

  9. Jaafra, Y., Laurent, J.L., Deruyver, A., Naceur, M.S.: Reinforcement learning for neural architecture search: a review. Image and Vision Computing, pp. 57–66 (2019)

    Google Scholar 

  10. Krizhevsky, A., Hinton, G., et al.: Learning multiple layers of features from tiny images (2009)

    Google Scholar 

  11. Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097–1105 (2012)

    Google Scholar 

  12. Kwasigroch, A., Grochowski, M., Mikolajczyk, M.: Deep neural network architecture search using network morphism. In: International Conference on Methods and Models in Automation and Robotics, pp. 30–35. IEEE (2019)

    Google Scholar 

  13. Li, L., Talwalkar, A.: Random search and reproducibility for neural architecture search. In: Uncertainty in Artificial Intelligence, pp. 367–377. PMLR (2020)

    Google Scholar 

  14. Li, Y., Dong, M., Wang, Y., Xu, C.: Neural architecture search in a proxy validation loss landscape. In: International Conference on Machine Learning, pp. 5853–5862. PMLR (2020)

    Google Scholar 

  15. Liu, C., et al.: Progressive neural architecture search. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11205, pp. 19–35. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01246-5_2

    Chapter  Google Scholar 

  16. Liu, H., Simonyan, K., Yang, Y.: Darts: differentiable architecture search. arXiv:1806.09055 (2018)

  17. Liu, Y., Sun, Y., Xue, B., Zhang, M., Yen, G.G., Tan, K.C.: A survey on evolutionary neural architecture search. IEEE Trans. Neural Networks Learn. Syst. (2021)

    Google Scholar 

  18. Luo, R., Tan, X., Wang, R., Qin, T., Chen, E., Liu, T.-Y.: Accuracy prediction with non-neural model for neural architecture search. arXiv:2007.04785 (2020)

  19. Real, E., Aggarwal, A., Huang, Y., Le, Q.V.: Regularized evolution for image classifier architecture search. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 4780–4789 (2019)

    Google Scholar 

  20. Real, E., et al.: Large-scale evolution of image classifiers. In: International Conference on Machine Learning, pp. 2902–2911. PMLR (2017)

    Google Scholar 

  21. Siems, J., Zimmer, L., Zela, A., Lukasik, J., Keuper, M., Hutter, F.: Nas-bench-301 and the case for surrogate benchmarks for neural architecture search. arXiv:2008.09777 (2020)

  22. Wei, C., Niu, C., Tang, Y., Wang, Y., Hu, H., Liang, J.: Npenas: neural predictor guided evolution for neural architecture search. arXiv:2003.12857 (2020)

  23. Wei, T., Wang, C., Rui, Y., Chen, C.W.: Network morphism. In: International Conference on Machine Learning, pp. 564–572. PMLR (2016)

    Google Scholar 

  24. White, C., Nolen, S., Savani, Y.: Exploring the loss landscape in neural architecture search. arXiv:2005.02960 (2020)

  25. White, C., Zela, A., Ru, B., Liu, Y., Hutter, F.: How powerful are performance predictors in neural architecture search? arXiv:2104.01177 (2021)

  26. Wu, J., et al.: Weak nas predictors are all you need. arXiv:2102.10490 (2021)

  27. Xie, L., et al.: Weight-sharing neural architecture search: a battle to shrink the optimization gap. ACM Computing Surveys (CSUR), pp. 1–37 (2021)

    Google Scholar 

  28. Yu, K., Sciuto, C., Jaggi, M., Musat, C., Salzmann, M.: Evaluating the search phase of neural architecture search. arXiv:1902.08142 (2019)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Meyssa Zouambi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zouambi, M., Jacques, J., Dhaenens, C. (2023). LS-PON: A Prediction-Based Local Search for Neural Architecture Search. In: Nicosia, G., et al. Machine Learning, Optimization, and Data Science. LOD 2022. Lecture Notes in Computer Science, vol 13810. Springer, Cham. https://doi.org/10.1007/978-3-031-25599-1_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-25599-1_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-25598-4

  • Online ISBN: 978-3-031-25599-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics