Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3456126.3456133acmotherconferencesArticle/Chapter ViewAbstractPublication PagesasseConference Proceedingsconference-collections
research-article
Open access

Enhanced Neural Architecture Search Using Super Learner and Ensemble Approaches

Published: 29 June 2021 Publication History

Abstract

Neural networks, and in particular Convolutional Neural Networks (CNNs), are often optimized using default parameters. Neural Architecture Search (NAS) enables multiple architectures to be evaluated prior to selection of the optimal architecture. A system integrating open-source tools for Neural Architecture Search (OpenNAS) of image classification problems has been developed and made available to the open-source community. OpenNAS takes any dataset of grayscale, or RGB images, and generates the optimal CNN architecture. The training and optimization of neural networks, using super learner and ensemble approaches, is explored in this research. Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO) and pretrained models serve as base learners for network ensembles. Meta learner algorithms are subsequently applied to these base learners and the ensemble performance on image classification problems is evaluated. Our results show that a stacked generalization ensemble of heterogeneous models is the most effective approach to image classification within OpenNAS.

References

[1]
Frank Hutter, Lars Kotthoff, and Joaquin Vanschoren. 2019. Automated machine learning: methods, systems, challenges. Springer Nature.
[2]
Lars Kotthoff, Chris Thornton, Holger H Hoos, Frank Hutter, and Kevin Leyton-Brown. 2017. Auto-WEKA 2.0: Automatic model selection and hyperparameter optimization in WEKA. The Journal of Machine Learning Research 18, 1 (2017), 826–830.
[3]
Randal S Olson and Jason H Moore. 2019. TPOT: A Tree-Based Pipeline Optimization Tool for Automating. Automated Machine Learning: Methods, Systems, Challenges (2019), 151.
[4]
Haifeng Jin, Qingquan Song, and Xia Hu. 2019. Auto-keras: An efficient neural architecture search system. In Proceedings of the 25 th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 1946–1956.
[5]
Séamus Lankford and Diarmuid Grimes. 2020. “Neural Architecture Search using Particle Swarm Optimization and Ant Colony Optimization”, in Proceedings of the 28th AIAI Irish Conference on Artificial Intelligence and Cognitive Science.
[6]
Russell Eberhart and James Kennedy. 1995. A new optimizer using particle swarm theory. In MHS’95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science. IEEE, 39–43.
[7]
Marco Dorigo and Luca Maria Gambardella. 1997. Ant colony system: a cooperative learning approach to the traveling salesman problem. IEEE Transactions on evolutionary computation 1, 1 (1997), 53–66.
[8]
Karen Simonyan and Andrew Zisserman. 2014. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 (2014).
[9]
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition. 770–778.
[10]
Hong-Yen Chen and Chung-Yen Su. 2018. An enhanced hybrid MobileNet. In 2018 9 th International Conference on Awareness Science and Technology (iCAST). IEEE, 308–312.
[11]
Yann LeCun, Léon Bottou, Yoshua Bengio, and Patrick Haffner. 1998. Gradient-based learning applied to document recognition. Proc. IEEE 86, 11 (1998), 2278–2324.
[12]
Beatriz A Garro and Roberto A Vázquez. 2015. Designing artificial neural networks using particle swarm optimization algorithms. Computational intelligence and neuroscience 2015 (2015).
[13]
Michalis Mavrovouniotis and Shengxiang Yang. 2015. Training neural networks with ant colony optimization algorithms for pattern classification. Soft Computing 19, 6 (2015), 1511–1522.
[14]
Francisco Erivaldo Fernandes Junior and Gary G Yen. 2019. Particle swarm optimization of deep neural networks architectures for image classification. Swarm and Evolutionary Computation 49 (2019), 62–74.
[15]
Edvinas Byla and Wei Pang. 2019. Deepswarm: Optimising convolutional neural networks using swarm intelligence. In UK Workshop on Computational Intelligence. Springer, 119–130.
[16]
Cheng Ju, Aurélien Bibaut, and Markvan der Laan. 2018. The relative performance of ensemble methods with deep convolutional neural networks for image classification. Journal of Applied Statistics 45, 15 (2018), 2800–2818.
[17]
Mark J Van der Laan, Eric C Polley, and Alan E Hubbard. 2007. Super learner. Statistical applications in genetics and molecular biology 6, 1 (2007).
[18]
Sebastian Flennerhag. 2017. Mlens Documentation. (2017).
[19]
Alex Krizhevsky, Vinod Nair, and Geoffrey Hinton. 2014. The cifar-10 dataset. online: http://www.cs.toronto.edu/∼kriz/cifar.html (2014)
[20]
Han Xiao, Kashif Rasul, and Roland Vollgraf. 2017. Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv preprint arXiv:1708.07747 (2017).
[21]
Xin He, Kaiyong Zhao, and Xiaowen Chu. 2020. AutoML: A Survey of the State-of-the-Art. Knowledge-Based Systems (2020), 106622.
[22]
Matthias Feurer, Aaron Klein, Katharina Eggensperger, Jost Tobias Springenberg, Manuel Blum, and Frank Hutter. 2019. Auto-sklearn: efficient and robust automated machine learning. In Automated Machine Learning. Springer, Cham, 113–134

Cited By

View all
  • (2024)Measuring the efficiency of banks using high-performance ensemble techniqueNeural Computing and Applications10.1007/s00521-024-09929-y36:27(16797-16815)Online publication date: 1-Sep-2024
  • (2023)High performance machine learning approach for reference evapotranspiration estimationStochastic Environmental Research and Risk Assessment10.1007/s00477-023-02594-y38:2(689-713)Online publication date: 4-Nov-2023
  1. Enhanced Neural Architecture Search Using Super Learner and Ensemble Approaches

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      ASSE '21: 2021 2nd Asia Service Sciences and Software Engineering Conference
      February 2021
      143 pages
      ISBN:9781450389082
      DOI:10.1145/3456126
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 29 June 2021

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. ACO
      2. AutoML
      3. CNN
      4. Ensemble
      5. PSO
      6. Stacking
      7. Super Learner

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      Conference

      ASSE '21

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)171
      • Downloads (Last 6 weeks)18
      Reflects downloads up to 12 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Measuring the efficiency of banks using high-performance ensemble techniqueNeural Computing and Applications10.1007/s00521-024-09929-y36:27(16797-16815)Online publication date: 1-Sep-2024
      • (2023)High performance machine learning approach for reference evapotranspiration estimationStochastic Environmental Research and Risk Assessment10.1007/s00477-023-02594-y38:2(689-713)Online publication date: 4-Nov-2023

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Login options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media