Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
article

How effective is the Grey Wolf optimizer in training multi-layer perceptrons

Published: 01 July 2015 Publication History

Abstract

This paper employs the recently proposed Grey Wolf Optimizer (GWO) for training Multi-Layer Perceptron (MLP) for the first time. Eight standard datasets including five classification and three function-approximation datasets are utilized to benchmark the performance of the proposed method. For verification, the results are compared with some of the most well-known evolutionary trainers: Particle Swarm Optimization (PSO), Genetic Algorithm (GA), Ant Colony Optimization (ACO), Evolution Strategy (ES), and Population-based Incremental Learning (PBIL). The statistical results prove the GWO algorithm is able to provide very competitive results in terms of improved local optima avoidance. The results also demonstrate a high level of accuracy in classification and approximation of the proposed trainer.

References

[1]
McCulloch WS, Pitts W (1943) A logical calculus of the ideas immanent in nervous activity. Bull Math Biophysics 5:115---133
[2]
Bebis G, Georgiopoulos M (1994) Feed-forward neural networks. Potentials, IEEE 13:27---31
[3]
Kohonen T (1990) The self-organizing map. Proc IEEE 78:1464---1480
[4]
Park J, Sandberg IW (1993) Approximation and radial-basis-function networks. Neural Comput 5:305---316
[5]
Dorffner G (1996) Neural networks for time series processing, in Neural Network World
[6]
Ghosh-Dastidar S, Adeli H (2009) Spiking neural networks. Int J Neural Syst 19:295---308
[7]
Reed RD, Marks RJ (1998) Neural smithing: supervised learning in feedforward artificial neural networks. Mit Press
[8]
Caruana R, Niculescu-Mizil A (2006) An empirical comparison of supervised learning algorithms. In: Proceedings of the 23rd international conference on Machine learning, pp 161---168
[9]
Hinton GE, Sejnowski TJ (1999) Unsupervised learning: foundations of neural computation. MIT press
[10]
Wang D (2001) Unsupervised learning: foundations of neural computation. AI Mag 22:101
[11]
Hertz J (1991) Introduction to the theory of neural computation. Basic Books 1
[12]
Wang G-G, Guo L, Gandomi AH, Hao G-S, Wang H (2014) Chaotic krill herd algorithm. Inf Sci 274:17---34
[13]
Wang G-G, Gandomi AH, Alavi AH, Hao G-S (2013) Hybrid krill herd algorithm with differential evolution for global numerical optimization. Neural Comput App.
[14]
Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46---61
[15]
Van Laarhoven PJ, Aarts EH (1987) Simulated annealing. Springer
[16]
Szu H, Hartley R (1987) Fast simulated annealing. Phys Lett A 122:157---162
[17]
Mitchell M, Holland JH, Forrest S (1993) When will a genetic algorithm outperform hill climbing? In: NIPS:51---58
[18]
Goldfeld SM, Quandt RE, Trotter HF (1966) Maximization by quadratic hill-climbing. Econometrica: J Econ Soc:541---551
[19]
Mirjalili S, Mohd Hashim SZ, Moradian Sardroudi H (2012) Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm. Appl Math Comput 218:11125---11137
[20]
Whitley D, Starkweather T, Bogart C (1990) Genetic algorithms and neural networks: Optimizing connections and connectivity. Parallel comput 14:347---361
[21]
Mendes R, Cortez P, Rocha M, Neves J (2002) Particle swarms for feedforward neural network training, learning vol. 6
[22]
Gudise V G, Venayagamoorthy G K (2003) Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks. In: Proceedings swarm intelligence symposium, 2003. SIS'03, pp 110---117
[23]
Blum C, Socha K (2005) Training feed-forward neural networks with ant colony optimization: an application to pattern classification. In: 5th international conference on, Hybrid Intelligent Systems, 2005. HIS'05, p 6
[24]
Socha K, Blum C (2007) An ant colony optimization algorithm for continuous optimization: application to feed-forward neural network training. Neural Comput Appl 16:235---247
[25]
Karaboga D, Akay B, Ozturk C (2007) Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks," in Modeling decisions for artificial intelligence ed: Springer, pp 318---329
[26]
Ozturk C, Karaboga D (2011) Hybrid Artificial Bee Colony algorithm for neural network training. In: 2011 IEEE Congress on, Evolutionary Computation (CEC), pp 84---88
[27]
Ilonen J, Kamarainen J-K, Lampinen J (2003) Differential evolution training algorithm for feed-forward neural networks. Neural Process Lett 17:93---105
[28]
Slowik A, Bialko M (2008) Training of artificial neural networks using differential evolution algorithm. In: 2008 Conference on, Human System Interactions, pp 60---65
[29]
Green II RC, Wang L, Alam M (2012) Training neural networks using central force optimization and particle swarm optimization: insights and comparisons. Expert Syst Appl 39:555---563
[30]
Pereira L, Rodrigues D, Ribeiro P, Papa J, Weber SA (2014) Social-spider optimization-based artificial neural networks training and its applications for Parkinson's disease identification. In: 2014 IEEE 27th international symposium on in computer-based medical systems (CBMS), pp 14---17
[31]
Yu JJ, Lam AY, Li VO (2011) Evolutionary artificial neural network based on chemical reaction optimization. In: 2011 IEEE congress on, evolutionary computation (CEC), pp 2083---2090
[32]
Pereira LA, Afonso LC, Papa JP, Vale ZA, Ramos CC, Gastaldello DS, Souza AN (2013) Multilayer perceptron neural networks training through charged system search and its Application for non-technical losses detection. In: 2013 IEEE PES conference on, innovative smart grid technologies latin America (ISGT LA), pp 1---6
[33]
Moallem P, Razmjooy N (2012) A multi layer perceptron neural network trained by invasive weed optimization for potato color image segmentation. Trends Appl Sci Res 7:445---455
[34]
Uzlu E, Kankal M, Akpınar A, Dede T (2014) Estimates of energy consumption in Turkey using neural networks with the teaching---learning-based optimization algorithm. Energy 75:295---303
[35]
Mirjalili S, Sadiq AS (2011) Magnetic optimization algorithm for training multi layer perceptron. In: Communication Software and Networks (ICCSN), 2011 IEEE 3rd International Conference, IEEE, pp 42---46
[36]
Belew RK, McInerney J, Schraudolph NN (1990) Evolving networks: Using the genetic algorithm with connectionist learning
[37]
Blake C, Merz CJ (1998) {UCI} Repository of machine learning databases
[38]
Mirjalili S, Mirjalili SM, Lewis A (2014) Let a biogeography-based optimizer train your multi-layer perceptron. Inf Sci 269:188---209
[39]
Beyer H-G, Schwefel H-P (2002) Evolution strategies---a comprehensive introduction. Nat Comput 1:3---52
[40]
Yao X, Liu Y, Lin G (1999) Evolutionary programming made faster. Evolutionary Comput IEEE Trans 3:82---102
[41]
Yao X, Liu Y (1997) Fast evolution strategies. In: evolutionary programming VI, pp 149---161
[42]
Baluja S (1994) Population-based incremental learning. a method for integrating genetic search based function optimization and competitive learning, DTIC Document

Cited By

View all
  • (2024)Competing leaders grey wolf optimizer and its application for training multi-layer perceptron classifierExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.122349239:COnline publication date: 1-Apr-2024
  • (2024)Metaheuristic learning algorithms for accurate prediction of hydraulic performance of porous embankment weirsApplied Soft Computing10.1016/j.asoc.2023.111150151:COnline publication date: 17-Apr-2024
  • (2024)Metaheuristic-based hyperparameter optimization for multi-disease detection and diagnosis in machine learningService Oriented Computing and Applications10.1007/s11761-023-00382-818:2(163-182)Online publication date: 1-Jun-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Applied Intelligence
Applied Intelligence  Volume 43, Issue 1
July 2015
232 pages

Publisher

Kluwer Academic Publishers

United States

Publication History

Published: 01 July 2015

Author Tags

  1. Evolutionary algorithm
  2. Grey Wolf optimizer
  3. Learning neural network
  4. MLP
  5. Multi-layer perceptron

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 02 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Competing leaders grey wolf optimizer and its application for training multi-layer perceptron classifierExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.122349239:COnline publication date: 1-Apr-2024
  • (2024)Metaheuristic learning algorithms for accurate prediction of hydraulic performance of porous embankment weirsApplied Soft Computing10.1016/j.asoc.2023.111150151:COnline publication date: 17-Apr-2024
  • (2024)Metaheuristic-based hyperparameter optimization for multi-disease detection and diagnosis in machine learningService Oriented Computing and Applications10.1007/s11761-023-00382-818:2(163-182)Online publication date: 1-Jun-2024
  • (2024)Cleaner fish optimization algorithm: a new bio-inspired meta-heuristic optimization algorithmThe Journal of Supercomputing10.1007/s11227-024-06105-w80:12(17338-17376)Online publication date: 1-Aug-2024
  • (2024)A Novel Two-Level Clustering-Based Differential Evolution Algorithm for Training Neural NetworksApplications of Evolutionary Computation10.1007/978-3-031-56852-7_17(259-272)Online publication date: 3-Mar-2024
  • (2023)Global path planning for airport energy station inspection robots based on improved grey wolf optimization algorithmJournal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology10.3233/JIFS-23089445:3(4483-4500)Online publication date: 1-Jan-2023
  • (2023)Memory, evolutionary operator, and local search based improved Grey Wolf Optimizer with linear population size reduction techniqueKnowledge-Based Systems10.1016/j.knosys.2023.110297264:COnline publication date: 15-Mar-2023
  • (2023)An improved arithmetic optimization algorithm for training feedforward neural networks under dynamic environmentsKnowledge-Based Systems10.1016/j.knosys.2023.110274263:COnline publication date: 5-Mar-2023
  • (2023)Enhancing multilayer perceptron neural network using archive-based harris hawks optimizer to predict gold pricesJournal of King Saud University - Computer and Information Sciences10.1016/j.jksuci.2023.10155735:5Online publication date: 13-Jul-2023
  • (2023)Potential of vibrational spectroscopy coupled with machine learning as a non-invasive diagnostic method for COVID-19Computer Methods and Programs in Biomedicine10.1016/j.cmpb.2022.107295229:COnline publication date: 1-Feb-2023
  • Show More Cited By

View Options

View options

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media