JCP 2018 Vol.13(7): 805-822 ISSN: 1796-203X
doi: 10.17706/jcp.13.7.805-822
doi: 10.17706/jcp.13.7.805-822
Impact of Different Random Initializations on Generalization Performance of Extreme Learning Machine
Xiaofang Zhang1, Xiaoli Lin1, Rana Aamir Raza Ashfaq2
1Information Engineering Department, City College, Wuhan University of Science and Technology, Wuhan 430083, China.
2Department of Computer Science, Bahauddin Zakariya University, Multan, Pakistan.
Abstract—The generalization performance of extreme learning machine (ELM) is influenced by the random initializations to input-layer weights and hidden-layer biases. In this paper, we demonstrate this conclusion through testing the classification accuracies of ELMs corresponding to different random initializations. 30 UCI data sets and 24 continuous probability distributions are employed in this experimental study. The final results present the following important and valuable observations and conclusions, i.e., (1) the probability distributions with symmetrical and bell-shaped probability density functions (e.g., Hyperbolic Secant, Student's-t, Laplace and Normal) always bring about the higher training accuracies and easily cause the over-fitting of ELM; (2) ELMs with random input-layer weights and hidden-layer biases chosen from heavy-tailed distributions (e.g., Gamma, Rayleigh and Frechet) have the better generalization performances; and (3) the light-tailed distributions (e.g., Central Chi-Squared, Erlang, F, Gumbel and Logistic) are usually unsuited to initialize the input-layer weights and hidden-layer biases for ELM. All these provide the useful enlightenments for practical applications of ELMs in different fields.
Index Terms—Extreme learning machine, ELM, generalization performance, random initialization.
2Department of Computer Science, Bahauddin Zakariya University, Multan, Pakistan.
Abstract—The generalization performance of extreme learning machine (ELM) is influenced by the random initializations to input-layer weights and hidden-layer biases. In this paper, we demonstrate this conclusion through testing the classification accuracies of ELMs corresponding to different random initializations. 30 UCI data sets and 24 continuous probability distributions are employed in this experimental study. The final results present the following important and valuable observations and conclusions, i.e., (1) the probability distributions with symmetrical and bell-shaped probability density functions (e.g., Hyperbolic Secant, Student's-t, Laplace and Normal) always bring about the higher training accuracies and easily cause the over-fitting of ELM; (2) ELMs with random input-layer weights and hidden-layer biases chosen from heavy-tailed distributions (e.g., Gamma, Rayleigh and Frechet) have the better generalization performances; and (3) the light-tailed distributions (e.g., Central Chi-Squared, Erlang, F, Gumbel and Logistic) are usually unsuited to initialize the input-layer weights and hidden-layer biases for ELM. All these provide the useful enlightenments for practical applications of ELMs in different fields.
Index Terms—Extreme learning machine, ELM, generalization performance, random initialization.
Cite: Xiaofang Zhang, Xiaoli Lin, Rana Aamir Raza Ashfaq, "Impact of Different Random Initializations on Generalization Performance of Extreme Learning Machine," Journal of Computers vol. 13, no. 7, pp. 805-822, 2018.
General Information
ISSN: 1796-203X
Abbreviated Title: J.Comput.
Frequency: Bimonthly
Abbreviated Title: J.Comput.
Frequency: Bimonthly
Editor-in-Chief: Prof. Liansheng Tan
Executive Editor: Ms. Nina Lee
Abstracting/ Indexing: DBLP, EBSCO, ProQuest, INSPEC, ULRICH's Periodicals Directory, WorldCat,etc
E-mail: jcp@iap.org
-
Nov 14, 2019 News!
Vol 14, No 11 has been published with online version [Click]
-
Mar 20, 2020 News!
Vol 15, No 2 has been published with online version [Click]
-
Dec 16, 2019 News!
Vol 14, No 12 has been published with online version [Click]
-
Sep 16, 2019 News!
Vol 14, No 9 has been published with online version [Click]
-
Aug 16, 2019 News!
Vol 14, No 8 has been published with online version [Click]
- Read more>>