Abstract
Traditional extreme learning machine (ELM) has random weights between input layer and hidden layer, this kind of random feature mapping brings non-discriminative feature space and unstable classification accuracy, which greatly limits the performance of the ELM networks. Therefore, to get the well-pleasing input weights, two biologically inspired, unsupervised learning methods were introduced to optimize the traditional ELM networks, namely the generalized hebbian algorithm (GHA) and intrinsic plasticity learning (IPL). The GHA is able to extract the principal components of the input data of arbitrary size, while the IPL tunes the probability density of the neuron’s output towards a desired distribution such as exponential distribution or weber distribution, thereby maximizing the networks information transmission. With the incorporation of the GHA and IPL approach, the optimized ELM networks generates a discriminative feature space and preserves much more characteristic of the input data, accordingly, achieving a better task performance. Based on the above two unsupervised methods, a simple, yet effective hierarchical feature mapping extreme learning machine (HFMELM) is further proposed. With almost no information loss in the layer-wise feature mapping process, the HFMELM is able to learn the high-level representation of the input data. To evaluate the effectiveness of the proposed methods, extensive experiments on several datasets are presented, the results show that the proposed methods significantly outperform the traditional ELM networks.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Leshno M, Lin VY, Pinkus A, Schocken S (1993) Multilayer feedforward networks with a nonpolynomial activation function can approximate any function. Neural Netw 6(6):861–867
Huang G-B, Babri HA (1998) Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions. IEEE Trans Neural Netw 9(1):224–229
Huang G-B, Zhu Q-Y, Siew C-K (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In 2004 IEEE international joint conference on proceedings neural networks, vol 2. IEEE, pp 985–990
Huang G-B, Zhou H, Ding X, Zhang R (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern Part B (Cybern) 42(2):513–529
Liu X, Wang L, Huang G-B, Zhang J, Yin J (2015) Multiple kernel extreme learning machine. Neurocomputing 149:253–264
Huang G, Song S, Gupta JN, Wu C (2014) Semi-supervised and unsupervised extreme learning machines. IEEE Trans Cybern 44(12):2405–2417
Tang J, Deng C, Huang G-B (2016) Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst 27(4):809–821
Liang N-Y, Huang G-B, Saratchandran P, Sundararajan N (2006) A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Trans Neural Networks 17(6):1411–1423
Mirza B, Lin Z, Toh K-A (2013) Weighted online sequential extreme learning machine for class imbalance learning. Neural Process Lett 38(3):465–486
Mirza B, Kok S, Dong F (2016) Multi-layer online sequential extreme learning machine for image classification. In Proceedings of ELM-2015 Volume 1. Springer, Berlin pp 39–49
Zong W, Huang G-B (2014) Learning to rank with extreme learning machine. Neural Process Lett 39(2):155–166
Zong W, Huang G-B, Chen Y (2013) Weighted extreme learning machine for imbalance learning. Neurocomputing 101:229–242
Cao J, Lin Z, Huang G-B, Liu N (2012) Voting based extreme learning machine. Inf Sci 185(1):66–77
Liu N, Wang H (2010) Ensemble based extreme learning machine. IEEE Signal Process Lett 17(8):754–757
Iosifidis A, Tefas A, Pitas I (2013) Minimum class variance extreme learning machine for human action recognition. IEEE Trans Circuits Syst Video Technol 23(11):1968–1979
Kasun LLC, Yang Y, Huang G-B, Zhang Z (2016) Dimension reduction with extreme learning machine. IEEE Trans Image Process 25(8):3906–3918
Iosifidis A, Tefas A, Pitas I (2016) Graph embedded extreme learning machine. IEEE Trans Cybern 46(1):311–324
Nguyen TV, Mirza B (2017) Dual-layer kernel extreme learning machine for action recognition. Neurocomputing 260:123–130
Zhu W, Miao J, Qing L (2014) Constrained extreme learning machine: a novel highly discriminative random feedforward neural network. In 2014 international joint conference on neural networks (IJCNN). IEEE, pp 800–807
Niu P, Ma Y, Li M, Yan S, Li G (2016) A kind of parameters self-adjusting extreme learning machine. Neural Process Lett 44(3):813–830
Huang G-B, Chen L (2008) Enhanced random search based incremental extreme learning machine. Neurocomputing 71(16–18):3460–3468
Iosifidis A, Tefas A, Pitas I (2015) Dropelm: Fast neural network regularization with dropout and dropconnect. Neurocomputing 162:57–66
Yu W, Zhuang F, He Q, Shi Z (2015) Learning deep representations via extreme learning machines. Neurocomputing 149:308–315
Zhou H, Huang G-B, Lin Z, Wang H, Soh YC (2015) Stacked extreme learning machines. IEEE Trans Cybern 45(9):2013–2025
Li G, Niu P, Ma Y, Wang H, Zhang W (2014) Tuning extreme learning machine by an improved artificial bee colony to model and optimize the boiler efficiency. Knowl-Based Syst 67:278–289
Han F, Yao H-F, Ling Q-H (2013) An improved evolutionary extreme learning machine based on particle swarm optimization. Neurocomputing 116:87–93
Cao J, Lin Z, Huang G-B (2012) Self-adaptive evolutionary extreme learning machine. Neural Process Lett 36(3):285–305
Neumann K, Steil JJ (2011) Batch intrinsic plasticity for extreme learning machines. In International conference on artificial neural networks. Springer, Berlin pp 339–346
Klaus Steil J (2013) Optimizing extreme learning machines via ridge regression and batch intrinsic plasticity. Neurocomputing 102:23–30
Sanger TD (1989) Optimal unsupervised learning in a single-layer linear feedforward neural network. Neural Netw 2(6):459–473
Johnson WB, Lindenstrauss J (1984) Extensions of lipschitz mappings into a hilbert space. Contemp Math 26(189–206):1
Li C (2011) A model of neuronal intrinsic plasticity. IEEE Trans Auton Ment Dev 3(4):277–284
Triesch J (2005) Synergies between intrinsic and synaptic plasticity in individual model neurons. Adv Neural Inf Process Syst 1417–1424
Schrauwen B, Wardermann M, Verstraeten D, Steil JJ, Stroobandt D (2008) Improving reservoirs using intrinsic plasticity. Neurocomputing 71(7–9):1159–1171
Hebb DO (2005) The organization of behavior: a neuropsychological theory. Psychology Press, Hove
Oja E, Karhunen J, Wang L, Vigario R (1996) Principal and independent components in neural networks-recent developments. Proceedings VII Italian Workshop Neural Networks WIRN 95:16–35
Karhunen J, Joutsensalo J (1995) Generalizations of principal component analysis, optimization problems, and neural networks. Neural Netw 8(4):549–562
Triesch J (2014) Synergies between intrinsic and synaptic plasticity mechanisms. Neural Comput 19(4):885–909 s
Schlkopf B, Platt J, Hofmann T (2006) Greedy layer-wise training of deep networks. In: International conference on neural information processing systems, pp 153–160
Huang G, Liu Z, Weinberger KQ, van der Maaten L (2017) Densely connected convolutional networks. Proc IEEE Conf Comput Vis Pattern Recognit 1(2):3
Acknowledgements
This research is supported by the National Science and Technology Major Projects (No. 2013ZX03005013), and the Opening Foundation of the State Key Laboratory for Diagnosis and Treatment of Infectious Diseases (No. 2014KF06).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Chen, C., Jin, X., Jiang, B. et al. Optimizing Extreme Learning Machine via Generalized Hebbian Learning and Intrinsic Plasticity Learning. Neural Process Lett 49, 1593–1609 (2019). https://doi.org/10.1007/s11063-018-9869-6
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-018-9869-6