Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Hybrid Extreme Learning Machine and Backpropagation with Adaptive Activation Functions for Classification Problems

  • Conference paper
  • First Online:
Intelligent Systems Design and Applications (ISDA 2020)

Abstract

This paper proposes a hybrid approach of Extreme Learning Machine with Backpropagation with adaptive activation functions for classification problems. In general, machine learning research seeks to find algorithms that can learn specific parameters through data to create increasingly accurate generalist predictive models. In some scenarios, these models become very complex, requiring great computational power for both the training stage and the predictive stage. Adaptive activation functions emerged intending to increase models’ predictive capacity, thus generating better models without increasing their complexity. We evaluate the performance of the proposal in a benchmark of ten classification problems. The results obtained show that the hybrid approach with adaptive activation functions, in general, surpasses the standard functions with the same architecture.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bishop, C.M., et al.: Neural Networks for Pattern Recognition. Oxford University Press (1995)

    Google Scholar 

  2. Campolucci, P., Capperelli, F., Guarnieri, S., Piazza, F., Uncini, A.: Neural networks with adaptive spline activation function. In: Proceedings of 8th Mediterranean Electrotechnical Conference on Industrial Applications in Power Systems, Computer Science and Telecommunications (MELECON 1996), vol. 3, pp. 1442–1445. IEEE (1996)

    Google Scholar 

  3. Canziani, A., Paszke, A., Culurciello, E.: An analysis of deep neural network models for practical applications. arXiv preprint arXiv:1605.07678 (2016)

  4. Chen, C.T., Chang, W.D.: A feedforward neural network with function shape autotuning. Neural Netw. 9(4), 627–641 (1996)

    Article  Google Scholar 

  5. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)

    Article  MathSciNet  Google Scholar 

  6. Fisher, R.A.: Xv.—the correlation between relatives on the supposition of mendelian inheritance. Earth Environ. Sci. Trans. Roy. Soc. Edinburgh 52(2), 399–433 (1919)

    Google Scholar 

  7. Godfrey, L.B., Gashler, M.S.: A continuum among logarithmic, linear, and exponential functions, and its potential to improve generalization in neural networks. In: 2015 7th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management (IC3K), vol. 1, pp. 481–486. IEEE (2015)

    Google Scholar 

  8. Guo, P., Cheng, W., Wang, Y.: Hybrid evolutionary algorithm with extreme machine learning fitness function evaluation for two-stage capacitated facility location problems. Expert Syst. Appl. 71, 57–68 (2017)

    Article  Google Scholar 

  9. Haykin, S.: Neural Networks - A Comprehensive Foundation, 2nd edn. Prentice Hall (1998)

    Google Scholar 

  10. He, K., Zhang, X., Ren, S., Sun, J.: Delving deep into rectifiers: surpassing human-level performance on imagenet classification. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 1026–1034 (2015)

    Google Scholar 

  11. Huang, G., Huang, G.B., Song, S., You, K.: Trends in extreme learning machines: a review. Neural Networks 61(Supplement C), 32 – 48 (2015)

    Google Scholar 

  12. Huang, G.B., Wang, D.H., Lan, Y.: Extreme learning machines: a survey. Int. J. Mach. Learn. Cybern. 2(2), 107–122 (2011)

    Article  Google Scholar 

  13. Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of the 2004 IEEE International Joint Conference on Neural Networks, 2004, vol. 2, pp. 985–990. IEEE (2004)

    Google Scholar 

  14. Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70(1–3), 489–501 (2006)

    Article  Google Scholar 

  15. Jagtap, A.D., Kawaguchi, K., Karniadakis, G.E.: Adaptive activation functions accelerate convergence in deep and physics-informed neural networks. J. Comput. Phys. 404, 109136 (2020)

    Google Scholar 

  16. Karlik, B., Olgac, A.V.: Performance analysis of various activation functions in generalized MLP architectures of neural networks. Int. J. Artif. Intell. Expert Syst. 1(4), 111–122 (2011)

    Google Scholar 

  17. Kingma, D.P., Ba, J.: Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

  18. Kunc, V., Kléma, J.: On transformative adaptive activation functions in neural networks for gene expression inference. bioRxiv, p. 587287 (2019)

    Google Scholar 

  19. Lau, M.M., Lim, K.H.: Review of adaptive activation function in deep neural network. In: 2018 IEEE-EMBS Conference on Biomedical Engineering and Sciences (IECBES), pp. 686–690. IEEE (2018)

    Google Scholar 

  20. Li, D., Chen, X., Becchi, M., Zong, Z.: Evaluating the energy efficiency of deep convolutional neural networks on CPUs and GPUs. In: 2016 IEEE International Conferences on Big Data and Cloud Computing (BDCloud), Social Computing and Networking (SocialCom), Sustainable Computing and Communications (SustainCom)(BDCloud-SocialCom-SustainCom). pp. 477–484. IEEE (2016)

    Google Scholar 

  21. Merenda, M., Porcaro, C., Iero, D.: Edge machine learning for ai-enabled IoT devices: a review. Sensors 20(9), 2533 (2020)

    Article  Google Scholar 

  22. Olson, R.S., La Cava, W., Orzechowski, P., Urbanowicz, R.J., Moore, J.H.: PMLB: a large benchmark suite for machine learning evaluation and comparison. BioData Mining 10(1), 36 (2017)

    Article  Google Scholar 

  23. Piazza, F., Uncini, A., Zenobi, M.: Artificial neural networks with adaptive polynomial activation function (1992)

    Google Scholar 

  24. Ramachandran, P., Zoph, B., Le, Q.V.: Searching for activation functions. arXiv preprint arXiv:1710.05941 (2017)

  25. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature 323(6088), 533–536 (1986)

    Article  Google Scholar 

  26. Saporetti, C.M., Duarte, G.R., Fonseca, T.L., da Fonseca, L.G., Pereira, E.: Extreme learning machine combined with a differential evolution algorithm for lithology identification. RITA 25(4), 43–56 (2018)

    Article  Google Scholar 

  27. Scellier, B., Bengio, Y.: Equilibrium propagation: Bridging the gap between energy-based models and backpropagation. Front. Comput. Neurosci. 11, 24 (2017)

    Article  Google Scholar 

  28. Schwartz, R., Dodge, J., Smith, N.A., Etzioni, O.: Green ai. arXiv preprint arXiv:1907.10597 (2019)

  29. Strubell, E., Ganesh, A., McCallum, A.: Energy and policy considerations for deep learning in nlp. arXiv preprint arXiv:1906.02243 (2019)

  30. Sulistiyo, M.D., Dayawati, R.N., et al.: Evolution strategies for weight optimization of artificial neural network in time series prediction. In: 2013 International Conference on Robotics, Biomimetics, Intelligent Computational Systems, pp. 143–147. IEEE (2013)

    Google Scholar 

  31. Tezel, G., Özbay, Y.: A new neural network with adaptive activation function for classification of ecg arrhythmias. In: International Conference on Knowledge-Based and Intelligent Information and Engineering Systems, pp. 1–8. Springer (2007)

    Google Scholar 

  32. Tukey, J.W.: Exploratory Data Analysis, vol. 2. Reading, MA (1977)

    MATH  Google Scholar 

  33. Vecci, L., Campolucci, P., Piazza, F., Uncini, A.: Approximation capabilities of adaptive spline neural networks. In: Proceedings of International Conference on Neural Networks (ICNN 1997), vol. 1, pp. 260–265. IEEE (1997)

    Google Scholar 

  34. Wu, R., Huang, H., Qian, X., Huang, T.: A L-BFGS based learning algorithm for complex-valued feedforward neural networks. Neural Process. Lett. 47(3), 1271–1284 (2018)

    Article  Google Scholar 

  35. ZahediNasab, R., Mohseni, H.: Neuroevolutionary based convolutional neural network with adaptive activation functions. Neurocomputing 381, 306–313 (2020)

    Article  Google Scholar 

  36. Zou, W., Yao, F., Zhang, B., Guan, Z.: Back propagation convex extreme learning machine. In: Proceedings of ELM-2016, pp. 259–272. Springer (2018)

    Google Scholar 

Download references

Acknowledgment

The authors acknowledge the financial support of CNPq (429639/2016-3), FAPEMIG (APQ-00334/18), and CAPES - Finance Code 001. The authors would like to thank Itaú Unibanco for hours released to its collaborator to develop this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to T. L. Fonseca .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Fonseca, T.L., Goliatt, L. (2021). Hybrid Extreme Learning Machine and Backpropagation with Adaptive Activation Functions for Classification Problems. In: Abraham, A., Piuri, V., Gandhi, N., Siarry, P., Kaklauskas, A., Madureira, A. (eds) Intelligent Systems Design and Applications. ISDA 2020. Advances in Intelligent Systems and Computing, vol 1351. Springer, Cham. https://doi.org/10.1007/978-3-030-71187-0_2

Download citation

Publish with us

Policies and ethics