Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3546790.3546798acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiconsConference Proceedingsconference-collections
research-article
Open access

Reducing the Spike Rate in Deep Spiking Neural Networks

Published: 07 September 2022 Publication History
  • Get Citation Alerts
  • Abstract

    One objective of Spiking Neural Networks is a very efficient computation in terms of energy consumption. To achieve this target, a small spike rate is of course very beneficial since the event-driven nature of such a computation. However, as the network becomes deeper, the spike rate tends to increase without any improvements in the final results. On the other hand, the introduction of a penalty on the excess of spikes can often lead the network to a configuration where many neurons are silent, resulting in a drop of the computational efficacy.
    In this paper, we propose a learning strategy that keeps the spike rate under control, by (i) changing the loss function to penalize the spikes generated by neurons after the first ones, and by (ii) proposing a two-phase training that avoids silent neurons during the training.

    References

    [1]
    Sander M. Bohté, Joost N. Kok, and Han La Poutré. 2000. SpikeProp: backpropagation for networks of spiking neurons. In ESANN (Bruges (Belgium)). 419–424.
    [2]
    Olaf Booij and Hieu tat Nguyen. 2005. A gradient descent rule for spiking neurons emitting multiple spikes. Inform. Process. Lett. 95, 6 (2005), 552–558. https://doi.org/10.1016/j.ipl.2005.05.023 Applications of Spiking Neural Networks.
    [3]
    Erika Covi, Elisa Donati, Xiangpeng Liang, David Kappel, Hadi Heidari, Melika Payvand, and Wei Wang. 2021. Adaptive Extreme Edge Computing for Wearable Devices. Frontiers in Neuroscience 15 (2021). https://doi.org/10.3389/fnins.2021.611300
    [4]
    Wulfram Gerstner and Werner M. Kistler. 2002. Spiking Neuron Models: Single Neurons, Populations, Plasticity. Cambridge University Press. https://doi.org/10.1017/CBO9780511815706
    [5]
    A. L. Hodgkin and A. F. Huxley. 1952. A quantitative description of membrane current and its application to conduction and excitation in nerve. Journal of physiology 117, 4 (Aug 1952), 500–544. https://doi.org/10.1113/jphysiol.1952.sp004764
    [6]
    Dongsung Huh and Terrence J Sejnowski. 2018. Gradient Descent for Spiking Neural Networks. In Advances in Neural Information Processing Systems, S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett (Eds.). Vol. 31. Curran Associates, Inc.https://proceedings.neurips.cc/paper/2018/file/185e65bc40581880c4f2c82958de8cfe-Paper.pdf
    [7]
    Giacomo Indiveri, Bernabe Linares-Barranco, Tara Hamilton, André van Schaik, Ralph Etienne-Cummings, Tobi Delbruck, Shih-Chii Liu, Piotr Dudek, Philipp Häfliger, Sylvie Renaud, Johannes Schemmel, Gert Cauwenberghs, John Arthur, Kai Hynna, Fopefolu Folowosele, Sylvain SAÏGHI, Teresa Serrano-Gotarredona, Jayawan Wijekoon, Yingxue Wang, and Kwabena Boahen. 2011. Neuromorphic Silicon Neuron Circuits. Frontiers in Neuroscience 5 (2011). https://doi.org/10.3389/fnins.2011.00073
    [8]
    Diederik P. Kingma and Jimmy Ba. 2015. Adam: A Method for Stochastic Optimization. In 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7-9, 2015, Conference Track Proceedings, Yoshua Bengio and Yann LeCun (Eds.). http://arxiv.org/abs/1412.6980
    [9]
    Laura Kriener, Julian Göltz, and Mihai A. Petrovici. 2021. The Yin-Yang dataset. CoRR abs/2102.08211(2021). arXiv:2102.08211https://arxiv.org/abs/2102.08211
    [10]
    Yasuaki Kuroe and Tomokazu Ueyama. 2010. Learning methods of recurrent Spiking Neural Networks based on adjoint equations approach. In The 2010 International Joint Conference on Neural Networks (IJCNN). 1–8. https://doi.org/10.1109/IJCNN.2010.5596914
    [11]
    Y. Lecun, L. Bottou, Y. Bengio, and P. Haffner. 1998. Gradient-based learning applied to document recognition. Proc. IEEE 86, 11 (1998), 2278–2324. https://doi.org/10.1109/5.726791
    [12]
    Ling Li, Amrit Pratap, Hsuan-Tien Lin, and Yaser S. Abu-Mostafa. 2005. Improving Generalization by Data Categorization. In Knowledge Discovery in Databases: PKDD 2005, Alípio Mário Jorge, Luís Torgo, Pavel Brazdil, Rui Camacho, and João Gama (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 157–168.
    [13]
    Wolfgang Maass. 1996. Lower Bounds for the Computational Power of Networks of Spiking Neurons. Neural Computation 8, 1 (Jan 1996), 1–40. https://doi.org/10.1162/neco.1996.8.1.1
    [14]
    Wolfgang Maass. 1997. Networks of spiking neurons: The third generation of neural network models. Neural Networks 10, 9 (1997), 1659–1671. https://doi.org/10.1016/S0893-6080(97)00011-7
    [15]
    Emre O. Neftci, Hesham Mostafa, and Friedemann Zenke. 2019. Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine 36, 6 (2019), 51–63. https://doi.org/10.1109/MSP.2019.2931595
    [16]
    Adam Paszke, Sam Gross, Francisco Massa, Adam Lerer, James Bradbury, Gregory Chanan, Trevor Killeen, Zeming Lin, Natalia Gimelshein, Luca Antiga, Alban Desmaison, Andreas Kopf, Edward Yang, Zachary DeVito, Martin Raison, Alykhan Tejani, Sasank Chilamkurthy, Benoit Steiner, Lu Fang, Junjie Bai, and Soumith Chintala. 2019. PyTorch: An Imperative Style, High-Performance Deep Learning Library. In Advances in Neural Information Processing Systems 32, H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, and R. Garnett (Eds.). Curran Associates, Inc., 8024–8035. http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf
    [17]
    Melika Payvand, Mohammed E. Fouda, Fadi Kurdahi, Ahmed Eltawil, and Emre O. Neftci. 2020. Error-triggered Three-Factor Learning Dynamics for Crossbar Arrays. In 2020 2nd IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS). 218–222. https://doi.org/10.1109/AICAS48895.2020.9073998
    [18]
    Arianna Rubino, Melika Payvand, and Giacomo Indiveri. 2019. Ultra-Low Power Silicon Neuron Circuit for Extreme-Edge Neuromorphic Intelligence. In 2019 26th IEEE International Conference on Electronics, Circuits and Systems (ICECS). 458–461. https://doi.org/10.1109/ICECS46596.2019.8964713
    [19]
    Bodo Rueckauer, Iulia-Alexandra Lungu, Yuhuang Hu, Michael Pfeiffer, and Shih-Chii Liu. 2017. Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification. Frontiers in Neuroscience 11 (2017). https://doi.org/10.3389/fnins.2017.00682
    [20]
    Kukan Selvaratnam, Yasuaki Kuroe, and Takehiro Mori. 2000. Learning methods of recurrent spiking neural networks. Trans. Inst. Syst. Control Inf. Eng. 13, 3 (2000), 95–104. https://doi.org/10.5687/iscie.13.3_95
    [21]
    Timo C. Wunderlich and Christian Pehle. 2021. Event-based backpropagation can compute exact gradients for spiking neural networks. Scientific Reports 11, 1 (18 Jun 2021), 12829. https://doi.org/10.1038/s41598-021-91786-z
    [22]
    Yan Xu, Xiaoqin Zeng, Lixin Han, and Jing Yang. 2013. A supervised multi-spike learning algorithm based on gradient descent for spiking neural networks. Neural Networks 43(2013), 99–113. https://doi.org/10.1016/j.neunet.2013.02.003

    Cited By

    View all
    • (2024)Simultaneous Velocity and Texture Classification from a Neuromorphic Tactile Sensor Using Spiking Neural NetworksElectronics10.3390/electronics1311215913:11(2159)Online publication date: 1-Jun-2024

    Index Terms

    1. Reducing the Spike Rate in Deep Spiking Neural Networks

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      ICONS '22: Proceedings of the International Conference on Neuromorphic Systems 2022
      July 2022
      213 pages
      ISBN:9781450397896
      DOI:10.1145/3546790
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 07 September 2022

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Loss Function
      2. Neuromorphic Computing
      3. Spiking Neural Networks

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Conference

      ICONS

      Acceptance Rates

      Overall Acceptance Rate 13 of 22 submissions, 59%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)272
      • Downloads (Last 6 weeks)31

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Simultaneous Velocity and Texture Classification from a Neuromorphic Tactile Sensor Using Spiking Neural NetworksElectronics10.3390/electronics1311215913:11(2159)Online publication date: 1-Jun-2024

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Get Access

      Login options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media