Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3407197.3407203acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiconsConference Proceedingsconference-collections
research-article

Quantizing Spiking Neural Networks with Integers

Published: 28 July 2020 Publication History

Abstract

Spiking neural networks (SNNs) are a promising approach to developing autonomous agents that continuously adapt to their environment. Developing low-power SNNs that can be implemented on a digital platform is a critical step to the realization of such agents. One of the most important methods of implementing low-power SNNs requires operating at reduced precision. While traditional computer vision has seen a lot of research examining the trade-offs between precision and model performance, such trade-offs are underexamined for contemporary SNNs. This paper studies the trade-offs associated with learning-performance and the quantization of neural dynamics, weights and learning components in SNNs. Our results show that SNNs trained using only integer fixed-point representations can still retain their accuracy while occupying dramatically lower memory footprints and using only energy-efficient fixed-point arithmetic. We show that the memory usage of SNNs trained with reduced precision weights, errors, gradients and neural dynamics can be downsized by 73.78% at the cost of 1.04% test error increase on the DVS gesture data set.

References

[1]
Arnon Amir, Brian Taba, David Berg, Timothy Melano, Jeffrey McKinstry, Carmelo di Nolfo, Tapan Nayak, Alexander Andreopoulos, Guillaume Garreau, Marcela Mendoza, Jeff Kusnitz, Michael Debole, Steve Esser, Tobi Delbruck, Myron Flickner, and Dharmendra Modha. 2017. A Low Power, Fully Event-Based Gesture Recognition System. In The IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[2]
Mike Davies, Narayan Srinivasa, Tsung-Han Lin, Gautham Chinya, Yongqiang Cao, Sri Harsha Choday, Georgios Dimou, Prasad Joshi, Nabil Imam, Shweta Jain, 2018. Loihi: A neuromorphic manycore processor with on-chip learning. IEEE Micro 38, 1 (2018), 82–99.
[3]
Tobi Delbruck and Manuel Lang. 2013. Robotic goalie with 3 ms reaction time at 4% CPU load using event-based dynamic vision sensor. Frontiers in neuroscience 7 (2013), 223.
[4]
Wolfram Gestner and W Kistler. 2002. Spiking neuron models. Cambridge University(2002).
[5]
Xavier Glorot and Yoshua Bengio. 2010. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the thirteenth international conference on artificial intelligence and statistics. 249–256.
[6]
Suyog Gupta, Ankur Agrawal, Kailash Gopalakrishnan, and Pritish Narayanan. 2015. Deep learning with limited numerical precision. In International Conference on Machine Learning. 1737–1746.
[7]
Giacomo Indiveri, Bernabé Linares-Barranco, Tara Julia Hamilton, André Van Schaik, Ralph Etienne-Cummings, Tobi Delbruck, Shih-Chii Liu, Piotr Dudek, Philipp Häfliger, Sylvie Renaud, 2011. Neuromorphic silicon neuron circuits. Frontiers in neuroscience 5 (2011), 73.
[8]
G. Indiveri and Y. Sandamirskaya. 2019. The Importance of Space and Time for Signal Processing in Neuromorphic Agents: The Challenge of Developing Low-Power, Autonomous Agents That Interact With the Environment. IEEE Signal Processing Magazine 36, 6 (2019), 16–28.
[9]
Eugene M Izhikevich. 2003. Simple model of spiking neurons. IEEE Transactions on neural networks 14, 6 (2003), 1569–1572.
[10]
Xin Jin, Alexander Rast, Francesco Galluppi, Sergio Davies, and Steve Furber. 2010. Implementing spike-timing-dependent plasticity on SpiNNaker neuromorphic hardware. In The 2010 International Joint Conference on Neural Networks (IJCNN). IEEE, 1–8.
[11]
Yingyezhe Jin, Wenrui Zhang, and Peng Li. 2018. Hybrid macro/micro level backpropagation for training deep spiking neural networks. In Advances in Neural Information Processing Systems. 7005–7015.
[12]
Jacques Kaiser, Alexander Friedrich, Juan Camilo Vasquez Tieck, Daniel Reichard, Arne Rönnau, Emre Neftci, and Rüdiger Dillmann. 2019. Embodied Event-Driven Random Backpropagation. CoRR abs/1904.04805(2019). arxiv:1904.04805http://arxiv.org/abs/1904.04805
[13]
Jacques Kaiser, Hesham Mostafa, and Emre Neftci. 2018. Synaptic Plasticity Dynamics for Deep Continuous Local Learning (DECOLLE). arXiv preprint arXiv:1811.10766(2018).
[14]
Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. 2015. Deep learning. nature 521, 7553 (2015), 436–444.
[15]
Hao Li, Zheng Xu, Gavin Taylor, Christoph Studer, and Tom Goldstein. 2018. Visualizing the Loss Landscape of Neural Nets. In Neural Information Processing Systems.
[16]
Timothy P Lillicrap, Daniel Cownden, Douglas B Tweed, and Colin J Akerman. 2016. Random synaptic feedback weights support error backpropagation for deep learning. Nature communications 7, 1 (2016), 1–10.
[17]
Wolfgang Maass. 2015. To spike or not to spike: that is the question. Proc. IEEE 103, 12 (2015), 2219–2224.
[18]
Paul A Merolla, John V Arthur, Rodrigo Alvarez-Icaza, Andrew S Cassidy, Jun Sawada, Filipp Akopyan, Bryan L Jackson, Nabil Imam, Chen Guo, Yutaka Nakamura, 2014. A million spiking-neuron integrated circuit with a scalable communication network and interface. Science 345, 6197 (2014), 668–673.
[19]
J Parker Mitchell, Grant Bruer, Mark E Dean, James S Plank, Garrett S Rose, and Catherine D Schuman. 2017. NeoN: Neuromorphic control for autonomous robotic navigation. In 2017 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS). IEEE, 136–142.
[20]
Hesham Mostafa, Vishwajith Ramesh, and Gert Cauwenberghs. 2018. Deep supervised learning using local errors. Frontiers in neuroscience 12 (2018), 608.
[21]
Emre O Neftci, Hesham Mostafa, and Friedemann Zenke. 2019. Surrogate gradient learning in spiking neural networks. arXiv preprint arXiv:1901.09948(2019).
[22]
Emre O Neftci, Hesham Mostafa, and Friedemann Zenke. 2019. Surrogate Gradient Learning in Spiking Neural Networks. Signal Processing Magazine, IEEE(2019).
[23]
Emre O Neftci, Bruno U Pedroni, Siddharth Joshi, Maruan Al-Shedivat, and Gert Cauwenberghs. 2016. Stochastic synapses enable efficient brain-inspired learning machines. Frontiers in neuroscience 10 (2016), 241.
[24]
Bodo Rueckauer, Iulia-Alexandra Lungu, Yuhuang Hu, Michael Pfeiffer, and Shih-Chii Liu. 2017. Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Frontiers in neuroscience 11 (2017), 682.
[25]
Sungho Shin, Yoonho Boo, and Wonyong Sung. 2020. SQWA: Stochastic Quantized Weight Averaging for Improving the Generalization Capability of Low-Precision Deep Neural Networks. arxiv:2002.00343 [cs.LG]
[26]
Sumit Bam Shrestha and Garrick Orchard. 2018. SLAYER: Spike layer error reassignment in time. In Advances in Neural Information Processing Systems. 1412–1421.
[27]
Martino Sorbaro, Qian Liu, Massimo Bortone, and Sadique Sheik. 2019. Optimizing the energy consumption of spiking neural networks for neuromorphic applications. arXiv preprint arXiv:1912.01268(2019).
[28]
Miguel Soto, Teresa Serrano-Gotarredona, and Bernabé Linares-Barranco. 2017. ”SLOW-POKER-DVS Database”. ''http://www2.imse-cnm.csic.es/caviar/SLOWPOKERDVS.html''
[29]
Paul J Werbos. 1990. Backpropagation through time: what it does and how to do it. Proc. IEEE 78, 10 (1990), 1550–1560.
[30]
Shuang Wu, Guoqi Li, Feng Chen, and Luping Shi. 2018. Training and inference with integers in deep neural networks. arXiv preprint arXiv:1802.04680(2018).
[31]
Amirreza Yousefzadeh, Timothée Masquelier, Teresa Serrano-Gotarredona, and Bernabé Linares-Barranco. 2017. Hardware implementation of convolutional STDP for on-line visual feature learning. In 2017 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE, 1–4.
[32]
Amirreza Yousefzadeh, Evangelos Stromatias, Miguel Soto, Teresa Serrano-Gotarredona, and Bernabé Linares-Barranco. 2018. On practical issues for stochastic stdp hardware with 1-bit synaptic weights. Frontiers in neuroscience 12 (2018).
[33]
Friedemann Zenke and Surya Ganguli. 2018. Superspike: Supervised learning in multilayer spiking neural networks. Neural computation 30, 6 (2018), 1514–1541.

Cited By

View all
  • (2024)Neuro-Spark: A Submicrosecond Spiking Neural Networks Architecture for In-Sensor Filtering2024 International Conference on Neuromorphic Systems (ICONS)10.1109/ICONS62911.2024.00017(63-70)Online publication date: 30-Jul-2024
  • (2024)MINT: Multiplier-less INTeger Quantization for Energy Efficient Spiking Neural Networks2024 29th Asia and South Pacific Design Automation Conference (ASP-DAC)10.1109/ASP-DAC58780.2024.10473825(830-835)Online publication date: 22-Jan-2024
  • (2023)Sharing leaky-integrate-and-fire neurons for memory-efficient spiking neural networksFrontiers in Neuroscience10.3389/fnins.2023.123000217Online publication date: 31-Jul-2023
  • Show More Cited By
  1. Quantizing Spiking Neural Networks with Integers

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    ICONS 2020: International Conference on Neuromorphic Systems 2020
    July 2020
    186 pages
    ISBN:9781450388511
    DOI:10.1145/3407197
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 28 July 2020

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Local Learning
    2. Memory Efficiency
    3. Quantization
    4. Quantized Learning
    5. Spiking Neural Networks
    6. Surrogate Gradient

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    ICONS 2020

    Acceptance Rates

    Overall Acceptance Rate 13 of 22 submissions, 59%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)98
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 13 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Neuro-Spark: A Submicrosecond Spiking Neural Networks Architecture for In-Sensor Filtering2024 International Conference on Neuromorphic Systems (ICONS)10.1109/ICONS62911.2024.00017(63-70)Online publication date: 30-Jul-2024
    • (2024)MINT: Multiplier-less INTeger Quantization for Energy Efficient Spiking Neural Networks2024 29th Asia and South Pacific Design Automation Conference (ASP-DAC)10.1109/ASP-DAC58780.2024.10473825(830-835)Online publication date: 22-Jan-2024
    • (2023)Sharing leaky-integrate-and-fire neurons for memory-efficient spiking neural networksFrontiers in Neuroscience10.3389/fnins.2023.123000217Online publication date: 31-Jul-2023
    • (2023)The Hardware Impact of Quantization and Pruning for Weights in Spiking Neural NetworksIEEE Transactions on Circuits and Systems II: Express Briefs10.1109/TCSII.2023.326070170:5(1789-1793)Online publication date: May-2023
    • (2023)Low Precision Quantization-aware Training in Spiking Neural Networks with Differentiable Quantization Function2023 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN54540.2023.10191387(1-8)Online publication date: 18-Jun-2023
    • (2023)QMTS: Fixed-point Quantization for Multiple-timescale Spiking Neural NetworksArtificial Neural Networks and Machine Learning – ICANN 202310.1007/978-3-031-44207-0_34(407-419)Online publication date: 22-Sep-2023
    • (2022)Quantization Framework for Fast Spiking Neural NetworksFrontiers in Neuroscience10.3389/fnins.2022.91879316Online publication date: 19-Jul-2022
    • (2022)Training-aware Low Precision Quantization in Spiking Neural Networks2022 56th Asilomar Conference on Signals, Systems, and Computers10.1109/IEEECONF56349.2022.10051957(1147-1151)Online publication date: 31-Oct-2022
    • (2022)Navigating Local Minima in Quantized Spiking Neural Networks2022 IEEE 4th International Conference on Artificial Intelligence Circuits and Systems (AICAS)10.1109/AICAS54282.2022.9869966(352-355)Online publication date: 13-Jun-2022
    • (2021)Hessian Aware Quantization of Spiking Neural NetworksInternational Conference on Neuromorphic Systems 202110.1145/3477145.3477158(1-5)Online publication date: 27-Jul-2021
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media