Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3664647.3681186acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
research-article

Q-SNNs: Quantized Spiking Neural Networks

Published: 28 October 2024 Publication History

Abstract

Brain-inspired Spiking Neural Networks (SNNs) leverage sparse spikes to represent information and process them in an asynchronous event-driven manner, offering an energy-efficient paradigm for the next generation of machine intelligence. However, the current focus within the SNN community prioritizes accuracy optimization through the development of large-scale models, limiting their viability in resource-constrained and low-power edge devices. To address this challenge, we introduce a lightweight and hardware-friendly Quantized SNN (Q-SNN) that applies quantization to both synaptic weights and membrane potentials. By significantly compressing these two key elements, the proposed Q-SNNs substantially reduce both memory usage and computational complexity. Moreover, to prevent the performance degradation caused by this compression, we present a new Weight-Spike Dual Regulation (WS-DR) method inspired by information entropy theory. Experimental evaluations on various datasets, including static and neuromorphic, demonstrate that our Q-SNNs outperform existing methods in terms of both model size and accuracy. These state-of-the-art results in efficiency and efficacy suggest that the proposed method can significantly improve edge intelligent computing.

References

[1]
Filipp Akopyan, Jun Sawada, Andrew Cassidy, Rodrigo Alvarez-Icaza, John Arthur, Paul Merolla, Nabil Imam, Yutaka Nakamura, Pallab Datta, Gi-Joon Nam, et al. 2015. Truenorth: Design and tool flow of a 65 mw 1 million neuron programmable neurosynaptic chip. IEEE transactions on computer-aided design of integrated circuits and systems, Vol. 34, 10 (2015), 1537--1557.
[2]
Arnon Amir, Brian Taba, David Berg, Timothy Melano, Jeffrey McKinstry, Carmelo Di Nolfo, Tapan Nayak, Alexander Andreopoulos, Guillaume Garreau, Marcela Mendoza, et al. 2017. A low power, fully event-based gesture recognition system. In Proceedings of the IEEE conference on computer vision and pattern recognition. 7243--7252.
[3]
Malyaban Bal and Abhronil Sengupta. 2023. Spikingbert: Distilling bert to train spiking language models using implicit differentiation. arXiv preprint arXiv:2308.10873 (2023).
[4]
Kaiwei Che, Luziwei Leng, Kaixuan Zhang, Jianguo Zhang, Qinghu Meng, Jie Cheng, Qinghai Guo, and Jianxing Liao. 2022. Differentiable hierarchical and surrogate gradient search for spiking neural networks. Advances in Neural Information Processing Systems, Vol. 35 (2022), 24975--24990.
[5]
Yanqi Chen, Zhaofei Yu, Wei Fang, Tiejun Huang, and Yonghong Tian. 2021. Pruning of deep spiking neural networks through gradient rewiring. arXiv preprint arXiv:2105.04916 (2021).
[6]
Sayeed Shafayet Chowdhury, Isha Garg, and Kaushik Roy. 2021. Spatio-temporal pruning and quantization for low-latency spiking neural networks. In 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 1--9.
[7]
Mike Davies, Narayan Srinivasa, Tsung-Han Lin, Gautham Chinya, Yongqiang Cao, Sri Harsha Choday, Georgios Dimou, Prasad Joshi, Nabil Imam, Shweta Jain, et al. 2018. Loihi: A neuromorphic manycore processor with on-chip learning. Ieee Micro, Vol. 38, 1 (2018), 82--99.
[8]
Jia Deng, Wei Dong, Richard Socher, Li-Jia Li, Kai Li, and Li Fei-Fei. 2009. Imagenet: A large-scale hierarchical image database. In 2009 IEEE conference on computer vision and pattern recognition. Ieee, 248--255.
[9]
Lei Deng, Yujie Wu, Yifan Hu, Ling Liang, Guoqi Li, Xing Hu, Yufei Ding, Peng Li, and Yuan Xie. 2021. Comprehensive snn compression using admm optimization and activity regularization. IEEE transactions on neural networks and learning systems, Vol. 34, 6 (2021), 2791--2805.
[10]
Shikuang Deng, Yuhang Li, Shanghang Zhang, and Shi Gu. 2022. Temporal efficient training of spiking neural network via gradient re-weighting. arXiv preprint arXiv:2202.11946 (2022).
[11]
Yiting Dong, Dongcheng Zhao, and Yi Zeng. 2024. Temporal knowledge sharing enable spiking neural network learning from past and future. IEEE Transactions on Artificial Intelligence (2024).
[12]
Yufei Guo, Xuhui Huang, and Zhe Ma. 2023. Direct learning-based deep spiking neural networks: a review. Frontiers in Neuroscience, Vol. 17 (2023), 1209795.
[13]
Yufei Guo, Weihang Peng, Yuanpei Chen, Liwen Zhang, Xiaode Liu, Xuhui Huang, and Zhe Ma. 2023. Joint a-snn: Joint training of artificial and spiking neural networks via self-distillation and weight factorization. Pattern Recognition, Vol. 142 (2023), 109639.
[14]
Yufei Guo, Xinyi Tong, Yuanpei Chen, Liwen Zhang, Xiaode Liu, Zhe Ma, and Xuhui Huang. 2022. RecDis-SNN: Rectifying Membrane Potential Distribution for Directly Training Spiking Neural Networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). 326--335.
[15]
A. L. Hodgkin and A. F. Huxley. [n.,d.]. A quantitative description of membrane current and its application to conduction and excitation in nerve. The Journal of Physiology ( [n.,d.]), 500--544. https://doi.org/10.1113/jphysiol.1952.sp004764
[16]
Wei Huang, Yangdong Liu, Haotong Qin, Ying Li, Shiming Zhang, Xianglong Liu, Michele Magno, and Xiaojuan Qi. 2024. BiLLM: Pushing the Limit of Post-Training Quantization for LLMs. arXiv preprint arXiv:2402.04291 (2024).
[17]
Eugene M Izhikevich. 2003. Simple model of spiking neurons. IEEE Transactions on neural networks, Vol. 14, 6 (2003), 1569--1572.
[18]
Hyeryung Jang, Nicolas Skatchkovsky, and Osvaldo Simeone. 2021. BiSNN: training spiking neural networks with binary weights via bayesian learning. In 2021 IEEE Data Science and Learning Workshop (DSLW). IEEE, 1--6.
[19]
Saeed Reza Kheradpisheh, Maryam Mirsadeghi, and Timothée Masquelier. 2022. Bs4nn: Binarized spiking neural networks with temporal coding and learning. Neural Processing Letters, Vol. 54, 2 (2022), 1255--1273.
[20]
Seijoon Kim, Seongsik Park, Byunggook Na, and Sungroh Yoon. 2020. Spiking-yolo: spiking neural network for energy-efficient object detection. In Proceedings of the AAAI conference on artificial intelligence, Vol. 34. 11270--11277.
[21]
Alex Krizhevsky, Geoffrey Hinton, et al. 2009. Learning multiple layers of features from tiny images. (2009).
[22]
Ravi Kumar Kushawaha, Saurabh Kumar, Biplab Banerjee, and Rajbabu Velmurugan. 2021. Distilling spikes: Knowledge distillation in spiking neural networks. In 2020 25th International Conference on Pattern Recognition (ICPR). IEEE, 4536--4543.
[23]
Hongmin Li, Hanchao Liu, Xiangyang Ji, Guoqi Li, and Luping Shi. 2017. Cifar10-dvs: an event-stream dataset for object classification. Frontiers in neuroscience, Vol. 11 (2017), 244131.
[24]
Yuhang Li, Xin Dong, and Wei Wang. 2019. Additive powers-of-two quantization: An efficient non-uniform discretization for neural networks. arXiv preprint arXiv:1909.13144 (2019).
[25]
Yaxin Li, Xuanye Fang, Yuyuan Gao, Dongdong Zhou, Jiangrong Shen, Jian K Liu, Gang Pan, and Qi Xu. 2024. Efficient structure slimming for spiking neural networks. IEEE Transactions on Artificial Intelligence (2024).
[26]
Yuhang Li, Youngeun Kim, Hyoungseob Park, Tamar Geller, and Priyadarshini Panda. 2022. Neuromorphic data augmentation for training spiking neural networks. In Computer Vision--ECCV 2022: 17th European Conference, Tel Aviv, Israel, October 23--27, 2022, Proceedings, Part VII. Springer, 631--649.
[27]
Yaxin Li, Qi Xu, Jiangrong Shen, Hongming Xu, Long Chen, and Gang Pan. 2024. Towards Efficient Deep Spiking Neural Networks Construction with Spiking Activity based Pruning. arXiv preprint arXiv:2406.01072 (2024).
[28]
Qianhui Liu, Jiaqi Yan, Malu Zhang, Gang Pan, and Haizhou Li. 2024. LitE-SNN: Designing Lightweight and Efficient Spiking Neural Network through Spatial-Temporal Compressive Network Search and Joint Optimization. arXiv preprint arXiv:2401.14652 (2024).
[29]
Sen Lu and Abhronil Sengupta. 2020. Exploring the Connection Between Binary and Spiking Neural Networks. Frontiers in Neuroscience, Vol. 14 (2020). https://doi.org/10.3389/fnins.2020.00535
[30]
Zihan Pan, Malu Zhang, Jibin Wu, Jiadong Wang, and Haizhou Li. 2021. Multi-tone phase coding of interaural time difference for sound source localization with spiking neural networks. IEEE/ACM Transactions on Audio, Speech, and Language Processing, Vol. 29 (2021), 2656--2670.
[31]
Jing Pei, Lei Deng, Sen Song, Mingguo Zhao, Youhui Zhang, Shuang Wu, Guanrui Wang, Zhe Zou, Zhenzhi Wu, Wei He, et al. 2019. Towards artificial general intelligence with hybrid Tianjic chip architecture. Nature, Vol. 572, 7767 (2019), 106--111.
[32]
Yijian Pei, Changqing Xu, Zili Wu, and Yintang Yang. 2023. ALBSNN: ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator. Frontiers in Neuroscience, Vol. 17 (2023), 1225871.
[33]
Rachmad Vidya Wicaksana Putra and Muhammad Shafique. 2024. SpikeNAS: A Fast Memory-Aware Neural Architecture Search Framework for Spiking Neural Network Systems. arXiv preprint arXiv:2402.11322 (2024).
[34]
GC Qiao, Ning Ning, Yue Zuo, SG Hu, Qi Yu, and Yecheng Liu. 2021. Direct training of hardware-friendly weight binarized spiking neural network with surrogate gradient learning towards spatio-temporal event-based dynamic data recognition. Neurocomputing, Vol. 457 (2021), 203--213.
[35]
Haotong Qin, Yifu Ding, Mingyuan Zhang, Qinghua Yan, Aishan Liu, Qingqing Dang, Ziwei Liu, and Xianglong Liu. 2022. Bibert: Accurate fully binarized bert. arXiv preprint arXiv:2203.06390 (2022).
[36]
Haotong Qin, Ruihao Gong, Xianglong Liu, Xiao Bai, Jingkuan Song, and Nicu Sebe. 2020. Binary neural networks: A survey. Pattern Recognition, Vol. 105 (2020), 107281.
[37]
Haotong Qin, Xudong Ma, Xingyu Zheng, Xiaoyang Li, Yang Zhang, Shouda Liu, Jie Luo, Xianglong Liu, and Michele Magno. 2024. Accurate LoRA-Finetuning Quantization of LLMs via Information Retention. arXiv preprint arXiv:2402.05445 (2024).
[38]
Mohammad Rastegari, Vicente Ordonez, Joseph Redmon, and Ali Farhadi. 2016. Xnor-net: Imagenet classification using binary convolutional neural networks. In European conference on computer vision. Springer, 525--542.
[39]
Deboleena Roy, Indranil Chakraborty, and Kaushik Roy. 2019. Scaling deep spiking neural networks with binary stochastic activations. In 2019 IEEE International Conference on Cognitive Computing (ICCC). IEEE, 50--58.
[40]
Bodo Rueckauer, Iulia-Alexandra Lungu, Yuhuang Hu, Michael Pfeiffer, and Shih-Chii Liu. 2017. Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification. Frontiers in Neuroscience, Vol. 11 (12 2017). https://doi.org/10.3389/fnins.2017.00682
[41]
Jiangrong Shen, Qi Xu, Jian K Liu, Yueming Wang, Gang Pan, and Huajin Tang. 2023. Esl-snns: An evolutionary structure learning strategy for spiking neural networks. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 37. 86--93.
[42]
Xinyu Shi, Jianhao Ding, Zecheng Hao, and Zhaofei Yu. 2023. Towards Energy Efficient Spiking Neural Networks: An Unstructured Pruning Framework. In The Twelfth International Conference on Learning Representations.
[43]
Martino Sorbaro, Qian Liu, Massimo Bortone, and Sadique Sheik. 2020. Optimizing the energy consumption of spiking neural networks for neuromorphic applications. Frontiers in neuroscience, Vol. 14 (2020), 516916.
[44]
Qiaoyi Su, Yuhong Chou, Yifan Hu, Jianing Li, Shijie Mei, Ziyang Zhang, and Guoqi Li. 2023. Deep directly-trained spiking neural networks for object detection. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 6555--6565.
[45]
Sugahara Takuya, Renyuan Zhang, and Yasuhiko Nakashima. 2021. Training low-latency spiking neural network through knowledge distillation. In 2021 IEEE Symposium in Low-Power and High-Speed Chips (COOL CHIPS). IEEE, 1--3.
[46]
Yixuan Wang, Yang Xu, Rui Yan, and Huajin Tang. 2020. Deep spiking neural networks with binary weights for object recognition. IEEE Transactions on Cognitive and Developmental Systems, Vol. 13, 3 (2020), 514--523.
[47]
Wenjie Wei, Malu Zhang, Hong Qu, Ammar Belatreche, Jian Zhang, and Hong Chen. 2023. Temporal-coded spiking neural networks with dynamic firing threshold: Learning with event-driven backpropagation. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 10552--10562.
[48]
Wenjie Wei, Malu Zhang, Jilin Zhang, Ammar Belatreche, Jibin Wu, Zijing Xu, Xuerui Qiu, Hong Chen, Yang Yang, and Haizhou Li. 2024. Event-Driven Learning for Spiking Neural Networks. arXiv preprint arXiv:2403.00270 (2024).
[49]
Jibin Wu, Yansong Chua, Malu Zhang, Haizhou Li, and Kay Chen Tan. 2018. A spiking neural network framework for robust sound classification. Frontiers in neuroscience, Vol. 12 (2018), 836.
[50]
Jibin Wu, Chenglin Xu, Xiao Han, Daquan Zhou, Malu Zhang, Haizhou Li, and Kay Chen Tan. 2021. Progressive tandem learning for pattern recognition with deep spiking neural networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 44, 11 (2021), 7824--7840.
[51]
Jibin Wu, Emre Yilmaz, Malu Zhang, Haizhou Li, and Kay Chen Tan. 2020. Deep spiking neural networks for large vocabulary automatic speech recognition. Frontiers in neuroscience, Vol. 14 (2020), 199.
[52]
Yujie Wu, Lei Deng, Guoqi Li, and Luping Shi. 2018. Spatio-temporal backpropagation for training high-performance spiking neural networks. Frontiers in neuroscience, Vol. 12 (2018), 323875.
[53]
Yujie Wu, Lei Deng, Guoqi Li, Jun Zhu, Yuan Xie, and Luping Shi. 2019. Direct training for spiking neural networks: Faster, larger, better. In Proceedings of the AAAI conference on artificial intelligence, Vol. 33. 1311--1318.
[54]
Qi Xu, Jie Deng, Jiangrong Shen, Biwu Chen, Huajin Tang, and Gang Pan. [n.,d.]. Hybrid Spiking Vision Transformer for Event-Based Object Detection. Available at SSRN 4790563 ( [n.,d.]).
[55]
Qi Xu, Yaxin Li, Xuanye Fang, Jiangrong Shen, Jian K Liu, Huajin Tang, and Gang Pan. 2023. Biologically inspired structure learning with reverse knowledge distillation for spiking neural networks. arXiv preprint arXiv:2304.09500 (2023).
[56]
Qi Xu, Yaxin Li, Jiangrong Shen, Jian K Liu, Huajin Tang, and Gang Pan. 2023. Constructing deep spiking neural networks from artificial neural networks with knowledge distillation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 7886--7895.
[57]
Man Yao, Guangshe Zhao, Hengyu Zhang, Yifan Hu, Lei Deng, Yonghong Tian, Bo Xu, and Guoqi Li. 2023. Attention spiking neural networks. IEEE transactions on pattern analysis and machine intelligence (2023).
[58]
Ruokai Yin, Youngeun Kim, Yuhang Li, Abhishek Moitra, Nitin Satpute, Anna Hambitzer, and Priyadarshini Panda. 2023. Workload-balanced pruning for sparse spiking neural networks. arXiv preprint arXiv:2302.06746 (2023).
[59]
Ruokai Yin, Yuhang Li, Abhishek Moitra, and Priyadarshini Panda. [n.,d.]. MINT: Multiplier-less INTeger Quantization for Energy Efficient Spiking Neural Networks. ( [n.,d.]).
[60]
Donghyung Yoo and Doo Seok Jeong. 2023. CBP-QSNN: Spiking Neural Networks Quantized Using Constrained Backpropagation. IEEE Journal on Emerging and Selected Topics in Circuits and Systems, Vol. 13, 4 (2023), 1137--1146. https://doi.org/10.1109/JETCAS.2023.3328911
[61]
Malu Zhang, Hong Qu, Ammar Belatreche, Yi Chen, and Zhang Yi. 2018. A highly effective and robust membrane potential-driven supervised learning method for spiking neurons. IEEE transactions on neural networks and learning systems, Vol. 30, 1 (2018), 123--137.
[62]
Malu Zhang, Jiadong Wang, Jibin Wu, Ammar Belatreche, Burin Amornpaisannon, Zhixuan Zhang, Venkata Pavan Kumar Miriyala, Hong Qu, Yansong Chua, Trevor E Carlson, et al. 2021. Rectified linear postsynaptic potential function for backpropagation in deep spiking neural networks. IEEE transactions on neural networks and learning systems, Vol. 33, 5 (2021), 1947--1958.
[63]
Shibo Zhou, Xiaohua Li, Ying Chen, Sanjeev T. Chandrasekaran, and Arindam Sanyal. 2021. Temporal-Coded Deep Spiking Neural Network with Easy Training and Robust Performance. In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (35th AAAI Conference on Artificial Intelligence, AAAI 2021). Association for the Advancement of Artificial Intelligence, 11143--11151.
[64]
Rui-Jie Zhu, Qihang Zhao, Guoqi Li, and Jason K Eshraghian. 2023. Spikegpt: Generative pre-trained language model with spiking neural networks. arXiv preprint arXiv:2302.13939 (2023).

Index Terms

  1. Q-SNNs: Quantized Spiking Neural Networks

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    MM '24: Proceedings of the 32nd ACM International Conference on Multimedia
    October 2024
    11719 pages
    ISBN:9798400706868
    DOI:10.1145/3664647
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 28 October 2024

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. neuromorphic datasets
    2. quantization
    3. spiking neural networks

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    MM '24
    Sponsor:
    MM '24: The 32nd ACM International Conference on Multimedia
    October 28 - November 1, 2024
    Melbourne VIC, Australia

    Acceptance Rates

    MM '24 Paper Acceptance Rate 1,150 of 4,385 submissions, 26%;
    Overall Acceptance Rate 2,145 of 8,556 submissions, 25%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 108
      Total Downloads
    • Downloads (Last 12 months)108
    • Downloads (Last 6 weeks)86
    Reflects downloads up to 25 Dec 2024

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media