Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Razor SNN: Efficient Spiking Neural Network with Temporal Embeddings

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2023 (ICANN 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14258))

Included in the following conference series:

  • 1008 Accesses

Abstract

The event streams generated by dynamic vision sensors (DVS) are sparse and non-uniform in the spatial domain, while still dense and redundant in the temporal domain. Although spiking neural network (SNN), the event-driven neuromorphic model, has the potential to extract spatio-temporal features from the event streams, it is not effective and efficient. Based on the above, we propose an events sparsification spiking framework dubbed as Razor SNN, pruning pointless event frames progressively. Concretely, we extend the dynamic mechanism based on the global temporal embeddings, reconstruct the features, and emphasize the events effect adaptively at the training stage. During the inference stage, eliminate fruitless frames hierarchically according to a binary mask generated by the trained temporal embeddings. Comprehensive experiments demonstrate that our Razor SNN achieves competitive performance consistently on four events-based benchmarks: DVS 128 Gesture, N-Caltech 101, CIFAR10-DVS and SHD.

This work was supported by the National Key Research and Development Program of China (Grant No. 2018YFE0203801).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Amir, A., et al.: A low power, fully event-based gesture recognition system. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 7243–7252 (2017)

    Google Scholar 

  2. Cannici, M., Ciccone, M., et al.: Attention mechanisms for object recognition with event-based cameras. In: WACV (2019)

    Google Scholar 

  3. Caporale, N., Dan, Y.: Spike timing-dependent plasticity: a hebbian learning rule. Annu. Rev. Neurosci. 31, 25–46 (2008)

    Article  Google Scholar 

  4. Cheng, X., Hao, Y., Xu, J., Xu, B.: LISNN: improving spiking neural networks with lateral interactions for robust object recognition. In: IJCAI, pp. 1519–1525 (2020)

    Google Scholar 

  5. Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The heidelberg spiking data sets for the systematic evaluation of spiking neural networks. IEEE Trans. Neural Netw. Learn. Syst. 33(7), 2744–2757 (2020)

    Article  Google Scholar 

  6. Deng, L., et al.: Rethinking the performance comparison between SNNs and ANNs. Neural Netw. 121, 294–307 (2020)

    Article  Google Scholar 

  7. Kim, Y., Panda, P.: Optimizing deeper spiking neural networks for dynamic vision sensing. Neural Netw. 144, 686–698 (2021)

    Article  Google Scholar 

  8. Kugele, A., Pfeil, T., et al.: Efficient processing of spatio-temporal data streams with spiking neural networks. Front. Neurosci. 14, 439 (2020)

    Article  Google Scholar 

  9. Kundu, S., Datta, G., Pedram, M., Beerel, P.A.: Towards low-latency energy-efficient deep snns via attention-guided compression. arXiv preprint arXiv:2107.12445 (2021)

  10. Kundu, S., et al.: Spike-thrift: towards energy-efficient deep spiking neural networks by limiting spiking activity via attention-guided compression. In: CVPR (2021)

    Google Scholar 

  11. Kushawaha, R.K., Kumar, S., Banerjee, B., Velmurugan, R.: Distilling spikes: knowledge distillation in spiking neural networks. In: 2020 25th International Conference on Pattern Recognition (ICPR), pp. 4536–4543. IEEE (2021)

    Google Scholar 

  12. Lee, J.H., Delbruck, T., Pfeiffer, M.: Training deep spiking neural networks using backpropagation. Front. Neurosci. 10, 508 (2016)

    Article  Google Scholar 

  13. Panchapakesan, S., Fang, Z., Li, J.: SyncNN: evaluating and accelerating spiking neural networks on FPGAs. In: 2021 31st International Conference on Field-Programmable Logic and Applications (FPL), pp. 286–293. IEEE (2021)

    Google Scholar 

  14. Pérez-Carrasco, J.A., et al.: Mapping from frame-driven to frame-free event-driven vision systems by low-rate rate coding and coincidence processing-application to feedforward convnets. IEEE Trans. Pattern Anal. Mach. Intell. 35(11), 2706–2719 (2013)

    Article  Google Scholar 

  15. Pfeiffer, M., Pfeil, T.: Deep learning with spiking neurons: opportunities and challenges. Front. Neurosci. 12, 774 (2018)

    Article  Google Scholar 

  16. Ramesh, B., Yang, H., et al.: DART: distribution aware retinal transform for event-based cameras. TPAMI 42(11), 2767–2780 (2019)

    Google Scholar 

  17. Rao, Y., et al.: DynamicViT: efficient vision transformers with dynamic token sparsification. In: Advances in Neural Information Processing Systems (2021)

    Google Scholar 

  18. Rückauer, B., Känzig, N., Liu, S.C., Delbruck, T., Sandamirskaya, Y.: Closing the accuracy gap in an event-based visual recognition task. arXiv preprint arXiv:1906.08859 (2019)

  19. Wu, H., et al.: Training spiking neural networks with accumulated spiking flow. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 10320–10328 (2021)

    Google Scholar 

  20. Wu, Y., Deng, L., et al.: Spatio-temporal backpropagation for training high-performance spiking neural networks. Front. Neurosci. 12, 331 (2018)

    Article  Google Scholar 

  21. Wu, Y., Deng, L., et al.: Direct training for spiking neural networks: faster, larger, better. In: AAAI (2019)

    Google Scholar 

  22. Wu, Z., Zhang, H., et al.: LIAF-Net: leaky integrate and analog fire network for lightweight and efficient spatiotemporal information processing. IEEE Trans. Neural Netw. Learn. Syst. 33(11), 6249–6262 (2021)

    Article  Google Scholar 

  23. Xu, Q., Qi, Y., Yu, H., Shen, J., Tang, H., Pan, G., et al.: CSNN: an augmented spiking based framework with perceptron-inception. In: IJCAI, pp. 1646–1652 (2018)

    Google Scholar 

  24. Yao, M., et al.: Temporal-wise attention spiking neural networks for event streams classification. In: ICCV (2021)

    Google Scholar 

  25. Yin, B., Corradi, F., Bohté, S.M.: Effective and efficient computation with multiple-timescale spiking recurrent neural networks. In: International Conference on Neuromorphic Systems 2020, pp. 1–8 (2020)

    Google Scholar 

  26. Zenke, F., Vogels, T.P.: The remarkable robustness of surrogate gradient learning for instilling complex function in spiking neural networks. Neural Comput. 33(4), 899–925 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  27. Zhang, L., Cao, J., Zhang, Y., Zhou, B., Feng, S.: Distilling neuron spike with high temperature in reinforcement learning agents. arXiv preprint arXiv:2108.10078 (2021)

  28. Zhang, Y., Chen, W., Lu, Y., Huang, T., Sun, X., Cao, J.: Avatar knowledge distillation: self-ensemble teacher paradigm with uncertainty. arXiv preprint arXiv:2305.02722 (2023)

  29. Zheng, H., Wu, Y., et al.: Going deeper with directly-trained larger spiking neural networks. arXiv preprint arXiv:2011.05280 (2020)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jian Cao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhang, Y., Cao, J., Chen, J., Sun, W., Wang, Y. (2023). Razor SNN: Efficient Spiking Neural Network with Temporal Embeddings. In: Iliadis, L., Papaleonidas, A., Angelov, P., Jayne, C. (eds) Artificial Neural Networks and Machine Learning – ICANN 2023. ICANN 2023. Lecture Notes in Computer Science, vol 14258. Springer, Cham. https://doi.org/10.1007/978-3-031-44192-9_33

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-44192-9_33

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-44191-2

  • Online ISBN: 978-3-031-44192-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics