Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3584954.3584993acmotherconferencesArticle/Chapter ViewAbstractPublication PagesniceConference Proceedingsconference-collections
research-article
Open access

hxtorch.snn: Machine-learning-inspired Spiking Neural Network Modeling on BrainScaleS-2

Published: 12 April 2023 Publication History

Abstract

Neuromorphic systems require user-friendly software to support the design and optimization of experiments. In this work, we address this need by presenting our development of a machine learning-based modeling framework for the BrainScaleS-2 neuromorphic system. This work represents an improvement over previous efforts, which either focused on the matrix-multiplication mode of BrainScaleS-2 or lacked full automation. Our framework, called hxtorch.snn, enables the hardware-in-the-loop training of spiking neural networks within PyTorch, including support for auto differentiation in a fully-automated hardware experiment workflow. In addition, hxtorch.snn facilitates seamless transitions between emulating on hardware and simulating in software. We demonstrate the capabilities of hxtorch.snn on a classification task using the Yin-Yang dataset employing a gradient-based approach with surrogate gradients and densely sampled membrane observations from the BrainScaleS-2 hardware system.

References

[1]
2022. EBRAINS Research Infrastructure. https://ebrains.eu
[2]
Elias Arnold, Georg Böcherer, Eric Müller, Philipp Spilger, Johannes Schemmel, Stefano Calabrò, and Maxim Kuschnerov. 2022. Spiking Neural Network Equalization on Neuromorphic Hardware for IM/DD Optical Communication. In European Conference on Optical Communication (ECOC) 2022. Optica Publishing Group, Th1C.5. https://opg.optica.org/abstract.cfm?URI=ECEOC-2022-Th1C.5
[3]
Trevor Bekolay, James Bergstra, Eric Hunsberger, Travis DeWolf, Terrence C Stewart, Daniel Rasmussen, Xuan Choo, Aaron Voelker, and Chris Eliasmith. 2014. Nengo: a Python tool for building large-scale functional brain models. Frontiers in Neuroinformatics 7 (2014), 48.
[4]
Guillaume Bellec, Darjan Salaj, Anand Subramoney, Robert Legenstein, and Wolfgang Maass. 2018. Long short-term memory and learning-to-learn in networks of spiking neurons. Advances in neural information processing systems 31 (2018).
[5]
Guillaume Bellec, Franz Scherr, Anand Subramoney, Elias Hajek, Darjan Salaj, Robert Legenstein, and Wolfgang Maass. 2020. A solution to the learning dilemma for recurrent networks of spiking neurons. Nature Communications 11, 1 (2020), 3625. https://doi.org/10.1038/s41467-020-17236-y
[6]
James Bradbury, Roy Frostig, Peter Hawkins, Matthew James Johnson, Chris Leary, Dougal Maclaurin, George Necula, Adam Paszke, Jake VanderPlas, Skye Wanderman-Milne, and Qiao Zhang. 2022. JAX: composable transformations of Python+NumPy programs. http://github.com/google/jax
[7]
R. Brette and W. Gerstner. 2005. Adaptive Exponential Integrate-and-Fire Model as an Effective Description of Neuronal Activity. J. Neurophysiol. 94 (2005), 3637 – 3642. https://doi.org/10.1152/jn.00686.2005
[8]
Benjamin Cramer, Sebastian Billaudelle, Simeon Kanya, Aron Leibfried, Andreas Grübl, Vitali Karasenko, Christian Pehle, Korbinian Schreiber, Yannik Stradmann, Johannes Weis, 2022. Surrogate gradients for analog neuromorphic computing. Proceedings of the National Academy of Sciences 119, 4 (2022).
[9]
William J. Dally, Yatish Turakhia, and Song Han. 2020. Domain-Specific Hardware Accelerators. Commun. ACM 63, 7 (June 2020), 48–57. https://doi.org/10.1145/3361682
[10]
Andrew P. Davison, Daniel Brüderle, Jochen Eppler, Jens Kremkow, Eilif Muller, Dejan Pecevski, Laurent Perrinet, and Pierre Yger. 2009. PyNN: a common interface for neuronal network simulators. Front. Neuroinform. 2, 11 (2009). https://doi.org/10.3389/neuro.11.011.2008
[11]
Electronic Vision(s) Group. 2022. Open-Source Software. https://github.com/electronicvisions/
[12]
Jason K Eshraghian, Max Ward, Emre Neftci, Xinxin Wang, Gregor Lenz, Girish Dwivedi, Mohammed Bennamoun, Doo Seok Jeong, and Wei D Lu. 2021. Training spiking neural networks using lessons from deep learning. arXiv preprint (2021). arxiv:2109.12894 [cs.NE]
[13]
Steve K Esser, Rathinakumar Appuswamy, Paul Merolla, John V Arthur, and Dharmendra S Modha. 2015. Backpropagation for energy-efficient neuromorphic computing. Advances in neural information processing systems 28 (2015).
[14]
Steven K. Esser, Paul A. Merolla, John V. Arthur, Andrew S. Cassidy, Rathinakumar Appuswamy, Alexander Andreopoulos, David J. Berg, Jeffrey L. McKinstry, Timothy Melano, Davis R. Barch, Carmelo di Nolfo, Pallab Datta, Arnon Amir, Brian Taba, Myron D. Flickner, and Dharmendra S. Modha. 2016. Convolutional networks for fast, energy-efficient neuromorphic computing. Proceedings of the National Academy of Sciences 113, 41 (2016), 11441–11446. https://doi.org/10.1073/pnas.1604850113 arXiv:https://www.pnas.org/content/113/41/11441.full.pdf
[15]
Wei Fang, Yanqi Chen, Jianhao Ding, Ding Chen, Zhaofei Yu, Huihui Zhou, Timothée Masquelier, Yonghong Tian, and other contributors. 2020. SpikingJelly. https://github.com/fangwei123456/spikingjelly.
[16]
Julian Göltz, Laura Kriener, Andreas Baumbach, Sebastian Billaudelle, Oliver Breitwieser, Benjamin Cramer, Dominik Dold, Ákos Ferenc Kungl, Walter Senn, Johannes Schemmel, Karlheinz Meier, and Mihai A. Petrovici. 2021. Fast and energy-efficient neuromorphic deep learning with first-spike times. Nature Machine Intelligence 3, 9 (2021), 823–835. https://doi.org/10.1038/s42256-021-00388-x
[17]
Hananel Hazan, Daniel J. Saunders, Hassaan Khan, Devdhar Patel, Darpan T. Sanghavi, Hava T. Siegelmann, and Robert Kozma. 2018. BindsNET: A Machine Learning-Oriented Spiking Neural Networks Library in Python. Frontiers in Neuroinformatics 12 (2018), 89. https://doi.org/10.3389/fninf.2018.00089
[18]
Intel. 2021. Announcement of Loihi-2 and new software framework. https://www.intel.com/content/www/us/en/newsroom/news/intel-unveils-neuromorphic-loihi-2-lava-software.html https://github.com/lava-nc/.
[19]
Laura Kriener, Julian Göltz, and Mihai A. Petrovici. 2022. The Yin-Yang Dataset. In Neuro-Inspired Computational Elements Conference (Virtual Event, USA) (NICE 2022). Association for Computing Machinery, New York, NY, USA, 107–111. https://doi.org/10.1145/3517343.3517380
[20]
Chit-Kwan Lin, Andreas Wild, Gautham N Chinya, Yongqiang Cao, Mike Davies, Daniel M Lavery, and Hong Wang. 2018. Programming Spiking Neural Networks on Intel’s Loihi. Computer 51, 3 (2018), 52–61.
[21]
Dylan R. Muir, Felix Bauer, and Philipp Weidel. 2019. Rockpool Documentation. https://doi.org/10.5281/zenodo.3773845
[22]
Eric Müller, Elias Arnold, Oliver Breitwieser, Milena Czierlinski, Arne Emmel, Jakob Kaiser, Christian Mauch, Sebastian Schmitt, Philipp Spilger, Raphael Stock, Yannik Stradmann, Johannes Weis, Andreas Baumbach, Sebastian Billaudelle, Benjamin Cramer, Falk Ebert, Julian Göltz, Joscha Ilmberger, Vitali Karasenko, Mitja Kleider, Aron Leibfried, Christian Pehle, and Johannes Schemmel. 2022. A Scalable Approach to Modeling on Accelerated Neuromorphic Hardware. Front. Neurosci. 16 (2022). https://doi.org/10.3389/fnins.2022.884128
[23]
Eric Müller, Sebastian Schmitt, Christian Mauch, Sebastian Billaudelle, Andreas Grübl, Maurice Güttler, Dan Husmann, Joscha Ilmberger, Sebastian Jeltsch, Jakob Kaiser, Johann Klähn, Mitja Kleider, Christoph Koke, José Montes, Paul Müller, Johannes Partzsch, Felix Passenberg, Hartmut Schmidt, Bernhard Vogginger, Jonas Weidner, Christian Mayr, and Johannes Schemmel. 2022. The Operating System of the Neuromorphic BrainScaleS-1 System. Neurocomputing 501 (2022), 790–810. https://doi.org/10.1016/j.neucom.2022.05.081 arxiv:2003.13749 [cs.NE]
[24]
Emre O. Neftci, Hesham Mostafa, and Friedemann Zenke. 2019. Surrogate gradient learning in spiking neural networks: Bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Processing Magazine 36, 6 (2019), 51–63. https://doi.org/10.1109/MSP.2019.2931595
[25]
Adam Paszke, Sam Gross, Francisco Massa, Adam Lerer, James Bradbury, Gregory Chanan, Trevor Killeen, Zeming Lin, Natalia Gimelshein, Luca Antiga, Alban Desmaison, Andreas Kopf, Edward Yang, Zachary DeVito, Martin Raison, Alykhan Tejani, Sasank Chilamkurthy, Benoit Steiner, Lu Fang, Junjie Bai, and Soumith Chintala. 2019. PyTorch: An Imperative Style, High-Performance Deep Learning Library. In Advances in Neural Information Processing Systems 32, H. Wallach, H. Larochelle, A. Beygelzimer, F. d’Alché-Buc, E. Fox, and R. Garnett (Eds.). Curran Associates, Inc., 8024–8035. http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf
[26]
Christian Pehle, Sebastian Billaudelle, Benjamin Cramer, Jakob Kaiser, Korbinian Schreiber, Yannik Stradmann, Johannes Weis, Aron Leibfried, Eric Müller, and Johannes Schemmel. 2022. The BrainScaleS-2 Accelerated Neuromorphic System with Hybrid Plasticity. Frontiers in Neuroscience 16 (2022). https://doi.org/10.3389/fnins.2022.795876 arxiv:2201.11063 [cs.NE]
[27]
Christian Pehle and Jens Egholm Pedersen. 2021. Norse — A deep learning library for spiking neural networks. https://doi.org/10.5281/zenodo.4422025 Documentation: https://norse.ai/docs/.
[28]
Oliver Rhodes, Petruţ A. Bogdan, Christian Brenninkmeijer, Simon Davidson, Donal Fellows, Andrew Gait, David R. Lester, Mantas Mikaitis, Luis A. Plana, Andrew G. D. Rowley, Alan B. Stokes, and Steve B. Furber. 2018. sPyNNaker: A Software Package for Running PyNN Simulations on SpiNNaker. Frontiers in Neuroscience 12 (2018), 816. https://doi.org/10.3389/fnins.2018.00816
[29]
Andrew G. D. Rowley, Christian Brenninkmeijer, Simon Davidson, Donal Fellows, Andrew Gait, David R. Lester, Luis A. Plana, Oliver Rhodes, Alan B. Stokes, and Steve B. Furber. 2019. SpiNNTools: The Execution Engine for the SpiNNaker Platform. Frontiers in Neuroscience 13 (2019), 231. https://doi.org/10.3389/fnins.2019.00231
[30]
Bodo Rueckauer, Connor Bybee, Ralf Goettsche, Yashwardhan Singh, Joyesh Mishra, and Andreas Wild. 2021. NxTF: An API and Compiler for Deep Spiking Neural Networks on Intel Loihi. arXiv preprint (Jan. 2021).
[31]
Sebastian Schmitt, Johann Klähn, Guillaume Bellec, Andreas Grübl, Maurice Güttler, Andreas Hartel, Stephan Hartmann, Dan Husmann, Kai Husmann, Sebastian Jeltsch, Mitja Kleider, Christoph Koke, Alexander Kononov, Christian Mauch, Eric Müller, Paul Müller, Johannes Partzsch, Mihai A. Petrovici, Bernhard Vogginger, Stefan Schiefer, Stefan Scholze, Vasilis Thanasoulis, Johannes Schemmel, Robert Legenstein, Wolfgang Maass, Christian Mayr, and Karlheinz Meier. 2017. Neuromorphic Hardware In The Loop: Training a Deep Spiking Network on the BrainScaleS Wafer-Scale System. Proceedings of the 2017 IEEE International Joint Conference on Neural Networks (IJCNN) (2017), 2227–2234. https://doi.org/10.1109/IJCNN.2017.7966125
[32]
Sumit Bam Shrestha and Garrick Orchard. 2018. SLAYER: Spike Layer Error Reassignment in Time. In Advances in Neural Information Processing Systems, S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett (Eds.). Vol. 31. Curran Associates, Inc.https://proceedings.neurips.cc/paper/2018/file/82f2b308c3b01637c607ce05f52a2fed-Paper.pdf
[33]
Philipp Spilger, Eric Müller, Arne Emmel, Aron Leibfried, Christian Mauch, Christian Pehle, Johannes Weis, Oliver Breitwieser, Sebastian Billaudelle, Sebastian Schmitt, Timo C. Wunderlich, Yannik Stradmann, and Johannes Schemmel. 2020. hxtorch: PyTorch for BrainScaleS-2 — Perceptrons on Analog Neuromorphic Hardware. In IoT Streams for Data-Driven Predictive Maintenance and IoT, Edge, and Mobile for Embedded Machine Learning. Springer International Publishing, Cham, 189–200. https://doi.org/10.1007/978-3-030-66770-2_14
[34]
Yannik Stradmann, Sebastian Billaudelle, Oliver Breitwieser, Falk Leonard Ebert, Arne Emmel, Dan Husmann, Joscha Ilmberger, Eric Müller, Philipp Spilger, Johannes Weis, and Johannes Schemmel. 2022. Demonstrating Analog Inference on the BrainScaleS-2 Mobile System. IEEE Open Journal of Circuits and Systems 3 (2022), 252–262. https://doi.org/10.1109/OJCAS.2022.3208413
[35]
BrainScaleS-2 Team. 2022. hxtorch.snn: Yin-Yang Demo. https://electronicvisions.github.io/documentation-brainscales2
[36]
BrainScaleS-2 Team. 2022. Source code for the hxtorch.snn Yin-Yang Demo. https://github.com/electronicvisions/hxtorch/tree/nice2023_yydemo/src/pyhxtorch/hxtorch/lit_yinyang
[37]
Johannes Weis, Philipp Spilger, Sebastian Billaudelle, Yannik Stradmann, Arne Emmel, Eric Müller, Oliver Breitwieser, Andreas Grübl, Joscha Ilmberger, Vitali Karasenko, Mitja Kleider, Christian Mauch, Korbinian Schreiber, and Johannes Schemmel. 2020. Inference with Artificial Neural Networks on Analog Neuromorphic Hardware. In IoT Streams for Data-Driven Predictive Maintenance and IoT, Edge, and Mobile for Embedded Machine Learning. Springer International Publishing, Cham, 201–212. https://doi.org/10.1007/978-3-030-66770-2_15
[38]
Timo C Wunderlich and Christian Pehle. 2021. Event-based backpropagation can compute exact gradients for spiking neural networks. Scientific Reports 11, 1 (2021), 1–17. https://doi.org/10.1038/s41598-021-91786-z

Cited By

View all
  • (2024)Neuromorphic intermediate representation: A unified instruction set for interoperable brain-inspired computingNature Communications10.1038/s41467-024-52259-915:1Online publication date: 16-Sep-2024
  • (2023)Interfacing Neuromorphic Hardware with Machine Learning Frameworks - A ReviewProceedings of the 2023 International Conference on Neuromorphic Systems10.1145/3589737.3605967(1-8)Online publication date: 1-Aug-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
NICE '23: Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference
April 2023
124 pages
ISBN:9781450399470
DOI:10.1145/3584954
This work is licensed under a Creative Commons Attribution-ShareAlike International 4.0 License.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 12 April 2023

Check for updates

Author Tags

  1. accelerator
  2. analog computing
  3. hardware abstraction
  4. modeling
  5. neuromorphic

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

NICE 2023

Acceptance Rates

Overall Acceptance Rate 25 of 40 submissions, 63%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)300
  • Downloads (Last 6 weeks)52
Reflects downloads up to 31 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Neuromorphic intermediate representation: A unified instruction set for interoperable brain-inspired computingNature Communications10.1038/s41467-024-52259-915:1Online publication date: 16-Sep-2024
  • (2023)Interfacing Neuromorphic Hardware with Machine Learning Frameworks - A ReviewProceedings of the 2023 International Conference on Neuromorphic Systems10.1145/3589737.3605967(1-8)Online publication date: 1-Aug-2023

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media