Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3468044.3468052acmotherconferencesArticle/Chapter ViewAbstractPublication PagesheartConference Proceedingsconference-collections
research-article

StreamBrain: An HPC Framework for Brain-like Neural Networks on CPUs, GPUs and FPGAs

Published: 21 June 2021 Publication History

Abstract

The modern deep learning method based on backpropagation has surged in popularity and has been used in multiple domains and application areas. At the same time, there are other - less-known - machine learning algorithms with a mature and solid theoretical foundation whose performance remains unexplored. One such example is the brain-like Bayesian Confidence Propagation Neural Network (BCPNN). In this paper, we introduce StreamBrain--a framework that allows neural networks based on BCPNN to be practically deployed in High-Performance Computing systems. StreamBrain is a domain-specific language (DSL), similar in concept to existing machine learning (ML) frameworks, and supports backends for CPUs, GPUs, and even FPGAs. We empirically demonstrate that StreamBrain can train the well-known ML benchmark dataset MNIST within seconds, and we are the first to demonstrate BCPNN on STL-10 size networks. We also show how StreamBrain can be used to train with custom floating-point formats and illustrate the impact of using different bfloat variations on BCPNN using FPGAs.

References

[1]
Bekolay et al. 2014. Nengo: a Python Tool for Building Large-scale Functional Brain Models. Frontiers in Neuroinformatics (2014).
[2]
Ben-Nun et al. 2019. Demystifying Parallel and Distributed Deep Learning. ACM CSUR'19 (2019).
[3]
Chandra et al. 2001. Parallel programming in OpenMP. Morgan Kaufmann.
[4]
Coates et al. 2011. An Analysis of Single-layer Networks in Unsupervised Feature Learning. In AISTATS'11.
[5]
Czajkowski et al. 2012. From OpenCL to High-Performance Hardware on FPGAs. In in FPL'12.
[6]
Davison et al. 2009. PyNN: a Common Interface for Neuronal Network Simulators. Frontiers in Neuroinformatics (2009).
[7]
De Dinechin et al. 2011. Designing Custom Arithmetic Data Paths with FloPoCo. Design & Test of Computers (2011).
[8]
Zohouri et al. 2018. High-performance High-order Stencil computation on FPGAs using OpenCL. In IPDPSW'18.
[9]
Fiebig et al. 2017. A Spiking Working Memory Model based on Hebbian Short-term Potentiation. Journal of Neuroscience (2017).
[10]
Fiebig et al. 2020. An Indexing Theory for Working Memory based on Fast Hebbian Plasticity. ENeuro (2020).
[11]
Gewaltig et al. 2007. Nest (Neural Simulation Tool). Scholarpedia (2007).
[12]
Gulli et al. 2017. Deep Learning with Keras.
[13]
Gustafson et al. 2017. Beating Floating Point at its own Game: Posit Arithmetic. Supercomputing Frontiers and Innovations (2017).
[14]
Hines et al. 1997. The NEURON simulation environment. Neural Computation (1997).
[15]
Johansson et al. 2007. Towards Cortex Sized Artificial Neural Systems. Neural Networks (2007).
[16]
Knight et al. 2016. Large-scale Simulations of Plastic Neural Networks on Neuromorphic Hardware. Frontiers in Neuroanatomy (2016).
[17]
Krizhevsky et al. 2012. ImageNET Classification with Deep Convolutional Neural Networks. In in NIPS'12.
[18]
LeCun et al. 2015. Deep learning. Nature (2015).
[19]
LeCun, Yann. 1998. The MNIST Database of Handwritten Digits. http://yann.lecun.com/exdb/mnist/ (1998).
[20]
Morcos, Benjamin. 2019. NengoFPGA. Master's thesis. MSc Thesis.
[21]
Mountcastle, Vernon B. 1997. The Columnar Organization of the Neocortex. Brain (1997).
[22]
Paszke et al. 2019. Pytorch: An Imperative style, High-Performance Deep Learning Library. In NeuroIPS'19.
[23]
Podobas et al. 2017. Designing and Cccelerating Spiking Neural Networks using OpenCL for FPGAs. In FPT'17.
[24]
Podobas et al. 2018. Hardware Implementation of POSITs and their application in FPGAs. In IPDPSW'18.
[25]
Ravichandran et al. 2020. Brain-like Approaches to Unsupervised Learning of Hidden Representations. arXiv preprint arXiv:2005.03476 (2020).
[26]
Ravichandran et al. 2020. Learning Representations in Bayesian Confidence Propagation Neural Networks. IJCNN'20 (2020).
[27]
Stathis et al. 2020. eBrainII: a 3 kW RealTime Custom 3D DRAM Integrated ASIC Implementation of a Biologically Plausible Model of a Human Scale Cortex. Journal of Signal Processing Systems (2020).

Cited By

View all
  • (2023)SEMKIS-DSL: A Domain-Specific Language to Support Requirements Engineering of Datasets and Neural Network RecognitionInformation10.3390/info1404021314:4(213)Online publication date: 1-Apr-2023
  • (2023)A domain-specific language for describing machine learning datasetsJournal of Computer Languages10.1016/j.cola.2023.10120976(101209)Online publication date: Aug-2023
  1. StreamBrain: An HPC Framework for Brain-like Neural Networks on CPUs, GPUs and FPGAs

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      HEART '21: Proceedings of the 11th International Symposium on Highly Efficient Accelerators and Reconfigurable Technologies
      June 2021
      76 pages
      ISBN:9781450385497
      DOI:10.1145/3468044
      © 2021 Association for Computing Machinery. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of the United States government. As such, the United States Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

      In-Cooperation

      • German Research Foundation: German Research Foundation

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 21 June 2021

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. AI
      2. BCPNN
      3. Emerging Machine Learning
      4. FPGA
      5. GPU
      6. HPC
      7. Neural networks
      8. Representation learning
      9. Unsupervised learning

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      Conference

      HEART '21

      Acceptance Rates

      Overall Acceptance Rate 22 of 50 submissions, 44%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)24
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 15 Oct 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)SEMKIS-DSL: A Domain-Specific Language to Support Requirements Engineering of Datasets and Neural Network RecognitionInformation10.3390/info1404021314:4(213)Online publication date: 1-Apr-2023
      • (2023)A domain-specific language for describing machine learning datasetsJournal of Computer Languages10.1016/j.cola.2023.10120976(101209)Online publication date: Aug-2023

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media