Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3381755.3381771acmotherconferencesArticle/Chapter ViewAbstractPublication PagesniceConference Proceedingsconference-collections
extended-abstract

Natural gradient learning for spiking neurons

Published: 18 June 2020 Publication History

Abstract

Due to their simplicity and success in machine learning, gradient-based learning rules represent a popular choice for synaptic plasticity models. While they have been linked to biological observations, it is often ignored that their predictions generally depend on a specific representation of the synaptic strength. In a neuron, the impact of a synapse can be described using the state of many different observables such as neutortransmitter release rates or membrane potential changes. Which one of these is chosen when deriving a learning rule can drastically change the predictions of the model. This is doubly unsatisfactory, both with respect to optimality and from a conceptual point of view. By following the gradient on the manifold of the neuron's firing distributions instead of one that is relative to some arbitrary synaptic weight parametrization, natural gradient descent provides a solution to both these problems. While the computational advantages of natural gradient are well-studied in ANNs, its predictive power as a model for in-vivo synaptic plasticity has not yet been assessed. By formulating natural gradient learning in the context of spiking interactions, we demonstrate how it can improve the convergence speed of spiking networks. Furthermore, our approach provides a unified, normative framework for both homo- and heterosynaptic plasticity in structured neurons and predicts a number of related biological phenomena.

References

[1]
Laurence Aitchison and Peter E Latham. 2014. Bayesian synaptic plasticity makes predictions about plasticity experiments in vivo. arXiv preprint arXiv:1410.1029 (2014).
[2]
Shun-Ichi Amari. 1998. Natural gradient works efficiently in learning. Neural computation 10, 2 (1998), 251--276.
[3]
Marina Chistiakova and Maxim Volgushev. 2009. Heterosynaptic plasticity in the neocortex. Experimental brain research 199, 3-4 (2009), 377.
[4]
Steven K Esser, Paul A Merolla, John V Arthur, Andrew S Cassidy, Rathinakumar Appuswamy, Alexander Andreopoulos, David J Berg, Jeffrey L McKinstry, Timothy Melano, Davis R Barch, et al. 2016. Convolutional networks for fast energy-efficient neuromorphic computing. Proc. Nat. Acad. Sci. USA 113, 41 (2016), 11441--11446.
[5]
Michael Häusser. 2001. Synaptic function: dendritic democracy. Current Biology 11, 1 (2001), R10-R12.
[6]
Diederik P Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).
[7]
Gary S Lynch, Thomas Dunwiddie, and Valentin Gribkoff. 1977. Heterosynaptic depression: a postsynaptic correlate of long-term potentiation. Nature 266, 5604 (1977), 737.
[8]
Mihai A Petrovici, Sebastian Schmitt, Johann Klahn, David Stöckel, Anna Schroeder, Guillaume Bellec, Johannes Bill, Oliver Breitwieser, Ilja Bytschok, Andreas Grübl, et al. 2017. Pattern representation and recognition with accelerated analog neuromorphic systems. In 2017 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE, 1--4.
[9]
Jean-Pascal Pfister, Taro Toyoizumi, David Barber, and Wulfram Gerstner. 2006. Optimal spike-timing-dependent plasticity for precise action potential firing in supervised learning. Neural computation 18, 6 (2006), 1318--1348.
[10]
Sébastien Royer and Denis Paré. 2003. Conservation of total synaptic weight through balanced synaptic depression and potentiation. Nature 422, 6931 (2003), 518.
[11]
Sebastian Schmitt, Johann Klähn, Guillaume Bellec, Andreas Grübl, Maurice Guettler, Andreas Hartel, Stephan Hartmann, Dan Husmann, Kai Husmann, Sebastian Jeltsch, et al. 2017. Neuromorphic hardware in the loop: Training a deep spiking network on the brainscales wafer-scale system. In 2017 International Joint Conference on Neural Networks (IJCNN). IEEE, 2227--2234.
[12]
Chetan Singh Thakur Thakur, Jamal Molin, Gert Cauwenberghs, Giacomo Indiveri, Kundan Kumar, Ning Qiao, Johannes Schemmel, Runchun Mark Wang, Elisabetta Chicca, Jennifer Olson Hasler, et al. 2018. Large-scale neuromorphic spiking array processors: A quest to mimic the brain. Frontiers in neuroscience 12 (2018), 891.
[13]
Robert Urbanczik and Walter Senn. 2014. Learning by the dendritic prediction of somatic spiking. Neuron 81, 3(2014), 521--528.
[14]
Robert Wöhrl, Dorothea Von Haebler, and Uwe Heinemann. 2007. Low-frequency stimulation of the direct cortical input to area CA1 induces homosynaptic LTD and heterosynaptic LTP in the rat hippocampal-entorhinal cortex slice preparation. European Journal of Neuroscience 25, 1 (2007), 251--258.
[15]
Friedemann Zenke, Everton J Agnes, and Wulfram Gerstner. 2015. Diverse synaptic plasticity mechanisms orchestrated to form and retrieve memories in spiking neural networks. Nature communications 6 (2015), 6922.

Cited By

View all
  • (2024)Conductance-based dendrites perform Bayes-optimal cue integrationPLOS Computational Biology10.1371/journal.pcbi.101204720:6(e1012047)Online publication date: 12-Jun-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
NICE '20: Proceedings of the 2020 Annual Neuro-Inspired Computational Elements Workshop
March 2020
131 pages
ISBN:9781450377188
DOI:10.1145/3381755
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

In-Cooperation

  • INTEL: Intel Corporation
  • IBM: IBM

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 June 2020

Check for updates

Author Tags

  1. dendritic learning
  2. efficient learning
  3. heterosynaptic plasticity
  4. homeostasis
  5. natural gradient descent
  6. parametrization invariance

Qualifiers

  • Extended-abstract
  • Research
  • Refereed limited

Funding Sources

Conference

NICE '20
NICE '20: Neuro-inspired Computational Elements Workshop
March 17 - 20, 2020
Heidelberg, Germany

Acceptance Rates

Overall Acceptance Rate 25 of 40 submissions, 63%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)16
  • Downloads (Last 6 weeks)2
Reflects downloads up to 16 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Conductance-based dendrites perform Bayes-optimal cue integrationPLOS Computational Biology10.1371/journal.pcbi.101204720:6(e1012047)Online publication date: 12-Jun-2024

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media