Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Supervised learning in multilayer spiking neural networks with inner products of spike trains

Published: 10 May 2017 Publication History

Abstract

Recent advances in neurosciences have revealed that neural information in the brain is encoded through precisely timed spike trains, not only through the neural firing rate. This paper presents a new supervised, multi-spike learning algorithm for multilayer spiking neural networks, which can implement the complex spatio-temporal pattern learning of spike trains. The proposed algorithm firstly defines inner product operators to mathematically describe and manipulate spike trains, and then solves the problems of error function construction and backpropagation among multiple output spikes during learning. The algorithm is successfully applied to different temporal tasks, such as learning sequences of spikes and nonlinear pattern classification problems. The experimental results show that the proposed algorithm has higher learning accuracy and efficiency than the Multi-ReSuMe learning algorithm. It is effective for solving complex spatio-temporal pattern learning problems.

References

[1]
S.S. Haykin, Pearson Education, Upper Saddle River, 2009.
[2]
D.E. Rummelhart, Learning representations by back-propagating errors, Nature, 323 (1986) 533-536.
[3]
Y. Chauvin, D.E. Rumelhart, Backpropagation: theory, architectures, and applications, Psychology Press, 1995.
[4]
S.M. Bohte, The evidence for neural information processing with precise spike-times, Nat. Comput., 3 (2004) 195-206.
[5]
K. Whalley, Neural coding: timing is key in the olfactory system, Nat. Rev. Neurosci., 14 (2013).
[6]
E.M. Izhikevich, Which model to use for cortical spiking neurons?, IEEE Trans. Neural Netw., 15 (2004) 1063-1070.
[7]
X. Lin, T. Zhang, Dynamical properties of piecewise linear spiking neuron model, Acta Electron. Sin., 37 (2009) 1270-1276.
[8]
W. Maass, Networks of spiking neurons: the third generation of neural network models, Neural Netw., 10 (1997) 1659-1671.
[9]
S. Ghosh-Dastidar, H. Adeli, Spiking neural networks, Int. J. Neural Syst., 19 (2009) 295-308.
[10]
W. Maass, Lower bounds for the computational power of networks of spiking neurons, Neural Comput., 8 (1996) 1-40.
[11]
W. Maass, Fast sigmoidal networks via spiking neurons, Neural Comput., 9 (1997) 279-304.
[12]
A.D. Almsi, S. Woniak, V. Cristea, Y. Leblebici, T. Engbersen, Review of advances in neural networks: neural design technology stack, Neurocomputing, 174 (2016) 31-41.
[13]
E.I. Knudsen, Supervised learning in the brain, J. Neurosci., 14 (1994) 3985-3997.
[14]
E.I. Knudsen, Instructed learning in the auditory localization pathway of the barn owl, Nature, 417 (2002) 322-328.
[15]
S. Guediche, L.L. Holt, P. Laurent, S.J. Lim, J.A. Fiez, Evidence for cerebellar contributions to adaptive plasticity in speech perception, Cerebral Cortex, 2014, bht428.
[16]
X. Lin, X. Wang, N. Zhang, H. Ma, Supervised learning algorithms for spiking neural networks: a review, Acta Electron. Sin., 43 (2015) 577-586.
[17]
R.M. Memmesheimer, R. Rubin, B.P. lveczky, H. Sompolinsky, Learning precisely timed spikes, Neuron, 82 (2014) 925-938.
[18]
S.M. Bohte, J.N. Kok, H.La. Poutre, Error-backpropagation in temporally encoded networks of spiking neurons, Neurocomputing, 48 (2002) 17-37.
[19]
W. Gerstner, W.M. Kistler, Spiking neuron models: single Neurons, populations, plasticity, Cambridge University Press, 2002.
[20]
J. Xin, M.J. Embrechts, Supervised learning with spiking neural networks, in: International Joint Conference on Neural Networks, IEEE, 2001, vol. 3, pp. 17721777.
[21]
S. McKennoch, D. Liu, L.G. Bushnell, Fast modifications of the spikeprop algorithm, in: International Joint Conference on Neural Networks, IEEE, 2006, pp. 39703977.
[22]
S. McKennoch, T. Voegtlin, L. Bushnell, Spike-timing error backpropagation in theta neuron networks, Neural Comput., 21 (2009) 9-45.
[23]
H. Fang, J. Luo, F. Wang, Fast learning in spiking neural networks by learning rate adaptation, Chin. J. Chem. Eng., 20 (2012) 1219-1224.
[24]
O. Booij, H. tat Nguyen, A gradient descent rule for spiking neurons emitting multiple spikes, Inf. Process. Lett., 95 (2005) 552-558.
[25]
S. Ghosh-Dastidar, H. Adeli, A new supervised learning algorithm for multiple spiking neural networks with application in epilepsy and seizure detection, Neural Netw., 22 (2009) 1419-1431.
[26]
Y. Xu, X. Zeng, L. Han, J. Yang, A supervised multi-spike learning algorithm based on gradient descent for spiking neural networks, Neural Netw., 43 (2013) 99-113.
[27]
R.V. Florian, The chronotron: a neuron that learns to fire temporally precise spike patterns, PLoS One, 7 (2012) e40233.
[28]
E. Kuriscak, P. Marsalek, J. Stroffek, P.G. Toth, Biological context of Hebb learning in artificial neural networks,a review, Neurocomputing, 152 (2015) 27-35.
[29]
N. Caporale, Y. Dan, Spike timing-dependent plasticity: a hebbian learning rule, Annu. Rev. Neurosci., 31 (2008) 25-46.
[30]
R. Legenstein, C. Naeger, W. Maass, What can a neuron learn with spike-timing-dependent plasticity?, Neural Comput., 17 (2005) 2337-2382.
[31]
J.J. Wade, L.J. McDaid, J.A. Santos, H.M. Sayers, SWAT: a spiking neural network training algorithm for classification problems, IEEE Trans. Neural Netw., 21 (2010) 1817-1830.
[32]
F. Ponulak, A. Kasinski, Supervised learning in spiking neural networks with resume: sequence learning, classification, and spike shifting, Neural Comput., 22 (2010) 467-510.
[33]
F. Ponulak, Analysis of the ReSuMe learning process for spiking neural networks, Int. J. Appl. Math. Comput. Sci., 18 (2008) 117-127.
[34]
I. Sporea, A. Grning, Supervised learning in multilayer spiking neural networks, Neural Comput., 25 (2013) 473-509.
[35]
P. Dayan, L.F. Abbott, Theoretical neuroscience: computational and mathematical modeling of neural systems, J. Cognit. Neurosci., 15 (2003) 154-155.
[36]
I.M. Park, S. Seth, A. Paiva, L. Li, J. Principe, Kernel methods on spike train space for neuroscience, IEEE Signal Process. Mag., 30 (2013) 149-160.
[37]
A. Carnell, D. Richardson, linear algebra for time series of spikes, in: ESANN, 2005, pp. 363368.
[38]
A. Mohemmed, S. Schliebs, S. Matsuda, N. Kasabov, Span:spike pattern association neuron for learning spatio-temporal spike patterns, Int. J. Neural Syst., 22 (2012) 786-803.
[39]
A. Mohemmed, S. Schliebs, S. Matsuda, N. Kasabov, Training spiking neural networks to associate spatio-temporal input-output spike patterns, Neurocomputing, 107 (2013) 3-10.
[40]
Q. Yu, H. Tang, K.C. Tan, H. Li, Precise-spike-driven synaptic plasticity: learning hetero-association of spatiotemporal spike patterns, PLoS One, 8 (2013) e78318.
[41]
Q. Yu, H. Tang, K.C. Tan, H. Yu, A brain-inspired spiking neural network model with temporal encoding and learning, Neurocomputing, 138 (2014) 3-13.
[42]
I.M. Park, S. Seth, M. Rao, J.C. Prncipe, Strictly positive-definite spike train kernels for point-process divergences, Neural Comput., 24 (2012) 2223-2250.
[43]
A.R. Paiva, I. Park, J.C. Prncipe, A reproducing kernel hilbert space framework for spike train signal processing, Neural Comput., 21 (2009) 424-449.
[44]
R. Gtig, R. Aharonov, S. Rotter, H. Sompolinsky, Learning input correlations through nonlinear temporally asymmetric hebbian plasticity, J. Neurosci., 23 (2003) 3697-3714.
[45]
S. Cash, R. Yuste, Linear summation of excitatory inputs by ca1 pyramidal neurons, Neuron, 22 (1999) 383-394.
[46]
N. Brunel, V. Hakim, P. Isope, J.P. Nadal, B. Barbour, Optimal information storage and the distribution of synaptic weights: perceptron versus purkinje cell, Neuron 43 (5) (2004) 745757.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Neurocomputing
Neurocomputing  Volume 237, Issue C
May 2017
401 pages

Publisher

Elsevier Science Publishers B. V.

Netherlands

Publication History

Published: 10 May 2017

Author Tags

  1. Backpropagation algorithm
  2. Inner products of spike trains
  3. Multilayer feedforward network
  4. Spiking neural networks
  5. Supervised learning

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 25 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Unleashing the potential of spiking neural networks for epileptic seizure detectionNeurocomputing10.1016/j.neucom.2024.127934598:COnline publication date: 14-Sep-2024
  • (2022)High-Accuracy Spiking Neural Network for Objective Recognition Based on Proportional Attenuating NeuronNeural Processing Letters10.1007/s11063-021-10669-654:2(1055-1073)Online publication date: 1-Apr-2022
  • (2022)Spike-train level supervised learning algorithm based on bidirectional modification for liquid state machinesApplied Intelligence10.1007/s10489-022-04152-553:10(12252-12267)Online publication date: 24-Sep-2022
  • (2021)Supervised Learning Algorithm for Multilayer Spiking Neural Networks with Long-Term Memory Spike Response ModelComputational Intelligence and Neuroscience10.1155/2021/85928242021Online publication date: 1-Jan-2021
  • (2020)Spike-Train Level Unsupervised Learning Algorithm for Deep Spiking Belief NetworksArtificial Neural Networks and Machine Learning – ICANN 202010.1007/978-3-030-61616-8_51(634-645)Online publication date: 15-Sep-2020
  • (2018)Supervised Learning Algorithm for Multi-spike Liquid State MachinesIntelligent Computing Theories and Application10.1007/978-3-319-95930-6_23(243-253)Online publication date: 15-Aug-2018
  • (2017)Evolution and analysis of embodied spiking neural networks reveals task-specific clusters of effective networksProceedings of the Genetic and Evolutionary Computation Conference10.1145/3071178.3071336(75-82)Online publication date: 1-Jul-2017

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media