Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
article

Hebbian learning of recurrent connections: A geometrical perspective

Published: 01 September 2012 Publication History

Abstract

We show how a Hopfield network with modifiable recurrent connections undergoing slow Hebbian learning can extract the underlying geometry of an input space. First, we use a slow and fast analysis to derive an averaged system whose dynamics derives from an energy function and therefore always converges to equilibrium points. The equilibria reflect the correlation structure of the inputs, a global object extracted through local recurrent interactions only. Second, we use numerical methods to illustrate how learning extracts the hidden geometrical structure of the inputs. Indeed, multidimensional scaling methods make it possible to project the final connectivity matrix onto a Euclidean distance matrix in a high-dimensional space, with the neurons labeled by spatial position within this space. The resulting network structure turns out to be roughly convolutional. The residual of the projection defines the nonconvolutional part of the connectivity, which is minimized in the process. Finally, we show how restricting the dimension of the space where the neurons live gives rise to patterns similar to cortical maps. We motivate this using an energy efficiency argument based on wire length minimization. Finally, we show how this approach leads to the emergence of ocular dominance or orientation columns in primary visual cortex via the self-organization of recurrent rather than feedforward connections. In addition, we establish that the nonconvolutional (or long-range) connectivity is patchy and is co-aligned in the case of orientation learning.

References

[1]
Amari, S. (1998). Natural gradient works efficiently in learning. Neural Computation, 10(2), 251-276.
[2]
Amari, S., Kurata, K., & Nagaoka, H. (1992). Information geometry of Boltzmann machines. IEEE Transactions on Neural Networks, 3(2), 260-271.
[3]
Bartsch, A., & Van Hemmen, J. (2001). Combined Hebbian development of geniculocortical and lateral connectivity in a model of primary visual cortex. Biological Cybernetics, 84(1), 41-55.
[4]
Bi, G., & Poo, M. (2001). Synaptic modification by correlated activity: Hebb's postulate revisited. Annual Review of Neuroscience, 24, 139.
[5]
Bienenstock, E., Cooper, L., & Munro, P. (1982). Theory for the development of neuron selectivity: Orientation specificity and binocular interaction in visual cortex. J. Neurosci., 2, 32-48.
[6]
Borg, I., & Groenen, P. (2005). Modern multidimensional scaling: Theory and applications. New York: Springer-Verlag.
[7]
Bosking, W., Zhang, Y., Schofield, B., & Fitzpatrick, D. (1997). Orientation selectivity and the arrangement of horizontal connections in tree shrew striate cortex. Journal of Neuroscience, 17(6), 2112-2127.
[8]
Bressloff, P. (2005). Spontaneous symmetry breaking in self-organizing neural fields. Biological Cybernetics, 93(4), 256-274.
[9]
Bressloff, P. C., & Cowan, J. D. (2003). A spherical model for orientation and spatial frequency tuning in a cortical hypercolumn. Philosophical Transactions of the Royal Society B, 358, 1643-1667.
[10]
Bressloff, P., Cowan, J., Golubitsky, M., Thomas, P., & Wiener, M. (2001). Geometric visual hallucinations, Euclidean symmetry and the functional architecture of striate cortex. Phil. Trans. R. Soc. Lond. B, 306(1407), 299-330.
[11]
Brewer, J. (1978). Kronecker products and matrix calculus in system theory. IEEE Transactions on Circuits and Systems, 25(9), 772-781.
[12]
Chklovskii, D., Schikorski, T., & Stevens, C. (2002). Wiring optimization in cortical circuits. Neuron, 34(3), 341-347.
[13]
Chossat, P., & Faugeras, O. (2009). Hyperbolic planforms in relation to visual edges and textures perception. PLoS Computational Biology, 5(12), 367-375.
[14]
Cohen, M., & Grossberg, S. (1983). Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. IEEE Transactions on Systems, Man, and Cybernetics, SMC, 13, 815-826.
[15]
Coifman, R., Maggioni, M., Zucker, S., & Kevrekidis, I. (2005). Geometric diffusions for the analysis of data from sensor networks. Current Opinion in Neurobiology, 15(5), 576-584.
[16]
Coombes, S. (2005). Waves, bumps, and patterns in neural field theories. Biological Cybernetics, 93(2), 91-108.
[17]
Cottet, G. (1995). Neural networks: Continuous approach and applications to image processing. Journal of Biological Systems, 3, 1131-1139.
[18]
Dayan, P., & Abbott, L. (2001). Theoretical neuroscience: Computational and mathematical modeling of neural systems. Cambridge, MA: MIT Press.
[19]
Degond, P., & Mas-Gallic, S. (1989). The weighted particle method For convection-diffusion equations: Part 1: The case of an isotropic viscosity. Mathematics of Computation, 53, 485-507.
[20]
Dong, D., & Hopfield, J. (1992). Dynamic properties of neural networks with adapting synapses. Network: Computation in Neural Systems, 3(3), 267-283.
[21]
Edwards, R. (1996). Approximation of neural network dynamics by reactiondiffusion equations. Mathematical Methods in the Applied Sciences, 19(8), 651-677.
[22]
Faugeras, O., Grimbert, F., & Slotine, J.-J. (2008). Abolute stability and complete synchronization in a class of neural fields models. SIAM J. Appl. Math, 61(1), 205-250.
[23]
Földiák, P. (1991). Learning invariance from transformation sequences. Neural Computation, 3(2), 194-200.
[24]
Geman, S. (1979). Some averaging and stability results for random differential equations. SIAM J. Appl. Math, 36(1), 86-105.
[25]
Gerstner, W., & Kistler, W. M. (2002). Mathematical formulations of Hebbian learning. Biological Cybernetics, 87, 404-415.
[26]
Hebb, D. (1949). The organization of behavior: A neuropsychological theory. Hoboken, NJ: Wiley.
[27]
Hubel, D. H., & Wiesel, T. N. (1977). Functional architecture of macaque monkey visual cortex. Proc. Roy. Soc. B, 198, 1-59.
[28]
Khalil, H., & Grizzle, J. (1996). Nonlinear systems. Upper Saddle River, NJ: Prentice Hall.
[29]
Kohonen, T. (1990). The self-organizing map. Proceedings of the IEEE, 78(9), 1464-1480.
[30]
Miikkulainen, R., Bednar, J., Choe, Y., & Sirosh, J. (2005). Computational maps in the visual cortex. New York: Springer.
[31]
Miller, K. (1996). Synaptic economics: Competition and cooperation in synaptic plasticity. Neuron, 17, 371-374.
[32]
Miller, K. D., Keller, J. B., & Stryker, M. P. (1989). Ocular dominance column development: Analysis and simulation. Science, 245, 605-615.
[33]
Miller, K., & MacKay, D. (1996). The role of constraints in Hebbian learning. Neural Comp, 6, 100-126.
[34]
Oja, E. (1982).Asimplified neuron model as a principal component analyzer. J. Math. Biology, 15, 267-273.
[35]
Ooyen, A. (2001). Competition in the development of nerve connections: A review of models. Network: Computation in Neural Systems, 12(1), 1-47.
[36]
Petitot, J. (2003). The neurogeometry of pinwheels as a sub-Riemannian contact structure. Journal of Physiology-Paris, 97(2-3), 265-309.
[37]
Sejnowski, T., & Tesauro, G. (1989). The Hebb rule for synaptic plasticity: Algorithms and implementations. In J.H. Byrne & W.O. Berry (Eds.), Neural models of plasticity: Experimental and theoretical approaches (pp. 94-103). Orlando, FL: Academic Press.
[38]
Swindale, N. (1996). The development of topography in the visual cortex: A review of models. Network: Computation in Neural Systems, 7(2), 161-247.
[39]
Takeuchi, A., & Amari, S. (1979). Formation of topographic maps and columnar microstructures in nerve fields. Biological Cybernetics, 35(2), 63-72.
[40]
Tikhonov, A. (1952). Systems of differential equations with small parameters multiplying the derivatives. Matem. sb, 31(3), 575-586.
[41]
Verhulst, F. (2007). Singular perturbation methods for slow-fast dynamics. Nonlinear Dynamics, 50(4), 747-753.
[42]
Wallis, G., & Baddeley, R. (1997). Optimal, unsupervised learning in invariant object recognition. Neural Computation, 9(4), 883-894.
[43]
Zucker, S., Lawlor, M., & Holtmann-Rice, D. (2011). Third order edge statistics reveal curvature dependency. Journal of Vision, 11, 1073.

Cited By

View all
  • (2019)Hopfield network-based approach to detect seam-carved images and identify tampered regionsNeural Computing and Applications10.1007/s00521-018-3463-831:10(6479-6492)Online publication date: 1-Oct-2019
  • (2015)Continuous neural network with windowed Hebbian learningBiological Cybernetics10.1007/s00422-015-0645-7109:3(321-332)Online publication date: 1-Jun-2015
  1. Hebbian learning of recurrent connections: A geometrical perspective

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image Neural Computation
      Neural Computation  Volume 24, Issue 9
      September 2012
      292 pages

      Publisher

      MIT Press

      Cambridge, MA, United States

      Publication History

      Published: 01 September 2012

      Qualifiers

      • Article

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)0
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 10 Nov 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2019)Hopfield network-based approach to detect seam-carved images and identify tampered regionsNeural Computing and Applications10.1007/s00521-018-3463-831:10(6479-6492)Online publication date: 1-Oct-2019
      • (2015)Continuous neural network with windowed Hebbian learningBiological Cybernetics10.1007/s00422-015-0645-7109:3(321-332)Online publication date: 1-Jun-2015

      View Options

      View options

      Get Access

      Login options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media