Abstract
In this paper, the behavior of some hebbian artificial neural networks with lateral weights is analyzed. Hebbian neural networks are employed in communications and signal processing applications for implementing on-line Principal Component Analysis (PCA). Different improvements over the original Oja model have been developed in the last two decades. Among them, models with lateral weights have been designed to directly provide the eigenvectors of the correlation matrix [1,5,6,9]. The behavior of hebbian models has been traditionally studied by resorting to an associated continuous-time formulation under some questionable assumptions which are not guaranteed in real implementations. In this paper we employ the alternative deterministic discrete-time (DDT) formulation that characterizes the average evolution of these nets and gathers the influence of the learning gains time evolution [12]. The dynamic behavior of some of these hebbian models is analytically characterized in this context and several simulations complement this comparative study.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Berzal, J.A., Zufiria, P.J.: Local analysis of a new Rubner-type neural network via a DDT formulation. In: Proc. of the IEEE International Symposium on Intelligent Signal Processing (WISP’2005), Faro, Portugal, p. 6 (2005)
Berzal, J.A., Zufiria, P.J.: Dynamic behavior of DCT and DDT formulations for the Sanger Neural Network, Neurocomputing (Accepted.)
Berzal, J.A., Zufiria, P.J.: Analysis of the Sanger Hebbian Neural Network. In: Cabestany, J., Prieto, A.G., Sandoval, F. (eds.) IWANN 2005. LNCS, vol. 3512, pp. 9–16. Springer, Heidelberg (2005)
Berzal, A., Zufiria, P.J.: Convergence analysis of a linearized Rubner network with modified lateral weight behavior, Computational and Mathematical Methods on Science and Engineering. In: Criado, R. (ed.) 2006 Conference on Computational and Mathematical Methods on Science and Engineering, CMMSE’06, pp. 125–132 (2006)
Diamantaras, K.I., Kung, S.Y.: Principal Component Neural Networks: Theory and Applications. John Wiley and Sons, New York (1994)
Kung, S.Y., Diamantaras, K.I.: A Neural Network Learning Algorithm for Adaptive Principal Component EXtraction (APEX). In: Proc. IEEE Int. Conf. Acoustics, Speech and Signal Processing, pp. 861-864 (1990)
Kushner, H.J., Yin, G.G.: Stochastic Approximation Algorithms and Applications. Springer, New York (1997)
Lancaster, P., Tismenetsky, M.: The theory of matrices, 2nd edn. Academic Press, San Diego (1985)
Rubner, J., Tavan, P.: A Selt-Organizing Network for Principal Component Analysis. Europhys. Lett. 10(7), 693–698 (1989)
Weingessel, A., Hornik, K.: Local PCA Algorithms. IEEE Transactions on Neural Networks, 1242–1250 (November 2000)
Yan, W.-Y., Helmke, U., Moore, J.B.: Global Analysis of Oja’s Flow for Neural Networks. IEEE Transactions on Neural Networks, 674–683 (September 1994)
Zufiria, P.J.: On the Discrete-Time Dynamics of the Basic Hebbian Neural-Network Node. IEEE Transactions on Neural Networks, 1342–1352 (November 2002)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zufiria, P.J., Berzal, J.A. (2007). Analysis of Hebbian Models with Lateral Weight Connections. In: Sandoval, F., Prieto, A., Cabestany, J., Graña, M. (eds) Computational and Ambient Intelligence. IWANN 2007. Lecture Notes in Computer Science, vol 4507. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-73007-1_6
Download citation
DOI: https://doi.org/10.1007/978-3-540-73007-1_6
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-73006-4
Online ISBN: 978-3-540-73007-1
eBook Packages: Computer ScienceComputer Science (R0)