Abstract
Most Artificial Neural Networks (ANNs) have a fixed topology during learning, and often suffer from a number of short-comings as a result. Variations of ANNs that use dynamic topologies have shown ability to overcome many of these problems. This paper introduces Location-Independent Transformations (LITs) as a general strategy for implementing distributed feedforward networks that use dynamic topologies (dynamic ANNs) efficiently in parallel hardware. A LIT creates a set of location-independent nodes, where each node computes its part of the network output independent of other nodes, using local information. This type of transformation allows efficient support for adding and deleting nodes dynamically during learning. In particular, this paper presents an LIT for standard Backpropagation with two layers of weights, and shows how dynamic extensions to Backpropagation can be supported.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Fahlmann, Scott, C. Lebiere. The Cascade-Correlation Learning Architechture. in Advances in Neural Information Processing 2. pp. 524–532. Morgan Kaufmann Publishers: Los Altos, CA.
Hammerstrom, D., W. Henry, M. Kuhn. Neurocomputer System for Neural-Network Applications. In Parallel Digital Implementations of Neural Networks. K. Przytula, V. Prasanna, Eds. Prentice-Hall, Inc. 1991.
Odri, S.V., D.P. Petrovacki, G.A. Krstonosic. Evolutional Development of a Multilevel Neural Network. Neural Networks, Vol. 6, #4. pp. 583–595. Pergamon Press Ltd.: New York. 1993.
Reilly, D.L., L.N. Cooper, C. Elbaum. Learning Systems Based on Multiple Neural Networks. (Internal paper). Nestor, Inc. 1988.
Rudolph G., and T.R. Martinez. An Efficient Static Topology for Modeling ASOCS. International Conference on Artificial Neural Networks, Helsinki, Finland. In Artificial Neural Networks, Kohonen et al, pp. 279–734. North Holland: Elsevier Publishers, 1991.
Rudolph G., and T.R. Martinez. A Transformation for Implementing Localist Neural Networks. Submitted, 1994.
Rudolph G., Martinez, T.R. An Efficient Transformation for Implementing Two-Layer FeedForward Neural Networks. To appear in the Journal of Artificial Neural Networks, 1995.
Stout, M., G. Rudolph, T.R. Martinez, L. Salmon, A VLSI Implementation of a Parallel, Self-Organizing Learning Model, Proceedings of the International Conference on Pattern Recognition (1994) 373–376.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1995 Springer-Verlag/Wien
About this paper
Cite this paper
Rudolph, G.L., Martinez, T.R. (1995). A Transformation for Implementing Efficient Dynamic Backpropagation Neural Networks. In: Artificial Neural Nets and Genetic Algorithms. Springer, Vienna. https://doi.org/10.1007/978-3-7091-7535-4_13
Download citation
DOI: https://doi.org/10.1007/978-3-7091-7535-4_13
Publisher Name: Springer, Vienna
Print ISBN: 978-3-211-82692-8
Online ISBN: 978-3-7091-7535-4
eBook Packages: Springer Book Archive