Abstract
Recently, some models of neural networks, recurrent neural networks, have been used in conjunction with their associated neural learning schemes to infer regular grammars from a set of sample strings. The representation of the inferred automata is hidden in the weights and connections of the net, this being a common feature in emergent subsymbolic representations. In order to relate the symbolic and connectionist approaches to the tasks of grammatical inference and recognition, we address and solve a basic problem, which is, how to build a neural net recognizer for a given regular language specified by a deterministic finite-state automaton. A second-order recurrent network model is employed, which allows to formulate the problem as one of solving a linear system of equations. These equations directly represent the automaton transitions in terms of static linear approximations of the network running equations, and can be viewed as constraints to be satisfied by the network weights. A description is given both for the weight computation step and the string recognition procedure.
The author is supported through a grant from the Government of Catalonia.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
J.L. Elman ”Finding structure in time” CRL Technical Report 8801 University of California, San Diego, Center of Research in Language, 1988.
C.L. Giles et al. ”Learning and extracting finite state automata with second-order recurrent neural networks” Neural Compulation, vol. 4, pp. 393–405, 1992.
J.E. Hopfcroft and J.D. Ullman ”Introduction to Automata Theory, Languages and Computation”, p. 68, Addison-Wesley, Reading MA, 1979.
L. Miclet ”Grammatical Inference” Chapter 9 ”Syntatic and Structural Pattern recognition: Theory and Applications, H. Bunke and A. Sanfeliu editors, World Scientific, 1990.
A. Sanfeliu and R. Alquezar, ”Understanding neural networks for grammatical inference and recognition” IAPR Int. Workshop on Structural and Syntactic Pattern Recognition, Bern, August 26–28, 1992.
D. Servan-Schreiber, A. Cleeremans and J.L. McClelland ”Graded state machines: the representation of temporal contingencies in simple recurrent networks” Machine Learning, vol. 7, pp. 161–193, 1991.
A.W. Smith and D. Zipser ”Learning sequential structure with the real-time recurrent learning algorithm” Int. Journal of Neural Systems, vol. 1 n.2, pp. 125–131, 1989.
R.L. Watrous and G.M. Kuhn ”Induction of finite state languages using second-order recurrent networks” Neural Computation, vol. 4, pp. 406–414, 1992.
R.J. Williams and D. Zipser ”A learning algorithm for continually running fully recurrent neural networks” Neural Computation, vol. 1, pp. 270–280, 1989.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1993 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Alquézar, R., Sanfeliu, A. (1993). Representation and recognition of regular grammars by means of second-order recurrent neural networks. In: Mira, J., Cabestany, J., Prieto, A. (eds) New Trends in Neural Computation. IWANN 1993. Lecture Notes in Computer Science, vol 686. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-56798-4_138
Download citation
DOI: https://doi.org/10.1007/3-540-56798-4_138
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-56798-1
Online ISBN: 978-3-540-47741-9
eBook Packages: Springer Book Archive