Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Representation and recognition of regular grammars by means of second-order recurrent neural networks

  • Conference paper
  • First Online:
New Trends in Neural Computation (IWANN 1993)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 686))

Included in the following conference series:

Abstract

Recently, some models of neural networks, recurrent neural networks, have been used in conjunction with their associated neural learning schemes to infer regular grammars from a set of sample strings. The representation of the inferred automata is hidden in the weights and connections of the net, this being a common feature in emergent subsymbolic representations. In order to relate the symbolic and connectionist approaches to the tasks of grammatical inference and recognition, we address and solve a basic problem, which is, how to build a neural net recognizer for a given regular language specified by a deterministic finite-state automaton. A second-order recurrent network model is employed, which allows to formulate the problem as one of solving a linear system of equations. These equations directly represent the automaton transitions in terms of static linear approximations of the network running equations, and can be viewed as constraints to be satisfied by the network weights. A description is given both for the weight computation step and the string recognition procedure.

The author is supported through a grant from the Government of Catalonia.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. J.L. Elman ”Finding structure in time” CRL Technical Report 8801 University of California, San Diego, Center of Research in Language, 1988.

    Google Scholar 

  2. C.L. Giles et al. ”Learning and extracting finite state automata with second-order recurrent neural networks” Neural Compulation, vol. 4, pp. 393–405, 1992.

    Google Scholar 

  3. J.E. Hopfcroft and J.D. Ullman ”Introduction to Automata Theory, Languages and Computation”, p. 68, Addison-Wesley, Reading MA, 1979.

    Google Scholar 

  4. L. Miclet ”Grammatical Inference” Chapter 9 ”Syntatic and Structural Pattern recognition: Theory and Applications, H. Bunke and A. Sanfeliu editors, World Scientific, 1990.

    Google Scholar 

  5. A. Sanfeliu and R. Alquezar, ”Understanding neural networks for grammatical inference and recognition” IAPR Int. Workshop on Structural and Syntactic Pattern Recognition, Bern, August 26–28, 1992.

    Google Scholar 

  6. D. Servan-Schreiber, A. Cleeremans and J.L. McClelland ”Graded state machines: the representation of temporal contingencies in simple recurrent networks” Machine Learning, vol. 7, pp. 161–193, 1991.

    Google Scholar 

  7. A.W. Smith and D. Zipser ”Learning sequential structure with the real-time recurrent learning algorithm” Int. Journal of Neural Systems, vol. 1 n.2, pp. 125–131, 1989.

    Google Scholar 

  8. R.L. Watrous and G.M. Kuhn ”Induction of finite state languages using second-order recurrent networks” Neural Computation, vol. 4, pp. 406–414, 1992.

    Google Scholar 

  9. R.J. Williams and D. Zipser ”A learning algorithm for continually running fully recurrent neural networks” Neural Computation, vol. 1, pp. 270–280, 1989.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

José Mira Joan Cabestany Alberto Prieto

Rights and permissions

Reprints and permissions

Copyright information

© 1993 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Alquézar, R., Sanfeliu, A. (1993). Representation and recognition of regular grammars by means of second-order recurrent neural networks. In: Mira, J., Cabestany, J., Prieto, A. (eds) New Trends in Neural Computation. IWANN 1993. Lecture Notes in Computer Science, vol 686. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-56798-4_138

Download citation

  • DOI: https://doi.org/10.1007/3-540-56798-4_138

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-56798-1

  • Online ISBN: 978-3-540-47741-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics