Abstract
Evolving recurrent neural networks represent a natural model of computation beyond the Turing limits. Here, we consider evolving recurrent neural networks working on infinite input streams. The expressive power of these networks is related to their attractor dynamics and is measured by the topological complexity of their underlying neural \(\omega \)-languages. In this context, the deterministic and non-deterministic evolving neural networks recognize the (boldface) topological classes of \(BC(\varvec{\mathrm {\Pi }}^0_2)\) and \(\varvec{\mathrm {\Sigma }}^1_1\) \(\omega \)-languages, respectively. These results can actually be significantly refined: the deterministic and nondeterministic evolving networks which employ \(\alpha \in 2^\omega \) as sole binary evolving weight recognize the (lightface) relativized topological classes of \(BC(\mathrm {\Pi }^0_2)(\alpha )\) and \(\mathrm {\Sigma }^1_1(\alpha )\) \(\omega \)-languages, respectively. As a consequence, a proper hierarchy of classes of evolving neural nets, based on the complexity of their underlying evolving weights, can be obtained. The hierarchy contains chains of length \(\omega _1\) as well as uncountable antichains.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
The results of the paper remain valid for any other kind of sigmoidal activation function satisfying the properties mentioned in [13, Sect. 4].
- 2.
In words, an attractor of \(\mathcal {N}\) is a set of output states into which the Boolean computation of the network could become forever trapped – yet not necessarily in a periodic manner.
References
Apt, K.R.: \(\omega \)-models in analytical hierarchy. Bulletin de l’académie polonaise des sciences XX(11), 901–904 (1972)
Balcázar, J.L., Gavaldà, R., Siegelmann, H.T.: Computational power of neural networks: a characterization in terms of Kolmogorov complexity. IEEE Trans. Inf. Theory 43(4), 1175–1183 (1997)
Cabessa, J., Duparc, J.: Expressive power of nondeterministic recurrent neural networks in terms of their attractor dynamics. IJUC 12(1), 25–50 (2016)
Cabessa, J., Siegelmann, H.T.: Evolving recurrent neural networks are super-Turing. In: Proceedings of IJCNN 2011, pp. 3200–3206. IEEE (2011)
Cabessa, J., Siegelmann, H.T.: The computational power of interactive recurrent neural networks. Neural Comput. 24(4), 996–1019 (2012)
Cabessa, J., Siegelmann, H.T.: The super-turing computational power of plastic recurrent neural networks. Int. J. Neural Syst. 24(8), 1450029 (2014)
Cabessa, J., Villa, A.E.P.: The expressive power of analog recurrent neural networks on infinite input streams. Theor. Comput. Sci. 436, 23–34 (2012)
Cabessa, J., Villa, A.E.P.: An attractor-based complexity measurement for Boolean recurrent neural networks. PLoS ONE 9(4), e94204+ (2014)
Cabessa, J., Villa, A.E.P.: Expressive power of first-order recurrent neural networks determined by their attractor dynamics. J. Comput. Syst. Sci. 82(8), 1232–1250 (2016)
Cabessa, J., Villa, A.E.P.: Recurrent neural networks and super-turing interactive computation. In: Koprinkova-Hristova, P., Mladenov, V., Kasabov, N.K. (eds.) Artificial Neural Networks. SSB, vol. 4, pp. 1–29. Springer, Cham (2015). doi:10.1007/978-3-319-09903-3_1
Finkel, O.: Ambiguity of omega-languages of turing machines. Log. Methods Comput. Sci. 10(3), 1–18 (2014)
Kechris, A.S.: Classical Descriptive Set Theory. Graduate Texts in Mathematics, vol. 156. Springer, New York (1995)
Kilian, J., Siegelmann, H.T.: The dynamic universality of sigmoidal neural networks. Inf. Comput. 128(1), 48–56 (1996)
Kleene, S.C.: Representation of events in nerve nets and finite automata. In: Shannon, C., McCarthy, J. (eds.) Automata Studies, pp. 3–41. Princeton University Press, Princeton (1956)
McCulloch, W.S., Pitts, W.: A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 5, 115–133 (1943)
Minsky, M.L.: Computation: Finite and Infinite Machines. Prentice-Hall Inc., Englewood Cliffs (1967)
Moschovakis, Y.N.: Descriptive Set Theory. Mathematical Surveys and Monographs, 2nd edn. American Mathematical Society, Providence (2009)
Siegelmann, H.T.: Recurrent neural networks and finite automata. Comput. Intell. 12, 567–574 (1996)
Siegelmann, H.T., Sontag, E.D.: Analog computation via neural networks. Theor. Comput. Sci. 131(2), 331–360 (1994)
Siegelmann, H.T., Sontag, E.D.: On the computational power of neural nets. J. Comput. Syst. Sci. 50(1), 132–150 (1995)
Síma, J., Orponen, P.: General-purpose computation with neural networks: a survey of complexity theoretic results. Neural Comput. 15(12), 2727–2778 (2003)
Staiger, L.: \(\omega \)-languages. In: Rozenberg, G., Salomaa, A. (eds.) Handbook of Formal Languages: Beyond Words, vol. 3, pp. 339–387. Springer, New York (1997)
Turing, A.M.: Intelligent machinery. Technical report, National Physical Laboratory, Teddington, UK (1948)
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer-Verlag GmbH Germany
About this paper
Cite this paper
Cabessa, J., Finkel, O. (2017). Expressive Power of Evolving Neural Networks Working on Infinite Input Streams. In: Klasing, R., Zeitoun, M. (eds) Fundamentals of Computation Theory. FCT 2017. Lecture Notes in Computer Science(), vol 10472. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-55751-8_13
Download citation
DOI: https://doi.org/10.1007/978-3-662-55751-8_13
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-55750-1
Online ISBN: 978-3-662-55751-8
eBook Packages: Computer ScienceComputer Science (R0)