Skip to main content

Understanding State Space Organization in Recurrent Neural Networks with Iterative Function Systems Dynamics

  • Conference paper
Hybrid Neural Systems (Hybrid Neural Systems 1998)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 1778))

Included in the following conference series:

Abstract

We study a novel recurrent network architecture with dynamics of iterative function systems used in chaos game representations of DNA sequences [16, 11] We show that such networks code the temporal and statistical structure of input sequences in a strict mathematical sense: generalized dimensions of network states are in direct correspondence with statistical properties of input sequences expressed via generalized Rényi entropy spectra. We also argue and experimentally illustrate that the commonly used heuristic of finite state machine extraction by network state space quantization corresponds in this case to variable memory length Markov model construction.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Barnsley, M.F.: Fractals everywhere. Academic Press, New York (1988)

    MATH  Google Scholar 

  2. Beck, C., Schlogl, F.: Thermodynamics of chaotic systems. Cambridge University Press, Cambridge (1995)

    MATH  Google Scholar 

  3. Bruske, J., Sommer, G.: Dynamic cell structure learns perfectly topology preserving map. Neural Computation 7(4), 845–865 (1995)

    Article  Google Scholar 

  4. Casey, M.P.: The dynamics of discrete-time computation, with application to recurrent neural networks and finite state machine extraction. Neural Computation 8(6), 1135–1178 (1996)

    Article  Google Scholar 

  5. Crutchfield, J.P., Young, K.: Inferring statistical complexity. Physical Review Letters 63, 105–108 (1989)

    Article  MathSciNet  Google Scholar 

  6. Crutchfield, J.P., Young, K.: Computation at the onset of chaos. In: Zurek, W.H. (ed.) Complexity, Entropy, and the physics of Information. SFI Studies in the Sciences of Complexity, vol. 8, pp. 223–269. Addison-Wesley, Reading (1990)

    Google Scholar 

  7. Frasconi, P., Gori, M., Maggini, M., Soda, G.: Insertion of finite state automata in recurrent radial basis function networks. Machine Learning 23, 5–32 (1996)

    MATH  Google Scholar 

  8. Freund, J., Ebeling, W., Rateitschak, K.: Self-similar sequences and universal scaling of dynamical entropies. Physical Review E 54(5), 5561–5566 (1996)

    Article  Google Scholar 

  9. Grassberger, P.: Information and complexity measures in dynamical systems. In: Atmanspacher, H., Scheingraber, H. (eds.) Information Dynamics, pp. 15–33. Plenum Press, New York (1991)

    Google Scholar 

  10. Hertz, J., Krogh, A., Palmer, R.G.: Introduction to the Theory of Neural Computation. Addison–Wesley, Redwood City (1991)

    Google Scholar 

  11. Jeffrey, J.: Chaos game representation of gene structure. Nucleic Acids Research 18(8), 2163–2170 (1990)

    Article  Google Scholar 

  12. Kenyon, R., Peres, Y.: Measures of full dimension on affine invariant sets. Ergodic Theory and Dynamical Systems 16, 307–323 (1996)

    Article  MATH  MathSciNet  Google Scholar 

  13. Kolen, J.F.: Recurrent networks: state machines or iterated function systems? In: Mozer, M.C., Smolensky, P., Touretzky, D.S., Elman, J.L., Weigend, A.S. (eds.) Proceedings of the 1993 Connectionist Models Summer School, pp. 203–210. Erlbaum Associates, Hillsdale (1994)

    Google Scholar 

  14. Manolios, P., Fanelli, R.: First order recurrent neural networks and deterministic finite state automata. Neural Computation 6(6), 1155–1173 (1994)

    Article  Google Scholar 

  15. McCauley, J.L.: Chaos, Dynamics and Fractals: an algorithmic approach to deter ministic chaos. Cambridge University Press, Cambridge (1994)

    Google Scholar 

  16. Tiňo, P.: Spatial representation of symbolic sequences through iterative function system. IEEE Transactions on Systems, Man, and Cybernetics Part A: Systems and Humans 29(4), 386–393 (1999)

    Article  Google Scholar 

  17. Tiňo, P., Dorffner, G.: Recurrent neural networks with iterated function systems dynamics. In: International ICSC/IFAC Symposium on Neural Computation, pp. 526–532 (1998)

    Google Scholar 

  18. Tiňo, P., Horne, B.G., Giles, C.L., Collingwood, P.C.: Finite state machines and recurrent neural networks – automata and dynamical systems approaches. In: Dayhoff, J.E., Omidvar, O. (eds.) Neural Networks and Pattern Recognition, pp. 171–220. Academic Press, London (1998)

    Google Scholar 

  19. Tiňo, P., Koteles, M.: Extracting Finite state representations from recurrent neural networks trained on chaotic symbolic sequences. IEEE Transactions on Neural Networks 10(2), 284–302 (1999)

    Article  Google Scholar 

  20. Tiňo, P., Sajda, J.: Learning and extracting initial mealy machines with a modular neural network model. Neural Computation 7(4), 822–844 (1995)

    Article  Google Scholar 

  21. Tiňo, P., Vojtek, V.: Modeling complex sequences with recurrent neural networks. In: Smith, G.D., Steele, N.C., Albrecht, R.F. (eds.) Artificial Neural Networks and Genetic Algorithms, pp. 459–463. Springer, New York (1998)

    Google Scholar 

  22. Oliver, J.L., Bernaola-Galván, P., Guerrero-Garcia, J., Román Roldan, R.: Entropic profiles of DNA sequences through chaos-game-derived images. Journal of Theor. Biology 160, 457–470 (1993)

    Article  Google Scholar 

  23. Omlin, C.W., Giles, C.L.: Extraction of rules from discrete-time recurrent neural networks. Neural Networks 9(1), 41–51 (1996)

    Article  Google Scholar 

  24. Renyi, A.: On the dimension and entropy of probability distributions. Acta Math. Hung. 10, 193 (1959)

    Article  MATH  MathSciNet  Google Scholar 

  25. Roman-Roldan, R., Bernaola-Galvan, P., Oliver, J.L.: Entropic feature for sequence pattern through iteration function systems. Pattern Recognition Letters 15, 567–573 (1994)

    Article  Google Scholar 

  26. Ron, D., Singer, Y., Tishby, N.: The power of amnesia. In: Advances in Neural Information Processing Systems, pp. 176–183. Morgan Kaufmann, San Francisco (1994)

    Google Scholar 

  27. Ron, D., Singer, Y., Tishby, N.: The power of amnesia. Machine Learning 25, 117–150 (1996)

    Article  MATH  Google Scholar 

  28. Tabor, W.: Dynamical automata. Technical Report TR98-1694, Cornell University, Computer Science Department (1998)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2000 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Tiňo, P., Dorffner, G., Schittenkopf, C. (2000). Understanding State Space Organization in Recurrent Neural Networks with Iterative Function Systems Dynamics. In: Wermter, S., Sun, R. (eds) Hybrid Neural Systems. Hybrid Neural Systems 1998. Lecture Notes in Computer Science(), vol 1778. Springer, Berlin, Heidelberg. https://doi.org/10.1007/10719871_18

Download citation

  • DOI: https://doi.org/10.1007/10719871_18

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-67305-7

  • Online ISBN: 978-3-540-46417-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics