Abstract
An alternative definition of the KS entropy h μ based on mutual information is proposed. The new definition is designed to handle experimental noise more gracefully than the standard definition. An example is used to illustrate the difference.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
R. G. Gallager. Information Theory and Reliable Communication. John Wiley and Sons, New York, 1968.
A. N. Kolmogorov. A new metric invariant of transient dynamical systems and automorphisms in lebesgue spaces. Dokl. Akad. Nauk. SSSR, 119:861–864, 1958. English summary in Mathematical Reviews, vol. 21, pp. 386, 1960.
Y. Sinai. On the concept of entropy for a dynamic system. Dokl. Akad. Nauk. SSSR, 124:768–771, 1959. English summary in Mathematical Reviews, vol. 21, pp. 386–387, 1960.
Y. Sinai. Introduction to Ergodic Theory. Princeton University Press, Princeton, 1976.
R. Shaw. Strange attractors, chaotic behavior, and information flow. Z. Naturforsch, 36a (1): 80112, Jan. 1981.
A. M. Fraser. Information and entropy in strange attractors. I.E.E.E. Transactions on Information Theory, 35 (2): 245–262, March 1989.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1989 Plenum Press, New York
About this chapter
Cite this chapter
Fraser, A.M. (1989). Measuring Complexity in Terms of Mutual Information. In: Abraham, N.B., Albano, A.M., Passamante, A., Rapp, P.E. (eds) Measures of Complexity and Chaos. NATO ASI Series, vol 208. Springer, Boston, MA. https://doi.org/10.1007/978-1-4757-0623-9_11
Download citation
DOI: https://doi.org/10.1007/978-1-4757-0623-9_11
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4757-0625-3
Online ISBN: 978-1-4757-0623-9
eBook Packages: Springer Book Archive