Abstract
Chapter 4 is devoted to information theory as the science of literal communication. It begins with describing Shannon’s paradigm, which identifies the actors of any communication: (1) the source, which generates some message; (2) the channel which propagates the message; and (3) the destination which receives it. The matching of these entities to each others needs using devices which transform the message by coding, of two main types. Source coding is intended to shorten the message that the source delivers. Channel coding is intended to protect it against the symbol errors which occur in the channel, which demands lengthening the message. Both are assumed to be exactly reversible. Quantitative measures of information are defined, based on the improbability of symbols and messages. The source entropy measures the average information quantity borne by each of the symbols of the message it delivers. The channel capacity measures the largest information quantity that it can transfer. Two fundamental theorems state that source coding can reduce the message length up to a limit set by the source entropy, and that errorless communication is possible in the presence of symbol errors, but only provided the source entropy is less than the channel capacity. A normalized version of Shannon’s paradigm assumes that the message is transformed by source coding followed by channel coding, both achieving their theoretical limit. A simple proof of the fundamental source coding theorem is presented and the Huffman source coding algorithm is described. Comments about source coding help understanding the very concept of information and its relationship with semantics.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Such a function is generally referred to as ‘concave’ in the mathematical literature. We prefer using the single word ‘convex’, the shape of its representative curve being indicated by ∩ or ∪.
- 2.
Analog means for copying multidimensional objects exist, but they are only approximative so they are not reliable when they are repeatedly used.
- 3.
It may even be thought of as a 4-dimensional object since folding the assembled polypeptidic chain into a 3-dimensional molecule involves time.
References
Battail, G. (1990). Codage de source adaptatif par l’algorithme de Guazzo. Annales Télécommunic, 45(11–12), 677–693.
Battail, G. (1997). Théorie de l’information. Paris: Masson.
Battail, G. (2009). Living versus inanimate: the information border. Biosemiotics, 2(3), 321–341. doi:10.1007/s12304-009-9059-z.
Brillouin, L. (1956). Science and information theory. NewYork: Academic Press.
Cover, T. M., & Thomas, J. A. (1991). Elements of information theory. New York: Wiley.
Gallager, R. G. (1968). Information theory and reliable communication. New York: Wiley.
Gallager, R. G. (1978). Variations on a theme by Huffman. IEEE Transactions On Information Theory, IT-24(6), 668–674.
Guazzo, M., , M. (1980). A general minimum-redundancy source-coding algorithm. IEEE Transactions On Information Theory, IT-26(1), 15–25.
Huffman, D. A. (1952). A method for the construction of minimum redundancy codes. Proceeding IRE, 40, 1098–1101.
Jaynes, E. T. (1957). Information theory and statistical mechanics I & II. Physical Review, 107/108, 620–630/171–190.
Johnson, R. W., & J.E. Shore, J. E. (1983). Comments on and correction to axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Transactions On Information Theory, IT-29(6), 942–943.
Khinchin, A. I. (1957). Mathematical foundations of information theory. Dover.
Kullback, S. (1959). Information theory and statistics. New York: Wiley.
McMillan, B. (1953). The basic theorems of information theory. Annals Of Mathematics Statistics 24, 196–219. (Reprinted in Slepian 1974, 57–80).
Moher, M. (1993). Decoding via cross-entropy minimization. Proceeding GLOBECOM'93. 809–813, Houston, U.S.A.
Neumann, J. von. (1966). Theory of self-reproducing automata, edited and completed by A.W. Burks. Urbana and London: University of Illinois Press.
Pattee, H. (2005). The physics and metaphysics of biosemiotics. Journal of biosemiotics, 1(1) 281–301. (Reprinted in Favareau 2010, pp. 524–540).
Rissanen, J. J. (1976). Generalized Kraft inequality and arithmetic coding. IBM Journal of Research & Development, 20(3) 198–203.
Rissanen, J. J., & Langdon, G. G. Jr. (1979). Arithmetic coding. IBM Journal of Research & Development 23(2) 149–162.
Roubine, E. (1970). Introduction à la théorie de la communication, tome III: théorie de l’information. Paris: Masson.
Schrödinger, E. (1943). In What is life? and mind and matter. London: Cambridge University Press (1967).
Shannon, C. E. (1948). A mathematical theory of communication. The Bell System Technical Journal, 27, 379–457, 623–656. (Reprinted in Shannon and Weaver 1949, Sloane and Wyner 1993, pp. 5–83 and in Slepian 1974, pp. 5–29).
Shannon, C. E., & Weaver, W. (1949). The mathematical theory of communication. Urbana: University of Illinois Press.
Shore, J. E., & Johnson, R. W. (1980). Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Transactions. on Information Theory, IT-26(1), 26–37.
Shore, J. E., & Johnson, R. W. (1981). Properties of cross-entropy minimization. IEEE Transactions on Information Theory, IT-27(4), 472–482.
Slepian, D. (Ed.). (1974). Key papers in the development of information theory. Piscataway: IEEE Press.
Sloane, N. J. A., & A.D. Wyner, A. D. (Eds.). (1993). Claude Elwood Shannon, collected papers. Piscataway: IEEE Press.
Yockey, H. P. (1992). Information theory and molecular biology. Cambridge: Cambridge University Press.
Ziv, J., Lempel, J. (1978). Compression of individual sequences via variable-rate coding. IEEE Transactions on Information Theory, IT-24(5), 530–536.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2014 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Battail, G. (2014). Information Theory as the Science of Literal Communication. In: Information and Life. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-7040-9_4
Download citation
DOI: https://doi.org/10.1007/978-94-007-7040-9_4
Published:
Publisher Name: Springer, Dordrecht
Print ISBN: 978-94-007-7039-3
Online ISBN: 978-94-007-7040-9
eBook Packages: Biomedical and Life SciencesBiomedical and Life Sciences (R0)