Article contents
Elementary Concepts of Information Theory*
Published online by Cambridge University Press: 03 November 2016
Extract
The present age is witnessing the rapid growth of a wide variety of communication and control systems using electrical signals. Examples are radio and commercial communications, television and facsimile transmission, radar and aircraft control systems, computers and industrial control mechanisms. All these have in common the fact that information has to be transmitted and processed through a channel, and they can all be represented by a block schematic such as in fig. 1. The efficiency of these systems must be assessed by comparing their information handling capacity with their cost in time, transmission channel bandwidth, equipment complexity and money. It is therefore essential to define a quantitative measure of information and to derive relations between this quantity and the parameters of a given communication system and also to determine criteria for a maximum information flow under given conditions. Information theory concerns itself with these problems, based on simplified and somewhat idealized mathematical models of actual systems. The origins of information theory go back to the 1920’s, but consistent logical foundations for the theory were laid by Shannon in the late 1940’s and the modern theory dates from then. It might appear that information is too loosely defined as a concept to be submitted to quantitative analysis; and indeed, before we can do so, we must clearly differentiate between the amount of information in a message, its meaning (semantic aspects), and its value (philosophical aspects).
- Type
- Research Article
- Information
- Copyright
- Copyright © Mathematical Association 1962
Footnotes
Lecture given at the Southampton Mathematical Conference on April 15, 1961
References
References for Further Reading
- 2
- Cited by