Introduction
Published online by Cambridge University Press: 05 June 2012
Summary
In 1948, in the introduction to his classic paper, “A mathematical theory of communication,” Claude Shannon, wrote:
“The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.”
To solve that problem he created, in the pages that followed, a completely new branch of applied mathematics, which is today called information theory and/or coding theory. This book's object is the presentation of the main results of this theory as they stand 30 years later.
In this introductory chapter we illustrate the central ideas of information theory by means of a specific pair of mathematical models, the binary symmetric source and the binary symmetric channel.
The binary symmetric source (the source, for short) is an object which emits one of two possible symbols, which we take to be “0” and “1,” at a rate of R symbols per unit of time. We shall call these symbols bits, an abbreviation of binary digits. The bits emitted by the source are random, and a “0” is as likely to be emitted as a “1.” We imagine that the source rate R is continuously variable, that is, R can assume any nonnegative value.
The binary symmetric channel (the BSC2 for short) is an object through which it is possible to transmit one bit per unit of time.
- Type
- Chapter
- Information
- The Theory of Information and CodingStudent Edition, pp. 1 - 14Publisher: Cambridge University PressPrint publication year: 2004