Introduction
Published online by Cambridge University Press: 10 November 2009
Summary
In 1948, in the introduction to his classic paper, “A mathematical theory of communication,” Claude Shannon wrote:
“The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.”
To solve that problem he created, in the pages that followed, a completely new branch of applied mathematics, which is today called information theory and/or coding theory. This book's object is the presentation of the main results of this theory as they stand 30 years later.
In this introductory chapter we illustrate the central ideas of information theory by means of a specific pair of mathematical models, the binary symmetric source and the binary symmetric channel.
The binary symmetric source (the source, for short) is an object which emits one of two possible symbols, which we take to be “0” and “1,” at a rate of R symbols per unit of time. We shall call these symbols bits, an abbreviation of binary digits. The bits emitted by the source are random, and a “0” is as likely to be emitted as a “1.” We imagine that the source rate R is continuously variable, that is, R can assume any nonnegative value.
The binary symmetric channel (the BSC2 for short) is an object through which it is possible to transmit one bit per unit of time. However, the channel is not completely reliable: there is a fixed probability p (called the raw bit error probability3), 0 ≤ p ≤ ½, that the output bit will not be the same as the input bit.
- Type
- Chapter
- Information
- The Theory of Information and Coding , pp. 1 - 14Publisher: Cambridge University PressPrint publication year: 2002