An Introduction to Information Theory
Claude Shannon’s 1948 paper “A Mathematical Theory of Communication” is the paper that made the digital world we live in possible. Scientific American called it “The Magna Carta of the Information Age.”
Shannon defined modern digital communication and determined things like how much information can be transmitted over a telephone line, the effects of noise on the signal, and the measures you have to take to get a perfect signal on the other end. It made the Internet possible.
Trouble is, it’s tough reading – college level material for engineers and math geeks. HOWEVER Shannon’s concepts are simple and easy to explain. In just a few minutes you’ll understand Shannon’s concepts and you’ll see that any 7th grader can easily grasp them.
There are two basic problems in information theory that are very easy to explain. Two people, Alice and Bob, want to communicate over a digital channel over some long period of time, and they know the probability that certain messages will be sent ahead of time. For example, English language sentences are more likely than gibberish, and “Hi” is much more likely than “asphyxiation.” The problems are:
- Say communication is very expensive. Then the problem is to come up with an encoding scheme for the messages which minimizes the expected length of an encoded message and guarantees the ability to unambiguously decode a message. This is called the noiseless coding problem.
- Say communication is not expensive, but error prone. In particular, each bit of your message is erroneously flipped with some known probably , and all the errors are independent. Then the question is, how can one encode their messages to as to guarantee (with high probability) the ability to decode any sent message? This is called the noisy coding problem.