Introduction
How can we protect a bit stream against subsequent errors?
We proceed by packaging the bit stream into blocks, where a block consists of information bits together with additional bits called parity check bits. The information bits contain the message content of the bit stream, while the parity check bits are selected according to the information bits. As we shall see, it is the careful selection of these parity check bits that makes error control in a block possible. Not only can an error be detected in a block, but it can even be corrected.
In the following, we construct methods for detecting and correcting single bit errors in a block. If a block has more than one bit in error, then we are subject to a missed detection or a wrong correction. Fortunately, when bit errors occur infrequently and independently of each other, the probability of multiple bit errors in a block is small .
In 1948, Claude Shannon showed that controlled redundancy in digital communications allows for the existence of communications at arbitrarily low bit error rates (BER). Error control coding (ECC) uses this controlled redundancy to detect and correct errors. When ECC is used to correct errors this is called Forward Error Correction (FEC). The method of error control coding used depends on the requirements of the system (e.g. data, voice, and video) and the nature of the channel (e.g. wireless, mobile, high interference). The obstacle in error control coding research is to find a way to add redundancy to the channel so that the receiver can fully utilize that redundancy to detect and correct the errors and to improve the coding gain -- the effective lowering of the power required or improvement in through put. Codes have varying degrees of efficiency and only a few limited cases actually make use of all of the redundancy in the channel. Different codes also work better in certain conditions like high bit error rates (BER), high throughput, or bursty channels. Part of Shannon s work was discovering the theoretical limit on the lowest signal to noise ratio required to achieve throughput. Shannon also proved that channels have a maximum capacity, C, that can not be surpasses regardless of how good the code is. However, he proved that codes exist such that by keeping the code rate, R, (explained below) less than C, one could control the error rate based simply on the code structure without having to lower the effective information throughput. Four decades of research have lead to codes that come very close to matching this theoretical limit.