The Shannon-Hartley theorem tells the maximum amount of error-free digital data that can be transmitted over a communications channel (e.g., a copper wire or an optical fiber) with a specified bandwidth in the presence of noise. Bandwidth is the range of frequencies that a communications channel can carry. The greater the bandwidth of a channel, the larger is its throughput (i.e., data transmission capacity). The term noise refers to signals in a communication channel that are unrelated to the information that is being transmitted and can reduce the throughput of the channel. Noise can be electrical signals that occur in copper wire as a result of radio frequency interference (RFI) from electrical and electronic products. In the case of optical fiber, it can be distortions of the light waves traversing the fiber as a result of minor imperfections in the fiber. Unlimited amounts of error-free data could theoretically be transmitted over an infinite bandwidth, noise-free, analog communications channel. However, real-world signals are limited by both bandwidth constraints and noise. Bandwidth limitations alone do not impose a theoretical limit on throughput. This is because it is still possible for the signal to take on an infinite number of different voltage levels on each cycle, with each slightly different level being assigned a different meaning or bit sequence. However, if there is both a bandwidth limitation and noise (as is always the case in the real world), then there is a limit on the amount of information transfer regardless of the degree of sophistication of the data encoding/decoding techniques. This is because the noise obfuscates the fine differences that distinguish the various signal levels, thereby limiting the number of detection levels that can be employed. Taking into consideration all possible multi-level and multi-phase encoding techniques, the Shannon-Hartley theorem tells the theoretical maximum rate of error-free, or arbitrarily low bit error rate (BER), data that can be sent through an analog communications channel with a given average signal power level. For any given BER, however small, a coding technique can be found that achieves that rate, although the smaller the given BER, the more complicated will be the technique. The theorem, named after its developers, Claude Shannon and Ralph Hartley, both of whom were researchers at Bell Labs, was proved by Claude Shannon in 1948. It is a foundation of information theory and has extensive applications in both communications and data storage. Created November 13, 2005. |