The
Bandwidth is the range of frequencies that a communications channel can carry. The greater the bandwidth of a channel, the larger is its
The term Unlimited amounts of error-free data could theoretically be transmitted over an infinite bandwidth, noise-free, analog communications channel. However, real-world signals are limited by both bandwidth constraints and noise. Bandwidth limitations alone do not impose a theoretical limit on throughput. This is because it is still possible for the signal to take on an infinite number of different voltage levels on each cycle, with each slightly different level being assigned a different meaning or bit sequence. However, if there is both a bandwidth limitation and noise (as is always the case in the real world), then there is a limit on the amount of information transfer regardless of the degree of sophistication of the data encoding/decoding techniques. This is because the noise obfuscates the fine differences that distinguish the various signal levels, thereby limiting the number of detection levels that can be employed. Taking into consideration all possible multi-level and multi-phase encoding techniques, the Shannon-Hartley theorem tells the theoretical maximum rate of error-free, or arbitrarily low bit error rate (BER), data that can be sent through an analog communications channel with a given average signal power level. For any given BER, however small, a coding technique can be found that achieves that rate, although the smaller the given BER, the more complicated will be the technique. The theorem, named after its developers, Claude Shannon and Ralph Hartley, both of whom were researchers at Bell Labs, was proved by Claude Shannon in 1948. It is a foundation of information theory and has extensive applications in both communications and data storage. Created November 13, 2005. |