Shannon–Hartley theorem

It is an application of the noisy-channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to Gaussian noise.

The theorem establishes Shannon's channel capacity for such a communication link, a bound on the maximum amount of error-free information per time unit that can be transmitted with a specified bandwidth in the presence of the noise interference, assuming that the signal power is bounded, and that the Gaussian noise process is characterized by a known power or power spectral density.

through an analog communication channel subject to additive white Gaussian noise (AWGN) of power

: where During the late 1920s, Harry Nyquist and Ralph Hartley developed a handful of fundamental ideas related to the transmission of information, particularly in the context of the telegraph as a communications system.

At the time, these concepts were powerful breakthroughs individually, but they were not part of a comprehensive theory.

In the 1940s, Claude Shannon developed the concept of channel capacity, based in part on the ideas of Nyquist and Hartley, and then formulated a complete theory of information and its transmission.

Nyquist published his results in 1928 as part of his paper "Certain topics in Telegraph Transmission Theory".

[2] This method, later known as Hartley's law, became an important precursor for Shannon's more sophisticated notion of channel capacity.

Hartley argued that the maximum number of distinguishable pulse levels that can be transmitted and received reliably over a communications channel is limited by the dynamic range of the signal amplitude and the precision with which the receiver can distinguish amplitude levels.

Specifically, if the amplitude of the transmitted signal is restricted to the range of [−A ... +A] volts, and the precision of the receiver is ±ΔV volts, then the maximum number of distinct pulses M is given by By taking information per pulse in bit/pulse to be the base-2-logarithm of the number of distinct messages M that could be sent, Hartley[3] constructed a measure of the line rate R as: where

Hartley then combined the above quantification with Nyquist's observation that the number of independent pulses that could be put through a channel of bandwidth

bits per second:[5] Hartley did not work out exactly how the number M should depend on the noise statistics of the channel, or how the communication could be made reliable even when individual symbol pulses could not be reliably distinguished to M levels; with Gaussian noise statistics, system designers had to choose a very conservative value of

Hartley's rate result can be viewed as the capacity of an errorless M-ary channel of

Building on Hartley's foundation, Shannon's noisy channel coding theorem (1948) describes the maximum possible efficiency of error-correcting methods versus levels of noise interference and data corruption.

, then if there exists a coding technique which allows the probability of error at the receiver to be made arbitrarily small.

It connects Hartley's result with Shannon's channel capacity theorem in a form that is equivalent to specifying the M in Hartley's line rate formula in terms of a signal-to-noise ratio, but achieving reliability through error-correction coding rather than through reliably distinguishable pulse levels.

If there were such a thing as a noise-free analog channel, one could transmit unlimited amounts of error-free data over it per unit of time (Note that an infinite-bandwidth analog channel could not transmit unlimited amounts of error-free data absent infinite signal power).

Real channels, however, are subject to limitations imposed by both finite bandwidth and nonzero noise.

Bandwidth and noise affect the rate at which information can be transmitted over an analog channel.

Bandwidth limitations alone do not impose a cap on the maximum information rate because it is still possible for the signal to take on an indefinitely large number of different voltage levels on each symbol pulse, with each slightly different level being assigned a different meaning or bit sequence.

In the case of the Shannon–Hartley theorem, the noise is assumed to be generated by a Gaussian process with a known variance.

Such noise can arise both from random sources of energy and also from coding and measurement error at the sender and receiver respectively.

Comparing the channel capacity to the information rate from Hartley's law, we can find the effective number of distinguishable levels M:[8] The square root effectively converts the power ratio back to a voltage ratio, so the number of levels is approximately proportional to the ratio of signal RMS amplitude to noise standard deviation.

This similarity in form between Shannon's capacity and Hartley's law should not be interpreted to mean that

In this low-SNR approximation, capacity is independent of bandwidth if the noise is white, of spectral density

AWGN channel capacity with the power-limited regime and bandwidth-limited regime indicated. Here, ; B and C can be scaled proportionally for other values.