In quantum mechanics, information theory, and Fourier analysis, the entropic uncertainty or Hirschman uncertainty is defined as the sum of the temporal and spectral Shannon entropies.
It turns out that Heisenberg's uncertainty principle can be expressed as a lower bound on the sum of these entropies.
This is stronger than the usual statement of the uncertainty principle in terms of the product of standard deviations.
In 1957,[1] Hirschman considered a function f and its Fourier transform g such that where the "≈" indicates convergence in L2, and normalized so that (by Plancherel's theorem), He showed that for any such functions the sum of the Shannon entropies is non-negative, A tighter bound,
was conjectured by Hirschman[1] and Everett,[2] proven in 1975 by W. Beckner[3] and in the same year interpreted as a generalized quantum mechanical uncertainty principle by Białynicki-Birula and Mycielski.
[5] Note, however, that the above entropic uncertainty function is distinctly different from the quantum Von Neumann entropy represented in phase space.
The proof of this tight inequality depends on the so-called (q, p)-norm of the Fourier transformation.
The (q, p)-norm of the Fourier transform is defined to be[6] In 1961, Babenko[7] found this norm for even integer values of q.
Finally, in 1975, using Hermite functions as eigenfunctions of the Fourier transform, Beckner[3] proved that the value of this norm (in one dimension) for all q ≥ 2 is Thus we have the Babenko–Beckner inequality that From this inequality, an expression of the uncertainty principle in terms of the Rényi entropy can be derived.
, we have Squaring both sides and taking the logarithm, we get We can rewrite the condition on
yields the less general Shannon entropy inequality, valid for any base of logarithm, as long as we choose an appropriate unit of information, bit, nat, etc.
The constant will be different, though, for a different normalization of the Fourier transform, (such as is usually used in physics, with normalizations chosen so that ħ=1 ), i.e., In this case, the dilation of the Fourier transform absolute squared by a factor of 2π simply adds log(2π) to its entropy.
on the real line, Shannon's entropy inequality specifies: where H is the Shannon entropy and V is the variance, an inequality that is saturated only in the case of a normal distribution.
That is (for ħ=1), exponentiating the Hirschman inequality and using Shannon's expression above, Hirschman[1] explained that entropy—his version of entropy was the negative of Shannon's—is a "measure of the concentration of [a probability distribution] in a set of small measure."
Thus a low or large negative Shannon entropy means that a considerable mass of the probability distribution is confined to a set of small measure.
Note that this set of small measure need not be contiguous; a probability distribution can have several concentrations of mass in intervals of small measure, and the entropy may still be low no matter how widely scattered those intervals are.