Entropy power inequality

It shows that the entropy power of suitably well-behaved random variables is a superadditive function.

The entropy power inequality was proved in 1948 by Claude Shannon in his seminal paper "A Mathematical Theory of Communication".

For a random vector X : Ω → Rn with probability density function f : Rn → R, the differential entropy of X, denoted h(X), is defined to be and the entropy power of X, denoted N(X), is defined to be In particular, N(X) = |K| 1/n when X is normal distributed with covariance matrix K. Let X and Y be independent random variables with probability density functions in the Lp space Lp(Rn) for some p > 1.

Then Moreover, equality holds if and only if X and Y are multivariate normal random variables with proportional covariance matrices.

The entropy power inequality can be rewritten in an equivalent form that does not explicitly depend on the definition of entropy power (see Costa and Cover reference below).