Brascamp–Lieb inequality

The first is a result in geometry concerning integrable functions on n-dimensional Euclidean space

The second is a result of probability theory which gives a concentration inequality for log-concave probability distributions.

Both are named after Herm Jan Brascamp and Elliott H. Lieb.

Fix natural numbers m and n. For 1 ≤ i ≤ m, let ni ∈ N and let ci > 0 so that Choose non-negative, integrable functions and surjective linear maps Then the following inequality holds: where D is given by Another way to state this is that the constant D is what one would obtain by restricting attention to the case in which each

is a centered Gaussian function, namely

[1] Consider a probability density function

This probability density function

Such probability density functions have tails which decay exponentially fast, so most of the probability mass resides in a small region around the mode of

The Brascamp–Lieb inequality gives another characterization of the compactness of

The Brascamp–Lieb inequality reads: where H is the Hessian and

[2] The inequality is generalized in 2008[3] to account for both continuous and discrete cases, and for all linear maps, with precise estimates on the constant.

Now define the Brascamp-Lieb constant for the BL datum:

that satisfies the above two conditions is a closed convex polytope defined by linear inequalities.

Note that while there are infinitely many possible choices of subspace

, so the subset is a closed convex polytope.

Similarly we can define the BL polytope for the discrete case.

The case of the Brascamp–Lieb inequality in which all the ni are equal to 1 was proved earlier than the general case.

[6] In 1989, Keith Ball introduced a "geometric form" of this inequality.

are positive numbers satisfying for all

are positive measurable functions on

resolve the inner product the inequality has a particularly simple form: the constant is equal to 1 and the extremal Gaussian densities are identical.

Ball used this inequality to estimate volume ratios and isoperimetric quotients for convex sets in [7] and.

[8] There is also a geometric version of the more general inequality in which the maps

Take ni = n, Bi = id, the identity map on

, replacing fi by f1/cii, and let ci = 1 / pi for 1 ≤ i ≤ m. Then and the log-concavity of the determinant of a positive definite matrix implies that D = 1.

This yields Hölder's inequality in

: The Brascamp–Lieb inequality is an extension of the Poincaré inequality which only concerns Gaussian probability distributions.

[9] The Brascamp–Lieb inequality is also related to the Cramér–Rao bound.

[9] While Brascamp–Lieb is an upper-bound, the Cramér–Rao bound lower-bounds the variance of

The Cramér–Rao bound states which is very similar to the Brascamp–Lieb inequality in the alternative form shown above.