Multidimensional Chebyshev's inequality

In probability theory, the multidimensional Chebyshev's inequality[1] is a generalization of Chebyshev's inequality, which puts a bound on the probability of the event that a random variable differs from its expected value by more than a specified amount.

is positive, Markov's inequality holds: Finally, There is a straightforward extension of the vector version of Chebyshev's inequality to infinite dimensional settings[more refs.

[3] Let X be a random variable which takes values in a Fréchet space

This includes most common settings of vector-valued random variables, e.g., when

Suppose that X is of "strong order two", meaning that for every seminorm || ⋅ ||α.

This is a generalization of the requirement that X have finite variance, and is necessary for this strong form of Chebyshev's inequality in infinite dimensions.

The terminology "strong order two" is due to Vakhania.

be the Pettis integral of X (i.e., the vector generalization of the mean), and let be the standard deviation with respect to the seminorm || ⋅ ||α.

The proof is straightforward, and essentially the same as the finitary version[source needed].

If σα = 0, then X is constant (and equal to μ) almost surely, so the inequality is trivial.

The crucial trick in Chebyshev's inequality is to recognize that