Uncorrelatedness (probability theory)

In probability theory and statistics, two real-valued random variables,

, are said to be uncorrelated if their covariance,

{\displaystyle \operatorname {cov} [X,Y]=\operatorname {E} [XY]-\operatorname {E} [X]\operatorname {E} [Y]}

If two variables are uncorrelated, there is no linear relationship between them.

Uncorrelated random variables have a Pearson correlation coefficient, when it exists, of zero, except in the trivial case when either variable has zero variance (is a constant).

In this case the correlation is undefined.

In general, uncorrelatedness is not the same as orthogonality, except in the special case where at least one of the two random variables has an expected value of 0.

In this case, the covariance is the expectation of the product, and

are independent, with finite second moments, then they are uncorrelated.

However, not all uncorrelated variables are independent.[1]: p.

are called uncorrelated if their covariance

{\displaystyle X,Y{\text{ uncorrelated}}\quad \iff \quad \operatorname {E} [XY]=\operatorname {E} [X]\cdot \operatorname {E} [Y]}

Two complex random variables

are called uncorrelated if their covariance

{\displaystyle Z,W{\text{ uncorrelated}}\quad \iff \quad \operatorname {E} [Z{\overline {W}}]=\operatorname {E} [Z]\cdot \operatorname {E} [{\overline {W}}]{\text{ and }}\operatorname {E} [ZW]=\operatorname {E} [Z]\cdot \operatorname {E} [W]}

A set of two or more random variables

This is equivalent to the requirement that the non-diagonal elements of the autocovariance matrix

The autocovariance matrix is defined as: The claim is that

have zero covariance (and thus are uncorrelated), but are not independent.

Proof: Taking into account that where the second equality holds because

is a continuous random variable uniformly distributed on

is 0 on the triangle defined by

{\displaystyle Cov[X,Y]=E\left[(X-E[X])(Y-E[Y])\right]=E\left[X^{3}-{X \over 3}\right]={{1^{4}-(-1)^{4}} \over {4\times 2}}=0}

There are cases in which uncorrelatedness does imply independence.

One of these cases is the one in which both random variables are two-valued (so each can be linearly transformed to have a Bernoulli distribution).

[3] Further, two jointly normally distributed random variables are independent if they are uncorrelated,[4] although this does not hold for variables whose marginal distributions are normal and uncorrelated but whose joint distribution is not joint normal (see Normally distributed and uncorrelated does not imply independent).

are called uncorrelated if They are uncorrelated if and only if their cross-covariance matrix

[5]: p.337 Two complex random vectors

are called uncorrelated if their cross-covariance matrix and their pseudo-cross-covariance matrix is zero, i.e. if where and Two stochastic processes

are called uncorrelated if their cross-covariance