In probability theory and statistics, a cross-covariance matrix is a matrix whose element in the i, j position is the covariance between the i-th element of a random vector and j-th element of another random vector.
When the two random vectors are the same, the cross-covariance matrix is referred to as covariance matrix.
A random vector is a random variable with multiple dimensions.
Each element of the vector is a scalar random variable.
Each element has either a finite number of observed empirical values or a finite or infinite number of potential values.
The potential values are specified by a theoretical joint probability distribution.
Intuitively, the cross-covariance matrix generalizes the notion of covariance to multiple dimensions.
The cross-covariance matrix of two random vectors
is typically denoted by
For random vectors
, each containing random elements whose expected value and variance exist, the cross-covariance matrix of
is defined by[1]: 336 where
are vectors containing the expected values of
need not have the same dimension, and either might be a scalar value.
The cross-covariance matrix is the matrix whose
entry is the covariance between the i-th element of
and the j-th element of
This gives the following component-wise definition of the cross-covariance matrix.
are random vectors, then
matrix whose
-th entry is
For the cross-covariance matrix, the following basic properties apply:[2] where
matrices of constants, and
matrix of zeroes.
are complex random vectors, the definition of the cross-covariance matrix is slightly changed.
Transposition is replaced by Hermitian transposition: For complex random vectors, another matrix called the pseudo-cross-covariance matrix is defined as follows: Two random vectors
are called uncorrelated if their cross-covariance matrix
matrix is a zero matrix.
[1]: 337 Complex random vectors
are called uncorrelated if their covariance matrix and pseudo-covariance matrix is zero, i.e. if