If we have two vectors X = (X1, ..., Xn) and Y = (Y1, ..., Ym) of random variables, and there are correlations among the variables, then canonical-correlation analysis will find linear combinations of X and Y that have a maximum correlation with each other.
[1] T. R. Knapp notes that "virtually all of the commonly encountered parametric tests of significance can be treated as special cases of canonical-correlation analysis, which is the general procedure for investigating the relationships between two sets of variables.
"[2] The method was first introduced by Harold Hotelling in 1936,[3] although in the context of angles between flats the mathematical concept was published by Camille Jordan in 1875.
[5] Unfortunately, perhaps because of its popularity, the literature can be inconsistent with notation, we attempt to highlight such inconsistencies in this article to help the reader make best use of the existing literature and techniques available.
Like its sister method PCA, CCA can be viewed in population form (corresponding to random vectors and their covariance matrices) or in sample form (corresponding to datasets and their sample covariance matrices).
These two forms are almost exact analogues of each other, which is why their distinction is often overlooked, but they can behave very differently in high dimensional settings.
[6] We next give explicit mathematical definitions for the population problem and highlight the different objects in the so-called canonical decomposition - understanding the differences between these objects is crucial for interpretation of the technique.
of random variables with finite second moments, one may define the cross-covariance
In practice, we would estimate the covariance matrix based on sampled data from
Canonical-correlation analysis seeks a sequence of vectors
Then one seeks vectors maximizing the same correlation subject to the constraint that they are to be uncorrelated with the first pair of canonical variables; this gives the second pair of canonical variables.
be the cross-covariance matrix for any pair of (vector-shaped) random variables
can be obtained from the eigen-decomposition (or by diagonalization): and Thus By the Cauchy–Schwarz inequality, ...can someone check the this, particularly the term to the right of "(d) leq"?
The subsequent pairs are found by using eigenvalues of decreasing magnitudes.
Orthogonality is guaranteed by the symmetry of the correlation matrices.
The solution is therefore: Reciprocally, there is also: Reversing the change of coordinates, we have that The canonical variables are defined by: CCA can be computed using singular value decomposition on a correlation matrix.
[8] It is available as a function in[9] CCA computation using singular value decomposition on a correlation matrix is related to the cosine of the angles between flats.
The cosine function is ill-conditioned for small angles, leading to very inaccurate computation of highly correlated principal vectors in finite precision computer arithmetic.
To fix this trouble, alternative algorithms[11] are available in Each row can be tested for significance with the following method.
th row, the test statistic is: which is asymptotically distributed as a chi-squared with
are logically zero (and estimated that way also) the product for the terms after this point is irrelevant.
For example, one might find that an extraversion or neuroticism dimension accounted for a substantial amount of shared variance between the two tests.
Constraint restrictions can be imposed on such a model to ensure it reflects theoretical requirements or intuitively obvious conditions.
[15] Visualization of the results of canonical correlation is usually through bar plots of the coefficients of the two sets of variables for the pairs of canonical variates showing significant correlation.
Some authors suggest that they are best visualized by plotting them as heliographs, a circular format with ray like bars, with each half representing the two sets of variables.
, which illustrates that the canonical-correlation analysis treats correlated and anticorrelated variables similarly.
are treated as elements of a vector space with an inner product given by the covariance
is then equivalent to the definition of principal vectors for the pair of subspaces spanned by the entries of
CCA can also be viewed as a special whitening transformation where the random vectors
[17] The canonical correlations are then interpreted as regression coefficients linking