Cross-correlation

In signal processing, cross-correlation is a measure of similarity of two series as a function of the displacement of one relative to the other.

It is commonly used for searching a long signal for a shorter, known feature.

It has applications in pattern recognition, single particle analysis, electron tomography, averaging, cryptanalysis, and neurophysiology.

In probability and statistics, the term cross-correlations refers to the correlations between the entries of two random vectors

is a scalar random variable which is realized repeatedly in a time series, then the correlations of the various temporal instances of

are two independent random variables with probability density functions

function along the x-axis, calculating the integral of their product at each position.

This is because when peaks (positive areas) are aligned, they make a large contribution to the integral.

, each containing random elements whose expected value and variance exist, the cross-correlation matrix of

is the value (or realization) produced by a given run of the process at time

Subtracting the mean before multiplication yields the cross-covariance between times

Note that this expression is not well-defined for all time series or processes, because the mean or variance may not exist.

represent a pair of stochastic processes that are jointly wide-sense stationary.

is precisely the additional information (beyond being individually wide-sense stationary) conveyed by the requirement that

The cross-correlation of a pair of jointly wide sense stationary stochastic processes can be estimated by averaging the product of samples measured from one process and samples measured from the other (and its time shifts).

For a large number of samples, the average converges to the true cross-correlation.

It is common practice in some disciplines (e.g. statistics and time series analysis) to normalize the cross-correlation function to get a time-dependent Pearson correlation coefficient.

However, in other disciplines (e.g. engineering) the normalization is usually dropped and the terms "cross-correlation" and "cross-covariance" are used interchangeably.

For jointly wide-sense stationary stochastic processes, the definition is

The normalization is important both because the interpretation of the autocorrelation as a correlation provides a scale-free measure of the strength of statistical dependence, and because the normalization has an effect on the statistical properties of the estimated autocorrelations.

For jointly wide-sense stationary stochastic processes, the cross-correlation function has the following symmetry property:[11]: p.173

Cross-correlations are useful for determining the time delay between two signals, e.g., for determining time delays for the propagation of acoustic signals across a microphone array.

This is typically done at every step by subtracting the mean and dividing by the standard deviation.

In functional analysis terms, this can be thought of as the dot product of two normalized vectors.

are real matrices, their normalized cross-correlation equals the cosine of the angle between the unit vectors

Normalized correlation is one of the methods used for template matching, a process used for finding instances of a pattern or object within an image.

NCC is similar to ZNCC with the only difference of not subtracting the local mean value of intensities:

Caution must be applied when using cross correlation function which assumes Gaussian variance for nonlinear systems.

In certain circumstances, which depend on the properties of the input, cross correlation between the input and output of a system with nonlinear dynamics can be completely blind to certain nonlinear effects.

[14] This problem arises because some quadratic moments can equal zero and this can incorrectly suggest that there is little "correlation" (in the sense of statistical dependence) between two signals, when in fact the two signals are strongly related by nonlinear dynamics.

Visual comparison of convolution , cross-correlation and autocorrelation . For the operations involving function f , and assuming the height of f is 1.0, the value of the result at 5 different points is indicated by the shaded area below each point. Also, the vertical symmetry of f is the reason and are identical in this example.
Animation of how cross-correlation is calculated. The left graph shows a green function G that is phase-shifted relative to function F by a time displacement of 𝜏. The middle graph shows the function F and the phase-shifted G represented together as a Lissajous curve . Integrating F multiplied by the phase-shifted G produces the right graph, the cross-correlation across all values of 𝜏.