Wiener–Khinchin theorem

In applied mathematics, the Wiener–Khinchin theorem or Wiener–Khintchine theorem, also known as the Wiener–Khinchin–Einstein theorem or the Khinchin–Kolmogorov theorem, states that the autocorrelation function of a wide-sense-stationary random process has a spectral decomposition given by the power spectral density of that process.

[1][2][3][4][5][6][7] Norbert Wiener proved this theorem for the case of a deterministic function in 1930;[8] Aleksandr Khinchin later formulated an analogous result for stationary stochastic processes and published that probabilistic analogue in 1934.

[9][10] Albert Einstein explained, without proofs, the idea in a brief two-page memo in 1914.

is a wide-sense-stationary random process whose autocorrelation function (sometimes called autocovariance) defined in terms of statistical expected value

where the asterisk denotes complex conjugate, then there exists a monotone function

, or equivalently a non negative Radon measure

[1][13] This is a kind of spectral decomposition of the auto-correlation function.

is called the power spectral distribution function and is a statistical distribution function.

does not exist in general, because stochastic random functions are usually not absolutely integrable.

assumed to be absolutely integrable, so it need not have a Fourier transform either.

is absolutely continuous (e.g. if the process is purely indeterministic), then

, the power spectral density of

everywhere,[14] (obtaining that F is the integral of its averaged derivative[15]), and the theorem simplifies to

are "sufficiently nice" such that the Fourier inversion theorem is valid, the Wiener–Khinchin theorem takes the simple form of saying that

For the discrete-time case, the power spectral density of the function with discrete values

is used to denote the imaginary unit (in engineering, sometimes the letter

, defined in its deterministic or stochastic formulation.

is absolutely summable, i.e. the result of the theorem then can be written as Being a discrete-time sequence, the spectral density is periodic in the frequency domain.

(note the interval is open from one side).

The theorem is useful for analyzing linear time-invariant systems (LTI systems) when the inputs and outputs are not square-integrable, so their Fourier transforms do not exist.

A corollary is that the Fourier transform of the autocorrelation function of the output of an LTI system is equal to the product of the Fourier transform of the autocorrelation function of the input of the system times the squared magnitude of the Fourier transform of the system impulse response.

[16] This works even when the Fourier transforms of the input and output signals do not exist because these signals are not square-integrable, so the system inputs and outputs cannot be directly related by the Fourier transform of the impulse response.

Since the Fourier transform of the autocorrelation function of a signal is the power spectrum of the signal, this corollary is equivalent to saying that the power spectrum of the output is equal to the power spectrum of the input times the energy transfer function.

This corollary is used in the parametric method for power spectrum estimation.

In many textbooks and in much of the technical literature, it is tacitly assumed that Fourier inversion of the autocorrelation function and the power spectral density is valid, and the Wiener–Khinchin theorem is stated, very simply, as if it said that the Fourier transform of the autocorrelation function was equal to the power spectral density, ignoring all questions of convergence[17] (similar to Einstein's paper[11]).

But the theorem (as stated here) was applied by Norbert Wiener and Aleksandr Khinchin to the sample functions (signals) of wide-sense-stationary random processes, signals whose Fourier transforms do not exist.

Wiener's contribution was to make sense of the spectral decomposition of the autocorrelation function of a sample function of a wide-sense-stationary random process even when the integrals for the Fourier transform and Fourier inversion do not make sense.

Further complicating the issue is that the discrete Fourier transform always exists for digital, finite-length sequences, meaning that the theorem can be blindly applied to calculate autocorrelations of numerical sequences.

As mentioned earlier, the relation of this discrete sampled data to a mathematical model is often misleading, and related errors can show up as a divergence when the sequence length is modified.

, to obtain what they refer to as the autocorrelation function.