In time series analysis, the moving-average model (MA model), also known as moving-average process, is a common approach for modeling univariate time series.
[1][2] The moving-average model specifies that the output variable is cross-correlated with a non-identical to itself random-variable.
Together with the autoregressive (AR) model, the moving-average model is a special case and key component of the more general ARMA and ARIMA models of time series,[3] which have a more complicated stochastic structure.
The moving-average model should not be confused with the moving average, a distinct concept despite some similarities.
[1] The notation MA(q) refers to the moving average model of order q: where
The value of q is called the order of the MA model.
This can be equivalently written in terms of the backshift operator B as[4] Thus, a moving-average model is conceptually a linear regression of the current value of the series against current and previous (observed) white noise error terms or random shocks.
The random shocks at each point are assumed to be mutually independent and to come from the same distribution, typically a normal distribution, with location at zero and constant scale.
The moving-average model is essentially a finite impulse response filter applied to white noise, with some additional interpretation placed on it.
[clarification needed] The role of the random shocks in the MA model differs from their role in the autoregressive (AR) model in two ways.
First, they are propagated to future values of the time series directly: for example,
Second, in the MA model a shock affects
values only for the current period and q periods into the future; in contrast, in the AR model a shock affects
This means that iterative non-linear fitting procedures need to be used in place of linear least squares.
Moving average models are linear combinations of past white noise terms, while autoregressive models are linear combinations of past time series values.
[6] ARMA models are more complicated than pure AR and MA models, as they combine both autoregressive and moving average components.
[5] The autocorrelation function (ACF) of an MA(q) process is zero at lag q + 1 and greater.
Autoregressive Integrated Moving Average (ARIMA) models are an alternative to segmented regression that can also be used for fitting a moving-average model.
[7] This article incorporates public domain material from the National Institute of Standards and Technology