Variance inflation factor

[1] The VIF provides an index that measures how much the variance (the square of the estimate's standard deviation) of an estimated regression coefficient is increased because of collinearity.

Cuthbert Daniel claims to have invented the concept behind the variance inflation factor, but did not come up with the name.

[2] Consider the following linear model with k independent variables: The standard error of the estimate of βj is the square root of the j + 1 diagonal element of s2(X′X)−1, where s is the root mean squared error (RMSE) (note that RMSE2 is a consistent estimator of the true variance of the error term,

); X is the regression design matrix — a matrix such that Xi, j+1 is the value of the jth independent variable for the ith case or observation, and such that Xi,1, the predictor vector associated with the intercept term, equals 1 for all i.

It turns out that the square of this standard error, the estimated variance of the estimate of βj, can be equivalently expressed as:[3][4] where Rj2 is the multiple R2 for the regression of Xj on the other covariates (a regression that does not involve the response variable Y) and

This identity separates the influences of several distinct factors on the variance of the coefficient estimate: The remaining term, 1 / (1 − Rj2) is the VIF.

It reflects all other factors that influence the uncertainty in the coefficient estimates.

By contrast, the VIF is greater than 1 when the vector Xj is not orthogonal to all columns of the design matrix for the regression of Xj on the other covariates.

Finally, note that the VIF is invariant to the scaling of the variables (that is, we could scale each variable Xj by a constant cj without changing the VIF).

By using Schur complement, the element in the first row and first column in

We can calculate k different VIFs (one for each Xi) in three steps: First we run an ordinary least square regression that has Xi as a function of all the other explanatory variables in the first equation.

with the following formula : where R2i is the coefficient of determination of the regression equation in step one, with

However, there is no value of VIF greater than 1 in which the variance of the slopes of predictors isn't inflated.

As a result, including two or more variables in a multiple regression that are not orthogonal (i.e. have correlation = 0), will alter each other's slope, SE of the slope, and P-value, because there is shared variance between the predictors that can't be uniquely attributed to any one of them.

[7] Some software instead calculates the tolerance which is just the reciprocal of the VIF.

The square root of the variance inflation factor indicates how much larger the standard error increases compared to if that variable had 0 correlation to other predictor variables in the model.