Backfitting algorithm

In statistics, the backfitting algorithm is a simple iterative procedure used to fit a generalized additive model.

It was introduced in 1985 by Leo Breiman and Jerome Friedman along with generalized additive models.

In most cases, the backfitting algorithm is equivalent to the Gauss–Seidel method, an algorithm used for solving a certain linear system of equations.

Additive models are a class of non-parametric regression models of the form: where each

is our outcome variable.

represents our inherent error, which is assumed to have mean zero.

represent unspecified smooth functions of a single

, we typically do not have a unique solution:

is left unidentifiable as one can add any constants to any of the

It is common to rectify this by constraining leaving necessarily.

is our smoothing operator.

This is typically chosen to be a cubic spline smoother but can be any other appropriate fitting operation, such as: In theory, step (b) in the algorithm is not needed as the function estimates are constrained to sum to zero.

However, due to numerical issues this might become a problem in practice.

[1] If we consider the problem of minimizing the expected squared error: There exists a unique solution by the theory of projections given by: for i = 1, 2, ..., p. This gives the matrix interpretation: where

In this context we can imagine a smoother matrix,

or in abbreviated form An exact solution of this is infeasible to calculate for large np, so the iterative technique of backfitting is used.

We take initial guesses

in turn to be the smoothed fit for the residuals of all the others: Looking at the abbreviated form it is easy to see the backfitting algorithm as equivalent to the Gauss–Seidel method for linear smoothing operators S. Following,[2] we can formulate the backfitting algorithm explicitly for the two dimensional case.

in the ith updating step, the backfitting steps are By induction we get and If we set

by directly plugging out from

: We can check this is a solution to the problem, i.e. that

correspondingly, by plugging these expressions into the original equations.

The choice of when to stop the algorithm is arbitrary and it is hard to know a priori how long reaching a specific convergence threshold will take.

Also, the final model depends on the order in which the predictor variables

As well, the solution found by the backfitting procedure is non-unique.

A modification of the backfitting algorithm involving projections onto the eigenspace of S can remedy this problem.

We can modify the backfitting algorithm to make it easier to provide a unique solution.

be the space spanned by all the eigenvectors of Si that correspond to eigenvalue 1.

to be a matrix that projects orthogonally onto

, we get the following modified backfitting algorithm: