In statistics, an additive model (AM) is a nonparametric regression method.
It was suggested by Jerome H. Friedman and Werner Stuetzle (1981)[1] and is an essential part of the ACE algorithm.
The AM uses a one-dimensional smoother to build a restricted class of nonparametric regression models.
Because of this, it is less affected by the curse of dimensionality than a p-dimensional smoother.
Furthermore, the AM is more flexible than a standard linear model, while being more interpretable than a general regression surface at the cost of approximation errors.
Problems with AM, like many other machine-learning methods, include model selection, overfitting, and multicollinearity.
Given a data set
of n statistical units, where
represent predictors and
is the outcome, the additive model takes the form or Where
σ
are unknown smooth functions fit from the data.
Fitting the AM (i.e. the functions
) can be done using the backfitting algorithm proposed by Andreas Buja, Trevor Hastie and Robert Tibshirani (1989).