Some authors use a slightly stronger assumption of additive noise: where the random variable
belongs to a specific parametric family of functions it is impossible to get an unbiased estimate for
The errors are assumed to have a multivariate normal distribution and the regression curve is estimated by its posterior mode.
The Gaussian prior may depend on unknown hyperparameters, which are usually estimated via empirical Bayes.
Smoothing splines have an interpretation as the posterior mode of a Gaussian process regression.
Kernel regression estimates the continuous dependent variable from a limited set of data points by convolving the data points' locations with a kernel function—approximately speaking, the kernel function specifies how to "blur" the influence of the data points so that their values can be used to predict the value for nearby locations.
[2] Although the original Classification And Regression Tree (CART) formulation applied only to predicting univariate data, the framework can be used to predict multivariate data, including time series.