[3] Self organising kernel adaptive filters that use iteration to achieve convex LMS error minimisation address some of the statistical and practical issues of non-linear models that do not arise in the linear case.
[4] Regularisation is particularly important feature for non-linear models and also often used in linear adaptive filters to reduce statistical uncertainties.
Iterative gradient descent that is typically used in adaptive filters has also gained popularity in offline batch-mode support vector based machine learning because of its computational efficiency for large data set processing.
Both time series and batch data processing performance is reported [5] to be able to easily handle over 100,000 training examples using as little as 10kB RAM.
Data sizes this large are challenging to the original formulations of support vector machines and other kernel methods, which for example relied on constrained optimisation using linear or quadratic programming techniques.