Elastic net regularization

In statistics and, in particular, in the fitting of linear or logistic regression models, the elastic net is a regularized regression method that linearly combines the L1 and L2 penalties of the lasso and ridge methods.

Nevertheless, elastic net regularization is typically more accurate than both methods with regard to reconstruction.

[1] The elastic net method overcomes the limitations of the LASSO (least absolute shrinkage and selection operator) method which uses a penalty function based on Use of this penalty function has several limitations.

To overcome these limitations, the elastic net adds a quadratic part (

The estimates from the elastic net method are defined by The quadratic penalty term makes the loss function strongly convex, and it therefore has a unique minimum.

The elastic net method includes the LASSO and ridge regression: in other words, each of them is a special case where

Meanwhile, the naive version of elastic net method finds an estimator in a two-stage procedure : first for each fixed

it finds the ridge regression coefficients, and then does a LASSO type shrinkage.

This kind of estimation incurs a double amount of shrinkage, which leads to increased bias and poor predictions.

[8] The authors showed that for every instance of the elastic net, an artificial binary classification problem can be constructed such that the hyper-plane solution of a linear support vector machine (SVM) is identical to the solution

The reduction immediately enables the use of highly optimized SVM solvers for elastic net problems.

Some authors have referred to the transformation as Support Vector Elastic Net (SVEN), and provided the following MATLAB pseudo-code: