Lasso = SVM
02 / 2012
en | de
Home
Icon

An Equivalence between the Lasso and Support Vector Machines

(Showing a reduction between instances of each)

Abstract:
We investigate the relation of two fundamental tools in machine learning, that is the support vector machine (SVM) for classification, and the Lasso technique used in regression. We show that the resulting optimization problems are equivalent, in the following sense: Given any instance of an l2-loss soft-margin (or hard-margin) SVM, we construct a Lasso instance having the same optimal solutions, and vice versa.

In consequence, many existing optimization algorithms for both SVMs and Lasso can also be applied to the respective other problem instances. Also, the equivalence allows for many known theoretical insights for SVM and Lasso to be translated between the two settings. One such implication gives a simple kernelized version of the Lasso, analogous to the kernels used in the SVM setting. Another consequence is that the sparsity of a Lasso solution is equal to the number of support vectors for the corresponding SVM instance, and that one can use screening rules to prune the set of support vectors. Furthermore, we can relate sublinear time algorithms for the two problems, and give a new such algorithm variant for the Lasso.

Talk Slides - Connections between the Lasso and Support Vector Machines
Invited plenary presentation of 8th July 2013 at
ROKS 2013 - International Workshop on Advances in Regularization, Optimization, Kernel Methods and Support Vector Machines: Theory and Applications, Leuven, Belgium

Book Chapter: Jaggi, M. (2014). An Equivalence between the Lasso and Support Vector Machines. In Regularization, Optimization, Kernels, and Support Vector Machines (pp. 1–26). Chapman and Hall/CRC.