Sciweavers

ICML
2004
IEEE

Gradient LASSO for feature selection

14 years 5 months ago
Gradient LASSO for feature selection
LASSO (Least Absolute Shrinkage and Selection Operator) is a useful tool to achieve the shrinkage and variable selection simultaneously. Since LASSO uses the L1 penalty, the optimization should rely on the quadratic program (QP) or general non-linear program which is known to be computational intensive. In this paper, we propose a gradient descent algorithm for LASSO. Even though the final result is slightly less accurate, the proposed algorithm is computationally simpler than QP or non-linear program, and so can be applied to large size problems. We provide the convergence rate of the algorithm, and illustrate it with simulated models as well as real data sets.
Yongdai Kim, Jinseog Kim
Added 30 Jun 2010
Updated 30 Jun 2010
Type Conference
Year 2004
Where ICML
Authors Yongdai Kim, Jinseog Kim
Comments (0)