Sciweavers

ESANN
2008

Comparison of sparse least squares support vector regressors trained in primal and dual

14 years 1 months ago
Comparison of sparse least squares support vector regressors trained in primal and dual
In our previous work, we have developed sparse least squares support vector regressors (sparse LS SVRs) trained in the primal form in the reduced empirical feature space. In this paper we develop sparse LS SVRs trained in the dual form in the empirical feature space. Namely, first the support vectors that span the reduced empirical feature space are selected by the Cholesky factorization and LS SVR is trained in the dual form by solving a set of linear equations. We compare the computational cost of the LS SVRs in the primal and dual form and clarify that if the dimension of the reduced empirical feature space is almost equal to the number of training data, the dual form is faster. But the primal form is computationally more stable and for the large margin parameter the coefficient matrix of the dual form becomes near singular. By computer experiments using some benchmark data sets we verify the above results.
Shigeo Abe
Added 29 Oct 2010
Updated 29 Oct 2010
Type Conference
Year 2008
Where ESANN
Authors Shigeo Abe
Comments (0)