Sciweavers

ICANN
2007
Springer

Sparse Least Squares Support Vector Regressors Trained in the Reduced Empirical Feature Space

14 years 6 months ago
Sparse Least Squares Support Vector Regressors Trained in the Reduced Empirical Feature Space
Abstract. In this paper we discuss sparse least squares support vector regressors (sparse LS SVRs) defined in the reduced empirical feature space, which is a subspace of mapped training data. Namely, we define an LS SVR in the primal form in the empirical feature space, which results in solving a set of linear equations. The independent components in the empirical feature space are obtained by deleting dependent components during the Cholesky factorization of the kernel matrix. The independent components are associated with support vectors and controlling the threshold of the Cholesky factorization we obtain a sparse LS SVM. For linear kernels the number of support vectors is the number of input variables at most and if we use the input axes as support vectors, the primal and dual forms are equivalent. By computer experiments we show that we can reduce the number of support vectors without deteriorating the generalization ability.
Shigeo Abe, Kenta Onishi
Added 08 Jun 2010
Updated 08 Jun 2010
Type Conference
Year 2007
Where ICANN
Authors Shigeo Abe, Kenta Onishi
Comments (0)