Sciweavers

ICIP
2009
IEEE

Efficient reduction of support vectors in kernel-based methods

13 years 10 months ago
Efficient reduction of support vectors in kernel-based methods
Kernel-based methods, e.g., support vector machine (SVM), produce high classification performances. However, the computation becomes time-consuming as the number of the vectors supporting the classifier increases. In this paper, we propose a method for reducing the computational cost of classification by kernel-based methods while retaining the high performance. By using linear algebra of a kernel Gram matrix of the support vectors (SVs) at low computational cost, the method efficiently prunes the redundant SVs which are unnecessary for constructing the classifier. The pruning is based on the evaluation of the performance of the classifier formed by the reduced SVs in SVM. In the experiment of classification using SVM for various datasets, the feasibility of the evaluation criterion and the effectiveness of the proposed method are demonstrated.
Takumi Kobayashi, Nobuyuki Otsu
Added 19 Feb 2011
Updated 19 Feb 2011
Type Journal
Year 2009
Where ICIP
Authors Takumi Kobayashi, Nobuyuki Otsu
Comments (0)