Sciweavers

ECAI
2004
Springer

A Generalized Quadratic Loss for Support Vector Machines

14 years 4 months ago
A Generalized Quadratic Loss for Support Vector Machines
The standard SVM formulation for binary classification is based on the Hinge loss function, where errors are considered not correlated. Due to this, local information in the feature space which can be useful to improve the prediction model is disregarded. In this paper we address this problem by defining a generalized quadratic loss where the co-occurrence of errors is weighted according to a kernel similarity measure in the feature space. In particular the proposed approach weights pairs of errors according to the distribution of the related patterns in the feature space. The generalized quadratic loss includes also target information in order to penalize errors on pairs of patterns that are similar and of the same class. We show that the resulting dual problem can be expressed as a hard margin SVM in a different feature space when the co-occurrence error matrix is invertible. We compare our approach against a standard SVM on some binary classification tasks. Experimental results o...
Filippo Portera, Alessandro Sperduti
Added 01 Jul 2010
Updated 01 Jul 2010
Type Conference
Year 2004
Where ECAI
Authors Filippo Portera, Alessandro Sperduti
Comments (0)