Sciweavers

KDD
2008
ACM

A sequential dual method for large scale multi-class linear svms

15 years 25 days ago
A sequential dual method for large scale multi-class linear svms
Efficient training of direct multi-class formulations of linear Support Vector Machines is very useful in applications such as text classification with a huge number examples as well as features. This paper presents a fast dual method for this training. The main idea is to sequentially traverse through the training set and optimize the dual variables associated with one example at a time. The speed of training is enhanced further by shrinking and cooling heuristics. Experiments indicate that our method is much faster than state of the art solvers such as bundle, cutting plane and exponentiated gradient methods. Categories and Subject Descriptors I.5.2 [Pattern Recognition]: Design Methodology--Classifier design and evaluation General Terms Algorithms, Performance, Experimentation
S. Sathiya Keerthi, S. Sundararajan, Kai-Wei Chang
Added 30 Nov 2009
Updated 30 Nov 2009
Type Conference
Year 2008
Where KDD
Authors S. Sathiya Keerthi, S. Sundararajan, Kai-Wei Chang, Cho-Jui Hsieh, Chih-Jen Lin
Comments (0)