Sciweavers

ICML
2002
IEEE

Learning the Kernel Matrix with Semi-Definite Programming

15 years 15 days ago
Learning the Kernel Matrix with Semi-Definite Programming
Kernel-based learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information is contained in the so-called kernel matrix, a symmetric and positive definite matrix that encodes the relative positions of all points. Specifying this matrix amounts to specifying the geometry of the embedding space and inducing a notion of similarity in the input space--classical model selection problems in machine learning. In this paper we show how the kernel matrix can be learned from data via Semi-Definite Programming techniques. When applied to a kernel matrix associated with both training and test data this gives a powerful transductive algorithm--using the labelled part of the data one can learn an "optimal" embedding also for the unlabelled part. The induced similarity bet...
Gert R. G. Lanckriet, Nello Cristianini, Peter L.
Added 17 Nov 2009
Updated 17 Nov 2009
Type Conference
Year 2002
Where ICML
Authors Gert R. G. Lanckriet, Nello Cristianini, Peter L. Bartlett, Laurent El Ghaoui, Michael I. Jordan
Comments (0)