Sciweavers

AAAI
2010

Smooth Optimization for Effective Multiple Kernel Learning

14 years 1 months ago
Smooth Optimization for Effective Multiple Kernel Learning
Multiple Kernel Learning (MKL) can be formulated as a convex-concave minmax optimization problem, whose saddle point corresponds to the optimal solution to MKL. Most MKL methods employ the L1-norm simplex constraints on the combination weights of kernels, which therefore involves optimization of a non-smooth function of the kernel weights. These methods usually divide the optimization into two cycles: one cycle deals with the optimization on the kernel combination weights, and the other cycle updates the parameters of SVM. Despite the success of their efficiency, they tend to discard informative complementary kernels. To improve accuracy, we introduce smoothness to the optimization procedure. Furthermore, we transform the optimization into a single smooth convex optimization problem and employ the Nesterov's method to efficiently solve the optimization problem. Experiments on benchmark data sets demonstrate that the proposed algorithm clearly improves current MKL methods in a num...
Zenglin Xu, Rong Jin, Shenghuo Zhu, Michael R. Lyu
Added 29 Oct 2010
Updated 29 Oct 2010
Type Conference
Year 2010
Where AAAI
Authors Zenglin Xu, Rong Jin, Shenghuo Zhu, Michael R. Lyu, Irwin King
Comments (0)