We devise a boosting approach to classification and regression based on column generation using a mixture of kernels. Traditional kernel methods construct models based on a single positive semi-definite kernel with the type of kernel predefined and kernel parameters chosen from a set of possible choices according to cross validation performance. Our approach creates models that are mixtures of a library of kernel models, and our algorithm automatically determines kernels to be used in the final model. The 1-norm and 2norm regularization methods are employed to restrict the ensemble of kernel models. The proposed method produces more sparse solutions. Hence it can handle larger problems, and significantly reduces the testing time. By extending the column generation (CG) optimization which existed for linear programs with 1-norm regularization to quadratic programs that use 2-norm regularization, we are able to solve many learning formulations by leveraging various algorithms for constr...
Jinbo Bi, Tong Zhang, Kristin P. Bennett