We develop a new multiclass classification method that reduces the multiclass problem to a single binary classifier (SBC). Our method constructs the binary problem by embedding smaller binary problems into a single space. A good embedding will allow for large margin classification. We show that the construction of such an embedding can be reduced to the task of learning linear combinations of kernels. We provide a bound on the generalization error of the multiclass classifier obtained with our construction and outline the conditions for its consistency. Our empirical examination of the new method indicates that it outperforms one-vs-all, all-pairs and the error-correcting output coding scheme at least when the number of classes is small. Key words: multiclass classification, support vector machines, multiple kernel learning