In the recent years support vector machines (SVMs) have been successfully applied to solve a large number of classification problems. Training an SVM, usually posed as a quadratic programming (QP) problem, often becomes a challenging task for the large data sets due to the high memory requirements and slow convergence. We propose to apply boosting to Platt’s Sequential Minimal Optimization (SMO) algorithm and to use resulting Boost-SMO method for speeding and scaling up the SVM training. Experiments on three commonly used benchmark data sets show that Boost-SMO achieves classification accuracy comparable to conventional SMO but is a factor of 3 to 10 faster. The speed-up could easily be orders of magnitude on the larger data sets.