Abstract— A distributed online learning framework for support vector machines (SVMs) is presented and analyzed. First, the generic binary classification problem is decomposed into multiple relaxed subproblems. Then, each of them is solved iteratively through parallel update algorithms with minimal communication overhead. This computation can be performed by individual processing units, such as separate computers or processor cores, in parallel and possibly having access to only a subset of the data. Convergence properties of continuous- and discrete-time variants of the proposed parallel update schemes are studied. A sufficient condition is derived under which synchronous and asynchronous gradient algorithms converge to the approximate solution. Subsequently, a class of stochastic update algorithms, which may arise due to distortions in the information flow between units, is shown to be globally stable under similar sufficient conditions. Active set methods are utilized to decrea...