In this work we show that randomized (block) coordinate descent methods can be accelerated by parallelization when applied to the problem of minimizing the sum of a partially sepa...
We consider a class of matrix spectral norm approximation problems for finding an affine combination of given matrices having the minimal spectral norm subject to some prescribed ...
The alternating direction method of multipliers (ADMM) is now widely used in many fields, and its convergence was proved when two blocks of variables are alternatively updated. It...
Caihua Chen, Bingsheng He, Yinyu Ye, Xiaoming Yuan
We improve a recent guarantee of Bach and Moulines on the linear convergence of SGD for smooth and strongly convex objectives, reducing a quadratic dependence on the strong convex...
We introduce computable a-priori and a-posteriori error bounds for optimality and feasibility of a point generated as the rounding of an optimal point of the LP relaxation of a mi...
The standard algorithms for solving large-scale convex-concave saddle point problems, or, more generally, variational inequalities with monotone operators, are proximal type algor...
We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runt...
In this paper, we generalize the well-known Nesterov’s accelerated gradient (AG) method, originally designed for convex smooth optimization, to solve nonconvex and possibly stoch...