Sciweavers

CORR
2016
Springer

Sub-Sampled Newton Methods II: Local Convergence Rates

8 years 8 months ago
Sub-Sampled Newton Methods II: Local Convergence Rates
Many data-fitting applications require the solution of an optimization problem involving a sum of large number of functions of high dimensional parameter. Here, we consider the problem of minimizing a sum of n functions over a convex constraint set X ⊆ Rp where both n and p are large. In such problems, sub-sampling as a way to reduce n can offer great amount of computational efficiency, while maintaining their original convergence properties. Within the context of second order methods, we first give quantitative local convergence results for variants of Newton’s method where the Hessian is uniformly subsampled. Using random matrix concentration inequalities, one can sub-sample in a way that the curvature information is preserved. Using such sub-sampling strategy, we establish locally Q-linear and Q-superlinear convergence rates. We also give additional convergence results for when the sub-sampled Hessian is regularized by modifying its spectrum or Levenberg-type regularization....
Farbod Roosta-Khorasani, Michael W. Mahoney
Added 31 Mar 2016
Updated 31 Mar 2016
Type Journal
Year 2016
Where CORR
Authors Farbod Roosta-Khorasani, Michael W. Mahoney
Comments (0)