We present a line search algorithm for minimizing nonconvex and/or nonsmooth objective functions. The algorithm is a hybrid between a standard Broyden-Fletcher-Goldfarb-Shanno (BFGS) and an adaptive gradient sampling (GS) method. The BFGS strategy is employed as it typically yields fast convergence to the vicinity of a stationary point, and along with the adaptive GS strategy the algorithm ensures that convergence will continue to such a point. Under loose assumptions, we prove that the algorithm converges globally with probability one. The algorithm has been implemented in C++ and the results of numerical experiments are presented to illustrate the efficacy of the proposed numerical method. Keywords nonsmooth optimization, nonconvex optimization, unconstrained optimization, quasi-Newton methods, gradient sampling, line search methods Mathematics Subject Classification (2000) 49M05 · 65K05 · 65K10 · 90C26 · 90C30 · 90C53 · 93B40
Frank E. Curtis, Xiaocun Que