DOI: 10.1287/moor.2023.1390 ISSN:

A Superlinearly Convergent Subgradient Method for Sharp Semismooth Problems

Vasileios Charisopoulos, Damek Davis
  • Management Science and Operations Research
  • Computer Science Applications
  • General Mathematics

Subgradient methods comprise a fundamental class of nonsmooth optimization algorithms. Classical results show that certain subgradient methods converge sublinearly for general Lipschitz convex functions and converge linearly for convex functions that grow sharply away from solutions. Recent work has moreover extended these results to certain nonconvex problems. In this work, we seek to improve the complexity of these algorithms by asking the following question. Is it possible to design a superlinearly convergent subgradient method? We provide a positive answer to this question for a broad class of sharp semismooth functions.

Funding: The research of D. Davis was supported by the Division of Mathematical Sciences [Grant 2047637] and the Alfred P. Sloan Foundation.

More from our Archive