Global rates of convergence for nonconvex optimization on manifolds

Nicolas Boumal, P. A. Absil, Coralia Cartis

Research output: Contribution to journalArticlepeer-review

137 Scopus citations

Abstract

We consider the minimization of a cost function f on a manifold M using Riemannian gradient descent and Riemannian trust regions (RTR). We focus on satisfying necessary optimality conditions within a tolerance ε. Specifically, we show that, under Lipschitz-type assumptions on the pullbacks of f to the tangent spaces of M, both of these algorithms produce points with Riemannian gradient smaller than ε in O1/ε2 iterations. Furthermore, RTR returns a point where also the Riemannian Hessian's least eigenvalue is larger than −ε in O1/ε3 iterations. There are no assumptions on initialization. The rates match their (sharp) unconstrained counterparts as a function of the accuracy ε (up to constants) and hence are sharp in that sense. These are the first deterministic results for global rates of convergence to approximate first- and second-order Karush-Kuhn-Tucker points on manifolds. They apply in particular for optimization constrained to compact submanifolds of Rn, under simpler assumptions.

Original languageEnglish (US)
Pages (from-to)1-33
Number of pages33
JournalIMA Journal of Numerical Analysis
Volume39
Issue number1
DOIs
StatePublished - Jan 1 2019

All Science Journal Classification (ASJC) codes

  • General Mathematics
  • Computational Mathematics
  • Applied Mathematics

Keywords

  • Complexity
  • Gradient descent
  • Optimization on manifolds
  • Riemannian optimization
  • Trust-region method

Fingerprint

Dive into the research topics of 'Global rates of convergence for nonconvex optimization on manifolds'. Together they form a unique fingerprint.

Cite this