Adaptive regularization with cubics on manifolds

Naman Agarwal, Nicolas Boumal, Brian Bullins, Coralia Cartis

Research output: Contribution to journalArticlepeer-review

16 Scopus citations


Adaptive regularization with cubics (ARC) is an algorithm for unconstrained, non-convex optimization. Akin to the trust-region method, its iterations can be thought of as approximate, safe-guarded Newton steps. For cost functions with Lipschitz continuous Hessian, ARC has optimal iteration complexity, in the sense that it produces an iterate with gradient smaller than ε in O(1 / ε1.5) iterations. For the same price, it can also guarantee a Hessian with smallest eigenvalue larger than -ε. In this paper, we study a generalization of ARC to optimization on Riemannian manifolds. In particular, we generalize the iteration complexity results to this richer framework. Our central contribution lies in the identification of appropriate manifold-specific assumptions that allow us to secure these complexity guarantees both when using the exponential map and when using a general retraction. A substantial part of the paper is devoted to studying these assumptions—relevant beyond ARC—and providing user-friendly sufficient conditions for them. Numerical experiments are encouraging.

Original languageEnglish (US)
Pages (from-to)85-134
Number of pages50
JournalMathematical Programming
Issue number1
StatePublished - Jul 2021
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Software
  • General Mathematics


  • Complexity
  • Cubic regularization
  • Lipschitz regularity
  • Newton’s method
  • Optimization on manifolds


Dive into the research topics of 'Adaptive regularization with cubics on manifolds'. Together they form a unique fingerprint.

Cite this