First-order methods almost always avoid strict saddle points

Jason D. Lee, Ioannis Panageas, Georgios Piliouras, Max Simchowitz, Michael I. Jordan, Benjamin Recht

Research output: Contribution to journalArticlepeer-review

106 Scopus citations

Abstract

We establish that first-order methods avoid strict saddle points for almost all initializations. Our results apply to a wide variety of first-order methods, including (manifold) gradient descent, block coordinate descent, mirror descent and variants thereof. The connecting thread is that such algorithms can be studied from a dynamical systems perspective in which appropriate instantiations of the Stable Manifold Theorem allow for a global stability analysis. Thus, neither access to second-order derivative information nor randomness beyond initialization is necessary to provably avoid strict saddle points.

Original languageEnglish (US)
Pages (from-to)311-337
Number of pages27
JournalMathematical Programming
Volume176
Issue number1-2
DOIs
StatePublished - Jul 1 2019
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Software
  • General Mathematics

Keywords

  • Dynamical systems
  • Gradient descent
  • Local minimum
  • Saddle points
  • Smooth optimization

Fingerprint

Dive into the research topics of 'First-order methods almost always avoid strict saddle points'. Together they form a unique fingerprint.

Cite this