Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval

Yuxin Chen, Yuejie Chi, Jianqing Fan, Cong Ma

Research output: Contribution to journalArticlepeer-review

125 Scopus citations

Abstract

This paper considers the problem of solving systems of quadratic equations, namely, recovering an object of interest x∈ Rn from m quadratic equations/samples yi=(ai⊤x♮)2,1≤i≤m. This problem, also dubbed as phase retrieval, spans multiple domains including physical sciences and machine learning. We investigate the efficacy of gradient descent (or Wirtinger flow) designed for the nonconvex least squares problem. We prove that under Gaussian designs, gradient descent—when randomly initialized—yields an ϵ-accurate solution in O(log n+ log (1 / ϵ)) iterations given nearly minimal samples, thus achieving near-optimal computational and sample complexities at once. This provides the first global convergence guarantee concerning vanilla gradient descent for phase retrieval, without the need of (i) carefully-designed initialization, (ii) sample splitting, or (iii) sophisticated saddle-point escaping schemes. All of these are achieved by exploiting the statistical models in analyzing optimization algorithms, via a leave-one-out approach that enables the decoupling of certain statistical dependency between the gradient descent iterates and the data.

Original languageEnglish (US)
Pages (from-to)5-37
Number of pages33
JournalMathematical Programming
Volume176
Issue number1-2
DOIs
StatePublished - Jul 1 2019

All Science Journal Classification (ASJC) codes

  • Software
  • General Mathematics

Fingerprint

Dive into the research topics of 'Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval'. Together they form a unique fingerprint.

Cite this