TY - JOUR

T1 - Gradient descent with random initialization

T2 - fast global convergence for nonconvex phase retrieval

AU - Chen, Yuxin

AU - Chi, Yuejie

AU - Fan, Jianqing

AU - Ma, Cong

N1 - Funding Information:
Acknowledgements Y. Chen is supported in part by the AFOSR YIP award FA9550-19-1-0030, by the ARO grant W911NF-18-1-0303, by the ONR grant N00014-19-1-2120, and by the Princeton SEAS innovation award. Y. Chi is supported in part by AFOSR under the grant FA9550-15-1-0205, by ONR under the grant N00014-18-1-2142, by ARO under the grant W911NF-18-1-0303, and by NSF under the grants CAREER ECCS-1818571 and CCF-1806154. J. Fan is supported in part by NSF grants DMS-1662139 and DMS-1712591 and NIH grant 2R01-GM072611-13.
Funding Information:
Y. Chen is supported in part by the AFOSR YIP award FA9550-19-1-0030, by the ARO grant W911NF-18-1-0303, by the ONR grant N00014-19-1-2120, and by the Princeton SEAS innovation award. Y.?Chi is supported in part by AFOSR under the grant FA9550-15-1-0205, by ONR under the grant N00014-18-1-2142, by ARO under the grant W911NF-18-1-0303, and by NSF under the grants CAREER ECCS-1818571 and CCF-1806154. J.?Fan is supported in part by NSF grants DMS-1662139 and DMS-1712591 and NIH grant 2R01-GM072611-13.
Publisher Copyright:
© 2019, Springer-Verlag GmbH Germany, part of Springer Nature and Mathematical Optimization Society.

PY - 2019/7/1

Y1 - 2019/7/1

N2 - This paper considers the problem of solving systems of quadratic equations, namely, recovering an object of interest x♮∈ Rn from m quadratic equations/samples yi=(ai⊤x♮)2,1≤i≤m. This problem, also dubbed as phase retrieval, spans multiple domains including physical sciences and machine learning. We investigate the efficacy of gradient descent (or Wirtinger flow) designed for the nonconvex least squares problem. We prove that under Gaussian designs, gradient descent—when randomly initialized—yields an ϵ-accurate solution in O(log n+ log (1 / ϵ)) iterations given nearly minimal samples, thus achieving near-optimal computational and sample complexities at once. This provides the first global convergence guarantee concerning vanilla gradient descent for phase retrieval, without the need of (i) carefully-designed initialization, (ii) sample splitting, or (iii) sophisticated saddle-point escaping schemes. All of these are achieved by exploiting the statistical models in analyzing optimization algorithms, via a leave-one-out approach that enables the decoupling of certain statistical dependency between the gradient descent iterates and the data.

AB - This paper considers the problem of solving systems of quadratic equations, namely, recovering an object of interest x♮∈ Rn from m quadratic equations/samples yi=(ai⊤x♮)2,1≤i≤m. This problem, also dubbed as phase retrieval, spans multiple domains including physical sciences and machine learning. We investigate the efficacy of gradient descent (or Wirtinger flow) designed for the nonconvex least squares problem. We prove that under Gaussian designs, gradient descent—when randomly initialized—yields an ϵ-accurate solution in O(log n+ log (1 / ϵ)) iterations given nearly minimal samples, thus achieving near-optimal computational and sample complexities at once. This provides the first global convergence guarantee concerning vanilla gradient descent for phase retrieval, without the need of (i) carefully-designed initialization, (ii) sample splitting, or (iii) sophisticated saddle-point escaping schemes. All of these are achieved by exploiting the statistical models in analyzing optimization algorithms, via a leave-one-out approach that enables the decoupling of certain statistical dependency between the gradient descent iterates and the data.

UR - http://www.scopus.com/inward/record.url?scp=85061183633&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85061183633&partnerID=8YFLogxK

U2 - 10.1007/s10107-019-01363-6

DO - 10.1007/s10107-019-01363-6

M3 - Article

C2 - 33833473

AN - SCOPUS:85061183633

SN - 0025-5610

VL - 176

SP - 5

EP - 37

JO - Mathematical Programming

JF - Mathematical Programming

IS - 1-2

ER -