Noisy matrix completion: Understanding statistical guarantees for convex relaxation via nonconvex optimization

Yuxin Chen, Yuejie Chi, Jianqing Fan, Cong Ma, Yuling Yan

Research output: Contribution to journalArticlepeer-review

Abstract

This paper studies noisy low-rank matrix completion: given partial and noisy entries of a large low-rank matrix, the goal is to estimate the underlying matrix faithfully and efficiently. Arguably one of the most popular paradigms to tackle this problem is convex relaxation, which achieves remarkable efficacy in practice. However, the theoretical support of this approach is still far from optimal in the noisy setting, falling short of explaining its empirical success. We make progress towards demystifying the practical efficacy of convex relaxation vis-a-vis random noise. When the rank and the condition number of the unknown matrix are bounded by a constant, we demonstrate that the convex programming approach achieves near-optimal estimation errors-in terms of the Euclidean loss, the entrywise loss, and the spectral norm loss-for a wide range of noise levels. All of this is enabled by bridging convex relaxation with the nonconvex Burer-Monteiro approach, a seemingly distinct algorithmic paradigm that is provably robust against noise. More specifically, we show that an approximate critical point of the nonconvex formulation serves as an extremely tight approximation of the convex solution, thus allowing us to transfer the desired statistical guarantees of the nonconvex approach to its convex counterpart.

Original languageEnglish (US)
Pages (from-to)3098-3121
Number of pages24
JournalSIAM Journal on Optimization
Volume30
Issue number4
DOIs
StatePublished - 2020

All Science Journal Classification (ASJC) codes

  • Software
  • Theoretical Computer Science

Keywords

  • Burer-Monteiro approach
  • Convex relaxation
  • Matrix completion
  • Minimaxity
  • Nonconvex optimization
  • Stability

Fingerprint Dive into the research topics of 'Noisy matrix completion: Understanding statistical guarantees for convex relaxation via nonconvex optimization'. Together they form a unique fingerprint.

Cite this