Nonconvex Low-Rank Tensor Completion from Noisy Data

Changxiao Cai, Gen Li, H. Vincent Poor, Yuxin Chen

Research output: Contribution to journalArticlepeer-review

14 Scopus citations


We study a noisy tensor completion problem of broad practical interest, namely, the reconstruction of a low-rank tensor from highly incomplete and randomly corrupted observations of its entries. Whereas a variety of prior work has been dedicated to this problem, prior algorithms either are computationally too expensive for large-scale applications or come with suboptimal statistical guarantees. Focusing on “incoherent” and well-conditioned tensors of a constant canonical polyadic rank, we propose a two-stage nonconvex algorithm—(vanilla) gradient descent following a rough initialization—that achieves the best of both worlds. Specifically, the proposed nonconvex algorithm faithfully completes the tensor and retrieves all individual tensor factors within nearly linear time, while at the same time enjoying near-optimal statistical guarantees (i.e., minimal sample complexity and optimal estimation accuracy). The estimation errors are evenly spread out across all entries, thus achieving optimal ℓ statistical accuracy. We also discuss how to extend our approach to accommodate asymmetric tensors. The insight conveyed through our analysis of nonconvex optimization might have implications for other tensor estimation problems.

Original languageEnglish (US)
Pages (from-to)1219-1237
Number of pages19
JournalOperations Research
Issue number2
StatePublished - Mar 1 2022
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Management Science and Operations Research


  • entrywise statistical guarantees
  • gradient descent
  • minimaxity
  • nonconvex optimization
  • spectral methods
  • tensor completion


Dive into the research topics of 'Nonconvex Low-Rank Tensor Completion from Noisy Data'. Together they form a unique fingerprint.

Cite this