Nonconvex low-rank symmetric tensor completion from noisy data

Changxiao Cai, Gen Li, H. Vincent Poor, Yuxin Chen

Research output: Contribution to journalConference articlepeer-review

32 Scopus citations


We study a completion problem of broad practical interest: the reconstruction of a low-rank symmetric tensor from highly incomplete and randomly corrupted observations of its entries. While a variety of prior work has been dedicated to this problem, prior algorithms either are computationally too expensive for large-scale applications, or come with sub-optimal statistical guarantees. Focusing on “incoherent” and well-conditioned tensors of a constant CP rank, we propose a two-stage nonconvex algorithm - (vanilla) gradient descent following a rough initialization - that achieves the best of both worlds. Specifically, the proposed nonconvex algorithm faithfully completes the tensor and retrieves individual tensor factors within nearly linear time, while at the same time enjoying near-optimal statistical guarantees (i.e. minimal sample complexity and optimal `2 and `8 statistical accuracy). The insights conveyed through our analysis of nonconvex optimization might have implications for other tensor estimation problems.

Original languageEnglish (US)
JournalAdvances in Neural Information Processing Systems
StatePublished - 2019
Event33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019 - Vancouver, Canada
Duration: Dec 8 2019Dec 14 2019

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing


Dive into the research topics of 'Nonconvex low-rank symmetric tensor completion from noisy data'. Together they form a unique fingerprint.

Cite this