TY - JOUR
T1 - Nonconvex low-rank symmetric tensor completion from noisy data
AU - Cai, Changxiao
AU - Li, Gen
AU - Vincent Poor, H.
AU - Chen, Yuxin
N1 - Funding Information:
Y. Chen is supported in part by the AFOSR YIP award FA9550-19-1-0030, by the ONR grant N00014-19-1-2120, by the ARO grant W911NF-18-1-0303, by the NSF grants CCF-1907661 and IIS-1900140. H. V. Poor is supported in part by the NSF grant DMS-1736417.
Publisher Copyright:
© 2019 Neural information processing systems foundation. All rights reserved.
PY - 2019
Y1 - 2019
N2 - We study a completion problem of broad practical interest: the reconstruction of a low-rank symmetric tensor from highly incomplete and randomly corrupted observations of its entries. While a variety of prior work has been dedicated to this problem, prior algorithms either are computationally too expensive for large-scale applications, or come with sub-optimal statistical guarantees. Focusing on “incoherent” and well-conditioned tensors of a constant CP rank, we propose a two-stage nonconvex algorithm - (vanilla) gradient descent following a rough initialization - that achieves the best of both worlds. Specifically, the proposed nonconvex algorithm faithfully completes the tensor and retrieves individual tensor factors within nearly linear time, while at the same time enjoying near-optimal statistical guarantees (i.e. minimal sample complexity and optimal `2 and `8 statistical accuracy). The insights conveyed through our analysis of nonconvex optimization might have implications for other tensor estimation problems.
AB - We study a completion problem of broad practical interest: the reconstruction of a low-rank symmetric tensor from highly incomplete and randomly corrupted observations of its entries. While a variety of prior work has been dedicated to this problem, prior algorithms either are computationally too expensive for large-scale applications, or come with sub-optimal statistical guarantees. Focusing on “incoherent” and well-conditioned tensors of a constant CP rank, we propose a two-stage nonconvex algorithm - (vanilla) gradient descent following a rough initialization - that achieves the best of both worlds. Specifically, the proposed nonconvex algorithm faithfully completes the tensor and retrieves individual tensor factors within nearly linear time, while at the same time enjoying near-optimal statistical guarantees (i.e. minimal sample complexity and optimal `2 and `8 statistical accuracy). The insights conveyed through our analysis of nonconvex optimization might have implications for other tensor estimation problems.
UR - http://www.scopus.com/inward/record.url?scp=85077760042&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85077760042&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85077760042
SN - 1049-5258
VL - 32
JO - Advances in Neural Information Processing Systems
JF - Advances in Neural Information Processing Systems
T2 - 33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019
Y2 - 8 December 2019 through 14 December 2019
ER -