TY - GEN
T1 - On the statistical limits of convex relaxations
T2 - 33rd International Conference on Machine Learning, ICML 2016
AU - Wang, Zhaoran
AU - Gu, Quanquan
AU - Liu, Han
PY - 2016
Y1 - 2016
N2 - Many high dimensional sparse learning problems are formulated as nonconvex optimization. A popular approach to solve these nonconvex optimization problems is through convex relaxations such as linear and semidefinite programming. In this paper, we study the statistical limits of convex relaxations. Particularly, we consider two problems: Mean estimation for sparse principal submatrix and edge probability estimation for stochastic block model. We exploit the sum-of-squares relaxation hierarchy to sharply characterize the limits of a broad class of convex relaxations. Our result shows statistical optimality needs to be compromised for achieving computational tractability using convex relaxations. Compared with existing results on computational lower bounds for statistical problems, which consider general polynomialtime algorithms and rely on computational hardness hypotheses on problems like planted clique detection, our theory focuses on a broad class of convex relaxations and does not rely on unproven hypotheses.
AB - Many high dimensional sparse learning problems are formulated as nonconvex optimization. A popular approach to solve these nonconvex optimization problems is through convex relaxations such as linear and semidefinite programming. In this paper, we study the statistical limits of convex relaxations. Particularly, we consider two problems: Mean estimation for sparse principal submatrix and edge probability estimation for stochastic block model. We exploit the sum-of-squares relaxation hierarchy to sharply characterize the limits of a broad class of convex relaxations. Our result shows statistical optimality needs to be compromised for achieving computational tractability using convex relaxations. Compared with existing results on computational lower bounds for statistical problems, which consider general polynomialtime algorithms and rely on computational hardness hypotheses on problems like planted clique detection, our theory focuses on a broad class of convex relaxations and does not rely on unproven hypotheses.
UR - http://www.scopus.com/inward/record.url?scp=84998890724&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84998890724&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:84998890724
T3 - 33rd International Conference on Machine Learning, ICML 2016
SP - 2055
EP - 2067
BT - 33rd International Conference on Machine Learning, ICML 2016
A2 - Weinberger, Kilian Q.
A2 - Balcan, Maria Florina
PB - International Machine Learning Society (IMLS)
Y2 - 19 June 2016 through 24 June 2016
ER -