TY - JOUR
T1 - Non-convex statistical optimization for sparse tensor graphical model
AU - Sun, Wei
AU - Wang, Zhaoran
AU - Liu, Han
AU - Cheng, Guang
N1 - Funding Information:
We would like to thank the anonymous reviewers for their helpful comments. Han Liu is grateful for the support of NSF CAREER Award DMS1454377, NSF IIS1408910, NSF IIS1332109, NIH R01MH102339, NIH R01GM083084, and NIH R01HG06841. Guang Cheng's research is sponsored by NSF CAREER Award DMS1151692, NSF DMS1418042, Simons Fellowship in Mathematics, ONR N00014-15-1-2331 and a grant from Indiana Clinical and Translational Sciences Institute.
PY - 2015
Y1 - 2015
N2 - We consider the estimation of sparse graphical models that characterize the dependency structure of high-dimensional tensor-valued data. To facilitate the estimation of the precision matrix corresponding to each way of the tensor, we assume the data follow a tensor normal distribution whose covariance has a Kronecker product structure. The penalized maximum likelihood estimation of this model involves minimizing a non-convex objective function. In spite of the non-convexity of this estimation problem, we prove that an alternating minimization algorithm, which iteratively estimates each sparse precision matrix while fixing the others, attains an estimator with the optimal statistical rate of convergence as well as consistent graph recovery. Notably, such an estimator achieves estimation consistency with only one tensor sample, which is unobserved in previous work. Our theoretical results are backed by thorough numerical studies.
AB - We consider the estimation of sparse graphical models that characterize the dependency structure of high-dimensional tensor-valued data. To facilitate the estimation of the precision matrix corresponding to each way of the tensor, we assume the data follow a tensor normal distribution whose covariance has a Kronecker product structure. The penalized maximum likelihood estimation of this model involves minimizing a non-convex objective function. In spite of the non-convexity of this estimation problem, we prove that an alternating minimization algorithm, which iteratively estimates each sparse precision matrix while fixing the others, attains an estimator with the optimal statistical rate of convergence as well as consistent graph recovery. Notably, such an estimator achieves estimation consistency with only one tensor sample, which is unobserved in previous work. Our theoretical results are backed by thorough numerical studies.
UR - http://www.scopus.com/inward/record.url?scp=84965106699&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84965106699&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:84965106699
SN - 1049-5258
VL - 2015-January
SP - 1081
EP - 1089
JO - Advances in Neural Information Processing Systems
JF - Advances in Neural Information Processing Systems
T2 - 29th Annual Conference on Neural Information Processing Systems, NIPS 2015
Y2 - 7 December 2015 through 12 December 2015
ER -