TY - JOUR
T1 - Spectral Methods for Data Science
T2 - A Statistical Perspective
AU - Chen, Yuxin
AU - Chi, Yuejie
AU - Fan, Jianqing
AU - Ma, Cong
N1 - Funding Information:
We gratefully acknowledge the generous financial support of multiple agencies. More specifically, Y. Chen acknowledges the support by the AFOSR YIP award FA9550-19-1-0030, the ONR grant N00014-19-1-2120, the ARO YIP award W911NF-20-1-0097, the ARO grant W911NF-18-1-0303, the NSF grants CCF-1907661, IIS-1900140, IIS-2100158 and
Funding Information:
The authors thank the Editor-in-Chief Prof. Michael Jordan for his encouragements, and the publisher Mike Casey for his editorial help. We are deeply indebted to our wonderful collaborators who have contributed significantly to, and helped shape our perspectives into, the materials presented herein, including Emmanuel Abbe, Changxiao Cai, Emmanuel Candès, Yanxi Chen, Chen Cheng, Yonina Eldar, Yingying Fan, Haoyu Fu, Andrea Goldsmith, Leonidas Guibas, Qixing Huang, Govinda Kamath, Tracy Ke, Gen Li, Yuanxin Li, Yingbin Liang, Yuan Liao, Junwei Lu, Yue Lu, Jinchi Lv, Vincent Monardo, H. Vincent Poor, Changho Suh, Tian Tong, David Tse, Bingyan Wang, Kaizheng Wang, Weichen Wang, Yuting Wei, Yuling Yan, Zhuorang Yang, Huishuai Zhang, Yuchen Zhou, Yiqiao Zhong, and Ziwei Zhu. We owe our particular gratitude to Yuling Yan, who has generously helped with most materials presented in Sections 4.6-4.7. We also thank Bingyan Wang and Chen Dan for their helpful comments about an early version of this monograph, Changxiao Cai for his help in producing Figure 3.5, and Kaizheng Wang for his help in generating Figure 4.2. We gratefully acknowledge the generous financial support of multiple agencies. More specifically, Y. Chen acknowledges the support by the AFOSR YIP award FA9550-19-1-0030, the ONR grant N00014-19-1- 2120, the ARO YIP award W911NF-20-1-0097, the ARO grant W911NF- 18-1-0303, the NSF grants CCF-1907661, IIS-1900140, IIS-2100158 and DMS-2014279, and the Princeton SEAS innovation award; Y. Chi has been supported in part by the ONR under the grants N00014-18-1-2142 and N00014-19-1-2404, by the ARO under the grant W911NF-18-1- 0303, and by the NSF under the grants CAREER ECCS-1818571, CCF-1901199, CCF-1806154, CCF-2007911, CCF-2106778 and ECCS- 2126634; and J. Fan has been supported in part by the ONR grant N00014-19-1-2120, the NSF grants DMS-1662139, DMS-1712591, DMS- 2053832, DMS-2052926, and the NIH grant R01-GM072611, and the Princeton SEAS innovation award. Part of this work was done while Y. Chen was visiting the Simons Institute for the Theory of Computing. Last but not least, this work would not come to existence without the continuing support of our families, especially during the difficult time of COVID-19 pandemic when this monograph was completed. Y. Chen thanks Yuting Wei for bringing love and encouragement everyday during the writing of this monograph. Y. Chi is deeply grateful to her parents, husband, and daughter for being the silver lining in the pandemic. J. Fan enjoys gratefully his wife and daughters' company and thanks them for compassionate support. C. Ma thanks Xinyi Liu for her unfailing support, and Pidan the Cat for bringing surprises and joys everyday. This monograph is dedicated to them.
Funding Information:
DMS-2014279, and the Princeton SEAS innovation award; Y. Chi has been supported in part by the ONR under the grants N00014-18-1-2142 and N00014-19-1-2404, by the ARO under the grant W911NF-18-1-0303, and by the NSF under the grants CAREER ECCS-1818571, CCF-1901199, CCF-1806154, CCF-2007911, CCF-2106778 and ECCS-2126634; and J. Fan has been supported in part by the ONR grant N00014-19-1-2120, the NSF grants DMS-1662139, DMS-1712591, DMS-2053832, DMS-2052926, and the NIH grant R01-GM072611, and the Princeton SEAS innovation award. Part of this work was done while Y. Chen was visiting the Simons Institute for the Theory of Computing.
Publisher Copyright:
© 2020 American Society of Parasitologists. All rights reserved.
PY - 2021/10/21
Y1 - 2021/10/21
N2 - Spectral methods have emerged as a simple yet surprisingly effective approach for extracting information from massive, noisy and incomplete data. In a nutshell, spectral methods refer to a collection of algorithms built upon the eigenvalues (resp. singular values) and eigenvectors (resp. singular vectors) of some properly designed matrices constructed from data. A diverse array of applications have been found in machine learning, imaging science, financial and econometric modeling, and signal processing, including recommendation systems, community detection, ranking, structured matrix recovery, tensor data estimation, joint shape matching, blind deconvolution, financial investments, risk managements, treatment evaluations, causal inference, amongst others. Due to their simplicity and effectiveness, spectral methods are not only used as a stand-alone estimator, but also frequently employed to facilitate other more sophisticated algorithms to enhance performance. While the studies of spectral methods can be traced back to classical matrix perturbation theory and the method of moments, the past decade has witnessed tremendous theoretical advances in demystifying their efficacy through the lens of statistical modeling, with the aid of concentration inequalities and non-asymptotic random matrix theory. This monograph aims to present a systematic, comprehensive, yet accessible introduction to spectral methods from a modern statistical perspective, highlighting their algorithmic implications in diverse large-scale applications. In particular, our exposition gravitates around several central questions that span various applications: how to characterize the sample efficiency of spectral methods in reaching a target level of statistical accuracy, and how to assess their stability in the face of random noise, missing data, and adversarial corruptions? In addition to conventional ℓ2perturbation analysis, we present a systematic ℓ∞and ℓ2,∞perturbation theory for eigenspace and singular subspaces, which has only recently become available owing to a powerful "leave-one-out" analysis framework.
AB - Spectral methods have emerged as a simple yet surprisingly effective approach for extracting information from massive, noisy and incomplete data. In a nutshell, spectral methods refer to a collection of algorithms built upon the eigenvalues (resp. singular values) and eigenvectors (resp. singular vectors) of some properly designed matrices constructed from data. A diverse array of applications have been found in machine learning, imaging science, financial and econometric modeling, and signal processing, including recommendation systems, community detection, ranking, structured matrix recovery, tensor data estimation, joint shape matching, blind deconvolution, financial investments, risk managements, treatment evaluations, causal inference, amongst others. Due to their simplicity and effectiveness, spectral methods are not only used as a stand-alone estimator, but also frequently employed to facilitate other more sophisticated algorithms to enhance performance. While the studies of spectral methods can be traced back to classical matrix perturbation theory and the method of moments, the past decade has witnessed tremendous theoretical advances in demystifying their efficacy through the lens of statistical modeling, with the aid of concentration inequalities and non-asymptotic random matrix theory. This monograph aims to present a systematic, comprehensive, yet accessible introduction to spectral methods from a modern statistical perspective, highlighting their algorithmic implications in diverse large-scale applications. In particular, our exposition gravitates around several central questions that span various applications: how to characterize the sample efficiency of spectral methods in reaching a target level of statistical accuracy, and how to assess their stability in the face of random noise, missing data, and adversarial corruptions? In addition to conventional ℓ2perturbation analysis, we present a systematic ℓ∞and ℓ2,∞perturbation theory for eigenspace and singular subspaces, which has only recently become available owing to a powerful "leave-one-out" analysis framework.
UR - http://www.scopus.com/inward/record.url?scp=85130231493&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85130231493&partnerID=8YFLogxK
U2 - 10.1561/2200000079
DO - 10.1561/2200000079
M3 - Article
AN - SCOPUS:85130231493
SN - 1935-8237
VL - 14
SP - 566
EP - 806
JO - Foundations and Trends in Machine Learning
JF - Foundations and Trends in Machine Learning
IS - 5
ER -