An eigenvector perturbation bound and its application to robust covariance estimation

Jianqing Fan, Weichen Wang, Yiqiao Zhong

Research output: Contribution to journalArticlepeer-review

58 Scopus citations

Abstract

In statistics and machine learning, we are interested in the eigenvectors (or singular vectors) of certain matrices (e.g. covariance matrices, data matrices, etc). However, those matrices are usually perturbed by noises or statistical errors, either from random sampling or structural patterns. The Davis-Kahan sin θ theorem is often used to bound the difference between the eigenvectors of a matrix A and those of a perturbed matrix A = A + E, in terms of 2 norm. In this paper, we prove that when A is a low-rank and incoherent matrix, the norm perturbation bound of singular vectors (or eigenvectors in the symmetric case) is smaller by a factor of d1 or d2 for left and right vectors, where d1 and d2 are the matrix dimensions. The power of this new perturbation result is shown in robust covariance estimation, particularly when random variables have heavy tails. There, we propose new robust covariance estimators and establish their asymptotic properties using the newly developed perturbation bound. Our theoretical results are verified through extensive numerical experiments.

Original languageEnglish (US)
Pages (from-to)1-42
Number of pages42
JournalJournal of Machine Learning Research
Volume18
StatePublished - Apr 1 2018
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence
  • Control and Systems Engineering
  • Statistics and Probability

Keywords

  • Approximate factor model
  • Incoherence
  • Low-rank matrices
  • Matrix perturbation theory
  • Sparsity

Fingerprint

Dive into the research topics of 'An eigenvector perturbation bound and its application to robust covariance estimation'. Together they form a unique fingerprint.

Cite this