Abstract
Consider a sample of a centered random vector with unit covariance matrix. We show that under certain regularity assumptions, and up to a natural scaling, the smallest and the largest eigenvalues of the empirical covariance matrix converge, when the dimension and the sample size both tend to infinity, to the left and right edges of the Marchenko–Pastur distribution. The assumptions are related to tails of norms of orthogonal projections. They cover isotropic log-concave random vectors as well as random vectors with i.i.d. coordinates with almost optimal moment conditions. The method is a refinement of the rank one update approach used by Srivastava and Vershynin to produce non-asymptotic quantitative estimates. In other words we provide a new proof of the Bai and Yin theorem using basic tools from probability theory and linear algebra, together with a new extension of this theorem to random matrices with dependent entries.
Original language | English (US) |
---|---|
Pages (from-to) | 847-889 |
Number of pages | 43 |
Journal | Probability Theory and Related Fields |
Volume | 170 |
Issue number | 3-4 |
DOIs | |
State | Published - Apr 2018 |
All Science Journal Classification (ASJC) codes
- Analysis
- Statistics and Probability
- Statistics, Probability and Uncertainty
Keywords
- Convex body
- Covariance matrix
- Dependence
- Log-concave distribution
- Operator norm
- Random matrix
- Sherman–Morrison formula
- Singular value
- Thin-shell inequality