## Abstract

This paper aims to address two fundamental challenges arising in eigenvector estimation and inference for a low-rank matrix from noisy observations: 1) how to estimate an unknown eigenvector when the eigen-gap (i.e. the spacing between the associated eigenvalue and the rest of the spectrum) is particularly small; 2) how to perform estimation and inference on linear functionals of an eigenvector - a sort of 'fine-grained' statistical reasoning that goes far beyond the usual \ell {2} analysis. We investigate how to address these challenges in a setting where the unknown n\times n matrix is symmetric and the additive noise matrix contains independent (and non-symmetric) entries. Based on eigen-decomposition of the asymmetric data matrix, we propose estimation and uncertainty quantification procedures for an unknown eigenvector, which further allow us to reason about linear functionals of an unknown eigenvector. The proposed procedures and the accompanying theory enjoy several important features: 1) distribution-free (i.e. prior knowledge about the noise distributions is not needed); 2) adaptive to heteroscedastic noise; 3) minimax optimal under Gaussian noise. Along the way, we establish valid procedures to construct confidence intervals for the unknown eigenvalues. All this is guaranteed even in the presence of a small eigen-gap (up to O(\sqrt {n/\mathrm {poly}\log (n)}\,) times smaller than the requirement in prior theory), which goes significantly beyond what generic matrix perturbation theory has to offer.

Original language | English (US) |
---|---|

Pages (from-to) | 7380-7419 |

Number of pages | 40 |

Journal | IEEE Transactions on Information Theory |

Volume | 67 |

Issue number | 11 |

DOIs | |

State | Published - Nov 1 2021 |

## All Science Journal Classification (ASJC) codes

- Information Systems
- Computer Science Applications
- Library and Information Sciences

## Keywords

- Eigen-gap
- confidence interval
- heteroscedasticity
- linear form of eigenvectors
- uncertainty quantification