Statistical Inference for High-Dimensional Matrix-Variate Factor Models

Elynn Y. Chen, Jianqing Fan

Research output: Contribution to journalArticlepeer-review

Abstract

This article considers the estimation and inference of the low-rank components in high-dimensional matrix-variate factor models, where each dimension of the matrix-variates (p × q) is comparable to or greater than the number of observations (T). We propose an estimation method called α-PCA that preserves the matrix structure and aggregates mean and contemporary covariance through a hyper-parameter α. We develop an inferential theory, establishing consistency, the rate of convergence, and the limiting distributions, under general conditions that allow for correlations across time, rows, or columns of the noise. We show both theoretical and empirical methods of choosing the best α, depending on the use-case criteria. Simulation results demonstrate the adequacy of the asymptotic results in approximating the finite sample properties. The α-PCA compares favorably with the existing ones. Finally, we illustrate its applications with a real numeric dataset and two real image datasets. In all applications, the proposed estimation procedure outperforms previous methods in the power of variance explanation using out-of-sample 10-fold cross-validation. Supplementary materials for this article are available online.

Original languageEnglish (US)
JournalJournal of the American Statistical Association
DOIs
StateAccepted/In press - 2021

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Keywords

  • Asymptotic normality
  • Factor models
  • High-dimension
  • Latent low rank
  • Matrix-variate

Fingerprint

Dive into the research topics of 'Statistical Inference for High-Dimensional Matrix-Variate Factor Models'. Together they form a unique fingerprint.

Cite this