CANONICAL THRESHOLDING FOR NONSPARSE HIGH-DIMENSIONAL LINEAR REGRESSION

Igor Silin, Jianqing Fan

Research output: Contribution to journalArticlepeer-review

Abstract

We consider a high-dimensional linear regression problem. Unlike many papers on the topic, we do not require sparsity of the regression coefficients; instead, our main structural assumption is a decay of eigenvalues of the covariance matrix of the data. We propose a new family of estimators, called the canonical thresholding estimators, which pick largest regression coefficients in the canonical form. The estimators admit an explicit form and can be linked to LASSO and Principal Component Regression (PCR). A theoretical analysis for both fixed design and random design settings is provided. Obtained bounds on the mean squared error and the prediction error of a specific estimator from the family allow to clearly state sufficient conditions on the decay of eigenvalues to ensure convergence. In addition, we promote the use of the relative errors, strongly linked with the out-of-sample R2. The study of these relative errors leads to a new concept of joint effective dimension, which incorporates the covariance of the data and the regression coefficients simultaneously, and describes the complexity of a linear regression problem. Some minimax lower bounds are established to showcase the optimality of our procedure. Numerical simulations confirm good performance of the proposed estimators compared to the previously developed methods.

Original languageEnglish (US)
Pages (from-to)460-486
Number of pages27
JournalAnnals of Statistics
Volume50
Issue number1
DOIs
StatePublished - Feb 2022
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Keywords

  • Covariance eigenvalues decay
  • High-dimensional linear regression
  • Principal component regression
  • Relative errors
  • Thresholding

Fingerprint

Dive into the research topics of 'CANONICAL THRESHOLDING FOR NONSPARSE HIGH-DIMENSIONAL LINEAR REGRESSION'. Together they form a unique fingerprint.

Cite this