Factor Augmented Sparse Throughput Deep ReLU Neural Networks for High Dimensional Regression

Jianqing Fan, Yihong Gu

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


This article introduces a Factor Augmented Sparse Throughput (FAST) model that uses both latent factors and sparse idiosyncratic components for nonparametric regression. It contains many popular statistical models. The FAST model bridges factor models on one end and sparse nonparametric models on the other end. It encompasses structured nonparametric models such as factor augmented additive models and sparse low-dimensional nonparametric interaction models and covers the cases where the covariates do not admit factor structures. This model allows us to conduct high-dimensional nonparametric model selection for both strong dependent and weak dependent covariates and hence contributes to interpretable machine learning, particularly to the feature selections for neural networks. Via diversified projections as estimation of latent factor space, we employ truncated deep ReLU networks to nonparametric factor regression without regularization and to a more general FAST model using nonconvex regularization, resulting in factor augmented regression using neural network (FAR-NN) and FAST-NN estimators, respectively. We show that FAR-NN and FAST-NN estimators adapt to the unknown low-dimensional structure using hierarchical composition models in nonasymptotic minimax rates. We also study statistical learning for the factor augmented sparse additive model using a more specific neural network architecture. Our results are applicable to the weak dependent cases without factor structures. In proving the main technical result for FAST-NN, we establish a new deep ReLU network approximation result that contributes to the foundation of neural network theory. Numerical studies further support our theory and methods. Supplementary materials for this article are available online.

Original languageEnglish (US)
JournalJournal of the American Statistical Association
StateAccepted/In press - 2023
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


  • Approximability of ReLU network
  • Factor model
  • Hierachical composition model
  • High-dimensional nonparametric model selection
  • Minimax optimal rates


Dive into the research topics of 'Factor Augmented Sparse Throughput Deep ReLU Neural Networks for High Dimensional Regression'. Together they form a unique fingerprint.

Cite this