Large Scale Prediction with Decision Trees

Jason M. Klusowski, Peter M. Tian

Research output: Contribution to journalArticlepeer-review


This article shows that decision trees constructed with Classification and Regression Trees (CART) and C4.5 methodology are consistent for regression and classification tasks, even when the number of predictor variables grows sub-exponentially with the sample size, under natural 0-norm and 1-norm sparsity constraints. The theory applies to a wide range of models, including (ordinary or logistic) additive regression models with component functions that are continuous, of bounded variation, or, more generally, Borel measurable. Consistency holds for arbitrary joint distributions of the predictor variables, thereby accommodating continuous, discrete, and/or dependent data. Finally, we show that these qualitative properties of individual trees are inherited by Breiman’s random forests. A key step in the analysis is the establishment of an oracle inequality, which allows for a precise characterization of the goodness of fit and complexity tradeoff for a mis-specified model. Supplementary materials for this article are available online.

Original languageEnglish (US)
Pages (from-to)525-537
Number of pages13
JournalJournal of the American Statistical Association
Issue number545
StatePublished - 2024
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


  • C4.5
  • CART
  • Interpretable machine learning
  • Oracle inequality
  • Random forests


Dive into the research topics of 'Large Scale Prediction with Decision Trees'. Together they form a unique fingerprint.

Cite this