Abstract
This paper studies the statistical convergence and consistency of regularized boosting methods, where the samples need not be independent and identically distributed but can come from stationary weakly dependent sequences. Consistency is proven for the composite classifiers that result from a regularization achieved by restricting the 1-norm of the base classifiers' weights. The less restrictive nature of sampling considered here is manifested in the consistency result through a generalized condition on the growth of the regularization parameter. The weaker the sample dependence, the faster the regularization parameter is allowed to grow with increasing sample size. A consistency result is also provided for data-dependent choices of the regularization parameter.
Original language | English (US) |
---|---|
Article number | 6650087 |
Pages (from-to) | 651-660 |
Number of pages | 10 |
Journal | IEEE Transactions on Information Theory |
Volume | 60 |
Issue number | 1 |
DOIs | |
State | Published - Jan 2014 |
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Keywords
- Bayes-risk consistency
- beta-mixing
- boosting
- classification
- dependent data
- empirical processes
- memory
- non-iid
- penalized model selection
- regularization