Convergence and consistency of regularized Boosting algorithms with stationary β-mixing observations

Aurélie C. Lozano, Sanjeev R. Kulkarni, Robert E. Schapire

Research output: Chapter in Book/Report/Conference proceedingConference contribution

15 Scopus citations

Abstract

We study the statistical convergence and consistency of regularized Boosting methods, where the samples are not independent and identically distributed (i.i.d.) but come from empirical processes of stationary β-mixing sequences. Utilizing a technique that constructs a sequence of independent blocks close in distribution to the original samples, we prove the consistency of the composite classifiers resulting from a regularization achieved by restricting the 1-norm of the base classifiers' weights. When compared to the i.i.d. case, the nature of sampling manifests in the consistency result only through generalization of the original condition on the growth of the regularization parameter.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 18 - Proceedings of the 2005 Conference
Pages819-826
Number of pages8
StatePublished - Dec 1 2005
Event2005 Annual Conference on Neural Information Processing Systems, NIPS 2005 - Vancouver, BC, Canada
Duration: Dec 5 2005Dec 8 2005

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Other

Other2005 Annual Conference on Neural Information Processing Systems, NIPS 2005
CountryCanada
CityVancouver, BC
Period12/5/0512/8/05

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint Dive into the research topics of 'Convergence and consistency of regularized Boosting algorithms with stationary β-mixing observations'. Together they form a unique fingerprint.

Cite this