How boosting the margin can also boost classifier complexity

Lev Reyzin, Robert E. Schapire

Research output: Chapter in Book/Report/Conference proceedingConference contribution

67 Scopus citations

Abstract

Boosting methods are known not to usually overfit training data even as the size of the generated classifiers becomes large. Schapire et al. attempted to explain this phenomenon in terms of the margins the classifier achieves on training examples. Later, however, Breiman cast serious doubt on this explanation by introducing a boosting algorithm, arc-gv, that can generate a higher margins distribution than AdaBoost and yet performs worse. In this paper, we take a close look at Breiman's compelling but puzzling results. Although we can reproduce his main finding, we find that the poorer performance of arc-gv can be explained by the increased complexity of the base classifiers it uses, an explanation supported by our experiments and entirely consistent with the margins theory. Thus, we find maximizing the margins is desirable, but not necessarily at the expense of other factors, especially base-classifier complexity.

Original languageEnglish (US)
Title of host publicationICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
Pages753-760
Number of pages8
StatePublished - Oct 6 2006
EventICML 2006: 23rd International Conference on Machine Learning - Pittsburgh, PA, United States
Duration: Jun 25 2006Jun 29 2006

Publication series

NameICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
Volume2006

Other

OtherICML 2006: 23rd International Conference on Machine Learning
CountryUnited States
CityPittsburgh, PA
Period6/25/066/29/06

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Fingerprint Dive into the research topics of 'How boosting the margin can also boost classifier complexity'. Together they form a unique fingerprint.

Cite this