Abstract
Boosting is an approach to machine learning based on the idea of creating a highly accurate prediction rule by combining many relatively weak and inaccurate rules. The AdaBoost algorithm of Freund and Schapire was the first practical boosting algorithm, and remains one of the most widely used and studied, with applications in numerous fields. This chapter aims to review some of the many perspectives and analyses of AdaBoost that have been applied to explain or understand it as a learning method, with comparisons of both the strengths and weaknesses of the various approaches.
Original language | English (US) |
---|---|
Title of host publication | Empirical Inference |
Subtitle of host publication | Festschrift in Honor of Vladimir N. Vapnik |
Publisher | Springer Berlin Heidelberg |
Pages | 37-52 |
Number of pages | 16 |
ISBN (Electronic) | 9783642411366 |
ISBN (Print) | 9783642411359 |
DOIs | |
State | Published - Jan 1 2013 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- General Computer Science