Analysis of boosting algorithms using the smooth margin function

Cynthia Rudin, Robert E. Schapire, Ingrid Daubechies

Research output: Contribution to journalArticlepeer-review

19 Scopus citations


We introduce a useful tool for analyzing boosting algorithms called the "smooth margin function," a differentiable approximation of the usual margin for boosting algorithms. We present two boosting algorithms based on this smooth margin, "coordinate ascent boosting" and "approximate coordinate ascent boosting," which are similar to Freund and Schapire's AdaBoost algorithm and Breiman's arc-gv algorithm. We give convergence rates to the maximum margin solution for both of our algorithms and for arc-gv. We then study AdaBoost's convergence properties using the smooth margin function. We precisely bound the margin attained by AdaBoost when the edges of the weak classifiers fall within a specified range. This shows that a previous bound proved by Ratsch and Warmuth is exactly tight. Furthermore, we use the smooth margin to capture explicit properties of AdaBoost in cases where cyclic behavior occurs.

Original languageEnglish (US)
Pages (from-to)2723-2768
Number of pages46
JournalAnnals of Statistics
Issue number6
StatePublished - Dec 2007

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


  • AdaBoost
  • Arc-gv
  • Boosting
  • Convergence rates
  • Coordinate descent
  • Large margin classification


Dive into the research topics of 'Analysis of boosting algorithms using the smooth margin function'. Together they form a unique fingerprint.

Cite this