Boosting based on a smooth margin

Cynthia Rudin, Robert E. Schapire, Ingrid Daubechies

Research output: Contribution to journalConference articlepeer-review

17 Scopus citations

Abstract

We study two boosting algorithms, Coordinate Ascent Boosting and Approximate Coordinate Ascent Boosting, which are explicitly designed to produce maximum margins. To derive these algorithms, we introduce a smooth approximation of the margin that one can maximize in order to produce a maximum margin classifier. Our first algorithm is simply coordinate ascent on this function, involving a line search at each step. We then make a simple approximation of this line search to reveal our second algorithm. These algorithms are proven to asymptotically achieve maximum margins, and we provide two convergence rate calculations. The second calculation yields a faster rate of convergence than the first, although the first gives a more explicit (still fast) rate. These algorithms are very similar to AdaBoost in that they are based on coordinate ascent, easy to implement, and empirically tend to converge faster than other boosting algorithms. Finally, we attempt to understand AdaBoost in terms of our smooth margin, focusing on cases where AdaBoost exhibits cyclic behavior.

Original languageEnglish (US)
Pages (from-to)502-517
Number of pages16
JournalLecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)
Volume3120
DOIs
StatePublished - 2004
Externally publishedYes
Event17th Annual Conference on Learning Theory, COLT 2004 - Banff, Canada
Duration: Jul 1 2004Jul 4 2004

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Boosting based on a smooth margin'. Together they form a unique fingerprint.

Cite this