Margin-based ranking and an equivalence between AdaBoost and RankBoost

Cynthia Rudin, Robert E. Schapire

Research output: Contribution to journalArticlepeer-review

78 Scopus citations

Abstract

We study boosting algorithms for learning to rank. We give a general margin-based bound for ranking based on covering numbers for the hypothesis space. Our bound suggests that algorithms that maximize the ranking margin will generalize well. We then describe a new algorithm, smooth margin ranking, that precisely converges to a maximum ranking-margin solution. The algorithm is a modification of RankBoost, analogous to "approximate coordinate ascent boosting." Finally, we prove that AdaBoost and RankBoost are equally good for the problems of bipartite ranking and classification in terms of their asymptotic behavior on the training set. Under natural conditions, AdaBoost achieves an area under the ROC curve that is equally as good as RankBoost's; furthermore, RankBoost, when given a specific intercept, achieves a misclassification error that is as good as AdaBoost's. This may help to explain the empirical observations made by Cortes and Mohri, and Caruana and Niculescu-Mizil, about the excellent performance of AdaBoost as a bipartite ranking algorithm, as measured by the area under the ROC curve.

Original languageEnglish (US)
Pages (from-to)2193-2232
Number of pages40
JournalJournal of Machine Learning Research
Volume10
StatePublished - 2009
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Software
  • Artificial Intelligence
  • Control and Systems Engineering
  • Statistics and Probability

Keywords

  • Adaboost
  • Area under the roc curve
  • Generalization bounds
  • Rankboost
  • Ranking

Fingerprint

Dive into the research topics of 'Margin-based ranking and an equivalence between AdaBoost and RankBoost'. Together they form a unique fingerprint.

Cite this