Using and combining predictors that specialize

Yoav Freund, Robert E. Schapire, Yoram Singer, Manfred K. Warmuth

Research output: Contribution to journalConference articlepeer-review

166 Scopus citations

Abstract

We study online learning algorithms that predict by combining the predictions of several subordinate prediction algorithms, sometimes called `experts.' These simple algorithms belong to the multiplicative weights family of algorithms. The performance of these algorithms degrades only logarithmically with the number of experts, making them particularly useful in applications where the number of experts is very large. However, in applications such as text categorization, it is often natural for some of the experts to abstain from making predictions on some of the instances. We show how to transform algorithms that assume that all experts are always awake to algorithms that do not require this assumption. We also show how to derive corresponding loss bounds. Our method is very general, and can be applied to a large family of online learning algorithms. We also give applications to various prediction models including decision graphs and `switching' experts.

Original languageEnglish (US)
Pages (from-to)334-343
Number of pages10
JournalConference Proceedings of the Annual ACM Symposium on Theory of Computing
DOIs
StatePublished - 1997
EventProceedings of the 1997 29th Annual ACM Symposium on Theory of Computing - El Paso, TX, USA
Duration: May 4 1997May 6 1997

All Science Journal Classification (ASJC) codes

  • Software

Fingerprint

Dive into the research topics of 'Using and combining predictors that specialize'. Together they form a unique fingerprint.

Cite this