Online multiclass learning by interclass hypothesis sharing

Michael Fink, Shai Shalev-Shwartz, Yoram Singer, Shimon Ullman

Research output: Chapter in Book/Report/Conference proceedingConference contribution

18 Scopus citations

Abstract

We describe a general framework for online multiclass learning based on the notion of hypothesis sharing. In our framework sets of classes are associated with hypotheses. Thus, all classes within a given set share the same hypothesis. This framework includes as special cases commonly used constructions for multiclass categorization such as allocating a unique hypothesis for each class and allocating a single common hypothesis for all classes. We generalize the multiclass Perception to our framework and derive a unifying mistake bound analysis. Our construction naturally extends to settings where the number of classes is not known in advance but, rather, is revealed along the online learning process. We demonstrate the merits of our approach by comparing it to previous methods on both synthetic and natural datasets.

Original languageEnglish (US)
Title of host publicationICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
Pages313-320
Number of pages8
Volume2006
StatePublished - Oct 6 2006
Externally publishedYes
EventICML 2006: 23rd International Conference on Machine Learning - Pittsburgh, PA, United States
Duration: Jun 25 2006Jun 29 2006

Other

OtherICML 2006: 23rd International Conference on Machine Learning
CountryUnited States
CityPittsburgh, PA
Period6/25/066/29/06

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Cite this

Fink, M., Shalev-Shwartz, S., Singer, Y., & Ullman, S. (2006). Online multiclass learning by interclass hypothesis sharing. In ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning (Vol. 2006, pp. 313-320)