Abstract
We describe a general framework for online multiclass learning based on the notion of hypothesis sharing. In our framework sets of classes are associated with hypotheses. Thus, all classes within a given set share the same hypothesis. This framework includes as special cases commonly used constructions for multiclass categorization such as allocating a unique hypothesis for each class and allocating a single common hypothesis for all classes. We generalize the multiclass Perception to our framework and derive a unifying mistake bound analysis. Our construction naturally extends to settings where the number of classes is not known in advance but, rather, is revealed along the online learning process. We demonstrate the merits of our approach by comparing it to previous methods on both synthetic and natural datasets.
Original language | English (US) |
---|---|
Title of host publication | ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning |
Pages | 313-320 |
Number of pages | 8 |
Volume | 2006 |
State | Published - Oct 6 2006 |
Externally published | Yes |
Event | ICML 2006: 23rd International Conference on Machine Learning - Pittsburgh, PA, United States Duration: Jun 25 2006 → Jun 29 2006 |
Other
Other | ICML 2006: 23rd International Conference on Machine Learning |
---|---|
Country/Territory | United States |
City | Pittsburgh, PA |
Period | 6/25/06 → 6/29/06 |
All Science Journal Classification (ASJC) codes
- Engineering(all)