Support Vector Machines on a budget

Ofer Dekel, Yoram Singer

Research output: Chapter in Book/Report/Conference proceedingConference contribution

24 Scopus citations

Abstract

The standard Support Vector Machine formulation does not provide its user with the ability to explicitly control the number of support vectors used to define the generated classifier. We present a modified version of SVM that allows the user to set a budget parameter B and focuses on minimizing the loss attained by the B worst-classified examples while ignoring the remaining examples. This idea can be used to derive sparse versions of both L1-SVM and L2-SVM. Technically, we obtain these new SVM variants by replacing the 1-norm in the standard SVM formulation with various interpolation-norms. We also adapt the SMO optimization algorithm to our setting and report on some preliminary experimental results.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 19 - Proceedings of the 2006 Conference
Pages345-352
Number of pages8
StatePublished - Dec 1 2007
Externally publishedYes
Event20th Annual Conference on Neural Information Processing Systems, NIPS 2006 - Vancouver, BC, Canada
Duration: Dec 4 2006Dec 7 2006

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Other

Other20th Annual Conference on Neural Information Processing Systems, NIPS 2006
CountryCanada
CityVancouver, BC
Period12/4/0612/7/06

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint Dive into the research topics of 'Support Vector Machines on a budget'. Together they form a unique fingerprint.

Cite this