Boosting with structural sparsity

John Duchi, Yoram Singer

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Scopus citations

Abstract

We derive generalizations of AdaBoost and related gradient-based coordinate descent methods that incorporate sparsity-promoting penalties for the norm of the predictor that is being learned. The end result is a family of coordinate descent algorithms that integrate forward feature induction and back-pruning through regularization and give an automatic stopping criterion for feature induction. We study penalties based on the l 1, l 2, and l norms of the predictor and introduce mixed-norm penalties that build upon the initial penalties. The mixed-norm regularizers facilitate structural sparsity in parameter space, which is a useful property in multiclass prediction and other related tasks. We report empirical results that demonstrate the power of our approach in building accurate and structurally sparse model.

Original languageEnglish (US)
Title of host publicationProceedings of the 26th Annual International Conference on Machine Learning, ICML'09
DOIs
StatePublished - 2009
Event26th Annual International Conference on Machine Learning, ICML'09 - Montreal, QC, Canada
Duration: Jun 14 2009Jun 18 2009

Publication series

NameACM International Conference Proceeding Series
Volume382

Other

Other26th Annual International Conference on Machine Learning, ICML'09
Country/TerritoryCanada
CityMontreal, QC
Period6/14/096/18/09

All Science Journal Classification (ASJC) codes

  • Software
  • Human-Computer Interaction
  • Computer Vision and Pattern Recognition
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Boosting with structural sparsity'. Together they form a unique fingerprint.

Cite this