Sparse boosting

Zhen James Xiang, Peter Jeffrey Ramadge

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

We propose a boosting algorithm that seeks to minimize the AdaBoost exponential loss of a composite classifier using only a sparse set of base classifiers. The proposed algorithm is computationally efficient and in test examples produces composite classifiers that are sparser and generalize as well those produced by Adaboost. The algorithm can be viewed as a coordinate descent method for the l1-regularized Adaboost exponential loss function.

Original languageEnglish (US)
Title of host publication2009 IEEE International Conference on Acoustics, Speech, and Signal Processing - Proceedings, ICASSP 2009
Pages1625-1628
Number of pages4
DOIs
StatePublished - 2009
Event2009 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2009 - Taipei, Taiwan, Province of China
Duration: Apr 19 2009Apr 24 2009

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
ISSN (Print)1520-6149

Other

Other2009 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2009
Country/TerritoryTaiwan, Province of China
CityTaipei
Period4/19/094/24/09

All Science Journal Classification (ASJC) codes

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering

Keywords

  • Algorithms
  • Optimization methods
  • Pattern classification
  • Signal representations

Fingerprint

Dive into the research topics of 'Sparse boosting'. Together they form a unique fingerprint.

Cite this