Online dictionary learning for sparse coding

Julien Mairal, Francis Bach, Jean Ponce, Guillermo Sapiro

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1214 Scopus citations

Abstract

Sparse coding - that is, modelling data vectors as sparse linear combinations of basis elements - is widely used in machine learning, neuroscience, signal processing, and statistics. This paper focuses on learning the basis set, also called dictionary, to adapt it to specific data, an approach that has recently proven to be very effective for signal reconstruction and classification in the audio and image processing domains. This paper proposes a new online optimization algorithm for dictionary learning, based on stochastic approximations, which scales up gracefully to large datasets with millions of training samples. A proof of convergence is presented, along with experiments with natural images demonstrating that it leads to faster performance and better dictionaries than classical batch algorithms for both small and large datasets.

Original languageEnglish (US)
Title of host publicationProceedings of the 26th International Conference On Machine Learning, ICML 2009
PublisherOmnipress
Pages689-696
Number of pages8
ISBN (Print)9781605585161
StatePublished - 2009
Externally publishedYes
Event26th International Conference On Machine Learning, ICML 2009 - Montreal, QC, Canada
Duration: Jun 14 2009Jun 18 2009

Publication series

NameProceedings of the 26th International Conference On Machine Learning, ICML 2009

Other

Other26th International Conference On Machine Learning, ICML 2009
Country/TerritoryCanada
CityMontreal, QC
Period6/14/096/18/09

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Computer Networks and Communications
  • Software

Fingerprint

Dive into the research topics of 'Online dictionary learning for sparse coding'. Together they form a unique fingerprint.

Cite this