Efficient projections onto the ℓ1-ball for learning in high dimensions

John Duchi, Shai Shalev-Shwartz, Yoram Singer, Tushar Chandra

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1047 Scopus citations

Abstract

We describe efficient algorithms for projecting a vector onto the ℓ1-ball. We present two methods for projection. The first performs exact projection in O(n) expected time, where n is the dimension of the space. The second works on vectors k of whose elements are perturbed outside the ℓ1-ball, projecting in O(k log(n)) time. This setting is especially useful for online learning in sparse feature spaces such as text categorization applications. We demonstrate the merits and effectiveness of our algorithms in numerous batch and online learning tasks. We show that variants of stochastic gradient projection methods augmented with our efficient projection procedures outperform interior point methods, which are considered state-of-the-art optimization techniques. We also show that in online settings gradient updates with ℓ1 projections outperform the exponentiated gradient algorithm while obtaining models with high degrees of sparsity.

Original languageEnglish (US)
Title of host publicationProceedings of the 25th International Conference on Machine Learning
Pages272-279
Number of pages8
StatePublished - 2008
Event25th International Conference on Machine Learning - Helsinki, Finland
Duration: Jul 5 2008Jul 9 2008

Publication series

NameProceedings of the 25th International Conference on Machine Learning

Other

Other25th International Conference on Machine Learning
Country/TerritoryFinland
CityHelsinki
Period7/5/087/9/08

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Human-Computer Interaction
  • Software

Fingerprint

Dive into the research topics of 'Efficient projections onto the ℓ1-ball for learning in high dimensions'. Together they form a unique fingerprint.

Cite this