Algorithms for non-negative matrix factorization

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5277 Scopus citations

Abstract

Non-negative matrix factorization (NMF) has previously been shown to be a useful decomposition for multivariate data. Two different multiplicative algorithms for NMF are analyzed. They differ only slightly in the multiplicative factor used in the update rules. One algorithm can be shown to minimize the conventional least squares error while the other minimizes the generalized Kullback-Leibler divergence. The monotonic convergence of both algorithms can be proven using an auxiliary function analogous to that used for proving convergence of the Expectation-Maximization algorithm. The algorithms can also be interpreted as diagonally rescaled gradient descent, where the rescaling factor is optimally chosen to ensure convergence.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 13 - Proceedings of the 2000 Conference, NIPS 2000
PublisherNeural information processing systems foundation
ISBN (Print)0262122413, 9780262122412
StatePublished - 2001
Event14th Annual Neural Information Processing Systems Conference, NIPS 2000 - Denver, CO, United States
Duration: Nov 27 2000Dec 2 2000

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Other

Other14th Annual Neural Information Processing Systems Conference, NIPS 2000
Country/TerritoryUnited States
CityDenver, CO
Period11/27/0012/2/00

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Algorithms for non-negative matrix factorization'. Together they form a unique fingerprint.

Cite this