### Abstract

Non-negative matrix factorization (NMF) has previously been shown to be a useful decomposition for multivariate data. Two different multiplicative algorithms for NMF are analyzed. They differ only slightly in the multiplicative factor used in the update rules. One algorithm can be shown to minimize the conventional least squares error while the other minimizes the generalized Kullback-Leibler divergence. The monotonic convergence of both algorithms can be proven using an auxiliary function analogous to that used for proving convergence of the Expectation-Maximization algorithm. The algorithms can also be interpreted as diagonally rescaled gradient descent, where the rescaling factor is optimally chosen to ensure convergence.

Original language | English (US) |
---|---|

Title of host publication | Advances in Neural Information Processing Systems 13 - Proceedings of the 2000 Conference, NIPS 2000 |

Publisher | Neural information processing systems foundation |

ISBN (Print) | 0262122413, 9780262122412 |

State | Published - Jan 1 2001 |

Event | 14th Annual Neural Information Processing Systems Conference, NIPS 2000 - Denver, CO, United States Duration: Nov 27 2000 → Dec 2 2000 |

### Publication series

Name | Advances in Neural Information Processing Systems |
---|---|

ISSN (Print) | 1049-5258 |

### Other

Other | 14th Annual Neural Information Processing Systems Conference, NIPS 2000 |
---|---|

Country | United States |

City | Denver, CO |

Period | 11/27/00 → 12/2/00 |

### All Science Journal Classification (ASJC) codes

- Computer Networks and Communications
- Information Systems
- Signal Processing

## Fingerprint Dive into the research topics of 'Algorithms for non-negative matrix factorization'. Together they form a unique fingerprint.

## Cite this

*Advances in Neural Information Processing Systems 13 - Proceedings of the 2000 Conference, NIPS 2000*(Advances in Neural Information Processing Systems). Neural information processing systems foundation.