Entropy and inference, revisited

Ilya Nemenman, Fariel Shafee, William Bialek

Research output: Chapter in Book/Report/Conference proceedingConference contribution

99 Scopus citations

Abstract

We study properties of popular near-uniform (Dirichlet) priors for learning undersampled probability distributions on discrete nonmetric spaces and show that they lead to disastrous results. However, an Occam-style phase space argument expands the priors into their infinite mixture and resolves most of the observed problems. This leads to a surprisingly good estimator of entropies of discrete distributions.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 14 - Proceedings of the 2001 Conference, NIPS 2001
PublisherNeural information processing systems foundation
ISBN (Print)0262042088, 9780262042086
StatePublished - Jan 1 2002
Event15th Annual Neural Information Processing Systems Conference, NIPS 2001 - Vancouver, BC, Canada
Duration: Dec 3 2001Dec 8 2001

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Other

Other15th Annual Neural Information Processing Systems Conference, NIPS 2001
CountryCanada
CityVancouver, BC
Period12/3/0112/8/01

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint Dive into the research topics of 'Entropy and inference, revisited'. Together they form a unique fingerprint.

  • Cite this

    Nemenman, I., Shafee, F., & Bialek, W. (2002). Entropy and inference, revisited. In Advances in Neural Information Processing Systems 14 - Proceedings of the 2001 Conference, NIPS 2001 (Advances in Neural Information Processing Systems). Neural information processing systems foundation.