Conditional entropy and error probability

Siu Wai Ho, Sergio Verdú

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

Fano's inequality relates the error probability and conditional entropy of a finitely-valued random variable X given another random variable Y. It is not necessarily tight when the marginal distribution of X is fixed. In this paper, we consider both finite and countably infinite alphabets. A tight upper bound on the conditional entropy of X given Y is given in terms of the error probability and the marginal distribution of X. A new lower bound on the conditional entropy for countably infinite alphabet is also found. The equivalence of the reliability criteria of vanishing error probability and vanishing conditional entropy is established in wide generality.

Original languageEnglish (US)
Title of host publicationProceedings - 2008 IEEE International Symposium on Information Theory, ISIT 2008
Pages1622-1626
Number of pages5
DOIs
StatePublished - 2008
Event2008 IEEE International Symposium on Information Theory, ISIT 2008 - Toronto, ON, Canada
Duration: Jul 6 2008Jul 11 2008

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
ISSN (Print)2157-8101

Other

Other2008 IEEE International Symposium on Information Theory, ISIT 2008
Country/TerritoryCanada
CityToronto, ON
Period7/6/087/11/08

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Conditional entropy and error probability'. Together they form a unique fingerprint.

Cite this