Generalizing the Fano Inequality

Te Sun Han, Sergio Verdú

Research output: Contribution to journalArticlepeer-review

84 Scopus citations


The Fano inequality gives a lower bound on the mutual information between two random variables that take values on an M-element set, provided at least one of the random variables is equiprobable. We show several simple lower bounds on mutual information which do not assume such a restriction. In particular, this can be accomplished by replacing log M with the infinite-order Renyi entropy in the Fano inequality. Applications to hypothesis testing are exhibited along with bounds on mutual information in terms of the a priori and a posteriori error probabilities.

Original languageEnglish (US)
Pages (from-to)1247-1251
Number of pages5
JournalIEEE Transactions on Information Theory
Issue number4
StatePublished - Jul 1994

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences


  • Fano inequality
  • Shannon theory
  • hypothesis testing
  • mutual information


Dive into the research topics of 'Generalizing the Fano Inequality'. Together they form a unique fingerprint.

Cite this