A Lower Bound on the Probability of Error in Multihypothesis Testing

H. Vincent Poor, Sergio Verdú

Research output: Contribution to journalLetterpeer-review

31 Scopus citations


Consider two random variables X and Y, where X is finitely (or countably-infinitely) valued, and where Y is arbitrary. Let ∊ denote the minimum probability of error incurred in estimating X from Y. It is shown that [formula omitted] where π(X|Y) denotes the posterior probability of X given Y. This bound finds information-theoretic applications in the proof of converse channel coding theorems. It generalizes and strengthens previous lower bounds due to Shannon, and to Verdú and Han.

Original languageEnglish (US)
Pages (from-to)1992-1994
Number of pages3
JournalIEEE Transactions on Information Theory
Issue number6
StatePublished - Nov 1995

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences


  • Converse Channel Coding Theorem
  • Hypothesis testing
  • Shannon theory
  • probability of error


Dive into the research topics of 'A Lower Bound on the Probability of Error in Multihypothesis Testing'. Together they form a unique fingerprint.

Cite this