Abstract
Consider two random variables X and Y, where X is finitely (or countably-infinitely) valued, and where Y is arbitrary. Let ∊ denote the minimum probability of error incurred in estimating X from Y. It is shown that [formula omitted] where π(X|Y) denotes the posterior probability of X given Y. This bound finds information-theoretic applications in the proof of converse channel coding theorems. It generalizes and strengthens previous lower bounds due to Shannon, and to Verdú and Han.
| Original language | English (US) |
|---|---|
| Pages (from-to) | 1992-1994 |
| Number of pages | 3 |
| Journal | IEEE Transactions on Information Theory |
| Volume | 41 |
| Issue number | 6 |
| DOIs | |
| State | Published - Nov 1995 |
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Keywords
- Converse Channel Coding Theorem
- Hypothesis testing
- Shannon theory
- probability of error