Abstract
The Fano inequality gives a lower bound on the mutual information between two random variables that take values on an M-element set, provided at least one of the random variables is equiprobable. We show several simple lower bounds on mutual information which do not assume such a restriction. In particular, this can be accomplished by replacing log M with the infinite-order Renyi entropy in the Fano inequality. Applications to hypothesis testing are exhibited along with bounds on mutual information in terms of the a priori and a posteriori error probabilities.
Original language | English (US) |
---|---|
Pages (from-to) | 1247-1251 |
Number of pages | 5 |
Journal | IEEE Transactions on Information Theory |
Volume | 40 |
Issue number | 4 |
DOIs | |
State | Published - Jul 1994 |
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Keywords
- Fano inequality
- Shannon theory
- hypothesis testing
- mutual information