Fano's inequality relates the error probability of guessing a finitely-valued random variable X given another random variable Y and the conditional entropy of X given Y. It is not necessarily tight when the marginal distribution of X is fixed. This paper gives a tight upper bound on the conditional entropy of X given Y in terms of the error probability and the marginal distribution of X. A new lower bound on the conditional entropy for countably infinite alphabets is also found. The relationship between the reliability criteria of vanishing error probability and vanishing conditional entropy is also discussed. A strengthened form of the Schur-concavity of entropy which holds for finite or countably infinite random variables is given.
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
- Fano's inequality
- Shannon theory
- majorization theory