Abstract
This paper gives upper and lower bounds on the minimum error probability of Bayesian M-ary hypothesis testing in terms of the Arimoto-Rényi conditional entropy of an arbitrary order α. The improved tightness of these bounds over their specialized versions with the Shannon conditional entropy (α = 1) is demonstrated. In particular, in the case where M is finite, we show how to generalize Fano's inequality under both the conventional and list-decision settings. As a counterpart to the generalized Fano's inequality, allowing M to be infinite, a lower bound on the Arimoto-Rényi conditional entropy is derived as a function of the minimum error probability. Explicit upper and lower bounds on the minimum error probability are obtained as a function of the Arimoto-Rényi conditional entropy for both positive and negative α. Furthermore, we give upper bounds on the minimum error probability as functions of the Rényi divergence. In the setup of discrete memoryless channels, we analyze the exponentially vanishing decay of the Arimoto-Rényi conditional entropy of the transmitted codeword given the channel output when averaged over a random-coding ensemble.
Original language | English (US) |
---|---|
Pages (from-to) | 4-25 |
Number of pages | 22 |
Journal | IEEE Transactions on Information Theory |
Volume | 64 |
Issue number | 1 |
DOIs | |
State | Published - 2018 |
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Keywords
- Arimoto-Rényi conditional entropy
- Bayesian minimum probability of error
- Chernoff information
- Fano's inequality
- List decoding
- M-ary hypothesis testing
- Random coding
- Rényi divergence
- Rényi entropy