Arimoto-rényi conditional entropy and Bayesian M-ary hypothesis testing

Igal Sason, Sergio Verdu

Research output: Contribution to journalArticlepeer-review

29 Scopus citations

Abstract

This paper gives upper and lower bounds on the minimum error probability of Bayesian M-ary hypothesis testing in terms of the Arimoto-Rényi conditional entropy of an arbitrary order α. The improved tightness of these bounds over their specialized versions with the Shannon conditional entropy (α = 1) is demonstrated. In particular, in the case where M is finite, we show how to generalize Fano's inequality under both the conventional and list-decision settings. As a counterpart to the generalized Fano's inequality, allowing M to be infinite, a lower bound on the Arimoto-Rényi conditional entropy is derived as a function of the minimum error probability. Explicit upper and lower bounds on the minimum error probability are obtained as a function of the Arimoto-Rényi conditional entropy for both positive and negative α. Furthermore, we give upper bounds on the minimum error probability as functions of the Rényi divergence. In the setup of discrete memoryless channels, we analyze the exponentially vanishing decay of the Arimoto-Rényi conditional entropy of the transmitted codeword given the channel output when averaged over a random-coding ensemble.

Original languageEnglish (US)
Pages (from-to)4-25
Number of pages22
JournalIEEE Transactions on Information Theory
Volume64
Issue number1
DOIs
StatePublished - Jan 1 2018

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint Dive into the research topics of 'Arimoto-rényi conditional entropy and Bayesian M-ary hypothesis testing'. Together they form a unique fingerprint.

Cite this