Arimoto-Rényi conditional entropy and Bayesian hypothesis testing

Igal Sason, Sergio Verdú

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

This paper gives upper and lower bounds on the minimum error probability of Bayesian M-ary hypothesis testing in terms of the Arimoto-Rényi conditional entropy of an arbitrary order α. The improved tightness of these bounds over their specialized versions with the Shannon conditional entropy (α = 1) is demonstrated. In particular, in the case where M is finite, we show how to generalize Fano's inequality under both the conventional and list-decision settings. As a counterpart to the generalized Fano's inequality, allowing M to be infinite, a lower bound on the Arimoto-Rényi conditional entropy is derived as a function of the minimum error probability. Explicit upper and lower bounds on the minimum error probability are obtained as a function of the Arimoto-Rényi conditional entropy.

Original languageEnglish (US)
Title of host publication2017 IEEE International Symposium on Information Theory, ISIT 2017
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2965-2969
Number of pages5
ISBN (Electronic)9781509040964
DOIs
StatePublished - Aug 9 2017
Event2017 IEEE International Symposium on Information Theory, ISIT 2017 - Aachen, Germany
Duration: Jun 25 2017Jun 30 2017

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
ISSN (Print)2157-8095

Other

Other2017 IEEE International Symposium on Information Theory, ISIT 2017
Country/TerritoryGermany
CityAachen
Period6/25/176/30/17

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Keywords

  • Arimoto-Rényi conditional entropy
  • Fano's inequality
  • Hypothesis testing
  • Information measures
  • Minimum probability of error
  • Rényi divergence

Fingerprint

Dive into the research topics of 'Arimoto-Rényi conditional entropy and Bayesian hypothesis testing'. Together they form a unique fingerprint.

Cite this