Upper bounds on the relative entropy and Rényi divergence as a function of total variation distance for finite alphabets

Igal Sason, Sergio Verdu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

14 Scopus citations

Abstract

A new upper bound on the relative entropy is derived as a function of the total variation distance for probability measures defined on a common finite alphabet. The bound improves a previously reported bound by Csiszár and Talata. It is further extended to an upper bound on the Rényi divergence of an arbitrary non-negative order (including ∞) as a function of the total variation distance.

Original languageEnglish (US)
Title of host publicationITW 2015 - 2015 IEEE Information Theory Workshop
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages214-218
Number of pages5
ISBN (Electronic)9781467378529
DOIs
StatePublished - Dec 17 2015
EventIEEE Information Theory Workshop, ITW 2015 - Jeju Island, Korea, Republic of
Duration: Oct 11 2015Oct 15 2015

Publication series

NameITW 2015 - 2015 IEEE Information Theory Workshop

Other

OtherIEEE Information Theory Workshop, ITW 2015
CountryKorea, Republic of
CityJeju Island
Period10/11/1510/15/15

All Science Journal Classification (ASJC) codes

  • Information Systems

Fingerprint Dive into the research topics of 'Upper bounds on the relative entropy and Rényi divergence as a function of total variation distance for finite alphabets'. Together they form a unique fingerprint.

  • Cite this

    Sason, I., & Verdu, S. (2015). Upper bounds on the relative entropy and Rényi divergence as a function of total variation distance for finite alphabets. In ITW 2015 - 2015 IEEE Information Theory Workshop (pp. 214-218). [7360766] (ITW 2015 - 2015 IEEE Information Theory Workshop). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ITWF.2015.7360766