Upper bounds on the relative entropy and Rényi divergence as a function of total variation distance for finite alphabets

Igal Sason, Sergio Verdu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

23 Scopus citations

Abstract

A new upper bound on the relative entropy is derived as a function of the total variation distance for probability measures defined on a common finite alphabet. The bound improves a previously reported bound by Csiszár and Talata. It is further extended to an upper bound on the Rényi divergence of an arbitrary non-negative order (including ∞) as a function of the total variation distance.

Original languageEnglish (US)
Title of host publicationITW 2015 - 2015 IEEE Information Theory Workshop
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages214-218
Number of pages5
ISBN (Electronic)9781467378529
DOIs
StatePublished - Dec 17 2015
EventIEEE Information Theory Workshop, ITW 2015 - Jeju Island, Korea, Republic of
Duration: Oct 11 2015Oct 15 2015

Publication series

NameITW 2015 - 2015 IEEE Information Theory Workshop

Other

OtherIEEE Information Theory Workshop, ITW 2015
Country/TerritoryKorea, Republic of
CityJeju Island
Period10/11/1510/15/15

All Science Journal Classification (ASJC) codes

  • Information Systems

Keywords

  • Pinsker's inequality
  • Rényi divergence
  • relative entropy
  • relative information
  • total variation distance

Fingerprint

Dive into the research topics of 'Upper bounds on the relative entropy and Rényi divergence as a function of total variation distance for finite alphabets'. Together they form a unique fingerprint.

Cite this