TY - JOUR
T1 - F-Divergence Inequalities
AU - Sason, Igal
AU - Verdu, Sergio
N1 - Funding Information:
This work has been supported by the Israeli Science Foundation (ISF) under Grant 12/12, by NSF Grant CCF-1016625, by the Center for Science of Information, an NSF Science and Technology Center under Grant CCF-0939370, and by ARO under MURI Grant W911NF-15-1-0479.
Publisher Copyright:
© 1963-2012 IEEE.
PY - 2016/11
Y1 - 2016/11
N2 - This paper develops systematic approaches to obtain f -divergence inequalities, dealing with pairs of probability measures defined on arbitrary alphabets. Functional domination is one such approach, where special emphasis is placed on finding the best possible constant upper bounding a ratio of f -divergences. Another approach used for the derivation of bounds among f -divergences relies on moment inequalities and the logarithmic-convexity property, which results in tight bounds on the relative entropy and Bhattacharyya distance in terms of χ2 divergences. A rich variety of bounds are shown to hold under boundedness assumptions on the relative information. Special attention is devoted to the total variation distance and its relation to the relative information and relative entropy, including 'reverse Pinsker inequalities,' as well as on the Eγ divergence, which generalizes the total variation distance. Pinsker's inequality is extended for this type of f -divergence, a result which leads to an inequality linking the relative entropy and relative information spectrum. Integral expressions of the Renyi divergence in terms of the relative information spectrum are derived, leading to bounds on the Renyi divergence in terms of either the variational distance or relative entropy.
AB - This paper develops systematic approaches to obtain f -divergence inequalities, dealing with pairs of probability measures defined on arbitrary alphabets. Functional domination is one such approach, where special emphasis is placed on finding the best possible constant upper bounding a ratio of f -divergences. Another approach used for the derivation of bounds among f -divergences relies on moment inequalities and the logarithmic-convexity property, which results in tight bounds on the relative entropy and Bhattacharyya distance in terms of χ2 divergences. A rich variety of bounds are shown to hold under boundedness assumptions on the relative information. Special attention is devoted to the total variation distance and its relation to the relative information and relative entropy, including 'reverse Pinsker inequalities,' as well as on the Eγ divergence, which generalizes the total variation distance. Pinsker's inequality is extended for this type of f -divergence, a result which leads to an inequality linking the relative entropy and relative information spectrum. Integral expressions of the Renyi divergence in terms of the relative information spectrum are derived, leading to bounds on the Renyi divergence in terms of either the variational distance or relative entropy.
KW - Pinsker's inequality
KW - Relative entropy
KW - Rényi divergence
KW - f-divergence
KW - relative information
KW - total variation distance
UR - http://www.scopus.com/inward/record.url?scp=85027055882&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85027055882&partnerID=8YFLogxK
U2 - 10.1109/TIT.2016.2603151
DO - 10.1109/TIT.2016.2603151
M3 - Article
AN - SCOPUS:85027055882
SN - 0018-9448
VL - 62
SP - 5973
EP - 6006
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 11
M1 - 7552457
ER -