TY - JOUR
T1 - Mismatched estimation and relative entropy
AU - Verdú, Sergio
N1 - Funding Information:
Manuscript received July 21, 2009; revised March 21, 2010. Date of current version July 14, 2010. This work was supported in part by the National Science Foundation under Grants CCF-0635154 and CCF-0728445. The material in this paper was presented in part at the IEEE International Symposium on Information Theory, Seoul, South Korea, June 2009. The author is with the Department of Electrical Engineering, Princeton University, Princeton, NJ 08544 USA (e-mail: [email protected]). Communicated by H. Yamamoto, Associate Editor for Shannon Theory. Color versions of one or more of the figures in this paper are available online at http://ieeexplore.ieee.org. Digital Object Identifier 10.1109/TIT.2010.2050800 1For brevity, throughout the paper, information units are nats, and logarithms are natural.
PY - 2010/8
Y1 - 2010/8
N2 - A random variable with distribution P is observed in Gaussian noise and is estimated by a mismatched minimum mean-square estimator that assumes that the distribution is Q, instead of P. This paper shows that the integral over all signal-to-noise ratios (SNRs) of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy D(P ∥ Q) (in nats). This representation of relative entropy can be generalized to nonreal-valued random variables, and can be particularized to give new general representations of mutual information in terms of conditional means. Inspired by the new representation, we also propose a definition of free relative entropy which fills a gap in, and is consistent with, the literature on free probability.
AB - A random variable with distribution P is observed in Gaussian noise and is estimated by a mismatched minimum mean-square estimator that assumes that the distribution is Q, instead of P. This paper shows that the integral over all signal-to-noise ratios (SNRs) of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy D(P ∥ Q) (in nats). This representation of relative entropy can be generalized to nonreal-valued random variables, and can be particularized to give new general representations of mutual information in terms of conditional means. Inspired by the new representation, we also propose a definition of free relative entropy which fills a gap in, and is consistent with, the literature on free probability.
KW - Divergence
KW - Shannon theory
KW - free probability
KW - minimum mean- square error (MMSE) estimation
KW - mutual information
KW - relative entropy
KW - statistics
UR - http://www.scopus.com/inward/record.url?scp=77954590943&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=77954590943&partnerID=8YFLogxK
U2 - 10.1109/TIT.2010.2050800
DO - 10.1109/TIT.2010.2050800
M3 - Article
AN - SCOPUS:77954590943
SN - 0018-9448
VL - 56
SP - 3712
EP - 3720
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 8
M1 - 5508632
ER -