Abstract
A random variable with distribution P is observed in Gaussian noise and is estimated by a mismatched minimum mean-square estimator that assumes that the distribution is Q, instead of P. This paper shows that the integral over all signal-to-noise ratios (SNRs) of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy D(P ∥ Q) (in nats). This representation of relative entropy can be generalized to nonreal-valued random variables, and can be particularized to give new general representations of mutual information in terms of conditional means. Inspired by the new representation, we also propose a definition of free relative entropy which fills a gap in, and is consistent with, the literature on free probability.
Original language | English (US) |
---|---|
Article number | 5508632 |
Pages (from-to) | 3712-3720 |
Number of pages | 9 |
Journal | IEEE Transactions on Information Theory |
Volume | 56 |
Issue number | 8 |
DOIs | |
State | Published - Aug 2010 |
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Keywords
- Divergence
- Shannon theory
- free probability
- minimum mean- square error (MMSE) estimation
- mutual information
- relative entropy
- statistics