Mismatched estimation and relative entropy

Sergio Verdú

Research output: Contribution to journalArticlepeer-review

69 Scopus citations

Abstract

A random variable with distribution P is observed in Gaussian noise and is estimated by a mismatched minimum mean-square estimator that assumes that the distribution is Q, instead of P. This paper shows that the integral over all signal-to-noise ratios (SNRs) of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy D(P ∥ Q) (in nats). This representation of relative entropy can be generalized to nonreal-valued random variables, and can be particularized to give new general representations of mutual information in terms of conditional means. Inspired by the new representation, we also propose a definition of free relative entropy which fills a gap in, and is consistent with, the literature on free probability.

Original languageEnglish (US)
Article number5508632
Pages (from-to)3712-3720
Number of pages9
JournalIEEE Transactions on Information Theory
Volume56
Issue number8
DOIs
StatePublished - Aug 2010

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Keywords

  • Divergence
  • Shannon theory
  • free probability
  • minimum mean- square error (MMSE) estimation
  • mutual information
  • relative entropy
  • statistics

Fingerprint

Dive into the research topics of 'Mismatched estimation and relative entropy'. Together they form a unique fingerprint.

Cite this