Mismatched estimation and relative entropy

Sergio Verdú

Research output: Chapter in Book/Report/Conference proceedingConference contribution

8 Scopus citations

Abstract

A random variable with distribution P is observe din Gaussian noise and is estimated by a minimum meansquare estimator that assumes that the distribution is Q. This paper shows that the integral over all signal-to-noise ratios of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy D(P||Q). This representation of relative entropy can be generalized to non realvalued random variables, and can be particularized to give a new general representation of mutual information in terms of conditional means. Inspired by the new representation, we also propose a definition of free relative entropy which fills a gap in, and is consistent with, the literature on free probability.

Original languageEnglish (US)
Title of host publication2009 IEEE International Symposium on Information Theory, ISIT 2009
Pages809-813
Number of pages5
DOIs
StatePublished - 2009
Event2009 IEEE International Symposium on Information Theory, ISIT 2009 - Seoul, Korea, Republic of
Duration: Jun 28 2009Jul 3 2009

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
ISSN (Print)2157-8102

Other

Other2009 IEEE International Symposium on Information Theory, ISIT 2009
Country/TerritoryKorea, Republic of
CitySeoul
Period6/28/097/3/09

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Mismatched estimation and relative entropy'. Together they form a unique fingerprint.

Cite this