Mutual information and MMSE in Gaussian channels

Dongning Guo, Shlomo Shamai, Sergio Verdú

Research output: Contribution to journalConference articlepeer-review

26 Scopus citations

Abstract

Consider arbitrarily distributed input signals observed in additive Gaussian noise. A new fundamental relationship is found between the input-output mutual information and the minimum mean-square error (MMSE) of an estimate of the input given the output: The derivative of the mutual information (nats) with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE. This identity holds for both scalar and vector signals, as well as for discrete- and continuous-time noncausal MMSE estimation (smoothing). A consequence of the result is a new relationship in continuous-time nonlinear filtering: Regardless of the input statistics, the causal MMSE achieved at snr is equal to the expected value of the noncausal MMSE achieved with a channel whose SNR is chosen uniformly distributed between 0 and snr.

Original languageEnglish (US)
Pages (from-to)347
Number of pages1
JournalIEEE International Symposium on Information Theory - Proceedings
StatePublished - 2004
EventProceedings - 2004 IEEE International Symposium on Information Theory - Chicago, IL, United States
Duration: Jun 27 2004Jul 2 2004

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Mutual information and MMSE in Gaussian channels'. Together they form a unique fingerprint.

Cite this