Functional properties of minimum mean-square error and mutual information

Yihong Wu, Sergio Verdú

Research output: Contribution to journalArticlepeer-review

97 Scopus citations


In addition to exploring its various regularity properties, we show that the minimum mean-square error (MMSE) is a concave functional of the input-output joint distribution. In the case of additive Gaussian noise, the MMSE is shown to be weakly continuous in the input distribution and Lipschitz continuous with respect to the quadratic Wasserstein distance for peak-limited inputs. Regularity properties of mutual information are also obtained. Several applications to information theory and the central limit theorem are discussed.

Original languageEnglish (US)
Article number6084749
Pages (from-to)1289-1301
Number of pages13
JournalIEEE Transactions on Information Theory
Issue number3
StatePublished - Mar 2012

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences


  • Bayesian statistics
  • Gaussian noise
  • central limit theorem
  • minimum mean-square error (MMSE)
  • mutual information
  • non-Gaussian noise


Dive into the research topics of 'Functional properties of minimum mean-square error and mutual information'. Together they form a unique fingerprint.

Cite this