Representation of mutual information via input estimates

Daniel P. Palomar, Sergio Verdú

Research output: Contribution to journalArticlepeer-review

50 Scopus citations

Abstract

A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum mean-square error. This paper generalizes the link between information theory and estimation theory to arbitrary channels, giving representations of the derivative of mutual information as a function of the conditional marginal input distributions given the outputs. We illustrate the use of this representation in the efficient numerical computation of the mutual information achieved by inputs such as specific codes or natural language.

Original languageEnglish (US)
Pages (from-to)453-470
Number of pages18
JournalIEEE Transactions on Information Theory
Volume53
Issue number2
DOIs
StatePublished - Feb 2007

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Keywords

  • Computation of mutual information
  • Extrinsic information
  • Input estimation
  • Low-density parity-check (LDPC) codes
  • Minimum mean square error (MMSE)
  • Mutual information
  • Soft channel decoding

Fingerprint

Dive into the research topics of 'Representation of mutual information via input estimates'. Together they form a unique fingerprint.

Cite this