Estimation-theoretic representation of mutual information

Daniel P. Palomar, Sergio Verdú

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

A fundamental relationship between information theory and estimation theory was recently unveiled for the Gaussian channel, relating the derivative of mutual information with the minimum mean-square error. This paper generalizes this fundamental link between information theory and estimation theory to arbitrary channels and in particular encompasses the discrete memoryless channel (DMC). In addition to the intrinsic theoretical interest of such a result, it naturally leads to an efficient numerical computation of mutual information for cases in which it was previously infeasible such as with LDPC codes.

Original languageEnglish (US)
Title of host publication43rd Annual Allerton Conference on Communication, Control and Computing 2005
PublisherUniversity of Illinois at Urbana-Champaign, Coordinated Science Laboratory and Department of Computer and Electrical Engineering
Pages2013-2022
Number of pages10
ISBN (Electronic)9781604234916
StatePublished - 2005
Event43rd Annual Allerton Conference on Communication, Control and Computing 2005 - Monticello, United States
Duration: Sep 28 2005Sep 30 2005

Publication series

Name43rd Annual Allerton Conference on Communication, Control and Computing 2005
Volume4

Other

Other43rd Annual Allerton Conference on Communication, Control and Computing 2005
Country/TerritoryUnited States
CityMonticello
Period9/28/059/30/05

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Estimation-theoretic representation of mutual information'. Together they form a unique fingerprint.

Cite this