Mutual information and conditional mean estimation in Poisson channels

Dongning Guo, Shlomo Shamai, Sergio Verdu

Research output: Contribution to journalArticlepeer-review

68 Scopus citations

Abstract

Following the discovery of a fundamental connection between information measures and estimation measures in Gaussian channels, this paper explores the counterpart of those results in Poisson channels. In the continuous-time setting, the received signal is a doubly stochastic Poisson point process whose rate is equal to the input signal plus a dark current. It is found that, regardless of the statistics of the input, the derivative of the input - output mutual information with respect to the intensity of the additive dark current can be expressed as the expected difference between the logarithm of the input and the logarithm of its noncausal conditional mean estimate. The same holds for the derivative with respect to input scaling, but with the logarithmic function replaced by χ log χ. Similar relationships hold for discrete-time versions of the channel where the outputs are Poisson random variables conditioned on the input symbols.

Original languageEnglish (US)
Pages (from-to)1837-1849
Number of pages13
JournalIEEE Transactions on Information Theory
Volume54
Issue number5
DOIs
StatePublished - May 1 2008

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Keywords

  • Mutual information
  • Nonlinear filtering
  • Optimal estimation
  • Point process
  • Poisson process
  • Smoothing

Fingerprint Dive into the research topics of 'Mutual information and conditional mean estimation in Poisson channels'. Together they form a unique fingerprint.

Cite this