Mutual information and conditional mean estimation in poisson channels

Dongning Guo, Sergio Verdú, Shlomo Shamai

Research output: Chapter in Book/Report/Conference proceedingConference contribution

11 Scopus citations

Abstract

Following the recent discovery of new connections between information and estimation in Gaussian channels, this paper reports parallel results in the Poisson regime. Both scalar and continuous-time Poisson channels are considered. It is found that, regardless of the statistics of the input, the derivative of the input-output mutual information with respect to the dark current can be expressed in the expected difference between the logarithm of the input and the logarithm of its conditional mean estimate (noncausal in case of continuous-time). The same is true for the derivative with respect to input scaling, but with the logarithmic function replaced by x log x.

Original languageEnglish (US)
Title of host publication2004 IEEE Information Theory Workshop - Proceedings, ITW
Pages265-270
Number of pages6
StatePublished - Dec 1 2004
Event2004 IEEE Information Theory Workshop - Proceedings, ITW - San Antonio, TX, United States
Duration: Oct 24 2004Oct 29 2004

Publication series

Name2004 IEEE Information Theory Workshop - Proceedings, ITW

Other

Other2004 IEEE Information Theory Workshop - Proceedings, ITW
CountryUnited States
CitySan Antonio, TX
Period10/24/0410/29/04

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Fingerprint Dive into the research topics of 'Mutual information and conditional mean estimation in poisson channels'. Together they form a unique fingerprint.

  • Cite this

    Guo, D., Verdú, S., & Shamai, S. (2004). Mutual information and conditional mean estimation in poisson channels. In 2004 IEEE Information Theory Workshop - Proceedings, ITW (pp. 265-270). (2004 IEEE Information Theory Workshop - Proceedings, ITW).