Universal estimation of information measures

Sergio Verdu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Scopus citations

Abstract

In this presentation I will give an overview of the state of the art in universal estimation of: Entropy Divergence Mutual Information with emphasis on recent algorithms we have proposed with H. Cai, S. Kulkarni and Q. Wang. These algorithms converge to the desired quantities without any knowledge of the statistical properties of the observed data, under several conditions such as stationary-ergodicity in the case of discrete processes, and memorylessness in the case of analog data. A sampling of the literature in this topic is given below.

Original languageEnglish (US)
Title of host publicationProceedings of the IEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, ITW2005
Number of pages1
DOIs
StatePublished - Dec 1 2005
EventIEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, ITW2005 - Rotorua, New Zealand
Duration: Aug 29 2005Sep 1 2005

Publication series

NameProceedings of the IEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, ITW2005

Other

OtherIEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, ITW2005
CountryNew Zealand
CityRotorua
Period8/29/059/1/05

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Fingerprint Dive into the research topics of 'Universal estimation of information measures'. Together they form a unique fingerprint.

  • Cite this

    Verdu, S. (2005). Universal estimation of information measures. In Proceedings of the IEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, ITW2005 [1531895] (Proceedings of the IEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, ITW2005). https://doi.org/10.1109/ITW.2005.1531895