TY - GEN

T1 - Universal estimation of information measures

AU - Verdu, Sergio

PY - 2005/12/1

Y1 - 2005/12/1

N2 - In this presentation I will give an overview of the state of the art in universal estimation of: Entropy Divergence Mutual Information with emphasis on recent algorithms we have proposed with H. Cai, S. Kulkarni and Q. Wang. These algorithms converge to the desired quantities without any knowledge of the statistical properties of the observed data, under several conditions such as stationary-ergodicity in the case of discrete processes, and memorylessness in the case of analog data. A sampling of the literature in this topic is given below.

AB - In this presentation I will give an overview of the state of the art in universal estimation of: Entropy Divergence Mutual Information with emphasis on recent algorithms we have proposed with H. Cai, S. Kulkarni and Q. Wang. These algorithms converge to the desired quantities without any knowledge of the statistical properties of the observed data, under several conditions such as stationary-ergodicity in the case of discrete processes, and memorylessness in the case of analog data. A sampling of the literature in this topic is given below.

UR - http://www.scopus.com/inward/record.url?scp=33749075209&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33749075209&partnerID=8YFLogxK

U2 - 10.1109/ITW.2005.1531895

DO - 10.1109/ITW.2005.1531895

M3 - Conference contribution

AN - SCOPUS:33749075209

SN - 078039481X

SN - 9780780394810

T3 - Proceedings of the IEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, ITW2005

BT - Proceedings of the IEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, ITW2005

T2 - IEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, ITW2005

Y2 - 29 August 2005 through 1 September 2005

ER -