### Abstract

In this presentation I will give an overview of the state of the art in universal estimation of: Entropy Divergence Mutual Information with emphasis on recent algorithms we have proposed with H. Cai, S. Kulkarni and Q. Wang. These algorithms converge to the desired quantities without any knowledge of the statistical properties of the observed data, under several conditions such as stationary-ergodicity in the case of discrete processes, and memorylessness in the case of analog data. A sampling of the literature in this topic is given below.

Original language | English (US) |
---|---|

Title of host publication | Proceedings of the IEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, ITW2005 |

Number of pages | 1 |

DOIs | |

State | Published - Dec 1 2005 |

Event | IEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, ITW2005 - Rotorua, New Zealand Duration: Aug 29 2005 → Sep 1 2005 |

### Publication series

Name | Proceedings of the IEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, ITW2005 |
---|

### Other

Other | IEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, ITW2005 |
---|---|

Country | New Zealand |

City | Rotorua |

Period | 8/29/05 → 9/1/05 |

### All Science Journal Classification (ASJC) codes

- Engineering(all)

## Fingerprint Dive into the research topics of 'Universal estimation of information measures'. Together they form a unique fingerprint.

## Cite this

Verdu, S. (2005). Universal estimation of information measures. In

*Proceedings of the IEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, ITW2005*[1531895] (Proceedings of the IEEE ITSOC Information Theory Workshop 2005 on Coding and Complexity, ITW2005). https://doi.org/10.1109/ITW.2005.1531895