### Abstract

A random variable with distribution P is observe din Gaussian noise and is estimated by a minimum meansquare estimator that assumes that the distribution is Q. This paper shows that the integral over all signal-to-noise ratios of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy D(P||Q). This representation of relative entropy can be generalized to non realvalued random variables, and can be particularized to give a new general representation of mutual information in terms of conditional means. Inspired by the new representation, we also propose a definition of free relative entropy which fills a gap in, and is consistent with, the literature on free probability.

Original language | English (US) |
---|---|

Title of host publication | 2009 IEEE International Symposium on Information Theory, ISIT 2009 |

Pages | 809-813 |

Number of pages | 5 |

DOIs | |

State | Published - Nov 19 2009 |

Event | 2009 IEEE International Symposium on Information Theory, ISIT 2009 - Seoul, Korea, Republic of Duration: Jun 28 2009 → Jul 3 2009 |

### Publication series

Name | IEEE International Symposium on Information Theory - Proceedings |
---|---|

ISSN (Print) | 2157-8102 |

### Other

Other | 2009 IEEE International Symposium on Information Theory, ISIT 2009 |
---|---|

Country | Korea, Republic of |

City | Seoul |

Period | 6/28/09 → 7/3/09 |

### All Science Journal Classification (ASJC) codes

- Theoretical Computer Science
- Information Systems
- Modeling and Simulation
- Applied Mathematics

## Fingerprint Dive into the research topics of 'Mismatched estimation and relative entropy'. Together they form a unique fingerprint.

## Cite this

*2009 IEEE International Symposium on Information Theory, ISIT 2009*(pp. 809-813). [5205651] (IEEE International Symposium on Information Theory - Proceedings). https://doi.org/10.1109/ISIT.2009.5205651