Abstract
In addition to exploring its various regularity properties, we show that the minimum mean-square error (MMSE) is a concave functional of the input-output joint distribution. In the case of additive Gaussian noise, the MMSE is shown to be weakly continuous in the input distribution and Lipschitz continuous with respect to the quadratic Wasserstein distance for peak-limited inputs. Regularity properties of mutual information are also obtained. Several applications to information theory and the central limit theorem are discussed.
Original language | English (US) |
---|---|
Article number | 6084749 |
Pages (from-to) | 1289-1301 |
Number of pages | 13 |
Journal | IEEE Transactions on Information Theory |
Volume | 58 |
Issue number | 3 |
DOIs | |
State | Published - Mar 2012 |
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Keywords
- Bayesian statistics
- Gaussian noise
- central limit theorem
- minimum mean-square error (MMSE)
- mutual information
- non-Gaussian noise