Abstract
Upper and lower bounds on the minimum mean square error for additive noise channels are derived when the input distribution is constrained to be close to a Gaussian reference distribution in terms of the Kullback-Leibler divergence. The upper bound is tight and is attained by a Gaussian distribution whose mean is identical to that of the reference distribution and whose covariance matrix is defined implicitly via a system of non-linear equations. The estimator that attains the upper bound is identified as a minimax optimal estimator that is robust against deviations from the assumed prior. The lower bound provides an alternative to well-known inequalities in estimation and information theory-such as the Cramér-Rao lower bound, Stam's inequality, or the entropy power inequality-that is potentially tighter and defined for a larger class of input distributions. Several examples of applications in signal processing and information theory illustrate the usefulness of the proposed bounds in practice.
Original language | English (US) |
---|---|
Article number | 8890879 |
Pages (from-to) | 6352-6367 |
Number of pages | 16 |
Journal | IEEE Transactions on Signal Processing |
Volume | 67 |
Issue number | 24 |
DOIs | |
State | Published - Dec 15 2019 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- Signal Processing
- Electrical and Electronic Engineering
Keywords
- Cramér-Rao bound
- MMSE bounds
- Stam's inequality
- entropy power inequality
- minimax optimization
- robust estimation