TY - JOUR

T1 - Estimation in Gaussian noise

T2 - Properties of the minimum mean-square error

AU - Guo, Dongning

AU - Wu, Yihong

AU - Shamai, Shlomo

AU - Verdú, Sergio

N1 - Funding Information:
Manuscript received June 19, 2008; revised April 19, 2010; accepted August 21, 2010. Date of current version March 16, 2011. This work was supported in part by the National Science Foundation (NSF) under Grants CCF-0644344 and CCF-0635154 and in part by the Binational U.S.-Israel Scientific Foundation. This work was presented in part at the IEEE International Symposium on Information Theory, Toronto, ON, Canada, July 2008.

PY - 2011/4

Y1 - 2011/4

N2 - Consider the minimum mean-square error (MMSE) of estimating an arbitrary random variable from its observation contaminated by Gaussian noise. The MMSE can be regarded as a function of the signal-to-noise ratio (SNR) as well as a functional of the input distribution (of the random variable to be estimated). It is shown that the MMSE is concave in the input distribution at any given SNR. For a given input distribution, the MMSE is found to be infinitely differentiable at all positive SNR, and in fact a real analytic function in SNR under mild conditions. The key to these regularity results is that the posterior distribution conditioned on the observation through Gaussian channels always decays at least as quickly as some Gaussian density. Furthermore, simple expressions for the first three derivatives of the MMSE with respect to the SNR are obtained. It is also shown that, as functions of the SNR, the curves for the MMSE of a Gaussian input and that of a non-Gaussian input cross at most once over all SNRs. These properties lead to simple proofs of the facts that Gaussian inputs achieve both the secrecy capacity of scalar Gaussian wiretap channels and the capacity of scalar Gaussian broadcast channels, as well as a simple proof of the entropy power inequality in the special case where one of the variables is Gaussian.

AB - Consider the minimum mean-square error (MMSE) of estimating an arbitrary random variable from its observation contaminated by Gaussian noise. The MMSE can be regarded as a function of the signal-to-noise ratio (SNR) as well as a functional of the input distribution (of the random variable to be estimated). It is shown that the MMSE is concave in the input distribution at any given SNR. For a given input distribution, the MMSE is found to be infinitely differentiable at all positive SNR, and in fact a real analytic function in SNR under mild conditions. The key to these regularity results is that the posterior distribution conditioned on the observation through Gaussian channels always decays at least as quickly as some Gaussian density. Furthermore, simple expressions for the first three derivatives of the MMSE with respect to the SNR are obtained. It is also shown that, as functions of the SNR, the curves for the MMSE of a Gaussian input and that of a non-Gaussian input cross at most once over all SNRs. These properties lead to simple proofs of the facts that Gaussian inputs achieve both the secrecy capacity of scalar Gaussian wiretap channels and the capacity of scalar Gaussian broadcast channels, as well as a simple proof of the entropy power inequality in the special case where one of the variables is Gaussian.

KW - Entropy

KW - Gaussian broadcast channel

KW - Gaussian noise

KW - Gaussian wiretap channel

KW - estimation

KW - minimum mean square error (MMSE)

KW - mutual information

UR - http://www.scopus.com/inward/record.url?scp=79952832483&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=79952832483&partnerID=8YFLogxK

U2 - 10.1109/TIT.2011.2111010

DO - 10.1109/TIT.2011.2111010

M3 - Article

AN - SCOPUS:79952832483

VL - 57

SP - 2371

EP - 2385

JO - IRE Professional Group on Information Theory

JF - IRE Professional Group on Information Theory

SN - 0018-9448

IS - 4

M1 - 5730572

ER -