TY - JOUR
T1 - A simple proof of the entropy-power inequality
AU - Verdú, Sergio
AU - Guo, Dongning
N1 - Funding Information:
Manuscript received July 13, 2005. This work was supported in part by the National Science Foundation under Grants NCR-0074277 and CCR-0312879. S. Verdú is with the Department of Electrical Engineering, Princeton University, Princeton, NJ 08544 USA (e-mail: [email protected]). D. Guo is with the Department of Electrical Engineering and Computer Science, Northwestern University, Evanston, IL 60208 USA (e-mail: [email protected]). Communicated by Y. Steinberg, Associate Editor for Shannon Theory. Digital Object Identifier 10.1109/TIT.2006.872978 1For convenience, throughout this correspondence we assume that all logarithms are natural.
PY - 2006/5
Y1 - 2006/5
N2 - This correspondence gives a simple proof of Shannon's entropy-power inequality (EPI) using the relationship between mutual information and minimum mean-square error (MMSE) in Gaussian channels.
AB - This correspondence gives a simple proof of Shannon's entropy-power inequality (EPI) using the relationship between mutual information and minimum mean-square error (MMSE) in Gaussian channels.
KW - Differential entropy
KW - Entropy-power inequality (EPI)
KW - Minimum mean-square error (MMSE)
UR - http://www.scopus.com/inward/record.url?scp=33646020998&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=33646020998&partnerID=8YFLogxK
U2 - 10.1109/TIT.2006.872978
DO - 10.1109/TIT.2006.872978
M3 - Article
AN - SCOPUS:33646020998
SN - 0018-9448
VL - 52
SP - 2165
EP - 2166
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 5
ER -