TY - GEN
T1 - Worst additive noise
T2 - 2014 28th IEEE Convention of Electrical and Electronics Engineers in Israel, IEEEI 2014
AU - Bustin, Ronit
AU - Poor, H. Vincent
AU - Shamai, Shlomo
N1 - Publisher Copyright:
© Copyright 2015 IEEE All rights reserved.
PY - 2014
Y1 - 2014
N2 - The "worst additive noise" problem is considered. The problem refers to an additive channel in which the input is known to some extent. It is further assumed that the noise consists of an additive Gaussian component and an additive component of arbitrary distribution. The question is: what is the distribution over the additive noise that will minimize the mutual information between the input and the output? Two settings for this problem are considered. In the first setting a Gaussian input with a given covariance matrix is considered and it is shown that the problem can be handled in the framework of the Guo, Shamai and Verdú I-MMSE relationship. This framework gives a simple derivation of Diggavi and Cover's result, that under a covariance constraint the "worst additive noise" distribution is Gaussian, meaning that Gaussian noise minimizes the input-output mutual information given that the input is Gaussian. The I-MMSE framework also shows that given that the input is Gaussian distributed, for any constraint on the distribution of the noise, which does not prohibit a Gaussian distribution, the "worst" distribution is a Gaussian distribution complying with the constraint. In the second setting it is assumed that the input contains a codeword from an optimal point-to-point codebook (i.e., it achieves capacity) and it is shown, for a subset of SNRs, that the minimum mutual information is obtained when the additive signal is Gaussian-like up to a given SNR.
AB - The "worst additive noise" problem is considered. The problem refers to an additive channel in which the input is known to some extent. It is further assumed that the noise consists of an additive Gaussian component and an additive component of arbitrary distribution. The question is: what is the distribution over the additive noise that will minimize the mutual information between the input and the output? Two settings for this problem are considered. In the first setting a Gaussian input with a given covariance matrix is considered and it is shown that the problem can be handled in the framework of the Guo, Shamai and Verdú I-MMSE relationship. This framework gives a simple derivation of Diggavi and Cover's result, that under a covariance constraint the "worst additive noise" distribution is Gaussian, meaning that Gaussian noise minimizes the input-output mutual information given that the input is Gaussian. The I-MMSE framework also shows that given that the input is Gaussian distributed, for any constraint on the distribution of the noise, which does not prohibit a Gaussian distribution, the "worst" distribution is a Gaussian distribution complying with the constraint. In the second setting it is assumed that the input contains a codeword from an optimal point-to-point codebook (i.e., it achieves capacity) and it is shown, for a subset of SNRs, that the minimum mutual information is obtained when the additive signal is Gaussian-like up to a given SNR.
UR - http://www.scopus.com/inward/record.url?scp=84941248141&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84941248141&partnerID=8YFLogxK
U2 - 10.1109/EEEI.2014.7005759
DO - 10.1109/EEEI.2014.7005759
M3 - Conference contribution
AN - SCOPUS:84941248141
T3 - 2014 IEEE 28th Convention of Electrical and Electronics Engineers in Israel, IEEEI 2014
BT - 2014 IEEE 28th Convention of Electrical and Electronics Engineers in Israel, IEEEI 2014
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 3 December 2014 through 5 December 2014
ER -