The "worst additive noise" problem is considered. The problem refers to an additive channel in which the input is known to some extent. It is further assumed that the noise consists of an additive Gaussian component and an additive component of arbitrary distribution. The question is: what is the distribution over the additive noise that will minimize the mutual information between the input and the output? Two settings for this problem are considered. In the first setting a Gaussian input with a given covariance matrix is considered and it is shown that the problem can be handled in the framework of the Guo, Shamai and Verdú I-MMSE relationship. This framework gives a simple derivation of Diggavi and Cover's result, that under a covariance constraint the "worst additive noise" distribution is Gaussian, meaning that Gaussian noise minimizes the input-output mutual information given that the input is Gaussian. The I-MMSE framework also shows that given that the input is Gaussian distributed, for any constraint on the distribution of the noise, which does not prohibit a Gaussian distribution, the "worst" distribution is a Gaussian distribution complying with the constraint. In the second setting it is assumed that the input contains a codeword from an optimal point-to-point codebook (i.e., it achieves capacity) and it is shown, for a subset of SNRs, that the minimum mutual information is obtained when the additive signal is Gaussian-like up to a given SNR.