Second-order asymptotics of mutual information

Viacheslav V. Prelov, Sergio Verdú

Research output: Contribution to journalArticlepeer-review

88 Scopus citations


A formula for the second-order expansion of the input-output mutual information of multidimensional channels as the signal-to-noise ratio (SNR) goes to zero is obtained. While the additive noise is assumed to be Gaussian, we deal with very general classes of input and channel distributions. As special cases, these channel models include fading channels, channels with random parameters, and channels with almost Gaussian noise. When the channel is unknown at the receiver, the second term in the asymptotic expansion depends not only on the covariance matrix of the input signal but also on the fourth mixed moments of its components. The study of the second-order asymptotics of mutual information finds application in the analysis of the bandwidth-power tradeoff achieved by various signaling strategies in the wideband regime.

Original languageEnglish (US)
Pages (from-to)1567-1580
Number of pages14
JournalIEEE Transactions on Information Theory
Issue number8
StatePublished - Aug 2004

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences


Dive into the research topics of 'Second-order asymptotics of mutual information'. Together they form a unique fingerprint.

Cite this