Relative entropy at the channel output of a capacity-achieving code

Yury Polyanskiy, Sergio Verdú

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

In this paper we establish a new inequality tying together the coding rate, the probability of error and the relative entropy between the channel and the auxiliary output distribution. This inequality is then used to show the strong converse, and to prove that the output distribution of a code must be close, in relative entropy, to the capacity achieving output distribution (for DMC and AWGN). One of the key tools in our analysis is the concentration of measure (isoperimetry).

Original languageEnglish (US)
Title of host publication2011 49th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2011
Pages52-59
Number of pages8
DOIs
StatePublished - 2011
Event2011 49th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2011 - Monticello, IL, United States
Duration: Sep 28 2011Sep 30 2011

Publication series

Name2011 49th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2011

Other

Other2011 49th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2011
Country/TerritoryUnited States
CityMonticello, IL
Period9/28/119/30/11

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Control and Systems Engineering

Keywords

  • Shannon theory
  • additive white Gaussian noise
  • concentration of measure
  • discrete memoryless channels
  • empirical output statistics
  • general channels
  • information measures
  • strong converse

Fingerprint

Dive into the research topics of 'Relative entropy at the channel output of a capacity-achieving code'. Together they form a unique fingerprint.

Cite this