TY - GEN
T1 - Relative entropy at the channel output of a capacity-achieving code
AU - Polyanskiy, Yury
AU - Verdú, Sergio
PY - 2011
Y1 - 2011
N2 - In this paper we establish a new inequality tying together the coding rate, the probability of error and the relative entropy between the channel and the auxiliary output distribution. This inequality is then used to show the strong converse, and to prove that the output distribution of a code must be close, in relative entropy, to the capacity achieving output distribution (for DMC and AWGN). One of the key tools in our analysis is the concentration of measure (isoperimetry).
AB - In this paper we establish a new inequality tying together the coding rate, the probability of error and the relative entropy between the channel and the auxiliary output distribution. This inequality is then used to show the strong converse, and to prove that the output distribution of a code must be close, in relative entropy, to the capacity achieving output distribution (for DMC and AWGN). One of the key tools in our analysis is the concentration of measure (isoperimetry).
KW - Shannon theory
KW - additive white Gaussian noise
KW - concentration of measure
KW - discrete memoryless channels
KW - empirical output statistics
KW - general channels
KW - information measures
KW - strong converse
UR - http://www.scopus.com/inward/record.url?scp=84856110936&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84856110936&partnerID=8YFLogxK
U2 - 10.1109/Allerton.2011.6120149
DO - 10.1109/Allerton.2011.6120149
M3 - Conference contribution
AN - SCOPUS:84856110936
SN - 9781457718168
T3 - 2011 49th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2011
SP - 52
EP - 59
BT - 2011 49th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2011
T2 - 2011 49th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2011
Y2 - 28 September 2011 through 30 September 2011
ER -