## Abstract

Let the kth-order empirical distribution of a code be defined as the proportion of k-strings anywhere in the codebook equal to every given k-string. We show that for any fixed k, the kth-order empirical distribution of any good code (i.e., a code approaching capacity with vanishing probability of error) converges in the sense of divergence to the set of input distributions that maximize the input/output mutual information of k channel uses. This statement is proved for discrete memoryless channels as well as a large class of channels with memory. If k grows logarithmically (or faster) with blocklength, the result no longer holds for certain good codes, whereas for other good codes, the result can be shown for k growing as fast as a certain fraction of blocklength.

Original language | English (US) |
---|---|

Pages (from-to) | 836-846 |

Number of pages | 11 |

Journal | IEEE Transactions on Information Theory |

Volume | 43 |

Issue number | 3 |

DOIs | |

State | Published - 1997 |

## All Science Journal Classification (ASJC) codes

- Information Systems
- Computer Science Applications
- Library and Information Sciences

## Keywords

- Approximation of output statistics
- Channel capacity
- Discrete memoryless channels
- Divergence
- Error- correcting codes
- Gaussian channels
- Shannon theory