Abstract
Finding the input distribution that maximizes mutual information leads, not only to the capacity of the channel, but to engineering insights that tell the designer what good codes should be like. This is due to the folk theorem: The empirical distribution of any good code (i.e., approaching capacity with vanishing probability of error) maximizes mutual information. This paper formalizes and proves this statement.
Original language | English (US) |
---|---|
Pages | 13 |
Number of pages | 1 |
State | Published - 1995 |
Event | Proceedings of the 1995 IEEE International Symposium on Information Theory - Whistler, BC, Can Duration: Sep 17 1995 → Sep 22 1995 |
Other
Other | Proceedings of the 1995 IEEE International Symposium on Information Theory |
---|---|
City | Whistler, BC, Can |
Period | 9/17/95 → 9/22/95 |
All Science Journal Classification (ASJC) codes
- Theoretical Computer Science
- Information Systems
- Modeling and Simulation
- Applied Mathematics