Abstract
We study new formulae based on Lyapunov exponents for entropy, mutual information, and capacity of finite state discrete time Markov channels. We also develop a method for directly computing mutual information and entropy using continuous state space Markov chains. We show that the entropy rate for a symbol sequence is equal to the primary Lyapunov exponent for a product of random matrices. We then develop a continuous state space Markov chain formulation that allows us to directly compute entropy rates as expectations with respect to the Markov chain's stationary distribution. We also show that the stationary distribution is a continuous function of the input symbol dynamics. This continuity allows the channel capacity to be written in terms of Lyapunov exponents.
Original language | English (US) |
---|---|
Pages (from-to) | 289 |
Number of pages | 1 |
Journal | IEEE International Symposium on Information Theory - Proceedings |
DOIs | |
State | Published - 2003 |
Externally published | Yes |
Event | Proceedings 2003 IEEE International Symposium on Information Theory (ISIT) - Yokohama, Japan Duration: Jun 29 2003 → Jul 4 2003 |
All Science Journal Classification (ASJC) codes
- Theoretical Computer Science
- Information Systems
- Modeling and Simulation
- Applied Mathematics