Capacity of finite state Markov channels with general inputs

Tim Holliday, Andrea Goldsmith, Peter Glynn

Research output: Contribution to journalConference articlepeer-review

11 Scopus citations

Abstract

We study new formulae based on Lyapunov exponents for entropy, mutual information, and capacity of finite state discrete time Markov channels. We also develop a method for directly computing mutual information and entropy using continuous state space Markov chains. We show that the entropy rate for a symbol sequence is equal to the primary Lyapunov exponent for a product of random matrices. We then develop a continuous state space Markov chain formulation that allows us to directly compute entropy rates as expectations with respect to the Markov chain's stationary distribution. We also show that the stationary distribution is a continuous function of the input symbol dynamics. This continuity allows the channel capacity to be written in terms of Lyapunov exponents.

Original languageEnglish (US)
Pages (from-to)289
Number of pages1
JournalIEEE International Symposium on Information Theory - Proceedings
DOIs
StatePublished - 2003
Externally publishedYes
EventProceedings 2003 IEEE International Symposium on Information Theory (ISIT) - Yokohama, Japan
Duration: Jun 29 2003Jul 4 2003

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Fingerprint Dive into the research topics of 'Capacity of finite state Markov channels with general inputs'. Together they form a unique fingerprint.

Cite this