Capacity, mutual information, and coding for finite-state Markov channels

Andrea Goldsmith, Pravin Varaiya

Research output: Contribution to conferencePaperpeer-review

Abstract

The Finite-State Markov Channel (FSMC) is a discrete-time varying channel whose variation is determined by a finite-state Markov process. We obtain the FSCM capacity as a function of the channel state probability conditioned on all past inputs and outputs, and the channel state probability conditioned on all past outputs alone. We also show that when the channel inputs are i.i.d., both conditional probabilities converge in distribution. In this case, the maximum mutual information of the FSMC, Iiid, is determined from these limit distributions. A class of channels for which Iiid equals Shannon capacity is also defined. Next, we consider coding techniques for these channels. We propose a decision-feedback decoding algorithm that uses the channel's Markovian structure to determine the maximum likelihood input sequence. We show that, for a particular class of FSMCs, this decoding scheme preserves the inherent channel capacity. We also present numerical results for the capacity and cutoff rate of a two-state variable noise channel with 4-PSK modulation using the decision-feedback decoder.

Original languageEnglish (US)
StatePublished - 1994
Externally publishedYes
EventProceedings of the 1994 IEEE International Symposium on Information Theory - Trodheim, Norw
Duration: Jun 27 1994Jul 1 1994

Other

OtherProceedings of the 1994 IEEE International Symposium on Information Theory
CityTrodheim, Norw
Period6/27/947/1/94

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Fingerprint Dive into the research topics of 'Capacity, mutual information, and coding for finite-state Markov channels'. Together they form a unique fingerprint.

Cite this