Abstract
The Finite-State Markov Channel (FSMC) is a discrete-time varying channel whose variation is determined by a finite-state Markov process. We obtain the FSCM capacity as a function of the channel state probability conditioned on all past inputs and outputs, and the channel state probability conditioned on all past outputs alone. We also show that when the channel inputs are i.i.d., both conditional probabilities converge in distribution. In this case, the maximum mutual information of the FSMC, Iiid, is determined from these limit distributions. A class of channels for which Iiid equals Shannon capacity is also defined. Next, we consider coding techniques for these channels. We propose a decision-feedback decoding algorithm that uses the channel's Markovian structure to determine the maximum likelihood input sequence. We show that, for a particular class of FSMCs, this decoding scheme preserves the inherent channel capacity. We also present numerical results for the capacity and cutoff rate of a two-state variable noise channel with 4-PSK modulation using the decision-feedback decoder.
Original language | English (US) |
---|---|
State | Published - 1994 |
Externally published | Yes |
Event | Proceedings of the 1994 IEEE International Symposium on Information Theory - Trodheim, Norw Duration: Jun 27 1994 → Jul 1 1994 |
Other
Other | Proceedings of the 1994 IEEE International Symposium on Information Theory |
---|---|
City | Trodheim, Norw |
Period | 6/27/94 → 7/1/94 |
All Science Journal Classification (ASJC) codes
- Theoretical Computer Science
- Information Systems
- Modeling and Simulation
- Applied Mathematics