Abstract
The Finite-State Markov Channel (FSMC) is a discrete time-varying channel whose variation is determined by a finite-state Markov process. These channels have memory due to the Markov channel variation. We obtain the FSMC capacity as a function of the conditional channel state probability. We also show that for i.i.d. channel inputs, this conditional probability converges weakly, and the channel's mutual information is then a closed-form continuous function of the input distribution. We next consider coding for FSMC's. In general, the complexity of maximum-likelihood decoding grows exponentially with the channel memory length. Therefore, in practice, interleaving and memorylcss channel codes are used. This technique results in some performance loss relative to the inherent capacity of channels with memory. We propose a maximum-likelihood decision-feedback decoder with complexity that is independent of the channel memory. We calculate the capacity and cutoff rate of our technique, and show that it preserves the capacity of certain FSMC's. We also compare the performance of the decision-feedback decoder with that of interleaving and memoryless channel coding on a fading channel with 4PSK modulation.
Original language | English (US) |
---|---|
Pages (from-to) | 868-886 |
Number of pages | 19 |
Journal | IEEE Transactions on Information Theory |
Volume | 42 |
Issue number | 3 |
DOIs | |
State | Published - 1996 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Keywords
- Capacity
- Decision-feedback maximum-likelihood decoding
- Finite-state markov channels
- Mutual information