TY - JOUR
T1 - Capacity of finite state channels based on lyapunov exponents of random matrices
AU - Holliday, Tim
AU - Goldsmith, Andrea
AU - Glynn, Peter
N1 - Funding Information:
Manuscript received July 18, 2003; revised October 19, 2005. This work was supported by the National Science Foundation under Grant CMS-0120912 and by the Office of Naval Research under Grant N00014-05-0168. The material in this paper was presented in part at the IEEE International Symposium on Information Theory, Yokohama, Japan, June/July 2003. T. Holliday is with Princeton University, Princeton, NJ 08544 USA (e-mail: [email protected]). A. Goldsmith and P. Glynn are with Stanford University, Stanford, CA 94305 USA (e-mail: [email protected]; [email protected]). Communicated by A. Kavc˘ić, Assoicate Editor for Detection and Estimation. Digital Object Identifier 10.1109/TIT.2006.878230
PY - 2006/8
Y1 - 2006/8
N2 - The finite-state Markov channel (FSMC) is a time-varying channel having states that are characterized by a finite-state Markov chain. These channels have infinite memory, which complicates their capacity analysis. We develop a new method to characterize the capacity of these channels based on Lyapunov exponents. Specifically, we show that the input, output, and conditional entropies for this channel are equivalent to the largest Lyapunov exponents for a particular class of random matrix products. We then show that the Lyapunov exponents can be expressed as expectations with respect to the stationary distributions of a class of continuous-state space Markov chains. This class of Markov chains, which is closely related to the prediction filter in hidden Markov models, is shown to be nonirreducible. Hence, much of the standard theory for continuous state-space Markov chains cannot be applied to establish the existence and uniqueness of stationary distributions, nor do we have direct access to a central limit theorem (CLT). In order to address these shortcomings, we utilize several results from the theory of random matrix products and Lyapunov exponents. The stationary distributions for this class of Markov chains are shown to be unique and continuous functions of the input symbol probabilities, provided that the input sequence has finite memory. These properties allow us to express mutual information and channel capacity in terms of Lyapunov exponents. We then leverage this connection between entropy and Lyapunov exponents to develop a rigorous theory for computing or approximating entropy and mutual information for finite-state channels with dependent inputs. We develop a method for directly computing entropy of finite-state channels that does not rely on simulation and establish its convergence. We also obtain a new asymptotically tight lower bound for entropy based on norms of random matrix products. In addition, we prove a new functional CLT for sample entropy and apply this theorem to characterize the error in simulated estimates of entropy. Finally, we present numerical examples of mutual information computation for intersymbol interference (ISI) channels and observe the capacity benefits of adding memory to the input sequence for such channels.
AB - The finite-state Markov channel (FSMC) is a time-varying channel having states that are characterized by a finite-state Markov chain. These channels have infinite memory, which complicates their capacity analysis. We develop a new method to characterize the capacity of these channels based on Lyapunov exponents. Specifically, we show that the input, output, and conditional entropies for this channel are equivalent to the largest Lyapunov exponents for a particular class of random matrix products. We then show that the Lyapunov exponents can be expressed as expectations with respect to the stationary distributions of a class of continuous-state space Markov chains. This class of Markov chains, which is closely related to the prediction filter in hidden Markov models, is shown to be nonirreducible. Hence, much of the standard theory for continuous state-space Markov chains cannot be applied to establish the existence and uniqueness of stationary distributions, nor do we have direct access to a central limit theorem (CLT). In order to address these shortcomings, we utilize several results from the theory of random matrix products and Lyapunov exponents. The stationary distributions for this class of Markov chains are shown to be unique and continuous functions of the input symbol probabilities, provided that the input sequence has finite memory. These properties allow us to express mutual information and channel capacity in terms of Lyapunov exponents. We then leverage this connection between entropy and Lyapunov exponents to develop a rigorous theory for computing or approximating entropy and mutual information for finite-state channels with dependent inputs. We develop a method for directly computing entropy of finite-state channels that does not rely on simulation and establish its convergence. We also obtain a new asymptotically tight lower bound for entropy based on norms of random matrix products. In addition, we prove a new functional CLT for sample entropy and apply this theorem to characterize the error in simulated estimates of entropy. Finally, we present numerical examples of mutual information computation for intersymbol interference (ISI) channels and observe the capacity benefits of adding memory to the input sequence for such channels.
KW - Finite-state channel
KW - Hidden Markov model
KW - Lyapunov exponent
KW - Random matrices
KW - Shannon capacity
UR - http://www.scopus.com/inward/record.url?scp=33746645650&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=33746645650&partnerID=8YFLogxK
U2 - 10.1109/TIT.2006.878230
DO - 10.1109/TIT.2006.878230
M3 - Article
AN - SCOPUS:33746645650
SN - 0018-9448
VL - 52
SP - 3509
EP - 3532
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 8
ER -