TY - GEN
T1 - Estimation of entropy rate and Rényi entropy rate for Markov chains
AU - Kamath, Sudeep
AU - Verdu, Sergio
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/8/10
Y1 - 2016/8/10
N2 - Estimation of the entropy rate of a stochastic process with unknown statistics, from a single sample path is a classical problem in information theory. While universal estimators for general families of processes exist, the estimates have not been accompanied by guarantees for fixed-length sample paths. We provide finite sample bounds on the convergence of a plug-in type estimator for the entropy rate of a Markov chain in terms of its alphabet size and its mixing properties. We also discuss Rényi entropy rate estimation for reversible Markov chains.
AB - Estimation of the entropy rate of a stochastic process with unknown statistics, from a single sample path is a classical problem in information theory. While universal estimators for general families of processes exist, the estimates have not been accompanied by guarantees for fixed-length sample paths. We provide finite sample bounds on the convergence of a plug-in type estimator for the entropy rate of a Markov chain in terms of its alphabet size and its mixing properties. We also discuss Rényi entropy rate estimation for reversible Markov chains.
UR - http://www.scopus.com/inward/record.url?scp=84985992228&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84985992228&partnerID=8YFLogxK
U2 - 10.1109/ISIT.2016.7541386
DO - 10.1109/ISIT.2016.7541386
M3 - Conference contribution
AN - SCOPUS:84985992228
T3 - IEEE International Symposium on Information Theory - Proceedings
SP - 685
EP - 689
BT - Proceedings - ISIT 2016; 2016 IEEE International Symposium on Information Theory
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2016 IEEE International Symposium on Information Theory, ISIT 2016
Y2 - 10 July 2016 through 15 July 2016
ER -