TY - GEN
T1 - Capacity definitions of general channels with receiver side information
AU - Effros, Michelle
AU - Goldsmith, Andrea
AU - Liang, Yifan
PY - 2007
Y1 - 2007
N2 - We consider three capacity definitions for general channels with channel side information at the receiver, where the channel is modeled as a sequence of finite dimensional conditional distributions not necessarily stationary, ergodic, or information stable. The Shannon capacity is the highest rate asymptotically achievable with arbitrarily small error probability. The outage capacity is the highest rate asymptotically achievable with a given probability of decoder-recognized outage. The expected capacity is the highest expected rate asymptotically achievable with a single encoder and multiple decoders, where the channel side information determines the decoder in use. Expected capacity equals Shannon capacity for channels governed by a stationary ergodic random process but is typically greater for general channels. These alternative definitions essentially relax the constraint that all transmitted information must be decoded at the receiver. We derive equations for these capacity definitions through information density. Examples are also provided to demonstrate their implications.
AB - We consider three capacity definitions for general channels with channel side information at the receiver, where the channel is modeled as a sequence of finite dimensional conditional distributions not necessarily stationary, ergodic, or information stable. The Shannon capacity is the highest rate asymptotically achievable with arbitrarily small error probability. The outage capacity is the highest rate asymptotically achievable with a given probability of decoder-recognized outage. The expected capacity is the highest expected rate asymptotically achievable with a single encoder and multiple decoders, where the channel side information determines the decoder in use. Expected capacity equals Shannon capacity for channels governed by a stationary ergodic random process but is typically greater for general channels. These alternative definitions essentially relax the constraint that all transmitted information must be decoded at the receiver. We derive equations for these capacity definitions through information density. Examples are also provided to demonstrate their implications.
UR - http://www.scopus.com/inward/record.url?scp=51249094680&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=51249094680&partnerID=8YFLogxK
U2 - 10.1109/ISIT.2007.4557342
DO - 10.1109/ISIT.2007.4557342
M3 - Conference contribution
AN - SCOPUS:51249094680
SN - 1424414296
SN - 9781424414291
T3 - IEEE International Symposium on Information Theory - Proceedings
SP - 921
EP - 925
BT - Proceedings - 2007 IEEE International Symposium on Information Theory, ISIT 2007
T2 - 2007 IEEE International Symposium on Information Theory, ISIT 2007
Y2 - 24 June 2007 through 29 June 2007
ER -