TY - JOUR
T1 - Generalizing capacity
T2 - New definitions and capacity theorems for composite channels
AU - Effros, Michelle
AU - Goldsmith, Andrea
AU - Liang, Yifan
N1 - Funding Information:
Manuscript received April 24, 2008; revised February 28, 2010. Current version published June 16, 2010. This work was supported by the DARPA IT-MANET program under grant number 1105741-1-TFIND. The material in this paper was presented in part at the IEEE International Symposium on Information Theory, Cambridge, MA, August 1998; the IEEE International Symposium on Information Theory, Nice, France, June 2007; and the IEEE Information Theory Workshop, Lake Tahoe, CA, September 2007. M. Effros is with the Department of Electrical Engineering, California Institute of Technology, Pasadena, CA 91125 USA (e-mail: [email protected]). A. Goldsmith is with the Department of Electrical Engineering, Stanford University, Stanford CA 94305 USA (e-mail: [email protected]). Y. (E.) Liang is with Goldman, Sachs & Co., New York, NY 10004-1950 USA (e-mail: [email protected]). Communicated by G. Kramer, Associate Editor for Shannon Theory. Color versions of one or more of the figures in this paper are available online at http://ieeexplore.ieee.org. Digital Object Identifier 10.1109/TIT.2010.2048456
PY - 2010/7
Y1 - 2010/7
N2 - We consider three capacity definitions for composite channels with channel side information at the receiver. A composite channel consists of a collection of different channels with a distribution characterizing the probability that each channel is in operation. The Shannon capacity of a channel is the highest rate asymptotically achievable with arbitrarily small error probability. Under this definition, the transmission strategy used to achieve the capacity must achieve arbitrarily small error probability for all channels in the collection comprising the composite channel. The resulting capacity is dominated by the worst channel in its collection, no matter how unlikely that channel is. We, therefore, broaden the definition of capacity to allow for some outage. The capacity versus outage is the highest rate asymptotically achievable with a given probability of decoder-recognized outage. The expected capacity is the highest average rate asymptotically achievable with a single encoder and multiple decoders, where channel side information determines the channel in use. The expected capacity is a generalization of capacity versus outage since codes designed for capacity versus outage decode at one of two rates (rate zero when the channel is in outage and the target rate otherwise) while codes designed for expected capacity can decode at many rates. Expected capacity equals Shannon capacity for channels governed by a stationary ergodic random process but is typically greater for general channels. The capacity versus outage and expected capacity definitions relax the constraint that all transmitted information must be decoded at the receiver. We derive channel coding theorems for these capacity definitions through information density and provide numerical examples to highlight their connections and differences. We also discuss the implications of these alternative capacity definitions for end-to-end distortion, source-channel coding, and separation.
AB - We consider three capacity definitions for composite channels with channel side information at the receiver. A composite channel consists of a collection of different channels with a distribution characterizing the probability that each channel is in operation. The Shannon capacity of a channel is the highest rate asymptotically achievable with arbitrarily small error probability. Under this definition, the transmission strategy used to achieve the capacity must achieve arbitrarily small error probability for all channels in the collection comprising the composite channel. The resulting capacity is dominated by the worst channel in its collection, no matter how unlikely that channel is. We, therefore, broaden the definition of capacity to allow for some outage. The capacity versus outage is the highest rate asymptotically achievable with a given probability of decoder-recognized outage. The expected capacity is the highest average rate asymptotically achievable with a single encoder and multiple decoders, where channel side information determines the channel in use. The expected capacity is a generalization of capacity versus outage since codes designed for capacity versus outage decode at one of two rates (rate zero when the channel is in outage and the target rate otherwise) while codes designed for expected capacity can decode at many rates. Expected capacity equals Shannon capacity for channels governed by a stationary ergodic random process but is typically greater for general channels. The capacity versus outage and expected capacity definitions relax the constraint that all transmitted information must be decoded at the receiver. We derive channel coding theorems for these capacity definitions through information density and provide numerical examples to highlight their connections and differences. We also discuss the implications of these alternative capacity definitions for end-to-end distortion, source-channel coding, and separation.
KW - Capacity versus outage
KW - Composite channel
KW - Expected capacity
KW - Information density
KW - Separation
KW - Shannon capacity
UR - http://www.scopus.com/inward/record.url?scp=77953771347&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=77953771347&partnerID=8YFLogxK
U2 - 10.1109/TIT.2010.2048456
DO - 10.1109/TIT.2010.2048456
M3 - Article
AN - SCOPUS:77953771347
SN - 0018-9448
VL - 56
SP - 3069
EP - 3087
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 7
M1 - 5485015
ER -