Abstract
A formula for the capacity of arbitrary single-user channels without feedback (not necessarily information stable, stationary, etc.) is proved. Capacity is shown to equal the supremum, over all input processes, of the input-output inf - information rate defined as the liminf in probability of the normalized information density. The key to this result is a new converse approach based on a simple new lower bound on the error probability of m-ary hypothesis testa among equiprobable hypotheses. A necessary and sufficient condition for the validity of the strong converse is given, as well as general expressions for ε-capacity.
Original language | English (US) |
---|---|
Pages (from-to) | 1147-1157 |
Number of pages | 11 |
Journal | IEEE Transactions on Information Theory |
Volume | 40 |
Issue number | 4 |
DOIs | |
State | Published - Jul 1994 |
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Keywords
- Shannon theory
- channel capacity
- channel coding theorem
- channels with memory
- strong converse