Approximation theory of output statistics

Te Sun Han, Sergio Verdu

Research output: Contribution to journalArticlepeer-review

457 Scopus citations


Given a channel and an input process, the minimum randomness of those input processes whose output statistics approximate the original output statistics with arbitrary accuracy is studied. The notion of resolvability of a channel, defined as the number of random bits required per channel use in order to generate an input that achieves arbitrarily accurate approximation of the output statistics for any given input process, is introduced. A general formula for resolvability that holds regardless of the channel memory structure, is obtained. It is shown that, for most channels, resolvability is equal to Shannon capacity. By-products of the analysis are a general formula for the minimum achievable (fixed-length) source coding rate of any finite-alphabet source, and a strong converse of the identification coding theorem, which holds for any channel that satisfies the strong converse of the channel coding theorem.

Original languageEnglish (US)
Pages (from-to)752-772
Number of pages21
JournalIEEE Transactions on Information Theory
Issue number3
StatePublished - 1993

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences


Dive into the research topics of 'Approximation theory of output statistics'. Together they form a unique fingerprint.

Cite this