Distributed channel synthesis

Paul Cuff

Research output: Contribution to journalArticlepeer-review

147 Scopus citations


Two familiar notions of correlation are rediscovered as the extreme operating points for distributed synthesis of a discrete memoryless channel, in which a stochastic channel output is generated based on a compressed description of the channel input. Wyner's common information is the minimum description rate needed. However, when common randomness independent of the input is available, the necessary description rate reduces to Shannon's mutual information. This paper characterizes the optimal tradeoff between the amount of common randomness used and the required rate of description. We also include a number of related derivations, including the effect of limited local randomness, rate requirements for secrecy, applications to game theory, and new insights into common information duality. Our proof makes use of a soft covering lemma, known in the literature for its role in quantifying the resolvability of a channel. The direct proof (achievability) constructs a feasible joint distribution over all parts of the system using a soft covering, from which the behavior of the encoder and decoder is inferred, with no explicit reference to joint typicality or binning. Of auxiliary interest, this paper also generalizes and strengthens this soft covering tool.

Original languageEnglish (US)
Article number6584816
Pages (from-to)7071-7096
Number of pages26
JournalIEEE Transactions on Information Theory
Issue number11
StatePublished - 2013

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences


  • Channel simulation
  • channel synthesis
  • common information
  • random number generator
  • resolvability
  • reverse Shannon theorem
  • soft covering
  • total variation distance


Dive into the research topics of 'Distributed channel synthesis'. Together they form a unique fingerprint.

Cite this