Communication requirements for generating correlated random variables

Paul Cuff

Research output: Chapter in Book/Report/Conference proceedingConference contribution

62 Scopus citations

Abstract

Two familiar notions of correlation are rediscovered as extreme operating points for simulating a discrete memoryless channel, in which a channel output is generated based only on a description of the channel input. Wyner's "common information" coincides with the minimum description rate needed. However, when common randomness independent of the input is available, the necessary description rate reduces to Shannon's mutual information. This work characterizes the optimal tradeoff between the amount of common randomness used and the required rate of description.

Original languageEnglish (US)
Title of host publicationProceedings - 2008 IEEE International Symposium on Information Theory, ISIT 2008
Pages1393-1397
Number of pages5
DOIs
StatePublished - 2008
Event2008 IEEE International Symposium on Information Theory, ISIT 2008 - Toronto, ON, Canada
Duration: Jul 6 2008Jul 11 2008

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
ISSN (Print)2157-8101

Other

Other2008 IEEE International Symposium on Information Theory, ISIT 2008
Country/TerritoryCanada
CityToronto, ON
Period7/6/087/11/08

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Communication requirements for generating correlated random variables'. Together they form a unique fingerprint.

Cite this