Erasure entropy

Sergio Verdú, Tsachy Weissman

Research output: Chapter in Book/Report/Conference proceedingConference contribution

8 Scopus citations


We define the erasure entropy of a collection of random variables as the sum of entropies of the individual variables conditioned on all the rest. The erasure entropy rate of a source is defined as the limit of the normalized erasure entropy. The erasure entropy measures the information content carried by each symbol knowing its context. In the setup of a source observed through an erasure channel, we offer an operational characterization of erasure entropy rate as the minimal amount of bits per erasure required to recover the erased information in the limit of small erasure probability. When we allow recovery of the erased symbols within a prescribed degree of distortion, the fundamental tradeoff is described by the erasure rate-distortion function which we characterize. When no additional encoded information is available, the erased information is reconstructed solely on the basis of its context by a denoiser. Connections between erasure entropy and discrete denoising are also explored.

Original languageEnglish (US)
Title of host publicationProceedings - 2006 IEEE International Symposium on Information Theory, ISIT 2006
Number of pages5
StatePublished - 2006
Event2006 IEEE International Symposium on Information Theory, ISIT 2006 - Seattle, WA, United States
Duration: Jul 9 2006Jul 14 2006

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
ISSN (Print)2157-8101


Other2006 IEEE International Symposium on Information Theory, ISIT 2006
Country/TerritoryUnited States
CitySeattle, WA

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics


  • Data compression
  • Discrete denoising
  • Entropy
  • Erasure channels
  • Markov processes
  • Rate-distortion theory
  • Shannon theory


Dive into the research topics of 'Erasure entropy'. Together they form a unique fingerprint.

Cite this