A new entropy power inequality for integer-valued random variables

Saeid Haghighatshoar, Emmanuel Abbe, I. Emre Telatar

Research output: Contribution to journalArticlepeer-review

20 Scopus citations


The entropy power inequality (EPI) yields lower bounds on the differential entropy of the sum of two independent real-valued random variables in terms of the individual entropies. Versions of the EPI for discrete random variables have been obtained for special families of distributions with the differential entropy replaced by the discrete entropy, but no universal inequality is known (beyond trivial ones). More recently, the sumset theory for the entropy function yields a sharp inequality H(X+X′)-H(X)≥1/2-o(1) when X + X′ are independent identically distributed (i.i.d.) with high entropy. This paper provides the inequality H(X+X′)-H(X)≥ g(H(X)) , where X + X′ are arbitrary i.i.d. integer-valued random variables and where g is a universal strictly positive function on ℝ+ satisfying g(0)=0. Extensions to nonidentically distributed random variables and to conditional entropies are also obtained.

Original languageEnglish (US)
Article number6797921
Pages (from-to)3787-3796
Number of pages10
JournalIEEE Transactions on Information Theory
Issue number7
StatePublished - Jul 2014

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences


  • Entropy inequalities
  • Mrs. Gerber's Lemma
  • Shannon sumset theory
  • doubling constant
  • entropy power inequality


Dive into the research topics of 'A new entropy power inequality for integer-valued random variables'. Together they form a unique fingerprint.

Cite this