MMSE Bounds under Kullback-Leibler Divergence Constraints on the Joint Input-Output Distribution

Michael Faus, H. Vincent Poor, Alex Dytso

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper proposes new lower and upper bounds on the minimum mean squared error (MMSE). The key idea is to minimize/maximize the MMSE subject to the constraint that the joint distribution of the input-output statistics lies in a Kullback-Leibler divergence ball centered at some Gaussian distribution. The bounds are shown to hold for a larger family of distributions than the Cramér-Rao bound and are shown to be sharper than the Cramér-Rao bound in some regimes.

Original languageEnglish (US)
Title of host publicationConference Record of the 54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020
EditorsMichael B. Matthews
PublisherIEEE Computer Society
Pages1477-1478
Number of pages2
ISBN (Electronic)9780738131269
DOIs
StatePublished - Nov 1 2020
Externally publishedYes
Event54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020 - Pacific Grove, United States
Duration: Nov 1 2020Nov 5 2020

Publication series

NameConference Record - Asilomar Conference on Signals, Systems and Computers
Volume2020-November
ISSN (Print)1058-6393

Conference

Conference54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020
Country/TerritoryUnited States
CityPacific Grove
Period11/1/2011/5/20

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'MMSE Bounds under Kullback-Leibler Divergence Constraints on the Joint Input-Output Distribution'. Together they form a unique fingerprint.

Cite this