A Kullback-Leibler Divergence Variant of the Bayesian Cramér-Rao Bound

Michael Fauß, Alex Dytso, H. Vincent Poor

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

This paper proposes a Bayesian Cramér-Rao type lower bound on the minimum mean square error. The key idea is to minimize the latter subject to the constraint that the joint distribution of the input-output statistics lies in a Kullback–Leibler divergence ball centered at a Gaussian reference distribution. The bound is tight and is attained by a Gaussian distribution whose mean is identical to that of the reference distribution and whose covariance matrix is determined by a scalar parameter that can be obtained by finding the unique root of a simple function. Examples of applications in signal processing and information theory illustrate the usefulness of the proposed bound in practice.

Original languageEnglish (US)
Article number108933
JournalSignal Processing
Volume207
DOIs
StatePublished - Jun 2023
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering

Keywords

  • 0000
  • 1111
  • Cramér–Rao bound
  • Kullback–Leibler divergence
  • MMSE bounds
  • information inequalities

Fingerprint

Dive into the research topics of 'A Kullback-Leibler Divergence Variant of the Bayesian Cramér-Rao Bound'. Together they form a unique fingerprint.

Cite this