Bayesian Risk with Bregman Loss: A Cramér-Rao Type Bound and Linear Estimation

Alex Dytso, Michael Faub, H. Vincent Poor

Research output: Contribution to journalArticlepeer-review

4 Scopus citations


A general class of Bayesian lower bounds when the underlying loss function is a Bregman divergence is demonstrated. This class can be considered as an extension of the Weinstein-Weiss family of bounds for the mean squared error and relies on finding a variational characterization of Bayesian risk. This approach allows for the derivation of a version of the Cramér-Rao bound that is specific to a given Bregman divergence. This new generalization of the Cramér-Rao bound reduces to the classical one when the loss function is taken to be the Euclidean norm. In order to evaluate the effectiveness of the new lower bounds, the paper also develops upper bounds on Bayesian risk, which are based on optimal linear estimators. The effectiveness of the new bound is evaluated in the Poisson noise setting.

Original languageEnglish (US)
Pages (from-to)1985-2000
Number of pages16
JournalIEEE Transactions on Information Theory
Issue number3
StatePublished - Mar 1 2022
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences


  • Bregman divergence
  • Cramér-Rao
  • Gaussian Noise
  • Linear estimation
  • Poisson noise
  • minimum mean squared error (MMSE)


Dive into the research topics of 'Bayesian Risk with Bregman Loss: A Cramér-Rao Type Bound and Linear Estimation'. Together they form a unique fingerprint.

Cite this