Maximin performance of binary-input channels with uncertain noise distributions

Andrew L. McKellips, Sergio Verdú

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

We consider uncertainty classes of noise distributions defined by a bound on the divergence with respect to a nominal noise distribution. The noise that maximizes the minimum error probability for binary-input channels is found. The effect of the reduction in uncertainty brought about by knowledge of the signal-to-noise ratio is also studied. The particular class of Gaussian nominal distributions provides an analysis tool for near-Gaussian channels. Asymptotic behavior of the least favorable noise distribution and resulting error probability are studied in a variety of scenarios, namely: asymptotically small divergence with and without power constraint; asymptotically large divergence with and without power constraint; and asymptotically large signal-to-noise ratio.

Original languageEnglish (US)
Pages (from-to)947-972
Number of pages26
JournalIEEE Transactions on Information Theory
Volume44
Issue number3
DOIs
StatePublished - 1998

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Keywords

  • Detection
  • Gaussian error probability
  • Hypothesis testing
  • Kullback-leibler divergence
  • Least favorable noise

Fingerprint Dive into the research topics of 'Maximin performance of binary-input channels with uncertain noise distributions'. Together they form a unique fingerprint.

Cite this