Abstract
We consider uncertainty classes of noise distributions defined by a bound on the divergence with respect to a nominal noise distribution. The noise that maximizes the minimum error probability for binary-input channels is found. The effect of the reduction in uncertainty brought about by knowledge of the signal-to-noise ratio is also studied. The particular class of Gaussian nominal distributions provides an analysis tool for near-Gaussian channels. Asymptotic behavior of the least favorable noise distribution and resulting error probability are studied in a variety of scenarios, namely: asymptotically small divergence with and without power constraint; asymptotically large divergence with and without power constraint; and asymptotically large signal-to-noise ratio.
Original language | English (US) |
---|---|
Pages (from-to) | 947-972 |
Number of pages | 26 |
Journal | IEEE Transactions on Information Theory |
Volume | 44 |
Issue number | 3 |
DOIs | |
State | Published - 1998 |
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Keywords
- Detection
- Gaussian error probability
- Hypothesis testing
- Kullback-leibler divergence
- Least favorable noise