Abstract
Additive-noise channels with binary inputs and zero-threshold detection are considered. We study worst case noise under the criterion of maximum error probability with constraints on both power and divergence with respect to a given symmetric nominal noise distribution. Particular attention is focused on the cases of a) Gaussian nominal distributions and b) asymptotic increase in worst case error probability when the divergence tolerance tends to zero.
Original language | English (US) |
---|---|
Pages (from-to) | 1256-1264 |
Number of pages | 9 |
Journal | IEEE Transactions on Information Theory |
Volume | 43 |
Issue number | 4 |
DOIs | |
State | Published - 1997 |
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Keywords
- Detection
- Gaussian error probability
- Hypothesis testing
- Kullback-Leibler divergence
- Least favorable noise