### Abstract

Consider two random variables X and Y, where X is finitely (or countably-infinitely) valued, and where Y is arbitrary. Let ∊ denote the minimum probability of error incurred in estimating X from Y. It is shown that [formula omitted] where π(X|Y) denotes the posterior probability of X given Y. This bound finds information-theoretic applications in the proof of converse channel coding theorems. It generalizes and strengthens previous lower bounds due to Shannon, and to Verdú and Han.

Original language | English (US) |
---|---|

Pages (from-to) | 1992-1994 |

Number of pages | 3 |

Journal | IEEE Transactions on Information Theory |

Volume | 41 |

Issue number | 6 |

DOIs | |

State | Published - Nov 1995 |

### All Science Journal Classification (ASJC) codes

- Information Systems
- Computer Science Applications
- Library and Information Sciences

### Keywords

- Converse Channel Coding Theorem
- Hypothesis testing
- Shannon theory
- probability of error

## Fingerprint Dive into the research topics of 'A Lower Bound on the Probability of Error in Multihypothesis Testing'. Together they form a unique fingerprint.

## Cite this

Poor, H. V., & Verdú, S. (1995). A Lower Bound on the Probability of Error in Multihypothesis Testing.

*IEEE Transactions on Information Theory*,*41*(6), 1992-1994. https://doi.org/10.1109/18.476322