### Abstract

Over binary input channels, the uniform distribution is a universal prior, in the sense that it maximizes the worst case mutual information of all binary input channels and achieves at least 94.2% of the capacity. In this paper, we address a similar question. We look for the best collection of finitely many a posteriori metrics, to maximize the worst case mismatched mutual information achieved by decoding with these metrics (instead of an optimal decoder such as the Maximum Likelihood (ML) tuned to the true channel). It is shown that for binary input and output channels, two metrics suffice to actually achieve the same performance as an optimal decoder. In particular, this implies that there exist a decoder which is generalized linear and achieves at least 94.2% of the compound capacity on any compound set, without knowledge of the underlying set.

Original language | English (US) |
---|---|

Title of host publication | 2010 IEEE Information Theory Workshop, ITW 2010 - Proceedings |

DOIs | |

State | Published - 2010 |

Event | 2010 IEEE Information Theory Workshop, ITW 2010 - Dublin, Ireland Duration: Aug 30 2010 → Sep 3 2010 |

### Publication series

Name | 2010 IEEE Information Theory Workshop, ITW 2010 - Proceedings |
---|

### Other

Other | 2010 IEEE Information Theory Workshop, ITW 2010 |
---|---|

Country | Ireland |

City | Dublin |

Period | 8/30/10 → 9/3/10 |

### All Science Journal Classification (ASJC) codes

- Information Systems
- Applied Mathematics

## Fingerprint Dive into the research topics of 'Universal a posteriori metrics game'. Together they form a unique fingerprint.

## Cite this

*2010 IEEE Information Theory Workshop, ITW 2010 - Proceedings*[5592854] (2010 IEEE Information Theory Workshop, ITW 2010 - Proceedings). https://doi.org/10.1109/CIG.2010.5592854