## Abstract

Suppose we are given a random source and want to use it as a random number generator; at what rate can we generate fair bits from it? We address this question in an information-theoretic setting by allowing for some arbitrarily small but nonzero deviation from “ideal” random bits. We prove our results with three different measures of approximation between the ideal and the obtained probability distributions: the variational distance, the d-bar distance, and the normalized divergence. Two different contexts are studied: fixed-length and variable-length random number generation. The fixed-length results of this paper provide an operational characterization of the inf-entropy rate of a source, defined in Han and Verdú [1] and the variable-length results characterize the liminf of the entropy rate, thereby establishing a pleasing duality with the fundamental limits of source coding. A feature of our results is that we do not restrict ourselves to ergodic or to stationary sources.

Original language | English (US) |
---|---|

Pages (from-to) | 1322-1332 |

Number of pages | 11 |

Journal | IEEE Transactions on Information Theory |

Volume | 41 |

Issue number | 5 |

DOIs | |

State | Published - Sep 1995 |

## All Science Journal Classification (ASJC) codes

- Information Systems
- Computer Science Applications
- Library and Information Sciences

## Keywords

- Shannon theory
- approximation theory
- entropy
- fixed-length source coding
- random number generation
- variable-length source coding