Abstract
We study the randomness necessary for the simulation of a random process with given distributions, in terms of the finite-precision resolvability of the process. Finite-precision resolvability is defined as the minimal random-bit rate required by the simulator as a function of the accuracy with which the distributions are replicated. The accuracy is quantified by means of various measures: variational distance, divergence, Ornstein, Prohorov and related measures of distance between the distributions of random processes. In the case of Ornstein, Prohorov and other distances of the Kantorovich-Vasershtein type, we show that the finite-precision resolvability is equal to the rate-distortion function with a fidelity criterion derived from the accuracy measure. This connection leads to new results on nonstationary rate-distortion theory. In the case of variational distance, the resolvability of stationary ergodic processes is shown to equal entropy rate regardless of the allowed accuracy. In the case of normalized divergence, explicit expressions for finite-precision resolvability are obtained in many cases of interest; and connections with data compression with minimum probability of block error are shown.
Original language | English (US) |
---|---|
Pages (from-to) | 63-86 |
Number of pages | 24 |
Journal | IEEE Transactions on Information Theory |
Volume | 42 |
Issue number | 1 |
DOIs | |
State | Published - 1996 |
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Keywords
- Data compression
- Divergence
- Ornstein distance
- Prohorov distance
- Rate-distortion theory
- Resolvability
- Shannon theory
- Simulation complexity
- Variational distance