### Abstract

Let X^{n} X^{n} be a sequence drawn from a discrete memoryless source, and let Y^{n} y^{n} be the corresponding reconstruction sequence that is output by a good rate-distortion code. This paper establishes a property of the joint distribution of (X^{n}, Y ^{n}). It is shown that for D > 0, the input-output statistics of a R(D)-achieving rate-distortion code converge (in normalized relative entropy) to the output-input statistics of a discrete memoryless channel (dmc). The dmc is 'backward' in that it is a channel from the reconstruction space y^{n} to source space X^{n}. It is also shown that the property does not necessarily hold when normalized relative entropy is replaced by variational distance.

Original language | English (US) |
---|---|

Title of host publication | 2013 IEEE Information Theory Workshop, ITW 2013 |

DOIs | |

State | Published - Dec 1 2013 |

Event | 2013 IEEE Information Theory Workshop, ITW 2013 - Seville, Spain Duration: Sep 9 2013 → Sep 13 2013 |

### Publication series

Name | 2013 IEEE Information Theory Workshop, ITW 2013 |
---|

### Other

Other | 2013 IEEE Information Theory Workshop, ITW 2013 |
---|---|

Country | Spain |

City | Seville |

Period | 9/9/13 → 9/13/13 |

### All Science Journal Classification (ASJC) codes

- Information Systems

## Fingerprint Dive into the research topics of 'A connection between good rate-distortion codes and backward DMCs'. Together they form a unique fingerprint.

## Cite this

*2013 IEEE Information Theory Workshop, ITW 2013*[6691321] (2013 IEEE Information Theory Workshop, ITW 2013). https://doi.org/10.1109/ITW.2013.6691321