Abstract
This paper studies the fundamental limits of the minimum average length of lossless and lossy variable-length compression, allowing a nonzero error probability ε, for lossless compression. We give nonasymptotic bounds on the minimum average length in terms of Erokhin's rate-distortion function and we use those bounds to obtain a Gaussian approximation on the speed of approach to the limit, which is quite accurate for all but small blocklengths: (1 - ε) k H(S) - ((V(S)/2π))1/2 exp[-((Q-1 (ε))2/2)], where Q-1(·) is the functional inverse of the standard Gaussian complementary cumulative distribution function, and V(S) is the source dispersion. A nonzero error probability thus not only reduces the asymptotically achievable rate by a factor of 1 - ε, but this asymptotic limit is approached from below, i.e., larger source dispersions and shorter blocklengths are beneficial. Variable-length lossy compression under an excess distortion constraint is shown to exhibit similar properties.
Original language | English (US) |
---|---|
Article number | 7115096 |
Pages (from-to) | 4316-4330 |
Number of pages | 15 |
Journal | IEEE Transactions on Information Theory |
Volume | 61 |
Issue number | 8 |
DOIs | |
State | Published - Aug 2015 |
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Keywords
- Shannon theory
- Variable-length compression
- dispersion
- finite-blocklength regime
- lossless compression
- lossy compression
- rate-distortion theory
- single-shot