TY - GEN

T1 - Cascade multiterminal source coding

AU - Cuff, Paul

AU - Su, Han I.

AU - Gamal, Abbas Ei

N1 - Copyright:
Copyright 2013 Elsevier B.V., All rights reserved.

PY - 2009

Y1 - 2009

N2 - We investigate distributed source coding of two correlated sources X and Y where messages are passed to a decoder in a cascade fashion. The encoder of X sends a message at rate R1 to the encoder of Y. The encoder of Y then sends a message to the decoder at rate R2 based both on Y and n the message it received about X. The decoder's task is to estimate a function of X and Y. For example, we consider the minimum mean squared-error distortion when encoding the sum of jointly Gaussian random variables under these constraints. We also characterize the rates needed to reconstruct a function of X and Y losslessly. Our general contribution toward understanding the limits of the cascade multiterminal source coding network is in the form of inner and outer bounds on the achievable rate region for satisfying a distortion constraint for an arbitrary distortion function d(x, y, z). The inner bound makes use of a balance between two encoding tactics-relaying the information about X and recompressing the information about X jointly with Y. In the Gaussian case, a threshold is discovered for identifying which of the two extreme strategies optimizes the inner bound. Relaying outperforms recompressing the sum at the relay for some rate pairs if the variance of X is greater than the variance of Y.

AB - We investigate distributed source coding of two correlated sources X and Y where messages are passed to a decoder in a cascade fashion. The encoder of X sends a message at rate R1 to the encoder of Y. The encoder of Y then sends a message to the decoder at rate R2 based both on Y and n the message it received about X. The decoder's task is to estimate a function of X and Y. For example, we consider the minimum mean squared-error distortion when encoding the sum of jointly Gaussian random variables under these constraints. We also characterize the rates needed to reconstruct a function of X and Y losslessly. Our general contribution toward understanding the limits of the cascade multiterminal source coding network is in the form of inner and outer bounds on the achievable rate region for satisfying a distortion constraint for an arbitrary distortion function d(x, y, z). The inner bound makes use of a balance between two encoding tactics-relaying the information about X and recompressing the information about X jointly with Y. In the Gaussian case, a threshold is discovered for identifying which of the two extreme strategies optimizes the inner bound. Relaying outperforms recompressing the sum at the relay for some rate pairs if the variance of X is greater than the variance of Y.

UR - http://www.scopus.com/inward/record.url?scp=70449516074&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=70449516074&partnerID=8YFLogxK

U2 - 10.1109/ISIT.2009.5205989

DO - 10.1109/ISIT.2009.5205989

M3 - Conference contribution

AN - SCOPUS:70449516074

SN - 9781424443130

T3 - IEEE International Symposium on Information Theory - Proceedings

SP - 1199

EP - 1203

BT - 2009 IEEE International Symposium on Information Theory, ISIT 2009

T2 - 2009 IEEE International Symposium on Information Theory, ISIT 2009

Y2 - 28 June 2009 through 3 July 2009

ER -