TY - GEN

T1 - How to compress interactive communication

AU - Barak, Boaz

AU - Braverman, Mark

AU - Chen, Xi

AU - Rao, Anup

PY - 2010

Y1 - 2010

N2 - We describe new ways to simulate 2-party communication protocols to get protocols with potentially smaller communication. We show that every communication protocol that communicates C bits and reveals I bits of information about the inputs to the participating parties can be simulated by a new protocol involving at most Õ(√CI) bits of communication. If the protocol reveals I bits of information about the inputs to an observer that watches the communication in the protocol, we show how to carry out the simulation with Õ(I) bits of communication. These results lead to a direct sum theorem for randomized communication complexity. Ignoring polylogarithmic factors, we show that for worst case computation, computing n copies of a function requires √n times the communication required for computing one copy of the function. For average case complexity, given any distribution μ on inputs, computing n copies of the function on n inputs sampled independently according to μ requires √n times the communication for computing one copy. If μ is a product distribution, computing n copies on n independent inputs sampled according to μ requires n times the communication required for computing the function. We also study the complexity of computing the sum (or parity) of n evaluations of f, and obtain results analogous to those above.

AB - We describe new ways to simulate 2-party communication protocols to get protocols with potentially smaller communication. We show that every communication protocol that communicates C bits and reveals I bits of information about the inputs to the participating parties can be simulated by a new protocol involving at most Õ(√CI) bits of communication. If the protocol reveals I bits of information about the inputs to an observer that watches the communication in the protocol, we show how to carry out the simulation with Õ(I) bits of communication. These results lead to a direct sum theorem for randomized communication complexity. Ignoring polylogarithmic factors, we show that for worst case computation, computing n copies of a function requires √n times the communication required for computing one copy of the function. For average case complexity, given any distribution μ on inputs, computing n copies of the function on n inputs sampled independently according to μ requires √n times the communication for computing one copy. If μ is a product distribution, computing n copies on n independent inputs sampled according to μ requires n times the communication required for computing the function. We also study the complexity of computing the sum (or parity) of n evaluations of f, and obtain results analogous to those above.

KW - communication complexity

KW - compression

KW - direct sum

KW - information theory

UR - http://www.scopus.com/inward/record.url?scp=77954724787&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=77954724787&partnerID=8YFLogxK

U2 - 10.1145/1806689.1806701

DO - 10.1145/1806689.1806701

M3 - Conference contribution

AN - SCOPUS:77954724787

SN - 9781605588179

T3 - Proceedings of the Annual ACM Symposium on Theory of Computing

SP - 67

EP - 76

BT - STOC'10 - Proceedings of the 2010 ACM International Symposium on Theory of Computing

T2 - 42nd ACM Symposium on Theory of Computing, STOC 2010

Y2 - 5 June 2010 through 8 June 2010

ER -