## Abstract

This paper provides the first general technique for proving information lower bounds on two-party unbounded-rounds communication problems. We show that the discrepancy lower bound, which applies to randomized communication complexity, also applies to information complexity. More precisely, if the discrepancy of a two-party function f with respect to a distribution μ is Disc_{μ}f, then any two party randomized protocol computing f must reveal at least Ω (log (1 / Disc_{μ}f)) bits of information to the participants. As a corollary, we obtain that any two-party protocol for computing a random function on { 0 , 1 } ^{n}× { 0 , 1 } ^{n} must reveal Ω (n) bits of information to the participants. In addition, we prove that the discrepancy of the Greater-Than function is Ω(1/n), which provides an alternative proof to the recent proof of Viola (Proceedings of the twenty-fourth annual ACM-SIAM symposium on discrete algorithms, SODA 2013, New Orleans, LA, USA, 6–8 Jan 2013, pp 632–651, 2013) of the Ω (log n) lower bound on the communication complexity of this well-studied function and, combined with our main result, proves the tight Ω (log n) lower bound on its information complexity. The proof of our main result develops a new simulation procedure that may be of an independent interest. In a followup breakthrough work of Kerenidis et al. (53rd annual IEEE symposium on foundations of computer science, FOCS 2012, New Brunswick, NJ, USA, 20–23 Oct 2012, pp 500–509, 2012), our simulation procedure served as a building block towards a proof that almost all known lower bound techniques for communication complexity (and not just discrepancy) apply to information complexity as well.

Original language | English (US) |
---|---|

Pages (from-to) | 846-864 |

Number of pages | 19 |

Journal | Algorithmica |

Volume | 76 |

Issue number | 3 |

DOIs | |

State | Published - Nov 1 2016 |

## All Science Journal Classification (ASJC) codes

- Computer Science(all)
- Computer Science Applications
- Applied Mathematics