Information complexity is computable

Mark Braverman, Jon Schneider

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

The information complexity of a function f is the minimum amount of information Alice and Bob need to exchange to compute the function f. In this paper we provide an algorithm for approximating the information complexity of an arbitrary function f to within any additive error ϵ > 0, thus resolving an open question as to whether information complexity is computable. In the process, we give the first explicit upper bound on the rate of convergence of the information complexity of f when restricted to b-bit protocols to the (unrestricted) information complexity of f.

Original languageEnglish (US)
Title of host publication43rd International Colloquium on Automata, Languages, and Programming, ICALP 2016
EditorsYuval Rabani, Ioannis Chatzigiannakis, Davide Sangiorgi, Michael Mitzenmacher
PublisherSchloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing
ISBN (Electronic)9783959770132
DOIs
StatePublished - Aug 1 2016
Externally publishedYes
Event43rd International Colloquium on Automata, Languages, and Programming, ICALP 2016 - Rome, Italy
Duration: Jul 12 2016Jul 15 2016

Publication series

NameLeibniz International Proceedings in Informatics, LIPIcs
Volume55
ISSN (Print)1868-8969

Other

Other43rd International Colloquium on Automata, Languages, and Programming, ICALP 2016
Country/TerritoryItaly
CityRome
Period7/12/167/15/16

All Science Journal Classification (ASJC) codes

  • Software

Keywords

  • Communication complexity
  • Convergence rate
  • Information complexity

Fingerprint

Dive into the research topics of 'Information complexity is computable'. Together they form a unique fingerprint.

Cite this