TY - GEN

T1 - An information complexity approach to extended formulations

AU - Braverman, Mark

AU - Moitra, Ankur

PY - 2013

Y1 - 2013

N2 - We prove an unconditional lower bound that any linear pro- gram that achieves an O(n1-ε) approximation for clique has size 2 Ω(n-ε). There has been considerable recent interest in proving unconditional lower bounds against any linear pro- gram. Fiorini et al. [13] proved that there is no polyno- mial sized linear program for traveling salesman. Braun et al. [7] proved that there is no polynomial sized O(n 1/2-ε)- approximate linear program for clique. Here we prove an optimal and unconditional lower bound against linear pro- grams for clique that matches Håstad's [15] celebrated hard- ness result. Interestingly, the techniques used to prove such lower bounds have closely followed the progression of tech- niques used in communication complexity. Here we develop an information theoretic framework to approach these ques- tions, and we use it to prove our main result. Also we resolve a related question: How many bits of communication are needed to get ε-advantage over random guessing for disjointness? Kalyanasundaram and Schnitger [18] proved that a protocol that gets constant advantage re- quires (n) bits of communication. This result in conjunc- tion with amplification implies that any protocol that gets ε-advantage requires Ω(ε2n) bits of communication. Here we improve this bound to Ω (εn), which is optimal for any > 0.

AB - We prove an unconditional lower bound that any linear pro- gram that achieves an O(n1-ε) approximation for clique has size 2 Ω(n-ε). There has been considerable recent interest in proving unconditional lower bounds against any linear pro- gram. Fiorini et al. [13] proved that there is no polyno- mial sized linear program for traveling salesman. Braun et al. [7] proved that there is no polynomial sized O(n 1/2-ε)- approximate linear program for clique. Here we prove an optimal and unconditional lower bound against linear pro- grams for clique that matches Håstad's [15] celebrated hard- ness result. Interestingly, the techniques used to prove such lower bounds have closely followed the progression of tech- niques used in communication complexity. Here we develop an information theoretic framework to approach these ques- tions, and we use it to prove our main result. Also we resolve a related question: How many bits of communication are needed to get ε-advantage over random guessing for disjointness? Kalyanasundaram and Schnitger [18] proved that a protocol that gets constant advantage re- quires (n) bits of communication. This result in conjunc- tion with amplification implies that any protocol that gets ε-advantage requires Ω(ε2n) bits of communication. Here we improve this bound to Ω (εn), which is optimal for any > 0.

KW - Communication complexity

KW - Extended formulations

KW - Nonnegative rank

UR - http://www.scopus.com/inward/record.url?scp=84879809023&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84879809023&partnerID=8YFLogxK

U2 - 10.1145/2488608.2488629

DO - 10.1145/2488608.2488629

M3 - Conference contribution

AN - SCOPUS:84879809023

SN - 9781450320290

T3 - Proceedings of the Annual ACM Symposium on Theory of Computing

SP - 161

EP - 170

BT - STOC 2013 - Proceedings of the 2013 ACM Symposium on Theory of Computing

T2 - 45th Annual ACM Symposium on Theory of Computing, STOC 2013

Y2 - 1 June 2013 through 4 June 2013

ER -