### Abstract

We consider an online vector balancing question where T vectors, chosen from an arbitrary distribution over [-1,1]^{n}, arrive one-by-one and must be immediately given a ± sign. The goal is to keep the discrepancy - the g.,"_{∞}-norm of any signed prefix-sum - as small as possible. A concrete example of this question is the online interval discrepancy problem where T points are sampled one-by-one uniformly in the unit interval [0,1], and the goal is to immediately color them ± such that every sub-interval remains always nearly balanced. As random coloring incurs ω(T^{1/2}) discrepancy, while the worst-case offline bounds are (gn log(T/n)) for vector balancing and 1 for interval balancing, a natural question is whether one can (nearly) match the offline bounds in the online setting for these problems. One must utilize the stochasticity as in the worst-case scenario it is known that discrepancy is ω(T^{1/2}) for any online algorithm. In a special case of online vector balancing, Bansal and Spencer [BS19] recently show an O(gnlogT) bound when each coordinate is independently chosen. When there are dependencies among the coordinates, as in the interval discrepancy problem, the problem becomes much more challenging, as evidenced by a recent work of Jiang, Kulkarni, and Singla [JKS19] that gives a non-trivial O(T^{1/loglogT}) bound for online interval discrepancy. Although this beats random coloring, it is still far from the offline bound. In this work, we introduce a new framework that allows us to handle online vector balancing even when the input distribution has dependencies across coordinates. In particular, this lets us obtain a poly(n, logT) bound for online vector balancing under arbitrary input distributions, and a polylog (T) bound for online interval discrepancy. Our framework is powerful enough to capture other well-studied geometric discrepancy problems; e.g., we obtain a poly(log^{d} (T)) bound for the online d-dimensional Tusnády's problem. All our bounds are tight up to polynomial factors. A key new technical ingredient in our work is an anti-concentration inequality for sums of pairwise uncorrelated random variables, which might also be of independent interest.

Original language | English (US) |
---|---|

Title of host publication | STOC 2020 - Proceedings of the 52nd Annual ACM SIGACT Symposium on Theory of Computing |

Editors | Konstantin Makarychev, Yury Makarychev, Madhur Tulsiani, Gautam Kamath, Julia Chuzhoy |

Publisher | Association for Computing Machinery |

Pages | 1139-1152 |

Number of pages | 14 |

ISBN (Electronic) | 9781450369794 |

DOIs | |

State | Published - Jun 8 2020 |

Event | 52nd Annual ACM SIGACT Symposium on Theory of Computing, STOC 2020 - Chicago, United States Duration: Jun 22 2020 → Jun 26 2020 |

### Publication series

Name | Proceedings of the Annual ACM Symposium on Theory of Computing |
---|---|

ISSN (Print) | 0737-8017 |

### Conference

Conference | 52nd Annual ACM SIGACT Symposium on Theory of Computing, STOC 2020 |
---|---|

Country | United States |

City | Chicago |

Period | 6/22/20 → 6/26/20 |

### All Science Journal Classification (ASJC) codes

- Software

### Keywords

- Anti-concentration
- Envy minimization
- Geometric discrepancy
- Online vector balancing

## Fingerprint Dive into the research topics of 'Online vector balancing and geometric discrepancy'. Together they form a unique fingerprint.

## Cite this

*STOC 2020 - Proceedings of the 52nd Annual ACM SIGACT Symposium on Theory of Computing*(pp. 1139-1152). (Proceedings of the Annual ACM Symposium on Theory of Computing). Association for Computing Machinery. https://doi.org/10.1145/3357713.3384280