Irregularities of distribution, derandomization, and complexity theory

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In 1935, van der Corput asked the following question: Given an infinite sequence of reals in [0, 1], define (Formula Presented) where Sn consists of the first n elements in the sequence. Is it possible for D(n) to stay in O(1)? Many years later, Schmidt proved that D(n) can never be in o(log n). In other words, there are limitations on how well the discrete distribution, x → |Sn ∩ [0, x]|, can simulate the continuous one, x → nx. The study of this intriguing phenomenon and its numerous variants related to the irregularities of distributions has given rise to discrepancy theory. The relevance of the subject to complexity theory is most evident in the study of probabilistic algorithms. Suppose that we feed a probabilistic algorithm not with a perfectly random sequence of bits (as is usually required) but one that is only pseudorandom or even deterministic. Should performance necessarily suffer? In particular, suppose that one could trade an exponential-size probability space for one of polynomial size without letting the algorithm realize the change. This form of derandomization can be expressed by saying that a very large distribution can be simulated by a small one for the purpose of the algorithm. Put differently, there exists a measure with respect to which the two distributions have low discrepancy. The study of discrepancy theory predates complexity theory and a wealth of mathematical techniques can be brought to bear to prove nontrivial derandomization results. The pipeline of ideas that flows from discrepancy theory to complexity theory constitutes the discrepancy method. We give a few examples in this survey. A more thorough treatment is given in our book[15]. We also briefly discuss the relevance of the discrepancy method to complexity lower bounds.

Original languageEnglish (US)
Title of host publicationFST TCS 2000
Subtitle of host publicationFoundations of Software Technology and Theoretical Computer Science - 20th Conference, Proceedings
EditorsSanjiv Kapoor, Sanjiva Prasad
PublisherSpringer Verlag
Pages46-54
Number of pages9
ISBN (Print)3540414134, 9783540414131
DOIs
StatePublished - 2000
Event20th Conference on Foundations of Software Technology and Theoretical Computer Science, FST TCS 2000 - New Delhi, India
Duration: Dec 13 2000Dec 15 2000

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume1974
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other20th Conference on Foundations of Software Technology and Theoretical Computer Science, FST TCS 2000
Country/TerritoryIndia
CityNew Delhi
Period12/13/0012/15/00

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Irregularities of distribution, derandomization, and complexity theory'. Together they form a unique fingerprint.

Cite this