TY - GEN
T1 - Irregularities of distribution, derandomization, and complexity theory
AU - Chazelle, Bernard
N1 - Publisher Copyright:
© Springer-Verlag Berlin Heidelberg 2000.
PY - 2000
Y1 - 2000
N2 - In 1935, van der Corput asked the following question: Given an infinite sequence of reals in [0, 1], define (Formula Presented) where Sn consists of the first n elements in the sequence. Is it possible for D(n) to stay in O(1)? Many years later, Schmidt proved that D(n) can never be in o(log n). In other words, there are limitations on how well the discrete distribution, x → |Sn ∩ [0, x]|, can simulate the continuous one, x → nx. The study of this intriguing phenomenon and its numerous variants related to the irregularities of distributions has given rise to discrepancy theory. The relevance of the subject to complexity theory is most evident in the study of probabilistic algorithms. Suppose that we feed a probabilistic algorithm not with a perfectly random sequence of bits (as is usually required) but one that is only pseudorandom or even deterministic. Should performance necessarily suffer? In particular, suppose that one could trade an exponential-size probability space for one of polynomial size without letting the algorithm realize the change. This form of derandomization can be expressed by saying that a very large distribution can be simulated by a small one for the purpose of the algorithm. Put differently, there exists a measure with respect to which the two distributions have low discrepancy. The study of discrepancy theory predates complexity theory and a wealth of mathematical techniques can be brought to bear to prove nontrivial derandomization results. The pipeline of ideas that flows from discrepancy theory to complexity theory constitutes the discrepancy method. We give a few examples in this survey. A more thorough treatment is given in our book[15]. We also briefly discuss the relevance of the discrepancy method to complexity lower bounds.
AB - In 1935, van der Corput asked the following question: Given an infinite sequence of reals in [0, 1], define (Formula Presented) where Sn consists of the first n elements in the sequence. Is it possible for D(n) to stay in O(1)? Many years later, Schmidt proved that D(n) can never be in o(log n). In other words, there are limitations on how well the discrete distribution, x → |Sn ∩ [0, x]|, can simulate the continuous one, x → nx. The study of this intriguing phenomenon and its numerous variants related to the irregularities of distributions has given rise to discrepancy theory. The relevance of the subject to complexity theory is most evident in the study of probabilistic algorithms. Suppose that we feed a probabilistic algorithm not with a perfectly random sequence of bits (as is usually required) but one that is only pseudorandom or even deterministic. Should performance necessarily suffer? In particular, suppose that one could trade an exponential-size probability space for one of polynomial size without letting the algorithm realize the change. This form of derandomization can be expressed by saying that a very large distribution can be simulated by a small one for the purpose of the algorithm. Put differently, there exists a measure with respect to which the two distributions have low discrepancy. The study of discrepancy theory predates complexity theory and a wealth of mathematical techniques can be brought to bear to prove nontrivial derandomization results. The pipeline of ideas that flows from discrepancy theory to complexity theory constitutes the discrepancy method. We give a few examples in this survey. A more thorough treatment is given in our book[15]. We also briefly discuss the relevance of the discrepancy method to complexity lower bounds.
UR - http://www.scopus.com/inward/record.url?scp=84947724521&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84947724521&partnerID=8YFLogxK
U2 - 10.1007/3-540-44450-5_3
DO - 10.1007/3-540-44450-5_3
M3 - Conference contribution
AN - SCOPUS:84947724521
SN - 3540414134
SN - 9783540414131
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 46
EP - 54
BT - FST TCS 2000
A2 - Kapoor, Sanjiv
A2 - Prasad, Sanjiva
PB - Springer Verlag
T2 - 20th Conference on Foundations of Software Technology and Theoretical Computer Science, FST TCS 2000
Y2 - 13 December 2000 through 15 December 2000
ER -