Extractor-Based time-Space lower bounds for learning

Sumegha Garg, Ran Raz, Avishay Tal

Research output: Chapter in Book/Report/Conference proceedingConference contribution

29 Scopus citations


A matrix M : A× X → {−1, 1} corresponds to the following learning problem: An unknown element x ∈ X is chosen uniformly at random. A learner tries to learn x from a stream of samples, (a1,b1), (a2,b2) . . ., where for every i, ai ∈ A is chosen uniformly at random and bi = M(ai,x). Assume that k, ℓ,r are such that any submatrix of M of at least 2−k ·|A| rows and at least 2−ℓ ·|X| columns, has a bias of at most 2−r. We show that any learning algorithm for the learning problem corresponding to M requires either a memory of size at least Ω (k · ℓ), or at least 2Ω(r) samples. The result holds even if the learner has an exponentially small success probability (of 2−Ω(r)). In particular, this shows that for a large class of learning problems, any learning algorithm requires either a memory of size at least Ω ((log |X|) · (log |A|)) or an exponential number of samples, achieving a tight Ω ((log |X|) · (log |A|)) lower bound on the size of the memory, rather than a bound of Ω min (log |X|)2, (log |A|)2 obtained in previous works by Raz [FOCS’17] and Moshkovitz and Moshkovitz [ITCS’18]. Moreover, our result implies all previous memory-samples lower bounds, as well as a number of new applications. Our proof builds on the work of Raz [FOCS’17] that gave a general technique for proving memory-samples lower bounds.

Original languageEnglish (US)
Title of host publicationSTOC 2018 - Proceedings of the 50th Annual ACM SIGACT Symposium on Theory of Computing
EditorsMonika Henzinger, David Kempe, Ilias Diakonikolas
PublisherAssociation for Computing Machinery
Number of pages14
ISBN (Electronic)9781450355599
StatePublished - Jun 20 2018
Event50th Annual ACM Symposium on Theory of Computing, STOC 2018 - Los Angeles, United States
Duration: Jun 25 2018Jun 29 2018

Publication series

NameProceedings of the Annual ACM Symposium on Theory of Computing
ISSN (Print)0737-8017


Other50th Annual ACM Symposium on Theory of Computing, STOC 2018
Country/TerritoryUnited States
CityLos Angeles

All Science Journal Classification (ASJC) codes

  • Software


  • Branching programs
  • Extractors
  • Learning from samples
  • Time-space tradeoff


Dive into the research topics of 'Extractor-Based time-Space lower bounds for learning'. Together they form a unique fingerprint.

Cite this