The Gradient Complexity of Linear Regression

Mark Braverman, Elad Hazan, Max Simchowitz, Blake Woodworth

Research output: Contribution to journalConference articlepeer-review

19 Scopus citations

Abstract

We investigate the computational complexity of several basic linear algebra primitives, including largest eigenvector computation and linear regression, in the computational model that allows access to the data via a matrix-vector product oracle. We show that for polynomial accuracy, Θ(d) calls to the oracle are necessary and sufficient even for a randomized algorithm. Our lower bound is based on a reduction to estimating the least eigenvalue of a random Wishart matrix. This simple distribution enables a concise proof, leveraging a few key properties of the random Wishart ensemble.

Original languageEnglish (US)
Pages (from-to)627-647
Number of pages21
JournalProceedings of Machine Learning Research
Volume125
StatePublished - 2020
Event33rd Conference on Learning Theory, COLT 2020 - Virtual, Online, Austria
Duration: Jul 9 2020Jul 12 2020

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'The Gradient Complexity of Linear Regression'. Together they form a unique fingerprint.

Cite this