Communication-efficient sparse regression

Jason D. Lee, Qiang Liu, Yuekai Sun, Jonathan E. Taylor

Research output: Contribution to journalArticle

25 Scopus citations

Abstract

We devise a communication-efficient approach to distributed sparse regression in the high-dimensional setting. The key idea is to average "debiased" or "desparsified" lasso estimators. We show the approach converges at the same rate as the lasso as long as the dataset is not split across too many machines, and consistently estimates the support under weaker conditions than the lasso. On the computational side, we propose a new parallel and computationally-efficient algorithm to compute the approximate inverse covariance required in the debiasing approach, when the dataset is split across samples. We further extend the approach to generalized linear models.

Original languageEnglish (US)
Pages (from-to)1-30
Number of pages30
JournalJournal of Machine Learning Research
Volume18
StatePublished - Jan 1 2017
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Keywords

  • Averaging
  • Debiasing
  • Distributed sparse regression
  • High-dimensional statistics
  • Lasso

Fingerprint Dive into the research topics of 'Communication-efficient sparse regression'. Together they form a unique fingerprint.

  • Cite this

    Lee, J. D., Liu, Q., Sun, Y., & Taylor, J. E. (2017). Communication-efficient sparse regression. Journal of Machine Learning Research, 18, 1-30.