Efficient Randomized Subspace Embeddings for Distributed Optimization Under a Communication Budget

Rajarshi Saha, Mert Pilanci, Andrea J. Goldsmith

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


We study first-order optimization algorithms under the constraint that the descent direction is quantized using a pre-specified budget of R -bits per dimension, where R (0,infty). We propose computationally efficient optimization algorithms with convergence rates matching the information-theoretic performance lower bounds for: (i) Smooth and Strongly-Convex objectives with access to an Exact Gradient oracle, as well as (ii) General Convex and Non-Smooth objectives with access to a Noisy Subgradient oracle. The crux of these algorithms is a polynomial complexity source coding scheme that embeds a vector into a random subspace before quantizing it. These embeddings are such that with high probability, their projection along any of the canonical directions of the transform space is small. As a consequence, quantizing these embeddings followed by an inverse transform to the original space yields a source coding method with optimal covering efficiency while utilizing just R -bits per dimension. Our algorithms guarantee optimality for arbitrary values of the bit-budget R , which includes both the sub-linear budget regime ( R < 1 ), as well as the high-budget regime ( R = 1 ), while requiring O(n2) multiplications, where n is the dimension. We also propose an efficient relaxation of this coding scheme using Hadamard subspaces that requires a near-linear time, i.e., O(n- n) additions. Furthermore, we show that the utility of our proposed embeddings can be extended to significantly improve the performance of gradient sparsification schemes. Numerical simulations validate our theoretical claims. Our implementations are available at https://github.com/rajarshisaha95/DistOptConstrComm.

Original languageEnglish (US)
Pages (from-to)183-196
Number of pages14
JournalIEEE Journal on Selected Areas in Information Theory
Issue number2
StatePublished - Jun 1 2022

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Media Technology
  • Artificial Intelligence
  • Applied Mathematics


  • bit-budget constraint
  • distributed optimization
  • error feedback
  • gradient quantization
  • hadamard subspace
  • Kashin embeddings
  • random orthonormal subspace


Dive into the research topics of 'Efficient Randomized Subspace Embeddings for Distributed Optimization Under a Communication Budget'. Together they form a unique fingerprint.

Cite this