Distributed Sub-gradient Algorithms with Limited Communications

Stefano Rini, Milind Rao, Andrea Goldsmith

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We consider the distributed convex optimization scenario in which nodes in a network collectively find the minimum of a function utilizing only local communications and computations. Various sub-gradient algorithms have been developed for this optimization setting for the case in which the global function factorizes as the sum of local functions to be distributed to the nodes in network, including the distributed (i) online gradient descent, (ii) Nesterov gradient descent, and (iii) dual averaging algorithms. Generally speaking, these subgradient algorithms assume that, in each communication round, nodes can exchange messages of size comparable to that of the optimization variable. For many high-dimensional optimization problems, this communication requirement is beyond the capabilities of the communication network supporting the distributed optimization. For this reason, we propose a dimensionality reduction technique based on random projections to adapt these distributed optimization algorithms to networks with communication links of limited capacity. In particular, we analyze the performance of the proposed dimensionality reduction technique for the three algorithms above, i.e. (i)-(iii). Numerical simulations are presented that illustrate the performance of the proposed approach for these three algorithms.

Original languageEnglish (US)
Title of host publicationConference Record - 53rd Asilomar Conference on Circuits, Systems and Computers, ACSSC 2019
EditorsMichael B. Matthews
PublisherIEEE Computer Society
Pages2171-2175
Number of pages5
ISBN (Electronic)9781728143002
DOIs
StatePublished - Nov 2019
Externally publishedYes
Event53rd Asilomar Conference on Circuits, Systems and Computers, ACSSC 2019 - Pacific Grove, United States
Duration: Nov 3 2019Nov 6 2019

Publication series

NameConference Record - Asilomar Conference on Signals, Systems and Computers
Volume2019-November
ISSN (Print)1058-6393

Conference

Conference53rd Asilomar Conference on Circuits, Systems and Computers, ACSSC 2019
Country/TerritoryUnited States
CityPacific Grove
Period11/3/1911/6/19

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Computer Networks and Communications

Keywords

  • Decentralized methods
  • Distributed convex optimization
  • Dual averaging
  • Gradient descent methods
  • Nesterov gradient acceleration
  • Random projections.

Fingerprint

Dive into the research topics of 'Distributed Sub-gradient Algorithms with Limited Communications'. Together they form a unique fingerprint.

Cite this