Distributed Stochastic Optimization with Random Communication and Computational Delays: Optimal Policies and Performance Analysis

Siyuan Yu, Wei Chen, H. Vincent Poor

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

Distributed stochastic optimization has attracted considerable attention due to its potential of scaling the computational resources, reducing the training time, and helping protect user privacy in decentralized machine learning. However, the staggers and limited bandwidth may induce random computational and communication delays, thereby severely hindering the optimization or learning process. As a result, we are interested in the optimal policies and their performance analysis for latency-aware distributed Stochastic Gradient Descent (SGD). To understand the effect of staleness and error of gradients in distributed optimization, both of which may determine the convergence time, we present a unified framework based on the stochastic delay differential equation to characterize the random convergence time. It is interestingly found that the average convergence time is much more sensitive to the gradient staleness rather than its error. To provide further insights, we show that the time cost of fully asynchronous SGD is approximately determined by the product of the gradient staleness and the 2-norm of the Hessian matrix of the objective function. Moreover, small staleness may slightly accelerate the SGD, while large staleness will result in its divergence.

Original languageEnglish (US)
Title of host publicationICC 2024 - IEEE International Conference on Communications
EditorsMatthew Valenti, David Reed, Melissa Torres
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3791-3796
Number of pages6
ISBN (Electronic)9781728190549
DOIs
StatePublished - 2024
Externally publishedYes
Event59th Annual IEEE International Conference on Communications, ICC 2024 - Denver, United States
Duration: Jun 9 2024Jun 13 2024

Publication series

NameIEEE International Conference on Communications
ISSN (Print)1550-3607

Conference

Conference59th Annual IEEE International Conference on Communications, ICC 2024
Country/TerritoryUnited States
CityDenver
Period6/9/246/13/24

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Electrical and Electronic Engineering

Keywords

  • Asynchronous optimization
  • federated learning
  • gradient staleness
  • stochastic gradient descent

Fingerprint

Dive into the research topics of 'Distributed Stochastic Optimization with Random Communication and Computational Delays: Optimal Policies and Performance Analysis'. Together they form a unique fingerprint.

Cite this