Communication-Efficient Distributed Learning: An Overview

Xuanyu Cao, Tamer Basar, Suhas Diggavi, Yonina C. Eldar, Khaled B. Letaief, H. Vincent Poor, Junshan Zhang

Research output: Contribution to journalReview articlepeer-review

12 Scopus citations


Distributed learning is envisioned as the bedrock of next-generation intelligent networks, where intelligent agents, such as mobile devices, robots, and sensors, exchange information with each other or a parameter server to train machine learning models collaboratively without uploading raw data to a central entity for centralized processing. By utilizing the computation/communication capability of individual agents, the distributed learning paradigm can mitigate the burden at central processors and help preserve data privacy of users. Despite its promising applications, a downside of distributed learning is its need for iterative information exchange over wireless channels, which may lead to high communication overhead unaffordable in many practical systems with limited radio resources such as energy and bandwidth. To overcome this communication bottleneck, there is an urgent need for the development of communication-efficient distributed learning algorithms capable of reducing the communication cost and achieving satisfactory learning/optimization performance simultaneously. In this paper, we present a comprehensive survey of prevailing methodologies for communication-efficient distributed learning, including reduction of the number of communications, compression and quantization of the exchanged information, radio resource management for efficient learning, and game-theoretic mechanisms incentivizing user participation. We also point out potential directions for future research to further enhance the communication efficiency of distributed learning in various scenarios.

Original languageEnglish (US)
Pages (from-to)851-873
Number of pages23
JournalIEEE Journal on Selected Areas in Communications
Issue number4
StatePublished - Apr 1 2023
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Electrical and Electronic Engineering
  • Computer Networks and Communications


  • Distributed learning
  • communication efficiency
  • compression
  • event-triggering
  • incentive mechanisms
  • meta-learning
  • multitask learning
  • online learning
  • quantization
  • resource allocation
  • single-task learning
  • sparsification


Dive into the research topics of 'Communication-Efficient Distributed Learning: An Overview'. Together they form a unique fingerprint.

Cite this