Differentially Private Federated Learning: Algorithm, Analysis and Optimization

Kang Wei, Jun Li, Chuan Ma, Ming Ding, H. Vincent Poor

Research output: Chapter in Book/Report/Conference proceedingChapter

3 Scopus citations


Federated learning (FL), a type of collaborative machine learning framework, is capable of helping protect users’ private data while training the data into useful models. Nevertheless, privacy leakage may still happen by analyzing the exchanged parameters, e.g., weights and biases in deep neural networks, between the central server and clients. In this chapter, to effectively prevent information leakage, we investigate a differential privacy mechanism in which, at the clients’ side, artificial noises are added to parameters before uploading. Moreover, we propose a K-client random scheduling policy, in which K clients are randomly selected from a total of N clients to participate in each communication round. Furthermore, a theoretical convergence bound is derived from the loss function of the trained FL model. In detail, considering a fixed privacy level, the theoretical bound reveals that there exists an optimal number of clients K that can achieve the best convergence performance due to the tradeoff between the volume of user data and the variances of aggregated artificial noises. To optimize this tradeoff, we further provide a differentially private FL based client selection (DP-FedCS) algorithm, which can dynamically select the number of training clients. Our experimental results validate our theoretical conclusions and also show that the proposed algorithm can effectively improve both the FL training efficiency and FL model quality for a given privacy protection level.

Original languageEnglish (US)
Title of host publicationStudies in Computational Intelligence
PublisherSpringer Science and Business Media Deutschland GmbH
Number of pages28
StatePublished - 2021

Publication series

NameStudies in Computational Intelligence
ISSN (Print)1860-949X
ISSN (Electronic)1860-9503

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence


  • Client selection
  • Convergence performance
  • Differential privacy
  • Federated learning


Dive into the research topics of 'Differentially Private Federated Learning: Algorithm, Analysis and Optimization'. Together they form a unique fingerprint.

Cite this