User-Level Privacy-Preserving Federated Learning: Analysis and Performance Optimization

Kang Wei, Jun Li, Ming Ding, Chuan Ma, Hang Su, Zhang Bo, Vincent Poor

Research output: Contribution to journalArticlepeer-review

15 Scopus citations


Federated learning (FL), as a type of collaborative machine learning framework, is capable of preserving private data from mobile terminals (MTs) while training the data into useful models. Nevertheless, it is still possible for a curious server to infer private information from the shared models uploaded by MTs. To address this problem, we first make use of the concept of local differential privacy (LDP), and propose a user-level differential privacy (UDP) algorithm by adding artificial noise to the shared models before uploading them to servers. According to our analysis, the UDP framework can realize <formula><tex>$(\epsilon_{i}, \delta_{i})$</tex></formula>-LDP for the i-th MT with adjustable privacy protection levels by varying the variances of the artificial noise processes. We then derive a theoretical convergence upper-bound for the UDP algorithm. It reveals that there exists an optimal number of communication rounds to achieve the best learning performance. More importantly, we propose a communication rounds discounting (CRD) method, which can achieve a much better trade-off between the computational complexity of searching and the convergence performance compared with the heuristic search method. Extensive experiments indicate that our UDP algorithm using the proposed CRD method can effectively improve both the training efficiency and model quality for the given privacy protection levels.

Original languageEnglish (US)
JournalIEEE Transactions on Mobile Computing
StateAccepted/In press - 2021

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Networks and Communications
  • Electrical and Electronic Engineering


  • Computational modeling
  • Convergence
  • Data models
  • Differential privacy
  • Federated learning
  • Privacy
  • Servers
  • Training
  • communication round
  • differential privacy
  • mobile edge computing


Dive into the research topics of 'User-Level Privacy-Preserving Federated Learning: Analysis and Performance Optimization'. Together they form a unique fingerprint.

Cite this