On differential privacy for federated learning in wireless systems with multiple base stations

Nima Tavangaran, Mingzhe Chen, Zhaohui Yang, José Mairton B. Da Silva, H. Vincent Poor

Research output: Contribution to journalArticlepeer-review

Abstract

In this work, we consider a federated learning model in a wireless system with multiple base stations and inter-cell interference. We apply a differentially private scheme to transmit information from users to their corresponding base station during the learning phase. We show the convergence behavior of the learning process by deriving an upper bound on its optimality gap. Furthermore, we define an optimization problem to reduce this upper bound and the total privacy leakage. To find the locally optimal solutions of this problem, we first propose an algorithm that schedules the resource blocks and users. We then extend this scheme to reduce the total privacy leakage by optimizing the differential privacy artificial noise. We apply the solutions of these two procedures as parameters of a federated learning system where each user is equipped with a classifier and communication cells have mostly fewer resource blocks than numbers of users. The simulation results show that our proposed scheduler improves the average accuracy of the predictions compared with a random scheduler. In particular, the results show an improvement of over 6%. Furthermore, its extended version with noise optimizer significantly reduces the amount of privacy leakage.

Original languageEnglish (US)
JournalIET Communications
DOIs
StateAccepted/In press - 2024
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Electrical and Electronic Engineering

Keywords

  • 6G
  • data privacy
  • federated learning
  • optimization
  • scheduling
  • wireless channels

Fingerprint

Dive into the research topics of 'On differential privacy for federated learning in wireless systems with multiple base stations'. Together they form a unique fingerprint.

Cite this