Abstract
With the rapid proliferation of smart mobile devices, federated learning (FL) has been widely considered for application in wireless networks for distributed model training. However, data heterogeneity, e.g., non-independently identically distributions and different sizes of training datasets among clients, poses major challenges to wireless FL. Limited communication resources complicate the implementation of fair scheduling which is required for training on heterogeneous data, and further deteriorate the overall performance. To address this issue, this paper focuses on performance analysis and optimization for wireless FL, considering data heterogeneity, combined with wireless resource allocation. Specifically, we first develop a closed-form expression for an upper bound on the FL loss function, with a particular emphasis on data heterogeneity described by a dataset size vector and a data divergence vector. Then we formulate the loss function minimization problem, under constraints on long-term energy consumption and latency, and jointly optimize client scheduling, uplink transmission power, channel allocation and the number of local epochs. Next, via the Lyapunov drift technique, we transform the optimization problem into a series of tractable problems. Extensive experiments on real-world datasets demonstrate that our method outperforms other benchmarks in terms of the learning accuracy and energy consumption.
Original language | English (US) |
---|---|
Pages (from-to) | 7728-7744 |
Number of pages | 17 |
Journal | IEEE Transactions on Wireless Communications |
Volume | 23 |
Issue number | 7 |
DOIs | |
State | Published - 2024 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- Computer Science Applications
- Electrical and Electronic Engineering
- Applied Mathematics
Keywords
- Federated learning
- client scheduling
- data heterogeneity
- wireless resource allocation