Exploring the Privacy-Energy Consumption Tradeoff for Split Federated Learning

Joohyung Lee, Mohamed Seif, Jungchan Cho, H. Vincent Poor

Research output: Contribution to journalArticlepeer-review


Split Federated Learning (SFL) has recently emerged as a promising distributed learning technology that leverages the strengths of both federated and split learning. In this approach, clients are responsible for training only part of the model, termed the client-side model, thereby alleviating their computational burden. Then, clients can enhance their convergence speed by synchronizing these client-side models. Consequently, SFL has received significant attention from both industry and academia, with diverse applications in 6G networks. However, while offering considerable benefits, SFL introduces additional communication overhead when interacting with servers. Moreover, the current SFL method presents several privacy concerns during frequent interactions. In this context, the choice of the cut layer in SFL, which splits the model into both client- and server-side models, can substantially impact the energy consumption of clients and their privacy because it influences the training burden and output of the client-side models. Correspondingly, extensive research is required to analyze the impact of cut layer selection, and careful consideration should be given to this aspect. Therefore, this study provides a comprehensive overview of the SFL process and reviews its state-of-the-art. We thoroughly analyze the energy consumption and privacy regarding the cut layer selection in SFL by considering the influence of various system parameters on the selection strategy. Moreover, we provide an illustrative example of cut layer selection to minimize the clients’ risk of reconstructing raw data at the server. This is done while sustaining energy consumption within the required energy budget, which involves trade-offs. We also discuss other control variables that can be optimized in conjunction with the cut layer selection. Finally, we highlight open challenges in this field. These are promising avenues for future research and development.

Original languageEnglish (US)
Pages (from-to)1
Number of pages1
JournalIEEE Network
StateAccepted/In press - 2024

All Science Journal Classification (ASJC) codes

  • Software
  • Information Systems
  • Hardware and Architecture
  • Computer Networks and Communications


  • Analytical models
  • Computational modeling
  • Data models
  • Energy consumption
  • Privacy
  • Servers
  • Training


Dive into the research topics of 'Exploring the Privacy-Energy Consumption Tradeoff for Split Federated Learning'. Together they form a unique fingerprint.

Cite this