About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Broadband Communications, Networks, and Systems. 13th EAI International Conference, BROADNETS 2022, Virtual Event, March 12-13, 2023 Proceedings

Research Article

Client Selection Based on Diversity Scaling for Federated Learning on Non-IID Data

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-40467-2_8,
        author={Yuechao Ren and Atul Sajjanhar and Shang Gao and Seng Loke},
        title={Client Selection Based on Diversity Scaling for Federated Learning on Non-IID Data},
        proceedings={Broadband Communications, Networks, and Systems. 13th EAI International Conference, BROADNETS 2022, Virtual Event, March 12-13, 2023 Proceedings},
        proceedings_a={BROADNETS},
        year={2023},
        month={7},
        keywords={Federated Learning Diversity Scaling Convergence client Selection},
        doi={10.1007/978-3-031-40467-2_8}
    }
    
  • Yuechao Ren
    Atul Sajjanhar
    Shang Gao
    Seng Loke
    Year: 2023
    Client Selection Based on Diversity Scaling for Federated Learning on Non-IID Data
    BROADNETS
    Springer
    DOI: 10.1007/978-3-031-40467-2_8
Yuechao Ren1,*, Atul Sajjanhar1, Shang Gao1, Seng Loke1
  • 1: Deakin University, 221 Burwood Hwy, Burwood
*Contact email: renyue@deakin.edu.au

Abstract

In a wireless Federated Learning (FL) system, clients train their local models over local datasets on IoT devices. The derived local models are uploaded to the FL server which generates a global model, then broadcasts the model back to the clients for further training. Due to the heterogeneous feature of clients, client selection plays an important role in determining the overall training time. Traditionally, maximum number of clients are selected if they can derive and upload their local models before the deadline in each global iteration. However, selecting more clients not only increases the energy consumption of the clients, but also might not be necessary as having fewer clients in early global iterations and more clients in later iterations have been proved better for model accuracy. To address the issue, this paper proposes a client selection scheme which dynamically adjusts and optimizes the trade-off between maximizing the number of selected clients and minimizing the total communication cost between the clients and the server. By comparing the data diversity of clients, this scheme can select the most suitable clients for global convergence. A Diversity Scaling Node Selection framework (FedDS) is implemented to dynamically change the selecting weights of each node based on the degree of non-i.i.d data diversity. Results has shown that the proposed FedDS can speed up the FL convergence rate compared to FedAvg with random node selection.

Keywords
Federated Learning Diversity Scaling Convergence client Selection
Published
2023-07-30
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-40467-2_8
Copyright © 2023–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL