About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Quality, Reliability, Security and Robustness in Heterogeneous Systems. 19th EAI International Conference, QShine 2023, Shenzhen, China, October 8 – 9, 2023, Proceedings, Part II

Research Article

Entrofuse: Clustered Federated Learning Through Entropy Approach

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-65123-6_6,
        author={Kaifei Tu and Wenhao Yuan and Xuehe Wang},
        title={Entrofuse: Clustered Federated Learning Through Entropy Approach},
        proceedings={Quality, Reliability, Security and Robustness in Heterogeneous Systems. 19th EAI International Conference, QShine 2023, Shenzhen, China, October 8 -- 9, 2023, Proceedings, Part II},
        proceedings_a={QSHINE PART 2},
        year={2024},
        month={8},
        keywords={federated learning cluster entropy},
        doi={10.1007/978-3-031-65123-6_6}
    }
    
  • Kaifei Tu
    Wenhao Yuan
    Xuehe Wang
    Year: 2024
    Entrofuse: Clustered Federated Learning Through Entropy Approach
    QSHINE PART 2
    Springer
    DOI: 10.1007/978-3-031-65123-6_6
Kaifei Tu1, Wenhao Yuan1, Xuehe Wang1,*
  • 1: School of Artificial Intelligence
*Contact email: wangxuehe@mail.sysu.edu.cn

Abstract

Conventional machine learning method typically relies on collecting vast quantities of data, which usually results in serious private information leakage and a huge communication burden. To tackle this severe challenge, Federated Learning (FL), which served as a novel paradigm of distributed machine learning, is recently proposed. Under the framework of FL, clients cooperatively train a shared global model with their own data and transmit the model parameter to the central server while keeping their private data localized. However, FL still encounters some limitations, particularly in confronting non-independent and non-identically distributed (Non-IID) data which results in poor model performance. In light of the above concerns, we propose an entropy-based clustering federated learning model namedEntrofuse, which aims to partition the clients into different clusters characterized by data distribution and subsequently, the model training process performs within each cluster. As it is hard to acquire the distribution of data samples, we adopt Kernel Density Estimation (KDE) method to estimate the data distribution of heterogeneous clients. Our approach takes into account both entropy and vector angle of the model parameter, and proves the rationality of our method through rigorous theoretical analysis. Experimental results show that our proposed method is superior to the non-clustered case on the EMNIST dataset and significantly improves the accuracy by 10% to 12%.

Keywords
federated learning cluster entropy
Published
2024-08-20
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-65123-6_6
Copyright © 2023–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL