
Research Article
FedNDA: Enhancing Federated Learning with Noisy Client Detection and Robust Aggregation
@ARTICLE{10.4108/eetinis.v12i3.8720, author={Tuan Dung Kieu and Charles Fonbonne and Trung-Kien Tran and Thi-Lan Le and Hai Vu and Huu-Thanh Nguyen and Thanh-Hai Tran}, title={FedNDA: Enhancing Federated Learning with Noisy Client Detection and Robust Aggregation}, journal={EAI Endorsed Transactions on Industrial Networks and Intelligent Systems}, volume={12}, number={3}, publisher={EAI}, journal_a={INIS}, year={2025}, month={7}, keywords={Federated learning, Deep learning, Noisy clients, Non-IID, Class imbalance}, doi={10.4108/eetinis.v12i3.8720} }
- Tuan Dung Kieu
Charles Fonbonne
Trung-Kien Tran
Thi-Lan Le
Hai Vu
Huu-Thanh Nguyen
Thanh-Hai Tran
Year: 2025
FedNDA: Enhancing Federated Learning with Noisy Client Detection and Robust Aggregation
INIS
EAI
DOI: 10.4108/eetinis.v12i3.8720
Abstract
Federated Learning is a novel decentralized methodology that enables multiple clients to collaboratively train a global model while preserving the privacy of their local data. Although federated learning enhances data privacy, it faces challenges related to data quality and client behavior. A fundamental issue is the presence of noisy labels in certain clients, which damages the global model's performance. To address this problem, this paper introduces a Federated learning framework with Noisy client Detection and robust Aggregation, FedNDA. In the first stage, FedNDA detects noisy clients by analyzing the distribution of their local losses. A noisy client exhibits a loss distribution distinct from that of clean clients. To handle class imbalance issue in local data, we utilize per-class losses instead of the total loss. We then assign each client a noisiness score, calculated as the Earth Mover’s Distance between the per-class loss distribution of the client and the average distribution of all clean clients. This noisiness metric is more sensitive for detecting noisy clients compared to conventional metrics such as Euclidean distance or L1 norm. The noisiness score is subsequently transfered to and used in the server-side aggregation function to prioritize clean clients while reducing the influence of noisy clients. Experimental results demonstrate that FedNDA outperforms FedAvg and FedNoRo by 4.68% and 3.6% on the CIFAR-10 dataset, and by 10.65% and 0.48% on the ICH dataset, respectively, in a high noisy setting.
Copyright © 2025 Tuan-Dung Kieu et al., licensed to EAI. This is an open access article distributed under the terms of the CC BY-NC-SA 4.0, which permits copying, redistributing, remixing, transformation, and building upon the material in any medium so long as the original work is properly cited.