About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Collaborative Computing: Networking, Applications and Worksharing. 19th EAI International Conference, CollaborateCom 2023, Corfu Island, Greece, October 4-6, 2023, Proceedings, Part III

Research Article

FedECCR: Federated Learning Method with Encoding Comparison and Classification Rectification

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-54531-3_4,
        author={Yan Zeng and Hui Zheng and Xin Wang and Beibei Zhang and Mingyao Zhou and Jilin Zhang and YongJian Ren},
        title={FedECCR: Federated Learning Method with Encoding Comparison and Classification Rectification},
        proceedings={Collaborative Computing: Networking, Applications and Worksharing. 19th EAI International Conference, CollaborateCom 2023, Corfu Island, Greece, October 4-6, 2023, Proceedings, Part III},
        proceedings_a={COLLABORATECOM PART 3},
        year={2024},
        month={2},
        keywords={Federated Learning Data Heterogeneity Prototypical Learning Contrastive Learning},
        doi={10.1007/978-3-031-54531-3_4}
    }
    
  • Yan Zeng
    Hui Zheng
    Xin Wang
    Beibei Zhang
    Mingyao Zhou
    Jilin Zhang
    YongJian Ren
    Year: 2024
    FedECCR: Federated Learning Method with Encoding Comparison and Classification Rectification
    COLLABORATECOM PART 3
    Springer
    DOI: 10.1007/978-3-031-54531-3_4
Yan Zeng1, Hui Zheng1, Xin Wang1, Beibei Zhang2, Mingyao Zhou3,*, Jilin Zhang1, YongJian Ren1
  • 1: School of Computer Science and Technology, Hangzhou Dianzi University
  • 2: Zhejiang Lab
  • 3: Hangzhou Huawei Communication Technology Co., Ltd.
*Contact email: zhoumingyao@huaewei.com

Abstract

Federated learning is a distributed training method that integrates multi-party data information using privacy-preserving technologies through dispersed client data sets to jointly construct a global model under the coordination of a central server. However, in practical applications, there is a high degree of data distribution skewness among clients, which causes the optimization direction of the client models to diverge, resulting in model bias and reducing the accuracy of the global model. Existing methods require the calculation and transmission of much information to correct the optimization direction of the client models, or only roughly limit the deviation of the client models end-to-end, ignoring targeted processing of the internal structure of the model, resulting in unclear improvement effects. To address these problems, we propose a federated optimization algorithm FedECCR based on encoding contrast and classification correction. This algorithm divides the model into an encoder and a classifier. It utilizes prototype contrastive training of the model encoder and unbiased classification correction of the classifier. This approach notably improves the accuracy of the global model while maintaining low communication costs. We conducted experiments on multiple data sets to evaluate the validity of our method, and the quantified results showed that FedECCR can improve the global model classification accuracy by approximately 1% to 6% compared to FedAvg, FedProx, and MOON.

Keywords
Federated Learning Data Heterogeneity Prototypical Learning Contrastive Learning
Published
2024-02-23
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-54531-3_4
Copyright © 2023–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL