About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Mobile and Ubiquitous Systems: Computing, Networking and Services. 20th EAI International Conference, MobiQuitous 2023, Melbourne, VIC, Australia, November 14–17, 2023, Proceedings, Part I

Research Article

FedGCS: Addressing Class Imbalance in Long-Tail Federated Learning

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-63989-0_11,
        author={Guozheng Liu and Wei Zhang and Huiling Shi and Lizhuang Tan and Chang Tang and Meihong Yang},
        title={FedGCS: Addressing Class Imbalance in Long-Tail Federated Learning},
        proceedings={Mobile and Ubiquitous Systems: Computing, Networking and Services. 20th EAI International Conference, MobiQuitous 2023, Melbourne, VIC, Australia, November 14--17, 2023, Proceedings, Part I},
        proceedings_a={MOBIQUITOUS},
        year={2024},
        month={7},
        keywords={Federated learning Long-tail learning non-IID},
        doi={10.1007/978-3-031-63989-0_11}
    }
    
  • Guozheng Liu
    Wei Zhang
    Huiling Shi
    Lizhuang Tan
    Chang Tang
    Meihong Yang
    Year: 2024
    FedGCS: Addressing Class Imbalance in Long-Tail Federated Learning
    MOBIQUITOUS
    Springer
    DOI: 10.1007/978-3-031-63989-0_11
Guozheng Liu1, Wei Zhang1,*, Huiling Shi1, Lizhuang Tan1, Chang Tang1, Meihong Yang1
  • 1: Key Laboratory of Computing Power Network and Information Security, Ministry of Education, Shandong Computer Science Center (National Supercomputer Center in Jinan)
*Contact email: wzhang@sdas.org

Abstract

Federated learning is a privacy-preserving distributed machine learning method, which facilitates clients to cooperate in training a shared model while safeguarding original data. The different distribution and quantity of training data between clients can pose significant challenges, such as data heterogeneity and class imbalance, which can greatly influence the performance of the shared model. Although many methods have been proposed to eliminate the deleterious influence of non-IID data, existing solutions usually do not perform well on tail-classes owing to the absence of attention for the long-tail distribution. We present a long-tail federated learning framework FedGCS, which can solve the global and local class imbalance problem via generic to compensate for specific. Specifically, clients separate features from the training data based on the class activation map and selectively fuse the separated class-specific features and class-generic features to restore the distribution of tail-classes. We also design a loss function—TailDistillation Loss to lessen the bias of the classifier towards head-classes. To appraise the effectiveness of FedGCS, we adapted multiple benchmark datasets to the long-tail federated learning setting. Experiments indicate that the FedGCS is an useful method, and is superior to previous approaches.

Keywords
Federated learning Long-tail learning non-IID
Published
2024-07-19
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-63989-0_11
Copyright © 2023–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL