About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Wireless Internet. 15th EAI International Conference, WiCON 2022, Virtual Event, November 2022, Proceedings

Research Article

Addressing Class Imbalance in Federated Learning via Collaborative GAN-Based Up-Sampling

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-27041-3_15,
        author={Can Zhang and Xuefeng Liu and Shaojie Tang and Jianwei Niu and Tao Ren and Quanquan Hu},
        title={Addressing Class Imbalance in Federated Learning via Collaborative GAN-Based Up-Sampling},
        proceedings={Wireless Internet. 15th EAI International Conference, WiCON 2022, Virtual Event, November 2022, Proceedings},
        proceedings_a={WICON},
        year={2023},
        month={2},
        keywords={Federated learning Class imbalance Collaborative up-sampling Generative adversarial networks},
        doi={10.1007/978-3-031-27041-3_15}
    }
    
  • Can Zhang
    Xuefeng Liu
    Shaojie Tang
    Jianwei Niu
    Tao Ren
    Quanquan Hu
    Year: 2023
    Addressing Class Imbalance in Federated Learning via Collaborative GAN-Based Up-Sampling
    WICON
    Springer
    DOI: 10.1007/978-3-031-27041-3_15
Can Zhang, Xuefeng Liu,*, Shaojie Tang, Jianwei Niu, Tao Ren, Quanquan Hu
    *Contact email: liu_xuefeng@buaa.edu.cn

    Abstract

    Federated learning (FL) is an emerging learning framework that enables decentralized devices to collaboratively train a model without leaking their data to each other. One common problem in FL is class imbalance, in which either the distribution or quantity of the training data varies in different devices. In the presence of class imbalance, the performance of the final model can be negatively affected. A straightforward approach to address class imbalance is up-sampling, by which data of minority classes in each device are augmented independently. However, this up-sampling approach does not allow devices to help each other and therefore its effectiveness can be greatly compromised. In this paper, we propose FED-CGU, a collaborative GAN-based up-sampling strategy in FL. In FED-CGU, devices can help each other during up-sampling via collaboratively training a GAN model which augments data for each device. In addition, some advanced designs of FED-CGU are proposed, including dynamically determining the number of augmented data in each device and selecting complementary devices that can better help each other. We test FED-CGU with benchmark datasets including Fashion-MNIST and CIFAR-10. Experimental results demonstrate that FED-CGU outperforms the state-of-the-art algorithms.

    Keywords
    Federated learning Class imbalance Collaborative up-sampling Generative adversarial networks
    Published
    2023-02-18
    Appears in
    SpringerLink
    http://dx.doi.org/10.1007/978-3-031-27041-3_15
    Copyright © 2022–2025 ICST
    EBSCOProQuestDBLPDOAJPortico
    EAI Logo

    About EAI

    • Who We Are
    • Leadership
    • Research Areas
    • Partners
    • Media Center

    Community

    • Membership
    • Conference
    • Recognition
    • Sponsor Us

    Publish with EAI

    • Publishing
    • Journals
    • Proceedings
    • Books
    • EUDL