About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Collaborative Computing: Networking, Applications and Worksharing. 18th EAI International Conference, CollaborateCom 2022, Hangzhou, China, October 15-16, 2022, Proceedings, Part I

Research Article

FedCL: An Efficient Federated Unsupervised Learning for Model Sharing in IoT

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-24383-7_7,
        author={Chen Zhao and Zhipeng Gao and Qian Wang and Zijia Mo and Xinlei Yu},
        title={FedCL: An Efficient Federated Unsupervised Learning for Model Sharing in IoT},
        proceedings={Collaborative Computing: Networking, Applications and Worksharing. 18th EAI International Conference, CollaborateCom 2022, Hangzhou, China, October 15-16, 2022, Proceedings, Part I},
        proceedings_a={COLLABORATECOM},
        year={2023},
        month={1},
        keywords={Federated learning Internet of things Self-supervised learning Unsupervised learning},
        doi={10.1007/978-3-031-24383-7_7}
    }
    
  • Chen Zhao
    Zhipeng Gao
    Qian Wang
    Zijia Mo
    Xinlei Yu
    Year: 2023
    FedCL: An Efficient Federated Unsupervised Learning for Model Sharing in IoT
    COLLABORATECOM
    Springer
    DOI: 10.1007/978-3-031-24383-7_7
Chen Zhao1, Zhipeng Gao1,*, Qian Wang, Zijia Mo1, Xinlei Yu1
  • 1: State Key Laboratory of Networking and Switching Technology
*Contact email: gaozhipeng@bupt.edu.cn

Abstract

Federated Learning (FL) continues to make significant advances, solving model sharing under privacy-preserving. However, these existing methods are only of limited utility in the Internet of Things (IoT) scenarios, as they either heavily depend on high-quality labeled data or only perform well under idealized conditions, which typically cannot be found in practical applications. As such, a natural problem is how to leverage unlabeled data among multiple clients to optimize sharing model. To address this shortcoming, we propose Federated Contrastive Learning (FedCL), an efficient federated learning method for unsupervised image classification. The proposed FedCL can be summarized in three steps: distributed federated pretraining of the local model using contrastive learning, supervised fine-tuning on a server with few labeled data, and distillation with unlabeled examples on each client for refining and transferring the personalized-specific knowledge. Extensive experiments show that our method outperforms all baseline methods by large margins, including 69.32% top-1 accuracy on CIFAR-10, 85.75% on SVHN, and 74.64% on Mini-ImageNet with the only use of 1% labels.

Keywords
Federated learning Internet of things Self-supervised learning Unsupervised learning
Published
2023-01-25
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-24383-7_7
Copyright © 2022–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL