About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Quality, Reliability, Security and Robustness in Heterogeneous Systems. 19th EAI International Conference, QShine 2023, Shenzhen, China, October 8 – 9, 2023, Proceedings, Part I

Research Article

Trusted Personalized Federated Learning Based on Differential Privacy

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-65126-7_28,
        author={Ruixin Liu and Zhenquan Qin and Xi Cheng and Rui Zhang and Jianbo Zheng},
        title={Trusted Personalized Federated Learning Based on Differential Privacy},
        proceedings={Quality, Reliability, Security and Robustness in Heterogeneous Systems. 19th EAI International Conference, QShine 2023, Shenzhen, China, October 8 -- 9, 2023, Proceedings, Part I},
        proceedings_a={QSHINE},
        year={2024},
        month={8},
        keywords={Federated learning Personalized federated learning Differential privacy},
        doi={10.1007/978-3-031-65126-7_28}
    }
    
  • Ruixin Liu
    Zhenquan Qin
    Xi Cheng
    Rui Zhang
    Jianbo Zheng
    Year: 2024
    Trusted Personalized Federated Learning Based on Differential Privacy
    QSHINE
    Springer
    DOI: 10.1007/978-3-031-65126-7_28
Ruixin Liu1, Zhenquan Qin1, Xi Cheng1, Rui Zhang2, Jianbo Zheng2,*
  • 1: Dalian University of Technology, Dalian
  • 2: Artificial Intelligence Research Institute, Shenzhen MSU-BIT University, Shenzhen
*Contact email: jianbo.zheng@smbu.edu.cn

Abstract

As an emerging machine learning model, federated learning ensures that user data is stored locally while breaking data siloes, which ensures the privacy of training data. However, in practical applications, the training data across different user terminals are often non-independent and identically distributed. Moreover, federated learning does not provide strict privacy protection for users. Attackers can infer data information based on the model parameters or gradients uploaded or downloaded, and there is a great risk of privacy leakage in the process of information transmission. Therefore, we design a personalized federated learning scheme with privacy protection capability. Considering that non-independent and identically distributed data will have a negative impact on model training, we aim to obtain a personalized model for each client by joining transfer learning and knowledge distillation. Since we first propose a personalized federated learning algorithm with privacy protection ability based on the Gaussian mechanism in differential privacy, and then compared with the traditional Laplace mechanism. Experimental results demonstrate that our proposed method achieves better accuracy and improves the privacy protection ability.

Keywords
Federated learning Personalized federated learning Differential privacy
Published
2024-08-20
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-65126-7_28
Copyright © 2023–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL