About | Contact Us | Register | Login
ProceedingsSeriesJournalsSearchEAI
Mobile Multimedia Communications. 16th EAI International Conference, MobiMedia 2023, Guilin, China, July 22-24, 2023, Proceedings

Research Article

Why Not Model Privacy?: An Efficient and Practical Scheme for Federated Learning Model Security

Cite
BibTeX Plain Text
  • @INPROCEEDINGS{10.1007/978-3-031-60347-1_33,
        author={Wang Shuai and Renwan Bi and Youliang Tian and Jinbo Xiong},
        title={Why Not Model Privacy?: An Efficient and Practical Scheme for Federated Learning Model Security},
        proceedings={Mobile Multimedia Communications. 16th EAI International Conference, MobiMedia 2023, Guilin, China, July 22-24, 2023, Proceedings},
        proceedings_a={MOBIMEDIA},
        year={2024},
        month={10},
        keywords={privacy-preserving federated learning homomorphic encryption model security},
        doi={10.1007/978-3-031-60347-1_33}
    }
    
  • Wang Shuai
    Renwan Bi
    Youliang Tian
    Jinbo Xiong
    Year: 2024
    Why Not Model Privacy?: An Efficient and Practical Scheme for Federated Learning Model Security
    MOBIMEDIA
    Springer
    DOI: 10.1007/978-3-031-60347-1_33
Wang Shuai1, Renwan Bi2, Youliang Tian1,*, Jinbo Xiong2
  • 1: The State Key Laboratory of Public Big Data, College of Computer Science and Technology, Guizhou University
  • 2: The Fujian Provincial Key Laboratory of Network Security and Cryptology, College of Computer and Cyber Security, Fujian Normal University
*Contact email: youliangtian@163.com

Abstract

Privacy-Preserving Federated Learning (PPFL), a new paradigm for secure and efficient Federated Learning, has the advantage of security aggregation without losing accuracy. However, PPFL focuses on protecting the privacy of private data, but the privacy of models is equally important. Model sharing among institutions increases the risk of model privacy leakage and economic loss when federated learning is applied to real-world scenarios. To address the above issues, we design a model privacy-preserving scheme based on fully homomorphic encryption, PriM. Specifically, we adopt the CKKS fully homomorphic encryption scheme to guarantee the security of the model, which supports both floating-point and vector encryption and avoids the significant overhead of single-value encryption in the homomorphic encryption-based PPFL scheme. Then, we maintain the confidentiality of the model computation process without revealing the model information to the participating entities. Finally, experiments show that our scheme achieves equal efficiency with the baseline algorithm FedAvg but is more realistic.

Keywords
privacy-preserving federated learning homomorphic encryption model security
Published
2024-10-25
Appears in
SpringerLink
http://dx.doi.org/10.1007/978-3-031-60347-1_33
Copyright © 2023–2025 ICST
EBSCOProQuestDBLPDOAJPortico
EAI Logo

About EAI

  • Who We Are
  • Leadership
  • Research Areas
  • Partners
  • Media Center

Community

  • Membership
  • Conference
  • Recognition
  • Sponsor Us

Publish with EAI

  • Publishing
  • Journals
  • Proceedings
  • Books
  • EUDL