
Research Article
FL-DP: Differential Private Federated Neural Network
@INPROCEEDINGS{10.1007/978-3-030-96791-8_20, author={Muhammad Maaz Irfan and Lin Wang and Sheraz Ali and Shan Jing and Chuan Zhao}, title={FL-DP: Differential Private Federated Neural Network}, proceedings={Security and Privacy in New Computing Environments. 4th EAI International Conference, SPNCE 2021, Virtual Event, December 10-11, 2021, Proceedings}, proceedings_a={SPNCE}, year={2022}, month={3}, keywords={Federated learning Data integrity Privacy-preserving Differential privacy}, doi={10.1007/978-3-030-96791-8_20} }
- Muhammad Maaz Irfan
Lin Wang
Sheraz Ali
Shan Jing
Chuan Zhao
Year: 2022
FL-DP: Differential Private Federated Neural Network
SPNCE
Springer
DOI: 10.1007/978-3-030-96791-8_20
Abstract
The rapid development of the Internet and machine learning has brought convenience and comfort to users’ lives. However, due to various attacks and sensitive data leaks, the large amount of data used in machine learning training has made the issue of personal privacy a growing concern as well. In the era of big data, anyone’s information can be stolen, which makes many people feel uneasy. We propose a new approach called FL-DP (Federated Learning Based on Differential Privacy). Based on differential privacy, this approach can effectively restrict the adversary’s access to the client model, which has the ability to limit data leakage. In this framework, we use the DP Laplace mechanism. It is ensured that all operations (including the server-side aggregation process) are secure and do not leak any information about the training data. Also, we consider adding multiple noises to the preprocessing process so that the client-side data is secure during the training process. Our approach not only provides client-level privacy, but also balances efficiency and privacy. After evaluation, our approach is highly scalable and can be applied to most machine learning based applications.