Research Article
Exploring the Privacy Bound for Differential Privacy: From Theory to Practice
@ARTICLE{10.4108/eai.8-4-2019.157414, author={Xianmang He and Yuan Hong and Yindong Chen}, title={Exploring the Privacy Bound for Differential Privacy: From Theory to Practice}, journal={EAI Endorsed Transactions on Security and Safety}, volume={5}, number={18}, publisher={EAI}, journal_a={SESA}, year={2019}, month={1}, keywords={Differential Privacy, Inference, Privacy Bound}, doi={10.4108/eai.8-4-2019.157414} }
- Xianmang He
Yuan Hong
Yindong Chen
Year: 2019
Exploring the Privacy Bound for Differential Privacy: From Theory to Practice
SESA
EAI
DOI: 10.4108/eai.8-4-2019.157414
Abstract
Data privacy has attracted significant interests in both database theory and security communities in the past few decades. Differential privacy has emerged as a new paradigm for rigorous privacy protection regardless of adversaries prior knowledge. However, the meaning of privacy bound ꞓ and how to select an appropriate ꞓ may still be unclear to the general data owners. More recently, some approaches have been proposed to derive the upper bounds of ꞓ for specified privacy risks. Unfortunately, these upper bounds suffer from some deficiencies (e.g., the bound relies on the data size, or might be too large), which greatly limits their applicability. To remedy this problem, we propose a novel approach that converts the privacy bound in differential privacy ꞓ to privacy risks understandable to generic users, and present an in-depth theoretical analysis for it. Finally, we have conducted experiments to demonstrate the effectiveness of our model.
Copyright © 2019 Xianmang He et al., licensed to EAI. This is an open access article distributed under the terms of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/), which permits unlimited use, distribution and reproduction in any medium so long as the original work is properly cited.