Visible to the public A Programming Framework for Differential Privacy with Accuracy Concentration Bounds

TitleA Programming Framework for Differential Privacy with Accuracy Concentration Bounds
Publication TypeConference Paper
Year of Publication2020
AuthorsLobo-Vesga, E., Russo, A., Gaboardi, M.
Conference Name2020 IEEE Symposium on Security and Privacy (SP)
Date Publishedmay
KeywordsAccuracy, accuracy concentration, Cognition, composability, concentration bounds, data analyses results, Data analysis, data privacy, Databases, Differential privacy, Functional programming, Haskell, Human Behavior, privacy, private data analyses, Programming, programming differentially private analyses, programming languages, pubcrawl, reasoning, Resiliency, Scalability, Tools
AbstractDifferential privacy offers a formal framework for reasoning about privacy and accuracy of computations on private data. It also offers a rich set of building blocks for constructing private data analyses. When carefully calibrated, these analyses simultaneously guarantee the privacy of the individuals contributing their data, and the accuracy of the data analyses results, inferring useful properties about the population. The compositional nature of differential privacy has motivated the design and implementation of several programming languages aimed at helping a data analyst in programming differentially private analyses. However, most of the programming languages for differential privacy proposed so far provide support for reasoning about privacy but not for reasoning about the accuracy of data analyses. To overcome this limitation, in this work we present DPella, a programming framework providing data analysts with support for reasoning about privacy, accuracy and their trade-offs. The distinguishing feature of DPella is a novel component which statically tracks the accuracy of different data analyses. In order to make tighter accuracy estimations, this component leverages taint analysis for automatically inferring statistical independence of the different noise quantities added for guaranteeing privacy. We evaluate our approach by implementing several classical queries from the literature and showing how data analysts can figure out the best manner to calibrate privacy to meet the accuracy requirements.
DOI10.1109/SP40000.2020.00086
Citation Keylobo-vesga_programming_2020