CPL - Chalmers Publication Library
| Utbildning | Forskning | Styrkeområden | Om Chalmers | In English In English Ej inloggad.

Differential privacy for Bayesian inference through posterior sampling

Christos Dimitrakakis (Institutionen för data- och informationsteknik, Datavetenskap, Algoritmer (Chalmers)) ; Blaine E N Nelson ; Zuhe Zhang ; Aikaterini Mitrokotsa (Institutionen för data- och informationsteknik, Nätverk och system (Chalmers) ) ; Benjamin I P Rubinstein
Journal of Machine Learning Research (15324435). Vol. 18 (2017), 1 March 2017,
[Artikel, refereegranskad vetenskaplig]

Differential privacy formalises privacy-preserving mechanisms that provide access to a database. Can Bayesian inference be used directly to provide private access to data? The answer is yes: under certain conditions on the prior, sampling from the posterior distribution can lead to a desired level of privacy and utility. For a uniform treatment, we define differential privacy over arbitrary data set metrics, outcome spaces and distribution families. This allows us to also deal with non-i.i.d or non-tabular data sets. We then prove bounds on the sensitivity of the posterior to the data, which delivers a measure of robustness. We also show how to use posterior sampling to provide differentially private responses to queries, within a decision-theoretic framework. Finally, we provide bounds on the utility of answers to queries and on the ability of an adversary to distinguish between data sets. The latter are complemented by a novel use of Le Cam's method to obtain lower bounds on distinguishability. Our results hold for arbitrary metrics, including those for the common definition of differential privacy. For specific choices of the metric, we give a number of examples satisfying our assumptions. © 2017 C Dimitrakakis, B. Nelson, Z. Zhang, A. Mitrokotsa, B. I. P. Rubinstein.

Nyckelord: Engineering controlled terms: Bayesian networks; Data privacy Bayesian inference; Decision-theoretic; Differential privacies; Distinguishability; Lower bounds; Posterior distributions; Privacy preserving; Tabular data Engineering main heading: Inference engines



Denna post skapades 2017-04-19. Senast ändrad 2017-06-14.
CPL Pubid: 248881