Poster
Differential Privacy Guarantees of Markov Chain Monte Carlo Algorithms
Andrea Bertazzi · Tim Johnston · Gareth Roberts · Alain Oliviero Durmus
East Exhibition Hall A-B #E-1300
Differential privacy is a framework which verifies that a statistical procedure is not too sensitive to individual components of the data. Specifically, it considers statistics that are randomised in such a way that changing a given entry in the data set only changes the (random) statistic a suitably small amount. This means one can prove mathematically that the (random) statistic does not give too much information about any one data point.In this paper we investigate the differential privacy of a class of widely used algorithms in statistics and optimisation. The class of algorithms we consider are often used for deriving additional information given a certain amount of prior information (sampling from Bayesian posteriors). We show firstly that the idealised theoretical privacy (the true posterior) and the implementable approximation (MCMC method) must broadly agree in their differential privacy if the numerical approximation is accurate. In the next part we show that certain implementable approximations (MCMC methods) can be differentially private under weaker (non-convex) technical assumptions than previously considered in the literature.