The composition theorem for differential privacy

Peter Kairouz, Sewoong Oh, Pramod Viswanath

Research output: Chapter in Book/Report/Conference proceedingConference contribution

222 Scopus citations

Abstract

Sequential querying of differentially private mechanisms degrades the overall privacy level. In this paper, we answer the fundamental question of characterizing the level of overall privacy degradation as a function of the number of queries and the privacy levels maintained by each privatization mechanism. Our solution is complete: we prove an upper bound on the overall privacy level and construct a sequence of privatization mechanisms that achieves this bound. The key innovation is the introduction of an operational interpretation of differential privacy (involving hypothesis testing) and the use of new data processing inequalities. Our result improves over the state-of-the-art and has immediate applications to several problems studied in the literature.

Original languageEnglish (US)
Title of host publication32nd International Conference on Machine Learning, ICML 2015
EditorsDavid Blei, Francis Bach
PublisherInternational Machine Learning Society (IMLS)
Pages1376-1385
Number of pages10
ISBN (Electronic)9781510810587
StatePublished - 2015
Externally publishedYes
Event32nd International Conference on Machine Learning, ICML 2015 - Lile, France
Duration: Jul 6 2015Jul 11 2015

Publication series

Name32nd International Conference on Machine Learning, ICML 2015
Volume2

Other

Other32nd International Conference on Machine Learning, ICML 2015
Country/TerritoryFrance
CityLile
Period7/6/157/11/15

All Science Journal Classification (ASJC) codes

  • Human-Computer Interaction
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'The composition theorem for differential privacy'. Together they form a unique fingerprint.

Cite this