The composition theorem for differential privacy

Peter Kairouz, Sewoong Oh, Pramod Viswanath

Research output: Chapter in Book/Report/Conference proceedingConference contribution

194 Scopus citations


Sequential querying of differentially private mechanisms degrades the overall privacy level. In this paper, we answer the fundamental question of characterizing the level of overall privacy degradation as a function of the number of queries and the privacy levels maintained by each privatization mechanism. Our solution is complete: we prove an upper bound on the overall privacy level and construct a sequence of privatization mechanisms that achieves this bound. The key innovation is the introduction of an operational interpretation of differential privacy (involving hypothesis testing) and the use of new data processing inequalities. Our result improves over the state-of-the-art and has immediate applications to several problems studied in the literature.

Original languageEnglish (US)
Title of host publication32nd International Conference on Machine Learning, ICML 2015
EditorsDavid Blei, Francis Bach
PublisherInternational Machine Learning Society (IMLS)
Number of pages10
ISBN (Electronic)9781510810587
StatePublished - 2015
Externally publishedYes
Event32nd International Conference on Machine Learning, ICML 2015 - Lile, France
Duration: Jul 6 2015Jul 11 2015

Publication series

Name32nd International Conference on Machine Learning, ICML 2015


Other32nd International Conference on Machine Learning, ICML 2015

All Science Journal Classification (ASJC) codes

  • Human-Computer Interaction
  • Computer Science Applications


Dive into the research topics of 'The composition theorem for differential privacy'. Together they form a unique fingerprint.

Cite this