Dependent differential privacy for correlated data

Jun Zhao, Junshan Zhang, H. Vincent Poor

Research output: Chapter in Book/Report/Conference proceedingConference contribution

26 Scopus citations

Abstract

Many organizations maintain large collections of personal information. Clearly, sharing personal data may endanger the privacy of users whose data is shared. To rigorously evaluate the privacy and the utility of data publishing, the notion of differential privacy (DP) has received much attention. However, it has been shown that DP may not work well for databases with tuple correlations, which arise from various behavioral, social, and genetic relationships between users. DP masks only the presence of those records received from each user, but does not mask statistical trends that may reveal information about each user. This distinction leads to the degradation of privacy expectations in a social sense, since an adversary may still combine a query response with tuple correlation to learn about a user's data. To extend differential privacy for correlated data, prior work has investigated various privacy metrics. Nevertheless, these privacy metrics either assume background knowledge of the adversary or lack effective and general achieving mechanisms. To overcome these limitations, this paper formalizes the notion of dependent differential privacy (DDP) such that under any tuple correlation, almost no sensitive information about any user can be leaked because of answering a query. This DDP guarantee applies to any data correlation and is independent of the adversary's knowledge. It is shown that this DDP notion can be quantitatively deduced by DP with a stronger privacy parameter, where the difference between the privacy parameters of DDP and DP depends on the correlation between data tuples. Further, various mechanisms for achieving DDP are presented. These mechanisms are computationally efficient and achieve higher utilities than mechanisms introduced in prior work to address tuple correlations. As a representative example, for data correlations modeled by an n-tuple Markov chain and a query with constant global sensitivity, the amount of noise in a Laplace mechanism proposed here does not scale with n, whereas the level of the noise added by the state-of-the-art mechanism of Liu et al. scales linearly with n, so the proposed mechanism achieves higher utility than that of the state-of-the-art.

Original languageEnglish (US)
Title of host publication2017 IEEE Globecom Workshops, GC Wkshps 2017 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1-7
Number of pages7
ISBN (Electronic)9781538639207
DOIs
StatePublished - Jul 2 2017
Event2017 IEEE Global Telecommunications Conference, GC 2017 - Singapore, Singapore
Duration: Dec 4 2017Dec 8 2017

Publication series

Name2017 IEEE Globecom Workshops, GC Wkshps 2017 - Proceedings
Volume2018-January

Other

Other2017 IEEE Global Telecommunications Conference, GC 2017
Country/TerritorySingapore
CitySingapore
Period12/4/1712/8/17

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Computer Science Applications
  • Hardware and Architecture
  • Safety, Risk, Reliability and Quality

Keywords

  • Data correlation
  • Differential privacy
  • Laplace mechanisms
  • Markov chains

Fingerprint

Dive into the research topics of 'Dependent differential privacy for correlated data'. Together they form a unique fingerprint.

Cite this