TY - GEN
T1 - Differential Privacy Under Multiple Selections
AU - Goel, Ashish
AU - Jiang, Zhihao
AU - Korolova, Aleksandra
AU - Munagala, Kamesh
AU - Sarmasarkar, Sahasrajit
N1 - Publisher Copyright:
© Ashish Goel, Zhihao Jiang, Aleksandra Korolova, Kamesh Munagala, and Sahasrajit Sarmasarkar.
PY - 2025/6/3
Y1 - 2025/6/3
N2 - We consider the setting where a user with sensitive features wishes to obtain a recommendation from a server in a differentially private fashion. We propose a “multi-selection” architecture where the server can send back multiple recommendations and the user chooses one from these that matches best with their private features. When the user feature is one-dimensional – on an infinite line – and the accuracy measure is defined w.r.t some increasing function h(.) of the distance on the line, we precisely characterize the optimal mechanism that satisfies differential privacy. The specification of the optimal mechanism includes both the distribution of the noise that the user adds to its private value, and the algorithm used by the server to determine the set of results to send back as a response. We show that Laplace is an optimal noise distribution in this setting. Furthermore, we show that this optimal mechanism results in an error that is inversely proportional to the number of results returned when the function h(.) is the identity function.
AB - We consider the setting where a user with sensitive features wishes to obtain a recommendation from a server in a differentially private fashion. We propose a “multi-selection” architecture where the server can send back multiple recommendations and the user chooses one from these that matches best with their private features. When the user feature is one-dimensional – on an infinite line – and the accuracy measure is defined w.r.t some increasing function h(.) of the distance on the line, we precisely characterize the optimal mechanism that satisfies differential privacy. The specification of the optimal mechanism includes both the distribution of the noise that the user adds to its private value, and the algorithm used by the server to determine the set of results to send back as a response. We show that Laplace is an optimal noise distribution in this setting. Furthermore, we show that this optimal mechanism results in an error that is inversely proportional to the number of results returned when the function h(.) is the identity function.
KW - Differential Privacy
KW - Mechanism Design
KW - Multi-Selection
UR - https://www.scopus.com/pages/publications/105007976294
UR - https://www.scopus.com/inward/citedby.url?scp=105007976294&partnerID=8YFLogxK
U2 - 10.4230/LIPIcs.FORC.2025.8
DO - 10.4230/LIPIcs.FORC.2025.8
M3 - Conference contribution
AN - SCOPUS:105007976294
T3 - Leibniz International Proceedings in Informatics, LIPIcs
BT - 6th Symposium on Foundations of Responsible Computing, FORC 2025
A2 - Bun, Mark
PB - Schloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing
T2 - 6th Symposium on Foundations of Responsible Computing, FORC 2025
Y2 - 4 June 2025 through 6 June 2025
ER -