With the recent surge of social networks such as Facebook, new forms of recommendations have become possible - recommendationsthat rely on one's social connections in orderto make personalized recommendations of ads, content, products, and people. Since recommendations may use sensitiveinformation, it is speculated that these recommendationsare associated with privacy risks. The main contributionof this work is in formalizing trade-offs between accuracyand privacy of personalized social recommendations.We study whether "social recommendations", or recommendationsthat are solely based on a user's social network,can be made without disclosing sensitive links in the socialgraph. More precisely, we quantify the loss in utilitywhen existing recommendation algorithms are modied tosatisfy a strong notion of privacy, called differential privacy. We prove lower bounds on the minimum loss in utility forany recommendation algorithm that is differentially private.We then adapt two privacy preserving algorithms from thedifferential privacy literature to the problem of social recommendations,and analyze their performance in comparison toour lower bounds, both analytically and experimentally. Weshow that good private social recommendations are feasibleonly for a small subset of the users in the social network orfor a lenient setting of privacy parameters.
All Science Journal Classification (ASJC) codes
- Computer Science (miscellaneous)
- General Computer Science