About
Activity
-
I am grateful for the opportunity to participate in this event and the Future of Privacy Forum's initiative to support the standardization and use of…
I am grateful for the opportunity to participate in this event and the Future of Privacy Forum's initiative to support the standardization and use of…
Liked by Ashwin Machanavajjhala
-
New & scary paper by Ryan Steed, Diana Qing, and Steven Wu, that presents a novel application of reconstruction attacks on Census data: identifying…
New & scary paper by Ryan Steed, Diana Qing, and Steven Wu, that presents a novel application of reconstruction attacks on Census data: identifying…
Liked by Ashwin Machanavajjhala
-
Happy UOOM Day! While many Americans are off for Independence Day break, companies across the digital advertising industry shouldn't forget that as…
Happy UOOM Day! While many Americans are off for Independence Day break, companies across the digital advertising industry shouldn't forget that as…
Liked by Ashwin Machanavajjhala
Experience & Education
Publications
-
A Rigorous and Customizable Framework for Privacy
PODS 2012
In this paper we introduce a new and general privacy framework called Pufferfish. The Pufferfish framework can be used to create new privacy definitions that are customized to the needs of a given application. The goal of Pufferfish is to allow experts in an application domain, who frequently do not have expertise in privacy, to develop rigorous privacy definitions for their data sharing needs. In addition to this, the Pufferfish framework can also be used to study existing privacy…
In this paper we introduce a new and general privacy framework called Pufferfish. The Pufferfish framework can be used to create new privacy definitions that are customized to the needs of a given application. The goal of Pufferfish is to allow experts in an application domain, who frequently do not have expertise in privacy, to develop rigorous privacy definitions for their data sharing needs. In addition to this, the Pufferfish framework can also be used to study existing privacy definitions.
We illustrate the benefits with several applications of this privacy framework: we use it to formalize and prove the statement that differential privacy assumes independence between records, we use it to define and study the notion of composition in a broader context than before, we show how to apply it to protect unbounded continuous attributes and aggregate information, and we show how to use it to rigorously account for prior data releases.Other authorsSee publication -
No Free Lunch in Data Privacy
SIGMOD
Differential privacy is a powerful tool for providing privacy preserving noisy query answers over statistical databases. It guarantees that the distribution of noisy query answers changes very little with the addition or deletion of any tuple. It is frequently accompanied by popularized claims that it provides privacy without any assumptions about the data and that it protects against attackers who know all but one record. In this paper we critically analyze the privacy protections offered by…
Differential privacy is a powerful tool for providing privacy preserving noisy query answers over statistical databases. It guarantees that the distribution of noisy query answers changes very little with the addition or deletion of any tuple. It is frequently accompanied by popularized claims that it provides privacy without any assumptions about the data and that it protects against attackers who know all but one record. In this paper we critically analyze the privacy protections offered by differential privacy. First, we use a no-free-lunch theorem, which defines non-privacy as a game, to argue that it is not possible to provide privacy and utility without making assumptions about how the data are generated. Then we explain where assumptions are needed. We argue that privacy of an individual is preserved when it is possible to limit the inference of an attacker about the participation of the individual in the data generating process. This is different from limiting the inference about the presence of a tuple (for example, Bob's participation in a social network may cause edges to form between pairs of his friends, so that it affects more than just the tuple labeled as ``Bob"). The definition of evidence of participation, in turn, depends on how the data are generated -- this is how assumptions enter the picture. We explain these ideas using examples from social network research as well as tabular data for which deterministic statistics have been previously released. In both cases the notion of participation varies, the use of differential privacy can lead to privacy breaches, and differential privacy does not always adequately limit inference about participation.
Other authorsSee publication -
Personalized Social Recommendations -- Accurate or Private?
VLDB 2011
With the recent surge of social networks like Facebook, new forms of recommendations have become possible -- personalized recommendations of ads, content, and even new friend and product connections based on one's social interactions. Since recommendations may use sensitive social information, it is speculated that these recommendations are associated with privacy risks. The main contribution of this work is in formalizing these expected trade-offs between the accuracy and privacy of…
With the recent surge of social networks like Facebook, new forms of recommendations have become possible -- personalized recommendations of ads, content, and even new friend and product connections based on one's social interactions. Since recommendations may use sensitive social information, it is speculated that these recommendations are associated with privacy risks. The main contribution of this work is in formalizing these expected trade-offs between the accuracy and privacy of personalized social recommendations.
In this paper, we study whether ``social recommendations'', or recommendations that are solely based on a user's social network, can be made without disclosing sensitive links in the social graph. More precisely, we quantify the loss in utility when existing recommendation algorithms are modified to satisfy a strong notion of privacy, called differential privacy. We prove lower bounds on the minimum loss in utility for any recommendation algorithm that is differentially private. We adapt two privacy preserving algorithms from the differential privacy literature to the problem of social recommendations, and analyze their performance in comparison to the lower bounds, both analytically and experimentally. We show that good private social recommendations are feasible only for a small subset of the users in the social network or for a lenient setting of privacy parameters.
Other authorsSee publication
More activity by Ashwin
-
This month's #PETed features Gerome Miklau, co-founder and CEO of Tumult Labs, who will join us to explain how we have successfully used…
This month's #PETed features Gerome Miklau, co-founder and CEO of Tumult Labs, who will join us to explain how we have successfully used…
Liked by Ashwin Machanavajjhala
-
4 YEARS OF POST IBM RETIREMENT ACTIVITIES Today, 30 June 2024, marks the 4th anniversary of my IBM Retirement (last day at IBM was 30 June 2020)…
4 YEARS OF POST IBM RETIREMENT ACTIVITIES Today, 30 June 2024, marks the 4th anniversary of my IBM Retirement (last day at IBM was 30 June 2020)…
Liked by Ashwin Machanavajjhala
-
𝕎𝕙𝕪 #𝔸𝕀 𝕗𝕠𝕝𝕜𝕤 𝕟𝕖𝕖𝕕 𝕒 𝕓𝕣𝕠𝕒𝕕 𝕓𝕒𝕤𝕖𝕕 𝕀𝕟𝕥𝕣𝕠 𝕥�� #𝔸𝕀 👉 As I go around giving talks/tutorials on the planning and…
𝕎𝕙𝕪 #𝔸𝕀 𝕗𝕠𝕝𝕜𝕤 𝕟𝕖𝕖𝕕 𝕒 𝕓𝕣𝕠𝕒𝕕 𝕓𝕒𝕤𝕖𝕕 𝕀𝕟𝕥𝕣𝕠 𝕥𝕠 #𝔸𝕀 👉 As I go around giving talks/tutorials on the planning and…
Liked by Ashwin Machanavajjhala
-
In case you're at ICES VII (International Conference on Establishment Statistics) today, come say hi! I'm presenting a software demo on how to…
In case you're at ICES VII (International Conference on Establishment Statistics) today, come say hi! I'm presenting a software demo on how to…
Liked by Ashwin Machanavajjhala
-
I am excited to see this article from AdExchanger by Allison Schiff about differential privacy becoming more mainstream in industry, not just being…
I am excited to see this article from AdExchanger by Allison Schiff about differential privacy becoming more mainstream in industry, not just being…
Liked by Ashwin Machanavajjhala
-
Heartfelt congratulations to John Abowd for receiving the ACM Policy Award for « transformative work in modernizing the US Census Bureau’s processing…
Heartfelt congratulations to John Abowd for receiving the ACM Policy Award for « transformative work in modernizing the US Census Bureau’s processing…
Liked by Ashwin Machanavajjhala
-
We finished our Summit this afternoon with insights from FTC Division of Privacy and Identity Protection Attorney Ronnie Solomon and privacy…
We finished our Summit this afternoon with insights from FTC Division of Privacy and Identity Protection Attorney Ronnie Solomon and privacy…
Liked by Ashwin Machanavajjhala
-
InfoSum is thrilled to be attending The NAI Summit 2024 in New York next week! InfoSum’s Chief Operating Officer, Lauren Wetzel, will be in action…
InfoSum is thrilled to be attending The NAI Summit 2024 in New York next week! InfoSum’s Chief Operating Officer, Lauren Wetzel, will be in action…
Liked by Ashwin Machanavajjhala
-
𝓢𝓽𝔂𝓵𝓮 is a 𝖉𝖎𝖘𝖙𝖗𝖎𝖇𝖚𝖙𝖎𝖔𝖓𝖆𝖑 property; 𝕔𝕠𝕣𝕣𝕖𝕔𝕥𝕟𝕖𝕤𝕤 is an 𝕚𝕟𝕤𝕥𝕒𝕟𝕔𝕖-𝕝𝕖𝕧𝕖𝕝 property. LLMs (and GenAI) learn and…
𝓢𝓽𝔂𝓵𝓮 is a 𝖉𝖎𝖘𝖙𝖗𝖎𝖇𝖚𝖙𝖎𝖔𝖓𝖆𝖑 property; 𝕔𝕠𝕣𝕣𝕖𝕔𝕥𝕟𝕖𝕤𝕤 is an 𝕚𝕟𝕤𝕥𝕒𝕟𝕔𝕖-𝕝𝕖𝕧𝕖𝕝 property. LLMs (and GenAI) learn and…
Liked by Ashwin Machanavajjhala
-
If you're a privacy engineer, please consider participating in our study!
If you're a privacy engineer, please consider participating in our study!
Liked by Ashwin Machanavajjhala
-
If you're at the IAPP - International Association of Privacy Professionals Global Privacy Summit today, don't miss the panel on deploying…
If you're at the IAPP - International Association of Privacy Professionals Global Privacy Summit today, don't miss the panel on deploying…
Liked by Ashwin Machanavajjhala
-
Data nerds: Urban's Financial Well-Being Data Hub is hosting a discussion with top experts in the field focused on how we can responsibly use private…
Data nerds: Urban's Financial Well-Being Data Hub is hosting a discussion with top experts in the field focused on how we can responsibly use private…
Liked by Ashwin Machanavajjhala
-
So, about this "truly anonymous" synthetic data… 😬 I've been telling anyone who'd listen that ad hoc approaches to anonymization should be assumed…
So, about this "truly anonymous" synthetic data… 😬 I've been telling anyone who'd listen that ad hoc approaches to anonymization should be assumed…
Liked by Ashwin Machanavajjhala
Other similar profiles
-
Gerome Miklau
Connect -
Philip Bohannon
Connect -
Luke Hartman
Connect -
Damien Desfontaines
Staff Scientist, differential privacy
Connect -
Samuel Haney
Scientist at Tumult Labs
Connect -
Hari Stephen Kumar
Connect -
Camille Landau
Connect -
David Pujol
Scientist at Tumult Labs
Connect -
Charles Estes
Software Engineer, Tumult Labs
Connect -
Anamay Chaturvedi
Postdoc
Connect
Explore collaborative articles
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
Explore More