Privacy & Data Archives - Center for Democracy and Technology https://cdt.org/area-of-focus/privacy-data/ Tue, 13 May 2025 21:09:58 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://cdt.org/wp-content/uploads/2019/11/cropped-cdt-logo-32x32.png Privacy & Data Archives - Center for Democracy and Technology https://cdt.org/area-of-focus/privacy-data/ 32 32 CDT Joins Call for SNAP Payment Processors to Refuse USDA Data Requests https://cdt.org/insights/cdt-joins-call-for-snap-payment-processors-to-refuse-usda-data-requests/ Tue, 13 May 2025 21:09:56 +0000 https://cdt.org/?post_type=insight&p=108817 This week, the Center for Democracy & Technology (CDT) joined Protect Democracy and the Electronic Privacy Information Center (EPIC) in calling on the private companies that process Supplemental Nutrition Assistance Program (SNAP) payments to refuse the federal government’s unprecedented, and likely illegal, request to access sensitive information about tens of millions of Americans who receive […]

The post CDT Joins Call for SNAP Payment Processors to Refuse USDA Data Requests appeared first on Center for Democracy and Technology.

]]>
This week, the Center for Democracy & Technology (CDT) joined Protect Democracy and the Electronic Privacy Information Center (EPIC) in calling on the private companies that process Supplemental Nutrition Assistance Program (SNAP) payments to refuse the federal government’s unprecedented, and likely illegal, request to access sensitive information about tens of millions of Americans who receive this life-saving benefit.

For over 60 years, the U.S. Department of Agriculture (USDA) has funded states to administer SNAP. In that time, the federal government has never requested access to the personal data of all program recipients, which are primarily low-income families as well as disabled or older adults. Forcing states to turn over data collected to administer a program that feeds millions of low-income, disabled, and older people for unknown purposes is an alarming data privacy threat that will create a chilling effect that prevents Americans from accessing life-saving benefits.

In this letter, we urge SNAP payment processors to stand up for privacy and refuse to facilitate this broad and dangerous attempt at government overreach.

Read the full letter.

The post CDT Joins Call for SNAP Payment Processors to Refuse USDA Data Requests appeared first on Center for Democracy and Technology.

]]>
CDT Submits Comments Outlining Dangers of SSA About-Face Blocking Vulnerable Beneficiaries from Accessing Critical Benefits https://cdt.org/insights/cdt-outlines-dangers-of-ssa-about-face-blocking-vulnerable-beneficiaries-from-accessing-critical-benefits/ Tue, 13 May 2025 14:32:32 +0000 https://cdt.org/?post_type=insight&p=108802 Despite initially heeding an outpouring of concerns, many around accessibility for disabled beneficiaries, the Social Security Administration (SSA) appears to be forging ahead with plans to require in-person visits or access to an online account to complete certain phone-based transactions. This about-face will block some of SSA’s most vulnerable beneficiaries from accessing critical benefits, including […]

The post CDT Submits Comments Outlining Dangers of SSA About-Face Blocking Vulnerable Beneficiaries from Accessing Critical Benefits appeared first on Center for Democracy and Technology.

]]>
Despite initially heeding an outpouring of concerns, many around accessibility for disabled beneficiaries, the Social Security Administration (SSA) appears to be forging ahead with plans to require in-person visits or access to an online account to complete certain phone-based transactions.

This about-face will block some of SSA’s most vulnerable beneficiaries from accessing critical benefits, including disabled and/or older people who disproportionately rely on telephone services. Though we appreciate SSA’s attention to the integrity of their programs, attempts to address fraud cannot make programs inaccessible to beneficiaries.

CDT has filed comments outlining the dangers of this approach to people with disabilities and older adults who depend on the SSA-administered benefits that they are entitled to receive.

Read the full comments.

The post CDT Submits Comments Outlining Dangers of SSA About-Face Blocking Vulnerable Beneficiaries from Accessing Critical Benefits appeared first on Center for Democracy and Technology.

]]>
Op-Ed – DOGE & Disability Rights: Three Key Tech Policy Concerns https://cdt.org/insights/op-ed-doge-disability-rights-three-key-tech-policy-concerns/ Mon, 12 May 2025 18:49:19 +0000 https://cdt.org/?post_type=insight&p=108773 This op-ed – authored by CDT’s Ariana Aboulafia  – first appeared in Tech Policy Press on May 12, 2025. A portion of the text has been pasted below. Three months into the Trump administration, the Department of Government Efficiency (DOGE) has wreaked havoc on the United States federal government and on many individuals who rely on government services. This […]

The post Op-Ed – DOGE & Disability Rights: Three Key Tech Policy Concerns appeared first on Center for Democracy and Technology.

]]>
This op-ed – authored by CDT’s Ariana Aboulafia  – first appeared in Tech Policy Press on May 12, 2025. A portion of the text has been pasted below.

Three months into the Trump administration, the Department of Government Efficiency (DOGE) has wreaked havoc on the United States federal government and on many individuals who rely on government services. This includes people with disabilities, who have been impacted by cuts to education programschaos at the Social Security Administration (SSA), the shuttering of digital services programs that focused on accessibility, and even mandatory return-to-office policies for federal workers, among other things. Despite the announcement that Elon Musk will soon be “stepping back” from his role at DOGE, there’s no indication that the agency will stop its crusade, regardless of the costs to everyday people. And, it will continue to use technology to do it.

I currently lead one of the only projects in the US that focuses on how technology (such as AI tools and algorithmic systems) impacts people with disabilities. From my vantage, it is clear that DOGE’s underlying ableist rhetoric both informs and forecasts its work, while its violations of data privacy and expansive use of AI without proper oversight have already harmed disabled people, and will continue to do so.

Read the full text.

The post Op-Ed – DOGE & Disability Rights: Three Key Tech Policy Concerns appeared first on Center for Democracy and Technology.

]]>
CDT and the Leadership Conference Release New Analysis of Immigration, DOGE, and Data Privacy https://cdt.org/insights/cdt-and-the-leadership-conference-release-new-analysis-of-immigration-doge-and-data-privacy/ Mon, 12 May 2025 13:59:00 +0000 https://cdt.org/?post_type=insight&p=108756 In March, CDT and the Leadership Conference’s Center for Civil Rights and Technology released a fact sheet examining some of the core issues related to the Department of Government Efficiency’s (DOGE) access to and use of sensitive information held by federal agencies. Since we released this analysis, not only has DOGE increased its efforts to […]

The post CDT and the Leadership Conference Release New Analysis of Immigration, DOGE, and Data Privacy appeared first on Center for Democracy and Technology.

]]>
In March, CDT and the Leadership Conference’s Center for Civil Rights and Technology released a fact sheet examining some of the core issues related to the Department of Government Efficiency’s (DOGE) access to and use of sensitive information held by federal agencies. Since we released this analysis, not only has DOGE increased its efforts to access sensitive information across the federal government, but DOGE and federal law enforcement authorities have specifically sought to repurpose administrative data for immigration-related uses. 

As the federal government seeks to rapidly expand the use of sensitive data to target immigrants, CDT and the Leadership Conference developed a follow-up explainer that analyzes the issues surrounding federal immigration authorities and DOGE’s access and use of administrative data for immigration-related activities. This new explainer details:

  • The types of administrative data held by federal agencies, 
  • Examples of how federal administrative data is being repurposed for immigration-related efforts, 
  • The legal protections of federal administrative data and law enforcement exceptions, 
  • The impacts of government data access and use on immigrants and society, and
  • The unanswered questions about and potential future changes to the federal government’s access, use, and sharing of administrative data for immigration-related purposes. 

Repurposing federal administrative data for immigration-related activities may have widespread and significant impacts on the lives of U.S. citizens and non-citizen immigrants alike. Ensuring transparency into the actions of DOGE and federal immigration authorities is a critical step towards protecting and safeguarding data privacy for everyone.

Read the full analysis.

The post CDT and the Leadership Conference Release New Analysis of Immigration, DOGE, and Data Privacy appeared first on Center for Democracy and Technology.

]]>
EU Tech Policy Brief: May 2025 https://cdt.org/insights/eu-tech-policy-brief-may-2025/ Wed, 07 May 2025 00:01:11 +0000 https://cdt.org/?post_type=insight&p=108724 Welcome back to the Centre for Democracy & Technology Europe‘s Tech Policy Brief! This edition covers the most pressing technology and internet policy issues under debate in Europe and gives CDT’s perspective on the impact to digital rights. To sign up for CDT Europe’s AI newsletter, please visit our website. Do not hesitate to contact […]

The post EU Tech Policy Brief: May 2025 appeared first on Center for Democracy and Technology.

]]>
Welcome back to the Centre for Democracy & Technology Europe‘s Tech Policy Brief! This edition covers the most pressing technology and internet policy issues under debate in Europe and gives CDT’s perspective on the impact to digital rights. To sign up for CDT Europe’s AI newsletter, please visit our website. Do not hesitate to contact our team in Brussels.

👁 Security, Surveillance & Human Rights

Building Global Spyware Standards with the Pall Mall Process

As international attention focuses on misuses of commercial spyware, the Pall Mall Process continues to gather momentum. This joint initiative, led by France and the United Kingdom, seeks to establish international guiding principles for the development, sale, and use of commercial cyber intrusion capabilities (CCICs). 

At the Process’s second conference in Paris earlier this month, Programme Director Silvia Lorenzo Perez joined global stakeholders as the process concluded with the adoption of a Pall Mall Code of Practice for States. The Code has been endorsed by 25 countries to date, including 18 EU Member States. It sets out commitments for state action regarding the development, facilitation, acquisition, and deployment of CCICs. It also outlines good practices and regulatory recommendations to promote responsible state conduct in the use of CCICs. 

Pall Mall Process annual event in Paris.
Pall Mall Process annual event in Paris.

CDT Europe will soon publish a comprehensive assessment of the official document to provide deeper insights into its implications. In parallel, and as part of our ongoing work to advance spyware regulation within the EU, CDT Europe is leading preparation of the sixth edition of the civil society roundtable series, “Lifting the Veil – Advancing Spyware Regulation in the EU,” on 13 May. Stakeholders will discuss what meaningful action should look like in the EU, following the political commitments made by the Member States that endorsed the Pall Mall Code of Practice.

CSOs Urge Swedish Parliament to Reject Legislation Undermining Encryption

CDT Europe joined a coalition of civil society organisations, including members of the Global Encryption Coalition, in an open letter urging the Swedish Parliament to reject proposed legislation that would weaken encryption. This legislation, if enacted, would greatly undermine the security and privacy of Swedish citizens, companies, and institutions. Despite its intention to combat serious crime, the legislation’s dangerous approach would instead create vulnerabilities that criminals and other malicious actors could readily exploit. Compromising encryption would leave Sweden’s citizens and institutions less safe than before. The proposed legislation would particularly harm those who rely on encryption the most, including journalists, activists, survivors of domestic violence, and marginalised communities. Human rights organisations have consistently highlighted encryption’s critical role in safeguarding privacy and free expression. Additionally, weakening encryption would also pose a national security threat, as even the Swedish Armed Forces rely on encrypted tools like Signal for secure communication. 

Recommended read: Ofcom, Global Titles and Mobile Network Security, Measures to Address Misuse of Global Titles

 💬 Online Expression & Civic Space

DSA Civil Society Coordination Group Meets with the ODS Bodies Network

Earlier this month, the DSA Civil Society Coordination Group met with the Out-of-Court Dispute Settlement (ODS) Bodies Network for the first time to explore ways to collaborate. Under Article 21 of the Digital Services Act (DSA), ODS Bodies are to provide independent resolution of disputes between users and online platforms. As these bodies start forming and seeking certification, their role in helping users access redress and offering insights into platform compliance is becoming more important.

The meeting introduced the ODS Network’s mission: to encourage cooperation among certified bodies, promote best practices for data-sharing, and engage with platforms and regulators. Civil society organisations, which often support users who have faced harms on platforms, discussed how they could help identify cases that could be referred to ODS Bodies. In return, records from ODS Bodies could become a valuable resource for tracking systemic risks and holding platforms accountable under the DSA.

The discussion further focused on how to raise user awareness of redress options, make ODS procedures more accessible, and strengthen data reporting practices. Participants also outlined next steps for working more closely together, particularly around identifying the types of data that could best support civil society’s efforts to monitor risks and support enforcement actions by the European Commission.

Asha Allen Joins Euphoria Podcast to Discuss Civil Society in the EU

Civil society is under pressure, and now more than ever, solidarity and resilience are vital. These are the resounding conclusions of the latest episode of the podcast Euphoria, featuring CDT Europe’s Secretary General Asha Allen. Asha joined Arianna and Federico from EU&U to unpack the current state of human rights and the growing threats faced by civil society in Europe and beyond. With key EU legislation like the AI Act and Digital Services Act becoming increasingly politicised, they explored how to defend democracy, safeguard fundamental rights, and shape a digital future that truly serves its citizens. Listen now to discover how cross-movement collaboration and rights-based tech policy can help counter rising authoritarianism.

CDT Europe Secretary General Asha Allen speaking with podcasters Federico Terreni and Arianna Labasin from EU&U at the Euphoria Podcast recording.
CDT Europe Secretary General Asha Allen speaking with podcasters Federico Terreni and Arianna Labasin from EU&U at the Euphoria Podcast recording.

Recommended read: FEPs, Silenced, censored, resisting: feminist struggles in the digital age

⚖ Equity and Data

EU AI Act Explainer — AI at Work

In the fourth part of our series on the AI Act and its implications for human rights, we examine the deployment of AI systems in the workplace and the AI Act’s specific obligations aimed at ensuring the protection of workers. In particular, we assess which of the prohibited AI practices could become relevant for the workplace and where potential loopholes and gaps lie. We also focus on the obligations of providers and deployers of high-risk AI systems, which could increase protection of workers from harms caused by automated monitoring and decision-making systems. Finally, we examine to what extent the remedies and enforcement mechanisms foreseen by the AI Act can be a useful tool for workers and their representatives to claim their rights. Overall, we find that the AI Act’s approach to allow more favourable legislation in the employment sector to apply is a positive step. Nevertheless, the regulation itself has only limited potential to protect workers’ rights.

CSOs Express Concern with Withdrawal of AI Liability Directive

CDT Europe joined a coalition of civil society organisations in sending an open letter to European Commission Executive Vice-President Virkkunen and Commissioner McGrath, expressing deep concern over the Commission’s recent decision to withdraw the proposed Artificial Intelligence Liability Directive (AILD) and stressing the urgent need to immediately begin preparatory work on a new, robust liability framework. We argued that the proposal is necessary because individuals seeking compensation for AI-induced harm will need to prove that damage was caused by a faulty AI system, which would be an insurmountable burden without a liability framework. 

Programme Director Laura Lazaro Cabrera also participated in a working lunch hosted by The Nine to discuss the latest trends and developments in AI policy following the Paris AI Summit. Among other aspects, Laura tackled the deregulatory approach taken by the European Commission, the importance of countering industry narratives, and the fundamental rights concerns underlying some of the key features of the AI Act.

Equity and Data Programme Director Laura Lazaro Cabrera speaking on a panel at the “Post-Paris AI Summit: Key Trends and Policies” event hosted by The Nine.
Equity and Data Programme Director Laura Lazaro Cabrera speaking on a panel at the “Post-Paris AI Summit: Key Trends and Policies” event hosted by The Nine.

Recommended read: Tech Policy Press, Human Rights are Universal, Not Optional: Don’t Undermine the EU AI Act with a Faulty Code of Practice

🆕 New Team Member!

Marcel Mir Teijeiro, AI Policy Fellow in CDT Europe's Equity and Data programme.
Marcel Mir Teijeiro, AI Policy Fellow in CDT Europe’s Equity and Data programme.

CDT Europe’s team keeps growing! At the beginning of April, we welcomed Marcel Mir Teijeiro as the Equity and Data programme’s New AI Policy Fellow. He’ll work on the implementation of the AI Act and CDT Europe’s advocacy to protect the right to effective remedy for AI-induced harms. Previously, Marcel participated in the Code of Practice multistakeholder process for General-Purpose AI Models, advising rights-holder groups across the cultural and creative industries on transparency and intellectual property aspects. A Spanish qualified lawyer, he also helped develop a hash-based technical solution for training dataset disclosure shared with the AI Office, U.S. National Institute for Standards and Technology, and the UK AI Safety Institute. We are excited to have him on board, and look forward to working with him!

🗞 In the Press

⏫ Upcoming Events

Tech Policy in 2025: Where Does Europe Stand?: On May 15, CDT Europe and Tech Policy Press are co-hosting an evening of drinks and informal discussion, “Tech Policy in 2025: Where Does Europe Stand?”. It will be an opportunity to connect with fellow tech policy enthusiasts, share ideas, and figure out what the future holds for tech regulation in Europe. The event is currently sold out, but you can still join the waitlist in case some spots open up! 

Lifting the Veil – Advancing Spyware Regulation in the EU: CDT Europe, together with the Open Government Partnership, is hosting the sixth edition of the Civil Society Roundtable Series: “Lifting the Veil – Advancing Spyware Regulation in the EU.” The roundtable will gather representatives from EU Member States, EU institutions, and international bodies alongside civil society organisations, technologists, legal scholars, and human rights defenders for an in-depth exchange on the future of spyware regulation. The participation is invitation-only, so if you think you can contribute to the conversation, feel free to reach out at eu@cdt.org.

CPDP.ai 2025: From 21 to 23 May, CDT Europe will participate in CPDP.ai 18th International Conference. Each year, CPDP gathers academics, lawyers, practitioners, policymakers, industry, and civil society from all over the world in Brussels, offering them an arena to exchange ideas and discuss the latest emerging issues and trends. This year, CDT Europe will be hosting two workshops on AI and spyware, in addition to our Secretary General Asha Allen speaking on a panel on the intersection of the DSA and online gender-based violence. You can still register to attend the conference.

The post EU Tech Policy Brief: May 2025 appeared first on Center for Democracy and Technology.

]]>
CDT Supports Strong Privacy Bill in Maine https://cdt.org/insights/cdt-supports-strong-privacy-bill-in-maine/ Tue, 06 May 2025 18:36:46 +0000 https://cdt.org/?post_type=insight&p=108623 On May 5, CDT testified in support of a strong privacy bill in Maine that properly focused on data minimization as its core privacy protection. In our testimony, we noted a few places that could use improvement, such as broadening the minimization protections, narrowing the “knowledge” standard, and adding the private right of action back […]

The post CDT Supports Strong Privacy Bill in Maine appeared first on Center for Democracy and Technology.

]]>
On May 5, CDT testified in support of a strong privacy bill in Maine that properly focused on data minimization as its core privacy protection. In our testimony, we noted a few places that could use improvement, such as broadening the minimization protections, narrowing the “knowledge” standard, and adding the private right of action back into the bill. We also opposed three other privacy bills, two of which would be harmful for privacy and would, in general, allow companies to continue engaging in the same privacy practices they’ve long engaged in, which place substantial burdens on individuals.”

Read the full testimony.

The post CDT Supports Strong Privacy Bill in Maine appeared first on Center for Democracy and Technology.

]]>
CDT Endorses State Data Privacy Act Model Bill from EPIC & Consumer Reports https://cdt.org/insights/cdt-endorses-state-data-privacy-act-model-bill-from-epic-consumer-reports/ Mon, 05 May 2025 16:42:37 +0000 https://cdt.org/?post_type=insight&p=108523 The Center for Democracy & Technology (CDT) recently supported this updated model state comprehensive privacy bill from EPIC and Consumer Reports that emphasizes data minimization, civil rights, and private right of action. CDT endorsed the compromise model bill because it properly focuses on strong data minimization, civil rights protections, and includes a private right of […]

The post CDT Endorses State Data Privacy Act Model Bill from EPIC & Consumer Reports appeared first on Center for Democracy and Technology.

]]>
The Center for Democracy & Technology (CDT) recently supported this updated model state comprehensive privacy bill from EPIC and Consumer Reports that emphasizes data minimization, civil rights, and private right of action.

CDT endorsed the compromise model bill because it properly focuses on strong data minimization, civil rights protections, and includes a private right of action. States that pass this law would truly protect the privacy of their residents.

Read the full model bill.

The post CDT Endorses State Data Privacy Act Model Bill from EPIC & Consumer Reports appeared first on Center for Democracy and Technology.

]]>
CDT Submits Comments to Representative Lori Trahan on Updating the Privacy Act of 1974 https://cdt.org/insights/cdt-submits-comments-to-representative-lori-trahan-on-updating-the-privacy-act-of-1974/ Wed, 30 Apr 2025 04:01:00 +0000 https://cdt.org/?post_type=insight&p=108499 On April 30, the Center for Democracy & Technology (CDT) submitted comments to Representative Lori Trahan about reforming the Privacy Act of 1974 to address advances in technology and emerging threats to federal government data privacy. Our comments highlight potential privacy harms related to federal government data practices and provide an overview of CDT’s nearly […]

The post CDT Submits Comments to Representative Lori Trahan on Updating the Privacy Act of 1974 appeared first on Center for Democracy and Technology.

]]>
On April 30, the Center for Democracy & Technology (CDT) submitted comments to Representative Lori Trahan about reforming the Privacy Act of 1974 to address advances in technology and emerging threats to federal government data privacy. Our comments highlight potential privacy harms related to federal government data practices and provide an overview of CDT’s nearly two decades of advocacy on the Privacy Act.

We urge Congress to address gaps in the Privacy Act, including by:

  • Updating the definition of “system of records,” 
  • Limiting the “routine use” exemption, 
  • Expanding the Privacy Act to cover non-U.S. persons, and 
  • Strengthening privacy notices.

Read the full comments.

The post CDT Submits Comments to Representative Lori Trahan on Updating the Privacy Act of 1974 appeared first on Center for Democracy and Technology.

]]>
You’ll Probably Be Protected: Explaining Differential Privacy Guarantees https://cdt.org/insights/youll-probably-be-protected-explaining-differential-privacy-guarantees/ Tue, 22 Apr 2025 18:13:06 +0000 https://cdt.org/?post_type=insight&p=108443 By Rachel Cummings, Associate Professor of Industrial Engineering and Operations Research, Columbia University and Priyanka Nanayakkara, Postdoctoral Fellow, Harvard Center for Research on Computation and Society Disclaimer: The views expressed by CDT’s Non-Resident Fellows are their own and do not necessarily reflect the policy, position, or views of CDT. Data collection is ubiquitous. Data are […]

The post You’ll Probably Be Protected: Explaining Differential Privacy Guarantees appeared first on Center for Democracy and Technology.

]]>
By Rachel Cummings, Associate Professor of Industrial Engineering and Operations Research, Columbia University and Priyanka Nanayakkara, Postdoctoral Fellow, Harvard Center for Research on Computation and Society

Disclaimer: The views expressed by CDT’s Non-Resident Fellows are their own and do not necessarily reflect the policy, position, or views of CDT.

Data collection is ubiquitous. Data are useful for a variety of purposes, from supporting research to helping allocate political representation. It benefits society to enable data use for such purposes, but it’s also important to protect people’s privacy in the process. Organizations across industry and government are increasingly turning to differential privacy (DP), an approach to privacy-preserving data analysis that limits how much information about an individual is learned from an analysis. Chances are DP has been used to provide a privacy guarantee for an analysis of your data: Companies like Google, Apple, Meta, Microsoft, and Uber, as well as government agencies like the U.S. Census Bureau have all used it in the past several years.

Not all differential privacy systems are created equal, though. The strength of privacy protections offered by DP depends on a “privacy loss budget” parameter, called epsilon. Epsilon is a measure of the amount of information “leaked” about individuals from the use of their data. This value can be chosen to be anything from zero to infinity, where smaller epsilon values correspond to stronger levels of privacy protections. Privacy protections can vary wildly according to how epsilon is set: bigger epsilons can leak much more information about individuals. For example, when epsilon is 0.1, an observer or attacker is 1.1 times more likely to learn something about you, compared to if they had never seen your data. If epsilon is 10, this becomes 22,000 times more likely. Despite epsilon’s importance as an indicator of privacy risk, it is seldom communicated to the people whose personal data are used by technology companies and other large organizations. This is in part because epsilon is difficult to reason about, even among experts. It is a unitless and contextless parameter, making it challenging to map onto real-world outcomes. Furthermore, it specifies probabilistic guarantees, meaning people must reason under uncertainty to fully grasp its implications. However, not explaining epsilon to people who are deciding whether to share their data under DP leaves them ill-informed about the protections that are being offered.

In an attempt to remedy this information asymmetry, we set out, in work published in 2023 with Gabriel Kaptchuk, Elissa M. Redmiles, and Mary Anne Smart, to design explanation methods for epsilon that empower people to make informed data-sharing decisions. Specifically, we wanted our methods to increase people’s:

  • Objective Risk Comprehension: Understanding of numeric risks associated with sharing data
  • Subjective Privacy Understanding: Self-rated feelings of understanding the privacy protections offered
  • Self-Efficacy: Self-rated feelings of having enough information and confidence making data-sharing decisions

Explanation Methods for Epsilon

We developed and evaluated three portable explanation methods for epsilon: an odds-based text method, an odds-based visualization method, and an example-based method. Each method provides information about what a data subject can expect to happen if they share or do not share data.

As a concrete scenario, imagine that your company sent a survey to all employees, asking whether you feel adequately supported by your manager. Further imagine that you want to respond NO, but are worried that your manager would retaliate against you if they found out. Your company wants to protect your privacy in this process, and will only send your manager a differentially-private version of the number of NO responses. How should your company communicate about the privacy guarantees you are receiving in this process, and about the epsilon value that is used?

The “odds-based” methods present probabilities of your manager believing that you responded NO on the survey, when you share or don’t share data. We present probabilities as frequencies (e.g., 10 out of 100) versus percentages (e.g., 10%) because prior research has found that frequency-framed probabilities support people in making more accurate probabilistic judgments. The first odds-based method uses text, as follows:

Odds-Based Text Method
If you do not share data, 39 out of 100 potential DP outputs will lead your manager to believe you responded NO.

If you share data, 61 out of 100 potential DP outputs will lead your manager to believe you responded NO.

The second odds-based method adds icon arrays–a frequency-framed visualization technique for depicting probabilities–to the text (See Figure 1):

Odds-Based Visualization Method

Figure 1. Odds-Based Visualization Method. This explanation assumes 𝟄 = 0.5. Source: Rachel Cummings and Priyanka Nanayakkara.

Alternatively, research in usability suggests that concrete examples may help people understand security and privacy concepts. Hence, our third “example-based” method shows potential outputs from the DP algorithm (i.e., results of the analysis under DP) with data sharing and without (See Figure 2):

Example-Based Method

Figure 2. Example-Based method: Because DP adds some random noise to the output, the number of NOs reported may not be a whole number, or could even be negative. This explanation assumes 𝟄 = 0.5. Source: Rachel Cummings and Priyanka Nanayakkara.

How Well Do Our Methods Work?

We evaluated our methods using a vignette survey study with 963 people. We presented them with the workplace scenario described above, and presented our communication methods under varying epsilon values to capture a wide range of protections. People then had to decide whether or not to share data, and answered some additional questions to measure their objective risk comprehension and their subjective understanding of the privacy protections and self-efficacy.

We compared these responses against two baseline explanations: In the No-Privacy Control, people received no privacy protections, and their manager would certainly see their response if they chose to share data. In the No-Epsilon Control, people were presented with an explanation of DP from prior work that provided a high-level description of DP without explaining epsilon.

We found that the odds-based visualization method improved participants’ objective risk comprehension over the No-Privacy Control, while the example-based method decreased comprehension. In other words, participants tended to answer more questions correctly with the odds-based visualization method and fewer questions correctly with the example-based method. Furthermore, both odds-based methods improved feelings of having enough information compared to the No-Epsilon Control, which did not include epsilon information, suggesting people may feel empowered by having explicit information about epsilon.

Interestingly, we found that participants were more likely to share data when given one of our methods over the No-Epsilon Control. For example, participants were nearly twice as likely to share data when given the odds-based visualization method than when given the No-Epsilon Control. Finally, as expected, participants appeared to be sensitive to changes in epsilon: as epsilon increased (privacy protections weakened), participants were less likely to share data.

From our study, we found that the odds-based methods were most effective. Both odds-based methods improved objective risk comprehension (measured by two yes/no questions about risks with and without data sharing), subjective privacy understanding, and feelings of having enough information, over the example-based method for participants in our study.

The Future of Explaining Differential Privacy

Our work suggests that odds-based methods are promising for explaining epsilon to data subjects. While probabilistic information is often sidestepped in public-facing explanations of epsilon, we hope that in the near future, organizations deploying DP can increase transparency by using methods like ours to give people concrete, accessible information about epsilon. In addition to explaining epsilon, there are several other factors that could help people assess protections, like which “model” of DP is used, a problem which recent work (including ours) tries to tackle.

More broadly, we hope that providing more clear explanations of DP will enable broader, responsible adoption of this privacy technique. As we highlight, understanding the epsilon parameter and the privacy guarantees it provides are a critical part of transparency around DP. Our goal is that, with clear explanations of privacy protections, people become more empowered to weigh in on decisions about their data. This will also facilitate increased trust in this technology, allow more organizations to begin using it in practice, and unlock its potential for societally valuable data analysis.

The post You’ll Probably Be Protected: Explaining Differential Privacy Guarantees appeared first on Center for Democracy and Technology.

]]>
CDT Supports Two Massachusetts Comprehensive Privacy Bills https://cdt.org/insights/cdt-supports-two-massachusetts-comprehensive-privacy-bills/ Tue, 15 Apr 2025 18:28:31 +0000 https://cdt.org/?post_type=insight&p=108365 Eric Null, Co-Director of the Privacy & Data Project at the Center for Democracy & Technology (CDT), provided testimony recently before the General Court of Massachusetts’s hearing of the Joint Committee on Advanced Information Technology, the Internet, and Cybersecurity. Eric testified in support of H.78, Massachusetts Consumer Data Privacy Act, and H.104/S.29/S.45, Massachusetts Data Privacy […]

The post CDT Supports Two Massachusetts Comprehensive Privacy Bills appeared first on Center for Democracy and Technology.

]]>
Eric Null, Co-Director of the Privacy & Data Project at the Center for Democracy & Technology (CDT), provided testimony recently before the General Court of Massachusetts’s hearing of the Joint Committee on Advanced Information Technology, the Internet, and Cybersecurity.

Eric testified in support of H.78, Massachusetts Consumer Data Privacy Act, and H.104/S.29/S.45, Massachusetts Data Privacy Act.

CDT supported these two strong comprehensive privacy bills in Massachusetts, both of which would provide meaningful data minimization, civil rights protections, and enforcement. The testimony provides context and detail for why those protections are important to include in a privacy law. 

Read the full testimony.

The post CDT Supports Two Massachusetts Comprehensive Privacy Bills appeared first on Center for Democracy and Technology.

]]>