Government Surveillance Archives - Center for Democracy and Technology https://cdt.org/area-of-focus/government-surveillance/ Wed, 07 May 2025 07:30:37 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://cdt.org/wp-content/uploads/2019/11/cropped-cdt-logo-32x32.png Government Surveillance Archives - Center for Democracy and Technology https://cdt.org/area-of-focus/government-surveillance/ 32 32 EU Tech Policy Brief: May 2025 https://cdt.org/insights/eu-tech-policy-brief-may-2025/ Wed, 07 May 2025 00:01:11 +0000 https://cdt.org/?post_type=insight&p=108724 Welcome back to the Centre for Democracy & Technology Europe‘s Tech Policy Brief! This edition covers the most pressing technology and internet policy issues under debate in Europe and gives CDT’s perspective on the impact to digital rights. To sign up for CDT Europe’s AI newsletter, please visit our website. Do not hesitate to contact […]

The post EU Tech Policy Brief: May 2025 appeared first on Center for Democracy and Technology.

]]>
Welcome back to the Centre for Democracy & Technology Europe‘s Tech Policy Brief! This edition covers the most pressing technology and internet policy issues under debate in Europe and gives CDT’s perspective on the impact to digital rights. To sign up for CDT Europe’s AI newsletter, please visit our website. Do not hesitate to contact our team in Brussels.

👁 Security, Surveillance & Human Rights

Building Global Spyware Standards with the Pall Mall Process

As international attention focuses on misuses of commercial spyware, the Pall Mall Process continues to gather momentum. This joint initiative, led by France and the United Kingdom, seeks to establish international guiding principles for the development, sale, and use of commercial cyber intrusion capabilities (CCICs). 

At the Process’s second conference in Paris earlier this month, Programme Director Silvia Lorenzo Perez joined global stakeholders as the process concluded with the adoption of a Pall Mall Code of Practice for States. The Code has been endorsed by 25 countries to date, including 18 EU Member States. It sets out commitments for state action regarding the development, facilitation, acquisition, and deployment of CCICs. It also outlines good practices and regulatory recommendations to promote responsible state conduct in the use of CCICs. 

Pall Mall Process annual event in Paris.
Pall Mall Process annual event in Paris.

CDT Europe will soon publish a comprehensive assessment of the official document to provide deeper insights into its implications. In parallel, and as part of our ongoing work to advance spyware regulation within the EU, CDT Europe is leading preparation of the sixth edition of the civil society roundtable series, “Lifting the Veil – Advancing Spyware Regulation in the EU,” on 13 May. Stakeholders will discuss what meaningful action should look like in the EU, following the political commitments made by the Member States that endorsed the Pall Mall Code of Practice.

CSOs Urge Swedish Parliament to Reject Legislation Undermining Encryption

CDT Europe joined a coalition of civil society organisations, including members of the Global Encryption Coalition, in an open letter urging the Swedish Parliament to reject proposed legislation that would weaken encryption. This legislation, if enacted, would greatly undermine the security and privacy of Swedish citizens, companies, and institutions. Despite its intention to combat serious crime, the legislation’s dangerous approach would instead create vulnerabilities that criminals and other malicious actors could readily exploit. Compromising encryption would leave Sweden’s citizens and institutions less safe than before. The proposed legislation would particularly harm those who rely on encryption the most, including journalists, activists, survivors of domestic violence, and marginalised communities. Human rights organisations have consistently highlighted encryption’s critical role in safeguarding privacy and free expression. Additionally, weakening encryption would also pose a national security threat, as even the Swedish Armed Forces rely on encrypted tools like Signal for secure communication. 

Recommended read: Ofcom, Global Titles and Mobile Network Security, Measures to Address Misuse of Global Titles

 💬 Online Expression & Civic Space

DSA Civil Society Coordination Group Meets with the ODS Bodies Network

Earlier this month, the DSA Civil Society Coordination Group met with the Out-of-Court Dispute Settlement (ODS) Bodies Network for the first time to explore ways to collaborate. Under Article 21 of the Digital Services Act (DSA), ODS Bodies are to provide independent resolution of disputes between users and online platforms. As these bodies start forming and seeking certification, their role in helping users access redress and offering insights into platform compliance is becoming more important.

The meeting introduced the ODS Network’s mission: to encourage cooperation among certified bodies, promote best practices for data-sharing, and engage with platforms and regulators. Civil society organisations, which often support users who have faced harms on platforms, discussed how they could help identify cases that could be referred to ODS Bodies. In return, records from ODS Bodies could become a valuable resource for tracking systemic risks and holding platforms accountable under the DSA.

The discussion further focused on how to raise user awareness of redress options, make ODS procedures more accessible, and strengthen data reporting practices. Participants also outlined next steps for working more closely together, particularly around identifying the types of data that could best support civil society’s efforts to monitor risks and support enforcement actions by the European Commission.

Asha Allen Joins Euphoria Podcast to Discuss Civil Society in the EU

Civil society is under pressure, and now more than ever, solidarity and resilience are vital. These are the resounding conclusions of the latest episode of the podcast Euphoria, featuring CDT Europe’s Secretary General Asha Allen. Asha joined Arianna and Federico from EU&U to unpack the current state of human rights and the growing threats faced by civil society in Europe and beyond. With key EU legislation like the AI Act and Digital Services Act becoming increasingly politicised, they explored how to defend democracy, safeguard fundamental rights, and shape a digital future that truly serves its citizens. Listen now to discover how cross-movement collaboration and rights-based tech policy can help counter rising authoritarianism.

CDT Europe Secretary General Asha Allen speaking with podcasters Federico Terreni and Arianna Labasin from EU&U at the Euphoria Podcast recording.
CDT Europe Secretary General Asha Allen speaking with podcasters Federico Terreni and Arianna Labasin from EU&U at the Euphoria Podcast recording.

Recommended read: FEPs, Silenced, censored, resisting: feminist struggles in the digital age

⚖ Equity and Data

EU AI Act Explainer — AI at Work

In the fourth part of our series on the AI Act and its implications for human rights, we examine the deployment of AI systems in the workplace and the AI Act’s specific obligations aimed at ensuring the protection of workers. In particular, we assess which of the prohibited AI practices could become relevant for the workplace and where potential loopholes and gaps lie. We also focus on the obligations of providers and deployers of high-risk AI systems, which could increase protection of workers from harms caused by automated monitoring and decision-making systems. Finally, we examine to what extent the remedies and enforcement mechanisms foreseen by the AI Act can be a useful tool for workers and their representatives to claim their rights. Overall, we find that the AI Act’s approach to allow more favourable legislation in the employment sector to apply is a positive step. Nevertheless, the regulation itself has only limited potential to protect workers’ rights.

CSOs Express Concern with Withdrawal of AI Liability Directive

CDT Europe joined a coalition of civil society organisations in sending an open letter to European Commission Executive Vice-President Virkkunen and Commissioner McGrath, expressing deep concern over the Commission’s recent decision to withdraw the proposed Artificial Intelligence Liability Directive (AILD) and stressing the urgent need to immediately begin preparatory work on a new, robust liability framework. We argued that the proposal is necessary because individuals seeking compensation for AI-induced harm will need to prove that damage was caused by a faulty AI system, which would be an insurmountable burden without a liability framework. 

Programme Director Laura Lazaro Cabrera also participated in a working lunch hosted by The Nine to discuss the latest trends and developments in AI policy following the Paris AI Summit. Among other aspects, Laura tackled the deregulatory approach taken by the European Commission, the importance of countering industry narratives, and the fundamental rights concerns underlying some of the key features of the AI Act.

Equity and Data Programme Director Laura Lazaro Cabrera speaking on a panel at the “Post-Paris AI Summit: Key Trends and Policies” event hosted by The Nine.
Equity and Data Programme Director Laura Lazaro Cabrera speaking on a panel at the “Post-Paris AI Summit: Key Trends and Policies” event hosted by The Nine.

Recommended read: Tech Policy Press, Human Rights are Universal, Not Optional: Don’t Undermine the EU AI Act with a Faulty Code of Practice

🆕 New Team Member!

Marcel Mir Teijeiro, AI Policy Fellow in CDT Europe's Equity and Data programme.
Marcel Mir Teijeiro, AI Policy Fellow in CDT Europe’s Equity and Data programme.

CDT Europe’s team keeps growing! At the beginning of April, we welcomed Marcel Mir Teijeiro as the Equity and Data programme’s New AI Policy Fellow. He’ll work on the implementation of the AI Act and CDT Europe’s advocacy to protect the right to effective remedy for AI-induced harms. Previously, Marcel participated in the Code of Practice multistakeholder process for General-Purpose AI Models, advising rights-holder groups across the cultural and creative industries on transparency and intellectual property aspects. A Spanish qualified lawyer, he also helped develop a hash-based technical solution for training dataset disclosure shared with the AI Office, U.S. National Institute for Standards and Technology, and the UK AI Safety Institute. We are excited to have him on board, and look forward to working with him!

🗞 In the Press

⏫ Upcoming Events

Tech Policy in 2025: Where Does Europe Stand?: On May 15, CDT Europe and Tech Policy Press are co-hosting an evening of drinks and informal discussion, “Tech Policy in 2025: Where Does Europe Stand?”. It will be an opportunity to connect with fellow tech policy enthusiasts, share ideas, and figure out what the future holds for tech regulation in Europe. The event is currently sold out, but you can still join the waitlist in case some spots open up! 

Lifting the Veil – Advancing Spyware Regulation in the EU: CDT Europe, together with the Open Government Partnership, is hosting the sixth edition of the Civil Society Roundtable Series: “Lifting the Veil – Advancing Spyware Regulation in the EU.” The roundtable will gather representatives from EU Member States, EU institutions, and international bodies alongside civil society organisations, technologists, legal scholars, and human rights defenders for an in-depth exchange on the future of spyware regulation. The participation is invitation-only, so if you think you can contribute to the conversation, feel free to reach out at eu@cdt.org.

CPDP.ai 2025: From 21 to 23 May, CDT Europe will participate in CPDP.ai 18th International Conference. Each year, CPDP gathers academics, lawyers, practitioners, policymakers, industry, and civil society from all over the world in Brussels, offering them an arena to exchange ideas and discuss the latest emerging issues and trends. This year, CDT Europe will be hosting two workshops on AI and spyware, in addition to our Secretary General Asha Allen speaking on a panel on the intersection of the DSA and online gender-based violence. You can still register to attend the conference.

The post EU Tech Policy Brief: May 2025 appeared first on Center for Democracy and Technology.

]]>
CDT Opposes Trump Administration Initiative to Routinely Collect Social Media Identifiers from Applicants for Immigration Benefits https://cdt.org/insights/cdt-opposes-trump-administration-initiative-to-routinely-collect-social-media-identifiers-from-applicants-for-immigration-benefits/ Mon, 05 May 2025 17:46:14 +0000 https://cdt.org/?post_type=insight&p=108522 These comments were co-authored by CDT Intern Jacob Smith. Today, CDT submitted comments opposing USCIS’ initiative to routinely collect social media identifiers from applicants for a wide variety of immigration benefits, ranging from asylum to naturalization. USCIS plans to collect social media identifiers to further its viewpoint-based immigration enforcement policy, which will punish and deport individuals […]

The post CDT Opposes Trump Administration Initiative to Routinely Collect Social Media Identifiers from Applicants for Immigration Benefits appeared first on Center for Democracy and Technology.

]]>
These comments were co-authored by CDT Intern Jacob Smith. 

Today, CDT submitted comments opposing USCIS’ initiative to routinely collect social media identifiers from applicants for a wide variety of immigration benefits, ranging from asylum to naturalization. USCIS plans to collect social media identifiers to further its viewpoint-based immigration enforcement policy, which will punish and deport individuals on the basis of their constitutionally-protected expression and chill the lawful speech of citizens and noncitizens alike. CDT’s comments document the Trump Administration’s unconstitutional and punitive immigration enforcement actions against lawful residents who exercised their rights to speech and protest. Social media surveillance furthered through USCIS’ proposed social media identifier collection would fly in the face of our First Amendment values and chill valuable expression. The negative consequences of these policies will be made worse through the use of imprecise AI tools that are bound to fail, exacerbating the chilling and punitive effects of the administration’s unlawful policies.

Read the full comments.

The post CDT Opposes Trump Administration Initiative to Routinely Collect Social Media Identifiers from Applicants for Immigration Benefits appeared first on Center for Democracy and Technology.

]]>
CDT Stands Up for Taxpayer Privacy https://cdt.org/insights/cdt-stands-up-for-taxpayer-privacy/ Wed, 16 Apr 2025 15:49:08 +0000 https://cdt.org/?post_type=insight&p=108378 The Center for Democracy & Technology has joined over 270 other organizations in a letter calling on Congress to stand up for taxpayer privacy just as millions of Americans are filing their tax returns. The letter decries a new Memorandum of Understanding (MOU) pursuant to which the Internal Revenue Service will share with the Department […]

The post CDT Stands Up for Taxpayer Privacy appeared first on Center for Democracy and Technology.

]]>
The Center for Democracy & Technology has joined over 270 other organizations in a letter calling on Congress to stand up for taxpayer privacy just as millions of Americans are filing their tax returns. The letter decries a new Memorandum of Understanding (MOU) pursuant to which the Internal Revenue Service will share with the Department of Homeland Security taxpayer information regarding as many as seven million taxpayers that DHS suspects are undocumented. Taxpayers will have no prior notice that their information is being shared, and no opportunity to challenge the sharing of their information on a case-by-case basis before it is shared.

As stated in the letter, which was quarterbacked by the civil rights and advocacy NGO UnidosUS, the IRS-DHS MOU “… poses an unprecedented threat to taxpayer privacy protections that have been respected on a bipartisan basis for nearly 50 years.” Taxpayer information is protected by law against disclosure, and immigration enforcement is not a recognized exception to those protections.  We are calling for Congress to conduct oversight hearings, demand release of the MOU without redactions, and demand that the Treasury Department explain its novel interpretation of the law. 

Taxpayer privacy encourages taxpayer compliance. As CDT has pointed out, use of taxpayer information for immigration enforcement will create a huge disincentive for undocumented people to pay taxes, and will drive them further into the informal labor sector, where they are vulnerable to abuse. This will cost the Treasury billions in lost tax revenue. The IRS had urged undocumented people to file tax returns, and to encourage them to do so, gave assurances that information submitted for tax purposes would not be used for immigration enforcement. The IRS has reneged on those assurances, calling into question other taxpayer privacy commitments — including those imposed by law. 

Read the full letter.

The post CDT Stands Up for Taxpayer Privacy appeared first on Center for Democracy and Technology.

]]>
CDT Submits Statement Urging House Judiciary to Close Loopholes on Warrantless Government Surveillance https://cdt.org/insights/cdt-submits-statement-urging-house-judiciary-to-close-loopholes-on-warrantless-government-surveillance/ Tue, 15 Apr 2025 21:05:00 +0000 https://cdt.org/?post_type=insight&p=108402 On April 8th, the House Judiciary Subcommittee on Crime and Federal Government Surveillance held a hearing to discuss warrantless surveillance issues. At the hearing, members and witnesses across the political spectrum highlighted the dangers of warrantless surveillance, and actions Congress should take to place checks on government surveillance powers. On April 15th, CDT submitted a […]

The post CDT Submits Statement Urging House Judiciary to Close Loopholes on Warrantless Government Surveillance appeared first on Center for Democracy and Technology.

]]>
On April 8th, the House Judiciary Subcommittee on Crime and Federal Government Surveillance held a hearing to discuss warrantless surveillance issues. At the hearing, members and witnesses across the political spectrum highlighted the dangers of warrantless surveillance, and actions Congress should take to place checks on government surveillance powers.

On April 15th, CDT submitted a statement for the record, highlighting three critical issues:

  • Closing the “Backdoor Search Loophole” that allows warrantless queries for Americans’ communications collected through Section 702 of FISA;
  • Closing the “Data Broker Loophole” to stop law enforcement from buying data that should require a warrant to obtain; and
  • Why restoring the independence of the Privacy and Civil Liberties Oversight Board (PCLOB) is key to oversight of government surveillance and preventing abuse.

(Additionally, a witness at the hearing – Gene Schaerr, General Counsel for the Project for Privacy & Surveillance Accountability – included as part of his written testimony a copy of the recently-released CDT and PPSA joint issue brief, entitled Debunking Myths on the National Security Impact of Warrants for U.S. Person Queries.)

Read the full statement of record.

The post CDT Submits Statement Urging House Judiciary to Close Loopholes on Warrantless Government Surveillance appeared first on Center for Democracy and Technology.

]]>
Automated Tools for Social Media Monitoring Irrevocably Chill Millions of Noncitizens’ Expression https://cdt.org/insights/automated-tools-for-social-media-monitoring-irrevocably-chill-millions-of-noncitizens-expression/ Tue, 15 Apr 2025 20:17:08 +0000 https://cdt.org/?post_type=insight&p=108372 Last week, USCIS stated its plans to routinely screen applicants’ social media activity for alleged antisemitism when making immigration decisions in millions of cases, and announced that it is scouring the social media accounts of foreign students for speech that it deems potential grounds to revoke their legal status. Simultaneously, the Department of State has […]

The post Automated Tools for Social Media Monitoring Irrevocably Chill Millions of Noncitizens’ Expression appeared first on Center for Democracy and Technology.

]]>
Last week, USCIS stated its plans to routinely screen applicants’ social media activity for alleged antisemitism when making immigration decisions in millions of cases, and announced that it is scouring the social media accounts of foreign students for speech that it deems potential grounds to revoke their legal status. Simultaneously, the Department of State has started using AI to enforce its “Catch and Revoke” policy and weed out “pro-Hamas” views among visa-holders, particularly including students who have protested against Israel’s war in Gaza. 

This isn’t USCIS’s first time conducting some form of social media monitoring; in fact, their first foray into social media data collection was in 2014. But, it is the first time the government has used a previously obscure provision of immigration law to target a large group of noncitizens for removal based on their political opinions and activism that the Secretary of State has determined could have “potentially serious adverse foreign policy consequences.” The current Administration’s broad definitions of speech that could lead to visa revocation or application denial, and the questionable constitutionality of making immigration decisions based on viewpoint, raise concerns that will only be exacerbated by the use of flawed, error-prone social media monitoring technologies.

The American immigration system already subjects applicants to disproportionate invasions of privacy and surveillance, some applicants more than others. In the current Administration, immigration enforcement has been particularly aggressive and gone beyond the bounds of previous enforcement efforts, with agents bringing deportation proceedings against applicants on valid visas on the basis of their legally-protected speech, including authorship of op-eds, participation in protests, and, according to a real albeit now-deleted social media post by the Immigration and Customs Enforcement agency, their ideas. Noncitizens have long been aware of the government’s surveillance of their speech and their social media activity, which has deterred them from accessing essential services and speaking freely on a wide range of topics, including their experience with immigration authorities, labor conditions in their workplace, or even domestic violence.

What is happening now, however, is an unprecedented and calculated effort by the U.S. government to conduct surveillance of public speech and use the results to target for removal those who disagree with government policy. At the time of writing, over 1,000 student visas have been revoked according to the State Department, some of which have been for participation in First Amendment-protected activities. For example, one post-doctoral student at Georgetown reportedly had his visa revoked for posting in support of Palestine on social media, posts that were characterized as “spreading Hamas propaganda” by a DHS spokesperson. In a high-profile case from earlier this year, the former President of Costa Rica received an email from the U.S. government revoking his visa to the United States a few weeks after he criticized the government on social media, saying, “It has never been easy for a small country to disagree with the U.S. government, and even less so, when its president behaves like a Roman emperor, telling the rest of the world what to do.” All signs indicate that disagreement with this Administration’s viewpoints could lead to negative consequences for noncitizens seeking to enter or remain in this country in any capacity.

This expansion of ideological targeting is cast against the backdrop of an immigration system that faces, at times, a Sisyphean backlog of applications and insufficient oversight of enforcement decisions, which are only growing in this political climate. Mistakes are routinely made, and they have devastating consequences. To the extent oversight agencies did exist, including through entities such as the Department of Homeland Security’s Office for Civil Rights and Civil Liberties, they have been shuttered or undermined, which will make it all the more difficult to identify and fix errors and failures to provide due process.

Applicants have little recourse to seek remedy or appeal mistakes when they are made, instead having to choose among cautious over-compliance in the form of silence, potential retaliation, or self-deportation to avoid it all. Increased social media surveillance of noncitizens against this backdrop will compound existing inequities within the system, and will almost certainly further chill noncitizens’ ability to speak and participate freely in society for fear of running afoul of the Administration.

And that’s all before accounting for the problems with the tools that the government will use to conduct this monitoring. The automated tools used for this type of social media surveillance are likely to be based on keyword filters and machine learning models, including large language models such as those that underlie chatbots such as ChatGPT. These tools are subject to various flaws and limitations that will exacerbate the deprivation of individuals’ fundamental rights to free expression and due process. This litany of problems with automated social media analysis is so pronounced that DHS opted against using such a system during the first Trump administration. DHS’s concerns about erroneous enforcement and deportations may have disappeared, but the risks from this technology have not.

First, models may be trained with a particular bias. Social media monitoring systems are generally trained on selected keywords and data easily found on the web, such as data scraped from Reddit, Wikipedia, and other largely open-access sources, which over-index on the views and perspectives of a few. Keywords may be added to the training corpus to fit the domain of use, such as offering examples of what constitutes “anti-semitism” or threats to national security. Should the training data over-represent a particular set of views or designations of “foreign terrorists,” the model may over-flag speech by some individuals more than others. The Administration’s over-capacious definition of the term “antisemitic” may be weaponized during the training of these social media monitoring models, subjecting to greater scrutiny anyone who has engaged in speech with which the Administration disagrees on topics such as Israel-Palestine or campus protests related to military actions against Gaza, even where the speech is protected by the First Amendment.

Second, and relatedly, these prescriptive tools struggle to parse context. While keyword filters and machine learning models may be able to identify words or phrases they’ve been tasked to detect, they are unable to parse the context in which the term is used, including such essential human expressions as humor, sarcasm, irony, and reclaimed language. We’ve written previously about how the use of automated content analysis tools by Facebook to enforce its Dangerous Organization & Individuals’ policy erroneously flagged and took down all posts containing the word “shaheed” (which means martyr in Arabic), even when an individual was named Shaheed or in contexts where individuals were not using the term in a way that glorified or approved of violence. Noncitizen journalists who cover protests or federal policy and post their articles on social media may also be flagged and surveilled simply for doing their job. People named Isis have long been caught up in the fray and flagged by these automated technologies. Posts by individuals citing the “soup nazi” episode of Seinfeld may also be swept in this analysis. Models’ inability to parse context will also limit their ability to conduct predictive analysis. Vendors procured by USCIS to conduct social media monitoring assert that they use AI to scan for “risky keywords” and identify persons of interest, but promises of predictive analysis likely rest on untested and discriminatory assumptions and burden the fundamental rights of all individuals swept up by these social media monitoring tools. 

Finally, the systems will be especially error-prone in multilingual settings. New multilingual language models purport to work better in more languages, yet are still trained primarily on English-language data, some machine-translated non-English data, and other available and often religious or government documents,—all imperfect proxies for how individuals speak their languages online. Multilingual training data for models is likely to underinclude terms frequently used by native speakers, including spoken regional dialects, slang, code-mixed terms, and “algospeak.” As a result, most models are unable to parse the more informal ways people have of speaking online, leading to erroneous outcomes when models analyze non-English language speech.

There have already been countless instances where digital translation technologies have been used by U.S. immigration enforcement agencies in problematic ways, which have prevented individuals from accessing a fair process and even safety. For example, an automated translation tool resulted in an individual erroneously being denied asylum because it misunderstood that she was seeking safety from parental abuse, literally translating that her perpetrator “el jefe” was her boss rather than her father. An individual from Brazil was detained for six months because of an incomplete asylum application, because the translation tool ICE used translated “Belo Horizonte” literally to “beautiful horizon” instead of identifying it as a city in which the applicant had lived. Another automated system used to conduct content analysis mistranslated “good morning” in Arabic to “attack them.” Widespread use of these error-prone systems to detect disfavored ideas will only exacerbate the discriminatory treatment of those who speak English as a second language.

Ultimately, the adoption of automated technologies to scan social media data will punish people for engaging in legal speech and result in more errors in an already flawed system. It will also chill the speech of millions of people in this country and abroad, impoverishing the global conversations that happen online. An applicant seeking to adjust their status or become a U.S. citizen, or even a U.S. citizen seeking to communicate with a noncitizen, will reasonably think twice before speaking freely or engaging in constitutionally-protected activities like protesting, simply because of the specter of social media surveillance. They already are.

The post Automated Tools for Social Media Monitoring Irrevocably Chill Millions of Noncitizens’ Expression appeared first on Center for Democracy and Technology.

]]>
When It Comes to Encryption, Back Doors Are Never Simple: Why UK Apple Users Won’t Have Encrypted Backups Anymore https://cdt.org/insights/when-it-comes-to-encryption-back-doors-are-never-simple-why-uk-apple-users-wont-have-encrypted-backups-anymore/ Tue, 08 Apr 2025 20:29:20 +0000 https://cdt.org/?post_type=insight&p=108224 Millions of Apple customers in the United Kingdom are losing access to an important end-to-end encryption tool protecting their personal data, after the company refused a reported UK government demand to build a back door into its system that would have allowed law enforcement to read personal data stored in the cloud.  Advanced Data Protection, […]

The post When It Comes to Encryption, Back Doors Are Never Simple: Why UK Apple Users Won’t Have Encrypted Backups Anymore appeared first on Center for Democracy and Technology.

]]>
Millions of Apple customers in the United Kingdom are losing access to an important end-to-end encryption tool protecting their personal data, after the company refused a reported UK government demand to build a back door into its system that would have allowed law enforcement to read personal data stored in the cloud. 

Advanced Data Protection, the service in question, allowed users to automatically store encrypted backups of files from their devices that not even Apple itself could access. And while there may be legitimate reasons for law enforcement to seek access to particular files, Apple correctly concluded that carving a new pathway through this wall of encryption would introduce a significant new vulnerability to Apple’s online storage system, one that would affect every user on the planet. 

Just as a new door into a home gives intruders an additional path inside, so too does a digital back door provide a new way for law enforcement, hackers, and unfriendly governments to access materials that are supposed to be protected. So, just like with a real door, digital engineers add a lock and key to keep things secure.

But anyone holding that key, whether a legitimate government actor, a repressive regime or a criminal hacker, can access users’ data for their own purposes. While Apple or another tech company would build protections into the system’s design, no company can guarantee that the keys would always remain safe from hackers or from government overreach. A key allowing access to so much data is a tremendously attractive target for bad actors, and if even a single hacker succeeds in accessing the key, all bets are off. Because of those risks, end-to-end encrypted backups rely on the principle that only the user has access to the keys.

Simply creating an additional access point would also introduce extra complexity to the cloud storage system, which in itself naturally creates opportunities for errors to creep in that hackers could exploit. Apple has noted that in practice, systems with backdoors are unlikely to provide the privacy and security guarantees users expect and demand and that risks of cloud data breaches are significant and impactful. While there may be a law enforcement interest in accessing certain backed up files, undermining encryption for everyone’s backed up files itself will risk widespread criminal activity, from unlawful surveillance to accessing people’s most intimate photos.

As an esteemed group of researchers noted about previous attempts to require back-door access to online systems, “The complexity of today’s Internet environment, with millions of apps and globally connected services, means that new law enforcement requirements are likely to introduce unanticipated, hard to detect security flaws.” Experts are clear that keys under doormats make us all less secure and will be widely abused.

And those are just the technical concerns.  Governments demanding access to those keys have different conceptions of the level of privacy their citizens should be allowed, and a back door built for a lawful purpose in one country could turn into a tool of repression in another. Likewise, regimes change, and if less-benevolent leaders take over, a surveillance system built for the “right” reasons could fall into the wrong hands.

While past proposed systems have differed in their technological details – and the apparent order to Apple in this case remains secret – one trend has been consistent. Whatever technologies are involved, from hidden access points to stored (“escrowed”) access codes that allow decryption, to “ghost users” added in to online conversations, systems for exceptional access inevitably get abused.

A case-in-point is the Athens Affair, in which the Greek government discovered that an unknown hacker had gained access to Vodafone’s “lawful intercept” system to spy on phone calls of both journalists and Greek politicians – including the nation’s then-president. And yet again more recently, the Salt Typhoon hackers gained unprecedented surveillance over telecommunications systems in the US and other countries, including Internet and cellular telephone metadata and even audio recordings of conversations from presidential candidates, through access to the lawful access systems put in place to comply with statutory requirements.

Apple’s decision to cut off encrypted cloud storage in the UK is a dramatic move, but it’s also both principled and pragmatic. Complying with the UK government’s reported order would undermine the security of every Advanced Data Protection user around the world. When it comes to encryption, threats to security anywhere are threats to security everywhere. 

The post When It Comes to Encryption, Back Doors Are Never Simple: Why UK Apple Users Won’t Have Encrypted Backups Anymore appeared first on Center for Democracy and Technology.

]]>
Broad Coalition Urges Sweden To Reject Draft Legislation Undermining Encryption https://cdt.org/insights/broad-coalition-urges-sweden-to-reject-draft-legislation-undermining-encryption/ Tue, 08 Apr 2025 07:50:55 +0000 https://cdt.org/?post_type=insight&p=108188 CDT Europe and 236 civil society organisations, companies, and cybersecurity experts, including members of the Global Encryption Coalition, hailing from 50 countries, are calling on the Swedish Parliament to reject proposed legislation that would undermine encryption, putting Swedish citizens, businesses, and institutions at greater risk. Though intended to combat crime, it would instead introduce vulnerabilities […]

The post Broad Coalition Urges Sweden To Reject Draft Legislation Undermining Encryption appeared first on Center for Democracy and Technology.

]]>
CDT Europe and 236 civil society organisations, companies, and cybersecurity experts, including members of the Global Encryption Coalition, hailing from 50 countries, are calling on the Swedish Parliament to reject proposed legislation that would undermine encryption, putting Swedish citizens, businesses, and institutions at greater risk. Though intended to combat crime, it would instead introduce vulnerabilities that cybercriminals and hostile actors could exploit, making Sweden less secure.

The legislation would require companies to store and provide law enforcement access to encrypted communications, effectively forcing them to create an encryption backdoor. Security experts, including the Swedish Armed Forces, warn that such backdoors weaken overall security, making private data vulnerable to cyberattacks and espionage. If passed, the law could lead encrypted service providers to reconsider their presence in the Swedish market rather than compromise user security.

This move would particularly endanger those who rely on encryption the most: journalists, activists, survivors of domestic violence, and marginalised communities. International human rights bodies have affirmed the essential role of encryption in protecting privacy and free expression. Moreover, weakening encryption would also threaten national security, with even the Swedish Armed Forces endorsing encrypted tools like Signal for secure communication in relation to non-classified communication of national security professionals.

Sweden should prioritise modern, targeted investigative techniques that uphold digital security and encryption for all users, rather than approaches that risk undermining these protections. We urge the Parliament to reject this dangerous legislation and protect Sweden’s security, privacy, and digital future.

Read the full letter.

The post Broad Coalition Urges Sweden To Reject Draft Legislation Undermining Encryption appeared first on Center for Democracy and Technology.

]]>
Debunking Myths on the National Security Impact of Warrants for U.S. Person Queries https://cdt.org/insights/debunking-myths-on-the-national-security-impact-of-warrants-for-u-s-person-queries/ Mon, 07 Apr 2025 16:40:21 +0000 https://cdt.org/?post_type=insight&p=108173 Co-authored with Gene Schaerr, General Counsel at the Project on Privacy and Surveillance Accountability [ PDF version ] Warrantless queries of Americans’ communications obtained via Section 702 of the Foreign Intelligence Surveillance Act (“FISA 702”) are antagonistic to the basic principle of the Fourth Amendment. Deliberately seeking to read Americans’ private communications – but without ever […]

The post Debunking Myths on the National Security Impact of Warrants for U.S. Person Queries appeared first on Center for Democracy and Technology.

]]>
Co-authored with Gene Schaerr, General Counsel at the Project on Privacy and Surveillance Accountability

[ PDF version ]

Warrantless queries of Americans’ communications obtained via Section 702 of the Foreign Intelligence Surveillance Act (“FISA 702”) are antagonistic to the basic principle of the Fourth Amendment. Deliberately seeking to read Americans’ private communications – but without ever showing evidence of wrongdoing or obtaining independent approval from a judge – violates the Constitution, disrespects American values, and opens the door to abuse.

Opponents of FISA reform nonetheless oppose requiring a warrant for U.S. person queries by claiming these queries provide huge value that would be disrupted by a warrant requirement. These claims are false – in reality a warrant rule has been carefully designed to account for the limited value that such queries provide.

MYTH #1: U.S. person queries are immensely important in a broad array of situations, making it dangerous to place restrictions on this important tool.

REALITY: Queries only provide value in a limited set of situations, and the warrant rule proposed in 2024 during the 118th Congress provides exceptions to account for all of them.

Opponents of reform frame U.S. person queries as frequently valuable across a wide set of national security goals and investigations, but the 2023-2024 debate over FISA 702 proved this was false: The Intelligence Community testimony, the President’s Intelligence Advisory Board, and the Privacy and Civil Liberties Oversight Board (PCLOB) uncovered only a few distinct scenarios in which U.S. person queries provided value. And the proposed warrant rule includes exceptions that account for all of them. 

Under the 2024 proposal, a warrant would not be required 1) when there is consent, 2) to track malware, or 3) for metadata queries:

  • Cyber Attacks: Queries were most useful in the cybersecurity context, helping the government detect warning signs of future attacks and trace attacks back to their sources. But queries focused on cyberthreat signatures are explicitly exempt. Much of the cybersecurity value of queries focused on network traffic patterns; this involves metadata rather than content, and metadata queries are also exempt from the warrant rule. Most importantly, any U.S. company or critical infrastructure entity targeted for a cyberattack can simply consent to a query.
  • Foreign Plots: Queries were also described as useful in detecting and responding to foreign assassination and kidnapping plots. But once again, the consent exception directly accounts for this need. A targeted American will obviously gratefully accept such a query to enable government protection.  
  • Foreign Recruitment: Defenders of the status quo cited limited cases in which queries helped the government discover suspicious foreign contacts, assisting the government in investigating whether the U.S. person was a foreign target or foreign agent. But because metadata queries are exempt, a warrant rule would not inhibit the government’s ability to identify these contacts. The government has never shown one instance in which content queries were critical to advancing an investigation against a foreign agent. Besides, reading the private emails of an American being criminally investigated is exactly what warrants are required for. 

MYTH #2: U.S. person queries need to be done quickly and efficiently, and a warrant rule would slow the process down in a manner that endangers Americans’ lives.

REALITY: The government has never shown queries provide time-sensitive responses, and the warrant rule’s exceptions account for such a scenario if it ever did emerge. 

A common argument against surveillance reform is the “ticking time bomb” hypothetical in which there simply isn’t time to abide by due process and obtain court approval. But the government has never shown a situation in which query results were needed so quickly that obtaining a warrant would be infeasible. 

  • If a time-sensitive emergency ever did occur, the warrant rule explicitly accounts for it by including an exception for exigent circumstances. Contrary to this complaint’s framing, the government has indicated that query results are used primarily during the early stages of investigations, or with queries run on targeted victims–in which cases the consent exception makes a warrant unnecessary.

In short, the exigent circumstances, consent and metadata exceptions to the proposed warrant requirement almost certainly address and legitimate concerns about the government’s ability to respond to threats quickly.  

MYTH #3: Warrants are not feasible given the scale of U.S. person queries conducted; adding this rule would overwhelm intelligence agencies and the courts.

REALITY: By permitting warrantless metadata queries, the warrant rule ensures the government will not need to go to court frequently.

In 2023, the most recent year for which data is available, the FBI conducted queries for over 57,000 unique U.S. person terms, reflecting unacceptable government overreach and fishing efforts. However, most of these queries do not produce responsive results. Because the proposed warrant requirement would apply only when the government sought to access a communication’s content, it would weed out impropriety without straining intelligence agencies or the courts. 

  • Only 1.58 percent of the FBI’s U.S. person queries resulted in personnel accessing content, according to the FBI. Thus, even if queries continued to be conducted at the prior rate of 57,000 annually – an unlikely prospect, given that many of these queries were improper or broad fishing efforts – a warrant would be potentially applicable to less than 1,000 queries a year, less than 3 per day on average. And because the proposed warrant rule would permit warrantless metadata queries (and only require court approval to access content), agencies would be able to confirm when a query will yield a “hit” before devoting any time and effort to seeking a warrant.

And even as to these 2-3 queries per day, most would fall under one of the exceptions to the warrant requirement described above. The FBI usually wouldn’t need 2-3 warrants each day; more likely it would need to obtain consent of 2-3 entities to help prevent a future cyberattack or foreign plot. And if adding a warrant requirement on this limited level would be too onerous for intelligence agencies or the courts, the solution would be to add personnel to cover that need, not to reject an important constitutional safeguard against abuse.

Americans’ basic rights should not be secondary to bureaucratic hurdles and staffing limits. The exceptions and exemptions built into the 2024 warrant proposal would allow the government to remain within the boundaries of the Constitution while also having the means to protect national security.

Read the full issue brief.

The post Debunking Myths on the National Security Impact of Warrants for U.S. Person Queries appeared first on Center for Democracy and Technology.

]]>
EU Tech Policy Brief: April 2025 https://cdt.org/insights/eu-tech-policy-brief-april-2025/ Tue, 01 Apr 2025 21:26:17 +0000 https://cdt.org/?post_type=insight&p=108123 Welcome back to the Centre for Democracy & Technology Europe‘s Tech Policy Brief! This edition covers the most pressing technology and internet policy issues under debate in Europe and gives CDT’s perspective on the impact to digital rights. To sign up for CDT Europe’s AI newsletter, please visit our website. Do not hesitate to contact […]

The post EU Tech Policy Brief: April 2025 appeared first on Center for Democracy and Technology.

]]>
Welcome back to the Centre for Democracy & Technology Europe‘s Tech Policy Brief! This edition covers the most pressing technology and internet policy issues under debate in Europe and gives CDT’s perspective on the impact to digital rights. To sign up for CDT Europe’s AI newsletter, please visit our website. Do not hesitate to contact our team in Brussels.

👁 Security, Surveillance & Human Rights

Citizen Lab Unveils Surveillance Abuses in Europe and Beyond                                       

​The recent Citizen Lab report regarding deployment of Paragon spyware in EU Member States, particularly Italy and allegedly in Cyprus and Denmark, highlights a concerning trend of surveillance targeting journalists, government opponents, and human rights defenders. Invasive monitoring of journalist Francesco Cancellato, members of the NGO Mediterranea Saving Humans, and human rights activist Yambio raises serious concerns about press freedom, fundamental rights, and the broader implications for democracy and rule of law in the EU. 

The Italian government’s denial that it authorised surveillance, while reports indicate otherwise, indicates a lack of transparency and accountability. Reportedly, the Undersecretary to the Presidency of the Council of Ministers admitted that Italian intelligence services used Paragon spyware against Mediterranean activists, citing national security justifications. This admission highlights the urgent need for transparent oversight mechanisms and robust legal frameworks to prevent misuse of surveillance technologies. 

Graphic for Citizen Lab report, which reads, "Virtue or Vice? A First Look at Paragon's Proliferating Spyware Options". Graphic has a yellow background, and a grayscale hand reaching through great message bubbles.
Graphic for Citizen Lab report, which reads, “Virtue or Vice? A First Look at Paragon’s Proliferating Spyware Options”. Graphic has a yellow background, and a grayscale hand reaching through great message bubbles.

Lack of decisive action at the European level in response to these findings is alarming. Efforts to initiate a plenary debate within the European Parliament have stalled due to insufficient political support, reflecting a broader pattern of inaction that threatens civic space and fundamental rights across the EU. This inertia is particularly concerning given parallel developments in France, Germany, and Austria, where legislative measures are being considered to legalise use of surveillance technologies. In light of the European Parliament’s PEGA Committee findings on Pegasus and equivalent spyware, it is imperative that EU institutions and Member States establish clear, rights-respecting policies governing the use of surveillance tools. Normalisation of intrusive surveillance without adequate safeguards poses a direct challenge to democratic principles and the protection of human rights within the EU.

Recommended read: Amnesty International, Serbia: Technical Briefing: Journalists targeted with Pegasus spyware

 💬 Online Expression & Civic Space

DSA Civil Society Coordination Group Publishes Analysis on DSA Risk Assessment Reports

Key elements of the Digital Services Act’s (DSA) due diligence obligations for Very Large Online Platforms and Search Engines (VLOPs/VLOSEs) are the provisions on risk assessment and mitigation. Last November, VLOPs and VLOSEs published their first risk assessment reports, which the DSA Civil Society Coordination Group, convened and coordinated by CDT Europe, took the opportunity to jointly assess. We identified both promising practices to adopt and critical gaps to address in order to improve future iterations of these reports and ensure meaningful DSA compliance.

Our analysis zooms in on key topics like online protection of minors, media pluralism, electoral integrity, and online gender-based violence. Importantly, we found that platforms have overwhelmingly focused on identifying and mitigating user-generated risks, as a result focusing less on risks stemming from the design of their services. In addition, platforms do not provide sufficient metrics and data to assess the effectiveness of the mitigation measures they employ. In our analysis, we describe what data and metrics future reports could reasonably include to achieve more meaningful transparency. 

Graphic with a blue background, with logo for the DSA Civil Society Coordination Group featuring members' logos. In black text, graphic reads, "Initial Analysis on the First Round of Risk Assessments Reports under the EU Digital Services Act".
Graphic with a blue background, with logo for the DSA Civil Society Coordination Group featuring members’ logos. In black text, graphic reads, “Initial Analysis on the First Round of Risk Assessments Reports under the EU Digital Services Act”.

CDT Europe’s David Klotsonis, lead author of the analysis, commented, “As the first attempt at DSA Risk Assessments, we didn’t expect perfection — but we did expect substance. Instead, these reports fall short as transparency tools, offering little new data on mitigation effectiveness or meaningful engagement with experts and affected communities. This is a chance for platforms to prove they take user safety seriously. To meet the DSA’s promise, they must provide real transparency and make civil society a key part of the risk assessment process. We are committed to providing constructive feedback and to fostering an ongoing dialogue.”

Recommended read: Tech Policy Press, A New Framework for Understanding Algorithmic Feeds and How to Fix Them 

⚖ Equity and Data

Code of Practice on General-Purpose AI Final Draft Falls Short

Following CDT Europe’s initial reaction to the release of the third Draft Code of Practice on General-Purpose AI (GPAI), we published a full analysis highlighting key concerns. One major issue is the Code’s narrow interpretation of the AI Act, which excludes fundamental rights risks from the list of selected risks that GPAI model providers must assess. Instead, assessing these risks is left as an option, and is only required if such risks are created by a model’s high-impact capabilities.

This approach stands in contrast to the growing international consensus, including the 2025 International AI Safety Report, which acknowledges the fundamental rights risks posed by GPAI. The Code also argues that existing legislation can better address these risks, but we push back on this claim. Laws like the General Data Protection Regulation, the Digital Services Act, and the Digital Markets Act lack the necessary tools to fully tackle these challenges.

Moreover, by making it optional to assess fundamental rights risks, the Code weakens some of its more promising provisions, such as requirements for external risk assessments and clear definitions of unacceptable risk tiers. 

In response to these concerns, we joined a coalition of civil society organisations in calling for a revised draft that explicitly includes fundamental rights risks in its risk taxonomy.

Global AI Standards Hub Summit 

At the inaugural global AI Standards Hub Summit, co-organised by the Alan Turing Institute, CDT Europe’s Laura Lazaro Cabrera spoke at a session exploring the role of fundamental rights in the development of international AI standards. Laura highlighted the importance of integrating sociotechnical expertise and meaningfully involving civil society actors to strengthen AI standards from a fundamental rights perspective. Laura emphasised the need to create dedicated spaces for civil society to participate in standards processes, tailored to the diversity of their contributions and resource limitations.  

Image featuring Programme Director for Equity and Data Laura Lazaro Cabrera speaking at a panel with three other panelists on the role of fundamental rights in standardisation, at the Global AI Standard Hub Summit
Image featuring Programme Director for Equity and Data Laura Lazaro Cabrera speaking at a panel with three other panelists on the role of fundamental rights in standardisation, at the Global AI Standard Hub Summit

Recommended read: Tech Policy Press, Human Rights are Universal, Not Optional: Don’t Undermine the EU AI Act with a Faulty Code of Practice

🆕 Job Opportunities in Brussels: Join Our EU Team

We’re looking for two motivated individuals to join our Brussels office and support our mission to promote human rights in the digital age. 

The Operations & Finance Officer will play a key role in keeping our EU office running smoothly—managing budgets, coordinating logistics, and ensuring strong operational foundations for our advocacy work. 

We’re also seeking an EU Advocacy Intern to support our policy and advocacy efforts, with hands-on experience in research, event planning, and stakeholder engagement. 

Apply before 23 April 2025 by sending your cover letter and CV to hr@cdt.org. For more information, visit our website

🗞 In the Press

⏫ Upcoming Event

Pall Mall Process Conference: On 3 and 4 April, our Director for Security and Surveillance Silvia Lorenzo Perez will participate in the annual Pall Mall Process Conference in Paris. 

The post EU Tech Policy Brief: April 2025 appeared first on Center for Democracy and Technology.

]]>
CDT Joins Letter to Fix the TAKE IT DOWN Act to Protect Encryption https://cdt.org/insights/cdt-joins-letter-to-fix-the-take-it-down-act-to-protect-encryption/ Tue, 01 Apr 2025 18:49:43 +0000 https://cdt.org/?post_type=insight&p=108109 (Last updated on April 4, 2025) Today, CDT joined more than 30 other civil society organizations and cybersecurity experts in submitting a letter to the House of Representatives Committee on Energy & Commerce, urging the Committee to amend the TAKE IT DOWN Act to protect encryption.  As the letter points out, the TAKE IT DOWN […]

The post CDT Joins Letter to Fix the TAKE IT DOWN Act to Protect Encryption appeared first on Center for Democracy and Technology.

]]>
(Last updated on April 4, 2025)

Today, CDT joined more than 30 other civil society organizations and cybersecurity experts in submitting a letter to the House of Representatives Committee on Energy & Commerce, urging the Committee to amend the TAKE IT DOWN Act to protect encryption. 

As the letter points out, the TAKE IT DOWN Act aims to combat the distribution of nonconsensual intimate imagery, which is a laudable goal, but its overbroad requirements for content removal could force encrypted service providers to break encryption or implement invasive monitoring technologies, endangering cybersecurity and free speech. 

The letter calls on the members of the U.S. House of Representatives Committee on Energy and Commerce to amend the TAKE IT DOWN Act by adding encrypted services to the list of services that are already excluded from obligations under the Act.

The letter builds on CDT’s advocacy urging modifications to the bill to address its risks to users’ online speech and privacy rights.

Read the full letter here.

The post CDT Joins Letter to Fix the TAKE IT DOWN Act to Protect Encryption appeared first on Center for Democracy and Technology.

]]>