U.S. Surveillance Archives - Center for Democracy and Technology https://cdt.org/area-of-focus/government-surveillance/us-surveillance/ Thu, 24 Apr 2025 15:05:30 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://cdt.org/wp-content/uploads/2019/11/cropped-cdt-logo-32x32.png U.S. Surveillance Archives - Center for Democracy and Technology https://cdt.org/area-of-focus/government-surveillance/us-surveillance/ 32 32 CDT Submits Statement Urging House Judiciary to Close Loopholes on Warrantless Government Surveillance https://cdt.org/insights/cdt-submits-statement-urging-house-judiciary-to-close-loopholes-on-warrantless-government-surveillance/ Tue, 15 Apr 2025 21:05:00 +0000 https://cdt.org/?post_type=insight&p=108402 On April 8th, the House Judiciary Subcommittee on Crime and Federal Government Surveillance held a hearing to discuss warrantless surveillance issues. At the hearing, members and witnesses across the political spectrum highlighted the dangers of warrantless surveillance, and actions Congress should take to place checks on government surveillance powers. On April 15th, CDT submitted a […]

The post CDT Submits Statement Urging House Judiciary to Close Loopholes on Warrantless Government Surveillance appeared first on Center for Democracy and Technology.

]]>
On April 8th, the House Judiciary Subcommittee on Crime and Federal Government Surveillance held a hearing to discuss warrantless surveillance issues. At the hearing, members and witnesses across the political spectrum highlighted the dangers of warrantless surveillance, and actions Congress should take to place checks on government surveillance powers.

On April 15th, CDT submitted a statement for the record, highlighting three critical issues:

  • Closing the “Backdoor Search Loophole” that allows warrantless queries for Americans’ communications collected through Section 702 of FISA;
  • Closing the “Data Broker Loophole” to stop law enforcement from buying data that should require a warrant to obtain; and
  • Why restoring the independence of the Privacy and Civil Liberties Oversight Board (PCLOB) is key to oversight of government surveillance and preventing abuse.

(Additionally, a witness at the hearing – Gene Schaerr, General Counsel for the Project for Privacy & Surveillance Accountability – included as part of his written testimony a copy of the recently-released CDT and PPSA joint issue brief, entitled Debunking Myths on the National Security Impact of Warrants for U.S. Person Queries.)

Read the full statement of record.

The post CDT Submits Statement Urging House Judiciary to Close Loopholes on Warrantless Government Surveillance appeared first on Center for Democracy and Technology.

]]>
Automated Tools for Social Media Monitoring Irrevocably Chill Millions of Noncitizens’ Expression https://cdt.org/insights/automated-tools-for-social-media-monitoring-irrevocably-chill-millions-of-noncitizens-expression/ Tue, 15 Apr 2025 20:17:08 +0000 https://cdt.org/?post_type=insight&p=108372 Last week, USCIS stated its plans to routinely screen applicants’ social media activity for alleged antisemitism when making immigration decisions in millions of cases, and announced that it is scouring the social media accounts of foreign students for speech that it deems potential grounds to revoke their legal status. Simultaneously, the Department of State has […]

The post Automated Tools for Social Media Monitoring Irrevocably Chill Millions of Noncitizens’ Expression appeared first on Center for Democracy and Technology.

]]>
Last week, USCIS stated its plans to routinely screen applicants’ social media activity for alleged antisemitism when making immigration decisions in millions of cases, and announced that it is scouring the social media accounts of foreign students for speech that it deems potential grounds to revoke their legal status. Simultaneously, the Department of State has started using AI to enforce its “Catch and Revoke” policy and weed out “pro-Hamas” views among visa-holders, particularly including students who have protested against Israel’s war in Gaza. 

This isn’t USCIS’s first time conducting some form of social media monitoring; in fact, their first foray into social media data collection was in 2014. But, it is the first time the government has used a previously obscure provision of immigration law to target a large group of noncitizens for removal based on their political opinions and activism that the Secretary of State has determined could have “potentially serious adverse foreign policy consequences.” The current Administration’s broad definitions of speech that could lead to visa revocation or application denial, and the questionable constitutionality of making immigration decisions based on viewpoint, raise concerns that will only be exacerbated by the use of flawed, error-prone social media monitoring technologies.

The American immigration system already subjects applicants to disproportionate invasions of privacy and surveillance, some applicants more than others. In the current Administration, immigration enforcement has been particularly aggressive and gone beyond the bounds of previous enforcement efforts, with agents bringing deportation proceedings against applicants on valid visas on the basis of their legally-protected speech, including authorship of op-eds, participation in protests, and, according to a real albeit now-deleted social media post by the Immigration and Customs Enforcement agency, their ideas. Noncitizens have long been aware of the government’s surveillance of their speech and their social media activity, which has deterred them from accessing essential services and speaking freely on a wide range of topics, including their experience with immigration authorities, labor conditions in their workplace, or even domestic violence.

What is happening now, however, is an unprecedented and calculated effort by the U.S. government to conduct surveillance of public speech and use the results to target for removal those who disagree with government policy. At the time of writing, over 1,000 student visas have been revoked according to the State Department, some of which have been for participation in First Amendment-protected activities. For example, one post-doctoral student at Georgetown reportedly had his visa revoked for posting in support of Palestine on social media, posts that were characterized as “spreading Hamas propaganda” by a DHS spokesperson. In a high-profile case from earlier this year, the former President of Costa Rica received an email from the U.S. government revoking his visa to the United States a few weeks after he criticized the government on social media, saying, “It has never been easy for a small country to disagree with the U.S. government, and even less so, when its president behaves like a Roman emperor, telling the rest of the world what to do.” All signs indicate that disagreement with this Administration’s viewpoints could lead to negative consequences for noncitizens seeking to enter or remain in this country in any capacity.

This expansion of ideological targeting is cast against the backdrop of an immigration system that faces, at times, a Sisyphean backlog of applications and insufficient oversight of enforcement decisions, which are only growing in this political climate. Mistakes are routinely made, and they have devastating consequences. To the extent oversight agencies did exist, including through entities such as the Department of Homeland Security’s Office for Civil Rights and Civil Liberties, they have been shuttered or undermined, which will make it all the more difficult to identify and fix errors and failures to provide due process.

Applicants have little recourse to seek remedy or appeal mistakes when they are made, instead having to choose among cautious over-compliance in the form of silence, potential retaliation, or self-deportation to avoid it all. Increased social media surveillance of noncitizens against this backdrop will compound existing inequities within the system, and will almost certainly further chill noncitizens’ ability to speak and participate freely in society for fear of running afoul of the Administration.

And that’s all before accounting for the problems with the tools that the government will use to conduct this monitoring. The automated tools used for this type of social media surveillance are likely to be based on keyword filters and machine learning models, including large language models such as those that underlie chatbots such as ChatGPT. These tools are subject to various flaws and limitations that will exacerbate the deprivation of individuals’ fundamental rights to free expression and due process. This litany of problems with automated social media analysis is so pronounced that DHS opted against using such a system during the first Trump administration. DHS’s concerns about erroneous enforcement and deportations may have disappeared, but the risks from this technology have not.

First, models may be trained with a particular bias. Social media monitoring systems are generally trained on selected keywords and data easily found on the web, such as data scraped from Reddit, Wikipedia, and other largely open-access sources, which over-index on the views and perspectives of a few. Keywords may be added to the training corpus to fit the domain of use, such as offering examples of what constitutes “anti-semitism” or threats to national security. Should the training data over-represent a particular set of views or designations of “foreign terrorists,” the model may over-flag speech by some individuals more than others. The Administration’s over-capacious definition of the term “antisemitic” may be weaponized during the training of these social media monitoring models, subjecting to greater scrutiny anyone who has engaged in speech with which the Administration disagrees on topics such as Israel-Palestine or campus protests related to military actions against Gaza, even where the speech is protected by the First Amendment.

Second, and relatedly, these prescriptive tools struggle to parse context. While keyword filters and machine learning models may be able to identify words or phrases they’ve been tasked to detect, they are unable to parse the context in which the term is used, including such essential human expressions as humor, sarcasm, irony, and reclaimed language. We’ve written previously about how the use of automated content analysis tools by Facebook to enforce its Dangerous Organization & Individuals’ policy erroneously flagged and took down all posts containing the word “shaheed” (which means martyr in Arabic), even when an individual was named Shaheed or in contexts where individuals were not using the term in a way that glorified or approved of violence. Noncitizen journalists who cover protests or federal policy and post their articles on social media may also be flagged and surveilled simply for doing their job. People named Isis have long been caught up in the fray and flagged by these automated technologies. Posts by individuals citing the “soup nazi” episode of Seinfeld may also be swept in this analysis. Models’ inability to parse context will also limit their ability to conduct predictive analysis. Vendors procured by USCIS to conduct social media monitoring assert that they use AI to scan for “risky keywords” and identify persons of interest, but promises of predictive analysis likely rest on untested and discriminatory assumptions and burden the fundamental rights of all individuals swept up by these social media monitoring tools. 

Finally, the systems will be especially error-prone in multilingual settings. New multilingual language models purport to work better in more languages, yet are still trained primarily on English-language data, some machine-translated non-English data, and other available and often religious or government documents,—all imperfect proxies for how individuals speak their languages online. Multilingual training data for models is likely to underinclude terms frequently used by native speakers, including spoken regional dialects, slang, code-mixed terms, and “algospeak.” As a result, most models are unable to parse the more informal ways people have of speaking online, leading to erroneous outcomes when models analyze non-English language speech.

There have already been countless instances where digital translation technologies have been used by U.S. immigration enforcement agencies in problematic ways, which have prevented individuals from accessing a fair process and even safety. For example, an automated translation tool resulted in an individual erroneously being denied asylum because it misunderstood that she was seeking safety from parental abuse, literally translating that her perpetrator “el jefe” was her boss rather than her father. An individual from Brazil was detained for six months because of an incomplete asylum application, because the translation tool ICE used translated “Belo Horizonte” literally to “beautiful horizon” instead of identifying it as a city in which the applicant had lived. Another automated system used to conduct content analysis mistranslated “good morning” in Arabic to “attack them.” Widespread use of these error-prone systems to detect disfavored ideas will only exacerbate the discriminatory treatment of those who speak English as a second language.

Ultimately, the adoption of automated technologies to scan social media data will punish people for engaging in legal speech and result in more errors in an already flawed system. It will also chill the speech of millions of people in this country and abroad, impoverishing the global conversations that happen online. An applicant seeking to adjust their status or become a U.S. citizen, or even a U.S. citizen seeking to communicate with a noncitizen, will reasonably think twice before speaking freely or engaging in constitutionally-protected activities like protesting, simply because of the specter of social media surveillance. They already are.

The post Automated Tools for Social Media Monitoring Irrevocably Chill Millions of Noncitizens’ Expression appeared first on Center for Democracy and Technology.

]]>
When It Comes to Encryption, Back Doors Are Never Simple: Why UK Apple Users Won’t Have Encrypted Backups Anymore https://cdt.org/insights/when-it-comes-to-encryption-back-doors-are-never-simple-why-uk-apple-users-wont-have-encrypted-backups-anymore/ Tue, 08 Apr 2025 20:29:20 +0000 https://cdt.org/?post_type=insight&p=108224 Millions of Apple customers in the United Kingdom are losing access to an important end-to-end encryption tool protecting their personal data, after the company refused a reported UK government demand to build a back door into its system that would have allowed law enforcement to read personal data stored in the cloud.  Advanced Data Protection, […]

The post When It Comes to Encryption, Back Doors Are Never Simple: Why UK Apple Users Won’t Have Encrypted Backups Anymore appeared first on Center for Democracy and Technology.

]]>
Millions of Apple customers in the United Kingdom are losing access to an important end-to-end encryption tool protecting their personal data, after the company refused a reported UK government demand to build a back door into its system that would have allowed law enforcement to read personal data stored in the cloud. 

Advanced Data Protection, the service in question, allowed users to automatically store encrypted backups of files from their devices that not even Apple itself could access. And while there may be legitimate reasons for law enforcement to seek access to particular files, Apple correctly concluded that carving a new pathway through this wall of encryption would introduce a significant new vulnerability to Apple’s online storage system, one that would affect every user on the planet. 

Just as a new door into a home gives intruders an additional path inside, so too does a digital back door provide a new way for law enforcement, hackers, and unfriendly governments to access materials that are supposed to be protected. So, just like with a real door, digital engineers add a lock and key to keep things secure.

But anyone holding that key, whether a legitimate government actor, a repressive regime or a criminal hacker, can access users’ data for their own purposes. While Apple or another tech company would build protections into the system’s design, no company can guarantee that the keys would always remain safe from hackers or from government overreach. A key allowing access to so much data is a tremendously attractive target for bad actors, and if even a single hacker succeeds in accessing the key, all bets are off. Because of those risks, end-to-end encrypted backups rely on the principle that only the user has access to the keys.

Simply creating an additional access point would also introduce extra complexity to the cloud storage system, which in itself naturally creates opportunities for errors to creep in that hackers could exploit. Apple has noted that in practice, systems with backdoors are unlikely to provide the privacy and security guarantees users expect and demand and that risks of cloud data breaches are significant and impactful. While there may be a law enforcement interest in accessing certain backed up files, undermining encryption for everyone’s backed up files itself will risk widespread criminal activity, from unlawful surveillance to accessing people’s most intimate photos.

As an esteemed group of researchers noted about previous attempts to require back-door access to online systems, “The complexity of today’s Internet environment, with millions of apps and globally connected services, means that new law enforcement requirements are likely to introduce unanticipated, hard to detect security flaws.” Experts are clear that keys under doormats make us all less secure and will be widely abused.

And those are just the technical concerns.  Governments demanding access to those keys have different conceptions of the level of privacy their citizens should be allowed, and a back door built for a lawful purpose in one country could turn into a tool of repression in another. Likewise, regimes change, and if less-benevolent leaders take over, a surveillance system built for the “right” reasons could fall into the wrong hands.

While past proposed systems have differed in their technological details – and the apparent order to Apple in this case remains secret – one trend has been consistent. Whatever technologies are involved, from hidden access points to stored (“escrowed”) access codes that allow decryption, to “ghost users” added in to online conversations, systems for exceptional access inevitably get abused.

A case-in-point is the Athens Affair, in which the Greek government discovered that an unknown hacker had gained access to Vodafone’s “lawful intercept” system to spy on phone calls of both journalists and Greek politicians – including the nation’s then-president. And yet again more recently, the Salt Typhoon hackers gained unprecedented surveillance over telecommunications systems in the US and other countries, including Internet and cellular telephone metadata and even audio recordings of conversations from presidential candidates, through access to the lawful access systems put in place to comply with statutory requirements.

Apple’s decision to cut off encrypted cloud storage in the UK is a dramatic move, but it’s also both principled and pragmatic. Complying with the UK government’s reported order would undermine the security of every Advanced Data Protection user around the world. When it comes to encryption, threats to security anywhere are threats to security everywhere. 

The post When It Comes to Encryption, Back Doors Are Never Simple: Why UK Apple Users Won’t Have Encrypted Backups Anymore appeared first on Center for Democracy and Technology.

]]>
Debunking Myths on the National Security Impact of Warrants for U.S. Person Queries https://cdt.org/insights/debunking-myths-on-the-national-security-impact-of-warrants-for-u-s-person-queries/ Mon, 07 Apr 2025 16:40:21 +0000 https://cdt.org/?post_type=insight&p=108173 Co-authored with Gene Schaerr, General Counsel at the Project on Privacy and Surveillance Accountability [ PDF version ] Warrantless queries of Americans’ communications obtained via Section 702 of the Foreign Intelligence Surveillance Act (“FISA 702”) are antagonistic to the basic principle of the Fourth Amendment. Deliberately seeking to read Americans’ private communications – but without ever […]

The post Debunking Myths on the National Security Impact of Warrants for U.S. Person Queries appeared first on Center for Democracy and Technology.

]]>
Co-authored with Gene Schaerr, General Counsel at the Project on Privacy and Surveillance Accountability

[ PDF version ]

Warrantless queries of Americans’ communications obtained via Section 702 of the Foreign Intelligence Surveillance Act (“FISA 702”) are antagonistic to the basic principle of the Fourth Amendment. Deliberately seeking to read Americans’ private communications – but without ever showing evidence of wrongdoing or obtaining independent approval from a judge – violates the Constitution, disrespects American values, and opens the door to abuse.

Opponents of FISA reform nonetheless oppose requiring a warrant for U.S. person queries by claiming these queries provide huge value that would be disrupted by a warrant requirement. These claims are false – in reality a warrant rule has been carefully designed to account for the limited value that such queries provide.

MYTH #1: U.S. person queries are immensely important in a broad array of situations, making it dangerous to place restrictions on this important tool.

REALITY: Queries only provide value in a limited set of situations, and the warrant rule proposed in 2024 during the 118th Congress provides exceptions to account for all of them.

Opponents of reform frame U.S. person queries as frequently valuable across a wide set of national security goals and investigations, but the 2023-2024 debate over FISA 702 proved this was false: The Intelligence Community testimony, the President’s Intelligence Advisory Board, and the Privacy and Civil Liberties Oversight Board (PCLOB) uncovered only a few distinct scenarios in which U.S. person queries provided value. And the proposed warrant rule includes exceptions that account for all of them. 

Under the 2024 proposal, a warrant would not be required 1) when there is consent, 2) to track malware, or 3) for metadata queries:

  • Cyber Attacks: Queries were most useful in the cybersecurity context, helping the government detect warning signs of future attacks and trace attacks back to their sources. But queries focused on cyberthreat signatures are explicitly exempt. Much of the cybersecurity value of queries focused on network traffic patterns; this involves metadata rather than content, and metadata queries are also exempt from the warrant rule. Most importantly, any U.S. company or critical infrastructure entity targeted for a cyberattack can simply consent to a query.
  • Foreign Plots: Queries were also described as useful in detecting and responding to foreign assassination and kidnapping plots. But once again, the consent exception directly accounts for this need. A targeted American will obviously gratefully accept such a query to enable government protection.  
  • Foreign Recruitment: Defenders of the status quo cited limited cases in which queries helped the government discover suspicious foreign contacts, assisting the government in investigating whether the U.S. person was a foreign target or foreign agent. But because metadata queries are exempt, a warrant rule would not inhibit the government’s ability to identify these contacts. The government has never shown one instance in which content queries were critical to advancing an investigation against a foreign agent. Besides, reading the private emails of an American being criminally investigated is exactly what warrants are required for. 

MYTH #2: U.S. person queries need to be done quickly and efficiently, and a warrant rule would slow the process down in a manner that endangers Americans’ lives.

REALITY: The government has never shown queries provide time-sensitive responses, and the warrant rule’s exceptions account for such a scenario if it ever did emerge. 

A common argument against surveillance reform is the “ticking time bomb” hypothetical in which there simply isn’t time to abide by due process and obtain court approval. But the government has never shown a situation in which query results were needed so quickly that obtaining a warrant would be infeasible. 

  • If a time-sensitive emergency ever did occur, the warrant rule explicitly accounts for it by including an exception for exigent circumstances. Contrary to this complaint’s framing, the government has indicated that query results are used primarily during the early stages of investigations, or with queries run on targeted victims–in which cases the consent exception makes a warrant unnecessary.

In short, the exigent circumstances, consent and metadata exceptions to the proposed warrant requirement almost certainly address and legitimate concerns about the government’s ability to respond to threats quickly.  

MYTH #3: Warrants are not feasible given the scale of U.S. person queries conducted; adding this rule would overwhelm intelligence agencies and the courts.

REALITY: By permitting warrantless metadata queries, the warrant rule ensures the government will not need to go to court frequently.

In 2023, the most recent year for which data is available, the FBI conducted queries for over 57,000 unique U.S. person terms, reflecting unacceptable government overreach and fishing efforts. However, most of these queries do not produce responsive results. Because the proposed warrant requirement would apply only when the government sought to access a communication’s content, it would weed out impropriety without straining intelligence agencies or the courts. 

  • Only 1.58 percent of the FBI’s U.S. person queries resulted in personnel accessing content, according to the FBI. Thus, even if queries continued to be conducted at the prior rate of 57,000 annually – an unlikely prospect, given that many of these queries were improper or broad fishing efforts – a warrant would be potentially applicable to less than 1,000 queries a year, less than 3 per day on average. And because the proposed warrant rule would permit warrantless metadata queries (and only require court approval to access content), agencies would be able to confirm when a query will yield a “hit” before devoting any time and effort to seeking a warrant.

And even as to these 2-3 queries per day, most would fall under one of the exceptions to the warrant requirement described above. The FBI usually wouldn’t need 2-3 warrants each day; more likely it would need to obtain consent of 2-3 entities to help prevent a future cyberattack or foreign plot. And if adding a warrant requirement on this limited level would be too onerous for intelligence agencies or the courts, the solution would be to add personnel to cover that need, not to reject an important constitutional safeguard against abuse.

Americans’ basic rights should not be secondary to bureaucratic hurdles and staffing limits. The exceptions and exemptions built into the 2024 warrant proposal would allow the government to remain within the boundaries of the Constitution while also having the means to protect national security.

Read the full issue brief.

The post Debunking Myths on the National Security Impact of Warrants for U.S. Person Queries appeared first on Center for Democracy and Technology.

]]>
Weaponizing Immigrant Tax Data: How IRS-DHS Cooperation Would Undermine Tax Compliance, Increase Burdens, and Threaten Data Privacy https://cdt.org/insights/weaponizing-immigrant-tax-data-how-irs-dhs-cooperation-would-undermine-tax-compliance-increase-burdens-and-threaten-data-privacy/ Mon, 24 Mar 2025 20:39:17 +0000 https://cdt.org/?post_type=insight&p=108025 Recent reports indicate the Internal Revenue Service (IRS) is nearing an agreement to allow immigration officials to use tax data to confirm the names and addresses of people suspected of being in the country illegally. If approved, the agreement would mark a “complete betrayal of 30 years of the government telling immigrants to file their […]

The post Weaponizing Immigrant Tax Data: How IRS-DHS Cooperation Would Undermine Tax Compliance, Increase Burdens, and Threaten Data Privacy appeared first on Center for Democracy and Technology.

]]>
Recent reports indicate the Internal Revenue Service (IRS) is nearing an agreement to allow immigration officials to use tax data to confirm the names and addresses of people suspected of being in the country illegally. If approved, the agreement would mark a “complete betrayal of 30 years of the government telling immigrants to file their taxes,” and a sharp reversal from just a few weeks ago. Late last month, the Department of Homeland Security (DHS) requested that the IRS divulge the home addresses of 700,000 people it suspected of being in the country illegally so that immigration authorities could locate them more easily, potentially for deportation proceedings. According to the Washington Post report that broke the news, IRS leaders initially refused the request, but newly-installed acting IRS commissioner, Melanie Krause, was interested in exploring options to fulfill the request without violating federal tax privacy laws. Even if this disclosure could be accomplished in a manner consistent with federal law, doing so would have severe effects.

Immigrants, including undocumented immigrants, pay hundreds of billions of dollars in federal and state taxes each year, despite the fact that they do not receive many of the federal benefits that these taxes fund. Disclosing their tax records to DHS for immigration enforcement would discourage tax compliance among immigrant communities, thereby weakening contributions to essential public programs and increasing burdens for U.S. citizens and nonimmigrant taxpayers. Furthermore, it would set a dangerous precedent for data privacy abuse, undermining the federal tax system and other federal programs more broadly.

The IRS was established to collect revenue and administer tax laws, not to serve as an arm of immigration enforcement. Upholding this simple distinction is crucial to maintaining an effective tax system that benefits all Americans.

Chilling Effects on Immigrant Tax Compliance 

Disclosing IRS records to DHS would broadly discourage tax compliance. Undocumented immigrants may stop filing their taxes altogether in order to protect themselves and their family members from efforts to deport them based on information they provide to the IRS. Additionally, U.S. citizens and legal residents in mixed-status households may avoid claiming undocumented dependents, avoid claiming certain tax credits, or stop filing taxes altogether to protect their loved ones who are undocumented. 

The IRS allows undocumented immigrants, visitors to the U.S. who earn income, and other people who do not qualify for social security numbers (SSNs) to file returns using Individual Taxpayer Identification Numbers (ITINs). ITINs do not provide legal immigration status or work authorization; they merely enable people without SSNs to pay taxes. ITINs also do not confer the same public benefits that U.S. citizen taxpayers receive, like Social Security or Medicare benefits, even though ITIN holders contribute billions of dollars to those programs through federal payroll taxes. 

Nevertheless, millions of ITIN holders, including undocumented immigrants, pay taxes every year. In fact, many undocumented immigrants do so primarily to demonstrate financial responsibility for future discretionary immigration proceedings. But if undocumented immigrants fear their tax data could be used against them for deportation proceedings, many may stop filing taxes altogether. This would lead to a significant tax revenue loss, especially in states with large immigrant populations, and would place increased tax burdens on U.S. citizens, Lawful Permanent Residents (i.e., Green Card holders), and nonimmigrant taxpayers, such as temporary visa holders.

Cascading Effects on U.S. Citizens and Nonimmigrant Taxpayers

If undocumented immigrants and other people stop filing taxes due to fear of their data being disclosed for immigration enforcement, the consequences would extend far beyond tax compliance and lost revenue. 

Disclosure of IRS records could drive more economic activity into the informal sector, where wages are often lower, worker protections are weaker, and businesses lose out on legal labor contributions. This would occur when undocumented immigrants prefer unreported employment to prevent the collection of their personal information and its disclosure for immigration enforcement purposes. This shift would undermine fair labor practices, put downward pressure on wages, and create additional obstacles for businesses seeking to comply with employment laws and tax regulations.

Nonimmigrants, such as foreign workers with temporary visas, could also face unwarranted scrutiny, errors, and delays in tax processing as their information gets swept up in response to requests from immigration authorities, even though they are lawfully present. Bureaucratic challenges to fix these errors could impose additional financial and legal burdens and further undermine tax compliance.

Legal Uncertainties and Dangerous Precedent for Data Privacy

Federal tax privacy laws, particularly Section 6103 of the Internal Revenue Code, strictly limit the disclosure of tax return information to protect taxpayer confidentiality. While there are narrow exceptions for certain law enforcement purposes, immigration enforcement is not among them, raising serious legal questions about whether such data-sharing would even be permissible. Any attempt to reinterpret these protections to justify disclosure will face legal challenges from advocacy groups, state governments, or even members of Congress, setting the stage for prolonged court battles over the scope of taxpayers’ privacy rights.

Furthermore, if the IRS grants DHS access to tax records for immigration enforcement, other agencies may also seek similar exceptions to use confidential taxpayer data for non-tax purposes. This would erode long-standing data privacy protections put into the tax code to promote tax compliance. 

Federal tax privacy laws offer some of the strictest privacy protections for data held by the government, so if DHS is able to bypass them, that would likely be only the first domino to fall. DHS almost certainly will seek immigrants’ data maintained by other federal benefit programs, too. DHS may be able to vitiate federal privacy laws protecting that data one-by-one, with cascading effects on U.S. citizens. For example, immigrant parents may not complete applications for federal student aid (“FAFSA forms”) for their U.S. citizen children, and immigrant mothers may not seek nutrition assistance through the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) for their U.S. citizen children, for fear that participation in such programs (and the information provided or withheld on applications to participate, such as the absence of a social security number on a form) will be used to identify them for immigration enforcement actions. In both examples, it is U.S. citizens who will suffer the loss of educational opportunity and adequate nutrition.   

***

Protecting tax privacy is not about shielding individuals from immigration enforcement — it is about maintaining the integrity of data protections that promote compliance with tax laws. Undermining tax privacy by permitting the use of tax records for immigration enforcement could begin the process of undermining privacy laws that are key to public confidence in the programs that those laws promote. 

The post Weaponizing Immigrant Tax Data: How IRS-DHS Cooperation Would Undermine Tax Compliance, Increase Burdens, and Threaten Data Privacy appeared first on Center for Democracy and Technology.

]]>
Secrets, Secrets Are No Fun: the United Kingdom’s Secret War on Encryption https://cdt.org/insights/secrets-secrets-are-no-fun-the-united-kingdoms-secret-war-on-encryption/ Wed, 19 Mar 2025 17:10:59 +0000 https://cdt.org/?post_type=insight&p=107972 Late last week, a secret tribunal in the U.K. reportedly held a secret hearing on an appeal by U.S. tech giant, Apple, of a secret order Apple reportedly received from the U.K. to compromise its users’ privacy and cybersecurity worldwide. The British government is attacking encryption, and the casualties could include the privacy and cybersecurity […]

The post <strong>Secrets, Secrets Are No Fun: the United Kingdom’s Secret War on Encryption</strong> appeared first on Center for Democracy and Technology.

]]>
Late last week, a secret tribunal in the U.K. reportedly held a secret hearing on an appeal by U.S. tech giant, Apple, of a secret order Apple reportedly received from the U.K. to compromise its users’ privacy and cybersecurity worldwide.

The British government is attacking encryption, and the casualties could include the privacy and cybersecurity of millions worldwide. The U.S. should demand that the U.K. withdraw its order, or else terminate the U.K.’s  unique access to the troves of user data it obtains from U.S. tech companies. 

The U.K. Ambushes Encryption

Recent reports suggest that the British Home Office has secretly issued a Technical Capability Notice (TCN) to Apple under the Investigatory Powers Act (IPA) of 2016, commonly known as the “Snoopers’ Charter,” compelling the company to introduce a backdoor into its end-to-end encrypted cloud storage service, “Advanced Data Protection” (ADP). The Snooper’s Charter, which has long concerned CDT, prohibits the recipient of a TCN from disclosing the existence or contents of the notice to anyone without the permission of the Secretary of State, so Apple can neither confirm, nor deny, the existence of the demand. 

Assuming the reports are true, such backdoor access would allow British officials to require Apple to provide in decrypted form content that any user — not only in the U.K., but worldwide — has uploaded to the cloud using ADP. This type of order has no known precedent in major democracies — for good reason. 

Introducing backdoors into end-to-end encryption means introducing systemic security flaws, as the U.K. knows. Across the world, cybersecurity experts agree that there is no way to provide government access to end-to-end encrypted data without breaking end-to-end encryption. News of the U.K. order to Apple sparked global alarm. Backdoors into encryption jeopardize all users’ privacy and cybersecurity because criminals specifically look to exploit these vulnerabilities. Nevertheless, the U.K. has decided to ambush encryption with its notice. As President Trump put it: “That’s something, you know, you hear about with China.”

In the case of Apple, the world’s second largest provider of mobile devices, introducing backdoor access into its encrypted cloud service would mean putting millions of users at risk. To make matters worse, the most harmful impact would fall on those who rely on encryption because they are already most vulnerable, including domestic violence survivors, LGBTQ+ persons, and others. These risks must not be tolerated.

Apple Fights Back in the Shadows

Rather than capitulate to the U.K.’s demand, Apple made the principled decision to cease offering ADP in Great Britain, and it has reportedly appealed the notice to the Investigatory Powers Tribunal, which has the authority to review complaints against U.K. intelligence services. British law requires Apple to comply with the notice even while its appeal is pending. As a result, British authorities may insist that Apple build a backdoor to ADP even though it does not offer ADP in the U.K. Apple may challenge such a fully extraterritorial mandate as disproportionate under applicable law. 

To make matters worse — again — the entire review process is also shrouded in secrecy. Similar to how the recipient of a TCN is prohibited from disclosing the existence or contents of the notice, the Investigatory Powers Tribunal proceedings can be kept secret. This means the U.K. Home Office can place Apple, or any other service provider, under a strict gag order when it issues a TCN. The chilling result: the public does not know if other encrypted services have received such notices and, if so, which of them complied with those notices, putting user data at risk. 

This blatant lack of transparency severely inhibits public discourse, making it impossible for stakeholders — including cybersecurity experts, civil rights organizations, and the general public — to understand the full implications and challenge the U.K.’s policy. Apple may or may not be the first recipient of a notice that requires undermining encryption, but it’s unlikely to be the last. In any case, policies that affect millions of users and global cybersecurity ought not be fought out in the shadows. 

Another CLOUD Looms in the U.S.

Despite the U.K. Home Office issuing the TCN under its own domestic law, the U.S. is not without means to respond. The US-UK CLOUD Act Agreement (Agreement) entered into effect under the authority of the U.S. CLOUD Act and gives the U.S. substantial leverage over the U.K. in surveillance matters. 

The CLOUD Act allows U.S. providers to disclose user data directly to foreign states under the laws of those foreign states, with certain conditions. Those conditions include limiting disclosures to cases involving serious crimes, preventing disclosure of information of Americans or anyone physically located in the U.S., and most importantly, requiring that the U.S. has entered an executive agreement with the requesting state that certifies the state’s laws and practices meet certain human rights standards. Countries with CLOUD Act agreements with the U.S. can bypass the cumbersome process under mutual legal assistance treaties (MLATs), as well as the probable cause requirement for compelled disclosure of communications content that applies in the MLAT context, and most importantly for the U.K., can engage in real time wiretapping of the users of U.S. tech companies, which MLAT processes and U.S. law do not otherwise permit. All CLOUD Act agreements are reciprocal, so the U.S. should enjoy the same benefits as partner states. 

So far, the U.S. has entered into only two CLOUD Act agreements: one with Australia, and one with the U.K., which entered into force on October 3, 2022. So what can be done?

Light Through the CLOUD

The CLOUD Act, and the US-UK CLOUD Act Agreement, present a significant opportunity for the U.S. to meaningfully pressure the U.K. to withdraw its demand to Apple. By law, the US-UK CLOUD Act Agreement expires after five years unless renewed, which means the Agreement will expire in October 2027 unless renewed. 

The U.S. Department of Justice quietly recertified the US-UK CLOUD Act Agreement in November 2024, around the Thanksgiving congressional recess. The recertification report sent to Congress, which is required by the Act, provides several key insights about the U.K.’s conduct under the Agreement, not least that the U.K. issued more than 20,000 requests to U.S. service providers — almost all of which included wiretapping surveillance — while the U.S. issued a mere 63 to British providers. This dramatic imbalance owes to the geographic concentration of major service providers in the U.S., but it also demonstrates the overwhelming importance of the Agreement to the U.K. and its relative lack of importance to the U.S., and provides a powerful lever for the U.S. to wield. After all, the Trump Administration could, under the terms of the Agreement, unilaterally terminate it without cause and with only 30 days notice.  

The recertification report subtly hints that the DOJ knew about the TCN issued to Apple, or other attacks on encryption in the U.K. The report states that although new laws in the U.K., such as the Investigatory Powers (Amendment) Act of 2024 that expanded surveillance authority under the IPA, did not violate the requirements of the CLOUD Act (per the DOJ), the DOJ had nonetheless “taken the opportunity […] to remind the U.K. of the the statute’s requirement that the terms of the Agreement shall not create any obligation that providers be capable of decrypting data or limitation that prevents providers from decrypting data.” At a minimum, the DOJ should also have “taken the opportunity” to warn Congress that the U.K. was preparing to use newly acquired powers under British law to undermine the security of Americans’ encrypted data and those of people around the world. 

The U.S. Seeks Answers

Congress has, in fact, taken steps to leverage the CLOUD Act and the US-UK CLOUD Act Agreement to seek answers from top U.S. and U.K. officials. In a letter to the Director of National Intelligence (DNI), Tulsi Gabbard, Senator Ron Wyden (D-OR) and Representative Andy Biggs (R-AZ) urged the U.S. to “[give] the U.K. an ultimatum: back down from this dangerous attack on U.S. cybersecurity, or face serious consequences.” The letter also asked DNI Gabbard to provide Congress with unclassified answers to critical questions, like whether the Trump Administration had any awareness of the TCN.

In her response, DNI Gabbard expressed that she shared a “grave concern about the serious implications of the United Kingdom, or any foreign country, requiring Apple or any company to create a ‘backdoor’ that would allow access to Americans personal encrypted data.” She further noted that such a TCN would be a “clear and egregious violation of Americans’ privacy and civil liberties, and open up a serious vulnerability for cyber exploitation by adversarial actors,” while committing to using her office to investigate the matter further. 

Most recently, a bipartisan group of members of Congress also urged the IPT to open its hearing to the public, and former Secretary of Homeland Security Michael Chertoff said the U.K. should reconsider its move to break encryption

These actions are the appropriate first steps, but the DOJ should also weigh in and urge the U.K. to reverse course, and Congress should modify the CLOUD Act itself to preclude agreements with states whose laws authorize orders to compel decryption by providers of end-to-end encrypted services. Such providers cannot decrypt data or communications without introducing serious security vulnerabilities and, as Apple was here, could effectively be compelled to cease the offer of such service, to the detriment of cybersecurity in the U.S. and abroad. In the meantime, if the U.K. refuses to withdraw the order, the U.S. should terminate the Agreement. 

***

The U.K.’s secret war on encryption threatens global cybersecurity and sets a dangerous precedent for government overreach. With secret orders, secret appeals, and secret hearings, the U.K. is undermining public trust and digital safety from the shadows. The U.S. must continue to  demand transparency and accountability. If the U.K. refuses to back down, Congress and the Trump administration should take decisive action to protect the security of Americans’ data. Encryption is not just a policy debate—it is a fundamental pillar of people’s privacy and security, and it must be protected.

The post <strong>Secrets, Secrets Are No Fun: the United Kingdom’s Secret War on Encryption</strong> appeared first on Center for Democracy and Technology.

]]>
EU Tech Policy Brief: January 2025 https://cdt.org/insights/eu-tech-policy-brief-january-2024/ Wed, 05 Feb 2025 00:45:21 +0000 https://cdt.org/?post_type=insight&p=107297 Welcome back to the Centre for Democracy & Technology Europe‘s Tech Policy Brief, where we highlight some of the most pressing technology and internet policy issues under debate in Europe, the U.S., and internationally, and give CDT’s perspective on the impact to digital rights. To sign up for this newsletter, or CDT Europe’s AI newsletter, […]

The post EU Tech Policy Brief: January 2025 appeared first on Center for Democracy and Technology.

]]>
Welcome back to the Centre for Democracy & Technology Europe‘s Tech Policy Brief, where we highlight some of the most pressing technology and internet policy issues under debate in Europe, the U.S., and internationally, and give CDT’s perspective on the impact to digital rights. To sign up for this newsletter, or CDT Europe’s AI newsletter, please visit our website.

📢 2025 Team Update 

CDT Europe’s team is back together! We’re thrilled to kick off the new year with the full team back in action. This January, we welcomed two team members, Joanna Tricoli, joining the Security, Surveillance and Human Rights Programme as a Policy and Research Officer, and Magdalena Maier, joining the Equity and Data Programme as a Legal and Advocacy Officer. Plus, our Secretary General, Asha Allen, has returned to the office – we’re so glad to have her back!

 Full CDT Europe team is pictured at CDT Europe’s office in Brussels.
 Full CDT Europe team is pictured at CDT Europe’s office in Brussels.

👁 Security, Surveillance & Human Rights

PCLOB Dismissals Put EU-U.S. Data Transfers At Risk

On 27 January, the Trump Administration dismissed three Democratic members of the Privacy and Civil Liberties Oversight Board (PCLOB), an independent government entity that facilitates transparency and accountability in U.S. surveillance. This lost the body its quorum, preventing it from commencing investigations or issuing reports on intelligence community activities that may threaten civil liberties. It is unclear when replacements will be appointed and operations will resume, but based on past instances, the process is likely to take a long time. 

The PCLOB plays a crucial role in protecting privacy rights and keeping intelligence agencies in check. It is also a key part of the EU-U.S. Data Privacy Framework (DPF), established in 2023 after years of negotiations following the Court of Justice of the EU’s invalidation of Privacy Shield. The DPF provides EU citizens with rights to access, correct, or delete their data, and offers redress mechanisms including independent dispute resolution and arbitration. Under the Framework, the PCLOB is responsible for overseeing and ensuring U.S. intelligence follows key privacy and procedural safeguards. As we pointed out in a Lawfare piece, weakening this Oversight Board raises serious concerns about DPF’s validity, since the EU now faces greater challenges in ensuring that the U.S. upholds its commitments — with the entire DPF and transatlantic data flows at risk. 

Venice Commission Asks for Strict Spyware Regulations   

In its long-awaited report released last December, the Venice Commission addressed growing concerns about spyware use, and the existing legislative frameworks regulating the technology in all Council of Europe Member States. The report is based on the Commission’s examination of whether those laws provide enough oversight to protect fundamental rights, and was done in response to a request from the Parliamentary Assembly of the Council of Europe following revelations about concerning uses of Pegasus spyware.

In the report, the Commission emphasised the need for clear and strict regulations due to spyware’s unprecedented intrusiveness, which can interfere with the most intimate aspects of our daily lives. To prevent misuse, it laid out clear guidelines for when and how governments can use such spyware surveillance tools, to ensure that privacy rights are respected and abuse is prevented.

Recommended read: The Guardian, WhatsApp says journalists and civil society members were targets of Israeli spyware 

💬 Online Expression & Civic Space

Civil Society Aligns Priorities on DSA Implementation

Last Wednesday, CDT Europe hosted the annual DSA Civil Society Coordination Group in-person meeting at its office, bringing together 36 participants from across Europe to strategise and plan for 2025 on topics including several aspects of Digital Services Act (DSA) enforcement. 

 DSA Coordination Group Meeting, hosted by CDT Europe in Brussels.
 DSA Coordination Group Meeting, hosted by CDT Europe in Brussels.

The day began with a focused workshop by the Recommender Systems Task Force on the role of recommender systems in annual DSA Risk Assessment reports, which Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) must complete to assess and mitigate the systemic risks posed by their services. The workshop addressed key challenges in interpreting these reports, particularly in the absence of data to substantiate claims about how effective mitigations are. 

That session was followed by a broader workshop on DSA Risk Assessments. With the first round of Risk Assessment and Audit reports now published, constructive civil society feedback on those reports can help improve each iteration, pushing towards the ultimate goal of meaningful transparency that better protects consumers and society at large.

Transparency and Accountability Are Needed from Online Platforms

Recently, at a multistakeholder event on DSA Risk Assessments, CDT Europe’s David Klotsonis facilitated a session on Recommender Systems. With the first round of Risk Assessment reports widely considered unsatisfactory by civil society, much of the conversation focused on how to foster greater and more meaningful transparency through these assessments. Participants highlighted that, without data to underpin the risk assessments, robust and informed evaluation by the public is impossible. Even in the absence of such data, however, the discussion underscored that consistent and meaningful engagement with relevant stakeholders—including those from digital rights organisations in the EU—remains crucial. Civil society reflections are key to ensure that these reports could be even more useful, and to drive the transparency and accountability necessary for better platform safety.

Recommended read: Tech Policy Press, Free Speech Was Never the Goal of Tech Billionaires. Power Was.

⚖ Equity and Data

CDT Europe Responds to EC Questionnaire on Prohibited AI Practices

CDT Europe participated in the public stakeholder consultation on which practices the AI Act prohibits, to inform the European Commission’s development of guidelines for practically implementing those prohibitions (which will apply beginning 2 February 2025). In our response, we highlighted that the prohibitions — as set out in the final AI Act text — should be further clarified to cover all potential scenarios where fundamental rights may be impacted. We also argued that exceptions to these prohibitions must be interpreted narrowly. 

Second Draft of the General-Purpose AI Code of Practice Raises Concerns

In December, the European Commission published the second draft of the General-Purpose AI (GPAI) Code of Practice (CoP). Despite significant changes and some improvements, several aspects of the draft continue to raise concerns among civil society. The systemic risk taxonomy, a key part of the draft that sets out the risks GPAI model providers must assess and mitigate, remains substantially unchanged. 

In earlier feedback, CDT Europe suggested key amendments to bring the draft in line with fundamental rights, such as including the risk to privacy or the prevalence of non-consensual intimate imagery and child sexual abuse material. On a different front, organisations representing rights-holders have called for critical revisions to the draft to avoid eroding EU copyright standards, noting that the CoP in its current form fails to require strict compliance with existing EU laws. 

Our comments on the second-draft systemic risk taxonomy’s approach to fundamental rights are available on our website. CDT Europe will continue to engage with the process, with the next draft expected to be released and simultaneously be made available for comments to CoP participants on 17 February

EDPB Opinion on Personal Data and AI Models: How Consequential Is It?

In an early January IAPP panel, our Equity & Data Programme Director Laura Lazaro Cabrera discussed the role of the latest EDPB opinion on AI models and GDPR in closing a long debate: Does the tokenisation process underlying AI models prevent data processing, in the traditional sense, from taking place? Ultimately, this line of reasoning would take AI models entirely outside of the General Data Protection Regulation (GDPR)’s scope.

Equity & Data Programme Director Laura Lazaro Cabrera speaking at IAPP’s online panel on the latest EDPB Opinion on Personal Data and AI Models.
Equity & Data Programme Director Laura Lazaro Cabrera speaking at IAPP’s online panel on the latest EDPB Opinion on Personal Data and AI Models.

The panel unpacked the opinion’s nuances, noting that it allowed for situations where a model could be considered legally anonymous — and thereby outside the GDPR’s scope — even when personal data could be extracted, if the likelihood of doing so using “reasonable means” was “insignificant”. As the panel highlighted, the opinion is strictly based on the GDPR and did not refer to the AI Act, but will inevitably inform how regulators approach data protection risks in the AI field. Those risks are currently under discussion in several AI Act implementation processes, such as those for the GPAI Code of Practice and the forthcoming template for reporting on a model’s training data.

Recommended read: POLITICO, The EU’s AI bans come with big loopholes for police

🦋 Bluesky

We are on Bluesky! As more users join the platform (including tech policy thought leaders), we’re finding more exciting content, and we want you to be part of the conversation. Be sure to follow us at @cdteu.bsky.social, and follow our team here. We also created a starter pack of 30+ EU tech journalists, to catch the latest digital news in the bubble. 

🗞 In the Press

⏫ Upcoming Events 

AI Summit: On 10 and 11 February, France will host the Artificial Intelligence Action Summit, gathering heads of State and Government, leaders of international organisations, CEOs, academia, NGOs, artists and members of civil society, to discuss the development of AI technologies across the world and their implications for human rights. CDT President Alexandra Reeve Givens and CDT Europe Programme Director Laura Lazaro Cabrera will attend the conference. Laura will be making the closing remarks at an official side event to the Summit hosted by Renaissance Numérique. Registration is open here.

RightsCon: Our Security, Surveillance and Human Rights Programme Director Silvia Lorenzo Perez will participate in a panel discussion on spyware at the 2025 RightsCon Edition, taking place from 24 to 27 February in Taipei. Each year, RightsCon convenes business leaders, policy makers, government representatives, technology experts, academics, journalists, and human rights advocates from around the world to tackle pressing issues at the intersection of human rights and technology.

The post EU Tech Policy Brief: January 2025 appeared first on Center for Democracy and Technology.

]]>
CDT Leads Coalition Letter Condemning PCLOB Firings https://cdt.org/insights/cdt-leads-coalition-letter-condemning-pclob-firings/ Fri, 31 Jan 2025 21:59:57 +0000 https://cdt.org/?post_type=insight&p=107216 The Center for Democracy & Technology (CDT) led 27 civil society organizations in issuing a letter to Congressional leadership condemning the firing of three members of the Privacy and Civil Liberties Oversight Board (PCLOB). The Board has been an integral oversight entity that has debunked false claims about mass surveillance programs, prompted declassification of key […]

The post CDT Leads Coalition Letter Condemning PCLOB Firings appeared first on Center for Democracy and Technology.

]]>
The Center for Democracy & Technology (CDT) led 27 civil society organizations in issuing a letter to Congressional leadership condemning the firing of three members of the Privacy and Civil Liberties Oversight Board (PCLOB). The Board has been an integral oversight entity that has debunked false claims about mass surveillance programs, prompted declassification of key facts about surveillance secrets, and spurred important legislative reforms.

The Board’s independence has been crucial to this work, and is threatened by these unjustified firings. The letter urges Congress to “re-double its own oversight activities given the harm to PCLOB, as well as to restore the Board’s independence, shield it from interference, and strengthen PCLOB so it can again perform its vital work protecting Americans’ rights and guarding against improper surveillance.”

The letter is pasted in full below, and is also available in PDF form.


Dear Speaker Johnson, Minority Leader Jeffries, Majority Leader Thune, Minority Leader Schumer, Chairman Jordan, Ranking Member Raskin, Chairman Grassley, and Ranking Member Durbin,

We write to strongly condemn the White House firing of three Privacy and Civil Liberties Oversight Board (PCLOB) members, which shattered the independence that is key to the Board’s effectiveness. We urge Congress to act expeditiously to restore that independence.

PCLOB was originally proposed by the 9-11 Commission and has existed for nearly two decades as a critical oversight entity for protecting  rights and combatting surveillance abuse. Its investigations and reports have debunked false claims by the intelligence community about mass surveillance programs, prompted declassification and disclosure of key facts about surveillance that had needlessly been kept secret, and spurred important legislative reforms.

Firing PCLOB members will significantly undermine the Board’s independence, and could make it impossible for it to conduct this type of effective oversight in the future. If at-will termination becomes acceptable, a President of either party will be able to block investigation of controversial or improper surveillance activities by removing any PCLOB member who begins to scrutinize conduct that the executive wants to keep hidden. The White House could kill any reports or findings from PCLOB it does not want issued, firing Board members to halt the release of information the White House wants covered up. Even the mere threat of firings would chill PCLOB from properly performing its duties, with members seeking to stay in the good graces of the White House rather than acting as a vigilant watchdog. It is for precisely this reason that Congress in 2007 removed a provision of PCLOB’s statutory charter indicating that its members “serve at the pleasure of the President.”

The effort to destroy PCLOB’s independence, and thereby significantly undermine its basic effectiveness as an oversight entity, raises significant concerns over how the executive’s surveillance powers could be misused by this or future administrations. 

Additionally, PCLOB plays a crucial role in the EU-US Data Privacy Framework, which permits the flow of personal data from Europe to the U.S. that is essential to the functioning of the U.S. tech economy. PCLOB is charged with ensuring that surveillance guidelines adopted as part of the Framework are consistent with its requirements. Weakening the PCLOB throws this oversight into doubt and heightens the risk that the Framework will fail court review in Europe.

We urge Congress to re-double its own oversight activities given the harm to PCLOB, as well as to restore the Board’s independence, shield it from interference, and strengthen PCLOB so it can again perform its vital work protecting Americans’ rights and guarding against improper surveillance.

Sincerely,

Access Now

Advocacy for Principled Action in Government 

American Civil Liberties Union 

American Governance Institute

Asian Americans Advancing Justice | AAJC 

Brennan Center for Justice at NYU School of Law

Center for Democracy & Technology

Center for Digital Democracy 

Citizens for Responsibility and Ethics in Washington (CREW) 

Common Cause 

Defending Rights & Dissent 

Demand Progress

Due Process Institute 

Electronic Frontier Foundation

Electronic Privacy Information Center (EPIC) 

Fight for the Future 

Free Press Action 

Freedom of the Press Foundation 

Global Network Initiative 

Government Information Watch 

Media Alliance 

Muslim Advocates

New America’s Open Technology Institute 

Oakland Privacy 

Project On Government Oversight 

Surveillance Technology Oversight Project 

Wikimedia Foundation

The post CDT Leads Coalition Letter Condemning PCLOB Firings appeared first on Center for Democracy and Technology.

]]>
Press Release: CDT Condemns Brazen Attempt to Gut Independent Government Surveillance Watchdog https://cdt.org/insights/press-release-cdt-condemns-brazen-attempt-to-gut-independent-government-surveillance-watchdog/ Wed, 22 Jan 2025 21:03:12 +0000 https://cdt.org/?post_type=insight&p=107120 (WASHINGTON) — Earlier today, the New York Times reported that President Trump threatened to terminate Democratic Members of the Privacy and Civil Liberties Oversight Board (PCLOB) if they do not resign by tomorrow, January 23. Terminating these members of the Board will leave it without the quorum necessary for it to commence investigations and issue […]

The post Press Release: CDT Condemns Brazen Attempt to Gut Independent Government Surveillance Watchdog appeared first on Center for Democracy and Technology.

]]>
(WASHINGTON) — Earlier today, the New York Times reported that President Trump threatened to terminate Democratic Members of the Privacy and Civil Liberties Oversight Board (PCLOB) if they do not resign by tomorrow, January 23. Terminating these members of the Board will leave it without the quorum necessary for it to commence investigations and issue reports that are crucial to protecting civil liberties in governmental anti-terrorism programs.

Center for Democracy & Technology (CDT) President Alexandra Reeve Givens responded to the news with the following statement:

“President Trump’s attempt to expel members of the Privacy and Civil Liberties Oversight Board is a brazen effort to destroy an independent watchdog that has protected Americans and exposed surveillance abuse under Democratic and Republican administrations alike.

“PCLOB was created specifically to provide oversight over the kinds of government actions where the need for secrecy makes people most vulnerable to abuses of power.

“This effort to shoot the watchdog should set off alarm bells for how the President and his appointees seek to wield the government’s broad surveillance powers. And it could torpedo trans-Atlantic trade and data sharing agreements that depended on the PCLOB’s assurance of oversight when they were brokered.

“Congress cannot ignore this threat to erode a key agency charged with protecting Americans’ fundamental freedoms. The Senate must focus on this in the forthcoming confirmation votes for intelligence community leaders and take bipartisan action to ensure the PCLOB can continue its vital work with the independence that is critical to its operations.”

The Privacy and Civil Liberties Oversight Board has existed for almost two decades as a bipartisan independent oversight entity. It has been essential to providing public information about misuse and false claims over intelligence surveillance. One of its earliest reports debunked claims from the Intelligence Community that an expiring provision of the USA PATRIOT Act had thwarted dozens of terrorist attacks. That provision was later allowed to expire. Last year PCLOB issued a comprehensive report on FISA Section 702 that brought abuses of the law to public light. PCLOB inquiries into counterterrorism programs resulted in the declassification of scores of facts that drove Congressional hearings and spurred on legislative reforms.

The Board’s creation was originally recommended by the 9/11 Commission, and established as an independent entity in 2007 as part of the Implementing Recommendations of the 9/11 Commission Act with an overwhelming bipartisan vote. The PCLOB Members President Trump seeks to remove each received unanimous bipartisan support when appointed, being confirmed by the Senate by voice vote. The last time the Board lost its quorum was in the context of a re-organization, and it took the President and Congress more than four years to restore the quorum so the Board could function again.

###

The Center for Democracy & Technology (CDT) is the leading nonpartisan, nonprofit organization fighting to advance civil rights and civil liberties in the digital age. We shape technology policy, governance, and design with a focus on equity and democratic values. Established in 1994, CDT has been a trusted advocate for digital rights since the earliest days of the internet. The organization is headquartered in Washington, D.C., and has a Europe Office in Brussels, Belgium.

The post Press Release: CDT Condemns Brazen Attempt to Gut Independent Government Surveillance Watchdog appeared first on Center for Democracy and Technology.

]]>
The Secret Law Key That Could Unlock a Pandora’s Box of Uncurtailed Government Surveillance https://cdt.org/insights/the-secret-law-key-that-could-unlock-a-pandoras-box-of-uncurtailed-government-surveillance/ Mon, 19 Aug 2024 15:30:34 +0000 https://cdt.org/?post_type=insight&p=105281 To resolve the debate over which entities can be compelled to disclose user information under FISA 702, the Senate version of the Intelligence Authorization Act takes an unprecedented approach: make it a secret.  Co-authored with CDT Intern Divya Vatsa Congress recently expanded the types of entities that can be compelled to assist with surveillance under […]

The post The Secret Law Key That Could Unlock a Pandora’s Box of Uncurtailed Government Surveillance appeared first on Center for Democracy and Technology.

]]>
To resolve the debate over which entities can be compelled to disclose user information under FISA 702, the Senate version of the Intelligence Authorization Act takes an unprecedented approach: make it a secret. 

Co-authored with CDT Intern Divya Vatsa

Congress recently expanded the types of entities that can be compelled to assist with surveillance under Section 702 of the Foreign Intelligence Surveillance Act (FISA 702) so as to dramatically increase the scope of potential surveillance. At the time, key legislators promised to revisit the types of entities subject to FISA 702 directives to rein it back in. The Senate has now proposed to do that but in a way that would make secret the scope of entities subject to FISA 702. Although narrowing the types of entities that can be compelled to assist with this surveillance is a necessary and positive step, the reliance on secret law to do so is a highly problematic approach that Congress should address by pressuring the Intelligence Community to declassify information that would permit the Congress to legislate in the light of day.

Background

In response to the Intelligence Community (IC)’s request for more authority to obtain communications without a warrant, Congress passed FISA 702 in 2008. It authorizes the government to compel electronic communication service providers (ECSPs) to disclose specific users’ communications content and metadata, whether stored or in transit, so long as the person or entity targeted for surveillance is a non-U.S. person reasonably believed to be abroad. Importantly, because foreign individuals communicate with Americans, any online communications of Americans to or from FISA 702 targets are also disclosed to the government under FISA 702 surveillance without a warrant. 

The balance FISA 702 seeks to strike between national security interests on the one hand and privacy rights on the other hinges in part on the type of entity deemed to be an ESCP. The definition of ECSP in FISA 702 effectively limits the scope of surveillance authorized under the statute: if a company’s service does not fit within the definition of ECSP, the company cannot be compelled to assist with FISA 702 surveillance on that service. Further, the definition of ECSP alerts the public of which services are subject to this broad surveillance authority, and it alerts companies offering those services that they may be directed to release a user’s communications. 

The Evolution of ECSP’s Definition

ECSP was first defined in FISA 702 to cover companies that directly facilitate and can access communications, like Google, AT&T, and Meta. In April of this year, however, the Reforming Intelligence and Securing America Act of 2024 (RISAA), drastically broadened the definition of ECSPs to include any entity or person that has access to equipment on which communications are stored or on which communications are transmitted. With limited exceptions for restaurants, hotels, dwellings, and community facilities, any business that provides WiFi could qualify as an ECSP under the new definition. 

This change was made in response to losses the IC suffered in cases decided in 2022 by the Foreign Intelligence Surveillance Court and in 2023 by the Foreign Intelligence Surveillance Court of Review. Those courts ruled that a company which had received a FISA directive did not properly fall under the definition of ECSP, and therefore could not be compelled to release data to the government. The name of the company and the type of service it offered were redacted from the decisions for national security reasons. The IC insisted that RISAA include an amendment to expand the ECSP definition to cover this entity, while not disclosing the name of the entity nor the nature of the services it provides. 

Congress accommodated the IC with an exceedingly broad definition, which immediately set off alarm bells from experts and civil society. In response to the public’s reaction, Congress included limited exceptions for restaurants, hotels, etc. However, even under that limitation, the ECSP definition could be applied to any business landlord – leading to continued criticism. To blunt these concerns and help quickly shepherd RISAA through the Senate, Senator Mark Warner (R-VA), Chairman of the Senate Select Committee on Intelligence (SSCI), promised to revisit the FISA 702 definition of ECSP during SSCI’s consideration of the annual Intelligence Authorization Act (IAA).

Narrowing RISAA’s Expansion of ECSP, But Hiding Its Scope

To his credit, Senator Warner has followed through on his promise. Under his leadership, the SSCI voted to amend the IAA to limit the expansion in the ECSP definition that Congress adopted in RISAA to newly cover only the type of entity that was the subject of the FISC and FISCR decisions. Because this narrows the scope of FISA 702 surveillance, it improves the law and its substance should be included in any compromise the House and Senate reach on the IAA. 

The problem is that the entity discussed in those decisions is classified. And instead of calling for declassification so that the entity’s name or the type of service it offered could be publicly described in the bill, the SSCI allowed for it to remain secret: the bill simply defines ECSPs as including the type of entity at issue in the FISC and FISCR decisions, without providing any further clarity.

The Problems with Secret Law

Thus while substantively narrowing the types of entities that can receive FISA 702 directives, this update effectively hides from the public which businesses will be responsible for responding to FISA 702 directives and thus the full breadth of FISA 702 surveillance. The danger of concealing the breadth of surveillance is especially acute in the FISA 702 context because of the lack of a warrant process: no judge or independent arbiter authorizes FISA 702 directives, and therefore, unlike other intrusive surveillance conducted in the U.S., no court assesses in advance whether the entity on which a FISA 702 surveillance directive is served must comply with it based on the ECSP definition. Court review after the surveillance directive is served will be stymied because the companies that could challenge directives in court won’t know if they have a basis for challenging the directive because they do not fit within the ECSP definition. The Senate version of the IAA indicates that they would receive a “summary” of the services at issue in the FISC and FISCR decisions, but the summary could be vague and generalized because it need not meet any standard. Companies thus will be hard pressed to challenge directives in court — a process that is already onerous, expensive, and conducted in a closed proceeding. 

In general, given the lack of public transparency and accountability, legislators and advocates have cautioned against secret law. For example, the trend of agencies seeking legal advice from the DOJ’s Office of Legal Counsel (OLC) through informal, written advice rather than through formal opinions that can be obtained by the public through Freedom of Information Act requests has been criticized because it undermines public trust in agency actions. In addition, severe public backlash against secret interpretations of laws such as the Patriot Act led to the USA Freedom Act, which outlawed the bulk domestic surveillance that the law had been secretly interpreted to authorize, and to the requirement that significant FISC opinions be made public.

This type of secrecy regarding surveillance conducted within the U.S. is without precedent. Other surveillance statutes clearly describe the entities that can be compelled to assist with surveillance. Further, secrecy regarding the permissible scope of surveillance conducted in the U.S. of people abroad is very different from other secrecy that has been tolerated in U.S. statutes. For example Congress appropriates funds annually in a “Black Budget” in which the dollar amount of the line items in the appropriations bill (but not the total amount) are kept secret. 

The IAA definition of ECSPs creates secret law in an unprecedented manner both at a general and specific level – it brings secret language directly into the statutory text. It codifies secret criteria that define the entities that might be subject to compelled disclosure and compulsory government action. Establishing such criteria without basic public understanding of its bounds is worrying for democratic accountability and protection of constitutional rights. In order for proper oversight to occur, Congress and external civil society stakeholders need to be able to evaluate the law. And in order to hold lawmakers accountable, voters need the ability to see and respond to the laws their representatives are creating.

Allowing the scope of surveillance to be defined in secret also creates a dangerous precedent for the IC to seek other secret laws in the future that expand surveillance, further amplifying the risk of abuse. If secret rules for when surveillance may occur becomes a standard practice in the name of national security, public trust in privacy protections will erode. This will have a chilling effect on free expression. The need for transparency in laws is particularly acute in the national security context: because the facts and particular applications of surveillance are likely to be classified in most if not all cases, the only level of public accountability comes from knowing the applicable law that governs national security activities. If those laws too are secret, national security agencies and officials can too easily act in ways that citizens would not countenance.

Transparency Instead of Secret Law

Congress must reconsider the definition of ECSP to more appropriately balance national security interests with the need for accountability and with privacy rights and expectations. The most straightforward path would be for the IC to declassify the type of entity described in the FISC and FISCR opinions, as 20 civil society groups including CDT recommended last month. This should happen now, so Congress can account for this information in the ECSP definition in the IAA, or in other legislation to which it is attached. Declassification would be consistent with the Principles of Transparency for the Intelligence Community, which commit the IC to “Provide appropriate transparency to enhance public understanding of … the laws, directives, authorities and policies that govern the IC’s activities.” Nor is there any reason to think that publicly describing the type of entity would damage national security – after all, each of the other types of ECSPs subject to 702 are set forth in public law. 

The move by the SSCI to narrow the scope of FISA 702 surveillance by narrowing the definition of ECSP in RISAA was a step in the right direction. But to do it by creating secret law and powerful precedent for more secret law in the future is a highly problematic way to proceed. FISA 702 sunsets in April 2026 unless Congress reauthorizes the program. Failure of the IC to declassify the types of entities subject to FISA 702 surveillance could threaten the IC’s efforts to secure FISA 702 reauthorization.

The post The Secret Law Key That Could Unlock a Pandora’s Box of Uncurtailed Government Surveillance appeared first on Center for Democracy and Technology.

]]>