State Privacy Resources Archives - Center for Democracy and Technology https://cdt.org/area-of-focus/privacy-data/state-privacy-resources/ Tue, 06 May 2025 16:44:15 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://cdt.org/wp-content/uploads/2019/11/cropped-cdt-logo-32x32.png State Privacy Resources Archives - Center for Democracy and Technology https://cdt.org/area-of-focus/privacy-data/state-privacy-resources/ 32 32 CDT Endorses State Data Privacy Act Model Bill from EPIC & Consumer Reports https://cdt.org/insights/cdt-endorses-state-data-privacy-act-model-bill-from-epic-consumer-reports/ Mon, 05 May 2025 16:42:37 +0000 https://cdt.org/?post_type=insight&p=108523 The Center for Democracy & Technology (CDT) recently supported this updated model state comprehensive privacy bill from EPIC and Consumer Reports that emphasizes data minimization, civil rights, and private right of action. CDT endorsed the compromise model bill because it properly focuses on strong data minimization, civil rights protections, and includes a private right of […]

The post CDT Endorses State Data Privacy Act Model Bill from EPIC & Consumer Reports appeared first on Center for Democracy and Technology.

]]>
The Center for Democracy & Technology (CDT) recently supported this updated model state comprehensive privacy bill from EPIC and Consumer Reports that emphasizes data minimization, civil rights, and private right of action.

CDT endorsed the compromise model bill because it properly focuses on strong data minimization, civil rights protections, and includes a private right of action. States that pass this law would truly protect the privacy of their residents.

Read the full model bill.

The post CDT Endorses State Data Privacy Act Model Bill from EPIC & Consumer Reports appeared first on Center for Democracy and Technology.

]]>
CDT, Civil Society Oppose Inadequate West Virginia Privacy Bill https://cdt.org/insights/cdt-civil-society-oppose-inadequate-west-virginia-privacy-bill/ Wed, 19 Mar 2025 17:57:36 +0000 https://cdt.org/?post_type=insight&p=107975 This week, CDT joined a civil society letter opposing an industry-friendly privacy bill in West Virginia. The letter states the bill does not provide adequate protections for consumers, particularly around data minimization, targeted advertising, and enforcement. From the letter: The American Civil Liberties Union of West Virginia, Electronic Privacy Information Center (EPIC), Consumer Reports, Consumer […]

The post CDT, Civil Society Oppose Inadequate West Virginia Privacy Bill appeared first on Center for Democracy and Technology.

]]>
This week, CDT joined a civil society letter opposing an industry-friendly privacy bill in West Virginia. The letter states the bill does not provide adequate protections for consumers, particularly around data minimization, targeted advertising, and enforcement.

From the letter:

The American Civil Liberties Union of West Virginia, Electronic Privacy Information Center (EPIC), Consumer Reports, Consumer Federation of America, and the Center for Democracy & Technology write in respectful opposition to H.B. 2987, relating to the Consumer Data Protection Act. The Consumer Data Protection Act seeks to provide to West Virginia consumers the right to know the information companies have collected about them, the right to access, correct, and delete that information, as well as the right to stop the disclosure of certain information to third parties. However, in its current form, it would do little to protect West Virginia consumers’ personal information, or to rein in major tech companies like Google and Facebook. The bill needs to be substantially improved before it is enacted; otherwise, it would risk locking in industry-friendly provisions that avoid actual reform.

Consumers currently possess very limited power to protect their personal information in the digital economy, while online businesses operate with virtually no limitations as to how they collect and process that information (so long as they note their behavior somewhere in their privacy policies). As a result, consumers’ every move is constantly tracked and often combined with offline activities to provide detailed insights into their most personal characteristics, including health conditions, political affiliations, religious beliefs, and even their precise geolocation. This information is sold as a matter of course, is used to deliver targeted advertising, facilitates differential pricing, and enables opaque algorithmic scoring.

At the same time, spending time online has become integral to modern life, with many individuals required to sign up for accounts with tech companies because of school, work, or simply out of a desire to connect with distant family and friends. Consumers are offered the illusory “choice” to consent to company data processing activities, but in reality this is an all-or-nothing decision; if you do not approve of any one of a company’s practices, your only choices are to either forgo the service altogether or acquiesce completely.

The full letter is available to read here.

The post CDT, Civil Society Oppose Inadequate West Virginia Privacy Bill appeared first on Center for Democracy and Technology.

]]>
States are Letting Us Down on Privacy https://cdt.org/insights/states-are-letting-us-down-on-privacy/ Sun, 28 Jan 2024 05:00:00 +0000 https://cdt.org/?post_type=insight&p=102327 On this International Privacy Day, it’s Time to Acknowledge That States are Enacting Industry-Friendly Privacy Legislation States have been on a tear passing privacy legislation over the past several years, motivated at least in part by the lack of federal privacy protections. At least 13 states now have laws in place that provide some protections for […]

The post States are Letting Us Down on Privacy appeared first on Center for Democracy and Technology.

]]>
On this International Privacy Day, it’s Time to Acknowledge That States are Enacting Industry-Friendly Privacy Legislation

States have been on a tear passing privacy legislation over the past several years, motivated at least in part by the lack of federal privacy protections. At least 13 states now have laws in place that provide some protections for online data. That’s good — states should be active in protecting privacy. States are the laboratories of democracy, and when we can, we should let them experiment with different policy approaches. States are adopting some beneficial provisions in their privacy bills, like data broker registries and user rights to access, correct, and delete data.

Generally, however, each state is simply adopting a minor variation of prior state laws, and those state laws so far lack one of the most meaningful privacy protections: data minimization. Data minimization ensures companies collect only data that is necessary to provide the product or service an individual requested. The strictest definition would disallow collecting data beyond the absolutely critical. More appropriate definitions allow data collection for other specified, allowable purposes, including to authenticate users, to protect against spam, to perform system maintenance, or to comply with other legal obligations.

Data minimization requirements place the privacy-protecting burden primarily on companies that collect and exploit the data, rather than on the already overburdened consumer. U.S. privacy law has developed primarily through the Federal Trade Commission’s authority to prevent “deceptive” practices, which has resulted primarily in protections against misleading people. For years, however, most people have agreed that notice-and-consent has failed, in large part because we know that people do not read or understand laborious, labyrinthian privacy policies. 

For years, however, most people have agreed that notice-and-consent has failed, in large part because we know that people do not read or understand laborious, labyrinthian privacy policies.

Narrowing the categories of data that companies can collect is important because of the variety of privacy-based harms that come about simply from companies collecting and hoarding massive amounts of data: becoming a larger target for hackers or unauthorized access, breaches of that data that result in further downstream harms like identity theft, and subsequent use of data that is unknown or secretive, such as selling the data to third parties that compile detailed individual profiles and use that data (particularly sensitive data) for targeted advertisements, and a variety of other harms.

Reducing data collected also protects against another significant harm: law enforcement access to data. Any data that a company has access to, law enforcement also has access to it. The Supreme Court’s decision in Dobbs v. Jackson Women’s Health Organization raised the salience of this concern, as people realized that any data that could be used to identify whether a person sought or received an abortion (location data, communications data, among many others) could be accessed by law enforcement. 

States have been letting us down on data minimization. The concept has been co-opted at the state level to mean something more like companies cannot collect data for any purposes for which they do not inform the consumer. For instance, Section 6(a)(1)-(2) of Connecticut’s privacy law states that a company shall “limit the collection of personal data to what is adequate, relevant[,] and reasonably necessary in relation to the purposes for which such data is processed, as disclosed to the consumer,” and “not process personal data for purposes that are neither reasonably necessary to, or compatible with, the disclosed purposes for which such personal data is processed, as disclosed to the consumer, unless the [company] obtains the consumer’s consent.” Connecticut’s minimization requirement is not effective because it allows companies to continue collecting data for basically any purpose stated in a privacy policy — which is already the law under the FTC’s Act deception standard and, as already mentioned, most consumers do not read privacy policies. Virginia (Section 59.1-578(A)(1)) and Texas (Section 541.101(a)(1), (b)(1)) privacy laws use nearly identical language, as does New Jersey’s law (Section 9(a)(1)-(2)), which just passed earlier this month. While some states require opt-in consent for processing sensitive data, those provisions are also insufficient because states often define sensitive data very narrowly (see Virginia definition limited to children’s data, demographic data, location, and biometrics).

The California Consumer Privacy Act has seemingly similar language, in Section 1798.100(c), that a company’s collection has to be “reasonably necessary and proportionate to achieve the purposes for which the personal information was collected or processed, or for another disclosed purpose that is compatible with the context in which the personal information was collected, and not further processed in a manner that is incompatible with those purposes.” However, the California Privacy Protection Agency subsequently passed rules stating (Section 7002(d)(1)) that companies should seek to collect the minimum information necessary to achieve the purpose identified. For instance, to mail a product and send an email confirmation, the only information needed is a physical address and an email address. Companies must also take into account (Section 7002(d)(2)) potential negative impacts of collecting data, including that precise geolocation data may give away sensitive information and visits to sensitive locations like health care providers. Colorado privacy regulations, in rule 6.07(A), include a similar requirement that companies “determine the minimum [p]ersonal [d]ata that is necessary, adequate, and relevant for the express purpose.”

Despite California’s more detailed rules, most states have enacted language similar to the Connecticut law, which ultimately has little impact on company data practices—it is merely a continuation of the failed notice-and-consent regime. The language not only allows, but bakes into state law and policy, the privacy status quo that so many people disfavor. It also places essentially no burden on companies to curtail their data practices. In most of these states with “comprehensive privacy” laws, if a company wants to build profiles of all their customers, or sell all the data they collect to third parties to increase their revenues, or hoover up every data point they can to train their internal Artificial Intelligence systems, the only thing stopping them is stating that purpose in a privacy policy. 

…most states have enacted language similar to the Connecticut law, which ultimately has little impact on company data practices—it is merely a continuation of the failed notice-and-consent regime. The language not only allows, but bakes into state law and policy, the privacy status quo that so many people disfavor.

People should not be satisfied with these laws. At the federal level, significant resources and discussion have gone into finding a reasonable approach to data minimization with the American Data Privacy and Protection Act (ADPPA). The bipartisan legislation included a strong minimization requirement, which required companies to collect sensitive data (broadly defined, unlike states, to include health, communications, contacts, financial info, biometric data, and other types of data) only to the extent it was strictly necessary to provide the product or service, or was strictly necessary for one of several other specified allowable purposes. This requirement would have placed significant privacy obligations on companies themselves, forcing them to justify their data collection rather than continuing to place the burden on the shoulders of individuals. 

Some states have or have had pending legislation that would provide similarly strong minimization requirements. Massachusetts is considering its own state-level ADPPA called the Massachusetts Data Privacy Protection Act, which CDT supports. Maine legislators have proposed a similar bill. 

Luckily, existing state laws can be fixed. Laws can be amended and updated to reflect current practices and technologies. States that have already adopted privacy legislation should update those laws to provide stronger privacy protections by changing their minimization language to more closely reflect ADPPA, or the Massachusetts and Maine bills. States that have not yet passed legislation should look toward ensuring that any future potential privacy law includes similar language. If they don’t, states will only continue letting us down on privacy.

The post States are Letting Us Down on Privacy appeared first on Center for Democracy and Technology.

]]>
CDT Comments on Global Privacy Control and Colorado’s Universal Opt-Out Mechanisms https://cdt.org/insights/cdt-comments-on-global-privacy-control-and-colorados-universal-opt-out-mechanisms/ Thu, 14 Dec 2023 17:02:51 +0000 https://cdt.org/?post_type=insight&p=101948 This week, CDT filed comments with the Colorado Attorney General regarding Colorado’s recognition of universal opt-out mechanisms as described by the Colorado Privacy Act. The 2021 law grants Colorado consumers important new rights with respect to their personal data, including the right to access, delete, and correct their personal data as well as the right […]

The post CDT Comments on Global Privacy Control and Colorado’s Universal Opt-Out Mechanisms appeared first on Center for Democracy and Technology.

]]>
This week, CDT filed comments with the Colorado Attorney General regarding Colorado’s recognition of universal opt-out mechanisms as described by the Colorado Privacy Act. The 2021 law grants Colorado consumers important new rights with respect to their personal data, including the right to access, delete, and correct their personal data as well as the right to opt out of the sale of their personal data or its use for targeted advertising or certain kinds of profiling. 

However, it is impossible for consumers to keep track of every piece of software and every company they interact with on the web. To enjoy their rights under the CPA, Coloradans need privacy tools that help them automatically communicate their data privacy preferences to the websites they visit. Such opt-out control mechanisms should be standardized, to ease adoption by industry and to facilitate effective choices by consumers. The Global Privacy Control (GPC) is one such browser setting that allows a user to communicate their preference – and to exercise their legal rights – to opt out of sharing and sale of their personal information.

CDT strongly recommends that Colorado recognize GPC as a universal opt-out mechanism. We are encouraged to see incubation and standardization of the Global Privacy Control ongoing at the World Wide Web Consortium (W3C). W3C is an open, multistakeholder standard-setting body and the standardization process enables vetting for the design, especially to ensure its interoperability across browsers and websites. The Colorado Department of Law can aid this effort in collaborating with other regulatory bodies in providing guidance to the standardization process.

Read the full comments.

The post CDT Comments on Global Privacy Control and Colorado’s Universal Opt-Out Mechanisms appeared first on Center for Democracy and Technology.

]]>
CDT Testifies to MA Legislature in Support of a Privacy Law Based on ADPPA https://cdt.org/insights/cdt-testifies-to-ma-legislature-in-support-of-a-privacy-law-based-on-adppa/ Fri, 20 Oct 2023 15:33:08 +0000 https://cdt.org/?post_type=insight&p=100540 On October 19, CDT Privacy & Data Co-Director Eric Null testified in front of the Massachusetts legislature in support of the Data Protection Privacy Act, which was modeled largely on the federal bill, the American Data Privacy and Protection Act (ADPPA), which CDT supported. Massachusetts should pass a privacy law based on ADPPA because, first, […]

The post CDT Testifies to MA Legislature in Support of a Privacy Law Based on ADPPA appeared first on Center for Democracy and Technology.

]]>
On October 19, CDT Privacy & Data Co-Director Eric Null testified in front of the Massachusetts legislature in support of the Data Protection Privacy Act, which was modeled largely on the federal bill, the American Data Privacy and Protection Act (ADPPA), which CDT supported.

Massachusetts should pass a privacy law based on ADPPA because, first, ADPPA was extensively negotiated and received bipartisan support, and represented a reasonable compromise. Second, it was supported by both civil society and industry, unlike many state privacy bills, which are often written by industry and continue to permit many intrusive data practices and rely on the failed notice-consent model. 

Last, passing comprehensive privacy legislation is a necessary component of AI governance. Without protections such as data minimization, civil rights protections, and algorithmic assessments, AI could turn into yet another privacy problem that is difficult to unwind. Passing the Massachusetts bill would help avoid those problems.

Read his full testimony.

The post CDT Testifies to MA Legislature in Support of a Privacy Law Based on ADPPA appeared first on Center for Democracy and Technology.

]]>
CDT and Rights Groups File Amicus Brief in Texas Online Age Verification Case https://cdt.org/insights/cdt-and-rights-groups-file-amicus-brief-in-texas-online-age-verification-case/ Thu, 28 Sep 2023 23:08:20 +0000 https://cdt.org/?post_type=insight&p=100168 The Center for Democracy & Technology (CDT) joined the ACLU, EFF, FIRE, Media Coalition, and Tech Freedom in an amicus brief urging the Fifth Circuit to affirm the preliminary injunction against Texas’s online age verification law. HB 1181 would require websites that host adult content to conduct age verification of every visitor to their sites […]

The post CDT and Rights Groups File Amicus Brief in Texas Online Age Verification Case appeared first on Center for Democracy and Technology.

]]>
The Center for Democracy & Technology (CDT) joined the ACLU, EFF, FIRE, Media Coalition, and Tech Freedom in an amicus brief urging the Fifth Circuit to affirm the preliminary injunction against Texas’s online age verification law. HB 1181 would require websites that host adult content to conduct age verification of every visitor to their sites and to display several “Texas Health and Human Services Warnings” asserting the dangers of pornography to adults and children.

As we argue in the brief, this law plainly violates the First Amendment. The provisions of the law requiring website operators and other online services to display the State of Texas’s derogatory opinions about lawful speech are a clear case of an unconstitutional effort by the government to compel the speech of private actors.

The age verification requirement of the law would also require online services to demand sensitive information from adult users before allowing them to access lawful, constitutionally protected speech. Concerns about privacy and security, an inability to produce requisite ID credentials, or errors by age verification tools could all chill adults from accessing online content that is fully protected by the First Amendment.

These constitutional infirmities with age verification mandates were recognized by the Supreme Court since the Reno v. ACLU decision in 1997, and have been reaffirmed by multiple courts since. The Fifth Circuit should affirm the preliminary injunction put in place by the district court and this law should ultimately be overturned.

Read the full brief here.

The post CDT and Rights Groups File Amicus Brief in Texas Online Age Verification Case appeared first on Center for Democracy and Technology.

]]>
Interpreting California’s Reproductive Health Information Shield Law: The ‘In California’ Limitation https://cdt.org/insights/interpreting-californias-reproductive-health-information-shield-law-the-in-california-limitation/ Wed, 06 Sep 2023 18:07:08 +0000 https://cdt.org/?post_type=insight&p=99855 By Graham Streich, Legal Intern at CDT’s Security & Surveillance Project Last year, California enacted a unique shield law for reproductive rights, AB 1242, stipulating that California providers of electronic communications services “shall not, in California, provide records, information, facilities, or assistance” (emphasis added) in response to legal process issued by law enforcement in another […]

The post Interpreting California’s Reproductive Health Information Shield Law: The ‘In California’ Limitation appeared first on Center for Democracy and Technology.

]]>
By Graham Streich, Legal Intern at CDT’s Security & Surveillance Project

Last year, California enacted a unique shield law for reproductive rights, AB 1242, stipulating that California providers of electronic communications services “shall not, in California, provide records, information, facilities, or assistance” (emphasis added) in response to legal process issued by law enforcement in another state in connection with an investigation of an abortion which, if performed in California, would be lawful under California law. 

California’s innovative abortion data privacy protections will almost certainly be the subject of litigation in the future, either in California or in a state with abortion bans from which law enforcement issues such legal process.

One issue that may arise in such litigation is what it means for companies to provide records, information, facilities, or assistance “in California.” The “in California” requirement may have been included to make the law more likely to pass constitutional scrutiny by requiring a clear nexus between California and what companies withhold against out-of-state warrants— “records, information, facilities, or assistance”— in addition to the fact that the company is incorporated or headquartered in California.

The prohibition on the provision of “assistance” in California seems relatively clear:  If company personnel who access the company’s user data in response to a law enforcement request — along with legal staff and other personnel who oversee such responses and turn over data to law enforcement — are located in California, then the company would be providing assistance “in California” when responding to a law enforcement request.  Such assistance would be prohibited in the case of an out-of-state request in connection with an investigation of an abortion that would be lawful in California.  Companies headquartered in California often will locate the relevant personnel in California as a matter of course.  

In some cases, however, the relevant personnel may be located outside California (e.g., personnel are located remotely in this post-pandemic age).  Nonetheless, by its terms, the shield law still applies if the relevant “records, information, [or] facilities” are provided in California.  Under basic canons of statutory construction, those words must be given meaning and not be rendered superfluous.  That language could mean that AB 1242 protections apply when responsive records and information are stored on servers physically located “in California” and the act of disclosure occurs there.  In somewhat analogous circumstances, in Microsoft v. United States—a case involving government demands for data that Microsoft stored in Ireland but could access in the United States—the Second Circuit found the relevant location was where Microsoft stored the data. 

In contrast, a federal district court in Wisconsin described the data’s location as intangible and found the relevant location to be where the provider disclosed the data to the government. A California district court maintained that the location of electronic data was so prone to fluctuation that its location might not be effectively tied to one jurisdiction, and similarly focused on the location of the disclosure to the government.  These differences were never resolved, as the Supreme Court vacated the Second Circuit’s decision in the Microsoft-Ireland case as moot once Congress passed the CLOUD ACT in 2018. .  

These unresolved issues could potentially limit how broadly AB 1242 protects users if the company personnel providing assistance are located outside of California. Even if user data is stored in California, a state demanding data for abortion investigations could argue (following the Wisconsin court) that the data’s location is actually intangible, or (following the California court) that data is subject to such rapid and unpredictable movement that its location should not be a factor for courts and instead, the focus should be on where the data is provided to law enforcement. Such arguments might have the effect of rendering the statutory reference to records or information superfluous, but If they were to prevail, the location of data in California might not be sufficient to trigger AB 1242.  

Assistance “in California” may be found in circumstances other than those in which the act of data disclosure to law enforcement occurs in California. For example, if the data is stored in California and the company copies it there and moves the copy to another state in which the act of disclosure occurs, arguably, the act of copying the data in California would be regarded as “assistance” provided in California. Likewise, other compliance-related activities in California that do not include the act of disclosure but are engaged in to prepare for making the disclosure, may constitute “assistance” provided in California. For example, the act in California of checking to determine whether a company has information responsive to a law enforcement disclosure demand may constitute “assistance” in California regardless of whether the data is stored there. 

California-headquartered technology and telecommunication companies should locate personnel that respond to law enforcement requests in California.  They should also ensure that activities they engage in to prepare to make disclosures to law enforcement occur in California. That should assure users that their data receives AB 1242 protections because “assistance” to law enforcement would be provided in California. Alternatively, or ideally, in addition, companies should also store relevant user data in California, which could provide an alternative avenue for the application of the statutory protections.  

The California legislature should also evaluate whether the “in California” requirement is necessary.  Washington has a shield law that, while similar in key respects to the California law, makes no distinction based on where aid is located, meaning the Washington law’s protections apply even if data is stored and assistance is provided out of state, so long as the company is incorporated or based in Washington. Users should advocate that the California legislature amend its law to eliminate the “in California” requirement for assistance or the location of data and require only that the company in question be headquartered or based in California.  

The post Interpreting California’s Reproductive Health Information Shield Law: The ‘In California’ Limitation appeared first on Center for Democracy and Technology.

]]>
CDT Joins Free Speech Orgs in Urging Montana Legislature to Reject TikTok Ban Bill https://cdt.org/insights/cdt-joins-free-speech-orgs-in-urging-montana-legislature-to-reject-tiktok-ban-bill/ Wed, 12 Apr 2023 14:04:42 +0000 https://cdt.org/?post_type=insight&p=97924 The Center for Democracy & Technology (CDT) joined a letter, alongside the ACLU and other free speech and civil liberties organizations, urging the Montana legislature to reject a bill that would ban TikTok in the state. The letter argues that the bill would unjustly cut Montanans off from a platform where they speak out and […]

The post CDT Joins Free Speech Orgs in Urging Montana Legislature to Reject TikTok Ban Bill appeared first on Center for Democracy and Technology.

]]>
The Center for Democracy & Technology (CDT) joined a letter, alongside the ACLU and other free speech and civil liberties organizations, urging the Montana legislature to reject a bill that would ban TikTok in the state. The letter argues that the bill would unjustly cut Montanans off from a platform where they speak out and exchange ideas everyday, and it would set an alarming precedent for excessive government control over how Montanans use the internet.

Although there are legitimate concerns about the Chinese government’s gaining access to Americans’ data through TikTok, banning the app violates the First Amendment because there is no public evidence that TikTok’s purported harms rise to the high level required by the constitution before the app could be banned, or that a ban is the only way to address such harm if it did exist.

CDT’s Caitlin Vogus, in a press release announcing the letter:

“Banning TikTok in Montana would raise serious First Amendment concerns and is not the appropriate way to protect the privacy of user’s data or respond to content on the app that lawmakers disapprove of,” said Caitlin Vogus, deputy director of the Free Expression Project at the Center for Democracy & Technology. “We urge the Montana legislature not to take this dangerous step toward limiting Montanans’ ability to speak freely and receive information online.”

Read the full letter + list of signatories here.

The post CDT Joins Free Speech Orgs in Urging Montana Legislature to Reject TikTok Ban Bill appeared first on Center for Democracy and Technology.

]]>
CDT Comments to California Privacy Protection Agency on Automated Decision-making and Risk Assessments https://cdt.org/insights/cdt-comments-to-california-privacy-protection-agency-on-automated-decision-making-and-risk-assessments/ Tue, 28 Mar 2023 19:03:17 +0000 https://cdt.org/?post_type=insight&p=97793 The California Privacy Protection Agency initiated a rulemaking process in 2021 to implement the California Consumer Privacy Act (CCPA), as amended by the California Privacy Rights Act. During this process, the Center for Democracy & Technology (CDT) has urged the Agency to craft its regulations to effectively protect sensitive data, establish data minimization and use/purpose limitations, and provide […]

The post CDT Comments to California Privacy Protection Agency on Automated Decision-making and Risk Assessments appeared first on Center for Democracy and Technology.

]]>
The California Privacy Protection Agency initiated a rulemaking process in 2021 to implement the California Consumer Privacy Act (CCPA), as amended by the California Privacy Rights Act. During this process, the Center for Democracy & Technology (CDT) has urged the Agency to craft its regulations to effectively protect sensitive data, establish data minimization and use/purpose limitations, and provide transparency and mitigate discriminatory outcomes in automated decision-making.

Most recently, the Agency invited comments to inform how it will implement provisions of the CCPA that require covered entities that process personal information to perform annual cybersecurity audits, submit risk assessments to the Agency on a regular basis, and provide consumers with access to information about and the ability to opt out of automated decision-making. CDT submitted comments focusing on automated decision-making and risk assessments, describing:

  • Businesses and organizations’ automated decision-making practices throughout multiple sectors and resulting harms to consumers,
  • Gaps in existing civil rights and consumer protection laws that prevent access to information and prevent consumers from opting out of automated decision-making,
  • Access and opt-out rights and risk assessment requirements under U.S. law compared to the EU’s General Data Protection Regulation, and
  • Recommendations for the scope and content of risk assessments and of responses to access requests for automated decision-making.

CDT’s comments also explain access and opt-out rights and risk assessments as they pertain to employment in particular. As of January 1, 2023, data that employers collect about workers is no longer exempt from the law – as a result, data used for employment decisions can be subject to the same privacy protections afforded to all California consumers. Therefore, properly scoped implementation of the CCPA is vital for all Californians.

Read the full comments here.

The post CDT Comments to California Privacy Protection Agency on Automated Decision-making and Risk Assessments appeared first on Center for Democracy and Technology.

]]>
CDT Comments Scrutinize NYC’s Revised Rules That Leave Even More Workers Unprotected From Algorithmic Bias https://cdt.org/insights/cdt-comments-scrutinize-nycs-revised-rules-that-leave-even-more-workers-unprotected-from-algorithmic-bias/ Thu, 26 Jan 2023 20:53:25 +0000 https://cdt.org/?post_type=insight&p=97112 Last year, the New York City Department of Consumer and Worker Protection began a rulemaking process to implement NYC’s Local Law 144 that requires bias audits for the use of automated employment decision tools (AEDTs). The Center for Democracy & Technology’s (CDT) earlier comments explained that the Department’s initial proposed rules narrowed the law’s protections – which […]

The post CDT Comments Scrutinize NYC’s Revised Rules That Leave Even More Workers Unprotected From Algorithmic Bias appeared first on Center for Democracy and Technology.

]]>
Last year, the New York City Department of Consumer and Worker Protection began a rulemaking process to implement NYC’s Local Law 144 that requires bias audits for the use of automated employment decision tools (AEDTs).

The Center for Democracy & Technology’s (CDT) earlier comments explained that the Department’s initial proposed rules narrowed the law’s protections – which were already rather limited to begin with – and recommended changes to encompass the full range of automated decision-making that can subject workers to employment discrimination.

The Department’s revisions to the proposed rules incorporate some of our recommendations, but these revisions are outweighed by unaddressed concerns and problematic changes that would restrict the law’s protections further still. CDT submitted new comments explaining that the revised proposed rules would:

  • Exempt even more automated decision-making tools from the law’s notice and auditing requirements;
  • Create new ambiguity about how bias audits should calculate impacts on race, ethnicity, and sex categories and intersections of these categories;
  • Enable employers to decide the type and extent of historical or test data they use for bias audits so they can misrepresent their AEDTs’ impacts; and
  • Allow employers to avoid informing workers about how an AEDT will assess them prior to its use so that workers can determine whether to seek accommodations.

Read the full comments here.

The post CDT Comments Scrutinize NYC’s Revised Rules That Leave Even More Workers Unprotected From Algorithmic Bias appeared first on Center for Democracy and Technology.

]]>