U.S. Privacy Legislation Archives - Center for Democracy and Technology https://cdt.org/area-of-focus/privacy-data/u-s-privacy-legislation/ Tue, 29 Apr 2025 17:21:55 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://cdt.org/wp-content/uploads/2019/11/cropped-cdt-logo-32x32.png U.S. Privacy Legislation Archives - Center for Democracy and Technology https://cdt.org/area-of-focus/privacy-data/u-s-privacy-legislation/ 32 32 CDT Submits Comments to Representative Lori Trahan on Updating the Privacy Act of 1974 https://cdt.org/insights/cdt-submits-comments-to-representative-lori-trahan-on-updating-the-privacy-act-of-1974/ Wed, 30 Apr 2025 04:01:00 +0000 https://cdt.org/?post_type=insight&p=108499 On April 30, the Center for Democracy & Technology (CDT) submitted comments to Representative Lori Trahan about reforming the Privacy Act of 1974 to address advances in technology and emerging threats to federal government data privacy. Our comments highlight potential privacy harms related to federal government data practices and provide an overview of CDT’s nearly […]

The post CDT Submits Comments to Representative Lori Trahan on Updating the Privacy Act of 1974 appeared first on Center for Democracy and Technology.

]]>
On April 30, the Center for Democracy & Technology (CDT) submitted comments to Representative Lori Trahan about reforming the Privacy Act of 1974 to address advances in technology and emerging threats to federal government data privacy. Our comments highlight potential privacy harms related to federal government data practices and provide an overview of CDT’s nearly two decades of advocacy on the Privacy Act.

We urge Congress to address gaps in the Privacy Act, including by:

  • Updating the definition of “system of records,” 
  • Limiting the “routine use” exemption, 
  • Expanding the Privacy Act to cover non-U.S. persons, and 
  • Strengthening privacy notices.

Read the full comments.

The post CDT Submits Comments to Representative Lori Trahan on Updating the Privacy Act of 1974 appeared first on Center for Democracy and Technology.

]]>
CDT Comment to House E&C Argues Federal Comprehensive Privacy Framework Should Protect Against Harms and Bolster Consumer Trust https://cdt.org/insights/cdt-comment-to-house-ec-argues-federal-comprehensive-privacy-framework-should-protect-against-harms-and-bolster-consumer-trust/ Mon, 07 Apr 2025 18:14:13 +0000 https://cdt.org/?post_type=insight&p=108181 CDT submitted these comments in response to the House Energy & Commerce Committee Privacy Working Group’s request for information (RFI) regarding a federal comprehensive privacy and security framework. We appreciate the opportunity to comment and the committee’s desire to gather more information from stakeholders about their privacy viewpoints and evidence of data practices and their […]

The post CDT Comment to House E&C Argues Federal Comprehensive Privacy Framework Should Protect Against Harms and Bolster Consumer Trust appeared first on Center for Democracy and Technology.

]]>
CDT submitted these comments in response to the House Energy & Commerce Committee Privacy Working Group’s request for information (RFI) regarding a federal comprehensive privacy and security framework. We appreciate the opportunity to comment and the committee’s desire to gather more information from stakeholders about their privacy viewpoints and evidence of data practices and their harms. 

In our comments, we urge the Working Group to build upon the significant amount of work done in the past several years to achieve bipartisan consensus on key elements of a federal privacy framework. We highlight that any effective federal privacy law must place limits on companies’ collection, use, processing, and sharing of data to protect individuals from harm. These harms include fraud and other economic injury, discrimination, reputational harm, harassment, and government surveillance that circumvents the Fourth Amendment and other legal protections. Because many of these risks increasingly arise from the use of artificial intelligence (AI) such as in automated decision-making systems, an effective 21st century privacy framework should account for AI. It should also provide meaningful transparency and require companies to regularly examine their data practices to mitigate risks to consumers. Protecting privacy will bolster consumer trust in our increasingly data-centric economy and thereby enable greater innovation.

Read the full comments.

The post CDT Comment to House E&C Argues Federal Comprehensive Privacy Framework Should Protect Against Harms and Bolster Consumer Trust appeared first on Center for Democracy and Technology.

]]>
Press Release: CDT Urges House Energy & Commerce to Vote No on American Privacy Rights Act https://cdt.org/insights/press-release-cdt-urges-house-energy-commerce-to-vote-no-on-american-privacy-rights-act/ Thu, 27 Jun 2024 22:07:00 +0000 https://cdt.org/?post_type=insight&p=104769 UPDATE 6/27/24, in response to the news that the markup was cancelled, Eric Null— Co-Director of CDT’s Privacy & Data Project — issued the following statement: “Today, the House Energy & Commerce Committee made the right call to cancel its markup of the updated American Privacy Rights Act, which lacked vital and necessary protections for civil […]

The post Press Release: CDT Urges House Energy & Commerce to Vote No on American Privacy Rights Act appeared first on Center for Democracy and Technology.

]]>
UPDATE 6/27/24, in response to the news that the markup was cancelled, Eric Null— Co-Director of CDT’s Privacy & Data Project — issued the following statement:

“Today, the House Energy & Commerce Committee made the right call to cancel its markup of the updated American Privacy Rights Act, which lacked vital and necessary protections for civil rights and algorithmic transparency that were included in prior bill drafts and in the American Data Privacy and Protection Act from 2022. We are encouraged to see the Committee avoid establishing bad precedent by voting on a privacy bill without civil rights. We know that Chair McMorris Rodgers and Ranking Member Pallone remain committed, as do we, to passing strong privacy legislation, and we will continue to work with those offices to ensure we do just that.”

—————————————————————————————————————————

(WASHINGTON) — Tomorrow, the House Energy & Commerce Committee will host a markup of the American Privacy Rights Act (APRA). CDT opposes the new agreement, and urges members to vote no.While the new draft of APRA contains moderate improvements in some areas, and parts of the bill would increase privacy protections, the new draft removes essential provisions protecting civil rights and requiring baseline algorithmic assessments. 

Alexandra Reeve Givens, president and CEO of CDT, issued the following statement:

“Civil rights protections are a vital and necessary aspect of any comprehensive privacy bill. The issue has been a top priority for civil society. CDT has urged that every iteration of APRA, and privacy bills before it, include and retain strong civil rights language, and similar civil rights language was voted out of this committee two years ago in a bipartisan vote of 53-2. This bill is a significant step backwards.

“It is disappointing and beyond frustrating that the most recent draft leaves critical civil rights protections on the cutting room floor and that civil society was not consulted in advance of that decision. Discriminating through data practices is one of the worst things companies can do with data, and one of the worst privacy-related harms experienced by society. Privacy legislation needs to clearly prevent discriminatory data practices, and APRA no longer does that. Members should oppose this version of the bill.

The post Press Release: CDT Urges House Energy & Commerce to Vote No on American Privacy Rights Act appeared first on Center for Democracy and Technology.

]]>
CDT VP of Policy Samir Jain Testimony Before U.S. House Energy & Commerce Committee on “Legislative Solutions to Protect Kids Online and Ensure Americans’ Data Privacy Rights” https://cdt.org/insights/cdt-vp-of-policy-samir-jain-testimony-before-u-s-house-energy-commerce-committee-on-legislative-solutions-to-protect-kids-online-and-ensure-americans-data-privacy-rights/ Tue, 16 Apr 2024 16:12:57 +0000 https://cdt.org/?post_type=insight&p=103251 On Wednesday, April 17, 2024 CDT VP of Policy Samir Jain is testifying before the U.S. House of Representatives Energy & Commerce Committee in a hearing titled “Legislative Solutions to Protect Kids Online and Ensure Americans’ Data Privacy Rights.” A summary of Jain’s testimony is pasted below, and you can read the full testimony here. *** […]

The post CDT VP of Policy Samir Jain Testimony Before U.S. House Energy & Commerce Committee on “Legislative Solutions to Protect Kids Online and Ensure Americans’ Data Privacy Rights” appeared first on Center for Democracy and Technology.

]]>
On Wednesday, April 17, 2024 CDT VP of Policy Samir Jain is testifying before the U.S. House of Representatives Energy & Commerce Committee in a hearing titled “Legislative Solutions to Protect Kids Online and Ensure Americans’ Data Privacy Rights.”

A summary of Jain’s testimony is pasted below, and you can read the full testimony here.

***

In Jain’s testimony, he explains that the bipartisan, bicameral American Privacy Rights Act (APRA) presents a renewed opportunity to finish the long-overdue job of passing a federal privacy law — and good context to provide further protections for kids, and a necessary foundation for addressing policy challenges around AI.

Jain illustrates that today’s data ecosystem victimizes Americans, why AI has accelerated the need for a federal privacy law, and why such a law is also a national security imperative. He also discusses elements of APRA that are essential to any effective privacy law, including data minimization requirements, civil rights protections, restrictions on data brokers, and meaningful provisions for enforcement at scale. 

Jain further notes that APRA is not flawless. He calls for the law to, among other things: include additional privacy protections for children (while not passing laws that restrict access to content or require age verification, which may raise constitutional concerns); extend its scope to government service providers; be clearer in its language around advertising; and provide stricter requirements for data brokers.

The post CDT VP of Policy Samir Jain Testimony Before U.S. House Energy & Commerce Committee on “Legislative Solutions to Protect Kids Online and Ensure Americans’ Data Privacy Rights” appeared first on Center for Democracy and Technology.

]]>
Brief – Unintended Consequences: Consumer Privacy Legislation and Schools https://cdt.org/insights/brief-unintended-consequences-consumer-privacy-legislation-and-schools/ Thu, 04 Apr 2024 21:54:11 +0000 https://cdt.org/?post_type=insight&p=103181 [ PDF Version ] The United States needs to enact comprehensive privacy legislation that limits the collection, use, and sharing of personal information to protect everyone, including children. Although such a bill has yet to be enacted at the federal level, state and federal legislators have proposed, and in some states enacted, legislation that limits […]

The post Brief – Unintended Consequences: Consumer Privacy Legislation and Schools appeared first on Center for Democracy and Technology.

]]>
[ PDF Version ]

The United States needs to enact comprehensive privacy legislation that limits the collection, use, and sharing of personal information to protect everyone, including children. Although such a bill has yet to be enacted at the federal level, state and federal legislators have proposed, and in some states enacted, legislation that limits the ways that companies can collect and use individuals’ data. Such legislation also often expands individuals’ rights to access and manage data about them held by companies. If not carefully crafted, however, privacy and child safety laws can inadvertently undermine the ability of schools and their vendors to carry out important educational functions.

Schools, and in turn the vendors they use (for services like managing student records and hosting educational content), have different data needs and uses than non-education private sector companies or non-profits. Quality data is required to support the core functions of schools including class assignments, transportation, nutrition, and even school funding. School operations can be actively hamstrung by an ill-suited law. Policymakers can, however, create a coherent legal regime that protects everyone’s privacy and safety while ensuring seamless education operations.

Existing Data Laws for Children and Education

A complex legal regime already governs data in an education context, making it important to consider how new laws will interact with these existing frameworks. These authorities include the Family Educational Rights and Privacy Act (FERPA), the Children’s Online Privacy Protection Act (COPPA), the Individuals with Disabilities Education Act (IDEA), and a host of state student privacy laws. 

These laws provide specific protections for a wide range of student data and how schools and companies must handle that data. For instance, FERPA addresses schools’ handling of education records and personally identifying information (PPI) of students, affording specific rights to parents to inspect and correct student records, including information maintained by vendors and third parties acting on behalf of the school. IDEA addresses, among other things, special confidentiality concerns for students with disabilities and their families. 

Federal education privacy laws like FERPA and IDEA create a floor for student privacy that can then be supplemented by additional state laws. Many states have enacted laws that impose additional obligations on education agencies, such as creating breach notification procedures and limiting the types of information that can be collected about a student. At least 128 state student privacy laws in effect today govern educational agencies and their vendors, providing an ever-widening range of additional protections to supplement federal student privacy laws.

Additionally, COPPA requires parental consent prior to certain operators of websites and online services collecting data about children under the age of 13. While not technically a student privacy law, COPPA can impact edtech companies. While the Federal Trade Commission (FTC) has long been clear that COPPA does not impose obligations on schools, it limits when a school can consent on behalf of a parent, requiring companies to obtain parents’ verifiable consent for any data collection that is not exclusively for educational purposes. 

While these frameworks are incomplete and should be improved, those improvements should be made intentionally with an eye to supporting students and school communities. These benefits are unlikely to result from bills that are targeted to other sectors but inadvertently impact education.

Inadvertent Detrimental Effects of General Privacy and Child Safety Laws on Education

Although drafters of privacy and child safety laws that are targeted at the private sector or non-education nonprofits often seek to exempt the education sector, educational institutions may end up being inadvertently covered. This oversight can impact schools’ ability to provide education to their communities, whether by limiting their ability to support students, limiting their ability to obtain core data required to provide critical services, or forcing schools to spend resources complying with additional conflicting or confusing frameworks. This inadvertent coverage can happen in a number of ways:

  • Bills that do not account for vendors providing services to schools, such as a February 2022 version of the Kids Online Safety Act (KOSA 2022), can require vendors to adhere to different standards for data than the school itself (for example, a right to deletion that might obligate a company that holds an education agency’s data to comply with a deletion request that the education agency itself would have the discretion to decline). Such different standards can create inconsistencies in how student data is handled and limit a school’s ability to rely on their vendors to handle data as expected in an educational context. Additionally, bills without clear treatment of vendors may also create legal complexity and inconsistency for schools, as they are ultimately responsible for student data, even if it is held by vendors, which is untenable if vendors are expected to follow different regulations than the school.
  • Bills that do not account for private schools can leave those schools with a legal framework not designed for the broader educational context. As an example, private schools may still be impacted by a bill that tries to account for education contexts by exempting any data covered by or entities subject to FERPA, because FERPA’s scope is limited to schools that accept federal funding, leaving out most private K-12 schools. 
  • Occasionally bills do not differentiate between private sector actors like companies and public sector actors like schools, such as the Online Privacy Act, which would thus require schools to abide by the same consumer frameworks as private companies, which can limit their ability to provide an effective education.

Legal frameworks that inadvertently cover schools or their vendors can negatively impact how schools deliver educational services. Some requirements can create legal challenges for schools, while some can more directly affect students’ educational experiences.

  • Data deletion: Many consumer data privacy laws, such as the proposed American Data Privacy Protection Act (ADPPA), give consumers the right to request or require that a “covered entity” delete any data about the consumer they hold. That requirement makes sense when a consumer wants to delete, for instance, an advertising profile about themselves. It makes much less sense when a parent wants to delete their child’s disciplinary history from their education record (FERPA already provides the parent the right to correct the record if they feel it is wrong).

    Consequently, these laws must be carefully drafted to ensure that schools are able to maintain their records as necessary to perform their role of educating students. ADPPA protects consumers by outlining data rights they have when data about them is held by “covered entities.” ADPPA, as introduced in Congress, takes care to exempt “governmental entities,” which would include schools, allowing them to maintain control of their records. However, an earlier discussion draft which does not include this exemption would have interfered with schools’ record keeping requirements. The updated version actually goes further than exempting schools themselves though; it also exempts people and entities that manage data on behalfof governmental entities like schools. This is crucial in an education context where schools rely heavily on edtech vendors in their technology ecosystems. Without this further exception, a vendor could be required to comply with, for instance, a parent’s request to delete their child’s transcripts, thus undercutting the reliability of educational records.
  • Correction: Consumer laws sometimes give consumers the right to correct data about them. As mentioned above, FERPA protects this right as well, giving parents and students the ability to contest inaccuracies in students’ educational records. However, under FERPA, a correction request typically goes through the school, and schools are able to determine whether a correction is warranted. If a consumer law is not drafted to ensure such requests go to the school, but rather enables parents and students to go directly to vendors employed by the school, it could prevent the school from determining whether the correction is valid and, if so, ensuring that the correction is done appropriately and accurately. Although many bills require the requesting consumer to prove the record is incorrect, allowing parents to request a change directly with a vendor rather than through the school could create significant confusion, or potentially allow for students to change grades or otherwise alter their academic record without the school’s awareness or involvement.
  • Profiling: Some laws place restrictions on profiling users under a certain age, where profiling generally means using the user’s past actions or other information about the user to make decisions about how to interact with or present information to the user in the future. Some of these profiling laws protect people in certain age ranges, generally under 13. Without appropriate carve outs for schools, both public and private, these restrictions could apply to many students in K-12 schools. However, some systems used by schools generate profiles of students that schools use to inform their instructional and educational practices. For example, schools may analyze data to personalize student learning in a number of ways, including allowing for individualized project-based learning or personalizing student goals. Disallowing profiling would render these systems ineffective, essentially removing a tool from the toolbox of schools that are aiming to support their most at-risk students.

Students, Especially LGBTQ+, Disapprove of Increased Parental Access To Online Activity

Many recent state and federal online child safety laws propose varying levels of parental access to their children’s online activities, assuming that more parental control will keep kids safer. However, though our research indicates that parents are already implementing measures to supervise what their children do online and would like additional controls, students do not share this perspective. This is even more pronounced among LGBTQ+ students, who are more likely to experience abuse, neglect, and homelessness if their parents are unsupportive.

Approximately half of students overall report that they would be comfortable with their parents being able to see a report of all of their online activity at school – similar to what their school’s student activity monitoring system captures. This drops to just 35 percent for LGBTQ+ students, compared to 55 percent among their non-LGBTQ+ peers. 

Students express even less support for their parents being able to see a report of their online activity wherever they are – only 42 percent of students said they would be comfortable with this. Again, LGBTQ+ students report being less comfortable than their non-LGTBQ+ peers with their parents having this ability (24 percent vs. 49 percent who would be comfortable). In line with these views, 67 percent of students said they would be likely to turn off their parents’ ability to see their online activity if they could, and LGBTQ+ students would be even more likely at 74 percent.

As previously stated, parents play an active role in supervising their children’s online activity, but they agree that older students deserve more privacy and less oversight than younger children. Just over 90 percent of parents agree that it is important for them as a parent to see everything their child is looking at and doing online from 3-8th grades, but that drops to 83 percent for students in 9-12th grades.

Given these findings, it is imperative to think about whether state and federal online child safety laws would actually keep students “safe.” The majority of students express not feeling comfortable with increased parental access to their online activity and data, and this sentiment is even more pronounced among LGBTQ+ students. This raises questions about whether parental access would cause a chilling effect and hamper kids’ freedom of speech and expression.

Drafting Legislation that Minimizes Unintended Consequences to the Education Sector

Policymakers should think carefully about whether and how educational institutions are implicated by the privacy and safety bills they draft. If policymakers do not intend to include the education sector, they can take a number of different approaches. 

  • Exempt organizations by class or statutory framework: This approach would entail exempting organizations by class, such as schools and vendors providing services to them (which would then be governed by existing legal frameworks like FERPA and IDEA, as described above). Legislators would have to create a robust definition of schools and vendors to avoid some of the unintended consequences detailed previously. 
  • Exempt by activity: Another approach that could be used to exempt the education sector would be to exempt data by purpose or activity. This would mean exempting data that is acquired and used for a legitimate educational purpose from provisions such as the right to delete (this language might mirror the “school official exception” language in FERPA that allows schools to outsource certain functions to vendors when there is a “legitimate educational interest in the education records”). This approach could allow for schools and their vendors to engage in activities like profiling if they have a legitimate educational reason to do so. 
  • Exempt by existing legal framework: Another approach to exempting schools is to exempt any data already covered by FERPA, as in the North Carolina Consumer Privacy Act. This approach has the advantage of covering both schools themselves and any vendors when they are handling FERPA-protected data. However, as noted previously, most private schools do not receive federal funding and are therefore not governed by FERPA. In this case, private schools and their vendors would not be exempted, and legislators would have to address them specifically, likely through a direct definitional carve out as there is not a similar legislation framework to FERPA that addresses private school data.

Conclusion

Regardless of how legislators and policymakers choose to approach and account for schools, it is critical to the functioning of our education system that they do so. Student data can be a great tool for improving education delivery and supporting students, but also contains highly sensitive personal information about young people that is worthy of well-designed protections. Policymakers need to ensure that schools can leverage that data effectively even as they take strides to provide much needed protections to consumers and their data.

[ PDF Version ]

The post Brief – Unintended Consequences: Consumer Privacy Legislation and Schools appeared first on Center for Democracy and Technology.

]]>
CDT Joins EPIC in Opposing New York Senate’s Budget Bill https://cdt.org/insights/cdt-joins-epic-in-opposing-new-york-senates-budget-bill/ Fri, 29 Mar 2024 16:56:35 +0000 https://cdt.org/?post_type=insight&p=103103 A letter from the Center for Democracy & Technology (CDT) and EPIC opposing the New York Privacy Act’s inclusion in the New York Senate’s budget bill for not going far enough in protecting New Yorkers’ privacy. We hope they remove it from the budget bill and continue improving it. Read the full letter.

The post CDT Joins EPIC in Opposing New York Senate’s Budget Bill appeared first on Center for Democracy and Technology.

]]>
A letter from the Center for Democracy & Technology (CDT) and EPIC opposing the New York Privacy Act’s inclusion in the New York Senate’s budget bill for not going far enough in protecting New Yorkers’ privacy. We hope they remove it from the budget bill and continue improving it.

Read the full letter.

The post CDT Joins EPIC in Opposing New York Senate’s Budget Bill appeared first on Center for Democracy and Technology.

]]>
CDT Files Comments with FTC in Response to COPPA Updates https://cdt.org/insights/cdt-files-comments-with-ftc-in-response-to-coppa-updates/ Wed, 13 Mar 2024 20:28:52 +0000 https://cdt.org/?post_type=insight&p=102869 On March 11, CDT filed comments with the Federal Trade Commission in response to their proposed updates to the Children’s Online Privacy Protection Act (COPPA) Rule. In the comments, we make several arguments: See the full comments here.

The post CDT Files Comments with FTC in Response to COPPA Updates appeared first on Center for Democracy and Technology.

]]>
On March 11, CDT filed comments with the Federal Trade Commission in response to their proposed updates to the Children’s Online Privacy Protection Act (COPPA) Rule. In the comments, we make several arguments:

  1. The COPPA Rule should include strong data minimization requirements;
  2. The FTC should strengthen and adopt its proposals around direct notice, verifiable parental consent mechanisms, and retention and deletion of children’s data;
  3. The definitions of biometric data and inferred data should be clarified, as should requirements around content personalization;
  4. The “child directed” determination should include, as proposed, a totality of the circumstances analysis, and the FTC should not require “constructive knowledge”; and
  5. The proposed educational exception to parental consent should be adopted with several important changes.

See the full comments here.

The post CDT Files Comments with FTC in Response to COPPA Updates appeared first on Center for Democracy and Technology.

]]>
CDT Comments in response to FTC’s Proposed Consent Order with X-Mode Social, Inc., and Outlogic, LLC https://cdt.org/insights/cdt-commets-in-response-to-ftc-proposed-consent-order/ Tue, 20 Feb 2024 22:34:00 +0000 https://cdt.org/?post_type=insight&p=102731 On February 20, 2024, CDT filed comments with the Federal Trade Commission drawing attention to two issues in its proposed consent order with X-Mode.  First, the exception to the ban on the use and sale of Sensitive Location Data for converting such data into non-location data is harmful. This language would allow X-Mode to continue […]

The post CDT Comments in response to FTC’s Proposed Consent Order with X-Mode Social, Inc., and Outlogic, LLC appeared first on Center for Democracy and Technology.

]]>
On February 20, 2024, CDT filed comments with the Federal Trade Commission drawing attention to two issues in its proposed consent order with X-Mode. 

First, the exception to the ban on the use and sale of Sensitive Location Data for converting such data into non-location data is harmful. This language would allow X-Mode to continue collecting, using, and selling data about people’s general visits to Sensitive Locations without specific GPS coordinates or other location data attached (e.g., the fact that they visited Planned Parenthood). That exception is harmful and should be removed. 

Second, the exception to the definition of Location Data that would allow X-Mode to continue to collect Sensitive Location Data if that data is collected abroad and used for a security, or a national security, purpose is problematic and unclear. The harm and deceptive behavior from X-Mode stem not from where the data is collected. The exception will also be difficult to comply with and enforce. It should similarly be removed, or at least clarified and narrowed.

Read the full comments here.

The post CDT Comments in response to FTC’s Proposed Consent Order with X-Mode Social, Inc., and Outlogic, LLC appeared first on Center for Democracy and Technology.

]]>
States are Letting Us Down on Privacy https://cdt.org/insights/states-are-letting-us-down-on-privacy/ Sun, 28 Jan 2024 05:00:00 +0000 https://cdt.org/?post_type=insight&p=102327 On this International Privacy Day, it’s Time to Acknowledge That States are Enacting Industry-Friendly Privacy Legislation States have been on a tear passing privacy legislation over the past several years, motivated at least in part by the lack of federal privacy protections. At least 13 states now have laws in place that provide some protections for […]

The post States are Letting Us Down on Privacy appeared first on Center for Democracy and Technology.

]]>
On this International Privacy Day, it’s Time to Acknowledge That States are Enacting Industry-Friendly Privacy Legislation

States have been on a tear passing privacy legislation over the past several years, motivated at least in part by the lack of federal privacy protections. At least 13 states now have laws in place that provide some protections for online data. That’s good — states should be active in protecting privacy. States are the laboratories of democracy, and when we can, we should let them experiment with different policy approaches. States are adopting some beneficial provisions in their privacy bills, like data broker registries and user rights to access, correct, and delete data.

Generally, however, each state is simply adopting a minor variation of prior state laws, and those state laws so far lack one of the most meaningful privacy protections: data minimization. Data minimization ensures companies collect only data that is necessary to provide the product or service an individual requested. The strictest definition would disallow collecting data beyond the absolutely critical. More appropriate definitions allow data collection for other specified, allowable purposes, including to authenticate users, to protect against spam, to perform system maintenance, or to comply with other legal obligations.

Data minimization requirements place the privacy-protecting burden primarily on companies that collect and exploit the data, rather than on the already overburdened consumer. U.S. privacy law has developed primarily through the Federal Trade Commission’s authority to prevent “deceptive” practices, which has resulted primarily in protections against misleading people. For years, however, most people have agreed that notice-and-consent has failed, in large part because we know that people do not read or understand laborious, labyrinthian privacy policies. 

For years, however, most people have agreed that notice-and-consent has failed, in large part because we know that people do not read or understand laborious, labyrinthian privacy policies.

Narrowing the categories of data that companies can collect is important because of the variety of privacy-based harms that come about simply from companies collecting and hoarding massive amounts of data: becoming a larger target for hackers or unauthorized access, breaches of that data that result in further downstream harms like identity theft, and subsequent use of data that is unknown or secretive, such as selling the data to third parties that compile detailed individual profiles and use that data (particularly sensitive data) for targeted advertisements, and a variety of other harms.

Reducing data collected also protects against another significant harm: law enforcement access to data. Any data that a company has access to, law enforcement also has access to it. The Supreme Court’s decision in Dobbs v. Jackson Women’s Health Organization raised the salience of this concern, as people realized that any data that could be used to identify whether a person sought or received an abortion (location data, communications data, among many others) could be accessed by law enforcement. 

States have been letting us down on data minimization. The concept has been co-opted at the state level to mean something more like companies cannot collect data for any purposes for which they do not inform the consumer. For instance, Section 6(a)(1)-(2) of Connecticut’s privacy law states that a company shall “limit the collection of personal data to what is adequate, relevant[,] and reasonably necessary in relation to the purposes for which such data is processed, as disclosed to the consumer,” and “not process personal data for purposes that are neither reasonably necessary to, or compatible with, the disclosed purposes for which such personal data is processed, as disclosed to the consumer, unless the [company] obtains the consumer’s consent.” Connecticut’s minimization requirement is not effective because it allows companies to continue collecting data for basically any purpose stated in a privacy policy — which is already the law under the FTC’s Act deception standard and, as already mentioned, most consumers do not read privacy policies. Virginia (Section 59.1-578(A)(1)) and Texas (Section 541.101(a)(1), (b)(1)) privacy laws use nearly identical language, as does New Jersey’s law (Section 9(a)(1)-(2)), which just passed earlier this month. While some states require opt-in consent for processing sensitive data, those provisions are also insufficient because states often define sensitive data very narrowly (see Virginia definition limited to children’s data, demographic data, location, and biometrics).

The California Consumer Privacy Act has seemingly similar language, in Section 1798.100(c), that a company’s collection has to be “reasonably necessary and proportionate to achieve the purposes for which the personal information was collected or processed, or for another disclosed purpose that is compatible with the context in which the personal information was collected, and not further processed in a manner that is incompatible with those purposes.” However, the California Privacy Protection Agency subsequently passed rules stating (Section 7002(d)(1)) that companies should seek to collect the minimum information necessary to achieve the purpose identified. For instance, to mail a product and send an email confirmation, the only information needed is a physical address and an email address. Companies must also take into account (Section 7002(d)(2)) potential negative impacts of collecting data, including that precise geolocation data may give away sensitive information and visits to sensitive locations like health care providers. Colorado privacy regulations, in rule 6.07(A), include a similar requirement that companies “determine the minimum [p]ersonal [d]ata that is necessary, adequate, and relevant for the express purpose.”

Despite California’s more detailed rules, most states have enacted language similar to the Connecticut law, which ultimately has little impact on company data practices—it is merely a continuation of the failed notice-and-consent regime. The language not only allows, but bakes into state law and policy, the privacy status quo that so many people disfavor. It also places essentially no burden on companies to curtail their data practices. In most of these states with “comprehensive privacy” laws, if a company wants to build profiles of all their customers, or sell all the data they collect to third parties to increase their revenues, or hoover up every data point they can to train their internal Artificial Intelligence systems, the only thing stopping them is stating that purpose in a privacy policy. 

…most states have enacted language similar to the Connecticut law, which ultimately has little impact on company data practices—it is merely a continuation of the failed notice-and-consent regime. The language not only allows, but bakes into state law and policy, the privacy status quo that so many people disfavor.

People should not be satisfied with these laws. At the federal level, significant resources and discussion have gone into finding a reasonable approach to data minimization with the American Data Privacy and Protection Act (ADPPA). The bipartisan legislation included a strong minimization requirement, which required companies to collect sensitive data (broadly defined, unlike states, to include health, communications, contacts, financial info, biometric data, and other types of data) only to the extent it was strictly necessary to provide the product or service, or was strictly necessary for one of several other specified allowable purposes. This requirement would have placed significant privacy obligations on companies themselves, forcing them to justify their data collection rather than continuing to place the burden on the shoulders of individuals. 

Some states have or have had pending legislation that would provide similarly strong minimization requirements. Massachusetts is considering its own state-level ADPPA called the Massachusetts Data Privacy Protection Act, which CDT supports. Maine legislators have proposed a similar bill. 

Luckily, existing state laws can be fixed. Laws can be amended and updated to reflect current practices and technologies. States that have already adopted privacy legislation should update those laws to provide stronger privacy protections by changing their minimization language to more closely reflect ADPPA, or the Massachusetts and Maine bills. States that have not yet passed legislation should look toward ensuring that any future potential privacy law includes similar language. If they don’t, states will only continue letting us down on privacy.

The post States are Letting Us Down on Privacy appeared first on Center for Democracy and Technology.

]]>
CDT Files Amicus Brief in Stark v. Patreon Supporting the Free Speech and Privacy Protections Provided by the Video Privacy Protection Act https://cdt.org/insights/cdt-files-amicus-brief-in-stark-v-patreon-supporting-the-free-speech-and-privacy-protections-provided-by-the-video-privacy-protection-act/ Thu, 21 Dec 2023 22:39:53 +0000 https://cdt.org/?post_type=insight&p=102078 On December 20, CDT filed an amicus brief along with the Electronic Frontier Foundation (EFF) in Stark v. Patreon in the Northern District of California, arguing that the court should reject a challenge to the facial constitutionality of the Video Privacy Protection Act (VPPA). As the brief explains, the VPPA furthers the government’s substantial interests […]

The post CDT Files Amicus Brief in Stark v. Patreon Supporting the Free Speech and Privacy Protections Provided by the Video Privacy Protection Act appeared first on Center for Democracy and Technology.

]]>
On December 20, CDT filed an amicus brief along with the Electronic Frontier Foundation (EFF) in Stark v. Patreon in the Northern District of California, arguing that the court should reject a challenge to the facial constitutionality of the Video Privacy Protection Act (VPPA).

As the brief explains, the VPPA furthers the government’s substantial interests in protecting people’s freedom to access and receive information, and in protecting people’s right to privacy in viewing that information. As a result, the vast majority of the applications of the VPPA are constitutional, and the court should not allow a facial challenge to the law.

As-applied, challenges can address the very limited circumstances where the law would impact First Amendment protected speech outside the commercial context.

Read the full brief.

The post CDT Files Amicus Brief in Stark v. Patreon Supporting the Free Speech and Privacy Protections Provided by the Video Privacy Protection Act appeared first on Center for Democracy and Technology.

]]>