The Race to Regulate the Internet: Should States or the Federal Government Set the Rules for Websites Content, Child Protection and Personal Data Control?
August 2024
We are writing to provide an update on some significant developments since our Race to Regulate the Internet program held on May 8 – video available here: https://youtu.be/5YBTmqFN7To
Headlines
- Supreme Court returns challenge to State content moderation laws to lower courts for further development
- Supreme Court dismisses injunction against certain federal government contacts with social media firms based on lack of standing by plaintiffs
- Department of Justice Office of Inspector reviews FBI contacts with social media firms regarding foreign threats to U.S. elections in relation to First Amendment considerations
- Congressman calls for investigation of alleged censorship by X of Vice President Harris’ campaign
- New York enacts legislation to protect children online
- White House Task Force on Kids Online Safety makes recommendations for parents and social media companies
- Department of Justice lawsuit alleges TikTok collected data on children less than 13 in violation of the Children’s Online Privacy Protection Act
- Meta to pay $1.4 billion to settle claims regarding unauthorized capture of personal biometric data
State or Federal Government Regulation of, or Influence on, Website Content
For most of the life of the internet, States for a variety of reasons, including early unfavorable court decisions, have generally not been inclined to attempt to impose requirements or restrictionon private party website content. In recent years, two large States – Florida and Texas, motivated by a perception that some large websites were discriminating against the presentation of conservative viewpoints – enacted laws designed to place requirements or restrictions on website content moderation intended to balance the range of views that are presented.
The Eleventh Circuit Court of Appeals affirmed a district court injunction against the Florida law, finding that the law was not likely to survive a review under the First Amendment. In contrast, the Fifth Circuit Court of Appeals reversed a district court injunction against the Texas law, based upon its determination that the law did not regulate any speech and thus did not implicate the First Amendment.
The Supreme Court’s review of these two laws on appeal was much anticipated. In a July 1 ruling, Justice Kagan joined by Chief Justice Roberts and Justices Sotomayor, Kavanaugh, Barrett and Jackson ruled that the trade association plaintiffs (NetChoice and the Computer & Communications Industry Association (CCIA)) had challenged the laws on their face (as compared to as applied) and thus had the very high burden of showing that a substantial number of a law’s applications are unconstitutional, judged in relation to the statute’s plainly legitimate sweep. The Court stated that the parties and the courts below had limited their analysis to a relatively narrow scope of website activities rather than address the full range of activities the laws covered and measure the constitutional versus unconstitutional applications of the law. Thus, the Court held that the parties had not briefed the critical issues, and the record was underdeveloped. As a result, the Court vacated the lower court decisions and remanded the cases for further consideration.
The Court went on to provide guidance for this further consideration. It indicated that it wanted to avoid having the Fifth Circuit conduct further proceedings on the basis that its First Amendment analysis was correct, where the Court found that the Fifth Circuit’s conclusions involved a serious misunderstanding of the Court’s First Amendment precedent. The Court stated that the Fifth Circuit was wrong in concluding that the restrictions placed on a website’s selection and presentation of content do not interfere with expression. Furthermore, it observed that the Fifth Circuit was wrong to treat as valid Texas’ interest in changing the content on the website’s feeds, in order to better balance the marketplace of ideas. The Court further observed that the Eleventh Circuit saw the First Amendment issues in the case much as the Court does.
Justice Alito, joined by Justices Thomas and Gorsuch, wrote that the only binding holding in the decisions was that the plaintiffs have yet to prove that the laws they challenged are facially unconstitutional, with which they agreed. He went on to take issue with much of the majority’s analysis.
We were fortunate to have representatives of CCIA (Stephanie Joyce) and NetChoice (Carl Szabo) join us as panelists at our May 8 program.
Moody v. NetChoice, 22-277_d18f.pdf (supremecourt.gov)
On June 26, the Supreme Court reversed a Fifth Circuit ruling granting a preliminary injunction to two States (Missouri and Louisiana) and five individuals based on allegations that Executive Branch agencies and officials had pressured certain websites to suppress protected speech in regard to COVID-19 (White House, CDC, Surgeon General) and the 2020 Election (FBI and CISA) in violation of the First Amendment. The injunction provided that the defendants and their employees shall not coerce or significantly encourage social media companies to remove, suppress, or reduce posted content containing protected free speech.
Justice Barrett wrote the Court’s opinion. She reviewed the allegations of the individual plaintiffs that the actions of the government had caused platforms to censor their content at the behest of the government and that this would continue to occur. The individual plaintiffs also argued that they had suffered injury to their right to listen to others. The States asserted a sovereign interest in hearing from their citizens on social media. In all instances the Court found that the plaintiffs had failed to demonstrate that they had standing to pursue their claims and reversed the ruling of the Fifth Circuit.
Justice Alito, joined by Justices Thomas and Gorsuch, dissented. He wrote that one of the individual plaintiffs had shown that Facebook’s censorship of her content resulted at least in part from the White House’s prompting of Facebook to amend its censorship policies, and therefore had standing. Justice Alito asserted that the government officials with potent authority had communications with Facebook that were virtual demands and that Facebook’s response showed that it felt a strong need to yield. As a result he concluded that the individual plaintiff was likely to prevail on her claim that the White House coerced Facebook into censoring her speech.
Murthy v. Missouri, 23-411_3dq3.pdf (supremecourt.gov)
As part of a report on the DOJ’s efforts to deal with foreign efforts to interfere with U.S. elections through information sharing inside and outside of the government, issued on July 23, the OIG touched on some of the issues presented in the Murthy case.
The OIG found that the DOJ did not have a comprehensive strategy guiding its approach to engagement with social media companies in regard to foreign malign influence threats (FMITs) on U.S. Elections, which the OIG believes creates a risk to the DOJ. The OIG observed that the FBI must be mindful that its interactions with social media companies could be perceived as coercion or significant encouragement aimed at convincing social media companies to limit or exclude speech posted by it users, which may implicate First Amendment protections, noting the Fifth Circuit’s opinion in Murthy.
During the course of its review the OIG recommended that the DOJ develop an approach to inform the public of its procedures for transmitting notices of (FMITs) to social media companies that is protective of First Amendment rights. The DOJ indicated that during the course of the lower court proceedings in Murthy in October 2023 it began developing a standardized approach for sharing FMITs with social media companies that appropriately accounts for First Amendment concerns.
This process led to the issuance of a standard operating procedure (SOP) that went into effect in February 2024. The DOJ stated that the SOP reflects the principle that it is permissible to share FMITs with social media companies, as long as it is clear that ultimately it is up to the company whether to take any action, including removing content or barring users based on such information.
OIG Press release DOJ OIG Releases Report of Investigation Regarding Alleged Unauthorized Contacts by FBI Employees with the Media and Other Persons in Advance of the 2016 Election (justice.gov), OIG Report 24-080.pdf (justice.gov)
To date, complaints about website censorship have come largely from conservatives inspiring, among other things, the Texas and Florida laws at issue in NetChoice v. Moody. In a change of pace, on July 23, Representative Jerrold Nadler, a Democrat from New York who is the Ranking Member on the House Judiciary Committee, wrote to Committee Chairman Jim Jordan regarding alleged censorship of Vice President Kamala Harris’ campaign handle on X.
Representative Nadler stated that numerous users reported that over the past two days when they tried to follow @KamalaHQ they received a “Limit Reached” message stating that the user is “unable to follow more people at this time.” He said that the messages do not make any sense as these users are otherwise free to follow other accounts. He stated that “[t]his suggests that X may be intentionally throttling or blocking Vice President Harris’ ability to communicate with potential voters. If true, such action would amount to egregious censorship based on political and viewpoint discrimination – issues this Committee clearly has taken very seriously.” Representative Nadler requested that the Committee immediately launch an investigation and request specified information from X.
Representative Nadler’s Letter 2024-07-23_jn_to_jdj.pdf (house.gov)
Protection of Children on the Internet
Recently many states have enacted laws to either require age verification for access to adult content websites or parental consent for children’s access to social media sites. New York now joins California in taking a different approach to seeking to protect children online.
In 2022 California enacted the California Age-Appropriate Design Code Act which requires businesses that offer online products or services to include or exclude certain design features in order to protect children in connection with their online products or services likely to be accessed by children. On September 18, 2023, a Federal District Court in California issued a preliminary injunction against the law finding that it was likely that it violated the First Amendment. The Ninth Circuit heard oral argument on California’s appeal on July 17.
California Age-Appropriate Design Act Bill Text – AB-2273 The California Age-Appropriate Design Code Act., Preliminary injunction against The California Age-Appropriate Design Code Act NETCHOICE-v-BONTA-PRELIMINARY-INJUNCTION-GRANTED.pdf
On June 20, 2024, New York Governor Kathy Hochul signed two pieces of legislation intended to protect children online. She said “[y]oung people across the nation are facing a mental health crisis fueled by addictive social media feeds – and New York is leading the way with a new model for addressing the crisis and protecting our kids.” She further observed that “[b]y reining in addictive feeds and shielding kids’ personal data, we’ll provide a safer digital; environment, give parents more piece of mind, and create a brighter future for young people across New York.”
The Stop Addictive Feeds Exploitation (SAFE) for Kids act (SAFE Act) targets “addictive feeds”. It prohibits users under 18 viewing addictive feeds on social media platforms without parental consent. It also prohibits platforms from sending notifications to minor from 12:00 am to 6:00 p.m.
The New York Child Data Protection Act prohibits online sites from collecting, using, sharing or selling personal data of anyone under the age of 18, unless they receive informed consent or unless doing so is strictly necessary for the purpose of the website. For users under 13, informed consent must be provided by a parent.
SAFE Act S7694A (nysenate.gov), Child Data Protection Act s7695b (nysenate.gov)
We were fortunate to have Utah State Senator Michael McKell, the sponsor of Utah’s social media parental consent law, join us a panelist at our May 8 program.
On July 22, the Kids Online Health and Safety Task Force comprised of representatives of the White House, the Department of Health and Human Services, the Department of Commerce, the Department of Education, the Federal Trade Commission, the Department of Homeland Security and the DOJ issued a document titled Best Practices for Family and Guidance for Industry.
The report identifies key risks and benefits of online platforms and digital technologies to young people’s health, safety and privacy. It provides best practices for parents and recommended practices for companies. Recommended practices include:
The report describes a series of initiatives that various Federal agencies are undertaking in support of kids online safety. The report also calls on Congress to enact legislation to protect youth online. It states that such legislation should include prohibiting platforms from collecting personal data from youth, banning targeted advertising, and implementing measures to keep children safe from those who would use online platforms to harm, harass, and exploit them.
White House Kids Safety Task Force press release Kids Online Health and Safety Task Force Announces Recommendations and Best Practices for Safe Internet Use | HHS.gov
On August 2, the DOJ filed suit against TikTok and affiliated entities alleging that over the past five years TikTok knowingly permitted children under 13 to create regular TikTok accounts while collecting and retaining personal information for the children without notifying or obtaining consent from their parents. The DOJ further alleged that even as to accounts intended for children under 13, TikTok unlawfully collected and retained certain personal information. Moreover, the DOJ that when parents learned of their children’s accounts and requested TikTok to delete the accounts and related information, TikTok frequently failed to do so. The DOJ’s suit notes that the alleged actions occurred despite being subject to court order prohibiting the companies from violating COPPA and imposing measures to ensure compliance with the law. The suit seeks civil penalties and injunctive relief.
The DOJ’s press release asserted that TikTok’s COPPA violations have resulted in millions of children under 13 using the regular TikTok app, thereby subjecting them to extensive data collection and allowing them to interact with adult users and to access adult content. It stated that the Department is deeply concerned that TikTok has continued to collect and retain children’s personal information despite a court order barring such conduct.
DOJ press release Office of Public Affairs | Justice Department Sues TikTok and Parent Company ByteDance for Widespread Violations of Children’s Privacy Laws | United States Department of Justice, DOJ complaint dl (justice.gov)
Personal Data Control
On July 31, the Texas Attorney General’s Office announced that it had entered into a settlement agreement with Meta to stop the company’s capture and use of personal biometric data of millions of Texans without authorization as required by Texas law. The Texas AG stated that this was the first lawsuit and settlement that had been brought under the Texas Capture or Use of Biometric Identifier Act (CUBI).
According to the AG the suit involved a feature introduced in 2011 that made it easier to for users to tag photos with the names of people in the photos. The AG said that Meta automatically turned this feature on without explaining how it worked, and that for more than a decade it ran facial recognition software on faces uploaded to Facebook, capturing the facial geometry of those faces. The AG alleged that Meta did this despite knowing that CUBI forbids companies from capturing such biometric identifiers of Texans unless the company first informs the person and receives their consent to do so.
Under the settlement, Meta will pay Texas $1.4 billion over five years, which the AG described as the largest privacy settlement an Attorney General has ever obtained.
Texas AG press release Attorney General Ken Paxton Secures $1.4 Billion Settlement with Meta Over Its Unauthorized Capture of Personal Biometric Data In Largest Settlement Ever Obtained From An Action Brought By A Single State | Office of the Attorney General (texasattorneygeneral.gov), Agreed Final Judgment Final State of Texas v Meta Order 2024.pdf (texasattorneygeneral.gov), Texas AG’s February 2022 suit against Meta State of Texas v. Meta Platforms Inc..pdf (texasattorneygeneral.gov)