Back

Social Media Companies Face Concurrent Trials Over Alleged Platform Addiction and Child Exploitation

Show me the source
Generated on: Last updated:

Social Media Giants Face Landmark Trials Over Addiction and Exploitation Allegations

Hundreds of families, school districts, and state attorneys general have initiated legal proceedings against major social media companies, including Meta, TikTok, YouTube, and Snap. Two distinct but concurrent high-profile jury trials have recently commenced, testing existing legal precedents, particularly the application of Section 230 of the Communications Decency Act. This act typically shields online platforms from liability for third-party content.

These trials represent a significant moment for the US legal system's approach to technology companies, potentially leading to fundamental changes in platform design and industry standards.

One trial, in Los Angeles, California, addresses allegations of platform addiction and its impact on youth mental health. The other, in Santa Fe, New Mexico, focuses on claims that Meta platforms facilitated child sexual exploitation.

California Trial: Allegations of Addictive Design and Mental Health Harm

The trial in Los Angeles Superior Court is a "bellwether" case involving a 19-year-old plaintiff identified as KGM and her mother, Karen Glenn. They allege that social media companies intentionally designed their platforms with addictive features, contributing to KGM's mental health issues, including depression, self-harm, and suicidal ideation, which began at age 10. This trial marks the first instance where social media platforms must present a defense against mental health harm claims before a jury.

Key Allegations from Plaintiffs:
  • Platforms were intentionally designed to be addictive, leading to compulsive use and mental health decline.
  • Features cited include infinite scroll, video autoplay, recommendation algorithms, and frequent notifications.
  • KGM alleges that Instagram and TikTok presented her with "depressive" and "harmful social comparison and body image" content.
  • The lawsuit also claims that features recommending connections facilitated interactions between KGM and strangers, including predatory adults. KGM experienced bullying and sextortion on Instagram, with an alleged slow response from Meta.
  • Plaintiffs seek financial damages and injunctive relief to mandate changes in platform design and establish industry-wide safety standards.
Company Responses and Settlements:

Shortly before or as the trial began, Snap Inc. and TikTok reached undisclosed settlement agreements with KGM's lawyers. Both companies denied wrongdoing in Snap's public statement, while TikTok has confirmed an "in principle" settlement. Meta and YouTube remain defendants in this specific trial.

YouTube spokesperson José Castañeda refuted the allegations, asserting the company's commitment to providing a safer experience for young users and offering parental controls, including a new option to limit short-form video scrolling. Meta has directed to a website stating the lawsuits "misportray our company" and highlighting efforts to protect teens, such as default privacy protections, content limits, parental supervision tools, and AI for identifying minor users.

Companies have historically disputed claims of negative effects on youth mental health, citing a lack of conclusive research, emphasizing benefits like entertainment and social connection, and invoking Section 230.

Legal Context:

Los Angeles Superior Court Judge Carolyn Kuhl ruled that jurors must consider the companies' design choices, not just user-generated content, potentially bypassing Section 230 immunity. The legal strategy draws parallels to 1990s lawsuits against tobacco companies, focusing on product addictiveness. Legal teams anticipate internal company documents and employee statements acknowledging platform addictiveness will be unsealed.

This trial is considered the first of many similar personal injury lawsuits, with over 1,000 cases pending. A separate federal multi-district litigation (MDL) involving over 235 plaintiffs, including attorneys general from nearly three dozen states, is scheduled for June in San Francisco. Executives from Meta, TikTok, and YouTube are expected to testify, including Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri.

New Mexico Trial: Allegations of Facilitating Child Exploitation

A separate jury trial commenced in Santa Fe, New Mexico, involving the state of New Mexico against Meta, the parent company of Facebook, Instagram, and WhatsApp. The lawsuit, initiated by Attorney General Raúl Torrez in 2023, alleges that Meta fostered an environment conducive to predators targeting children for sexual exploitation and withheld information regarding these detrimental effects. Jury selection concluded, and opening statements were scheduled to begin on February 9. The trial is expected to last approximately seven weeks.

Core Allegations and Evidence:

The lawsuit originated from a state-conducted undercover investigation that documented sexual solicitations through proxy social media accounts designed to appear as minors. The state alleges Meta's design choices and profit motives prioritized engagement over child safety, leading to hazardous environments and exposure to sexual exploitation, solicitation, sextortion, human trafficking, and child sexual abuse material (CSAM).

Allegations include Meta allowing unmoderated groups related to commercial sex and potentially benefiting financially from advertisements placed alongside content that sexualized children, based on internal documents. Internal documents reportedly indicate Meta's estimate of approximately 100,000 children on Facebook and Instagram experiencing online sexual harassment daily.

Evidence is expected to include details from "Operation MetaPhile," an investigation that led to 2024 arrests for preying on children via Meta platforms.

Undercover agents, posing as children, were allegedly solicited for sex through platform design features. The state attorney general asserts agents did not initiate conversations about sexual activity. A recent filing alleges that CEO Mark Zuckerberg approved minors' access to AI chatbot companions despite internal safety warnings that the bots could engage in sexual interactions. Internal communications reportedly indicated a "Mark-level decision that parents cannot turn it off" in a March 2024 chat discussing parental controls.

Legal Strategy and Meta's Response:

New Mexico's prosecution aims to use consumer protection and public nuisance laws, potentially establishing a new legal avenue for states to hold social media companies accountable and circumvent immunity provisions from the First Amendment and Section 230. Meta's attempts to dismiss the case based on Section 230 and the First Amendment were denied in June 2024, with the court ruling focusing on the lawsuit's emphasis on platform design and non-speech issues.

Meta refutes the civil accusations, describing the prosecutors' methodology as 'sensationalist.' The company contends that ongoing lawsuits oversimplify the factors contributing to teen mental health challenges. A Meta spokesperson stated the company's commitment to supporting young people, citing over a decade of efforts including collaboration with experts and law enforcement, in-depth research, the introduction of Teen Accounts with built-in protections, and tools for parental management. Meta expressed satisfaction with its progress and an ongoing commitment to improvement.

While CEO Mark Zuckerberg was removed as a defendant in this particular case, he has provided a deposition, and portions may be presented in court. Key witnesses for the plaintiffs are anticipated to include educators, law enforcement officials, and whistleblowers. A separate lawsuit by New Mexico prosecutors against Snap Inc. alleging similar child sexual exploitation is also pending.

Broader Legal Implications

These concurrent trials represent a shift in the US legal system's approach to technology companies. Legal experts suggest that unfavorable outcomes for the platforms could have significant financial implications and lead to fundamental changes in platform design and industry standards. The outcomes are being closely watched for their potential impact on future litigation and regulatory actions against social media companies.