Back
Technology

Juries Find Meta and Google Liable in Separate Cases Over Social Media Addiction and Child Safety

View source

Juries in the United States have delivered two significant verdicts against major technology companies Meta (owner of Instagram, Facebook, and WhatsApp) and Google (owner of YouTube). In Los Angeles, a jury found Meta and Google negligent in the design of their social media platforms, contributing to a young woman's addiction and mental health challenges. Separately, a New Mexico jury found Meta liable for knowingly harming children's mental health and concealing information related to child sexual exploitation on its platforms. Both companies have expressed disagreement with the verdicts and intend to appeal. These outcomes represent some of the first instances where juries have assigned liability to tech companies for harms associated with platform design and operation, potentially influencing thousands of similar lawsuits nationwide.

Los Angeles Verdict: Social Media Addiction

In a Los Angeles civil trial, a jury determined that Meta and Google were negligent in the design and operation of their social media platforms. The plaintiff, identified as Kaley GM (KGM), a 20-year-old woman, alleged that her extensive use of Instagram and YouTube from an early age contributed to her social media addiction and exacerbated her mental health issues, including depression, anxiety, body dysmorphia, and suicidal thoughts. KGM reported starting YouTube at age six and Instagram at age nine.

Jury Findings and Damages

After over 40 hours of deliberation across nine days, the jury reached its conclusions.

The jury concluded that Meta and Google knew, or should have known, their services posed a danger to minors and failed to provide adequate warnings. They found the companies' negligence was a "substantial factor" in causing harm to KGM.

The jury awarded KGM $US3 million in compensatory damages. An additional $US3 million in punitive damages was recommended, with the jury concluding the companies acted with malice, oppression, or fraud. The judge retains final authority on the awarded damages. Meta was assigned 70% of the liability, and Google (for YouTube) was assigned 30%. This allocation was reflected in the recommended punitive damages: $2.1 million from Meta and $900,000 from YouTube.

Plaintiff's Arguments

KGM's legal team contended that specific design features were intentionally engineered to foster addiction in young users and maximize engagement and advertising revenue. These features included infinite feeds, auto-play functions, persistent notifications, targeted algorithms, doomscrolling, and "toxic popularity meters."

Lawyers cited internal company documents, including "Project Myst," which reportedly found children experiencing "adverse effects" were most prone to Instagram addiction, and communications allegedly comparing platform effects to pushing drugs and gambling.

Company Defenses

Meta argued that KGM's mental health challenges were complex and predated or were unrelated to social media use, often referencing her home life and noting that her therapists had not identified social media as the sole cause. Meta CEO Mark Zuckerberg testified that user safety is a company priority and that he does not aim to maximize user time.

YouTube's defense emphasized its classification as a video streaming platform rather than a social media site, comparing it to television. The company also presented records that it claimed showed KGM's average daily use was just over a minute per day as she aged, disputing her reported duration of platform engagement. Both companies highlighted existing safety features and parental controls.

Defendants also invoked Section 230 of the 1996 Communications Decency Act, which generally protects internet companies from liability for user-generated content. However, the judge instructed jurors to distinguish between content and the design features delivering it.

Prior Settlements and Appeals

Before the trial commenced, Snap and TikTok had reached confidential settlements with KGM. Meta and Google have both announced their intention to appeal the Los Angeles verdict.

New Mexico Verdict: Child Safety and Mental Health Harms

In New Mexico, a jury found Meta liable in a case alleging the company failed to protect children from sexual predators and misled users about the safety of its platforms.

Jury Findings and Damages

Following a nearly seven-week trial, the jury reached a significant conclusion.

The jury concluded that Meta knowingly harmed children's mental health and concealed information regarding child sexual exploitation on its platforms.

The jury determined that Meta violated New Mexico's Unfair Practices Act by prioritizing profits over user safety, making false or misleading statements, and engaging in "unconscionable" trade practices that exploited the vulnerabilities of children. The verdict identified thousands of violations, leading to an order for Meta to pay $US375 million in civil penalties. This amount was less than one-fifth of what prosecutors had sought. Meta's stock reportedly rose 5% in early after-hours trading following the verdict.

State's Allegations and Evidence

New Mexico Attorney General Raúl Torrez filed the lawsuit in 2023, alleging that Facebook and Instagram created a "breeding ground" for child predators. The case included evidence from an undercover state investigation where agents created social media accounts posing as children, which reportedly received sexually explicit content and solicitations from adults.

Prosecutors argued that Meta publicly claimed its platforms were safe while internal documents indicated awareness of issues with sexual exploitation and mental health harms, and a failure to implement basic safety measures like effective age verification. Testimony included former Meta engineering director Arturo Bejar, who raised concerns after his daughter received solicitations, and former Meta Vice President of Partnerships Brian Boland, who stated safety was not a priority for Meta's CEO and COO when he left in 2020. Internal communications reportedly discussed how end-to-end encryption could impact the ability to report child sexual abuse material.

Meta's Defenses and Appeals

Meta stated its disagreement with the verdict and announced plans to appeal the decision. The company maintained that it invests heavily in safety, with 40,000 personnel dedicated to platform safety, and works to identify and remove harmful content and actors.

Meta argued that its platforms are protected from liability under Section 230 and First Amendment free-speech protections, contending that allegations of harm are inseparable from user-generated content. Meta executives have acknowledged "problematic use" on their platforms but dispute the concept of social media addiction.

Future Proceedings in New Mexico

The New Mexico verdict does not immediately compel changes to Meta's platforms. A second phase of the trial is scheduled for May, where a judge, without a jury, will determine if Meta's platforms constitute a public nuisance. The state is expected to request court orders for Meta to fund public programs addressing the identified harms and mandate platform changes, such as improved age verification, removal of predators, and protection for minors from encrypted communications.

Broader Legal and Regulatory Implications

These verdicts are seen as potential "bellwether" cases, expected to guide the resolution of thousands of similar lawsuits pending against social media companies across the U.S. These include cases from individuals, school districts, and attorneys general from more than 40 states, all asserting that social media platforms contribute to a youth mental health crisis through addictive design features and inadequate safety measures.

Legal Strategies and Section 230

The legal strategies in both cases focused on deliberate design choices and product liability, enabling them to navigate Section 230 of the Communications Decency Act, which typically shields internet companies from liability for content posted by users. This approach marks new legal territory.

Calls for Regulation

Experts and advocates suggest that while jury trials hold companies accountable, significant behavioral change within the industry will likely require new regulations. Calls for a "digital duty of care" that mandates platforms take reasonable steps to prevent harm are gaining momentum.

International Outlook

The U.S. verdicts do not immediately compel changes to platforms in other countries, but they support the position of governments, such as Australia's, that social media can be addictive and detrimental to young individuals. Australian law firms are evaluating the potential for similar legal cases, and the Australian government is advancing regulatory efforts, including expanding the definition of age-restricted social media platforms to encompass services utilizing addictive algorithms and features.

The final resolution of these cases, including appeals and potential settlements, is expected to take several years. The outcomes are anticipated to influence the ongoing debate about social media's impact on children's mental health and the future of tech regulation.