Back
Technology

Australia Implements Social Media Ban for Under-16s; Companies Report Account Deactivations

View source

Australia's Under-16 Social Media Ban: Implementation, Challenges, and Global Reactions

Australia has implemented a new law restricting individuals under 16 from holding social media accounts, effective December 10. This legislation requires platforms to take "reasonable steps" to prevent access for underage users, with non-compliance potentially incurring fines up to A$50 million. Following the ban's commencement, several social media companies, including Meta and Snapchat, have reported deactivating hundreds of thousands of accounts identified as belonging to under-16 users.

The law, described by the Australian government as a measure to protect children from online harms, has drawn varied responses from tech companies regarding its effectiveness and implementation challenges, while other nations observe or consider similar policies.

Legislative Framework and Rollout

The Australian government's Online Safety Amendment Act mandates social media platforms to implement age verification methods to prevent access for individuals under 16 years old. Prime Minister Anthony Albanese characterized the initiative as "world-leading," aimed at allowing children to experience childhood without certain online pressures.

The ban covers platforms such as Meta's Instagram, Facebook, and Threads, as well as TikTok, YouTube, X, Reddit, Snapchat, Kick, and Twitch. Fines for non-compliance can reach A$49.5 million (US$33 million, £25 million).

Platform Compliance and Account Deactivations

Meta's Response

Meta initiated account deactivations for users believed to be between 13 and 15 years old in Australia from December 4, ahead of the official ban. Between December 4 and December 11, Meta reported deactivating 544,052 accounts across its platforms: 330,639 on Instagram, 173,497 on Facebook, and 39,916 on Threads. The company estimates that approximately 150,000 Facebook users and 350,000 Instagram users fell within the 13-15 age bracket.

Meta described its compliance process as "ongoing and multi-layered."

Snapchat's Actions

Snapchat reported locking or disabling over 415,000 accounts belonging to under-16 users in Australia by the end of January, with continued daily deactivations.

Broader Deactivation Data

Federal government data indicated that over 4.7 million accounts across 10 platforms were deactivated within the initial two days of the ban. The eSafety Commissioner, Julie Inman Grant, clarified that this total includes historical, inactive, and duplicate accounts, in addition to those identified as being under 16.

Age Verification: Methods and Technical Hurdles

Social media platforms are required to implement age verification methods, which may include submitting a "video selfie" for facial age scans or providing government-issued identification. The UK-based Age Check Certification Scheme (ACCS) acknowledged the merits of these methods but noted that no single, universally effective solution for all deployment scenarios was identified.

Companies have raised concerns regarding the practicalities of age verification.

Snapchat cited "significant gaps" and technical limitations, noting that facial age estimation technology typically has an accuracy range of two to three years of a person's actual age.

Snapchat also observed a lack of a "liveness test" in some facial age estimation technologies to confirm image authenticity. Both Snap and Meta have suggested that age verification should be implemented at the app store level to potentially reduce compliance burdens and ensure consistent, industry-wide protections.

Divergent Responses and Criticisms

Government Perspective

Communications Minister Anika Wells stated that the ban aims to safeguard teens from "pressures and risks they can be exposed to while logged in to social media accounts." Prime Minister Albanese and Minister Wells described initial compliance data as "encouraging," acknowledging that full perfection was not expected immediately.

Industry Concerns

  • YouTube expressed concerns that the new legislation could reduce child safety by removing established parental controls, arguing that parents would lose the ability to supervise children's accounts once the ban commences. Rachel Lord, Public Policy Senior Manager at Google and YouTube Australia, called the law "rushed regulation" that "misunderstand[s] the platform."
  • Meta suggested that a law requiring parental approval for under-16s to download social media applications would be beneficial. It also reiterated concerns that the ban could isolate vulnerable teenagers and encourage them to use less regulated applications.
  • Reddit initiated a legal challenge against the Australian government, contending that the ban is inefficient and restricts young people's freedom of speech, potentially isolating them from age-appropriate community experiences, including political discussions.
  • Snapchat expressed concern that the ban's limited scope, excluding certain communication apps, might direct teenagers towards less regulated alternative messaging platforms.

Broader Societal Criticisms

Critics argue the measure could lead to social isolation for certain groups reliant on platforms for connection and potentially redirect young users to less regulated online environments. Some children, supported by mental health advocates, have stated that the ban may restrict connections for young people, particularly those in LGBTQ+, neurodivergent, or rural communities, potentially leaving them less prepared for online interactions.

Reports of Circumvention

Some under-16 users have reported circumventing the ban. Instances include individuals finding ways to regain access to platforms or migrating to other online services not initially covered by the ban. The eSafety Commissioner acknowledges reports of under-16 accounts remaining active and encourages public reporting to platforms for improved age-assurance accuracy.

Expanded Scope and Ongoing Monitoring

The eSafety Commissioner's office has requested self-assessments from other applications, such as Lemon8 (developed by TikTok's creators) and Yope, regarding their compliance with the ban, particularly in response to potential migration of young users. Yope's CEO, Bahram Ismailau, stated the company had self-assessed and determined it is not a social media platform, functioning as a private messenger. Lemon8 has reportedly committed to excluding users under 16 from its platform.

A Global Precedent: International Observation

Australia's social media ban for under-16s is being observed internationally.

The law, notable for its age limit of 16 and denial of parental approval exemptions, is considered one of the world's stricter policies.

Spain has announced plans for a similar social media ban for under-16s, citing concerns about addiction, abuse, and misinformation. Other governments, including the US state of Florida, the European Union, the UK, and France, are exploring or have proposed measures to limit children's social media use.