Australia has implemented a new federal law prohibiting individuals under 16 years of age from accessing specified social media platforms. The legislation, which became effective on Wednesday, has drawn varied responses from the public, tech companies, and experts, and has sparked international interest, including a similar consultation in the UK.
Policy Implementation and Scope
The ban applies to ten specified social media platforms, including Instagram, Snapchat, and X. YouTube and TikTok accounts are subject to the ban, though the platforms remain accessible without requiring an account. Online gaming platforms have been excluded from this legislation. Social media companies are mandated to take "reasonable steps" to enforce the age restriction. Penalties for serious breaches of the law could reach A$49.5 million. The legislation was passed into federal law by late November 2024, following a year of development.
Rationale for the Law
Proponents of the ban, including South Australian Premier Peter Malinauskas and Prime Minister Anthony Albanese, state the legislation aims to protect children from online harms such as excessive screen time, exposure to pressures, cyberbullying, predators, and "predatory algorithms." Malinauskas initiated state-level legislation after reading Jonathan Haidt's book "The Anxious Generation," published in March 2024, which examines the effects of smartphones on childhood. He affirmed that child protection was the primary consideration for the policy. The Australian Federal Police have also noted that chatrooms on various platforms can be environments for radicalization and child exploitation.
Emma Mason, a campaigner whose 15-year-old daughter died by suicide following experiences with online bullying, has publicly supported the legislation, stating it is the government's responsibility to protect vulnerable individuals from "unregulated agents of harm" online. She expressed a desire for the ban to prevent younger generations from developing within an unregulated online landscape.
Legislative Criteria and Gaming Exclusion
The eSafety Commissioner clarified that platform selection for the ban was not based on a safety, harms, or risk assessment. Instead, platforms were included if their "sole or significant purpose" was online social interaction between two or more users, if they allowed interaction among some or all users, and if they permitted user content posting. Gaming platforms were excluded on the basis that their primary purpose is not social-media style interaction under these criteria.
This exclusion has raised concerns among critics and health professionals. Dr. Daniela Vecchio, a psychiatrist who established Australia's only publicly-funded gaming disorder clinic in Perth, noted that gaming and social media are often interconnected and can present similar risks for children, such as excessive online engagement, exposure to harmful content, and cyberbullying. Gaming disorder is recognized by the World Health Organization, and a 2022 Macquarie University study indicated approximately 2.8% of Australian children are affected. Dr. Vecchio highlighted Discord and Roblox as platforms of particular concern, citing reports of potential exposure to explicit or harmful content, and both have faced child safety lawsuits in the United States.
Industry Responses and Age Verification Efforts
Social media firms have expressed opposition to the ban, contending it risks diminishing children's safety, infringes upon user rights, and raises questions about the efficacy of age verification technologies. Paul Taske from NetChoice, a trade group representing several large tech companies, stated that Australia's ban constitutes "blanket censorship."
Tech company representatives have engaged with government officials, with Snap CEO Evan Spiegel meeting with Australia's Communications Minister Anika Wells. Public statements from companies like Meta and Snap have suggested that major app store operators, such as Apple and Google, should be responsible for age verification. Meta further stated that "legislation that empowers parents to approve app downloads and verify age allows families - not the government - to decide which apps teens can access." Minister Wells noted that tech companies have had a 15-20 year period to improve their practices voluntarily and that current efforts are considered insufficient.
Amid mounting pressure, some social media companies have introduced features marketed as safer for younger users. Snapchat reported approximately 440,000 users aged 13-15 in Australia, TikTok around 200,000 under-16 accounts, and Meta about 450,000 users in this age group across Facebook and Instagram. Initiatives include:
- YouTube's rollout of AI technology to estimate user age and restrict harmful content for those under 18.
- Snapchat's special accounts with default safety settings for 13-17 year olds.
- Meta's Instagram Teen accounts, which implement stricter privacy and content settings for users under 18. A study co-led by former Meta whistleblower Arturo Béjar in September indicated that nearly two-thirds of the new safety tools on Instagram Teen accounts were ineffective.
- Roblox introduced new age assurance features in Australia weeks prior to the social media ban, with a global rollout planned for January.
- Discord implemented age checks for certain features earlier this year and introduced a new "teen-by-default" setting for Australian users concurrently with the ban.
Concerns and Criticisms from Diverse Groups
The ban has elicited various concerns and criticisms:
- Youth Perspectives: Breanna Easton, 15, residing 1,600km north-east of Brisbane, stated the restriction affects her ability to communicate with friends over 100km away. Conversely, Lola Farrugia, 12, who does not currently use social media, expressed support for the ban. Jacinta Hickey, 14, indicated she believes she possesses the maturity to navigate online content responsibly.
- Parental Concerns: Megan Easton, Breanna's mother, acknowledged the necessity of child protection but expressed reservations about governmental overreach and the removal of parental discretion in guiding children's digital experiences, particularly for those in remote areas.
- Minority Groups: A survey by Minus18, an LGBTQ+ youth support organization, indicated that 96% of nearly 1,000 respondents considered social media vital for accessing support and friends, with 82% believing the ban would lead to disengagement. Sadie Angus, 13, identifying as LGBTQ+, reported feeling cut off after her Instagram account was closed, stating it served as an anonymous space for sharing experiences and connecting with role models. Sharon Fraser, CEO of Reframing Autism, raised concerns for autistic young people, highlighting that online platforms can offer beneficial methods for communication and socialization not always readily available offline.
- Academic and Expert Critique: Professor Marcus Carter, a human-computer interaction expert at the University of Sydney, described the legislative approach as "reactionary" and questioned its effectiveness. Professor Tama Leaver, an internet studies expert at Curtin University, suggested the ban is "too blunt a tool" and advocated for a more nuanced approach, including for gaming platforms. He noted the wide spectrum of gaming experiences. Experts emphasize the need for age-appropriate regulation across all online environments and suggest that if harms are socially rooted, technical restrictions alone may not resolve them.
- Practical Challenges: Critics raise concerns that restricting access may push young users towards less regulated online spaces or encourage circumvention of age verification measures, potentially making monitoring more difficult for parents and support services.
Educational Perspectives
Iris Nastasi, principal at Rosebank College in Sydney, expressed support for the legislation, citing detrimental outcomes from extensive social media use, including damaged peer relationships. She noted a shift in her perspective, now emphasizing the preservation of childhood innocence.
Legal Challenges and International Interest
The law currently faces a High Court challenge initiated by two teenagers. Potential disputes with technology companies and a caution from former US President Donald Trump concerning American businesses also remain.
Minister Wells reported that leaders from various countries, including the EU, Fiji, Greece, Malta, Denmark, Norway, Singapore, and Brazil, have expressed interest in Australia's approach or are developing similar legislation. The UK government has initiated a consultation regarding the implementation of a ban on social media use for individuals under 16, similar to the Australian model, to address concerns about young people's mental health, online abuse, and exposure to harmful content. However, researchers in the UK have expressed concerns that a blanket ban might not fully address underlying issues, suggesting that online harms are often connected to offline harms. They advocate for improved education, open discussions, and greater adult understanding to help young people navigate digital realities.