Australia's Social Media Ban Excludes Gaming Platforms Amid Expert Concerns
Australia has implemented a social media ban for individuals under 16 years old. This legislation, which became effective on Wednesday, aims to restrict access to ten specified social media platforms, including Instagram, Snapchat, and X. While YouTube and TikTok remain accessible without requiring accounts, gaming platforms have been excluded from the ban.
Concerns from Health Professionals
Dr. Daniela Vecchio, a psychiatrist and founder of Australia's only publicly-funded gaming disorder clinic at Fiona Stanley Hospital in Perth, has expressed concerns regarding this exclusion. The clinic treats approximately 300 patients, including 15-year-old Sadmir Perviz, for excessive online gaming habits. Dr. Vecchio states that while gaming itself is not inherently detrimental, it can develop into an addictive behavior. She identifies similar risks for children on both gaming platforms and social media, such as excessive online engagement, potential exposure to predators, harmful content, and cyberbullying. Dr. Vecchio questions the absence of gaming platforms from the ban, noting the interconnectedness of gaming and social media use.
Sadmir Perviz, a former patient who spent up to 10 hours daily on online games, now participates in board game sessions at the clinic. Former patient Kevin Koo, 35, also described a period where he became extensively involved in online gaming following job loss, comparing the experience to substance abuse. Dr. Vecchio suggests expanding the social media ban to include gaming platforms and raising the age limit to 18.
Gaming disorder is recognized as a diagnosis by the World Health Organization. A 2022 Macquarie University study indicated that approximately 2.8% of Australian children are affected by gaming disorder.
Platform-Specific Issues and Industry Responses
Dr. Vecchio highlighted Discord and Roblox as platforms of particular concern, citing reports from experts and parents about potential exposure to explicit or harmful content. Both platforms have faced child safety lawsuits in the United States.
In response, Roblox introduced new age assurance features in Australia and two other countries weeks prior to the social media ban, with a global rollout planned for January. Discord also implemented age checks on certain features earlier this year and, on Wednesday, announced a new "teen-by-default" setting for all Australian users.
Legislative Criteria and Expert Criticism
The Australian government has stated the ban's purpose is to protect children from harmful content, cyberbullying, online grooming, and "predatory algorithms." The Australian Federal Police have warned that chatrooms on various platforms can be environments for radicalization and child exploitation.
However, the eSafety Commissioner clarified that platforms were not selected for the ban based on a safety, harms, or risk-based assessment. Instead, the legislation's criteria focused on platforms where the sole or significant purpose is:
- Enabling online social interaction between two or more users.
- Allowing users to interact with some or all other users.
- Permitting users to post content.
Gaming platforms were exempted because their primary purpose was not categorized as social-media style interaction under these criteria.
Professor Marcus Carter of the University of Sydney described the legislation as "incompetence" and "reactionary," arguing that the focus should be on practical assistance rather than broad bans. Professor Tama Leaver of Curtin University and the ARC Centre of Excellence for the Digital Child suggested that a more nuanced approach is necessary. He noted the wide spectrum of gaming, from beneficial platforms like Minecraft to those like Roblox, which he describes as an enabling tool for user-generated games, some of which may be accessed by young individuals despite being adult-oriented. Both academics emphasize the need for age-appropriate regulation in the digital sphere.