Back

Australian Government Raises Child Safety Concerns on Roblox, Seeks Meeting and Rating Review

Show me the source
Generated on: Last updated:

Australian Government Raises Alarm Over Child Safety on Roblox, Demands Urgent Meeting

The Australian government has formally expressed significant concerns regarding child safety on the online gaming platform Roblox, prompting Communications Minister Anika Wells to request an urgent meeting with the company. These actions follow ongoing reports of children's exposure to inappropriate content and potential grooming attempts.

The eSafety Commissioner, Julie Inman Grant, has also announced plans to verify Roblox's adherence to its safety commitments, while the Australian Classification Board has been asked to review the platform's current PG rating.

Government Actions and Concerns

Communications Minister Anika Wells has requested an urgent meeting with Roblox, citing alarming reports of children being exposed to graphic user-generated content, including sexually explicit and self-harm material. The Minister also highlighted significant concerns about predators using the platform for child grooming.

In a letter to Roblox, Minister Wells referenced recent media reports and legal charges against an individual accused of grooming hundreds of children on various platforms, including Roblox. She has consulted with the eSafety Commissioner regarding potential short-term measures and has asked the Australian Classification Board to assess the continued appropriateness of Roblox's current PG rating.

Prime Minister Anthony Albanese described the reports as "horrendous" and stated the government's commitment to taking necessary actions based on the eSafety Commissioner's advice, emphasizing that children's online safety is non-negotiable.

The government is actively exploring additional regulatory options for online services such as Roblox to ensure children's protection.

eSafety Commissioner's Involvement

The eSafety Commissioner, Julie Inman Grant, has formally informed Roblox of her office's intent to verify the platform's compliance with safety commitments made last year. These commitments reportedly included setting accounts for users under 16 to private by default and introducing tools to prevent adult users from contacting minors without parental consent. The eSafety office plans to rigorously test Roblox's implementation of these and other measures.

Commissioner Inman Grant stated that enhanced monitoring of Roblox's safety protocols would occur due to sustained concerns regarding child exploitation and exposure to harmful material. Following these compliance tests, the eSafety Commissioner may consider further action against Roblox under the Online Safety Act. This legislation allows for potential penalties of up to $49.5 million for non-compliance.

New codes under this legislation, specifically addressing age-restricted material, grooming, and sexual extortion, will apply to Roblox starting March 9.

Roblox's Context and Stated Safety Measures

Roblox, an online gaming platform where users create their own mini-games, is the most popular gaming application among Australian children aged four to 18. It boasts approximately 111 million daily users globally, with Australia being its second-largest market. The platform was previously exempted from Australia's under-16s social media ban, an exemption based on prior safety commitments made in cooperation with the eSafety Commission.

Roblox has stated it has robust safety policies and advanced safeguards to monitor for harmful content and communications. The company claims to be the first large online gaming platform to require facial recognition age checks for all users to access certain 17+ content and features.

In September, Roblox outlined commitments to align with Australia's Online Safety Act, which included:

  • Default private accounts for users under 16.
  • Tools to prevent adults from contacting under-16s without parental consent.
  • Default deactivation of direct and in-game chat features for children in Australia until age estimation is completed.
  • Prohibition of voice chat between adults and 13-15 year olds, and complete prohibition for under-13s.
  • Parental controls to disable chat for 13-15 year old users.
  • Verification for users to be 17 years or older for access to certain games.

The company also previously informed the regulator that it had fulfilled commitments regarding non-consensual sharing of intimate images, grooming, and sexual extortion, and planned to expand age verification for communication features by the end of last year. However, reports suggest that identified issues persist despite these measures, with some teens reportedly circumventing verification tools.

Background and Prior Reports

The increased scrutiny on Roblox follows various reports, including a Guardian Australia report from November. This report documented instances of virtual sexual harassment and violence experienced by a reporter playing as an eight-year-old character, even with parental controls activated.

Previous reports also indicated that Roblox had been used by groups to encourage young girls to self-harm and was allegedly infiltrated by right-wing extremists and Islamic State propaganda.

Minister Wells has highlighted the need for a "digital duty of care," proposing that digital platforms should be proactively responsible for keeping their users, particularly children, safe.