PLAY.GOOGLE.COM

SYDNEY — Australia’s internet regulator on Wednesday asked online gaming platforms including Roblox and Microsoft’s Minecraft to spell out how they protect children from grooming by sexual predators and shield young users from radicalization.

The eSafety Commissioner said it had issued legally enforceable transparency notices to Roblox, Minecraft, Epic Games’ Fortnite and Valve’s Steam, seeking details on their safety systems, staffing and measures aligned with cybersecurity protocols.

Companies must respond to the notices, with failure to comply exposing them to up to penalties of up to A$825,000 ($590,783) a day. They usually have 30 days to respond to compliance notices from Australian regulators.

eSafety Commissioner Julie Inman Grant said gaming-related services, including encrypted messaging, can become the first point of contact between children and offenders involved in grooming, sexual extortion and radicalization.

“What we often see after these offenders make contact with children in online game environments, they then move children to private messaging services,” Ms. Inman Grant said in a statement.

She said gaming platforms also function as major social spaces for children, noting nine in 10 Australians aged 8 to 17 have played online games.

“Predatory adults know this and target children through grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalization and other off-platform harms,” she said.

Microsoft said it was reviewing the regulator’s notice and took children’s online safety seriously.

“We continue to evolve our approach to meet the evolving threat and regulatory landscape,” a spokesperson said by email.

Roblox did not immediately respond to requests for comment.

The move comes amid rising scrutiny of how gaming platforms detect and prevent online threats to minors, particularly as real-time chats with unknown users on some platforms can be harder for automated systems to police than traditional social media.

On Tuesday, Roblox reached settlements with the US states of Alabama and West Virginia over allegations it failed to protect young users, agreeing to pay more than $23 million and make changes to how children access its chat and gaming features.

Roblox is facing more than 140 lawsuits in US federal courts accusing the company of knowingly facilitating child sexual exploitation.

As it grapples with the legal issues, Roblox last week said it would introduce tailored accounts for younger users from June, assigning children aged 5 to 8 to “Roblox Kids” and users aged 9 to 15 to “Roblox Select.” ($1 = 1.3965 Australian dollars) — Reuters