Loading weather...

Australia asks Roblox, Minecraft to detail child safety measures

Australia asks Roblox, Minecraft to detail child safety measures

By The Star


SYDNEY, April 22 (Reuters) - Australia's internet ⁠regulator on Wednesday asked online gaming platforms including Roblox and Microsoft's Minecraft to ⁠spell out how they were protecting children from grooming by sexual predators β€Œand youth from radicalisation.

The eSafety regulator said it had issued legally enforceable transparency notices to Roblox, Minecraft, Epic Games' Fortnite and Valve's Steam covering systems, staffing and safety aligned with cyber security protocols.

Companies must respond ​to the notices, with non-compliance exposing them to penalties ⁠and potential civil action.

eSafety Commissioner Julie ⁠Inman Grant said gaming-adjacent services, including encrypted messaging, can be the first point of ⁠contact β€Œbetween children and offenders in cases of grooming, sexual extortion and radicalisation.

"What we often see after these offenders make contact with children in online game environments, ⁠they then move children to private messaging services," Inman Grant ​said in a statement.

She β€Œsaid gaming platforms also function as social spaces, noting nine in 10 Australians ⁠aged eight to ​17 have played online games.

"Predatory adults know this and target children through grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalisation and other ⁠off-platform harms," she said.

Roblox and Microsoft did not immediately ​respond to requests for comment.

The move comes amid heightened scrutiny of how gaming platforms detect online threats to minors, with real-time chats with unknown users on some platforms harder for ⁠automated tools to police than traditional social media.

On Tuesday, Roblox reached settlements with Alabama and West Virginia over claims it failed to protect young users, agreeing to pay more than $23 million and make changes to how it allows children to use its chat and ​gaming functions.

Roblox is facing more than 140 lawsuits in ⁠U.S. federal courts accusing the company of knowingly facilitating child sexual exploitation.

As it grapples with ​the legal issues, Roblox last week said it would β€Œcreate tailored accounts for young users from June, ​assigning users aged five to eight to "Roblox Kids" and those aged nine to 15 to "Roblox Select."

(Reporting by Renju Jose in Sydney; Editing by Chris Reese)


Source: australia-asks-roblox-minecraft-to-detail-child-safety-measures


Disclaimer

The views and opinions expressed in this article are solely those of the author and do not necessarily reflect the official stance of Kritik.com.my. As an open platform, we welcome diverse perspectives, but the accuracy and integrity of contributed content remain the responsibility of the individual writer. Readers are encouraged to critically evaluate the information presented.


AI Summary

POWERED BY AI
  • Analyzing article content...


Login or Register to comment.


0 Comments

No comments yet. Be the first to comment!

Connection Issues

Kritikal News

Online now
Link copied to clipboard!