Roblox is rolling out mandatory age verification features in Australia, starting this week, to enhance child safety on its popular gaming platform. This move comes amidst growing concerns and lawsuits alleging the platform has failed to adequately protect minors from predators and inappropriate content. The new system utilizes facial estimation technology to categorize users into age groups, restricting chat functionalities between different age brackets.
Key Takeaways
- Roblox is implementing mandatory age verification using facial scanning technology.
- The feature aims to prevent minors from interacting with adults they don’t know.
- Australia, New Zealand, and the Netherlands will see the feature implemented in early December, with a global rollout in January.
- Roblox argues its platform is a gaming environment and should not be subject to Australia’s upcoming social media ban for under-16s.
- The company faces multiple lawsuits in the US related to child safety concerns.
Age Verification Rollout and Functionality
Starting this Wednesday, users in Australia will be able to voluntarily verify their age using Persona’s age estimation technology, which accesses a device’s camera to estimate age based on facial features. This feature will become mandatory in Australia, the Netherlands, and New Zealand from the first week of December, expanding to other markets in early January. Once verified, users will be assigned to one of six age groups, limiting their ability to chat with peers outside their own or similar groups. Users who opt out of age verification will still be able to use Roblox but will lose access to chat features.
Stance on Social Media Ban and Safety Concerns
Roblox insists that Australia’s upcoming under-16s social media ban should not apply to its services. The company’s chief safety officer, Matt Kaufman, described Roblox as an "immersive gaming platform" where games serve as "scaffolding for social interaction," differentiating it from social media platforms focused on content feeds. This stance comes as regulators have been in discussions with Roblox regarding safety, especially after an investigation highlighted instances of virtual sexual harassment on the platform. Despite the new measures, child safety advocates like the NSPCC have called for Roblox to ensure practical changes and prevent adult perpetrators from targeting young users.
Technology and Privacy Considerations
The age estimation technology, provided by Persona, has shown varying accuracy rates in trials. Roblox states that if users disagree with a ruling, they can correct it using government ID or parental controls. The company assures that data from scans will be deleted after verification, with ID images kept for 30 days for fraud detection before deletion. While age verification is optional for using the platform, it is required for chat features. This initiative is Roblox’s response to ongoing lawsuits in the US alleging the platform has not done enough to safeguard children from predators and inappropriate content, with plaintiffs claiming profits were prioritized over safety.
Sources
- Roblox rolls out age-verification features in Australia as gaming platform insists child social media ban
should not apply | Roblox, The Guardian. - Roblox CEO announces new age verification rules after lawsuits allege popular gaming platform harms
children, CNN. - Roblox blocks children from chatting to adult strangers, BBC.
- Roblox to scan users’ faces to verify age amid child safety concerns, Los Angeles Times.