Roblox has defended a major expansion of its child safety system, despite complaints from parents that the game’s user-age estimation can misclassify children and put them into less-protected versions of the service.
The gaming company, which says it has 144 million daily users worldwide, is extending its tech to introduce age‑specific accounts called Roblox Kids and Roblox Select.
A user’s estimated age will now determine the version and individual features of Roblox they can access, what content they see and who they can communicate with.
Parents told the BBC some children have been incorrectly identified as adults during the age‑check process, which they say can reduce parental controls.
A simple question
In an interview with BBC News, Matt Kaufman, Roblox’s chief safety officer, said the company’s age‑estimation system, which includes facial analysis, is now used by more than half of its daily users, amounting to tens of millions of people worldwide.
He said it typically estimates age “within about 1.4 years, plus or minus” for those under 18.
Roblox has not published data showing how often children are incorrectly classified as older users, but Kaufman argued the technology is more reliable than asking users to state their age themselves.
“When you ask them that simple question,” he said, “users are going to tell you whatever they want to tell you in order to get access.”
Age-rated accounts
Roblox already requires users to pass an age check before they can use chat features, placing people into age bands intended to limit communication between children, teenagers and adults.
The company says this is designed to reduce the risk of grooming and unwanted contact on the platform.
The new system extends that approach into account types:
- Roblox Kids, aimed at younger children, features a simplified interface, no communication tools and access only to a curated set of games.
- Roblox Select, for users aged nine to 15, allows limited communication and a broader but still restricted content library.
Under the changes, users who do not complete an age check will be restricted to children’s content and barred from communication on the platform.
The move comes the week after the mother of a 14-year-old girl was groomed into sending sexually explicit images of herself to an 18-year-old man said the platform was not doing enough to protect children.
Two million developers
To decide which experiences are made available to under-16s, Kaufman said Roblox uses a range of signals including how long a game has been on the platform and the history and usage patterns of people who make games within the platform.
The games people make will also have to meet suitability criteria, and those which include social or free-form elements will not be default-available on the Kids and Select accounts.
Those people making content are known as ‘developers’ and Kaufman says Roblox has more than two million of them.
However, Kaufman said it was, “irresponsible to choose one of those two million and have their opinion dictate how everybody feels about the platform.”
Yet some parents say when age checks go wrong, correcting the error can be difficult and stressful.
As scrutiny intensifies around children’s safety online, Roblox finds itself walking a familiar tightrope—balancing rapid growth with rising responsibility. Its latest updates to age verification and parental controls signal progress, but they also highlight a deeper question: can platforms truly police themselves when the stakes involve children?
Company executive Matt Kaufman has acknowledged that errors in age verification can occur, often attributing complaints to parents completing checks on behalf of their children or misunderstanding the system. Roblox, for its part, offers remedies—appeals, ID verification, and the ability to reset age checks. It has also introduced stricter parental controls, allowing guardians to block games and manage direct messages until a child turns 16.
These measures are not insignificant. They reflect a growing awareness within the tech industry that safeguarding younger users is no longer optional—it is expected. Yet, they also underscore the پیچیدگی of enforcing digital age boundaries in an environment where identity can be fluid and easily misrepresented.
Experts remain cautiously optimistic. Sonia Livingstone has described Roblox’s efforts as “encouraging,” but her assessment comes with a warning. There is, she argues, mounting evidence that risks persist—ranging from exposure to inappropriate content to the possibility of unwanted contact between adults and children.
Her concern cuts to the core of the debate: transparency and accountability. Parents are not just asking for tools; they are asking for assurance. Assurance that moderation systems work, that reporting mechanisms are effective, and that age verification is not quietly repurposed for commercial gain.
This conversation is unfolding against a backdrop of tightening regulation. Laws such as the UK’s Online Safety Act are beginning to force platforms to take greater responsibility for user safety, particularly for minors. Globally, governments are exploring stricter rules, including limits on social media access for younger users.
For Roblox CEO Dave Baszucki, the stance remains clear: parents should ultimately decide whether their children use the platform. But that position, while reasonable in principle, raises a reality—parents can only make informed decisions if the systems they rely on are trustworthy and transparent.
Roblox’s scale complicates everything. As one of the largest gaming platforms in the world, its influence—and its exposure to criticism—is inevitable. With that scale comes a higher standard. Incremental improvements may no longer be enough.
The challenge ahead is not just about adding features or refining algorithms. It is about rebuilding trust. And in the digital world, trust is not earned through claims of “going above and beyond,” but through consistent, verifiable protection of those most vulnerable.
For now, Roblox has taken steps forward. Whether those steps are sufficient remains an open—and increasingly urgent—question.

