LONDON — Online gaming platform Roblox is rolling out new safety measures for players under the age of 13 in response to concerns about harmful content and user interactions. The platform, which boasts around 70 million daily users worldwide, is particularly popular among children, but has faced increasing criticism over its ability to protect younger users from inappropriate content.
Starting December 3, Roblox will require game creators to indicate whether their games are suitable for children under 13. Games that fail to meet the criteria will be blocked for players in this age group. This move is aimed at enhancing content moderation and ensuring that younger users are not exposed to inappropriate or upsetting material.
In addition to these game-specific updates, starting November 18, Roblox will restrict children under 13 from accessing “social hangouts,” which are spaces designed for player interaction via text and voice. These experiences, which focus on communication rather than gameplay, have been flagged as potentially risky, given their emphasis on real-time, unmoderated socializing.
Furthermore, younger players will also be barred from using “free-form 2D user creation” tools. These tools allow users to create and share drawings or written content, but because creations bypass Roblox’s moderation system, they have been seen as a potential avenue for the spread of harmful material.
“We recognize the deadline is soon, but we greatly appreciate your cooperation in helping us ensure Roblox is a safe and civil place for users of all ages to come together,” the company said in a statement on its developer website.
The new restrictions come amid growing concerns about online safety. According to media regulator Ofcom, Roblox is the most popular game in the UK for children aged 8 to 12. However, the platform has come under fire after reports of inappropriate interactions. In May, a young user told the BBC they were asked for sexual images while playing on the platform. In response, Ofcom urged tech companies to enhance protections for children, including measures to block “toxic” content.
The platform has also faced challenges abroad, with Turkey blocking access to Roblox in August due to similar concerns over safety.
In a statement to the BBC, Roblox acknowledged the criticism and emphasized its ongoing commitment to improving safety. “We’re constantly strengthening our safety systems and policies — we shipped over 30 improvements this year and we have more to come,” the company said. While the new measures will be rolled out soon, Roblox confirmed that full enforcement will not begin until 2025.