Canada warns Roblox may help predators and extremists reach children
Canada’s safety brief said Roblox’s chatty, kid-heavy world could funnel children from play into predator and extremist contact on Discord and Snapchat.

Canada’s federal government has flagged Roblox as more than a moderation headache. A Public Safety Canada brief said the platform’s youth-heavy audience, social tools and huge stream of user-generated content created unusual openings for predators and extremists to reach children, groom them and pull them into abuse that could continue on other apps.
The warning went beyond one kind of harm. The brief linked Roblox to child sexual abuse, self-harm content and extremist communities, and said violent extremists, white nationalists and pedophiles could use the game to target minors before moving conversations to services such as Discord and Snapchat, where abuse can disappear from view. That makes Roblox look less like a single game and more like a hybrid of a chat app, a social network and a media feed, which is exactly the kind of setup regulators worry about when kids are the main audience.
The concern lands inside Canada’s broader counter-radicalization push. The Canada Centre for Community Engagement and Prevention of Violence leads federal efforts to counter radicalization to violence, and on December 10, 2025, Ottawa added 764, Maniac Murder Cult, Terrorgram Collective and Islamic State-Mozambique to the Criminal Code list of terrorist entities. Canada said it was the first country to list 764, which Public Safety Canada describes as a decentralized transnational network of online nihilistic violent extremists. The government said the move was meant in part to counter the radicalization of young people online.
The scale of Roblox is what makes the warning sting. Coverage of the brief cited close to 100 million daily users, with about half under 13. That means any safety lapse can touch an enormous number of children in the exact age range that parents are trying to keep fenced in.
Roblox has moved to show it is tightening up. In September 2025, the company said its open-source Sentinel AI helped submit about 1,200 reports of potential child exploitation to the National Center for Missing and Exploited Children in the first half of 2025. Roblox also said it had shipped more than 100 safety initiatives since January 2025. On April 13, 2026, it announced Roblox Kids for ages 5 to 8 and Roblox Select for ages 9 to 15, with rollout planned for early June and stricter defaults for content, communication and parental controls.
For families, the likely result is a more locked-down Roblox. If Ottawa keeps pressing, the platform could face heavier age checks, tougher identity verification, stricter content filters and more coordination across platforms, all of which would change the loose, freewheeling feel that made Roblox such a massive hit in the first place.
Know something we missed? Have a correction or additional information?
Submit a Tip

