World

UK regulator opens Telegram probe over child abuse material safeguards

Ofcom opened a formal probe into Telegram over CSAM safeguards, with fines of up to £18 million or 10% of global revenue on the line.

Lisa Park2 min read
Published
Listen to this article0:00 min
Share this article:
UK regulator opens Telegram probe over child abuse material safeguards
Source: c.files.bbci.co.uk

Britain’s communications regulator opened a formal investigation into Telegram on 21 April 2026, saying it is examining whether Telegram Messenger Inc. has failed, or is failing, to meet its illegal content safety duties under the Online Safety Act 2023 in relation to child sexual abuse material. Ofcom said the case was launched after evidence from its own assessment of the platform and from the Canadian Centre for Child Protection pointed to alleged sharing of CSAM on the service.

The probe matters because it puts one of the UK’s newest online safety powers to the test against a major messaging platform that has long sat between public social media and private communication. Ofcom said the law requires regulated user-to-user services to use proportionate systems and processes to prevent people from encountering priority illegal content, to reduce the risk that a service is used to facilitate priority offences, and to minimize how long illegal material remains online once it is identified. Those duties came into force on 17 March 2025, and providers can comply either by following Ofcom’s codes of practice or by using equivalent measures.

Ofcom’s enforcement tools are substantial. The regulator said compliance failures can lead to fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater. In the most serious cases, it can also seek a court order requiring third parties to take action that disrupts a provider’s business. That makes Telegram more than a single-platform case: it is an early test of how far regulators can push encrypted or semi-private services to build safety systems without claiming the power to read every message. Telegram is likely to argue that broad moderation duties on a global messaging service are hard to square with privacy, scale and the basic utility of the platform, even as Ofcom’s legal theory centers on whether the company’s internal safeguards meet the statutory standard.

Suzanne Cater, Ofcom’s director of enforcement, said the regulator had seen progress on file-sharing services but warned that the risk extends beyond them. “These firms must do more to protect children, or face serious consequences under the Online Safety Act,” she said. Ofcom also said that, alongside Telegram, it had opened investigations into Teen Chat and Chat Avenue over whether they were doing enough to prevent grooming risks. For child protection advocates, the Telegram case will be watched as a measure of whether the Online Safety Act can do more than signal concern and actually force product-level changes on platforms where harmful material is harder to detect and easier to move.

Sources:

Know something we missed? Have a correction or additional information?

Submit a Tip

Never miss a story.
Get Prism News updates weekly.

The top stories delivered to your inbox.

Free forever · Unsubscribe anytime

Discussion

More in World

UK regulator opens Telegram probe over child abuse material safeguards | Prism News