Telegram app investigated over concerns it hosts child sexual abuse material
Ofcom has launched an investigation into Telegram, one of the world’s most popular messaging apps.
The media regulator said it had received evidence from the Canadian Centre for Child Protection of users sharing child sex abuse material.
The Online Safety Act says user-to-user services need to take steps to proactively screen for harmful material and, if it is published, remove it or risk a fine of up to £18 million.
Ofcom said this morning: ‘In light of this, we have decided to open an investigation to examine whether Telegram has failed, or is failing, to comply with its duties in relation to illegal content.’
Telegram told Metro that it ‘categorically’ denied Ofcom’s accusations.
The Dubai-based company added: ‘Telegram categorically denies Ofcom’s accusations.
‘Since 2018, Telegram has virtually eliminated the public spread of CSAM on its platform through world-class detection algorithms and cooperation with NGOs.
‘We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy.’
‘Child sexual abuse destroys lives’
Rani Govender, the associate head of policy at the charity NSPCC, told Metro that 100 child sexual abuse image offences are recorded to police every day.
‘The scale of this abuse is stark and we strongly welcome Ofcom ramping up action to tackle it, including opening this investigation into Telegram,’ Govender added.
‘Children continue to face unacceptable risks online, especially on private messaging services where abuse can develop undetected.
‘It’s right that Ofcom focuses their attention on investigating these platforms.’
The Internet Watch Foundation, a nonprofit that investigates and collects reports of child sexual abuse imagery, also welcomed the announcement.
IWF communications director Emma Hardy told Metro: ‘Child sexual abuse destroys lives, and the circulation of the images and videos make those children victims all over again each time they are shared.
‘It is clear there must be nowhere criminals can be allowed to distribute this content.’
Telegram, Hardy said, needs to toughen its safeguarding measures and embrace a ‘zero-tolerance approach’.
The app acts almost as a cross between WhatsApp and X, allowing people to talk, organise and share news privately and in group chats.
Campaigners say that while the app’s light oversight has helped it thrive, it has also made it a tool for drug dealers and far-right groups.
Metro uncovered earlier this year a Telegram group where users ask others to use AI tools, such as X’s Grok, to create non-consensual sexual content.
Telegram said at the time that its terms of service forbid the creation of nonconsensual pornography.
The group remains active as of today, with recent posts including users asking for explicit, AI-edited images of teachers and their ‘crushes’.
Get in touch with our news team by emailing us at webnews@metro.co.uk.
For more stories like this, check our news page.