The UK media regulator has launched an investigation into Telegram over concerns it may be failing to prevent child sexual abuse material (CSAM) being shared. Ofcom said on Tuesday it was probing the popular messaging service after gathering evidence suggesting CSAM was present and being shared on the platform. Under the current law, user-to-user services operating in the UK must have systems in place to prevent people from encountering CSAM and other illegal content, or risk huge fines for breaches.
Telegram said in a statement that it 'categorically denies Ofcom's accusations', asserting that it has virtually eliminated the public spread of CSAM since 2018 through top-notch detection algorithms and partnerships with NGOs. The company expressed concern that this investigation may be part of a larger effort against online platforms that uphold freedom of speech and privacy.
This action is part of Ofcom’s broader crackdown on services suspected of flouting the UK's stringent online safety regulations. According to Suzanne Cater, Ofcom's director of enforcement, tackling CSAM is a top priority due to the severe damage it causes to victims. The NSPCC has praised Ofcom's investigation, highlighting the alarming number of child sexual abuse image offences reported daily. The probe also extends to other platforms like Teen Chat and Chat Avenue, accused of facilitating grooming risks.
With serious consequences on the horizon for platforms failing to comply with the Online Safety Act, Ofcom is determined to ensure that tech firms take necessary steps to protect children from exploitation online.
Telegram said in a statement that it 'categorically denies Ofcom's accusations', asserting that it has virtually eliminated the public spread of CSAM since 2018 through top-notch detection algorithms and partnerships with NGOs. The company expressed concern that this investigation may be part of a larger effort against online platforms that uphold freedom of speech and privacy.
This action is part of Ofcom’s broader crackdown on services suspected of flouting the UK's stringent online safety regulations. According to Suzanne Cater, Ofcom's director of enforcement, tackling CSAM is a top priority due to the severe damage it causes to victims. The NSPCC has praised Ofcom's investigation, highlighting the alarming number of child sexual abuse image offences reported daily. The probe also extends to other platforms like Teen Chat and Chat Avenue, accused of facilitating grooming risks.
With serious consequences on the horizon for platforms failing to comply with the Online Safety Act, Ofcom is determined to ensure that tech firms take necessary steps to protect children from exploitation online.






















