Technology

Telegram consistently declines to participate in kid protection programs

Telegram consistently declines to participate in kid protection programs

Telegram’s Controversial Stance on Child Safety Measures

 

The BBC has uncovered concerning details about Telegram, the popular messaging app founded by billionaire Pavel Durov. Despite its vast user base of over 950 million, Telegram has declined to join international initiatives aimed at detecting and removing child abuse material online. This revelation comes amidst Durov’s recent arrest in France, where he is facing allegations related to inadequate moderation on the platform, including issues of drug trafficking, child sexual content, and fraud.

Telegram is notably absent from two significant organizations: the National Centre for Missing and Exploited Children (NCMEC) and the Internet Watch Foundation (IWF). Both of these organizations work closely with numerous online platforms to combat child exploitation by identifying, reporting, and removing inappropriate material. While many social networks actively participate in these efforts, Telegram has ignored repeated requests from NCMEC to join their CyberTipline program, which includes over 1,600 registered internet companies.

Durov’s arrest has raised questions about Telegram’s commitment to user safety. Officials have indicated that his detention stems from a failure to cooperate with law enforcement regarding the platform’s moderation practices. Telegram has previously claimed that its moderation efforts align with industry standards and are continuously improving. However, unlike its competitors, Telegram does not participate in the established frameworks that help detect and remove child sexual abuse material (CSAM).

Telegram was initially founded in Russia but is now headquartered in Dubai, where Durov resides. The app has garnered popularity in various regions, including Russia, Ukraine, and former Soviet states, as well as Iran. Despite its global reach, Telegram’s failure to engage with NCMEC and IWF has raised significant red flags. Reports indicate that the vast majority of CSAM reports originate from major tech companies and social networks, such as Facebook, Google, Instagram, TikTok, Twitter (X), Snapchat, and WhatsApp.

The IWF has expressed frustration over Telegram’s lack of cooperation, stating that despite attempts to engage with the app over the past year, it remains unresponsive to their initiatives. As a result, Telegram does not proactively search for, remove, or block confirmed CSAM, which is typically categorized and compiled by organizations like the IWF and NCMEC. While Telegram does remove confirmed CSAM when alerted, the process is reportedly slower and less responsive compared to other platforms.

In addition to its refusal to collaborate with child protection organizations, Telegram is also not part of the TakeItDown program, which focuses on removing non-consensual explicit materials, often referred to as “revenge porn.” In stark contrast, major platforms like Snap, Facebook, Instagram, Threads, TikTok, Pornhub, and OnlyFans are active members of this initiative.

Transparency is another area where Telegram diverges from industry norms. Most social networks publish regular reports detailing content removals due to law enforcement requests. Companies like Meta, Snapchat, and TikTok maintain online libraries of these reports for public access. In contrast, Telegram’s approach to transparency reporting is limited. They provide updates through an in-app channel, lacking a comprehensive public archive. When contacted for previous reports, Telegram’s response was vague, indicating no reports were available for certain regions.

Telegram’s communication practices have also drawn criticism. Media inquiries are typically handled through an automated bot within the app, which has proven to be unresponsive. Although there is an unadvertised email address for press inquiries, responses have been slow or nonexistent.

In a recent interview, Durov revealed that he employs only about 30 engineers to manage the platform, raising questions about the company’s capacity to handle safety and moderation effectively. His dual citizenship in the United Arab Emirates and France, along with his Russian origins, adds another layer of complexity to the scrutiny surrounding Telegram.

As concerns about child safety online continue to escalate, Telegram’s refusal to join established protective measures has sparked significant debate. The app’s lack of engagement with organizations dedicated to combating child exploitation raises important questions about its commitment to user safety and responsible platform management. For a service with such a vast user base, these issues are critical to address in the ongoing conversation about online safety and accountability.

ADVERTISMENT

Leave a Reply

Back to top button