Australia Penalizes Telegram Over Delays in Investigating Violent Content

Fraud Management & Cybercrime
,
Geo Focus: Australia
,
Geo-Specific

eSafety Regulator Calls Out Telegram on Extremist and Child Sexual Content Management

Australia Fines Telegram for Delays in Violent Content Probe
Image: Shutterstock

In a significant enforcement action, Australia’s eSafety regulator has imposed a fine of nearly AU$1 million on social media platform Telegram for its delayed response to inquiries about the management of violent, extremist, and child sexual content on its platform. This action underscores the growing scrutiny over social media companies’ accountability in handling harmful content.

See Also: Unmasking Risk for Due Diligence, KYC, and Fraud


Australia’s eSafety commissioner publicly stated that Telegram took more than 160 days to respond to a notice requesting information on how it addresses content related to terrorism and child exploitation on its platform. This delay exceeded a mandated deadline set for May 6, prompting a penalty of AU$957,780.


Julie Inman Grant, the eSafety commissioner, emphasized the necessity for enhanced transparency within the tech industry. She remarked, “If we want accountability from the tech industry, we need much greater transparency. These powers give us insight into how these platforms are managing serious online harms affecting Australians.”


This regulatory action follows a series of transparency notices issued by the eSafety commissioner’s office in March 2024 to other major social networking platforms, including Meta, WhatsApp, Google, Reddit, and X (formerly Twitter). Telegram was the sole platform to fail to meet the imposed deadline.


The fine could have been greater, according to Grant, but she believes the current amount reflects the seriousness of Telegram’s delayed response while recognizing their subsequent improvements in correspondence with the commissioner’s office. To comply, Telegram has 28 days to either settle the fine, request an extension, or seek withdrawal of the infringement notice.

Since the adoption of Australia’s Online Safety Act in 2021, the eSafety commissioner has gained the authority to hold social media platforms, internet service providers, and app distribution services accountable for the existence of abusive content on their networks.

Serving over 900 million users globally, Telegram has faced challenges with regulators, often accused of not enforcing local laws regarding harmful content. This ongoing scrutiny reached a new level when French authorities arrested Telegram’s CEO, Pavel Durov, on charges linked to facilitating hacking and disseminating abusive material.

In the wake of Durov’s arrest, Telegram indicated it would begin sharing IP addresses and contact information of offenders in response to legitimate legal requests. Furthermore, the company announced the introduction of AI technologies and human moderators to better manage problematic content on its platform.

Telegram has also faced issues in South Korea, where a surge in deepfake videos victimizing minors prompted authorities to seek cooperation from Durov through French officials post-arrest—highlighting the global challenges in regulating digital content.

In summary, Australia’s regulatory body is advocating for greater accountability from Telegram and similar platforms, emphasizing the detrimental effects online radicalization can have on society. Commissioner Grant asserted that exposure to harmful online material can have lasting consequences, specifically for young users who are particularly vulnerable. As a result, eSafety is prepared to pursue civil penalties should Telegram fail to meet its compliance obligations.

Source link