Telegram's policies permit fraud on its platform — and worse

Telegram Chief Executive Officer Pavel Durov
Pavel Durov, chief executive officer of Telegram
Chris Ratcliffe/Bloomberg

Following the arrest of Telegram CEO Pavel Durov in France over the weekend, scrutiny over the platform's role in criminal networks for cybercrime, fraud, drug trafficking, child exploitation and more has come into focus.

According to a statement from France's top public prosecutor Laure Beccuau, Durov's arrest comes in the context of an investigation against "person unnamed" on multiple charges, including complicity in the possession and distribution of pornographic images of minors, also known as child sexual abuse material, or CSAM.

On Wednesday, the Associated Press reported that French prosecutors freed Durov from police custody after four days of questioning.

"An investigating judge has ended Pavel Durov's police custody and will have him brought to court for a first appearance and a possible indictment," a statement from the Paris prosecutor's office said.

The charges Beccuau's office brought against Durov on Wednesday include complicity in organized fraud, web-mastering an online platform to enable an illegal transaction and refusal to communicate information or documents necessary for carrying out lawful interceptions (also known as data warrants).

Telegram is part of a family of platforms and applications that market heavily on privacy and free speech. It enables end-to-end encryption of messages, which prevents third parties (including Telegram itself) from reading messages, a feature also found in platforms such as Signal and WhatsApp.

But Telegram stands apart from other privacy-focused messaging platforms in one key manner: It is highly permissive about the kind of activities that are implicitly allowed on the platform.

In its terms of service, Signal requires users to only use the service for "legal, authorized, and acceptable purposes." Meta's WhatsApp bans users from using the service in ways that, among other things, are illegal. In contrast, Telegram's terms of service do not put a blanket ban on illegal activity.

If authorities manage to curb such activity on Telegram, that could have a dampening effect on financial crime, including check fraud.

Telegram's role in fraud and cybercrime

Telegram plays a "significant role" in enabling fraud and scams, according to Greg Williamson, senior vice president of fraud reduction for BITS, the technology policy division of the Bank Policy Institute. Specifically, the platform serves as a "how-to" channel for illicit actors to share information and recruit criminal partners.

"Check fraud is one type of fraud where the industry has seen significant activity on Telegram, with large-scale criminal actors sharing tactics they've successfully used to counterfeit and steal checks," Williamson said.

American Banker has previously reported that fraudsters overwhelmingly turn to Telegram to recruit criminal partners in these schemes.

Apps Claiming 'Free Speech' For Users Vie For Parler's Place

Old-fashioned check fraud is growing, and fraudsters find people to cash their checks — and tell them what to wear while doing it — through a popular messaging service.

March 10

"A fraudster who has a counterfeit or stolen check ready to be deposited uses Telegram to advertise their previous successful check deposits to entice other criminals who have open accounts to partner and provide them with access to those deposit accounts," Williamson said.

Because Telegram serves as a forum for illicit actors to communicate and network with one another, Williamson said "shutting down a platform like Telegram wouldn't eliminate fraud, but it would meaningfully disrupt existing networks and force fraudsters to change their approach."

With respect to Telegram's responsiveness to judicial warrants for data, Williamson said companies like Telegram "have a responsibility to American consumers to implement and enforce measures to detect and prevent illicit activities on their platforms."

CSAM on Telegram

Telegram's permissiveness has had far-reaching negative effects in CSAM in particular. In a report published last year by the Stanford Internet Observatory, researchers found that, while many social media and messaging platforms struggle to identify and shut down CSAM, Telegram uniquely fails to mitigate or even disallow the material on its platform.

The report highlights a section of Telegram's terms of service, which states that by signing up for the platform, users agree not to "post illegal pornographic content on publicly viewable Telegram channels." According to the report, this means Telegram is "implicitly allowing CSAM on its platform, provided it is shared in private groups or direct messages."

In contrast, every other platform analyzed by the report at the time — TikTok, Snapchat, Discord, Twitter and Instagram — prohibits dissemination of CSAM on the platform and has publicly claimed to try to mitigate it.

Since the report was published in June 2023, Telegram has not changed its terms of service to address this or similar omissions identified by the Stanford report.

Lack of mitigation measures

Furthermore, Telegram says in an FAQ on its website that, "to this day, we have disclosed 0 bytes of user data to third parties, including governments."

This is because the platform has structured its messaging infrastructure such that "several court orders from different jurisdictions are required to force us to give up any data," according to Telegram. As such, "Telegram can be forced to give up data only if an issue is grave and universal enough to pass the scrutiny of several different legal systems around the world," the company FAQ reads.

Collectively, these policies and principles indicate that Telegram implements no mitigations for illegal communications in nonpublic channels on its platform.

Telegram did not respond to a request for comment for this story.

The company said in a broadcast message on one of its public channels that it "abides by EU laws, including the Digital Services Act," and that "its moderation is within industry standards and constantly improving." The statement also said, "It is absurd to claim that a platform or its owner are responsible for abuse of that platform."

A company spokesperson told Newsweek on Tuesday that the company "actively moderates harmful content on its platform." The spokesperson also said moderators "use a combination of proactive monitoring of public parts of the platform, AI tools and user reports in order to remove millions of pieces of harmful content each day before they can do harm."

While the statement addresses moderation on public parts of the platform, Telegram touts the lack of moderation on private parts of the platform in the FAQ on its website.

"All Telegram chats and group chats are private amongst their participants," reads the FAQ under a section about illegal activity on the platform. "We do not process any requests related to them."

These so-called "private" channels can grow to include as many as 200,000 users, according to Telegram.​​

For reprint and licensing requests for this article, click here.
Cyber security Fraud Crime and misconduct Technology
MORE FROM AMERICAN BANKER