French police have arrested the founder of Telegram. What happens next could change the course of big tech

When Pavel Durov arrived in France on his private jet last Saturday, he was greeted by police who promptly arrested him. As the founder of the direct messaging platform Telegram, he was accused of facilitating the widespread crimes committed on it.

The following day, a French judge extended Durov’s initial period of detention, allowing police to detain him for up to 96 hours.

Telegram has rejected the allegations against Durov. In a statement, the company said:

It is absurd to claim that a platform or its owner are responsible for abuse of that platform.

The case may have far-reaching international implications, not just for Telegram but for other global technology giants as well.

Who is Pavel Durov?

Born in Russia in 1984, Pavel Durov also has French citizenship. This might explain why he felt free to travel despite his app’s role in the Russia-Ukraine War and its widespread use by extremist groups and criminals more generally.

Durov started an earlier social media site, VKontakte, in 2006, which remains very popular in Russia. However, a dispute with how the new owners of the site were operating it led to him leaving the company in 2014.

It was shortly before this that Durov created Telegram. This platform provides both the means for communication and exchange as well as the protection of encryption that makes crimes harder to track and tackle than ever before. But that same protection also enables people to resist authoritarian governments that seek to prevent dissent or protest.

Durov also has connections with famed tech figures Elon Musk and Mark Zuckerberg, and enjoys broad support in the vocally libertarian tech community. But his platform is no stranger to legal challenges – even in his birth country.

An odd target

Pavel Durov is in some ways an odd target for French authorities.

Meta’s WhatsApp messenger app is also encrypted and boasts three times as many users, while X’s provocations for hate speech and other problematic content are unrepentantly public and increasingly widespread.

There is also no suggestion that Durov himself was engaged with making any illegal content. Instead, he is accused of indirectly facilitating illegal content by maintaining the app in the first place.

However, Durov’s unique background might go some way to suggest why he was taken in.

Unlike other major tech players, he lacks US citizenship. He hails from a country with a chequered past of internet activity – and a diminished diplomatic standing globally thanks to its war against Ukraine.

His app is large enough to be a global presence. But simultaneously it is not large enough to have the limitless legal resources of major players such as Meta.

Combined, these factors make him a more accessible target to test the enforcement of expanding regulatory frameworks.

A question of moderation

Durov’s arrest marks another act in the often confusing and contradictory negotiation of how much responsibility platforms shoulder for the content on their sites.

These platforms, which include direct messaging platforms such as Telegram and WhatsApp but also broader services such as those offered by Meta’s Facebook and Musk’s X, operate across the globe.

As such, they contend with a wide variety of legal environments.

This means any restriction put on a platform ultimately affects its services everywhere in the world – complicating and frequently preventing regulation.

On one side, there is a push to either hold the platforms responsible for illegal content or to provide details on the users that post it.

In Russia, Telegram itself was under pressure to provide names of protesters organising through its app to protest the war against Ukraine.

Conversely, freedom of speech advocates have fought against users being banned from platforms. Meanwhile political commentators cry foul of being “censored” for their political views.

These contradictions make regulation difficult to craft, while the platforms’ global nature make enforcement a daunting challenge. This challenge tends to play in platforms’ favour, as they can exercise a relatively strong sense of platform sovereignty in how they decide to operate and develop.

But these complications can obscure the ways platforms can operate directly as deliberate influencers of public opinion and even publishers of their own content.

To take one example, both Google and Facebook took advantage of their central place in the information economy to advertise politically orientated content to resist the development and implementation of Australia’s News Media Bargaining Code.

The platforms’ construction also directly influences what content can appear and what content is recommended – and hate speech can mark an opportunity for clicks and screen time.

Now, pressure is increasing to hold platforms responsible for how they moderate their users and content. In Europe, recent regulation such as the Media Freedom Act aims to prevent platforms from arbitrarily deleting or banning news producers and their content, while the Digital Services Act requires that these platforms provide mechanisms for removing illegal material.

Australia has its own Online Safety Act to prevent harms through platforms, though the recent case involving X reveals that its capacity may be quite limited.

European Union flags fly in front of European Commission headquarters

The European Union is making content moderation the responsibility of tech platforms. Olivier Hoslet/EPA

Future implications

Durov is currently only being detained, and it remains to be seen what, if anything, will happen to him in coming days.

But if he is charged and successfully prosecuted, it could lay the groundwork for France to take wider actions against not only tech platforms, but also their owners. It could also embolden nations around the world – in the West and beyond – to undertake their own investigations.

In turn, it may also make tech platforms think far more seriously about the criminal content they host.