Telegram Founder Pavel Durov Is Not a Free-Speech Martyr

telegram-founder-pavel-durov-is-not-a-free-speech-martyr
Telegram Founder Pavel Durov Is Not a Free-Speech Martyr

Telegram’s legal battle in France has been a long time coming. Pavel Durov, Telegram’s founder, has long thumbed his nose at authorities and is famously uncooperative when governments ask the app to take down posts or share information about its users.

French authorities indicted Durov on six different charges including complicity with the dissemination of child sexual abuse material, drug trafficking, and organized crime. Paris is, essentially, attempting to punish Durov for the nasty things that take place on Telegram.

Bad stuff happens on all social media networks. People buy and sell drugs on Facebook. Terrorists recruit on Instagram. Thieves plan crimes on TikTok. The difference is that the people running those companies often, but not always, cooperate with authorities when they ask for help tracking down a terrorist, drug dealer, or child abuser. Telegram, famously, does not… most of the time.

Telegram has been thumbing its nose at global authorities for years. It brags about it in its on-site FAQ. “To this day, we have disclosed 0 bytes of user data to third parties, including governments,” it says.

Durov’s arrest kicked off a wave of calls for his release. #FreeDurov trended on X. Elon Musk decried the arrest as censorship. Chris Pavlovski, the CEO of Rumble, left Europe before issuing his own statement. “France has threatened Rumble, and now they have crossed a red line by arresting Telegram’s CEO, Pavel Durov, reportedly for not censoring speech,” Pavlovski said on X. “Rumble will not stand for this behavior and will use every legal means available to fight for freedom of expression, a universal human right.”

The central question of the case is one that’s long dominated the internet: are platforms responsible for what their users post?

In the early days of the internet, fueled by a utopian libertarian optimism, the answer was a loud and staunch “no.” In 2024 people’s feelings are more complicated. Governments, especially outside of the United States, constantly fight a flood of bad shit on all media platforms.

Most people want the freedom of expression the internet provides but can agree that sharing child porn is bad. No one wants the dissent of protestors squashed online but they also don’t want the Islamic State recruiting militants on Instagram.

People can still download and use Telegram in France. Germany threatened to ban the app in 2021 over COVID disinformation but it didn’t follow through. When faced with a real authoritarian regime, Telegram crumbles. You can’t download it in China. Cuba blocked the app completely in 2021 as a means to stifle dissent. Bahrain and Azerbaijan have blocked the service for years.

Telegram has also selectively responded to government requests to censor content in the past. A year after Germany threatened it, Telegram deleted 64 groups Germany said violated its hate speech laws. In 2020 it removed several Belarussian groups that doxxed riot police at the behest of Apple. It’s also complied with requests from Indonesia and Iraq.

Telegram has very much been here before. The only thing new is Durov’s arrest, which is part of a wider trend of government patience growing short with tech CEOs who aren’t willing to work with them. The same thing is playing out in Brazil (which has its own long and complicated history with Telegram) where a judge has ordered Musk to name a legal representative for X in the country or face shutdown.

Durov is not a free-speech martyr. His legal problems in France aren’t about political posts. They’re about his failure to work with French authorities on the basics of moderating his platform. If he actually cared about privacy or censorship, Telegram’s messages would be end-to-end encrypted by default. They aren’t.

“Telegram abides by EU laws, including the Digital Services Act—its moderation is within industry standards and constantly improving. It is absurd to claim that a platform or its owner are responsible for abuse of that platform,” Telegram said in a statement about the arrest posted to X. “Almost a billion users globally use Telegram as means of communication and as a source of vital information.”

It is technically true that Telegram abides by EU laws, including the Digital Services Act (DSA). That’s why it was French authorities, not Brussels, that arrested him. The DSA is a new piece of regulatory legislation aimed at reigning in social media companies and making them more transparent. It also only applies to sites with more than 45 million monthly users.

Telegram is demonstrably a vector for child sex abuse imagery and crimes of all sorts. It showed up in the charging documents of a U.S. Army soldier who was arrested earlier this week. He allegedly used the service to store and disseminate child sexual abuse material. Advocacy groups, experts, and governments have warned Telegram for years that it’s a popular hangout for sex criminals. Telegram largely ignored them.

So, are tech CEOs responsible for the stuff people post on their platforms? According to governments across the world, the answer is yes.