The difference between Mark Zuckerberg and Pavel Durov’s free speech problems

In this photo illustration, WhatsApp, Facebook, Instagram, Telegram, TikTok and Threads apps displayed on a smartphone screen.
The arrest of Telegram CEO Pavel Durov and a letter from Mark Zuckerberg raise the issue of censorship in social media.
Jaque Silva—SOPA Images/LightRocket/Getty Images

Freedom of speech issues are unavoidable when you’re running a social network—and they’re often less than clear-cut.

Just ask Mark Zuckerberg, who says he regrets giving in to alleged pressure from the Biden administration to censor Facebook and Instagram content during the COVID pandemic. Writing to U.S. House Judiciary Chair Jim Jordan (R-Ohio), the Meta CEO said the White House had in 2021 “repeatedly pressured our teams for months to censor certain COVID-19 content, including humor and satire” and “we made some choices that, with the benefit of hindsight and new information, we wouldn’t make today.”

“I believe the government pressure was wrong, and I regret that we were not more outspoken about it,” Zuckerberg wrote.

Is it right to err on the side of heavy-handedness when moderating content during a raging pandemic that features life-threatening disinformation? There’s no easy answer to that ethical dilemma, but different legal systems would lean different ways. Is a U.S. administration pushing its luck by urging such heavy-handedness, given the constraints of the First Amendment? Maybe; it depends on whether the government tried to legally compel Meta or just to influence it (as it seems was the case).

Either way, Jordan and other House Republicans are calling the letter a “big win for free speech.”

Which brings us back to the subject of Telegram and its CEO Pavel Durov, who has been detained for questioning in France, in relation to the vast amount of criminal activity that takes place on the platform. Elon Musk and many others are characterizing Durov as a free speech martyr, but, as I wrote yesterday, it’s hard to judge those claims without looking at the specific allegations against the Russian-French-Emirati-Saint-Kitts-and-Nevis billionaire.

We have those now, as Paris prosecutors published the potential charges late yesterday (by the end of Wednesday, Durov must either be indicted or released).

There are a dozen potential charges, several of which relate to Durov’s alleged “complicity” in crimes that take place on Telegram: possessing and distributing child sexual abuse material, drug dealing, organized fraud, and what appears to be the distribution of hacking tools.

Similarly to Section 230 of the U.S. Communications Decency Act, EU law gives service providers like Telegram immunity from responsibility for the illegal content on their platforms, as long as they don’t know about it and act quickly to remove it when made aware of it.

Unfortunately for Durov, he is also suspected of complicity in “web-mastering an online platform in order to enable an illegal transaction in [an] organized group.” If true, that would mean he knew about and indeed encouraged the criminal activity on Telegram, making it a feature rather than a bug. He is also suspected of “criminal association with a view to committing a crime” and of laundering organized-crime proceeds.

These very serious potential charges take the case in quite a different direction from what most people would recognize as a free-speech debate. However, the same can’t be said for three items on the prosecutors’ list that say Durov is suspected of breaking French laws around providing and importing cryptographic tools.

A quick reminder: Telegram may pitch itself as a secure messaging service, but its end-to-end encryption system is homegrown; it isn’t on by default; it can’t be activated for Telegram’s popular group chats; and Telegram has never published its server-side code. In the words of cryptography guru Matthew Green, Telegram’s encryption is only “quite possibly” secure.

But the French authorities don’t like that encryption, because Telegram apparently never notified them about it. French law allows the free use of encryption, but demands that importers, exporters, and suppliers of cryptographic systems first notify the French Cybersecurity Agency, at which point they are bound by secrecy about their dealings with the agency. Some argue that this is so the authorities can demand a backdoor in the system, so they can read protected messages, though it’s worth noting that the government has repeatedly failed to win the legal right do so.

The prosecutors’ list also says Durov is suspected of “refusal to communicate, at the request of competent authorities, information or documents necessary for carrying out and operating interceptions allowed by law.” This could be a reference to undermining encryption, though the French legal expert Florence G’sell reads it as being about “refusing to disclose user information that was requested in the course of criminal investigations.”

Encryption is a privacy issue and, by extension, a free-speech issue—if people can’t communicate in private, they don’t feel free to fully express themselves. So this case does touch on free speech. However, as G’sell noted in an X thread, the potential charges around encryption are “relatively minor” and, overall, the case is really about “providing technical means for criminal activities, rather than social media regulations.”

One point in closing: It’s also worth remembering that Telegram was once the target of a ban in Durov’s native Russia, until it wasn’t, and the reasons for the rapprochement between Durov and the Kremlin remain opaque. Many opposition activists in Russia don’t trust Telegram, saying their conversations are monitored by state security.  

Is this a free-speech case? Partly—but certainly not as clearly as what Zuckerberg has had to deal with. Is it wise to make Durov the poster boy for the defense of free speech? I wouldn’t bet on it.

More news below.

David Meyer

Want to send thoughts or suggestions to Data Sheet? Drop a line here.


A special digital issue of Fortune

The best stories of July and August from Fortune, including a radical overhaul at a private equity titan, a crisis for the First Family of poultry, and more.

— KKR’s co-CEOs want to reach $1 trillion in assets by 2030. To do so, they’re willing to make big bets and leave the PE firm’s old ways behind. Read more.

— John Randal Tyson was set up to run his family’s $21 billion chicken empire. His erratic behavior could change that. Read more.

— Jeff Bezos’s famed management rules are slowly unraveling inside Amazon. Read more.

— A 25-year-old crypto whiz kid went from intern to president of Jump Trading’s crypto arm. Then he became the fall guy. Read more.

— An inside look at a secretive investment firm that counts some of the wealthiest Americans as clients and some of Silicon Valley’s most powerful figures as advisors. Read more.

— Can you quit Ozempic and stay thin? These startups say you can—but doctors say that’s an unproven claim. Read more.

NEWSWORTHY

HP’s Chips Act award. The latest recipient of cash from the U.S.’s Chips Act stash is HP, which will get $50 million in direct funding to expand and modernize an Oregon plant that develops and makes silicon components for life sciences lab equipment. Reuters reports that the supported technologies also have AI applications.

Chinese export curbs. The Chips Acts in the U.S. and EU are designed to secure supply chains by onshoring semiconductor manufacturing and getting away from geopolitical uncertainty in and around China. But China’s export controls on essential chipmaking materials are starting to hurt. According to the Financial Times, prices of germanium and gallium have nearly doubled in Europe over the last year. China produces 98% of the world’s gallium and 60% of its germanium, and this is its way of hitting back against western export controls on advanced chips and chipmaking equipment.

Google’s data center rejection. Dublin’s local authorities have rejected Google’s application to build a major new data center in the Irish capital. Per Bloomberg, the decision noted that Google hadn’t sufficiently explained the likely impact on the power grid. The sustainability aspect of the data center boom is becoming increasingly prominent as countries fret about the impact on electricity networks, water supplies, and more.

SIGNIFICANT FIGURES

-29%

—The drop yesterday in the share price of Temu parent PDD Holdings, after the company’s revenues fell short of analyst expectations. The e-commerce operation also warned of “intensified competition and external challenges” that could dampen future growth.

IN CASE YOU MISSED IT

Luca Maestri to step down as Apple CFO after a decade of huge growth at the iPhone giant, by Bloomberg

HP stuck ‘on the horns of a dilemma’ as it mulls pursuing Mike Lynch’s family for $4 billion Autonomy claim, by Ryan Hogg

Elon Musk’s X changes Grok chatbot after states say it spread election misinformation, by the Associated Press

Donald Trump says Elon Musk can consult for the federal government if he wins reelection, by Paolo Confino

Elon Musk backs California’s AI safety bill, citing risk to the public, by Christiaan Hetzner

Exodus at OpenAI: Nearly half of AGI safety staffers have left, says former researcher, by Sharon Goldman

BEFORE YOU GO

An AI’s instructions. Anthropic has taken the unusual step of publishing its system prompts—the instructions it gives its Claude Opus AI model to shape its “personality” and limit what it can do. As TechCrunch reports, some of the prompts are rules, like forbidding facial recognition, but the prompts also include things like telling Claude to seem “very smart and intellectually curious” and to appear like it “enjoys hearing what humans think on an issue and engaging in discussion on a wide variety of topics.” Let’s see if Anthropic’s competitors, like OpenAI, adopt a similarly open stance.

This is the web version of Fortune Tech, a daily newsletter breaking down the biggest players and stories shaping the future. Sign up to get it delivered free to your inbox.