It’s too soon to untangle the arrest of Pavel Durov, CEO of the encrypted-messaging service Telegram, which he co-founded. The French government has indicted him on charges of complicity in the distribution of child sex abuse images, aiding organized crime and refusing lawful orders to give information to law enforcement. There are still many questions about the extent of Durov’s role other than operating what the Atlantic’s Charlie Warzel called the “platform of choice for many activists, crypto scammers, drug dealers, terrorists, extremists, banned influencers, and conspiracy theorists.”
But it’s not too soon to talk about the implications for free speech, because we’ve already been wrestling with the problems posed by services like Telegram for many years — and will do so for many to come.
When I started writing on the internet, more than 20 years ago, my fellow bloggers and I assumed it was a free and open place where anything could happen. “The internet interprets censorship as damage and routes around it,” we used to tell each other, a little giddy. Well into the social media era, Twitter executives proudly proclaimed that their company belonged to “the free speech wing of the free speech party.”
But as the World Wide Web entered its third decade, the internet’s scale and reach empowered some very bad actors, from trolls to white nationalists to child pornographers and drug cartels. A clamor arose to crack down on all this dangerous chatter — which brings us to Pavel Durov.
The Associated Press reports that French authorities are saying his company has “refused to share information or documents with investigators when required by law.” The possibility that loose moderation and encrypted messaging are empowering heinous crimes is a real challenge for the free-speech wing of the free-speech party: Platforms where speech is unfettered are also platforms that make it easier to say, and do, antisocial things. This has always been a problem with free speech, of course, but the internet has given the bad guys opportunities we could never before have imagined.
And so there has been a concerted push for institutions to censor, to hand over user data, to fiddle with algorithms to tilt conversations in a more prosocial direction. Defending the freedom to say dark things — in private or public — inevitably raises the question “Why would you want to help such people?” Services such as Telegram, where conversations can tip from bad speech to bad deeds, make this particularly awkward to answer.
But there is an answer, which is that this is the wrong question. We should not be asking whether anyone wants to help criminals (no!) but whether it’s worth sacrificing our own liberties to make it easier for the government to stop them. The Bill of Rights answered this with a resounding no, and that’s still the correct answer after more than 200 years.
If you allow people to say anything, you’ll see a lot of hateful filth, but you will also see robust discussions that make our democracy stronger. If you allow bloggers to speculate about anything that crosses their minds, you will find they generate a lot of nonsense — and also provide a useful check on institutions that aren’t doing their jobs properly. If you maintain spaces where people can talk away from the prying eyes of the authorities, you will make it harder for democratic governments to catch criminals and also make it harder for despotic governments to crack down on political activists.
It’s tempting to say that we’ll let only the good governments have those powers, for good purposes. That we aren’t really sacrificing an important freedom, only the kinds of freedom that no one should have. That we’re simply sanding off the wilder edges of the internet, while leaving plenty of spaces for all the right kinds of speech to flourish.
But while it might not be one short step from a Telegram crackdown to a full-blown Chinese-style surveillance state, there is an inevitable trade-off: When such powers are used, they can be abused, as even democratic governments have done when they’ve decided that some emergency — communism, terrorism, the pandemic — required us to give up some of our liberties in the name of hunting the bad people.
Inevitably, we regret those concessions. Coincidentally, shortly after Durov was arrested, Meta CEO Mark Zuckerberg published a letter in response to a U.S. House committee inquiry, regretfully admitting that the Biden White House had pressured Meta to censor disinformation during the pandemic, and that Meta had done so in some cases, though Zuckerberg takes full responsibility for those decisions. Undoubtedly, those officials thought they were helping people, but ultimately Facebook, owned by Meta, ended up also throttling reasonable speculation about the origins of the virus, along with an absolutely true story about Hunter Biden’s laptop, right ahead of an election.
Small cost, I’m sure many of my readers will say, especially if they voted for Joe Biden. But then consider how Donald Trump might use such powers — and then consider what even worse governments might do with expansive powers over Telegram’s user base. Which is why we keep deciding anew to tie officials’ hands: not because we’re afraid of what they’ll do to the criminals but because we’re afraid of what might eventually be done to us.