Skip to main content
 

Patreon and Twitter are right about freedom of speech

At a blockchain event i was at on Friday, someone made an offhand comment about how Patreon's editorial decisions meant that people no longer felt safe using the platform and a decentralized alternative was needed. I've heard this a few times, and I think this is very far from the case.

So what happened to inspire this kind of comment? It turns out that just before Christmas, the platform kicked of a self-proclaimed "anti-feminist" for racist speech, and a dozen or so denizens of the intellectual dark web, including infamous mysoginist academic Jordan Petersen, followed him out. This follows other platforms kicking off Sandy Hook denier Alex Jones earlier in the year, and the deplatforming of instigator-for-profit Milo Yiannopoulos.

Particularly for followers of this kind of rhetoric, but also for many civil libertarians, this represented an unacceptable breach of freedom of speech. In the same way that "free speech" alternatives to Twitter and Facebook have sprung up over the last few years, there was suddenly a lot of talk about building a free speech Patreon.

Of course, free speech definitions vary, and the one used here is in the free market libertarian sense: complete structurelessness where, in effect, the loudest communities are the ones that can be heard. In fact, given that all of these people have been kicked off of existing platforms for some kind of bigotry, one might and should question whether the subjects of their hate would be able to have an effective voice on these new platforms at all. But nonetheless, it stands to reason that when they lost one platform, they felt that they should build one where it was structurally impossible to kick them off.

Among the alt right, there's an ongoing meme that Silicon Valley is against their perfectly fair speech and everyone who works at these platforms is biased towards liberal values. Perhaps this is true - after all, people with more formal education tend to be more liberal. But the alt right misses a few things, whether deliberately or inadvertently. The first is that sites like Patreon and Twitter are private spaces, and legal free speech protections only apply to government - and the kind of racist, regressive discussion that is so beloved of the alt right is rightly an anathema to most of the advertisers that keep these platforms afloat. But their speech is also often outside the bounds of what is permissible by law: violence is a part of their rhetoric. And whereas a platform can comply with Section 230 of the Communications Decency Act to be absolved of responsibility for "obscene" content hosted on their servers, there is no such exemption for criminal responsibility. All platforms are required to remove content that breaks the law, including threats of violence.

There is a distinction here between open standards like protocols, and proprietary services like platforms. It is impossible to kick someone off the web, for example, or to censor their speech on a free and open internet. Where editing may occur is in a situation where their web hosting company might be liable for hosting criminal content, or have policies against hosting the same, as is their right. And of course, someone can be kicked off a third-party hosting platform for the same reasons. But the web itself does not have a built-in censorship method, and nor should it: providing one would give any authoritarian government, or authoritarian corporation for that matter, carte blanche to decide what we can all read, see, and hear.

Whether criminal content could be removed from a decentralized system like IPFS, I'm not sure. Because IPFS data isn't private, it's possible that people who find themselves hosting criminal content might find themselves liable. I'm not a lawyer, but this is an interesting issue that pertains to decentralization in general. Whereas in earlier peer to peer networks like Gnutella or Bittorrent every decentralized node associated with an illegal file was actively interested in that file, I don't know what the legal precedents are for dumb nodes on a decentralized network, when public content is stored on your property without your direct involvement.

Regardless, there is again a distinction between a network like IPFS, and platforms that might be built over the top. Whereas the protocol can be data agnostic, hosted platforms can't be. So if I'm building a service that uses decentralized software as a back-end, I might find that I'm legally required to provide a mechanism to prevent people from posting calls to harm, for example by allowing people to report that content and kicking those people off that platform in response.

The irony is that Twitter in particular is very widely criticized for not doing enough to remove bigots and fascists from the network. I agree with those voices, leaning heavily on Karl Popper's widely-quoted philosophy:

Unlimited tolerance must lead to the disappearance of tolerance. If we extend unlimited tolerance even to those who are intolerant, if we are not prepared to defend a tolerant society against the onslaught of the intolerant, then the tolerant will be destroyed, and tolerance with them.

[...] We should therefore claim, in the name of tolerance, the right not to tolerate the intolerant. We should claim that any movement preaching intolerance places itself outside the law, and we should consider incitement to intolerance and persecution as criminal, in the same way as we should consider incitement to murder, or to kidnapping, or to the revival of the slave trade, as criminal.

The mechanisms and legal machinations of freedom of speech aside, any community that allows intolerance to flourish will in itself become intolerant. For community managers, and anyone who wants to create a thriving space for discussion, establishing a safe space for thought is paramount. In particular, establishing a space where people from vulnerable and underrepresented communities can be truly heard is important for any kind of free and democratic society.

For these reasons, and simply because they're a relatively small community that sits firmly on the wrong side of history, the alt right's attempts to create new, decentralized alternatives to existing platforms will fail. That's not to say that decentralized platforms and protocols will fail; the pendulum is swinging in that direction, and there are lots of other reasons to build spaces that are free from centralized control. But a movement fueled by hate ultimately can never succeed, and we have plenty of societal protections to ensure that the people who peddle in bigotry get what they deserve. Patreon and Twitter are right to kick them to the curb.

 

Photo by Hasan Almasi on Unsplash

· Posts