OPINION

BRET STEPHENS: Facebook's unintended consequences

Over the past several years we've learned a lot about the consequences of social media. Platforms intended to bring us closer together make us angrier and more isolated. Platforms aimed at democratizing speech empower demagogues. Platforms celebrating community violate our privacy in ways we scarcely realize and serve as conduits for deceptions hiding in plain sight.

Now Facebook has announced that it has permanently banned Louis Farrakhan, Alex Jones, Milo Yiannopoulos and a few other despicable people from its social platforms. What could possibly go wrong?

The issue isn't whether the people in question deserve censure. They do. Or that the forms of speech in which they traffic have redeeming qualities. They don't. Nor is the issue that Facebook has a moral duty to protect the free-speech rights of Farrakhan, Jones and their cohorts. It doesn't.

With respect to freedom of speech, the First Amendment says nothing more than that Congress shall make no law abridging it. A public company such as Facebook--like a private university or a family-owned newspaper--has broad latitude to feature or censor, platform or de-platform, whatever and whomever it wants.

Facebook's house, Facebook's rules.

The issue is much simpler: Do you trust Mark Zuckerberg and the other young lords of Silicon Valley to be good stewards of the world's digital speech?

I don't.

The deeper problem is the overwhelming concentration of technical, financial and moral power in the hands of people who lack the training, experience, wisdom, trustworthiness, humility and incentives to exercise that power responsibly.

That much should have been clear by the way in which Facebook's leaders attempted to handle their serial scandals over the past two years. Ordering opposition research on their more prominent critics. Consistently downplaying the extent of Russian meddling on their platform. Berating company employees who tried to do something about that meddling. Selling the personal information of millions of its users to an unscrupulous broker so the data could be used for political purposes.

Now Facebook wants to refurbish its damaged reputation by promising its users much more privacy via encrypted services as well as more aggressively policing hate speech on the site. Come again? This is what Alex Stamos, Facebook's former chief security officer, called "the judo move: In a world where everything is encrypted and doesn't last long, entire classes of scandal are invisible to the media." It's a cynical exercise in abdication dressed as an act of responsibility. Knock a few high-profile bigots down. Throw a thick carpet over much of the rest. Then figure out how to extract a profit from your new model.

Assuming that's Facebook's deeper calculation--it's hard to think of another--then it may wind up solving the company's short-term problems. But it might also produce two equally dismal results.

On the one hand, Facebook will be hosting the worst kinds of online behavior. In a public note in March, Zuckerberg admitted that encryption will help facilitate "truly terrible things like child exploitation, terrorism, and extortion." (For that, he promised to "work with law enforcement." Great.)

On the other hand, Facebook is completing its transition from being a simple platform, broadly indifferent to the content it hosts, to being a publisher that curates and is responsible for content. Getting rid of Farrakhan, Jones and the others are the easy calls for now, because they are such manifestly odious figures and have no real political power.

But what happens with the harder calls, the ones who want to be seen publicly and can't be swept under: alleged Islamophobes, militant anti-immigration types, the people who call for the elimination of Israel? Facebook has training documents governing hate speech and is now set to deploy the latest generation of artificial intelligence to detect it.

But the decision to absolutely ban certain individuals will always be a human one. It will inevitably be subjective. And as these things generally go, it will wind up leading to bans on people whose views are hateful mainly in the eyes of those doing the banning. Recall how the Southern Poverty Law Center, until recently an arbiter of moral hygiene in matters of hate speech, wound up smearing Ayaan Hirsi Ali and Maajid Nawaz, both champions of political moderation, as "anti-Muslim extremists."

Facebook probably can't imagine that its elaborate systems and processes would lead to perverse results. And not everything needs to be a slippery slope.

Then again, a company that once wanted to make the world more open and connected now wants to make it more private. In time it might also become a place where only nice thoughts are allowed. The laws of unintended consequence can't rule it out.

------------v------------

Bret Stephens is a New York Times columnist.

Editorial on 05/13/2019

Upcoming Events