Today's Paper Search Latest stories Traffic #Gazette200 Paper Trails Digital Replica FAQs Weather Newsletters Most commented Obits Puzzles + Games Archive
ADVERTISEMENT
ADVERTISEMENT

Over the past several years we've learned a lot about the consequences of social media. Platforms intended to bring us closer together make us angrier and more isolated. Platforms aimed at democratizing speech empower demagogues. Platforms celebrating community violate our privacy in ways we scarcely realize and serve as conduits for deceptions hiding in plain sight.

Now Facebook has announced that it has permanently banned Louis Farrakhan, Alex Jones, Milo Yiannopoulos and a few other despicable people from its social platforms. What could possibly go wrong?

The issue isn't whether the people in question deserve censure. They do. Or that the forms of speech in which they traffic have redeeming qualities. They don't. Nor is the issue that Facebook has a moral duty to protect the free-speech rights of Farrakhan, Jones and their cohorts. It doesn't.

With respect to freedom of speech, the First Amendment says nothing more than that Congress shall make no law abridging it. A public company such as Facebook--like a private university or a family-owned newspaper--has broad latitude to feature or censor, platform or de-platform, whatever and whomever it wants.

Facebook's house, Facebook's rules.

The issue is much simpler: Do you trust Mark Zuckerberg and the other young lords of Silicon Valley to be good stewards of the world's digital speech?

I don't.

The deeper problem is the overwhelming concentration of technical, financial and moral power in the hands of people who lack the training, experience, wisdom, trustworthiness, humility and incentives to exercise that power responsibly.

That much should have been clear by the way in which Facebook's leaders attempted to handle their serial scandals over the past two years. Ordering opposition research on their more prominent critics. Consistently downplaying the extent of Russian meddling on their platform. Berating company employees who tried to do something about that meddling. Selling the personal information of millions of its users to an unscrupulous broker so the data could be used for political purposes.

Now Facebook wants to refurbish its damaged reputation by promising its users much more privacy via encrypted services as well as more aggressively policing hate speech on the site. Come again? This is what Alex Stamos, Facebook's former chief security officer, called "the judo move: In a world where everything is encrypted and doesn't last long, entire classes of scandal are invisible to the media." It's a cynical exercise in abdication dressed as an act of responsibility. Knock a few high-profile bigots down. Throw a thick carpet over much of the rest. Then figure out how to extract a profit from your new model.

Assuming that's Facebook's deeper calculation--it's hard to think of another--then it may wind up solving the company's short-term problems. But it might also produce two equally dismal results.

On the one hand, Facebook will be hosting the worst kinds of online behavior. In a public note in March, Zuckerberg admitted that encryption will help facilitate "truly terrible things like child exploitation, terrorism, and extortion." (For that, he promised to "work with law enforcement." Great.)

On the other hand, Facebook is completing its transition from being a simple platform, broadly indifferent to the content it hosts, to being a publisher that curates and is responsible for content. Getting rid of Farrakhan, Jones and the others are the easy calls for now, because they are such manifestly odious figures and have no real political power.

But what happens with the harder calls, the ones who want to be seen publicly and can't be swept under: alleged Islamophobes, militant anti-immigration types, the people who call for the elimination of Israel? Facebook has training documents governing hate speech and is now set to deploy the latest generation of artificial intelligence to detect it.

But the decision to absolutely ban certain individuals will always be a human one. It will inevitably be subjective. And as these things generally go, it will wind up leading to bans on people whose views are hateful mainly in the eyes of those doing the banning. Recall how the Southern Poverty Law Center, until recently an arbiter of moral hygiene in matters of hate speech, wound up smearing Ayaan Hirsi Ali and Maajid Nawaz, both champions of political moderation, as "anti-Muslim extremists."

Facebook probably can't imagine that its elaborate systems and processes would lead to perverse results. And not everything needs to be a slippery slope.

Then again, a company that once wanted to make the world more open and connected now wants to make it more private. In time it might also become a place where only nice thoughts are allowed. The laws of unintended consequence can't rule it out.

------------v------------

Bret Stephens is a New York Times columnist.

Editorial on 05/13/2019

Print Headline: Unintended consequences

ADVERTISEMENT

Sponsor Content

You must be signed in to post comments

Comments

  • RBear
    May 13, 2019 at 8:02 a.m.

    Facebook was an idea that filled a gap created by the reduction in human interaction by fostering an online community of people who originally just wanted to know what their friends and family were doing. The platform is ingenious in that sense, but with that ingenuity come the problems of society as the reach expanded to friends of friends and eventually to the public. Just as this paper must moderate the comments of those who seek to spew extreme hate and other unacceptable thoughts, so too must Facebook if it seeks to remain open.
    ...
    The problem is very few know what to do about Facebook other than shut it down, which has been the cries of several on both sides of the political aisle. Remember, some of these folks are people who barely know how to turn their computers on, much less read an e-mail. Facebook shouldn't be "broken up" as some like Warren are screaming for. Sure, that makes her a darling with the Rage Against the Machine movement, but it's very impractical. In the long run, Warren will need that evil empire if she ever hopes to make a dent in the political scheme of things.
    ...
    On the right, there are those who feel they are being censored by the evil empire when in reality the speech that's being censored is either so far to the right I'm not sure any of those claiming censorship even know what the material is. Most are just echoing the thoughts of an equally uninformed base as those on the left who call for breaking up the "monopoly."
    ...
    In the end, it comes down to personal responsibility. Here's an easy way one can do this. Facebook offers a feature to allow you to train your news feed and cut out the crap. It takes some time and you have to come back to do it several times to keep it trained, but you can actually control what's in your feed. The real problem is that when your friends share crap, it ends up on your feed and that's why it has to be constantly trained.
    ...
    So, the simplest solution is to just stop sharing crap. Resist the urge to post the latest false flag from some extreme right or left wing site. Stick to the simple stuff. In the end, it will lessen the vile nature of what flows across the feed.

  • Illinoisroy
    May 13, 2019 at 11:57 a.m.

    Facebook needs to return to its roots, it isn't a credible news source.

  • LR1955
    May 13, 2019 at 2:23 p.m.

    I thought Facebook started as a students only place to meet your new college roommate, then profs wanted in to hang with their students, then everybody & their brother got approved to join.

    “Just as this paper must moderate the comments of those...” but the ArDemGaz doesn’t really moderate and they provide no tools for their readers to block the spewing so common in these comments. But it probably wouldn’t do any good to block one commenter and see them mentioned by other commenters. Freedom of Speech? More like Freedom of BlahBlahBlah etc

  • RBear
    May 13, 2019 at 3:51 p.m.

    LR1955 ever used the Flag function? I can't delete a comment, but I can suggest it be flagged by a moderator who then takes action.
    ...
    Regarding how it started, yes that's the genesis of FB. But the social media platform filled a gap with students that translated into society. Not sure what you're trying to get at other than being snarky.

ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT