OPINION

OPINION | EDITORIAL: Going too far

A modest proposal for social media

Social media companies may have finally crossed the Rubicon.

In the past few weeks these companies took the ultimate step of banning the president of the United States from their platforms. We guess they signaled that they are more powerful than any president, now or in the future.

But aside from power, and whether their decision is right or wrong, they are now wading deeper into political waters. That is where sharks swim, and they may rue the day they did it.

Social media companies were originally given immunity from not just libel but any liability for what others post on their websites, on the theory that they were merely community bulletin boards where a body could post anything and everything in the spirit of free speech.

Some people post controversial and revolutionary things on bulletin boards. It was done 500 years ago when Martin Luther challenged the Catholic Church by posting his 95 Theses on the Castle Church door in Wittenberg, Germany.

In those days, church doors were used as community bulletin boards. Martin Luther's 95 Theses launched a religious revolution that became the Protestant Reformation. It's hard to think of anything more revolutionary than that.

Back in the early days of America, even before we were a country, the content of newspapers could be ugly, vile, and defamatory. Some of it was just as awful as what you can see today on Facebook, Twitter and YouTube.

But over time, Americans came up with a solution: libel laws. The courts made newspapers and its owners responsible for everything they printed. If there was defamatory and malicious information in the paper, or a careless disregard for the truth, the newspaper was responsible, could be sued, and had to pay those damaged.

Today we have that same defamatory, often vile, content on social media. But the major difference is these new companies are not responsible for it. In fact, they have a virtual "get out of jail free card" with now-famous Section 230, a provision of the federal Communications Decency Act of 1996, which means they can't be sued for libel for what others say on their platforms.

When the Internet first got started, giving these companies immunity from libel was a way to protect and promote free speech. These social media sites were nothing more than community bulletin boards on which anyone could post anything or anyone could remove anything, just like they could 500 years ago. You can't hold a voluntary community board responsible for what is posted there.

Another newspaper analogy might be letters to the editor. But with a letter to the editor, the newspaper is responsible for what you write because the publisher can be sued for libel for what you say in your letter. So the newspaper has to review and edit it, if necessary.

But it's different with postings on Twitter, Facebook or YouTube. You can be sued for your comment, but not the social media companies.

Libel suits can be expensive, so those who write letters to the editor and post on social media rarely get sued. It is the companies which have money that are usually the defendants. And while social media companies have lots of money, they largely have immunity because of Section 230.

Let's say Section 230 is abolished, making these social media companies responsible for what others post there. It is unlikely that would be even possible. Facebook has more than two billion worldwide users. There's simply no way to monitor more than two billion postings.

What is the solution? The way the social media companies are making decisions--banning American citizens or politicians but still allowing dictators around the world to have accounts--doesn't look like self-governing is a workable solution.

Another solution would be government regulation. That's probably the worst option, and could easily result in authoritarian control by whatever party is in power. That sounds more like something you'd find in China or North Korea. It would be a complete abdication of freedom of speech and freedom of the press.

So who should regulate them? What about the users themselves? If these companies are going to truly be community bulletin boards, even for a world community, why not allow users to both post and remove comments? That was the way the bulletin board on the church door in Wittenberg operated 500 years ago. And the way most bulletin boards have operated over the centuries since then. Anyone can post anything, and anyone can take down any posting.

How would this work in the real world? If QAnon or Antifa each made some outrageous post, they could be taken down immediately. If someone across town posts some malicious lie and damages your reputation, you can take it down rather than going through the expense of a libel lawsuit. It seems highly unlikely your kids' birthday party would be removed.

In such a scenario, these companies should have all the protections of Section 230 since they would be not responsible for the content. They would lose much of their control, and maybe not make as much money.

But it would solve a lot of their headaches, since they could no longer be accused of banning anyone. It would get them out of politics. And like anyone else, they could remove content, and should if it is akin to yelling fire in a crowded theater, or if there are otherwise dangerous posts.

That is probably not too much to ask of companies with the largest platforms for communication in world history.

Upcoming Events