OPINION

EDITORIAL: Letter from an editor

Bulletin boards don’t edit correspondence

Thank you, friend, for your question. It's not often that people ask for our opinion. We just give it freely and often and even without being asked. We consider that more efficient. To answer your question:

When Facebook, Google, Twitter and other social media sites were started, they were designed to be like community bulletin boards. Anyone could post and say anything they wanted, just like a community bulletin board at your church, school or hangout.

But the people running those sites were concerned that if something posted there was libelous or slanderous, they could be held responsible and have to pay damages, just like newspapers. So the feds passed a law to shield them: Section 230 of the Communications Decency Act of 1996, which said as long as they acted as a community bulletin board and didn't edit or infringe on anyone's right to free speech, those sites could not be held responsible for libel or slander.

That was the theory. Remember this was the early days of the Internet, when the argument was we needed to let it be free to develop, not tax it, etc.

As it turns out, there is a major difference in a community bulletin board on which people post things manually and an electronic bulletin board, which thanks to the Internet and the ease of the world wide web, allows millions of people to both post comments and read them, anywhere and everywhere.

At first Mark Zuckerberg wanted people to just have a way to connect, first in colleges, then elsewhere. He had no interest in advertising. But then because so many people started using Facebook and "liking" posts, the site became a new way to advertise, by targeting people on what they liked. Facebook became an advertising behemoth. It is not in the company's economic interest to discourage or restrict users from using it. Facebook now has more than 1.7 billion users worldwide.

So now to your question, friend: Should social media sites edit content like newspapers and be responsible for content?

Even if they should, or they wanted to, how could Facebook's people edit 1.7 billion people every day? It seems impossible. The computer guys are trying to do it with algorithms, since hiring humans would be cost-prohibitive. But algorithms can't do the job, at least not now.

And if they do find a way to start physical editing, to what purpose? To provide accurate information? Can they hire enough humans to do that much fact-checking? And as we have seen, fact-checking can also be harder than anticipated. Who is going to fact-check the fact-checkers? This is one of the age-old questions: If you restrict free speech, how do you do it?

What other purpose for editing? Will you try to make all posts objective? Or will you allow opinions? If you do, will you allow some opinions but not others? (Remember the revolt at The New York Times because they didn't like Tom Cotton's opinion piece.) Or will you have it parade under the banner of objectivity but management can make its own value and perhaps political judgments on what to include, what to exclude, what to flag as dubious, what not to flag. We see a lot of this in the media today, and we certainly have seen it on Twitter. The more you think about this, the more you understand the basis for the argument for more free speech, not less.

This may be why Mark Zuckerberg has steadfastly defended following a free speech policy in determining what to allow on Facebook. It may not be the perfect answer, but it may be better than all the other alternatives. It is especially dicey now when you add politics in an election year. Joe Biden supporters may want Facebook to take down Donald Trump ads, and Donald Trump supporters may want Facebook to take down Joe Biden ads. They both may claim that the other's ads have false or misleading claims. Our democracy is almost 250 years old, and false claims by politicians have been with us since the beginning. Somehow we have survived and thrived, and the voters have been able to sort it out and make their own decisions.

Social media facilitates mobs, or at least mob mentality. Before, to create a mob you would have to do a lot of shoe-leather organizing. Today it is much easier on social media. And we have seen these mobs organize on social media on both the right and the left. They have both caused a lot of damage. People have had their reputations tarnished if not destroyed, good people have lost their jobs, and some have been put in harm's way physically.

Is there a better way for social media? One idea would be to let it function like a true bulletin board, where any user can take something down if he considers it dangerous. How that would work is anybody's guess.

Another idea: Let the social media platforms have immunity from libel or slander, but hold those who make the postings responsible for what they put on social media. The answer is these people already are responsible, but often there would be no economic incentive to sue, since they may have little money. Poor as well as rich people can deal in malicious libel and slander.

One change that would probably help is to do away with anonymity. If people have to list their names and physical addresses, that might make them think twice before posting something hateful or malicious. We do not allow anonymity in letters to the editor. If you want to say something in our newspaper, you need to have enough courage to say who you are.

You asked for our opinion, and that is exactly what this is, which means it may or may not be accurate, or fair, or what most people think. And in a country with a First Amendment, we can give our opinion, and so can you. As long as we are all willing to take the consequences of giving it.

Upcoming Events