Controlling the dark side of Facebook

"I found somebody I'm about to kill. I'm going to kill this guy right here. He's an old dude, too."

I wish I had never heard the voice of Steve Stephens, cold and cocky as he trolled the streets of Cleveland on Easter Sunday.

Recording a shaky video with his smartphone, the 37-year-old stopped his white Ford Fusion and walked over to Robert Godwin Sr., a 74-year-old father of nine and grandfather of 14 who was out for a walk. Still recording, Stephens asks Godwin a question and then shoots him in the head, zooming in on the bloody scene before calmly walking back to his car.

This gruesome footage--and the pain of my hometown--was posted on Facebook, alongside photos of cute kids searching for eggs and families smiling on the steps of churches after services.

Police initially thought that Stephens, 37, had broadcast the shooting on Facebook Live, the service that lets users share their experiences in real time. It turns out he didn't; he recorded it on his phone and uploaded it.

That's horrific enough.

But the day is almost certainly coming when someone will commit murder live on Facebook, a social network with 1.86 billion active users. When that happens, I'm not sure the Silicon Valley giant or its peers will be ready for it.

Time and time again, enterprising geeks in the Golden State have failed to account for--or straight-up downplayed--the dark parts of human nature. As they idealistically roll out apps and online services designed to make the world more utopian, they conveniently forget that some users will eagerly find ways to use their inventions to make the world more dystopian.

Ride-sharing services, such as Uber and Lyft, are seen solely through the eyes of the riders who hail them, not the drivers who get the short end of the stick with low wages and no benefits.

The magic of driver-less vehicles, coming to a road near you, is talked about merely in terms of alleviating traffic jams and accidents, not of the millions of long-haul truck drivers, cabbies and delivery drivers who will soon be out of a job.

Airbnb, the online house-sharing service, didn't expect hosts to decline customers based on foreign- or black-sounding names. Faced with evidence of that it was happening, the company had to issue new guidelines to curb the practice.

Technology is technology, but people are people. Greed and hate don't just go away because the interaction is digital.

Facebook, in particular, has had to learn this lesson again and again.

CEO Mark Zuckerberg has always had an overly optimistic view of his baby. More than a social network or media behemoth, he sees Facebook as a platform with the power to "make the world more open and connected" and "give people the power to build a global community that works for all of us."

A year ago he told Buzzfeed of the then-new Facebook Live feature: "We built this big technology platform so we can go and support whatever the most personal and emotional and raw and visceral ways people want to communicate are as time goes on."

It turns out that "raw" and "visceral" are a lot darker than Zuckerberg ever imagined.

Facebook users have streamed rapes, assaults and torture live. Still more disturbing, so many people have livestreamed their suicides that Facebook recently released a suite of suicide prevention tools.

In January an aspiring actor in Los Angeles started a Facebook Live session and shot himself in the head while sitting in a parked car. The man's family in Texas spotted the livestream and alerted police, but officers couldn't find him in time.

A few days earlier a 14-year-old girl in Miami hanged herself on Facebook, first making a noose out of a scarf and attaching it to a door frame. Police, once again, arrived too late.

Perhaps most unsettling, a Turkish man, distraught over a breakup, told viewers in October: "No one believed when I said will kill myself. So watch this." Then he tried to shoot himself, but the gun jammed. He pulled the trigger again and the screen went black.

It was inexcusable for Facebook to leave the video of Godwin's last moments on its site for three hours. In the world of social media, that's an eternity. It was long enough for thousands if not millions of people to watch it, record it and share it to YouTube.

The video went viral in less than an hour, even as Stephens went on Facebook Live while on the run from police before he killed himself Tuesday while being pursued.

Facebook eventually deactivated his profile page. But my Facebook and Twitter feeds were still filled with links to the video, alongside pleas for people to stop watching and spreading it.

"This is a horrific crime and we do not allow this kind of content on Facebook," the company said in a statement. "We work hard to keep a safe environment on Facebook, and are in touch with law enforcement in emergencies when there are direct threats to physical safety."

No company can prepare for every horrific eventuality. But imagining it can happen in the first place is a good place to start.

Editorial on 04/23/2017

Upcoming Events