WASHINGTON -- In March, as claims about the dangers and ineffectiveness of coronavirus vaccines spun across social media and undermined attempts to stop the spread of the virus, some Facebook employees thought they had found a way to help.
By subtly altering how posts about vaccines are ranked in people's newsfeeds, researchers at the company realized they could curtail the misleading information individuals saw about covid-19 vaccines and offer users posts from legitimate sources such as the World Health Organization.
"Given these results, I'm assuming we're hoping to launch ASAP," one Facebook employee wrote in March, responding to the internal memo about the study.
Instead, Facebook shelved some suggestions from the study. Other changes weren't made until April.
When another Facebook researcher suggested disabling comments on vaccine posts in March until the platform could do a better job of tackling anti-vaccine messages lurking in them, that proposal was ignored at the time.
Critics say Facebook was slow to act because it worried it might hurt the company's profits.
"Why would you not remove comments? Because engagement is the only thing that matters," said Imran Ahmed, CEO of the Center for Countering Digital Hate, an internet watchdog group. "It drives attention, and attention equal eyeballs, and eyeballs equal ad revenue."
In an emailed statement, Facebook said it has made "considerable progress" this year with downgrading vaccine misinformation in users' feeds.
Facebook's internal discussions were revealed in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen's legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizations, including The Associated Press.
The trove of documents shows that in the midst of the pandemic, Facebook carefully investigated how its platforms spread misinformation about lifesaving vaccines. They also reveal that rank-and-file employees regularly suggested solutions for countering anti-vaccine misinformation on the site, to no avail.
Typically, Facebook ranks posts by engagement -- the total number of likes, dislikes, comments and reshares. That ranking scheme may work well for innocuous subjects like recipes, dog photos or the latest viral singalong, but Facebook's own documents show that when it comes to divisive, contentious issues like vaccines, engagement-based ranking emphasizes only polarization, disagreement and doubt.
To study ways to reduce vaccine misinformation, Facebook researchers changed how posts are ranked for more than 6,000 users in the U.S., Mexico, Brazil and the Philippines. Instead of seeing posts about vaccines that were chosen based on their engagement, these users saw posts selected for their trustworthiness.
The results were striking: a nearly 12% decrease in content that made claims debunked by fact-checkers and an 8% increase in content from authoritative public health organizations such as the WHO or the U.S. Centers for Disease Control.
Facebook said it did implement many of the study's findings -- but not for another month, a delay that came at a pivotal stage of the global vaccine rollout.
In a statement, company spokeswoman Dani Lever said the internal documents "don't represent the considerable progress we have made since that time in promoting reliable information about COVID-19 and expanding our policies to remove more harmful COVID and vaccine misinformation."
The company also said it took time to consider and implement the changes.
Yet the need to act urgently couldn't have been clearer: At that time, states across the U.S. were rolling out vaccines to their most vulnerable -- the elderly and sick. And public health officials were worried. Only 10% of the population had received their first dose of a vaccine. And a third of Americans were thinking about skipping the shot entirely, according to a poll from The Associated Press-NORC Center for Public Affairs Research.
Despite this, Facebook employees acknowledged they had "no idea" just how bad anti-vaccine sentiment was in the comments sections on Facebook posts. But company research in February found that as much as 60% of the comments on vaccine posts were anti-vaccine or vaccine-reluctant.
Even worse, company employees admitted they didn't have a handle on catching those comments, or a policy in place to take them down.
Los Angeles resident Derek Beres, an author and fitness instructor, sees anti-vaccine content thrive in the comments every time he promotes immunizations on his account on Instagram, which is owned by Facebook. Last year, Beres began hosting a podcast after noticing conspiracy theories about covid-19 and vaccines were swirling on the social media feeds of health and wellness influencers.
Some Facebook employees suggesting disabling all commenting on vaccine posts while the company worked on a solution.
The suggestion went nowhere until mid-April, when Lever said the company stopped showing previews of popular comments on vaccine posts.
Instead, Facebook CEO Mark Zuckerberg announced March 15 that the company would start labeling posts about vaccines that described them as safe.
The move allowed Facebook to continue to get high engagement -- and ultimately profit -- off anti-vaccine comments, said Ahmed of the Center for Countering Digital Hate.
"Facebook has taken decisions which have led to people receiving misinformation which caused them to die," Ahmed said. "At this point, there should be a murder investigation."