Former President Donald Trump will find out this week whether he gets to return to Facebook in a decision likely to stir up strong feelings no matter which way it goes.
The social network's quasi-independent Oversight Board says it will announce its ruling Wednesday.
Trump's account was suspended for inciting violence that led to the deadly Jan. 6 Capitol riot. After years of treating Trump's inflammatory rhetoric with a light touch, Facebook and Instagram silenced his accounts Jan. 7, saying at the time that he'd be suspended "at least" through the end of his presidency.
Though Trump posted to Facebook often -- and his campaign was especially skillful at using the social network's advertising tools to reach potential voters -- his platform of choice was always Twitter. But Twitter, without an oversight board to kick the final decision to, banned him permanently.
While not always as high-profile as Twitter, Trump's Facebook posts were widely shared, as were those of his conservative supporters such as Ben Shapiro and Dan Bongino, who continue to amass millions of views and comments.
Facebook created its oversight panel to rule on thorny content on its platforms in response to widespread criticism about its inability to respond swiftly and effectively to misinformation, hate speech and nefarious influence campaigns. Its decisions so far have weighed on the side of free expression vs. restricting content.
In its first rulings, the panel overturned four out of five decisions by the social network to take down questionable content. It ordered Facebook to restore posts by users who the company said broke standards on adult nudity or hate speech or were dangerous individuals.
This included a Burma user's post about Muslims that included two widely shared photos of a dead Syrian toddler that was offensive but did not rise to the level of hate speech, it ruled.
But none of the rulings has the same gravity as this week's decision on Trump. The board was to announce its decision last month but that was delayed, it said, because it needed to process more than 9,000 public comments.
The board's 20 members, who eventually will grow to 40, include a former prime minister of Denmark, the former editor of the Guardian newspaper, legal scholars, human rights experts and journalists.
The first four board members were directly chosen by Facebook. They then worked with the company to select additional members. The board members earn a salary.
The board's independence has been questioned by critics who say it's a Facebook public relations campaign intended to draw attention away from deeper problems of hate and misinformation that still flourish on its platforms.
"The Oversight Board is designed to distract journalists and policy makers from the massive harm being done every day by Facebook," said Roger McNamee, an early investor in Facebook. "To view the board as legitimate, one must accept that a group structured to review a handful of cases a year is enough to supervise a platform that is undermining democracy around the world, amplifies denial in a pandemic, allegedly engages in price fixing in digital advertising, amplifies hate speech, and shares tens of millions of harmful messages every day."
Facebook regularly takes down thousands of posts and accounts, and about 150,000 of those cases have been appealed to the oversight board since it launched in October 2020. The board has said it is prioritizing the review of cases that have the potential to affect users around the world.
Information for this article was contributed by Matt O'Brien of The Associated Press.