Platforms say U.K. falsehoods monitored

FILE - In this Sept. 3, 2019, file photo, leave and remain supporters try to block each others' banners as they protest opposite Parliament Square in London. Internet companies say they’re working to fight misinformation ahead of next month’s general election in the United Kingdom, but bogus online claims and misleading political ads remain a threat due to government inaction. (AP Photo/Matt Dunham, File)
FILE - In this Sept. 3, 2019, file photo, leave and remain supporters try to block each others' banners as they protest opposite Parliament Square in London. Internet companies say they’re working to fight misinformation ahead of next month’s general election in the United Kingdom, but bogus online claims and misleading political ads remain a threat due to government inaction. (AP Photo/Matt Dunham, File)

LONDON -- Social media platforms say they are mounting a vigorous campaign against misinformation in the lead-up to next month's general election in the United Kingdom.

But digital misinformation experts believe British voters remain vulnerable to the same type of misleading ads and phony claims that played a role in the vote to leave the European Union three years ago.

Government inaction on online misinformation and digital ad regulations have added to the pressure internet companies are under as they face growing criticism for amplifying false claims during the run-up to the 2016 Brexit referendum and the 2016 election in the U.S.

Prime Minister Boris Johnson pushed for the snap Dec. 12 election, in which voters will choose 650 representatives to the House of Commons, hoping his Conservative Party will gain enough seats to break a stalemate over his plan to take Britain out of the EU.

And with campaigns barely under way, falsehoods are already spreading online.

A video posted last week on Twitter and Facebook by the Conservative Party contains a misleading edit of a television interview with a senior Labor Party figure. The video had been altered to show the official failing to answer a question about Brexit, when, in fact, he responded quickly.

The chairman of the Conservative Party called the doctored video lighthearted satire, but it's part of a serious problem confronting British voters, according to Will Moy, chief executive at Full Fact, an independent, London-based fact-checking organization.

"The biggest risk to people in the U.K. right now is being lied to by their own politicians," said Moy, whose organization works with Facebook and others as a third-party fact checker, as does The Associated Press. He said laws written decades ago to cover political advertising for print, radio and television can't be applied to the reach and speed of the internet.

Public debate surrounding the 2016 Brexit vote was driven in part by a number of false claims. They included promises that Britain could recoup 350 million pounds per week by leaving the EU -- an unfounded claim that a survey later found was believed by nearly half of all Britons.

The threat has grown alongside the influence of social media and the proliferation of online political ads. The proportion of campaign spending on digital advertising has increased from 0.3% in 2011 to 42.8% in 2017, according to the U.K.'s Electoral Commission.

Despite reports urging new regulations designed to combat misinformation or regulate the way digital ads are targeted at voters, officials in Britain have made no significant changes to laws governing online ads, social media and election disinformation.

That's left private, giant tech firms such as Facebook, Twitter and Google to decide how best to police such content through a patchwork of policies.

The U.K. election will be among the first since the start of Twitter's new policy prohibiting paid political advertisements, which takes effect Nov. 22.

Twitter's ban stands in stark contrast to Facebook's policy of not fact checking ads from politicians and allowing demonstrably false ads to remain up.

Last week a group of 10 U.K.-based technology researchers, transparency advocates and nonprofit tech organizations called on Facebook and Google, which operates YouTube, to follow Twitter's lead.

Despite the criticism, Facebook's leaders insist they understand the stakes and take the threat of misinformation seriously.

"We have learned the lessons of 2016, when Russia used Facebook to spread division and misinformation in the U.S. presidential election," Richard Allan, Facebook's vice president of policy solutions, wrote in a piece published last month in The Telegraph.

Information for this article was contributed by Amanda Seitz of The Associated Press.

A Section on 11/11/2019

Upcoming Events