Following the fallout from the 2016 US election, the social network can no longer claim neutrality. It's time Mark Zuckerberg acted, writes Charlie Mitchell
Facebook's fake news offensive is better late than never
In the prelude to the 2016 US presidential election, American voters were informed that Pope Francis had endorsed Donald Trump and that Hillary Clinton had ordered the murder of a rogue Democratic operative. The news that then president Barack Obama had banned the pledge of allegiance in schools was shared more than two million times. All were lies, concocted by internet trolls from Macedonia to Mississippi.
As the dust began to settle last year, debate raged about whether this “fake news” had swung the electoral pendulum towards Mr Trump, piling pressure on internet giants – whose senior managers were hauled before Congress – to stem its flow. Last week, Facebook took steps to do just that.
In a formal announcement, followed up on Mark Zuckerberg’s personal page, Facebook said it would now prioritise news that is trustworthy, informative and local on the newsfeeds of its two billion customers. However, Facebook will shift the task of establishing credibility to its users, who will be asked whether they recognise a news source and if they do whether they trust it. The site is also examining ways to incorporate ratings of informativeness into its rankings.
The new strategy will first be tested on Facebook users in the US, where 45 per cent of adults get their news from the social network according to the Pew Research Centre. “Social media enables people to spread information faster than ever before, and if we don’t specifically tackle these problems, then we end up amplifying them,” wrote Mr Zuckerberg. From this week onwards in the US, publishers deemed trustworthy by users may see increased dissemination. Those deemed untrustworthy will experience the reverse.
Facebook is a big part of the problem and so its attempt at a resolution is welcome, if a little late.
Social media has bulldozed the barriers to entry that have for years governed the media industry, allowing obscure fake news organisations to reach a very wide audience with ease. Because Facebook's algorithm prioritises “engagement”, startling fake news spreads quickly, while homogeneous formatting lends legitimacy to fraudulent articles. And since six in 10 people share news stories without reading them, most overlook the shoddy reporting behind punchy headlines.
Facebook announced last September that 470 “inauthentic” Russian actors had spent $100,000 on adverts over two years. It estimates that political content published by Russian outfits was shown to 126 million Americans in the two years before the November 2016 election. In a startling recent report, it emerged Facebook had disabled 5.8 million accounts that October.
Twitter said recently it would contact more than 650,000 US-based users who had retweeted, liked or followed Russian bot accounts on the social network during the 2016 election. Some 2,752 accounts have been traced back to the Internet Research Agency, a Russian outfit also active on Facebook and YouTube. Ultimately, the importance of the Russian fake news campaign in Mr Trump’s victory can only be divined.
This is not simply a US problem. Fears of Russian bots emerged in French and German elections last year, where a resurgent far-right was looking to ride the same wave that ushered in Brexit and Mr Trump. The issue has arisen again this month in the Czech presidential election, where incumbent Milos Zeman – a Putin ally – faces a strong western-oriented challenger. Concerns among shareholders, politicians and media bosses put fake news on the agenda at Davos this week, as The National's editor-in-chief Mina Al-Oraibi reported.
In this region, a sustained pro-Qatar fake news campaign has seen the publication of fabricated UN statements criticising the coalition involved in boycotting Qatar. As reported in The National, almost 50 bogus websites were last month traced to a Qatar-linked fake PR agency in London.
The growth of fake news has had profound consequences, most notably the erosion of trust in media. Michael Gove, a high profile pro-Brexit MP who now heads Britain’s department for environment, food and rural affairs, famously claimed in the lead up to the referendum that “people in this country have had enough of experts”.
With the sanctity of the truth in tatters, Mr Trump consistently gets away with blatant lies. His claim that his inauguration crowd eclipsed Mr Obama's, for instance, flies in the face of video footage to the contrary. According to the Washington Post’s ongoing fact-checker database, the president made 2,140 false or misleading claims in his first year in office, an average of 5.9 a day.
Mr Trump himself uses the term “fake news” to dismiss reports that are critical rather than untrue. Authoritarian leaders in Turkey, China and Egypt have all borrowed the epithet. Famous newspapers with long histories of accuracy and integrity, like the New York Times and Washington Post, have been slapped with the “fake news” label, caused large swathes of the US population to distrust them.
Facebook’s trust surveys will favour established names like these.
Mr Zuckerberg’s strategy is not perfect. Ultimately some of those Facebook users who shared viral Russian-sponsored fake news in the lead up to the 2016 election will now rate the credibility of different news sources.
In addition, given how reliant news providers are on Facebook for traffic and revenue, the changes could wound small specialist news organisations and those just starting up, thereby limiting the number of news sources at people’s disposal. One imagines it has not escaped the attention of Buzzfeed, now a serious news organisation, that Facebook’s latest move would have killed it in its infancy.
Controversy is inevitable in discussions of trust in news. But after the chaos of 2016, Facebook can no longer claim neutrality and take a back seat. At last Mr Zuckerberg has realised his obligation to stem the flow of harmful fake news on his platform. Time will tell if this solution does the job.