Abu Dhabi, UAEFriday 24 May 2019

Facebook gives users key role in stopping fake news

Starting this week, members will be able to rank news sources for trustworthiness

Facebook has announced that it will ban advertising from cryptocurrencies that are irresponsible. Loic Venance / AFP
Facebook has announced that it will ban advertising from cryptocurrencies that are irresponsible. Loic Venance / AFP

Facebook is introducing a new feature this week to allow its two billion users to rank their trust in news sources, its latest attempt to combat the spread of misinformation on the social network.

Along with Google and Twitter, Facebook has been criticised for allowing the spread of bogus news - some of which was directed by Russia - ahead of the 2016 US presidential election and in other countries.

Twitter also addressed the issue on Friday, announcing that it would notify almost 670,000 users in the US who followed, retweeted, or "liked" posts from suspect accounts during the election period.

Facebook co-founder and chief executive Mark Zuckerberg said the network would seek to "prioritise news that is trustworthy, informative, and local".

"There's too much sensationalism, misinformation and polarisation in the world today," Mr Zuckerberg said in post. "Social media enables people to spread information faster than ever before, and if we don't specifically tackle these problems, then we end up amplifying them."

The new "trusted sources" ranking, aims to "make sure the news you see is high quality" and "helps build a sense of common ground" rather than sow division, Mr Zuckerberg said.

He said Facebook decided to rely on member surveys as the most "objective" way to rank trust in news sources.

"We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem."

Mr Zuckerberg said the new ranking system would hopefully separate news organisations that are trusted only by their readers or watchers from ones that are broadly trusted across society.

"This update will not change the amount of news you see on Facebook," he said. "It will only shift the balance of news you see towards sources that are determined to be trusted by the community."


Read more:

Facebook wants your news feed to be more 'meaningful'

Mark Zuckerberg's personal goal for 2018? Fix Facebook


Twitter said on Friday that the number of Russia-linked accounts firing off tweets evidently aimed at the 2016 US election was more widespread than initially determined.

The company found and closed an additional 1,062 accounts associated with the Internet Research Agency, considered a "troll farm" connected to the Russian government, Twitter said in an online update.

"Any such activity represents a challenge to democratic societies everywhere, and we're committed to continuing to work on this important issue," Twitter said.

For the 10-week time span investigated prior to the November 2016 election, Twitter identified a total of 3,814 Internet Research Agency-linked accounts, according to the San Francisco-based company.

Those accounts posted 175,993 tweets, approximately 8.4 per cent of which were election related, Twitter said.

The analysis at Twitter also found an additional 13,512 automated accounts identified as "Russia-linked" and tweeting election-related content during that period, bringing the total to 50,258 accounts.

Twitter said it was sharing information about the accounts with Congress, where testimony late last year by Twitter, Facebook and Google showed that many more millions of Americans were exposed to the fake news than previously thought.

US legislators have been investigating whether President Donald Trump's campaign colluded with Russia in its bid to influence the election.

Meanwhile, British MPs investigating possible Russian interference in the Brexit referendum revealed this week that Facebook had agreed to broaden its own investigation into fake news around the vote, after the social media platform's initial efforts drew criticism.

The House of Commons' digital, culture, media and sport committee said Simon Milner, Facebook's head of policy in Britain, had promised the company would now search for "clusters engaged in coordinated activity around the Brexit referendum" that appeared to have originated in Russia.

Mr Milner said it may take several weeks to produce results.

The commitment follows demands from MPs for Facebook and Twitter to provide further information on alleged Russian social media meddling in the run-up to the June 2016 referendum.

Executives from the platforms, as well as from Google, are due to give evidence to the parliamentary inquiry in February, when MPs will visit America for the occasion.

Updated: January 20, 2018 06:38 PM