Facebook announces changes to content moderation

Social media giant promises greater transparency after oversight board’s recommendations

(FILES) In this file photo illustration taken on March 24, 2020, a Facebook App logo is displayed on a smartphone in Arlington, Virginia. On February 24 Facebook pledged to invest at least $1 billion to support journalism over the next three years as the social media giant defended its handling of a dispute with Australia over payments to media organizations.
Nick Clegg, head of global affairs, said in a statement that the company is willing to support news media while reiterating its concerns over mandated payments.
 / AFP / Olivier DOULIERY
Powered by automated translation

Facebook has revealed it will implement content moderation changes recommended by its own oversight board.

Nick Clegg, the company’s vice president of global affairs, said 11 areas would be changed as a result of the board’s report, which was released in January.

They include more transparency around policies on health misinformation and nudity, and improving its automation detection capabilities.

Facebook last year set up a panel of academics, lawyers, journalists and human rights advocates to review its content decisions amid concerns about its influence.

Employees work in the offices of Facebook Inc's European headquarters in the Dublin dockland development in Dublin, Ireland, on Tuesday, Feb. 9, 2016. Low corporate taxes are the cornerstone of Irish economic policy, and all four of the biggest political parties have promised to keep the 12.5 percent rate. Photographer: Aidan Crawley/Bloomberg via Getty Images
Former employees at Facebook's EMEA headquarters in Dublin have taken legal action against the company. Boomberg.

Of the 17 recommendations, Facebook committed to change on 11. It is still assessing five others, while refusing to take action on one related to Covid misinformation.

"There is one remaining recommendation that we disagree with and will not be taking action on since it relates to softening our enforcement of Covid-19 misinformation," Mr Clegg said in a release on the Facebook website.

"In consultation with global health authorities, we continue to believe our approach of removing Covid-19 misinformation that might lead to imminent harm is the correct one during a global pandemic."

In January, the board ruled on six cases that Facebook was legally obliged to follow, including hate speech related to Uighur Muslims and a post including a quote by Nazi Joseph Goebbels, but it was given 30 days to respond to other recommendations.

Facebook agreed to clarify its stance on issues such as health-related nudity on Instagram, to ensure that posts raising awareness of breast cancer symptoms, for example, were not wrongly flagged for review by moderators.

It also pledged to improve its automated detection systems by sampling more data and bringing in humans when not confident about its accuracy.

Facebook said it would continue to evaluate which kind of reviews or appeals should be done by people and which could be safely handled by automated systems.

Other recommendations related to being more transparent in its decision-making.

This former Facebook moderator is suing for psychological damage

This former Facebook moderator is suing for psychological damage

The company's decisions came shortly after The National revealed that more than 30 former Facebook moderators in Ireland, Spain and Germany are suing the social media company and four of its third-party outsourcing agents after suffering psychological damage from viewing graphic content.

From terrorist beheadings to mass shootings, Facebook’s content moderators view up to 1,000 extreme images on every shift.

Employed by recruitment agencies on behalf of Facebook, the former employees claim they were given inadequate training to deal with the disturbing content and “no support” to deal with the mental trauma.

In the claims being lodged with Ireland’s High Court, the plaintiffs said they suffered post-traumatic stress disorder as a result of the job.

The people involved in the claims were employed in Dublin by CPL and Accenture, in Spain by CCC, and in Germany by Majorel.

The first claim was lodged against Facebook and CPL a year ago by former Dublin employee Chris Gray.

Mr Gray told The National he had to watch child deaths and terrorist incidents, and the psychological scars from the content he viewed remain with him to this day.

“I never realised at the time how much the horrific things I had seen had affected me,” he said.

Facebook says it offers extensive support to its moderators in their difficult role.

“We are committed to providing support for those that review content for Facebook as we recognise that reviewing certain types of content can sometimes be difficult,” the representative said.

“Everyone who reviews content for Facebook goes through an in-depth, multi-week training programme on our community standards and has access to extensive psychological support to ensure their well-being.

"This includes 24/7 on-site support with trained practitioners, an on-call service, and access to private health care from the first day of employment."

The board began accepting cases last October 2020 and has already received tens of thousands of appeals.

But it will only rule on a small number of cases and delivered its verdict on the first batch of six in January.

Each case is reviewed by a five-member panel, which then presents a decision to the board for majority approval.

Decisions on specific posts are binding, but on wider issues Facebook can choose whether or not to adopt its recommendations.

It has been criticised for being too limited and unable to wade into user data, advertising and algorithms, for example.