Facebook is traumatising its workers

Content moderators at the website are leaving with post traumatic stress disorder

FILE - This April 25, 2019, file photo shows the thumbs-up "Like" logo on a sign at Facebook headquarters in Menlo Park, Calif. Facebook, following in Google’s footsteps, says it plans to invest $1 billion to support the news industry over the next three years. The social networking giant, which has been tussling with Australia over a law that would make social platforms pay news organizations, said it has invested $600 million since 2018 for news. (AP Photo/Jeff Chiu, File)
Powered by automated translation

This week, The National reported on the working conditions endured by the 15,000 people worldwide hired by Facebook to monitor and censor potentially inappropriate content on the social networking site.

In 2019, Chris Gray, a content moderator previously employed by one of the company's Facebook contracts for monitoring work, spoke out about the post-traumatic stress disorder diagnosis he sustained as a result of his role. It is a tragic, but perhaps unexpected outcome, given that his job involved witnessing executions, rapes and child abuse on a routine basis.

Last May, Facebook was ordered to pay $52 million in compensation to over 11,000 current and former content moderators in the US for job-related negative impacts on their mental health. In Europe, a further 30 workers from Ireland, Spain and Germany have launched legal action against the social media giant and four of its third-party outsourcing agents, seeking compensation for psychological damages. The fact that judges are finding not only Facebook, but also sub-contracted companies liable points to the nebulous manner in which the social media platform allegedly distances itself from the uncomfortable reality of moderators' jobs.

Mr Gray reports having had targets of checking 1000 flagged posts a day, which he would have to categorise with 98 per cent accuracy. He was given only eight days' training before being handed a copy of Facebook's guidelines and beginning the job. According to Mr Gray, the office was staffed in large part by young language students, who were being paid in the region of $32,000 a year.

Facebook CEO Mark Zuckerberg leaving The Merrion Hotel in Dublin after a meeting with politicians to discuss regulation of social media and harmful content. (Photo by Niall Carson/PA Images via Getty Images)
Facebook CEO Mark Zuckerberg uses outsourcing firms to employ content moderators. PA
All of this points to a company that has lost control of a technological revolution it created

Politicians are increasingly critical of Facebook's slow response to the issue. On Wednesday, James Lawless, chairman of Ireland's justice committee, suggested that Facebook is altogether failing to address these problems. Alan Rusbridger, a board member of Facebook's semi-independent Oversight Board, said the group would investigate the company's core algorithm, in an attempt to delve deeper into how the website not only censors content, but how it manages its prominence on the site. Whether or not the board is successful, their intentions are correct.

All of this points to a company that has lost control of a technological revolution it created. And with inadequate training and occupational support, it is no surprise that moderators are struggling.

Employees will, of course, have had some indication about the difficult nature of the job before they applied. But this is no excuse for inadequate safety measures and access to counselling, of the kind that other professions dealing with traumatic circumstances, such as medics and police officers, would be entitled to.

Identifying and adjudicating on evidence of abuses by human beings will always require other human beings in order to make the necessary moral judgements. Algorithms, especially in their current form, can only go so far in that process. Those who profit from social media platforms should support moderators when they demand better conditions. And the public, shielded on a daily basis from exposure to the horrors the moderators must witness, ought to second those demands.