_
_
_
_
_

The horrors experienced by Meta moderators: ‘I didn’t know what humans are capable of’

A court ruling in Spain that attributes a content moderator’s mental health problems to the job paves the way for acknowledging that fact for at least 25 other employees as well

Merlin Properties
The Glòries tower in Barcelona, Spain, where Meta's subcontractor for content filtering is located.Reuters
Josep Catà Figuls

After having to repeatedly watch numerous videos of suicides, murders, dismemberment and rapes, he had a panic attack and asked for help. This employee worked as a content moderator at a company that serves Meta, which owns Facebook, Instagram and WhatsApp. He was told to go to the fun floor: a large game room located in Barcelona’s Glòries tower, where the content moderation offices for the Californian tech giant are located. He sat staring blankly at a ping pong table. The fun floor didn’t help him at all. On another occasion, two hours after he had another panic attack, his boss gave him permission to go see a company psychologist. She was on another floor, the psychology floor. He spent over half an hour talking to her, getting it all out. When he finished, she told him that his work was very important to society, that they were all heroes, that he should be stronger. And that their time was up.

Content moderators are in charge of leaving walls and feeds clean and peaceful on Facebook and Instagram. Millions of people use these platforms every day and are unaware that this dark side exists. These workers are the ones who decide whether or not to publish fake news or photos that do not comply with Meta’s policy. But they are also the ones who have to deal with the most brutal content: viewing it, evaluating it, censoring it and, if necessary, sending it to the police.

In 2018, the company CCC Barcelona Digital Services moved into the Glòries tower. The announcement was very well received by the Catalan authorities, as the subcontractor of this large technological company came to swell the ranks of innovative companies located in Barcelona, and to occupy part of a building that had just lost the opportunity to house the headquarters of the European Medicines Agency.

The company began to hire people — primarily young foreigners who spoke several languages — to moderate content from different markets. Last October, an investigation by La Vanguardia exposed the conditions under which these moderators work. Before that, Catalonia’s Labor Inspectorate launched an investigation in 2021, and the following year imposed a fine of over €40,000 ($43,314) on the company for deficiencies in the evaluation and prevention of psychosocial risks in the workplace. In 2020, the company was acquired by Canadian Telus International, which claims that the accusations are false and that they have sufficient safety measures in place.

This worker started there in 2018 and stayed until 2020, when he obtained a medical leave for his mental health problems. The company and the mutual insurance company classified it as a common illness. “We then requested a change of contingencies, because his case fit perfectly with that of occupational accident. The National Institute of Social Security agreed with us, and the company appealed, which triggered the legal process,” explains Francesc Feliu, a partner in the law firm Espacio Jurídico Feliu Fins, which specializes in healthcare issues.

On January 12, the 28th Labor Court of Barcelona rejected the company’s claim and ruled that the worker’s sick leave should be classified as an accident at work. This is the first judgment that recognizes that the mental illness a content moderator suffers is caused by his work. “Work-related stress is the sole, exclusive and unquestionable trigger” of the disorders, says the ruling, which can still be appealed. Feliu has some 25 other workers who are waiting for their illness to be recognized as an occupational accident, and in October he also filed a criminal complaint against the company, alleging the lack of safety measures.

The worker has requested anonymity, because he is subject to strict confidentiality agreements, and he prefers not to talk about how he feels or about very personal issues, because the wounds left by this work are still fresh. He is having a hard time with the news coverage of the ruling, which make him relive what he saw. “But at least this is encouraging more people to seek justice,” he notes.

When he started working at the company, he had no idea of the violence of the videos he would have to watch. “They told me, but superficially, and then when you start you see that things are much, much worse...,” he says. The lawyer explains that the work is well paid (about €2,400 [$2600] gross per month, although there are salary differences between workers who are responsible for different markets, for which another office has also gone to court), no experience or training is required, and young foreigners are attracted to it: “They say ‘look, how cool, I’ll work for Meta,” explains Feliu. The affected worker points out that the illusions don’t last long: “People are not at all aware of what is going on. Before I worked there, I assure you that I didn’t know what humans were capable of.”

The workers suspect that they were training AI

Feliu explains that at that time — ”the conditions may have changed now,” he says — the content moderators with the best efficiency scores (as determined by a monthly worker evaluation) were placed in a high-priority section. That is, they continued to receive videos of all kinds, photos and posts in which suicides and terrorist acts appeared.

That was the section where the worker from the case worked: “Constantly seeing this makes you more sensitive to everything. After a while I couldn’t even see a suicide letter,” he explains. You had to strictly follow Meta’s policy, and often watch the videos to the end, several times with different moderators. “For example, you had to keep watching a live video of someone explaining that they wanted to commit suicide, and you couldn’t delete it or alert the police if you didn’t see something in the scene that suggested suicide, a gun, an open window.... Sometimes they would suddenly pull out the gun and shoot themselves, without you being able to do anything,” he laments.

To remove a video, they had to spell out the decision: “You had to rate the video by the worst thing that happened [in it], according to a scale. If the video started with some kind of violence, you had to wait for something more serious, like murder, dismemberment or sexual abuse, to rate it as the most serious. If the most serious violence came up at first, the system would let you delete it.”

This procedure made the workers suspicious. “If you already see that something is violent at second 10, why do you have to wait? You come to the conclusion that what they are doing is training artificial intelligence (AI), that [workers] are cannon fodder,” says Feliu. When asked about this, a spokesman for the subcontractor does not clarify whether this project exists or / and refers all questions to Meta.

The company employs some 2,000 people, after cutbacks at Meta led to the subcontractor’s workforce being reduced through layoffs last year. The works council has not responded to questions from this newspaper, and the company has appealed the ruling. In a statement, Telus explains that “thanks to the comprehensive wellness program,” in December of last year, it had reduced sick leave to 14% of the workforce, and that only “between 1% and 2%” were work-related mental health leaves.

The company claims that it has engaged outside medical support, that the team has a range of counselors available to them 24 hours a day, can call for breaks and emergency sessions whenever workers see disturbing content, and has technology to blur videos or turn off the sound, if necessary. “Any suggestion that employees are constantly exposed to disturbing content for eight hours a day is false,” the note says, adding that its workers’ well-being is a priority. At the trial, the company denied that there was a link between the employee’s mental illness and his work, arguing that he had seen a psychologist when he was 16 years old.

The worker explains that when he was with the company, there were timed five-minute breaks every hour, during which time he could not go outside to get some fresh air because just going down in the elevator used up the allotted time. The lunch break was 20 minutes, and workers had activities such as yoga sessions and games, “but [there was] no specific follow-up” for employees who evaluated some 400 pieces of content every day.

In addition, the rotating schedules — one week in the morning, one week in the afternoon, one week in the evening, one week at night —disturbed their rest, “which was already difficult because of the nightmares.” Feliu says that “25% of people were systematically on sick leave, plus all those who left the job before taking sick leave.” The attorney believes that the court ruling, and future workers will help the company to change things: “Content moderators are essential for social media, but so are their [working] conditions.”

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

More information

Archived In

Recomendaciones EL PAÍS
Recomendaciones EL PAÍS
_
_