Facebook seeks multilingual staff for a new center monitoring harmful content
Barcelona rumored to be the host city for the external company, as control strategy becomes a priority for the social media giant
Facebook is looking to hire people with multicultural and language skills to review reports of offensive content. A few weeks ago, it emerged that the California-based social media giant is planning to open an operations center in Barcelona, from which it would seek to combat and delete harmful content on its network.
We recognize that they work with potentially disturbing content and we assure them that they will receive as much psychological assistance as necessary
Richard Allan, Vice President Public Policy EMEA
Some sources named the Glóries Tower, popularly known as the Agbar Tower, as the chosen location for this new center. Facebook has confirmed its aim to team up with “prestigious” foreign companies in order to fight harmful content, but for now the location of the new center remains unconfirmed.
“We are looking for people with the ability to speak multiple languages, the capacity to learn fast and a background that shows that they can incorporate new approaches in an efficient way, as well as feel comfortable working in these type of environments,” explained vice-president for public policy EMEA at Facebook, Richard Allan, in a phone interview.
His statements came on the same day that Facebook published enforcement numbers for the first time, including statistics on all deleted content on the network from October 2017 until March of this year.
“When they come to work at Facebook’s inspection centers, we don’t tell them to look at the content and make decisions based on their personal opinions,” Allan adds. ”We require them to complete an intensive course that teaches them all of our community standards, the ones that we put up online.”
Content-control became a priority for Facebook following the Cambridge Analytica scandal, which saw founder Mark Zuckerberg appear before US Congress to answer questions about a massive leak of data from at least 87 million users of the social network. Last month, in front of the US House of Representatives, Zuckerberg said the company currently has 15,000 people working on security and content review, and that the goal is to increase the team to 20,000 by the end of this year.
“We have a significant number of people working in Facebook centers around the world, who are experts in the area of security, including terrorism specialists, but we need to scale up fast, which is why we are working with credible companies that can implant these content-revision centers for us,” Richard Allan explains. “Our philosophy is that we need to have a small number of large-scale centers, because we need to maintain quality.”
Even though these employees will work at external centers, Facebook determines some of the terms of their contracts, such as the mandatory psychological support that all employees have the right to access when dealing with violent content. “We recognize that they work with potentially disturbing content and we assure them that they will receive as much psychological assistance as necessary,” he adds.
In Germany, there are already centers dedicated to filtering and deleting comments, pictures or videos that violate network policies. International media have reported that the German Facebook branch is located in Essen, where around 500 people analyze offensive content, messages that incite hatred, or which could be interpreted as supportive of terrorism.
English version by Laura Rodríguez.