_
_
_
_
_

Study highlights lack of ‘algorithmic transparency’ in YouTube portrayal of migrants

An independent audit of the social network found that its recommendation system, which accounts for 70% of all views, ‘perpetuates a dehumanizing image’ of migration

Social networks
A still from one of the videos analyzed by Eticas in its YouTube audit showing migrants crossing the Spanish border at Melilla.
Manuel G. Pascual

On YouTube, 75% of videos displayed after searching for the word “migrants” show men crossing a border. In 76% of cases, non-white people are seen moving in large groups, which “perpetuates a dehumanizing image” of migration and promotes a “sense of threat.” On the same platform, however, searches using the keyword “refugees” brings up mostly videos of white men and women, recently influenced by the war in Ukraine. These usually show the subjects of the video in close-up, and in everyday situations, not at border crossings or “in the presence of armed forces.” These are among the results of analysis carried out by Eticas, a consulting firm specializing in algorithmic audits, on two of the largest social networks: YouTube and TikTok.

Eticas conducted an external audit — carried out by extracting data available to the public, not provided by the platforms themselves — of the work of several social networks. “We wanted to analyze how immigrants and refugees are represented because in the media there has been a lot of pedagogy about how certain groups are portrayed and the consequences this can have, but this reflection has not been applied to social networks,” explains Gemma Galdon, executive director of Eticas. As of next year, the Digital Services Act (DSA) will require technology companies operating in the European Union to conduct independent audits to ensure a secure digital space in which the fundamental rights of users are protected.

“In this audit, Eticas encourages the platform to bet on algorithmic transparency, improve its recommendation system and increase its engagement with migrant communities, for a more faithful representation of them,” the study concludes.

For the study, carried out in the United States, the United Kingdom and Canada, Eticas analysts collected videos on the subject of immigration over two months (June and July 2022 in the case of YouTube, and October to December in the case of TikTok). They did this using various profiles on each of the platforms to try to find differences in the content proposed by the algorithm, creating fictitious users that were pro-immigration, anti-immigration, or neutral.

The searches were also carried out in different locations to see if that had any effect on results. The YouTube analysis was conducted from London and Toronto, cities that have had very different relationships with immigration and refugees. “We were looking for spaces that were very politically connoted. Toronto is a big city in Canada, which has a big historical commitment to welcoming immigrants and refugees. London, on the other hand, is the capital of a country that has long been characterized by a closing of borders and a more aggressive discourse on immigration,” says Galdon.

In the case of TikTok, the locations selected were San Francisco (Democratic majority), Oklahoma City (Republican majority), and Virginia Beach (neutral), with the focus placed on the 2022 midterm elections. The results were very similar in all cases: the algorithm barely discriminated among users in terms of the type of content shown, either by profile characteristics or location: this is because both direct search results and content suggested by each platform’s algorithm were taken into account.

A still from a YouTube video showing two Ukrainian women seeking asylum in the U.S.
A still from a YouTube video showing two Ukrainian women seeking asylum in the U.S.

YouTube: 70% of viewed videos recommended by algorithm

With more than 2 billion active users, YouTube is the second most-used social platform in the world, behind Facebook. According to YouTube CEO Neal Mohan, up to 70% of the videos viewed on the platform are suggested by its video recommendation system. Those suggestions are based on a cocktail of data ranging from viewing history and interactions with videos, to the user’s age, gender or location. “As a result, YouTube’s recommendation algorithms can influence the perception of a large number of people around the world about migration,” the Eticas report noted.

The company’s analysis of YouTube shows that, in the vast majority of videos related to immigration, the faces of the people appearing in the content are not identifiable. In more than 60% of cases, the protagonists are not individuals, but large groups of people numbering 15 or more. Only 4% of those appearing under this search term are white, while women are underrepresented at between 1% and 4%. In the case of refugees, the portrait is different: the majority of subjects have Caucasian features. Their faces can be identified more than half the time, there is a higher proportion of videos showing between one and five individuals, and there are more women than men. There is also a lower percentage of videos in which refugees are shown crossing a border.

“Our recommendations systems are not designed to filter or demote videos or channels based on specific perspectives,” a YouTube spokesman told Time in 2019. The platform, meanwhile, says it audits its automated systems “to ensure that there is no unintended algorithmic bias.”

This image shows a common example of a YouTube video if a user searches for "immigrants" in Toronto: a group of non-white adults crossing a border.
This image shows a common example of a YouTube video if a user searches for "immigrants" in Toronto: a group of non-white adults crossing a border.

TikTok: entertainment over politics

TikTok has become the fastest-growing social network over recent years. Particularly popular among young people, it shows users short videos selected by the algorithm, which does not take into account what your friends on the platform are watching or what country you are in. TikTok is designed to promote the most viral content, wherever it comes from and whatever it shows (as long as it complies with the company’s legal content standards).

One of the most striking findings of Etica’s analysis of TikTok is that the Chinese-owned platform shuns politically charged videos about immigration or refugees: less than 1% of the content studied was based on arguments for or against these groups. Most of the videos related to these issues showed people making jokes, cooking, working, creating art, or displaying an unusual skill.

The Eticas study concludes that TikTok’s recommendation algorithm “is not influenced by users’ political preferences” or by their location. This runs against research carried out by The Wall Street Journal, according to which the platform is able to tailor content to users’ preferences based on how long they spend looking at a particular piece of content. “This suggests that the level of personalization of TikTok’s recommender was adjusted in the past year and underscores the evolution over time of the algorithms and the need to periodically reassess their impact,” the report said.

According to a spokeswoman for TikTok, the platform does not have specific mechanisms to try to correct possible biases. Its algorithm offers content from different creators and is committed to the diversity of sources to prevent distortions and to emphasize its global nature.

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

More information

Archived In

Recomendaciones EL PAÍS
Recomendaciones EL PAÍS
_
_