Wikipedia is an attractive target for governments seeking to influence large numbers of people. Social networks are crowded public squares where nations can openly or surreptitiously promote their own agendas. But due to its system of checks and balances arbitrated by dedicated volunteer editors who monitor every change to Wikipedia entries, coordinated state interference is often a complex endeavor.
The actors who vie for position on Wikipedia do it one of the most popular sites on the internet, not some obscure corner of the dark web. This year, more than 10,000 edits were made on the English-language page for the “2022 Russian Invasion of Ukraine,” the highest number of edits for a single article this year. Pages for Vladimir Putin, Ukraine and Volodymyr Zelenskiy also had many war-related edits. Wikipedia’s English site was visited 7.5 billion times in September alone, and its Spanish site was visited almost a billion times.
“Information Warfare and Wikipedia,” a recent report from the Institute for Strategic Dialogue (ISD) and the Center for the Analysis of Social Media (CASM), analyzes how state-sponsored organizations can infiltrate and modify high-profile Wikipedia pages. “Our work doesn’t attempt to empirically prove that Wikipedia is vulnerable to any kind of measurable degree,” said Carl Miller, a co-author of the report and CASM’s director of research. “It tries to convey what we know about the threat. The overall objective of the paper is less ambitious – Wikipedia is overlooked by researchers and journalists as a potential site for disinformation campaigns.”
Wikipedia’s impact ranges far beyond those who visit the site, since it also it feeds information to Siri and Google Assistant. In September 2019, a tug-of-war between Chinese and Taiwanese editors resulted in different answers when the Google and Siri digital assistants were asked, “What is Taiwan?” At one point, both replied, “A state in East Asia.” But earlier in the month, the answer was, “A province of the People’s Republic of China.” A BBC investigation found 1,600 biased edits to 22 politically sensitive Wikipedia articles on China. “We cannot verify who made each of these edits, why, or whether they reflect a more widespread practice. However, there are indications that they are not all necessarily organic, nor random,” said the report, which also found that legitimate Wikipedia editors have been harassed or pressured to quit.
The principal danger posed by disinformation is not online vandalism, nor the typical differences of opinion between dedicated editors. Àlex Hinojo, an editor for the Catalan-language Wikipedia site, described a long-running argument over the place names used in the Spanish-language Wikipedia site for some Spanish municipalities in Catalonia, Valencia and the Balearic Islands. “They [Wikipedia] have chosen to keep some Franco-era names such as ‘San Quirico’ instead of ‘Sant Quirze.’ But that should be decided by each region. The Spanish-language Wikipedia site uses the place names established by the Royal Spanish Academy and the National Statistics Institute. It’s an ongoing quarrel, but I don’t think there is any sinister force behind it.”
But this type of editorial debate is not what concerns Carl Miller and his team of social media researchers. “We think the biggest threat is ‘entryism,’ which is the long-term infiltration by state-sponsored actors who build Wikipedia reputations over time so that they can later exploit its underlying policies and governance processes,” said Miller.
The most sophisticated attacks have been launched on the English-language Wikipedia site, but other sites have been targeted as well, said Santiago de Viana, an editor for the Spanish-language site. “I’m aware of suspicions and accusations about coordinated, state-sponsored efforts to modify Spanish-language content. But it’s very difficult to find hard evidence of this, or to impose sanctions for such activity,” said de Viana. “For example, we frequently see an increase in vandalism and self-promoting edits by politicians during election campaigns, but the people behind this activity are rarely identified.”
Entryism is more concerning for Wikipedia pages that present Russia’s perspective on the invasion of Ukraine. According to Miller, there are four signs to look out for. “One – the edit makes subtle language changes that don’t break any rules, such as adding the Kremlin’s version of events; two – it manipulates the election of administrators so that like-minded Wikipedians become content stewards; three – it leverages administrator privileges to resolve conflicts; and four – it changes the underlying rules governing source citation, for example.”
This is not a theoretical concern. The Wikimedia Foundation, the organization that funds and regulates Wikipedia and other crowdsourced wiki projects, banned seven Chinese editors and removed administrator privileges from 12 users affiliated with the Wikimedians of Mainland China for “infiltration of Wikimedia systems, including positions with access to personal information, identifiable information, and elected bodies of influence.” In other words, Wikimedia alleged that Chinese activists and officials had gained access to privileged positions within the Wikimedia community.
Wikipedia and the invasion of Ukraine
The “Information Warfare and Wikipedia” report used the English-language Wikipedia page for the invasion of Ukraine as a case study, and examined 86 accounts that edited the page and were subsequently blocked from editing. The difficulty in detecting any coordinated activity by these accounts is seen in the sheer numbers of Wikipedia revisions they made over the years: 794,771 revisions to 332,990 pages on topics ranging from Judaism, Poland, aviation, airports, Iraq, Libya and Syria.
The challenge then becomes identifying the types of edits that are likely to be biased. “An edit is more complex to study than a tweet or a Facebook post, because each act can involve not only the addition of content but also its relocation or deletion, often in combination,” says the report, which focuses on edits that used overtly biased media sources. “The team manually assessed the edits containing these links, and found that 16 of these 22 edits were contentious, exhibiting narratives consistent with Kremlin-sponsored information warfare.”
But when the research team looked beyond the Wikipedia page on the Russian invasion of Ukraine, they found a common pattern of adding biased sources – 2,421 edits were found to introduce links to state-affiliated domains on 667 pages about every conceivable Russian conflict, and also pages for Formula 1 races, floods in Pakistan, and more. “This does not necessarily identify coordination or strategic intent,” says the report, “… but it can spotlight various regions of Wikipedia that might be investigated more closely.” In other words, disinformation campaigns on Wikipedia have been overlooked. “In a world where information warfare is more pervasive and sophisticated, this concerns me, precisely because Wikipedia is so valuable,” said Miller.
Wikipedia’s preventative measures included content warnings, article restrictions regarding the types of users who can edit, and blocking IP addresses and registered accounts. “The various language-specific Wikipedia sites have boards that receive community-sourced reports of disruptive or suspicious behavior,” said de Viana. “All you have to do is provide the administrators with a link to a specific edit and state why you think it violates a rule.”
This is why state-sponsored actors want to control Wikipedia administrator positions. But even that’s not so easy. “It’s very difficult for an administrator to do something controversial without anyone noticing,” said Francesc Fort, an editor for the Catalan-language Wikipedia. “If I block a random account, someone will complain. It’s complicated.”