Julia Ebner: ‘We’re at the beginning of the digital Middle Ages. It’s a very dangerous road’
The exploitation of a technology like the internet and successive global crises have led to a chaotic situation, says this Austrian researcher, who studies the proliferation of extremist ideas
Julia Ebner, 32, researches the growth of radical ideas in our societies. Three years ago, she published Going Dark: The Secret Social Lives of Extremists. Now, she has just released a kind of second part – Going Mainstream: How Extremists are Taking Over – which discusses how radical ideas have moved to the center of society, with extremist parties achieving solid electoral results.
Technology is a fundamental tool in this process: dark forums, hidden networks, memes and algorithms that are all about making money. Ebner – who has worked with the London-based Institute for Strategic Dialogue – sat down with EL PAÍS to discuss the importance of the ins and outs of how tech plays a big role in extremist movements.
Question. How have extreme ideas been sneaking into the center of society?
Answer. What I’ve seen is that – since COVID – [much of] the population in liberal democracies has become more susceptible to extremist ideas and conspiracies. It seems that some of them feel abandoned – they feel that there’s too much going on. They see themselves in rebellion against what they would call a “woke” culture, [one that’s] too politically correct. They’re against what they call “globalist policies.” But, for others, too little has changed. They rage at political inaction to address economic inequality, now exacerbated by the inflation and cost-of-living crises. Those two very deep frustrations about the status quo drive ideas that were previously marginal. It’s interesting to look at different European countries, because we see an increase in far-right populist parties, [such as] Vox in Spain, [the Brothers of Italy], or in Sweden, with the Sweden Democrats.
[With the lockdowns], there was a resurgence of conspiracies and myths against politicians, established media and scientific institutions. We already saw this crisis of mistrust earlier, with events such as Brexit, or the election of Trump in 2016.
Q. What are the gateways to the world of conspiracies?
A. Anti-feminism has been one; it’s the growth of toxic masculinity, which has really been an entry point into larger extremist narratives. Vaccines and COVID policies have [been gateways], but so has the war in Ukraine and, of course, the economic and inflationary crisis. Additionally, there’s been criticism of the trans movement or other [minority groups].
Q. In the book, you write that we’re living in the “digital Middle Ages.”
A. If we continue on the path we’re on, the history books of the future – if, hopefully, there are any – could speak of the 2020s as the beginning of the digital Middle Ages, or the Dark Ages. We’re seeing a return to myths, which is the exact opposite of what the Enlightenment reversed. It’s a very dangerous path.
Q. Memes are a basic tool in this cultural battle. And humor is essential. It’s said that it’s more difficult for the left to use this as a resource. Is this true?
A. Yes, it’s easier for the extreme right to make funny memes, because they can go for the low-hanging fruit. Humor is an art… but when you can go for the easy stuff, like making politically incorrect jokes, it’s easier and more superficial than imagining more sophisticated jokes, which would be in line with human rights, or that aren’t based on dehumanizing or demeaning other people. It’s much easier to make fun of others than, for example, oneself.
Q. Telegram – the messaging app – appears in each chapter of the book. Is Telegram the “front page” of all the conspiracy information available on the internet?
A. Telegram is becoming an incredible vehicle for spreading misinformation, conspiracies, and extremist content. When I started researching extremism and radicalization in 2015, [the app] was [mostly] used by jihadists and Islamist extremists – it was hardly used by right-wing extremist groups, or even by the wider population. It wasn’t as conventional as it is now. It’s almost established itself as a free speech haven for people who follow far-right influencers and have had their accounts removed from major platforms [such as Twitter].
Telegram is an information bubble. At the same time, due to the way the app works, you can now also become a personal content curator. This is what’s called “salad bar ideologies” – people just mix what they feel best fits their world view. They may be part of an anti-immigration group, or an anti-vaccine group. You have this sort of “self-selection” of reality.
Q. Is Telegram the main app used by extremist groups?
A. I would say that it’s definitely the main app for most of the current far-right movements and conspiracies.
Q. Is Telegram the end of the technological journey for the people who consume this information?
A. It’s often where the journey ends, yes. There are, of course, other very extreme fringe platforms where you might watch a video… but Telegram [has become] the ultimate sounding board, where you stay within your own community. There, you can have both [small] groups and larger channels, where the coordination of protests against immigration – or against vaccinations – also takes place.
Q. But to reach Telegram, people first have to use mainstream social media.
A. Yes. Sometimes, [you may get] a link to a Telegram group from a YouTube video, which might not be as radical as the Telegram group itself. Or in a forum, or under a tweet, or in a Facebook post. During my research on protests (in the real world), I was often invited to Telegram groups. [But once you’re on Telegram], it’s hard to expand your audience – [extremists groups] won’t extend their reach if they don’t campaign on big social media platforms.
These big platforms have a major responsibility: to ensure that these campaigns aren’t amplified by their algorithms, and are instead countered with [alternative] voices, or more moderate content. This might be less interesting in terms of getting our attention, but it’s much less damaging to democracy or to minorities.
Q. You talk about how we don’t really need to debunk every specific myth when we teach students how to use the internet. Rather, it’s better to give them a certain set of skills.
A. Yes. I think there’s a really big gap in the educational curriculum. I think we need to look at historical patterns, at the different types of conspiracy myths that always resurface when we’re in times of crisis. And, very often, we even have the same scapegoats, such as the Jews, or doctors. We also really need to look at what a conspiracy myth is composed of – just the similar elements, not going specifically into details [about each one].
The historical lens is part of this. And so is the psychological lens, we need to look at psychological patterns.
There are also conspiracy theories that aren’t harmful… some even turn out to be true! It’s about recognizing when there’s political exploitation involved, or when something becomes dangerous for minority communities or for democracy.
Q. We live in a time of constant crises, from the economy, to healthcare, to war. The internet has also resulted in a crisis regarding how we inform ourselves about the world. Which of these crises is most important in favoring the growth of extremist ideas?
A. I would say it’s a combination. We’ve never had this combination before… we’ve had new technologies that were disruptive, but there was a lag, either in the way we responded to them, or in the way some of them caused chaos. This even happened with the invention of the printing press, or radio – the radio was exploited by the Nazis, for example. New technologies have that potential. Today, we’re seeing this high-tech disruption, in addition to wars and diseases. That combination of factors is something I don’t think we’ve seen so far.
Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition