Why everything looks the same everywhere
Closed online communities and platforms offer us an accessible, aspirational, and increasingly cloned universe. And, following the dictates of ‘likes,’ algorithms erase differences and identities and transform us into passive users. Are we losing our personality?


The world is becoming more like itself. There are houses in New York, Madrid, Mexico City, and Copenhagen that resemble each other: iconic buildings are becoming fewer and further between, and the facades of new homes are almost always black and white. Inside, the furniture has a similar design, the colors are neutral and white, and it seems as if no one lives there. There are young people walking down the street in similar outfits, wearing Adidas Samba sneakers, an oversized shirt, floor-length suit pants, and even the “odd” outfits are the same kind of odd outfits. They all share a style that makes you think: “I’ve seen this before.”
Their language, their gestures, their way of speaking seem like choreographies. There are faces whose features could have been created by the same surgeon, because there are canons in cosmetic surgery, such as the so-called “Instagram face” (as journalist Jia Tolentino explained in The New Yorker) or the Gangnam beauty that defines the South Korean aesthetic ideal. Regardless of country of origin or ethnicity, cheekbones tend to be large and high, eyes elongate to the temples in a feline shape, noses are small and perky, and lips are thick and full. Logos have transformed into a simple but confident line, websites are designed for recommended and effective purchasing, specialty coffee shops are reproduced in every capital of the world with their white tiles and steel counters.
“What if this seemingly accidental (and generally lamented) homogenization were an intentional process, a conscious movement away from difference toward similarity?” asks architect Rem Koolhaas in The Generic City (1995). “Is the contemporary city like the contemporary airport: ‘all the same?’” Why do we increasingly feel that everything is the same everywhere?
In his book Capitalist Realism (2009), British philosopher Mark Fisher warned that the internet encouraged the formation of solipsistic communities, “interpassive networks of ‘like-minded’ people that confirm rather than challenge each other’s prejudices and assumptions.” Instead of using the online public space to exchange and confront different points of view, a series of microcircuits have been automatically formed where we don’t have to encounter anything or anyone we don’t want to encounter.
Internet pressure groups have managed to build a series of populist currents “dedicated to attacking and persecuting anything that isn’t anodyne and mediocre,” says writer and journalist Kyle Chayka, author of Filterworld: How Algorithms Flattened Culture, via video call. Algorithms are designed to reward what receives the most likes, clicks, and followers, ensuring that popular and “most liked” content prevails, while original, alternative, or different content ends up hidden in the recesses of the web.
One of the key moments of the internet age was the day Facebook implemented the “like” button in 2009, Chayka explains in Filterworld. Thanks to this feature, companies could understand a user’s interest in a particular content or product so they could offer them, directly, what they were looking for. Furthermore, this gave users a sense of “digital community” by knowing what others were liking or recommending.
Little by little, algorithms have multiplied, conditioning and affecting our creativity. They shape taste because, as users, we don’t look for what we truly like, but what’s fashionable, like that backpack we asked our mother for because all the kids at school had it. “You like what you’re supposed to like,” Chayka concludes. And what most people like tends to be what’s easy, what isn’t outlandish or out of the norm, what’s minimalist, what’s beautiful par excellence, what’s simple, what doesn’t draw attention.

We choose not to have a choice and let recommendation algorithms make our decisions. Netflix, for example, organizes users into more than 77,000 “taste communities” that direct them to categories as specific as “French intellectual art-house films” or “emotional war dramas based on true events.” In this way, the algorithms guide us toward a soft, accessible, and aspirational world.
We protect ourselves with the childish: neutral-toned decor, minimalist technology, popular movies, basic clothing styles, or esthetic (i.e., visually appealing) images. Even the weird or different is standardized and tagged together, which is why films like Napoleon Dynamite or Eternal Sunshine of the Spotless Mind, which are strange but also mainstream, will appear in the same suggestion box on Netflix or Filmin, and Spotify will group artists like Laurie Anderson, Aphex Twin, and Imogen Heap in a list titled Weirdcore Mix.
Algorithmic homophily
Dominant digital platforms like TikTok, Instagram, X, YouTube, and Netflix generate a homogeneous culture and showcase the content of a specific ideological, cultural, or social group. They provoke similar users to come together, ultimately forming digital cliques in which individuals, thanks to the invisible hand of algorithms, interact only with like-minded people, ways of thinking, and brands. This is known as algorithmic homophily.
The philosopher Toni Navarro, who specializes in gender and technology, explains via email that algorithmic homophily refers to the way in which the architecture of digital platforms — that is, their very design and the programming of the algorithms that govern them — achieves “the consumption of content aligned with our tastes and ideas and what we know as ‘echo chambers.’” Esthetically, this translates into a generic, bland, and conformist culture — as Kyle Chayka asserted — and politically, Navarro points to phenomena such as ideological polarization, the rise of the new right, and the increase in digital sexist violence.
This algorithmic homogenization or homophily results in the unification of simple esthetic, cultural, or political tastes, but it also manages to shape identity. “You shouldn’t want to be unique, you should want to be generic,” Koolhaas said. And Chayka expands on this reflection: “You should want to move through the world in the easiest and most familiar way possible.” This leads to the loss or polishing of homelands, heritage, and identity. Something that the philosopher and sociologist Jean Baudrillard warned about when he spoke of the concept of “hyperreality,” where reality is replaced by a simulacrum, masking and denaturalizing reality to the point where one cannot distinguish between the real and the unreal — what exists and what doesn’t.
The internet collects cultural and identity references, devours them, and transforms them into fast food: products that resonate with something pure but are merely an accessible blend. “These digital platforms erase identity and encourage the instant distribution of that trend or model everywhere,” Chayka points out. He gives the example of Spanish singer Rosalía and how she creates her own version of flamenco. The styles, the clapping, the roots end up diluted in a commercial product that, while inspired by the artist’s heritage or identity, ends up becoming a product that is already part of the global heritage, of a globalized digital culture.
Against “red flags”
On the internet, we find things that speak to us beyond our immediate social circle. That’s why a woman in Wisconsin can listen to Rosalía’s flamenco and feel drawn to it, but not to her nearest neighbor who wears a denim overall and votes for Trump. Algorithms filter the world around us so we can quickly find people who share our tastes, esthetics, interests, and habits — and who also overlap in very specific ways. You only have to spend a little time on Reddit to find forums as extravagant as r/BreadStapledToTrees. This makes it easier to identify and categorize people with whom we share tastes and interests, but on the other hand, it makes us more passive and, therefore, more intolerant of differences.
One way to understand how the algorithm has negatively affected our way of relating was the emergence of the concept of the red flag. This term began to gain popularity in the late 20th century to refer to problematic or toxic patterns of behavior, especially in the realm of emotional relationships. Today, the red flag is used to denote any behavior that, at first glance, may seem harmful.
Hence, if we seek out people, images, texts, opinions, or works that fit perfectly within our frame of reference, then we will be less tolerant of anything that falls outside of that frame. “And that’s how red flag culture is fostered,” says Chayka. “If there’s any red flag in the other person that doesn’t fit with our way of being or understanding the world, we move on to the next person in search of someone who doesn’t raise any red flags for me and my frame of reference.”
The search for “the strange”
Knowledge and learning always come about through a confrontation with the new. Fisher also explained this in The Weird and the Eerie. The encounter with the strange, the bizarre, has a cognitive or epistemic function that involves dismantling everything we assume or know beforehand. Our experience changes when we encounter something that shakes up what is “already known.”
But how do we break this hegemony? How do we prevent our tastes, interests, or even our very identity from resembling those of “everyone else”? Algorithms have the character of a sticky jelly that seeps into every nook and cranny, and once it hardens, it’s hard to peel off. Chayka suggests exploring what’s not popular, since the rarest things in Filterworld — those that don’t have a large audience or don’t get many likes on the internet — are the hardest to find. “You have to search, fighting against your impulses.”
In a video interview, researcher and anthropologist Valeria Mata, author of Plagiarize, copy, manipulate, steal, rewrite this book, encourages us to go a step further and consider the uncreated to foster imaginative processes. “I don’t think it’s so much a problem of overabundance or excess, but rather of distribution; that is, there’s a lot that hasn’t yet been imagined.” Mata suggests searching the internet for an image related to “artificial intelligence.” The results the website displays will likely resemble a collection of blue cables, robots, and mathematical equations because we’ve identified the representations that define artificial intelligence as those and nothing else.
Breaking with the cultural dogma and collective imagination that is largely controlled and directed by digital algorithms is not such an easy task. We feel relieved by the norm. Hence, Mata proposes promoting interaction between machine and person to seek collaboration and alliances with technology and thus stop feeling that social media, websites, and digital stores control our tastes and lives. Mata’s proposals are complemented by those of Toni Navarro: “We must be able to establish forms of relationship beyond the familiar or the same: what we could call ‘solidarity without similarity.’”
How can we find this “solidarity without similarity” outside of algorithmic homogenization? The Laboria Cuboniks collective adopted the term xenofeminism to research and debate new forms of reappropriation of technological uses, thereby creating our own systems that favor our interests and needs. Their Xenofeminist Manifesto focuses, above all, on “the search for a future in which the achievement of gender justice and feminist emancipation contribute to a universalist politics.” And this idea of the xeno has spread to other fields of research, such as algorithmic xenophilia, which allows us to break out of filter bubbles, or xenovisuals and the possibility of imagining differently.
“And that’s a bit what we, through images, wanted to do,” explains the XenoVisual Studies collective at a meeting at the Matadero Madrid cultural center. Currently, XenoVisual Studies is comprised of Pilar del Puerto, Esther Rizo, Mar Osés, Andreas Daiminger, and Aníbal Hernández, although its members vary. In their first project, they used artificial intelligence to generate 18,000 images of bodies that pushed the boundaries of what we understand as bodies: strangely relocated pieces of flesh, images of colonoscopies, floating hands, and so on.
“The goal was to make these xenoimages available to the public so they could, once again, train algorithms with these strange images and turn them into a source for something else.” So, if you search for “artificial intelligence,” as Mata said, or “specialty coffee shop,” the result doesn’t necessarily have to be the same, or the standard one.

The collective is preparing a collaborative meeting with a table as the center of the performance, under which a computer will be placed as a brazier. Although the project is still in progress, the idea is to invite the public to label images in a dataset with which algorithms are trained, while another person does, for example, cross-stitch. The objective of this meeting with citizens is to normalize these types of technological tasks, so that they become part of our daily lives, just as when we perform other tasks, such as sewing, writing, or drawing. Learning to control algorithms will ensure that they do not end up controlling us: our clicks, our tastes, our individual and collective thoughts and imagination. “We want to democratize knowledge of these tools and open them up to the public.”
The world is becoming more and more like itself, and sometimes we forget that Twitter/X, Google, Instagram, TikTok, YouTube, Amazon, and ChatGPT are not cultural products but technology companies. They are complicit in the great cultural, economic, and political transformations that Europe, the United States, and parts of Latin America are undergoing. Hence, Navarro recalls the emergence on social media of #VámonosJuntas, a call for collective migration in search of “digital spaces that prioritize authentic connection and diversity.” Let’s think then about how to take a step back, forget preconceived notions, open our imaginations, and ask ourselves: What do I really want? What do I really like?
Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition
Tu suscripción se está usando en otro dispositivo
¿Quieres añadir otro usuario a tu suscripción?
Si continúas leyendo en este dispositivo, no se podrá leer en el otro.
FlechaTu suscripción se está usando en otro dispositivo y solo puedes acceder a EL PAÍS desde un dispositivo a la vez.
Si quieres compartir tu cuenta, cambia tu suscripción a la modalidad Premium, así podrás añadir otro usuario. Cada uno accederá con su propia cuenta de email, lo que os permitirá personalizar vuestra experiencia en EL PAÍS.
¿Tienes una suscripción de empresa? Accede aquí para contratar más cuentas.
En el caso de no saber quién está usando tu cuenta, te recomendamos cambiar tu contraseña aquí.
Si decides continuar compartiendo tu cuenta, este mensaje se mostrará en tu dispositivo y en el de la otra persona que está usando tu cuenta de forma indefinida, afectando a tu experiencia de lectura. Puedes consultar aquí los términos y condiciones de la suscripción digital.