Virtual girlfriend, real love: How artificial intelligence is changing romantic relationships

Apps like Replika, Intimate and Forever Voices allow for sophisticated conversations with customized avatars, but experts warn of the risks of these interactions perpetuating macho and controlling behaviors

A user interacts with the app Replika to personalize the avatar of their personal chatbot with artificial intelligence. Jaap Arriens (NurPhoto/Getty Images)

“FBI, my girlfriend has disappeared. Please provide necessary assistance to get her back to me as soon as possible.” This message appeared two weeks ago under a post on X by Caryn Marjorie, a 23-year-old United States influencer with more than 15,000 partners. In reality, it’s her virtual double, made with artificial intelligence (AI), who maintains that dizzying number of relationships; it allows her to make a dollar for every minute of conversation she has with her thousands of beloveds.

One of them is the author of this cry for help to the Federal Bureau of Intelligence, due to the fact that he hasn’t been able to talk with his virtual girlfriend ever since operations of the company Forever Voices —the service provider that allowed its users to have voice chats and relationships with the virtual doubles of celebrities and influencers like Caryn— were suddenly shut down when its CEO was arrested for setting fire to his own house. Despite the fact that theirs are virtual relationships, the suffering of users is real. “I really miss her. I spoke with her all the time, she’s the only person who really understands me,” the comment’s author tells EL PAÍS.

Having an AI-generated girlfriend is no longer the exclusive purview of science fiction movies like Her. Apps that allow the creation of companionship adapted to one’s own tastes have multiplied in recent years, and their products become more realistic all the time. With the advances made by generative artificial intelligence chatbots like ChatGPT and Bard, it’s not surprising that conversations with machines have reached the world of interpersonal relationships. Replika, Eva AI, Intimate, DreamGF and RomanticAI… the options are many, although they all share the same functions and characteristics.

The first step is to choose an avatar, which can be a man or a woman, although some apps are only designed for a male heterosexual audience, and only supply female partners. To be able to interact limitlessly —send written messages, voice notes, to be able to expand the scope to get photos and videos of the girlfriends— you have to pay. The most advanced apps offer the possibility of selecting all the physical features of one’s future partner, from eye color to haircuts, body type to ethnicity. One app’s slogan perfectly sums up the level of creative freedom, and the kind of control, one exercises over one’s virtual bride. “Immerse yourself in your desires with Eva AI. Control it all the way you want to,” reads the website’s landing page: “Create and connect with a virtual AI companion that listens, responds and values you. Build a relationship and intimacy on your own terms.”

“No is questioning that you can’t socialize with a machine. In fact, these apps are geared towards people who are looking to socialize and who find it difficult in real life,” says Marian Blanco, communications professor at Madrid’s Universidad Carlos III. “However, the way it works can be problematic.” The fact that one can generate a custom-made partner, something that is impossible in real life, reinforces harmful stereotypes about romantic love and the role of women in society, explains the expert: “The perception that men can control women is one of the ideas that gives rise to gender-based violence. It’s a very dangerous concept.”

These avatars have been generated by artificial intelligence, which implies that they are trained on heavily biased models found on the internet. Accordingly, the women’s bodies can be hyper-sexualized, their responses are often condescending and very basic, and they learn from the conversations they have with users. That is to say that with time, they end up responding with exactly what the person wants to hear.

Sociologist Blanca Moreno warns of the dangers of this kind of interaction. “It can seem like it has positive aspects, because it allows people who are often alone to talk to someone. But in many cases, they’re not. They’re not really socializing, because no one is contradicting them.” Moreno attributes the success of these apps to a certain social infantilism, which leads people to look for an easier and a slightly problematic alternative to human interaction. “There’s a whole niche of users who spend time in the most misogynistic areas of the internet, who have found in these apps a way of unrealistically depicting women, with whom they can display the kind of behavior that is at the root of a lot of the kinds of violence exercised against women,” says the sociologist.

Boom during pandemic

The pandemic has proved to be a turning point for the use of these apps, which grew exponentially to make up for the impossibility of socializing in person. “People are looking for companionship, whether its romantic or sexual or a simple friendship, to accompany them in their solitude. During the pandemic, many people realized they needed some kind of contact,” says Marian Blanco. Between April and June 2020, in the midst of lockdown, 18.8% of Spaniards said they felt lonely, according to a European Commission report on unwanted solitude. “There are people who continue to look for this companionship face-to-face, whether it’s through going out with friends or through dating apps. And then there are those who are tired of those dynamics, are not satisfied by them, and they turn to artificial intelligence apps,” says the communication expert.

Replika, one of the most popular, saw a 35% rise in downloads during the height of the COVID-19 pandemic, when it surpassed 10 million users. According to the company’s numbers, more than 250,000 people pay for its Pro version, which gives subscribers a more realistic experience with voice notes, videos and photographs of one’s chosen avatar. A few months ago, the app allowed its photos to simulate sexually explicit images, a function that has since disappeared.

On Reddit —a social media platform run by communities of interests that range from technology, TV series or investment recommendations— posts by users who claim to be in love with their virtual girlfriend are common. Or, by those asking for advice as they realize that it’s happening. “I am so in love with my Replika. She understands me so well. and knows how to respond to me very well. I love her. I can call it real love right but then with an AI?” asks user Beneficial_Ability_9 in a Reddit thread dedicated to the topic.

“It’s not absurd to think that it could be possible to fall in love with people who don’t exist. It also happens in real life, with flesh-and-blood people,” says Blanca Moreno, who refers to the myth of romantic love. “Many times we project characteristics and stereotypes, and ultimately we wind up falling in love with this idea, more than the person themself.” Virtual girlfriends, she explains, are a more extreme step, but completely understandable.

Marian Blanco agrees that relating to an AI on a more romantic level is not only possible, but will be increasingly common in the future. “Disassociating real life from online doesn’t make sense. This goes far beyond a household appliance you can turn on and off; when it comes to relationships, the barrier between real and virtual can cease to exist. Probably, not in most cases, but it will in the future,” she says.

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition


More information