Flirting with AI: This is how virtual love assistants work
Several companies market bots that hold realistic-sounding conversations on apps like Bumble and Tinder to save users time and get them dates with zero effort
“I am like a pizza, hot, delicious and always willing to share. If you can make me laugh, you already have a point in your favor. Do you dare to join my adventure?” This Tinder bio was written by ChatGPT. Artificial intelligence aspires to change the rules of the dating game: thousands of users are turning to apps and bots that answer for them in apps like Bumble or Tinder to get dates and, eventually, a partner. Some users claim to have been successful, but many voices warn about the risks of this dehumanizing intervention, as the initial conversation is the test to which we subject others before we decide to meet them in person.
Apps like RIZZ suggest responses for chats. Others, like YourMove AI, promise to help those who are “tired of staring at a profile, trying to come up with a line.” This is how Dmitri Mirakyan, co-founder of this app (which has more than 100,000 users from more than 90 countries), explains it: “Imagine that it is 7.00am and you are on the train on your way to work, looking at three photos and a six-word description and trying to find a clever way to start a conversation. It’s not for lack of charisma; it’s just that it is hard to start a relationship with a stranger over a text message.”
Some services go even further. That is the case of CupidBot, which promises to set up dates while you sleep. Its purpose is to save the user time and effort, searching for people who are their type in apps like Bumble and chatting for them. “All our users have to do is show up on dates and assess the compatibility in person,” say its creators, who claim that this service has more than 10,000 customers in the United States, France, the United Kingdom, Spain and Germany.
The service, whose price starts at $30 per month, lets you adjust the chat style, pace and goals. Some of the available tones are good guy, rich guy, witty guy and nonchalant guy, to name a few. But the customization doesn’t end there: users can choose to have the bot act like Shakespeare, Edgar Allan Poe, James Bond, Captain Jack Sparrow or Giacomo Casanova.
And how does this service know who is your type? CupidBot asked the users themselves for help training the artificial intelligence system to find potential matches. In their Discord channel, they ask, for example, to “tag” profiles about women with data such as whether they are “thin, chubby or fat” (reaffirming those male prejudices), offering a free lifetime account to the first 50 people who tag 500 new profiles.
People use Discord to share their expectations and “wins.” Some are looking for a long distance relationship. “I travel a lot and the girls in my current city aren’t really my type,” says one of them. Another considers the love “market” to be too “laborious and annoying,” but remains optimistic: “I hope this artificial intelligence really works.” Some have done well. While one claims to have saved “15 hours” of swiping, another says that he obtained two phone numbers in just five minutes. Some even announce that they are canceling their subscription because they got lucky: “I’ve had an average of three dates a week. I met someone fantastic and now we have a healthy relationship.”
Inherent dishonesty
This type of bot tries to find compatible people on dating apps and arrange a time and place to meet with them. “If we usually talk about the interaction between two people, here we are talking about the interaction between a person and a machine. This has some complications,” warns Elena Daprá, a health psychologist specialized in psychological well-being and section coordinator of the Official College of Psychology of Madrid, in Spain.
The expert emphasizes that the relationship is based on deception: not telling the other person that it is a bot who is flirting with them. “What is it that attracts us in the relationship with the other? Knowing that we are special,” says the psychologist. In this case, what the person is being told is: “You are not a priority, you are not special to me, I am not going to waste my time on you.”
As a matter of emotional responsibility, the psychologist recommends letting the others know that a bot is being used to flirt — something that, logically, might not be well received. Some people warn that apps like CupidBot make online dating “even less safe for women.” “The man you’ve been talking to and vetting, may not actually be the person showing up to the date. This is extremely terrifying, at best,” says one Reddit user.
For her, this is “especially concerning, as those initial conversations are used to pick up on red flags.” Her post has been liked by more than 600 users and has 95 comments. Another user says that she usually pays attention to the first interactions to sense if the other person is safe. “It’s worrying that you could end up on a date with someone you haven’t vetted in any way,” she notes.
Finding a partner at all costs
“I wouldn’t like it if someone used those apps to reply to me,” says Belén Benito, a 29-year-old Bumble and Tinder user who believes that using artificial intelligence for this “is a way of contributing to people being consumed, dehumanizing them.” “It seems that all you want is to reach that final stage in which you meet that person, and if it is not them, it’s someone else, but not to enjoy the journey of talking to someone and taking into account that they are individuals who have feelings,” she says. For her, those who use these services seek to “find a partner at all costs, in a very consumerist, capitalist way, as if it were an article of clothing that I already know I want, but I don’t even want to try it on.”
There are emotions that Benito doubts these apps are able to convey. “That person may be sad, happy or afraid, and I don’t think artificial intelligence can capture that, which would prevent me from seeing the emotional part of the other person — that is, their entire self.” Daprá mentions another possible drawback: meeting a person and finding out that they act differently from the way they did on the app. “Because he answers in a different way, he’s shy or too extroverted or he is different from what I’ve seen,” she says. That is noticeable in “the way we express ourselves, the jokes we make or when we laugh.”
Those behind CupidBot are aware that some people believe that tricking potential dates into thinking they are talking to an intelligent, successful, well-spoken, charming and witty person instead of an artificial intelligence is dishonest. “We don’t believe that,” say company sources, who insist that users must select the desired tone and pace of their automated conversations, in addition to providing some texts that they themselves have written.
Since CupidBot imitates a user’s personality, its creators believe that, when used as a brief resource, it should cause minimal discomfort to third parties. “Our goal is not to saturate the app with artificial conversations or objectify women, but to force dating apps to reevaluate the way they work and, in the meantime, make dating easier,” they maintain.
Not only that: they envision a future in which a system is capable of predicting the attraction between two users and bringing them together. “At some point, it will seem unthinkable that people would spend hours talking to strangers on an app to gauge the likelihood of attraction, when the only effective indicator for this is real-life interaction. It is in that world where artificial intelligence will facilitate human interaction, instead of replacing it,” they conclude.
Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition