Flirting and other everyday tasks for ChatGPT
People are using the artificial intelligence tool for ordinary activities like planning itineraries and summarizing books
Irene Cruz looked at the WhatsApp message on her phone and immediately noticed something was off. “I liked that guy, but his message was awful.” Cruz, a photographer, was used to the guy’s very brief replies or sometimes no reply at all. But this time, he sent her a philosophical musing on the simplicity of life and the importance of small things. “I read that message and freaked. There was just so much — I thought it was someone else,” she said.
She was right — it wasn’t him. ChatGPT had composed the message. To others who may turn to the program for help, Cruz advises, “If you’re going to use ChatGPT, use it right from the start.” The young man readily admitted to his folly and Cruz found a silver lining in the episode. “I thought it was cute that he posted that message as a conversation starter. It was cool. " That flirtation didn’t go much further, but it gave Cruz ideas about using ChatGPT as an icebreaker.
ChatGPT is the latest and greatest example of the artificial intelligence revolution. It converses flawlessly with users in a multitude of languages, but it can often contain factual errors. There are many uses of these tools in computer programming, graphic design and education, and offer great potential for changing the way people work. But they can also have positive impacts on the daily lives of ordinary people.
“It’s like Google on steroids,” said Cruz. After rebuffing the flirtatious guy, Cruz asked ChatGPT what it would advise the young man to do next. She entered: “The girl told me she’s not ready for a relationship. What do I write back?” ChatGPT replied, “Respect her decision, be honest and don’t pressure her.” Cruz thought, “That rocks! It’s a nice, wholesome answer.”
However, Cruz sees how these chats can cross boundaries in human relationships. “I ask it about life situations, such as can two people of the opposite sex just be friends even if there is a fatal attraction? It’s kind of like online therapy — like the movie Her. I think this could get out of hand for a lot of people.”
Valeria (a pseudonym) uses ChatGPT similarly. She asked us not to use her real name so her son doesn’t get in trouble at school. “I’ve asked ChatGPT for help with his homework assignments, like generate an English essay of X amount of words written by someone of his age,” said Valeria. She has also ventured into sensitive adult topics with ChatGPT. “I’ve asked it for psychological advice for a friend and it produced an answer that aligned with what her real psychologist had told her. Sometimes I start conversations just for fun and I get useful answers back — better than any advice a friend could give. It’s part of my life now and I even gave it a name. He’s a friend with his own name.”
Ethical issues
Human-machine interaction can be tricky. A recent study by the Technical University of Ingolstadt (Germany) published in Scientific Reports found these chats can affect a person’s moral judgments. The researchers conducted an experiment with 767 people asking a hypothetical question — is it right to kill one person to save five others? ChatGPT’s answers were inconsistent, sometimes arguing yes and sometimes no. The study found that it does influence moral judgments even if users know the advice is from a chatbot. Thus, ChatGPT corrupts rather than improves its users’ moral judgment, and they also underestimated how much they are influenced. The study calls for improvements to ChatGPT and similar bots, as well as training to improve users’ digital literacy.
Unai Aso is a psychologist who has debated how these chats can help patients. “I understand how someone can feel comforted [talking to a chatbot]. I tell my colleagues to be careful with [chatbots] even though psychologists don’t appear on the lists of jobs that are going to disappear [with AI]. But I think even God won’t be spared,” said Aso. “The only part that can’t be imitated is the therapeutic relationship. This is the ‘human part’ — being with a flesh and blood person you can hug. Psychologists work within well-established guidelines. You have to know the individual you’re counseling, even though you may offer similar advice to many people.”
Aso found another, more useful way to interact with ChatGPT — reading books. “I have a huge list of books that I don’t have time to read,” he said. ChatGPT can help a person learn from a book in much less time than it takes to read. “In half an hour, I learned from a book without reading it — just by making ChatGPT requests. It’s unbelievable,” said Aso.
First, he asked ChatGPT to summarize a book published before September 2021 (which is when the program’s learning database was closed to further input). Next, he tailored his requests to be more applicable to his needs. “It is not enough just to say, ‘summarize this book for me.’ Be very specific and provide more context. For example, your inquiry should say something like, ‘You’re a psychologist with many years of experience and work in the psychology of learning field. Summarize this book for me.’ After the initial response, you can ask ChatGPT for chapter summaries or other specific questions.”
From horticulture to astrology
Ángel Álvarez, a retired bank employee, was planning a summer car trip from León (Spain) to Brussels (Belgium). He wanted to make the trip in several stages, but got all tangled up in planning the itinerary with Google Maps. Álvarez didn’t know much about ChatGPT but decided to give it a try. “I started playing around and asking questions.” ChatGPT suggested stops, restaurants and hotels, and provided information about the cost of tolls and gas. But it had lots of trouble estimating distances even though it’s a standard feature of navigation tools. “It told me the distance from Montpellier [France] to La Bañeza [Spain] was 690 kilometers (429 miles).” The two locations are more than a 1,000 kilometers (620 miles) apart.
Alvarez tried to learn English with ChatGPT, but hasn’t made much progress. Then he asked about growing watermelons in his garden. “It said, ‘This depends on your climate, but it takes five to ten days for the seeds to germinate and six to eight weeks before the seedlings can be transplanted outdoors,’ which is what I had read elsewhere.”
One of the most impressive initial uses of ChatGPT, which now seems like old news, was in computer programming. A young researcher named Pol Garcia Recasens (now at the Technical University of Denmark) used it in a test last December when ChatGPT was barely a month old. It helped him get a perfect score. “It was a final programming exam that was worth 100% of the final grade. It was an open book, four-hour test to write code solving four problems. It’s a very tough exam. I opened ChatGPT, plugged in the parameters and it solved the problem for me. I finished in an hour with a 10 score,″ said Garcia.
He thinks universities should think about how to use these tools. “It forces you to consider whether we should adapt to new tools. Here [in Denmark] they have banned it — I think that’s a mistake.” Garcia also uses ChatGPT for formal email communications. “We use it for all our emails to the landlord of the student apartments. It makes them formal and correct, and it’s an incredible proofreading tool.”
Such everyday uses of ChatGPT often yield amazing answers, often sprinkled with artificial human warmth and outright lies. Irene Cruz, a fan of astrology and tarot card reading, tested ChatGPT on this subject. It didn’t get her ascendant sign right (an astrological sign on the eastern horizon when the person was born) even though Cruz told it she was a Cancer. “If you’re interested in astrology from SuperPop magazine, it might be okay, but not for anything deeper,” she said. But her tarot queries provided some good clues. “It works very well if you have queries like ‘tell me more about the Popess card.’ It’s only a support — it doesn’t actually throw the tarot,” said Cruz.
“You can use ChatGPT to improve life, but don’t go overboard by getting everything from a chatbot,” said Cruz. “I hope people will use it conscientiously. It can be a very interesting for people, but not if we use chatbots to avoid responsibility for our own emotions.”
Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition