In Spain, dozens of girls are reporting AI-generated nude photos of them being circulated at school: ‘My heart skipped a beat’
While the police investigate, the mothers of the affected have organized to take action and try to stop those responsible
Back to school. First day of class. Isabel, 14 years old, went last Tuesday to her high school in Almendralejo (Extremadura, Spain), a municipality with almost 30,000 residents where practically everyone knows each other. That morning, she entered the schoolyard to find a rumor spreading from group to group. It was all everyone was talking about: there were photos of naked female classmates being passed around everyone’s phones. Isabel (her name has been changed at the request of her mother) went out to recess with her friends. They were in shock. Suddenly, a boy approached her and said: “I saw a naked photo of you.”
The young girl was afraid. After school, she returned home, and the first thing she did was tell her mother. “Mom, they say there’s a naked photo of me going around. That they did it with an artificial intelligence app. I’m scared. Some girls have also received it.” Sara, her 44-year-old mother, immediately contacted the mother of her daughter’s best friend, who had also just told her family about the situation. After talking, the mothers started making calls; by then, there were more than 20 girls affected. That is when a mother decided to create a WhatsApp group to better coordinate with everyone. That Monday, there were already 27 people in the group.
Almendralejo has five middle schools and, in at least four of them, AI-generated images of naked students have been spread. Police sources in Extremadura report that they are aware of seven complaints, so far. The case is being investigated by the Almendralejo judicial police. In fact, they have already identified “several” of the alleged authors of the photographic montages, according to officials. The case has been placed in the hands of the Juvenile Prosecutor’s Office.
Sara filed her complaint last Friday. When she arrived at the police station, she ran into another mother who was just coming out the door. Fátima Gómez, 30 years old, has a 12-year-old daughter. She found out about the case last Wednesday night around 10:00 p.m., when the mother of one of her daughter’s friends called her to tell her: “I saw a naked photo of your daughter. It’s a montage.”
Gómez suffered an anxiety attack. Later, she had a conversation with her daughter: “Do you know anything about a naked photo?” The girl did not hesitate. She said yes, and showed her mother a recent Instagram conversation she had with a boy. In it, he asks her to give him “some money.” When she refused, the boy immediately sent her a naked photo of herself. All she could do was block the contact. The police believe that there is a false profile behind this account.
As the number of affected girls kept increasing, the group of mothers kept growing. One of the mothers is Miriam Al Adib, a 46-year-old gynecologist. She has an Instagram profile with more than 120,000 followers. There, last Sunday, she made a live stream to talk about what had just happened at her home. The video already has more than 70,000 views. “I just got back from a trip; this is very serious, and I have to share it,” she says.
Al Adib, who has four daughters between 12 and 17 years old, tells EL PAÍS that she had just arrived from a trip from Barcelona, where she went to give some talks on female sexual health. After eating, her 14-year-old daughter approached her and said: “Mom, look at what happened. They have done this to many girls.” Then the girl showed her the photo of herself naked. “My heart skipped a beat,” Al Adib says. “If I didn’t know my daughter’s body, this photo looks real.” After that, the girl let her know that a friend’s mother was going to call her because, apparently, they were organizing in a WhatsApp group.
The mother of the other girl told her on the phone that there were many affected. “Some know that there are naked photos of their daughters, but they don’t have them,” she explained. Then Al Adib told the others that she had a platform where she could make a video explaining the situation, to try to reach the kids who are sending these photos, to make some noise. “This is a village, and we know — we know what’s going on.” The 10-minute video, where she tells what happened to her daughter, is accompanied by a text: “This, girls, won’t be tolerated. STOP THIS NOW. Girls, don’t be afraid to report such acts. Tell your mothers. Affected mothers, tell me, so that you can be in the group that we created.”
The reaction was one of massive support for all the affected mothers, with private and public messages urging them to continue and report. “All is not lost in society,” says Al Adib. “The feeling that women do not remain silent is a fact. We are no longer ashamed. We are victims, and now we can speak because society supports us. That’s the message that I have given my daughters, and they should never forget it.”
The investigation, according to police sources, remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”
Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition