_
_
_
_

Made by men to serve: Why virtual assistants have a woman’s name and voice

Various studies show that attributing feminine traits to robots increases how well they are perceived as human

Asistentes virtuales
Virtual assistants generally have a female voice.Westend61 (Getty Images/Westend61)
Lucía Franco

Technology doesn’t have emotions, but it tries to find ways to imitate them. To this end, virtual assistants such as Siri (Apple), Alexa (Amazon) and Cortana (Microsoft) have a woman’s name and voice. Although, after many controversies, these virtual assistants have been available for the past year in neutral and masculine voices, studies show that attributing female gender traits to robots increases how well they are perceived as human, as they are seen as warmer and more helpful. Many companies have based their decisions on these studies, arguing that giving their programs female traits increases their use and, with it, sales.

“I’m Siri, but I don’t like to talk about myself. How can I help you?” says the virtual assistant when asked for its name. Sylvie Boreau is a professor of ethical marketing at the Toulouse Business School (France) and has been researching why female robots are seen as more human for years. In the article The most human bot: Female gendering increases humanness perceptions of bots and acceptance of AI, Boreau explains her findings: that positive human qualities — such as the ability to perceive emotions and be friendlier and more helpful — is more associated with women than men.

“Users feel more comfortable interacting with female voices due to the traditional connotations associated with women’s care-giving and assistance roles in our society. That is why AI uses these feminine characteristics to make its products more human,” Bureau says by phone. However, the expert warns that this can create an ethical dilemma: “Giving a virtual assistant feminine characteristics can lead to women’s objectification.”

According to the researcher, over time, many users began to call their virtual assistant derogatory names such as bitch. In the case of Siri, the programmers decided that the device would respond: “I’d blush if I could.” This gave rise to a 2019 research paper on gender and technology by UNESCO’s Division for Gender Equality that made direct reference to Siri: the study was called I’d Blush If I Could. That forced Apple lo change Siri’s response in its next update. Now, she responds, “I’m not sure what you’re expecting with those words.”

However, the stereotype continues as these virtual assistants continue to be given a female name and voice. When Siri is asked if she is a woman, she replies that she does not have sex, like cacti. For Cristina Aranda, co-founder of the Women’s Tech association, which seeks to promote women in the field of technology, it is very clear that the problem lies with the people who manufacture and make decisions about this type of technology: “The vast majority of people who have shaped these products are men with great gender and cultural biases,” she says. Although Aranda thinks that it will be very difficult to change these stereotypes, she thinks there is some hope: “The only way to hack the system is for there to be more women in the sector. In the end, the decisions that programmers make when creating these devices are based on their belief system, and if they have always seen women as assistants, they will reflect this in their AI.

Martín Piqueras, professor at the OBS Business School (Spain) and digital strategy expert at the company Gartner, agrees. He uses the example of the first switchboard operators to explain why virtual assistants have a woman’s voice and name: “Telephone companies were able to verify very quickly that when a woman was the one who connected the call, customers felt more satisfied. Women generated confidence, and men felt assisted. Quickly, the rest of the companies sought to imitate that female voice in their customer services.”

Piqueras says that since those first “Hello Girls” — female switchboard operators — women’s voices have always been studied more: “It gives people confidence and makes them feel satisfied. And this has been proven by multinational companies, who know that their products are more likely to succeed using feminine traits.”

Scientist Karl Fredric MacDorman, an expert in the interaction between people and computers, published a report in 2010 in which he concluded that both men and women preferred female voices in their virtual assistants. Since then, as Piqueras explains, technology companies have relied on these studies to ensure that the feminine in their robots increases the sale of their devices.

Nancy Salazar, an expert in information technology, has studied this phenomenon for years: “The female gender has always been linked to servitude. This was confirmed by researcher Clifford Nass, professor at Stanford University, in his study: Are Machines Gender Neutral?, which concludes that people tend to perceive female voices as helpers and male voices as dictatorial figures.”

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

Tu suscripción se está usando en otro dispositivo

¿Quieres añadir otro usuario a tu suscripción?

Si continúas leyendo en este dispositivo, no se podrá leer en el otro.

¿Por qué estás viendo esto?

Flecha

Tu suscripción se está usando en otro dispositivo y solo puedes acceder a EL PAÍS desde un dispositivo a la vez.

Si quieres compartir tu cuenta, cambia tu suscripción a la modalidad Premium, así podrás añadir otro usuario. Cada uno accederá con su propia cuenta de email, lo que os permitirá personalizar vuestra experiencia en EL PAÍS.

En el caso de no saber quién está usando tu cuenta, te recomendamos cambiar tu contraseña aquí.

Si decides continuar compartiendo tu cuenta, este mensaje se mostrará en tu dispositivo y en el de la otra persona que está usando tu cuenta de forma indefinida, afectando a tu experiencia de lectura. Puedes consultar aquí los términos y condiciones de la suscripción digital.

More information

Archived In

Recomendaciones EL PAÍS
Recomendaciones EL PAÍS
_
_