Feminization of AI voice assistants

Essays

June 20
This essay contains questions about why the first generation of voice assistants are feminized and domestic and why it is an acquired need to gender a robot? I wrote it in the module "Interaction Design Methods" lectured by Dr Joëlle Bitton.

The feminization of AI voice assistants


In 2011 Apple introduced Siri, followed in 2013 by Microsofts Cortana, and one year later, in 2014 evolved Alexa from Amazon. It attracts attention that all those assistants have been assigned a female gender role. The naming, as well as the voice, perpetuates undesirable gender stereotypes out of the past century, that we should have discarded long ago.
This essay contains questions about why the first generation of voice assistants are feminized and domestic and why it is an acquired need to gender a robot? Furthermore, we question how it would sound if they were not gendered.

Women in tech industry


We live in a world mainly designed by men for men as the book «Invisible Women» or many other articles and books exposes. It reveals the gender data gap – a gap in knowledge that is the root of the eternal, which has created an omnipresent but invisible bias that has a profound impact on women‘s lives. (Criado-Perez, 2019)

A look at the Tech-Industry, where the women are understaffed, serves us with an excellent example. The number of women working in science, engineering, and technology who will leave the industry during their careers is up to 50%, according to a study by the Harvard Business Review in 2008. (Lien, 2015) Reasons for leaving the industry are the work environment, hardly existing career opportunities for women, sexism, discrimination, a significantly lower payment, and in general, an antagonistic male culture. 

A new study from 2014 shows that this has not changed yet. Nowadays, there may are many support programs that encourage and promote young girls in science, engineering, and tech, but as long as the industry itself is not changed, the potential will not be used to the fullest as well in the future. It is like trying to fill up a bucket that has a hole at the bottom. The underrepresentation of women in the tech industry leads to several issues. The main problem is that the perspective of women gets lost somehow. 

Even though the Voice Assistants (VAs) are mostly feminized – we will come back to that point later – the recognition software also works worse for women. There is to mention that it works poorly as well when it comes to accents in language – just another bias that occurs in that field. 

The unequal distribution of the genders in the tech-industry leads to unbalanced training for AI technologies. Looking at the percentage of men working at the «Association for Computational Linguistic Anthology Network» as authors, they represent over two-thirds compared to women. (Tatman, 2016) Although there are many talented and qualified women in this field, they are still outnumbered. Prejudices transpire in AI-Systems when they follow human biases held by those who are involved in encoding, collecting, selecting, or using data to train the algorithms used to control the AI.

Although they learn by being scripted, they also use already existing texts, which brings us to another point where biases arise. The study «Semantics derived automatically from language corpora contain human-like biases» (Caliskan, Bryson, Narayanan, 2017) reveals that machines can learn word associations from written texts and that these associations reflect those determined by humans. AI is not learning utterly new stuff; it just learns knowledge which we are assuming to know already. That means that the historical biases get embedded in AI. That might do not seem problematic when it comes to the association between pleasantness and flowers or unpleasantness and insects. Still, it gets problematic when we look at associations about race, gender, or distribution of the sexes in terms of career‘s first names. For example, it connects «father» with «doctor» and «nurse» with «mother» or closes the statement «Man is to computer programmer as woman is to ...» with «homemaker». To fix those issues, humans have to intervene in the process. (Simonite, 2017)

Language, in general, is a big transmitter and mediator of biases, whether racist, gender, or others. A study showed that even «genderless» words like «users» or «participants» unconsciously lead us to have a male person in mind instead of a female. (Bradley, MacArthur, Hancock, Carpendale, 2015)


So far, we have only talked about the presence of women in the tech industry and why voice assistants react worse to women. However, the absence of women might also suggest the assumption, that it leads to one of the main reasons for the feminization of voice assistants. 

Why does the tech industry genderize Voice Assistants, and why is that bad?

At first glance, this may not sound so bad. Making a short excursion into the world of Sci-Fi movies, for example, «Metropolis» (1927) or «Ex Machina» (2015), it comes apparent how long the dream of a female robot has already existed. It turns out that the dream of a female robot is almost as old as that of cinema itself. (Penny, 2016) That we tend to assign a gender to a robot is not only because of the high number of male producers but also because we live in a society where gender is a primary social category. 

We are permanently surrounded by gender norms, gender identities, and gender relations. The projection of human qualities onto robots and artificial intelligence helps us relate to them. Some people give their plants a name to build a stronger bond with them. So it is relatable why we tend to name and gender objects. It might say a lot about our prejudices and culturally determined preferences. Nevertheless, as soon as we assign a gender to robots, stereotypes arise. There is almost no way around that. 

The problem is that by hardening current stereotypes, it gets harder to get away from that way of thinking.

Let us have a closer look at which stigmatized gender roles arise. As we already know, comparing different VAs and AI Assistants, it is well-known that many VAs are feminized by default. At the same time, in applications for banks or insurance companies, male voices are more often chosen. Researches also show that users tend to like a male voice in tasks where authoritarian presence is required.

According to Amazon (Alexa) and Apple (Siri), people are more likely to buy from a human-sounding device, and at the same time, they feel more comfortable when they get help from a female voice. (Morley, 2020) So the companies use attributed long-established stereotypes that we should have left behind long ago to increase their turnover. 

This leads us to many stigmatized roles in our society. It is not only problematic that devices such as Siri and Alexa support us with work that used to be assigned to women by default and corresponds to the traditional ideas of housework and care assistance as a female domain. It is further the carefully constructed personalities behind the characters, which is questionable. 

The voice behind the AI represents an enthusiastic, courteous, passive, and docile personality.It brings up how devices like Siri are taught to respond to abuse. Siri responds to «You are a slut» with «I would blush if I could.» and Alexa answers to «You are hot» with «That is nice of you to say.» Those responses are woefully inadequate. The harassment of bots that cannot defend themselves demonstrates socially typical impulses for harassing women and human beings who are inferior to you. 

As Jaqueline Feldman, a journalist says:

«By creating interactions that encourage consumers to understand the objects that serve them as women, technologists abet the prejudice by which women are considered objects» (Feldman, F‘xa)

At this point, it would be the responsibility of the companies to instruct the user and make a statement that offensive languages, no matter to whom, is not tolerable. 

There are multiple challenges when creating a robot as genderless as possible. How should the voice sound? How does it grammatically speak? F‘xa is a feminist chatbot designed to educate about AI biases. What the maker of F‘xa did is that they resigned to use «I» in its conversations to make it slightly clear that it is still a chatbot and not a real person.

In some project designers try to approach new genderless voices for VAs. One example is the Voice Assistant «Q», launched in 2019, which introduces itself as the first genderless AI assistants, neither male nor female. (Q, VAs) They have recorded a large, diverse number of voices from every gender spectrum and any backgrounds. Afterward, they evaluated the average value of what we assign to females or males.It resulted in a value of between 145 and 175 Hertz. If it turns it higher, the voice is perceived as female, and by going lower, the voice is assigned to a male person.

Conclusion

In summary, the topic of biases in AI illustrates how many prejudices are still firmly embedded in our minds. It particularly apparent the importance of our language and the way we articulate ourselves. It is exactly that part where we, as a designer, have to take social responsibility and have to try to change and enhance the user‘s mindset. It does not have to feel obvious to the human being that uses the design all the time. Even with small steps, we can dissolve old-established stereotypes.

Nevertheless, it is also up to the tech giants and their responsibility to counteract these stereotypical images of society that still haunt in some minds, rather than using them to sell more of their products.