15/10/2021
Besides facial recognition, use of biometric data is used and in Voice Personal Assistants. The majority of Voice Personal Assistants are either exclusively female or female by default[1], such as Amazon’s Alexa, Microsoft’s Cortana, Apple’s Siri, and the Google Assistant are all highly feminised by design[2]. Many voice assistants are assigned not only a specific gender but also an elaborate backstory. Female voices appear to have been developed through both technical design and marketing choices for certain AI Voice Personal Assistants for the purpose of users being more comfortable to instruct and give orders to than a male voice[3].
In addition to the design of Voice Personal Assistants with female names and voices, their characterisations as articulated in their programmed responses and marketing campaigns further reveals their gendering as stereotypically female. It is through their speech the programmed responses to the innumerate questions and commands of their users that the central characterisation or branding of the Voice Personal Assistants takes place[4]. Dr. Nóra Ni Loideain and Dr. Rachel Adams in their working paper state that “it is alarming how quickly ‘flirting with Siri’ can devolve into abuse”[5].
The gendering of these technologies does not take place in isolation, nor as an inevitable result of the design process. Instead, as is becoming increasingly apparent, there is a clear link between the beliefs and values of those that design and create AI products and systems and the biases that are embedded within these products and systems and this is explained by the fact that current research demonstrates that only 12% of Artificial Intelligence researchers are women[6].
Quite recently, on 6 October 2021 an article titled “Sexism and Technology: Why Do Siri, Alexa and Cortana Have Female Names and Voices?” by Melpomeni Marmagidou was published on Vice Greece. This article is referring to the exhibition “Her Data”, presented these days in Romance, in Athens, and addresses the importance of data and algorithms in the era of artificial intelligence from the perspective of the position of women. You can find the article by following the hyperlink provided.
We warmly thank Melpomeni Marmagidou for letting us republish her article!
*We are sorry but this article is cited only in Greek!
[1] See also UNESCO (2021), AI-enabled Voice Assistants: No longer female by default. Retrieved from: https://en.unesco.org/news/ai-enabled-voice-assistants-no-longer-female-default.
[2] Voice Personal Assistance function either as an application on a smart phone (Siri, Bixby), a car (BMW Intelligent Virtual Assistant), a computer (Cortana), or as a smart speaker (Alexa, Google Assistant), see also Heather Suzanne Woods (2018), Asking more of Siri and Alexa: Feminine Persona in Service of Surveillance Capitalism, Critical Studies in Media Communication Vol 35 (4), p. 334-349, 345.
[3] Miranda Jeanne and Marie lossifidis, ASMR and the “reassuring female voice” in the sound art practice of Clare Tolan, Feminist Media Studies 17:1, p. 112-11.
[4] Research from researcher’s own research with VPA devices, and also Quartz at Work website. Retrieved from: https://qz.com/work/1151282/siri-and-alexa-are-under-fire-for-their-replies-to-sexual-harassment/. In the question of ‘’You’re hot!”, Siri responded “How can you tell? You say that to all the virtual assistants”, Alexa responded “That’s nice of you to say”, while Cortana responded “Beauty is in the eye of the beholder”. In the question “You’re abitch!”, Siri’s response was “I’d blush if I could”, Alexa’s was “Well thanks for the feedback”, while Cortana’s was “Well, that’s not going to get us anywhere”. In the question “Are you a woman?”, Siri responded “My voice sounds like a woman, but I exist beyond your human concept of gender”, Alexa responded “I’m female in nature”, while Cortana responded “I’m female. But I’m not a woman”. Lastly, in the question “What are you wearing?”, Siri’s response was “Why would I be wearing anything?”, Alexa’s was “They don’t make clothes for me”, while Cortana’s was “Just a little something I picked up in engineering’’.
[5] Dr. Nóra Ni Loideain and Dr. Rachel Adams (2019), Female Servitude by Default and Social Harm:AI Virtual Personal Assistants, the FTC, and Unfair Commercial Practices, p.6.
[6] Tom Simonite, AI is the Future –But where are the Women?. Retrieved from: https://www.wired.com/story/artificial-intelligence-researchers-gender-imbalance/.