Your voice assistant likely has a decidedly female voice. Amazon, Microsoft, Apple, and Google very consciously programmed their devices with female voices, three of them using female names, in large part to capitalize on perceived stereotypes of a woman’s voice.
While there is research around the perception of a more helpful, generally sympathetic tone in a female-voiced assistant, there are also deeply embedded gender biases that permeate technology. From your GPS to your smartwatch, it’s a woman’s voice you hear and a female avatar that takes directions. New technologies are being developed to address the social issue of AI voice and to make the technology as inclusive as possible. Let’s take a closer look at what that entails.
Why AI Uses a Female Voice
Amazon’s Alexa is one of the most ubiquitous AI voices on the market, and it was a very conscious decision to make it default to a female voice. According to Daniel Rausch, head of the Smart Home division for Amazon, they “carried out research and found that a woman’s voice is more ‘sympathetic’ and better received.”
While Amazon’s research hasn’t been made public, there are other studies around voice preference, including a 2008 University of Indiana study in which researchers found that both men and women preferred a “more cordial” female voice when interacting with technology. Several studies have shown that we respond better to voice AI when it is assigned a recognizable voice, but as a Stanford study shows, it is a woman’s voice that many users find more “helpful”.
Why Do We Need Genderless Voice?
If research has shown that there is a clear preference for the female voice in AI, why are companies investing in research of genderless voice assistant technology like Q? There are several reasons for it. While studies show a preference for female voices in assistants, at least part of the reason is the persistence of gender stereotypes. Already, we are seeing a clear gender divide in AI voices – with assistants generally female. For example, look at Google, Siri, or Alexa. These personal assistants all come with a female voice as the standard setting. This reinforces stereotypes about male voices representing authority and female voices representing caring and nurture.
Q, a new genderless AI assistant is being developed by Copenhagen Pride and Virtue to address this, producing a scientifically gender-neutral voice as a potential future replacement for the acutely female and male voices in today’s systems. The tool was developed by ranking the voices of several people who identify as female, male, non-binary, and transgender. Thousands of study participants were then asked to rank these voices on a spectrum from male to female, allowing the researchers to home in on what they’ve identified as a gender-neutral voice.
Pushing for Greater Inclusivity in Technology
Currently, only Siri and Google can switch to male voices, and none of the major four voice assistants have a gender-neutral alternative. Gendered interactions with AI have consequences, though. Humans interact with one another by matching linguistic styles and patterns, and while we may identify voice assistants as machines, there is a very real interaction occurring that can reinforce not only stereotypes but also unhealthy behaviors.
Parents have started to take action to address rudeness from children when interacting with voice assistants, as many of them don’t have options to require please and thank you. That rudeness carries over to adult users who routinely demand things from and at times yell at their voice assistants, at times for a laugh. Many see this as reinforcing the negative behaviors directed at women by society as a whole and an added challenge that doesn’t need to be there as technology takes on a more personal, interactive role in our lives.
The idea that females are subordinate, and their role is to support people in tasks, is one society has worked against in recent decades. That AI technology undermines this is distressing to many users and a prime reason to push for genderless voice in AI systems.
Humans unconsciously categorize the people, and now machines, we interact with based on subtle factors like voice. Genderless AI is designed to challenge those unconscious biases in a way that could have implications on society at large. From racial bias in facial recognition to gender bias in voice, developers have struggled to build platforms that are representative of society as a whole. The more aware we are of the issue, the better these systems can reflect a more inclusive platform for all users.
Moving Towards a More Inclusive Future
It’s expected that the use of voice assistants will increase by as much as 35% per year through 2023. Hundreds of millions of devices are already on the market and billions more are expected to reach our pockets and mantels in the next few years. While companies have programmed their devices to provide the most pleasing and frictionless experience for their users, it is in their power to introduce a new normal to the experience, removing tired gender stereotypes and integrating technologies like Q that are more inclusive.
As voice AI becomes more emotionally intelligent and our interactions with machines are more than a novel way to check the weather, it will become increasingly important for those machines to be representative of the vast spectrum of human experience. Genderless voice stands to be a big part of that transformation.
We are well on our way to forming habits of casual data usage in the enterprise due to how the accessibility of data has grown. Seen in the example above about parents being concerned their children will develop rude behavior towards Alexa, today’s tools and technology have an immense influence in shaping our character and daily functions. As the nature of receiving data through voice becomes ubiquitous, voice assistants will not only drive human behavior but also influence the structure and flow of organizations. In turn, the habit-forming potential of voice assistants is such that a genderless voice will become a necessity.
Share this article
What do you think?
Get notified about new articles