Artificial intelligence is no longer limited to science fiction. Every time you surf the web, check your email, or use a voice assistant, AI is helping you. Without AI, we wouldn’t have predictive search engines, accurate spam folders, or a voice on our phone that can make appointments and crack jokes with us.
All of these functions are integral parts of modern life, and the future of AI is bright, but as with any technological trend or innovation, there are unanticipated side effects. We think technology, such as AI, is neutral, but in reality it’s no more immune to bias than the well-intentioned people who create it. Women often face similar misrepresentation and stereotyping in AI as they do in society. Just as binary code is the language of computers, the gender binary is instrumental in some technology.
A recent report from Unesco (aptly named “I’d Blush if I Could” after one of Siri’s meek responses) explores the pervasiveness of female voice assistants. Think Piece 2 of the report claims that “because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK.’ The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”
The report cites a few explanations that tech companies give for this imbalance, such as studies that show people simply prefer a female voice, despite other studies that actually suggest the opposite. Unesco also reported that “widely used voice assistants often claim to be neither male nor female” but despite this, “nearly all of these assistants have been feminized.” Which begs the question, why would technology have a gender in the first place?
We live in a society that is obsessed with binaries, particularly the gender one, writes psychology professor Karen L. Blair in Psychology Today. Our tendency to gender everything, including babies, cars, and pets, is based on the belief that there are two genders, based on two biological sexes. As technology becomes more ubiquitous, the binary has extended to divide it as well. Humanoid robots are given genders even though the categorization is irrelevant to their functions. Helpful voice assistants are female, while protective security robots are named Marty. Non-humanoid robots like Roombas, or robots like Pepper that explicitly deny having a gender, are often gendered anyways by people interacting with them. Even when we’re told that a robot is genderless, we still want to conceptualize it as male or female. When CNN went on a date with Pepper, they immediately asked if the robot was a boy or a girl.
But the binary works both ways: We also develop technology that reinforces the binary by gendering us, such as TSA imaging portals. According to TSA guidelines regarding transgender passengers, “when you enter the imaging portal, the TSA officer presses a button designating a gender (male/female) based on how you present yourself.” The machine’s software will then scan the passengers body and signal an alarm if anything (such as genitalia or undergarments) does not match the entered gender. This technology has caused horrific experiences for transgender and intersex passengers.
This isn’t to say gender has no place in technology. Sometimes, gender is an important data point, which is the case with MIT CSAIL’s neural network Speech2Face. The software uses audio segments from YouTube to reconstruct an image of the speaker’s face. It claims to be accurate 94% of the time for age, race, and gender. According to the study, however, “the training data we use … does not represent equally the entire world population. Therefore, the model – as is the case with any machine learning model – is affected by this uneven distribution of data.” Speech2Face was also less accurate when listening to speakers who did not conform to common stereotypes. For example, a young boy might be misgendered because he had a higher-pitched voice the system interpreted as feminine. Reviews.com attempted to reach out to the researchers but they were not available for interviews.
Unesco’s report suggests one reason for the “predominance of female voice assistants may lie in the fact that they are designed by workforces that are overwhelmingly male.” It’s a common sentiment that artificial intelligence is only as good as the data we train it on. Datasets skewed to over represent light-skinned males, such as the ones used to train facial recognition software, will be considerably less accurate for darker skinned women, according to a study from M.I.T. So when overwhelmingly male teams train AI on overwhelmingly male datasets, the discrimination against women stands to increase exponentially.
There are some companies that are trying to change this. Pegg is a gender-neutral robot designed by Sage 2016. Sharma made Pegg gender neutral in the hopes it would help combat prejudice and provide transparency surrounding AI.
“As we are striving to evolve as a society and foster diversity and inclusion across the workforce, the integration and application of AI and emerging technologies should be a critical part of that process,” says Ron McMurtrie, CMO at Sage.
A more recent AI created to combat the gender binary is Q, which claims to be the first genderless voice. Instead of Pegg’s nonhuman voice, Q sounds like a real person, just without a distinct gender. The developers created it because “as society continues to break down the gender binary, recognising those who neither identify as male nor female, the technology we create should follow.” Their hope, according to the website, is to get the attention of tech giants like Apple, Amazon, Google, and Microsoft.
Are We Ready for Genderless AI?
The easy answer is no. But the more accurate answer is we have to be.
From bathrooms in North Carolina to the #MeToo movement, it’s clear that gender bias and different perspectives on the subject are pervasive in society today. We have an innate tendency to gender everyone and everything, but it doesn’t have to be like that with our technology. According to the Harvard Business Review, “one of the big advantages of AI is that, aside from being better at spotting things (i.e., millions of data points), it is also superior at ignoring things … AI can be trained to ignore people’s gender.”
While many of our laws, beliefs, and systems are built around the gender binary, we have the capability to create technology that isn’t. We can make technology, like Q, that is better than ourselves.
“In the process of learning how to mitigate these gender biases in AI, we might just be able to improve our own selves and businesses as well,” McMurtrie said.
These recent developments in more representative AI suggest a bright future for the field. “I truly believe that we can pave the way to a better future that fosters a keen awareness and commitment to creating technology void of gender bias. One will not work without the other – people and technology must evolve together, including our growing acceptance of genderless AI,” McMurtrie said.