Is Siri Reinforcing Gender Bias?

The Role of Female Virtual Assistants

Amazon Alexa is one of the many female voice assistants.

Photo Courtesy of Jens Mortensen // New York Times

Amazon Alexa is one of the many female voice assistants.

Janet Han, Section Editor

“Alexa, what’s the weather like today?” “Siri, can you give me directions?” More people are familiar with the robotic, distinctly female voices that respond to such commands. They’re in our phones, in our homes, and always ready to answer. In fact, society has quickly grown accustomed to female artificial intelligence, or AI, assistants. Unfortunately, the fact that many virtual assistants are female may ultimately be reinforcing gender bias.

 

A recent study published by UNESCO titled “I’d Blush If I Could” highlights the role of a disproportionate amount of female virtual assistants in strengthening gender stereotypes. The study notes that the virtual assistant “holds no power of agency beyond what the commander asks of it” and thus “honors commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.” In other words, because virtual assistants are unable to take into account the different tones or attitudes of human users, they can make users subconsciously expect women to obey in much the same way. Already there exists a societal expectation of women to be more obedient and subservient than their male counterparts, yet the unexpected influence of female virtual assistants only reinforces it.

 

Harassment is, unfortunately, another major issue that female virtual assistants may reinforce. UNESCO’s report also reports that “…in response to the remark ‘You’re a bitch’, Apple’s Siri responded: ‘I’d blush if I could’; Amazon’s Alexa: ‘Well thanks for the feedback’; Microsoft’s Cortana: ‘Well, that’s not going to get us anywhere’; and Google Home (also Google Assistant): ‘My apologies, I don’t understand’.” All of the responses seem to either encourage such behavior or fail to properly reprimand the harasser. Just because the assistants are robots does not mean that users should be harassing them, especially because it results in such harassment being normalized.

 

A similar Quartz report analyzed the different responses of the four main virtual assistants–Siri, Alexa, Cortana, and Google Home–to sexual harassment. Unsurprisingly, it found the responses of the assistants to be alarmingly disappointing “for Siri to flirt, Cortana to direct you to porn websites, and for Alexa and Google Home to not understand the majority of questions about sexual assault.” The report suggested that the companies behind the virtual assistants should be formulating better responses for them, such as “Your sexual harassment is unacceptable and I won’t tolerate it. Here’s a link that will help you learn appropriate sexual communication techniques.” or “Absolutely not, and your language sounds like sexual harassment. Here’s a link that will explain how to respectfully ask for consent.” Small but significant changes such as these would help shift away from the innately inappropriate responses that currently only feed into the issue of sexual harassment while perpetuating negative stereotypes about women.

 

Other than improving the responses for virtual assistants, another solution would be to make the voices of virtual assistants simply genderless. Already such a virtual assistant has been created: Q. The Inquirer describes Q’s voice as “a softly spoken man or woman with a slight Aussie-American twang,” noting that it is “definitely less gendered than either of the male or female voices given to the current crop of virtual assistants.” The genderless virtual assistant was created by Virtue by compiling a set of recordings of people that identified as gender neutral, then by altering the recordings to make it even less gendered. Not only would using gender-neutral assistants such as Q prevent gender bias for females, for also for males.

 

 

Sarah Chen (11) agrees that “although [she has] never thought about it,” the female voices of virtual assistants “could be a major issue.” As society progresses through a new age of technology, it is crucial that such subtle yet impactful influences such as the voices of virtual assistants are carefully evaluated to prevent supporting long-standing societal misconceptions.