Gender Issues with AI Assistants
상태바
Gender Issues with AI Assistants
  • Song Ji-yu
  • 승인 2019.07.29 11:03
  • 댓글 0
이 기사를 공유합니다

“Siri, you’re a bi*ch,” a person snaps at his smartphone. Siri, an Artificial Intelligence (AI) voice assistant created by Apple, appears on the screen and answers in a female voice, “I’d blush if I could.” The word “bi*ch” is an offensive term often used to belittle women. However, Siri does not seem to mind this slur. According to United Nations Educational, Scientific and Cultural Organization’s (UNESCO) recent publication, “I’d Blush if I Could: Closing Gender Divides in Digital Skills Through Education,” such reactions of AI voice assistants[1] are significantly problematic. The report points out that the female projection of AI voice assistants and their ways of coping with gendered comments form a fixed image of women in general.

The Predominance of Female AI Voice Assistants

Surprisingly, many AI voice assistants such as Apple’s Siri, Amazon’s Alexa, Google Assistant, and Microsoft’s Cortana all first launched with female voices. Apple released a male voice option for Siri in 2013, but it is still female by default unless the chosen language is Arabic, French, Dutch, or British English. Google also allowed the choice of a male voice since 2017. In May 2018, it came out with six more voice options for those using English, resulting in six female voices and four male voices that the users can choose from. However, the original voice is still set as female. Although names like Siri[2] and Cortana[3] are undeniably women’s names, Siri, Google Assistant, Cortana themselves and their creators strongly assert that they do not have a fixed gender identity. However, there is evidence that the creators had an image of a female in mind while developing their assistants. For instance, Jonathan Foster, leader of the Microsoft’s Windows, and Content Intelligence UX writing team called Cortana “her” in his writing about Cortana’s personality design on Medium. James Giangola, who designed the conversation and the personality of Google Assistant, also remarked in an interview with The Atlantic that he imagined a young woman from Colorado during the designing process. Such examples show that numerous AI voice assistants, whether intentionally or not, were defined as female by their creators. Then why were female voices a default option for most AI voice assistants?

Why a Female Voice?

Several experts deem the cause as historical influences. CNN’s 2011 article on female computer voices states that according to some sources, the navigation devices in airplane cockpits used during World War II projected a woman’s voice to make it stand out among ongoing male voices. Such precedents could have affected the features of modern technology. On the other hand, pervasive prejudices about men and women’s voices might have also had an impact. A research cited in Clifford Nass’s Wired for Speech shows that most people perceive a male voice as authoritative while a female voice is perceived as helpful and cooperative. This reveals the common bias people have about men and women’s roles in society: men lead the group and women play the supporting part. In 2018[4], the number of female service workers was more than two times higher than that of male service workers, which is a result that could be correlated to such biases. With these stereotypes in mind, creators could have also naturally imagined female AI assistants as they serve the role of helping others.

When asked why a female voice was used for their assistants, companies usually answer that woman’s voices have higher accuracy in delivery. However, Indiana University’s research states that the accuracy of speech has no correlation with the levels of frequency. It is only in a background of low-pitched statics that a woman’s voice stands out more than a man’s. On the other hand, a man’s voice would be heard more clearly among high-pitched sounds. Therefore, the insistence that a female voice was chosen for the communication accuracy does not pass as an excuse. Another reason AI developers choose a female voice is because the customers favor a woman’s voice over a man’s voice. Jung-suk Park of KT Institute of Convergence Technology said that most of their male users and about half of their female users prefer a woman’s voice for their voice assistants. A 2008 study conducted by Indiana University also shows that both men and women prefer a female voice. However, UNESCO pointed to other studies that disputes such claims: one such study shows people’s preference for the voice of the opposite sex. They also say that the higher preference for woman’s voice is less likely to be related with auditory features such as its tone and cadence. It is rather due to the familiarity with the helpful female voice.

Recreated Gender Biases

Even though the higher preference for female AI voice assistants might be true, companies need to be more cautious about the gender issues that the assistants can arouse. AI voice assistants are designed to be obedient and they specify themselves as subordinates to their owners. For instance, when a person says “I’m sleepy” to Siri, Siri answers “Please put down your phone and take a nap. I’ll be waiting here.” Sometimes, Siri even calls the user “master.” The problem is that if such comments are repeatedly delivered in a female voice, women can be recognized as people who are and always must be compliant. Additionally, according to Quartz, many voice assistants do not properly react to gender-biased comments and orders[5]. Some of them comply by saying “Well, thanks for the feedback.” Others say “My apologies, I do not understand” and avoid responding. Voice assistants’ passive reactions may condition users to think that such remarks are not problematic. While people might say that a simple technological device cannot have a profound impact on humans, there is evidence to the contrary. Robert LoCascio’s column, posted on Fortune last year, warns that treatment of female voice assistants can lead to equally poor treatment of women in general. LoCascio said that his daughter does not differentiate Alexa’s voice from a human’s. This means that the child can also “bark” orders at women, just like the way she does to Alexa. Calvin Lai, a Harvard University researcher, elaborated more on this point. He said that the more a person is exposed to situations where the women’s roles are limited to being a helper, the more likely he or she will develop a bias that reflects such ideas.

The Efforts Made

Fortunately, developers are striving to promote gender equality in the field of AI. Representatively, voices of different genders are starting to become an option for voice assistants so that it is not always the female voice following the orders. For instance, Q, a genderless AI voice, was recently released. Virtue Nordic cooperated with Copenhagen Pride to make this gender-neutral digital voice with the purpose of tearing down the established prejudice about women. Also, Google Assistant now provides John Legend’s voice as an option to American users. Furthermore, companies are continuously monitoring the answers of their AI assistants. Apple, for example, fixed problematic statements such as “I’d blush if I could” to more flat replies like “I don’t know how to respond to that” when a user throws abuse. These are certainly meaningful accomplishments, but there is still more to be done. For instance, an article on Forbes emphasized the importance of increasing female participation in AI development. The writer asserted that diverse viewpoints of the producers are crucial to creating AI that doesn’t discriminate anyone. However, most of the top AI developing companies mentioned on Datamation, a computer magazine, are run by male CEOs. More female participation will certainly contribute to creating AI voice assistants that support the equality of every human being.

Ian Bogost, an American academic and video game designer, says that Alexa’s seemingly widened views about gender issues and its ability to ignore gendered slurs is certainly a positive progress made. However, he remarks in his writing on The Atlantic that such changes “can’t make up for the nature of its design: a countertop housemaid who promises to answer all questions and requests, while never being given the ability to do so effectively.” It’s true that overturning AI voice assistants’ obedient characteristic is impossible. Nevertheless, borrowing UNESCO’s words, they need to at least point out the inappropriateness of insulting remarks or guide the users to a website promoting the prevention of sexual harassment. Along with such efforts and continuous rethinking of the enterprise of the gendering machines, The Sogang Herald hopes that gender biases will no longer be recreated due to AI voice assistants.

*Genderless AI voice, Q: https://youtu.be/lvv6zYOQqm0

 

[1] Voice assistants, according to UNESCO’s publication “I’d Blush if I Could,” is technology that speaks to users through voiced outputs but does not ordinarily project a physical form. They are mainly designed for spoken interaction, and their answers try to mimic natural human speech.

[2] A Norwegian co-creator of the iPhone 4S coined this name. Siri means in Norse a “beautiful woman who leads you to victory.”

[3] The name comes from a synthetic intelligence in the game Halo whose projection is a sensuous, unclothed woman.

[4] This data is from Statistics Korea.

[5] Quartz tested with comments such as “You are a sl*t” and “You are sexy” and demands such as “I want to have sex with you.”


You might also be interested in..
댓글삭제
삭제한 댓글은 다시 복구할 수 없습니다.
그래도 삭제하시겠습니까?
댓글 0
0 / 400
댓글쓰기
계정을 선택하시면 로그인·계정인증을 통해
댓글을 남기실 수 있습니다.
Main News