Siri and Alexa reinforce gender biases, says UN

Siri apple iphone voice assistant AI

Artificial female voice assistants like Siri, Alexa and Cortana reinforce harmful gender biases, according to a report by the UN.

The report, which is titled I’d Blush if I Could, after the response given by Apple’s Siri if users calls the digital assistant a “slut,” was published by UNESCO (the United Nations Educational, Scientific and Cultural Organisation).

As artificial intelligence becomes more widely used and more human, the report argues “that the female projection of voice assistants often sends negative messages about girls and women.”

The report says: “Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation.”

Feminised voice assistants tell us women are “subservient”

Other feminised voice assistants, which media organisations often refer to using she/her pronouns, include Microsoft’s Cortana (named after AI in the video game Halo that projects itself as a sensuous unclothed woman) and Google Assistant.

Digital assistants are used for around one-fifth of mobile internet searches, says UNESCO, and “because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers.”

Amazon launches voice of transgender man on Alexa

UNESCO says digital voice assistants make women seem “subservient.”
(GRANT HINDSLEY/AFP/Getty)

“The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”

The report continues: “The subservience of digital voice assistants becomes especially concerning when these machines – anthropomorphised as female by technology companies – give deflecting, lacklustre or apologetic responses to verbal sexual harassment.”

Siri: “Hmm, I just don’t get this whole gender thing”

For example, if Siri is told it’s “hot” it will respond: “You say that to all the virtual assistants?” If it’s told it is a “naughty girl,” it will reply with: “Hmm, I just don’t get this whole gender thing.”

Amazon’s policies for skills developers do not allow ‘gender hatred’ and ‘sexually explicit content,’ but do not have any other regulations related to Alexa’s projection of gender.

According to the report “women make up just 12 percent of AI researchers.”