Abu Dhabi, UAEThursday 21 November 2019

Why Alexa, and not Alex? Digital assistants enforce gender bias, says UN

New study finds that AI-powered assistants are usually female, normalising the idea that women should be 'subservient'

Voice-activated virtual assistants are perpetuating gender biases, a Unesco study has found. AFP
Voice-activated virtual assistants are perpetuating gender biases, a Unesco study has found. AFP

It's always "hi Alexa", not "hello Alexander", and "hey Siri", not "hi Stuart".

This is the issue a UN agency honed in on a new study, which found that assigning female genders to virtual assistants such as Amazon’s Alexa and Apple's Siri is reinforcing negative gender biases.

The report, conducted by the United Nations Educational, Scientific and Cultural Organisation (Unesco), found that by using a female voice for AI-powered assistants, gender stereotypes were being perpetuated.

“It sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’,” the study says.

"The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”"

The report also noted the coquettish intonation of some programmed responses, such as Siri's "I’d blush if I could", which is uttered when prompted with a specific explicit statement.

"Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation,” the report says.

While Apple and Amazon allow users to opt for a male voice, the default voice is that of a woman.

The Amazon Echo range of assistants provide the platform for the female-voiced assistant Alexa. AP /Mike Stewart
Amazon's Alexa has a female voice as its default setting. AP

“The assistant holds no power of agency beyond what the commander asks of it," Unesco's study says. "It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”

The report calls on technology companies to stop making voice assistants female by default, and urges them to develop a gender-neutral option.

Unesco also encouraged companies to dissuade users from gender-based insults and abuse by using appropriate, rather than flirtatious, responses.

Earlier this year, a group of linguists, technologists and sound designers revealed Q, a genderless digital voice.

Updated: May 22, 2019 12:27 PM

SHARE

SHARE