Hey Siri, a UN report finds digital assistants with female voices reinforce harmful gender biases

Male engineers have built voice assistants that “cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation.”

A report from UN agency Unesco has said that assigning female genders to popular digital assistants, including Amazon’s Alexa and Apple’s Siri, entrench damaging gender biases, reports the Guardian. The report found that female-sounding voice assistants often returned submissive and flirty responses to queries by the user, which...

What feeling does this article give you?
Joy
Disgust
Fear
Anger
Sadness

#hashtags to follow:

Unesco [+]    Amazon [+]    Alexa and Apple [+]    Siri [+]    Guardian [+]   

More #news: