Why do our digital assistants such as Alexa, Google Home, Siri and Cortana have “feminized” voices and what are the effects of this trend? That’s what I explore in this episode. Are there negative effects of using female voices in the devices we talk to and who talk to us? Are there alternatives? Turns out there is an alternative – a “genderless” voice. What does that sound like? Tune in to find out as we explore gender roles, expectations and equality.
- I’d Blush If I Could
- We tested bots like Siri and Alexa to see who would stand up to sexual harassment
- Why Siri and Alexa Weren’t Built to Smack Down Harassment
- Hey Siri, stop perpetuating sexist stereotypes, UN says
- Is it time for Alexa and Siri to have a “MeToo moment”?
- Female voice assistants fuel damaging gender stereotypes, says a UN study
The reason digital assistants acquiesce to harassment isn’t just sexism or gender inequality in the tech world, as disturbing and prevalent as those may be. No, the explanation lies elsewhere, I believe. These machines are meant to manipulate their users into staying connected to their devices, and that focus on manipulation must be laser-like. To clearly state that harassment toward digital assistants is unacceptable would mean having some standard, some line that can’t be crossed. And one line leads to another, and soon you’re distracted ”the user is distracted”from selling/buying merchandise, collecting/sharing data, and allowing a device to become ensconced in their life.
The moral standard most compatible with engagement is absolute freedom of expression, the standard of having no standards.
– Noam Cohen, “Why Siri and Alexa Weren’t Built to Smack Down Harassment”