The article on Digital Trends titled Alexa, Why Aren’t You a Dude? How Female Digital Assistants Reinforce Stereotypes investigates the overwhelming tendency towards female personas for digital assistants from Alexa to Siri to Cortana to Ok, Google. Mansplainers beware, the author notes that Siri allows for different genders and accents, but the more important point to locate is the default equivocation of women with subservience. The article articulates,
Both Apple and Google have both stated a desire to make their digital assistants more sophisticated, giving users a sense of a relationship rather than a device. It’s a potentially troublesome phenomenon as the makers of anthropomorphic assistants to accent non-threatening and subservient qualities to achieve social acceptance. Scarier still is the idea that digital assistants are not only reflecting gender bias, but causing it. Kids are already anthropomorphizing their robot friends, and also bossing them around…
The article is chock full of quotes from smart people calling for an end to defaulting to female voices, or for improving the design with social equality in mind. But who is really designing these digital assistants? We know that women are vastly underrepresented in Silicon Valley, and it is an unfortunate reality that the people driving these huge cultural influences might have no concept of their own bias. They have created a dream woman for men and waking nightmare for women.
Chelsea Kerwin, May 22, 2017