HomeNewsUN report says female voiced virtual assistants hurt women

UN report says female voiced virtual assistants hurt women

A UNESCO report released on Friday asserts that digital voice assistants with female voices as default (Google Assistant, Alexa and Siri) reinforce damaging stereotypes about women. The report alleges that the default female voice supports a stereotype of women being “obliging, docile and eager to please helpers” that make themselves “available at the touch of a button or with a blunt voice command”.

Virtual assistants like Alexa and Siri are often personified in marketing material as a distinctly female, wise-cracking AI helpers clearly inspired by science fiction. Even Microsoft’s virtual assistant Cortana is named after a spunky, sexualised military AI assistant from their flagship video game series Halo.

The UN reports asserts that having these virtual assistants represented as women by default is also problematic when people begin to ‘abuse’ their smart-home devices. The report says that because the AI assistant can’t defend itself it reinforces the notion that women are “subservient and tolerant of poor treatment”.

Both Cortana and Siri have female voices by default but features male voices that users can switch to. Alexa has many accents to choose from, but all of them are female. Google has a variety of voices and says that users have a 50% chance of getting a male or female one when they first use their assistant.

The report goes on to say that this trend is indicative of a sexist culture within the digital technology field that limits opportunities for women. It cites Siri’s formerly default response to being called a bitch being “I’d blush if I could”, as demonstrative of attitudes towards women in technology companies.