A United Nations report indirectly accuses smart assistant providers like Apple, Google and Microsoft of reinforcing gender bias by using female assistant voices by default.

Apple’s Siri, Microsoft’s Cortana, Google’s Assistant on Home speakers and Amazon’s Alexa are by far the most popular digital assistants out there and all of them default to using a female voice. Some assistant use female voices exclusively, like Alexa, and others allow the user to change voice gender in Settings, like Siri.

In some cases, an assistant’s default voice gender depends on the user’s specific markets, and Apple is a good example that—Siri uses a female voice in most countries, but she defaults to a male voice when the system language is set to Arabic, French, Dutch or British English.

From the report, titled “I’d blush if I could”:

Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it.

It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.

The title of the report (“I’d blush if I could”) used to be one of Siri’s responses to being addressed as a slut (another one: “Well, I never!”, as noted by 9to5Mac, but Apple has since changed those responses to “I don’t know how to respond to that.”

A female AI helper can also give kids wrong ideas about the role of women in our society, potentially suggesting that it’s normal for women, girls and female-gendered individuals to respond on demand.

According to Calvin Lai, a Harvard University researcher who studies unconscious bias, the gender associations people adopt are contingent on the number of times people are exposed to them. As female digital assistants spread, the frequency and volume of associations between ‘woman’ and ‘assistant’ increase dramatically.

According to Lai, the more that culture teaches people to equate women with assistants, the more real women will be seen as assistants – and penalized for not being assistant-like. This demonstrates that powerful technology can not only replicate gender inequalities, but also widen them.

I’m not sure what to think of this report other than Apple, Google, Microsoft and Amazon are very well aware of the cultural subtext of all this—otherwise, Siri’s default voice gender wouldn’t be dependent on your region—but I’m not sure they’re aware that all-female assistant voices can, and probably do, reinforce gender bias, especially with kids who may use that as proof of a connection between a woman’s voice and subservience.

Do female assistant voices really reinforce Western gender stereotypes? What’s your take on this report? Be sure to chime in with your thoughts in the comments section down below.