Soon Siri will no longer have a female voice by default – here’s why that’s so important

<div id=”js-article__body” itemprop=”articleBody” data-sbid=”WP-MKTW-0000218391″>

Hey Siri, why did you and your fellow AI assistants have to be female for so long? Virtual assistants built into smartphones and smart home devices that millions of people now depend on around the world have used female voices by default. Close your eyes and listen to the sound of AMZN from Amazon, + 1.30% Alexa, GOOG from Alphabet, + 2.87% Google Assistant and MSFT from Microsoft, + 2.24% Cortana in your head; If you are an American, you are probably hearing the female voice that was set as the default when you first activated the device.

But some tech companies are failing to reinforce the old-fashioned stereotype of a subservient woman who meets all a person’s needs. Most recently, Apple AAPL, + 0.99% announced Wednesday that Siri will stop using a female-sounding voice in the latest beta version of iOS, according to a TechCrunch report. Instead, anyone setting up Siri on their iOS device will choose from a variety of voices that vary based on tone and regional accent from the start. Apple is also adding two new Siri voices for English speakers. Apple currently allows US device users to choose between a male and female voice, along with six accents that include American, British, and Indian. But by default it is a female voice in the US (while by default it is a male voice in some countries like the UK) And you have to go to your settings to change it. Apple representatives were not immediately available for comment. But in a statement shared with TechCrunch, the company said: “We are excited to introduce two new Siri voices for English speakers and the option for Siri users to select the voice they want when setting up their device. This is a continuation of Apple’s commitment to diversity and inclusion, and products and services that are designed to better reflect the diversity of the world we live in. “It is a small but important step toward gender equality. A report The 2019 United Nations Council warned that female-voiced artificial intelligence assistants perpetuate the idea that “women are helpful, docile, and eager-to-please helpers, available at the touch of a button or forceful voice command.” Female AI assistants are even sexually harassed, as Wired and Quartz have reported, and the UN noted that these assistants have often been programmed to give passive or polite responses to sexually suggestive or abusive comments. For example, before the #MeToo movement, if you called Siri “bitch”, she would respond with “I’d blush if I could” or “There’s no need for that.” Quartz noted that Siri has since been updated. ces to say “Language!” in response. “What emerges is an illusion that Siri, an insensitive, unconscious and non-human chain of computer code, is a heterosexual woman, tolerant and occasionally attractive to male sexual advances and even harassment. It projects a digitally encrypted ‘children will be children’ attitude, ”the UN wrote. Additionally, a 2017 industry report from MindShareWorld found that more than a quarter of users fantasize about having sex with their voice assistant. “Our technologies reflect our culture,” Dr. Miriam E. Sweeney, an assistant professor at the University of Alabama who specializes in digital media, previously told MarketWatch. And the fact that we end up with female voices, or women portrayed in these various types of service roles, actually reinforces the feminization of a certain bonded labor force. [like being a personal assistant or working a call center] that is often seen as less skilled, less valuable and less affordable. “The UN has recommended that companies and governments stop making digital assistants female by default. And it suggested making these voices” neither male nor female “As well as scheduling attendees to discourage abusive or sexist language. Some of the early changes include that Alexa now also closed sexual harassment or explicit questions with” I won’t answer that “or” I’m not sure of the outcome you expected. ”And many devices have been expanding the voices they offer or adding more male voices, including celebrities like Samuel L. Jackson speaking on Alexa and John Legend lending their voice to the Google Assistant in the past. The recent Super Bowl commercial Amazon also changed the sexualization of AI assistants by introducing Michael B. Jordan as the “body” of Alexa.