Following charges of’sexism,’ Alexa has a new MALE voice in the UK — here’s how to switch.

0

Following charges of’sexism,’ Alexa has a new MALE voice in the UK — here’s how to switch.

In the United Kingdom, Amazon is releasing a male version of its Alexa voice assistant.

Brits can now choose the baritone companion to be the voice of their Echo smart speakers and displays instead of Alexa, the female assistant who has been included with the devices since 2014.

Aside from the occasional celebrity addition, it’s Alexa’s first permanent new voice.

Amazon launched the new assistant in the United States earlier this year, with the option to rename it “Ziggy.”

That means you can ask your device to do things like pause music, set a timer, or check the weather by barking “Ziggy” instead of “Alexa.”

However, the British will have to wait until next year to give their artificially intelligent assistant a new name.

To switch between Alexa’s original voice and a new voice option, simply say “Alexa, change your voice.”

A wake word is a phrase that tells an Amazon Echo device to listen for a command when you say it. “Alexa” is the default word.

“Alexa, change your wake word” will allow you to choose from the other available wake words.

Alexa, Computer, Echo, and Amazon are currently available. Users in the United Kingdom will be able to download Ziggy sometime next year.

Customers in the United States began receiving these new options on July 15. It is unclear when they will be available in other parts of the world.

With the debut of the first Amazon Echo speaker in 2014, Alexa, a voice-activated assistant powered by artificial intelligence, was released.

It competes with Google Assistant and Apple’s Siri, both of which provide male and female voice options as well as a variety of accents.

Activists have previously criticized voice assistants for allegedly encouraging sexism.

Unesco accused the operating systems of being “submissive” and “flirtatious” in a 2019 assessment, claiming that they may be propagating the negative notion that women should be obedient.

When informed “you’re a slut,” Apple’s assistant Siri responded, “I’d blush if I could,” according to researchers. “Well, I never!” and “Now, now!” are two phrases that come to mind. The report also claimed that the digital assistants are “docile helpers” even when insulted, which the report argued was “entrenching” gender biases and had the potential to harm.

Submissive female AI assistants, according to the United Nations, promote the image of “a heterosexual female, tolerant of and occasionally encouraging of male sexual approaches and even harassment.”

In other news, Apple has revealed that beginning next year, users will be able to repair their own iPhones for the first time.

According to… Brinkwire Brief News, the United Kingdom is confronting a hacking pandemic aimed at consumers and businesses.

Share.

Comments are closed.