The Limited Times

Now you can see non-English news...

Amazon will allow the voice of deceased friends and family to be used in its Alexa assistant

2022-06-23T14:06:23.884Z


The new feature, which will mimic voices from short audio, could "make memories [of deceased loved ones] last" over time, the company said.


Alexa, Amazon's virtual assistant, will offer the possibility of changing its voice to that of anyone the user chooses, including that of a deceased friend or relative, The Washington Post reported on Thursday, citing the announcement made by Rohit Prasad. , chief scientist of the artificial intelligence of Alexa during a presentation held in Las Vegas, in Nevada.

"Alexa, can Grandma finish reading

The Wizard of Oz

to me ?" asks a child in a promotional video about this new feature.

"Good!" the Amazon device replies, before changing into an intonation resembling that of an older woman.

Alexa will imitate voices after learning from audio of the chosen people, to help "memories [of loved ones] endure" over time, according to Prasad.

This functionality is still under development and it is unknown when it will be available, but the possibility of hearing the voice of dead people has already raised ethical and cybersecurity doubts, experts told the aforementioned newspaper.

[Apple employees in Maryland agree to form the company's first union in the US]

"I don't think our world is ready for easy-to-use voice cloning technology," said Rachel Tobac, director of SocialProof Security.

In her opinion, this technology could be used to manipulate the public through false audio or video clips.

"If a criminal can easily and credibly replicate another person's voice with a small voice sample, they can use that to impersonate other individuals," Tobac said.

Feel safe sending money through apps and avoid scams with these simple tips

June 22, 202203:06

"You can trick others into thinking you're the person you're impersonating, which can lead to fraud, data loss, account takeover and more," the security expert concluded.

There is also the danger of confusing human voices with robotic ones.

"You're not going to remember that you're talking to the depths of Amazon and its data collection services if you're talking to the voice of your grandmother or grandfather or a loved one who's gone," said Tama Leaver, a professor of Internet studies at Curtin University in Australia.

[This Texan mother succeeds with sexual photos on networks.

She supports her daughters with the money but religious relatives criticize her]

In addition, Amazon will need to resolve the issue of consent when it comes to voices from deceased people, Leaver added.

Prasad did not address these issues during his presentation, but stressed that the possibility of imitating human voices is a reflection that "artificial intelligence is living its golden age", where "dreams and science fiction are becoming reality".

Source: telemundo

All news articles on 2022-06-23

You may like

Life/Entertain 2024-03-19T07:19:44.696Z
News/Politics 2024-03-13T11:33:21.770Z

Trends 24h

News/Politics 2024-04-18T20:25:41.926Z

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.