Amazon’s Alexa could soon imitate the voices of deceased relatives – Port Alberni Valley News
Amazon’s Alexa could soon reproduce the voices of family members, even if they are dead.
The capability, unveiled at Amazon’s Re:Mars conference in Las Vegas, is being developed and would allow the virtual assistant to mimic a specific person’s voice based on a provided recording of at least one minute.
Rohit Prasad, Alexa’s senior vice president and chief scientist, said at the event on Wednesday that the desire behind the feature was to build trust in user interactions with Alexa by putting more “human attributes into ’empathy and affect’.
“These attributes have become even more important during the ongoing pandemic as so many of us have lost those we love,” Prasad said. “While AI can’t take away that pain of loss, it can certainly make their memories last.”
In a video released by Amazon at the event, a young child asks “Alexa, can Grandma finish reading The Wizard of Oz to me?” Alexa then recognizes the request and switches to another voice imitating the child’s grandmother. The voice assistant then continues to read the book with that same voice.
To create the feature, Prasad said the company needed to learn how to create “high quality voice” with a shorter recording time, as opposed to hours of studio recording. Amazon didn’t provide more details about the feature, which is sure to spark more privacy concerns and ethical questions about consent.
Amazon’s push comes as competitor Microsoft earlier this week said it was reducing its synthetic voice offerings and establishing stricter guidelines to “ensure the active participation of the speaker” whose voice is recreated. Microsoft said Tuesday it was limiting the number of customers who can use the service — while continuing to highlight acceptable uses such as an interactive Bugs Bunny character in AT&T stores.
“This technology has exciting potential in education, accessibility and entertainment, yet it is also easy to imagine how it could be used to impersonate speakers inappropriately and deceive listeners,” said said a blog post by Natasha Crampton, who leads Microsoft’s AI ethics division. .
THE ASSOCIATED PRESS
As we over span> Facebook and follow we on Twitter