Scammers use voice-cloning AI to impersonate relatives

You could very well receive a call in the near future from a loved one who is in urgent need of help, asking you to send them some money quickly. And you might be convinced it’s them because, well, you know their voice.

Artificial intelligence is changing that. New generative AI tools can create all kinds of output from simple text prompts, including trials written in the style of a particular author, the images worthy of art awardand – with just a snippet of someone’s voice to work with – speech that sounds convincingly like a specific person.

In January, Microsoft researchers demonstrated a text-to-speech AI tool that, when given just a three-second audio sample, can closely simulate a person’s voice. They didn’t share the code for others to play with; instead, they warned that the tool, called VALL-E, “could carry potential risks if misused…such as voice impersonation or impersonation of a specific speaker”.

But similar technology is already in the wild and crooks are taking advantage of it. If they can find 30 seconds of your voice somewhere online, chances are they can clone it and make it say anything.

“Two years ago, even a year ago, it took a lot of audio to clone a person’s voice. Now… if you have a Facebook page…or if you recorded a TikTok and your voice is there for 30 seconds, people can clone your voice,” said Hany Farid, professor of digital forensics at the University of California, Berkeley, at the Washington Post.

“The Money’s Gone”

THE Job reported this weekend on peril, describing how a Canadian family fell victim to scammers using AI voice cloning and lost thousands of dollars. A ‘lawyer’ told the elderly parents that their son had killed a US diplomat in a car accident, was in jail and needed money for legal fees.

The alleged lawyer then allegedly handed the phone over to the son, who allegedly told the parents he loved and appreciated them and needed the money. The cloned voice was “close enough that my parents actually believed they spoke to me,” the son, Benjamin Perkin, told the Job.

The parents sent over $15,000 through a bitcoin terminal to… well, crooks, not their son, as they thought.

“The money is gone,” Perkin told the newspaper. “There is no insurance. There’s no way to get it back. Let’s go.”

A company that offers a generative AI voice tool, ElevenLabs, tweeted on January 30 that he was seeing “an increasing number of cases of voice cloning misuse”. The next day he announcement the voice cloning capability would no longer be available to users of the free version of its tool, VoiceLab.

Fortune contacted the company for comment, but did not receive an immediate response.

“Almost all of the malicious content was generated by free, anonymous accounts,” he added. writing. “Additional ID verification is required. For this reason, VoiceLab will only be available on paid tiers. (Subscriptions start at $5 per month.)

Card verification won’t stop all bad actors, he acknowledged, but it would make users less anonymous and “force them to think twice”.

Learn how to navigate and build trust in your business with The Trust Factor, a weekly newsletter examining what leaders need to succeed. Register here.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top