“We were sucked in. We were convinced that we were talking to Brandon.”
Bail Out
Ruthless scammers are always looking for the next big con, and they might’ve found it: using AI to imitate your loved ones over the phone.
When a 73-year-old Ruth Card heard what she thought was the voice of her grandson Brandon on the other end of the line saying he needed money for bail, she and her husband rushed to the bank.
“It was definitely this feeling of… fear,” Card told The Washington Post. “That we’ve got to help him right now.”
The couple withdrew the maximum of 3,000 Canadian dollars at one bank and went to another for more. Fortunately, a vigilant bank manager flagged them down and warned them that another customer had gotten a similar phone call that sounded like it was from a loved one — but it turned out the voice had been faked.
“We were sucked in,” Card said. “We were convinced that we were talking to Brandon.”
Legal Trouble
Not all were as lucky. The 39-year-old Benjamin Perkin told WaPo how his elderly parents were swindled out of thousands of dollars with the help of an AI impersonator.
Perkin’s parents had received a phone call from a lawyer, who claimed that their son killed a US diplomat in a car crash and needed money for legal fees. The apparent lawyer then let Perkin speak on the phone — and the voice sounded just like him.
This convinced them. When the lawyer later called back asking for CAD $21,000, his parents went to the bank and sent the money through BitCoin.
“The money’s gone,” Perkin told the paper. “There’s no insurance. There’s no getting it back. It’s gone.”
Easy Pickings
Voice cloning scams have been a threat for several years now. But the growing ubiquity of powerful and easy-to-use AI means that the technology’s potential to be abused easily outpaces an unwitting public’s ability to keep up with the tricks of bad actors — not realizing they could be targeting them already.
“Two years ago, even a year ago, you needed a lot of audio to clone a person’s voice,” Hany Farid, a professor of forensics at UC Berkeley, told WaPo. “Now… if you have a Facebook page… or if you’ve recorded a TikTok and your voice is in there for 30 seconds, people can clone your voice.”
Take ElevenLabs, whose AI voice synthesis service costs as little as $5 per month, and can produce results so convincing that a journalist used it to break into his own bank account. It’s even spawned an entire genre of memes impersonating President Joe Biden. ElevenLabs’ voice cloning has only been around since 2022. Imagine the damage it — and competitors looking to ride the coattails of its success — could do in just a few more years.
More on voice cloning: Voice Actors Enraged By Companies Stealing Their Voices With AI