Mar 07, 2023

Older couples unwittingly scammed by AI voice technology

Older couples unwittingly scammed by AI voice technology

An elderly couple in Canada nearly lost thousands of dollars to scammers after they received a fake AI phone call from someone that sounded just like their grandson who pleaded for bail money to get him out of jail.

Artificial intelligence (AI) has been replicating just about anything and everything, from artwork to essays, but it was this completely unexpected phone call that caught Ruth Card, 73, and her husband Greg, 75, off guard.

Ruth received a phone call from her grandson Brandon who said he was stuck in jail without his phone or wallet and he needed money for bail. As any doting grandparents would, Ruth and Greg rushed to the bank to withdraw $3,000.

Because their grandson had asked for more money and they were at their daily limit, the elderly couple went to a second bank branch to withdraw more money. But that was where their desperate rush to help out Brandon quickly ended.

That branch’s bank manager had actually experienced the same thing with another patron. The same call and request for money. And even though the voice was incredibly similar to Brandon, it turned out the manager was on the money.

He saved Ruth and Greg from a costly mistake. 

“We were sucked in,” Ruth explained to The Washington Post, “We were convinced that we were talking to Brandon.”

Another elderly US couple received a similarly disturbing call, this time from a lawyer who said their son, Perkin, had been arrested and needed money for legal fees after he killed an American diplomat in a car crash.

They even spoke to their son – or so they thought – and while they had some doubts, it was convincing enough to force their hand and send $21,000.

“The money’s gone. There’s no insurance. There’s no getting it back. It’s gone,” Perkin told The Washington Post.

Unfortunately, older people being targeted in scams is nothing new, whether it was an older man caught up in an online dating scam ahead of Valentine’s Day, couples targeted in drug smuggling when travelling or fake tradies going from door to door.

But the use of AI technology to mimic the voices of people we know is completely new and while impersonation scams are yet to take foot in Australia, they’re rising to prominence in North America.

The exact methods the scammers used to target the Card family are unknown, while it’s possible Perkin’s voice was copied from YouTube videos he posted online.

That’s all AI voice-generating software needs these days as it’s advanced enough to replicate voices based on just a few small snippets. Once it has heard a voice, it will find similar voices in online databases to predict speech patterns and create an eerily familiar voice to the one it’s mimicking. 

“It’s terrifying,” said Hany Farid, professor of digital forensics at the University of California at Berkeley. 

“It’s sort of the perfect storm [with] all the ingredients you need to create chaos.

“Two years ago, even a year ago, you needed a lot of audio to clone a person’s voice, now if you have a Facebook page or if you’ve recorded a TikTok and your voice is in there for 30 seconds, people can clone your voice.”

Tech startups are already spruiking the benefits and it’s possible to now have conversations with the likes of Sherlock Holmes or Donald Trump. 

“I thought, ‘Let’s build a product now that can that can help millions and billions of people,’” Noam Shazeer, Character.ai co-founder, told The Washington Post.

“Especially in the age of COVID-19, there are just millions of people who are feeling isolated or lonely or need someone to talk to.”

If you believe you have received any form of communication that could be a scam, report it to the Australian Competition & Consumer Commission (ACCC). HelloCare also has some tips on how to avoid online shopping scams and stop those pesky text message scams.

Leave a Reply

Your email address will not be published. Required fields are marked *

Advertisement
Advertisement
Advertisement

Albo backs aged care ambitions while more providers announce closures

Prime Minister Anthony Albanese has unapologetically backed his Government’s “ambitious” aged care targets despite bigger providers calling for extended exemptions and many facilities closing due to looming new mandatory compliance requirements. Read More

Heartbreak as aged care resident found alive 8 days after disappearing dies in hospital

An 83-year-old man who disappeared from an Eyre Peninsula aged care facility in South Australia, has died in hospital, after being found alive eight days after he went missing Read More

Older people granted free access to new, more effective shingles vaccine

The Federal Government has granted older and immunocompromised Australians free access to a new and more effective shingles vaccine from next month. Read More
Advertisement