Mondo News

All of the latest tech and science news from all over the world.

The call is coming from inside the internet: AI voice scams on the rise with cloning tech

These next-gen bandits come armed with sophisticated new tools and techniques to steal hundreds of thousands of dollars from people like you and me. 

“If you get a call in the middle of the night and it sounds exactly like your panicked child or grandchild saying, ‘help, I was in a car accident, the police found drugs in the car, and I need money to post bail (or for a retainer for a lawyer),’ it’s a scam,” Talcove explained. 

Talcove recommends having a family password that only you and your closest inner circle share. Don’t make it anything easily discovered online either – no names of pets, favorite bands, etc. Better yet, make it two or three words that you discuss and memorize. If you get a call that sounds like a loved one, ask them for the code word or phrase immediately. 

If the caller pretends to be law enforcement, tell them you have a bad connection and will call them back. Ask the name of the facility they’re calling from (campus security, local jail, the FBI), and hang up (even though scammers will say just about anything to get you to stay on the line). If you can’t reach your loved one, look up the phone number of that facility or call your local law enforcement and tell them what’s going on. 

Also, be more careful than ever about what information you put out into the world.

“A criminal only needs three seconds of audio of your voice to ‘clone’ it,” Talcove warns. “Be very careful with social media. Consider making your accounts private. Don’t reveal the names of your family or even your dog. This is all information that a criminal armed with deepfake technology could use to fool you or your loved ones into a scam.”

Talcove shared a half dozen “how-to” video clips he says he pulled from the dark web showing these scams in action. He explained that criminals often sell information on how to create these deepfakes to other fraudsters. 

“I keep my eyes on criminal networks and emerging tactics. We literally monitor social media and the dark web and infiltrate criminal groups,” he added. “It’s getting scary. For example, filters can be applied over Zoom to change somebody’s voice and appearance. A criminal who grabs just a few seconds of audio from your [social media feeds], for example, can clone your voice and tone.” 

I skipped all the organized crime parts and just Googled “AI voice clone.” I won’t say exactly which tool I used, but it took me less than ten minutes to upload 30 seconds of my husband’s voice from a video saved on my smartphone to an AI audio generator online, for free. I typed in a few funny lines I wanted “him” to say, saved it on my laptop, and texted it to our family. The most challenging part was transferring the original clip from a .mov to a .wav file (and that’s easy too). 

“We’re all vulnerable, but the most vulnerable among us are our parents and grandparents,” Talcove says. “99-in-100 people couldn’t detect a deepfake video or voice clone. But our parents and grandparents, categorically, are less familiar with this technology. They would never suspect that the voice on the phone, which sounds exactly like their child screaming for help during a kidnapping, might be completely artificial.”

Category: Technology

Source: USA Today Technology

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: