A scammer calls and asks for a passcode, leaving Malcolm, an older man with a British accent, confused.
“What business are you talking about?” Malcolm asks.
Again, I received a scam call.
This time, Ibrahim, cooperative and polite with an Egyptian accent, answered the phone. “To be honest, I can’t really remember if I’ve bought anything recently,” he told the scammer. “Maybe one of my kids did,” Ibrahim continued, “but it’s not your fault, is it?”
Scammers are real, but Malcolm and Ibrahim aren’t. They’re just two of the conversational artificial intelligence bots created by Professor Dali Kaafar and his team, who founded Apate, named after the Greek goddess of deception, through his research at Macquarie University.
Apetto’s goal is to use conversational AI to eradicate phone fraud worldwide, leveraging existing systems that allow telecommunications companies to redirect calls when they identify them as coming from scammers.
Kafal was inspired to strike back at phone scammers after he told a “dad joke” to the caller in front of his two children as they enjoyed a picnic in the sun. His pointless chatter kept the scammer on the line. “The kids had a good laugh,” Kafal says. “I thought the goal was to trick them so they would waste their time and not talk to other people.
“In other words, we’re scamming the scammers.”
The next day, he called in his team from the university’s Cybersecurity Hub. He figured there had to be a better way than his dad joke approach — and something smarter than a popular existing technology: Lennybot.
Before Malcolm and Ibrahim, there was Lenny.
Lenny is a rambling, elderly Australian man who loves to chatter away. He’s a chatbot designed to poke fun at telemarketers.
Lenny’s anonymous creator posted this on Reddit. They say they created the chatbot as “a telemarketer’s worst nightmare… a lonely old man who wants to chat and is proud of his family, but can’t focus on the telemarketer’s purpose.” The act of tying up scammers is called scamming.
Apate bot to the rescue
Australian telecommunications companies have blocked almost 2 billion scam calls since December 2020.
Thanks to $720,000 in funding from the Office of National Intelligence, the “victim chatbots” could now number in the hundreds of thousands, too many to name individually. The bots are of different “ages,” speak English with different accents, and exhibit a range of emotions, personalities, and reactions; sometimes naive, sometimes skeptical, sometimes rude.
Once a carrier detects a fraudster and routes them to a system like Apate, bots go to work to keep them busy. The bots try different strategies and learn what works to keep fraudsters on the phone line longer. Through successes and failures, the machines fine-tune their patterns.
This way, they can collect information such as the length of calls, the times of day when scammers are likely to call, what information they are after, and the tactics they are using, and extract the information to detect new scams.
Kafal hopes Apate will disrupt the call fraud business model, which is often run by large, multi-billion-dollar criminal organizations. The next step will be to use the information it collects to proactively warn of scams and take action in real time.
“We’re talking about real criminals who are making our lives miserable,” Kafal said. “We’re talking about the risks to real people.”
“Sometimes people lose their life savings, have difficulty living due to debt, and sometimes suffer mental trauma. [by] shame.”
Richard Buckland, a cybercrime professor at the University of New South Wales, said techniques like Apate were different to other types of fraud, some of which were amateurish or amounted to vigilante fraud.
“Usually fraud is problematic,” he said, “but this is sophisticated.”
He says mistakes can happen when individuals go it alone.
“You can go after the wrong person,” he said. Many scams are perpetrated by people in near-slave-like conditions, “and they’re not bad people,” he said.
“[And] “Some of the fraudsters are going even further and trying to enforce the law themselves, either by hacking back or engaging with them. That’s a problem.”
But the Apate model appears to be using AI for good, as a kind of “honeypot” to lure criminals and learn from them, he says.
Buckland warns that false positives happen everywhere, so telcos need a high level of confidence that only fraudsters are directing AI bots, and that criminal organisations could use anti-fraud AI technology to train their own systems.
“The same techniques used to deceive scammers can be used to deceive people,” he says.
Scamwatch is run by the National Anti-Fraud Centre (NASC) under the auspices of the Australian Competition and Consumer Commission (ACCC), and an ACCC spokesman said scammers often impersonate well-known organisations and use fake legitimate phone numbers.
“Criminals create a sense of urgency to encourage their targeted victims to act quickly,” the spokesperson said, “often trying to convince victims to give up personal or bank details or provide remote access to their computers.”
“Criminals may already have detailed information about their targeted victims, such as names and addresses, obtained or purchased illegally through data breaches, phishing or other scams.”
This week Scamwatch had to issue a warning about what appears to be a meth scam.
Scammers claiming to be NASC officials were calling innocent people and saying they were under investigation for allegedly engaging in fraud.
The NASC says people should hang up the phone immediately if they are contacted by a scammer. The spokesperson said the company is aware of “technology initiatives to productize fraud prevention using AI voice personas,” including Apate, and is interested in considering evaluating the platform.
Meanwhile, there is a thriving community of scammers online, and Lenny remains one of their cult heroes.
One memorable recording shows Lenny asking a caller to wait a moment. Ducks start quacking in the background. “Sorry,” Lenny says. “What were you talking about?”
“Are you near the computer?” the caller asks impatiently. “Do you have a computer? Can you come by the computer right now?”
Lenny continues until the conman loses his mind. “Shut up. Shut up. Shut up.”
“Can we wait a little longer?” Lennie asked, as the ducks began quacking again.
Source: www.theguardian.com