Arizona mother describes AI phone scam faking daughter's kidnapping: 'It was completely her voice'
Several agencies in states across the south are looking for Brenton Fillers, AKA, the "TikTok Trickster," for allegedly meeting women and conning them out of money.
Arizona mother Jennifer DeStefano recounted a terrifying experience when phone scammers used artificial intelligence technology to make her think her teenage daughter was kidnapped.
Local news channel KPHO reported the story on Monday with DeStafano describing a recent incident when she received a call from an unknown number. Because her daughter was on a ski trip at the time, she answered the call out of concern of an accident.
DeStefano explained, "I pick up the phone and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing. I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying."
"This man gets on the phone and he’s like, ‘Listen here. I’ve got your daughter. This is how it’s going to go down. You call the police, you call anybody, I’m going to pop her so full of drugs. I’m going to have my way with her and I’m going to drop her off in Mexico.’ And at that moment, I just started shaking. In the background she’s going, ‘Help me, Mom. Please help me. Help me,’ and bawling," she continued.
DeStefano happened to be at her other daughter’s dance studio at the time where fellow mothers assisted her by calling 911 as well as DeStefano’s husband. After a few minutes, they were able to confirm that DeStefano’s daughter was safe. Despite this, DeStefano described feeling shaken at the experience.
"It was completely her voice. It was her inflection. It was the way she would have cried. I never doubted for one second it was her. That’s the freaky part that really got me to my core," she said.
The call came amidst a rise in "spoofing" schemes with fraudster claiming that they have kidnapped loved ones to receive ransom money using voice cloning technology. A TikTok user named Chelsie Gates garnered more than 2.5 million views on a video recounting her own experience in December.
"I was literally shaking during all of this," Gates said. "[I was] imagining my mom being held hostage at gunpoint at a patient’s house."
Computer science professor Subbarao Kambhampati warned that these stories of voice-cloning technology and catfish schemes could become more common as AI technology improves.
"In the beginning, it would require a larger amount of samples. Now there are ways in which you can do this with just three seconds of your voice. Three seconds. And with the three seconds, it can come close to how exactly you sound," Kambhampati said.
Kurt Knutsson the CyberGuy contributed an article to Fox News Digital that offered advice to avoid voice cloning scams including never answering an unknown number, removing a personalized voicemail and even avoiding posting videos online.
"Be careful what you post online. I know we all love sharing videos of good times with loved ones on our social accounts, however, you should consider making your account or those specific posts private so that only people you're friends with can see them," Knutsson wrote.