Cyber-kidnapping isn’t new. But advancements in AI—like mainstream access to voice cloning and deepfakes—could fuel even more copycat cases.
In typical incidents of cyber-kidnapping, everything happens virtually—attackers use manipulation tactics to make it seem as if someone is being held hostage, coercing family or friends to pay the ransom and gain their release. The schemes have been around for more than 20 years. A Los Angeles FBI investigation from 2013–2015 found the majority of cases happened in Mexican prisons. The rise of AI has added to these and similar schemes as scammers have begun using voice cloning technology to sound exactly like supposed victims.
“The more real a bad actor can make a scenario true to life, the better,” Chris Stangl, a former FBI special agent in the cyber division, told IT Brew. “A person’s likeness captured by a bad actor and then manipulated into synthetic content is a game changer in that a call to a victim is more authentic.”
Everything looks and sounds real, so the perpetrators play “into the fear that a loved one is under a threat of violence—and even death, raising the probability the victim will be tricked into keeping the issue secret, fearful of reporting to the police, and subsequently paying a ransom,” he added. In sextortion cases, bad actors will take vocal clips and images from social media to manipulate people into thinking these are images of graphic or sensitive material. In other cyber extortion cases or cyber-kidnapping, they may use voice cloning tech to fool parents into thinking their actual child has been kidnapped.
Double manipulation. In recent days, perpetrators executed a cyber-kidnapping attack on a teen studying in the US—forcing him into isolation in the woods while also extorting his parents in China for $80,000.
Top insights for IT pros
From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.
Stangl—who currently serves as the managing director of cybersecurity and investigations at Berkeley Research Group—said the case was extreme, mentioning other schemes, like romance scams, sextortion, and grandparent telephone scams—in which perpetrators posing as grandchildren in crisis target older relatives.
“The themes are the same: We’re going to identify a target, we’re going to make initial contact and develop that rapport—the baiting of them falling for this because they feel like their significant other has been harmed,” he said.
Secure the premises. One of the easiest ways to prevent cyber-kidnappings is, Stangl said, to practice good cyber hygiene, but he also advises users to think twice before even responding to messages from potential fraudsters.
“What I tell people is—don’t answer that message. Don’t pick up that phone, don’t respond to that text, because that is what the bad guy wants you to do, is they want you to engage,” he said. From there, he said, the ongoing conversation often hooks the victim in.
The most effective weapon against voice cloning, according to Stangl? Cyber hygiene.
People should remain cautious when posting or sending photos, videos, and other identifying info on social media, dating apps, and other sites.
“Unfortunately, the success rate of falling prey to those types of crimes is high,” he said. “Applying privacy settings on social media accounts—including setting profiles and friends lists as private—to limit the public exposure of photos, videos, and other personal information is one of the most effective tools.”