Skip to main content
AI

How audio deepfakes can trick employees into falling for a phish

They’re a “labor of love” requiring effort, but audio deepfakers have had success.
article cover

maxkabakov/Getty Images

3 min read

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.

Audio dupes have been around since Kevin McCallister needed a hotel room in New York City, but voice mimicry is easier and more convincing than what was possible with a Talkboy in 1992.

Today’s accessible AI tools can machine-learn vocal patterns and offer realistic-sounding audio of, say, President Biden reviewing We Bought a Zoo, or a bunch of Presidents playing Battlefield 2042.

Many deepfake demos get laughs, but audio impersonation can also help scammers get cash, and some IT pros say companies should prepare for a, shall we say…phoicemail.

“Think of this as an early-entry attack vector in a persistent attack from an advanced attacker,” said Tim Callan, chief experience officer at the cybersecurity provider Sectigo.

Straight to voicemail. I hate to do this, but I’m on the road, working on a major acquisition. This is all hush-hush, but I need you to wire $300,000…

That’s an example, according to Callan, of what someone who sounds like your CEO might say in an audio-deepfake scenario. A seemingly urgent verbal message could also convince an employee to send company data or open a malicious email.

For now, such deepfake attacks are a “labor of love,” said Callan, requiring the collection of audio, construction of a message, and available tools that complete the impersonation.

The imposter option arrives just as AI tools like ChatGPT have helped phishers fix up their typos.

“With things like GPT out there, a lot of the telltale signs of bad phishing attacks are completely disappearing. The idea that you can spot things because they were written by people with poor English skills…that’s all gone,” said Steve Wilson, chief product officer at the platform provider Contrast Security, during a panel at April’s RSA conference.

Deepfakes, real threats? Two-thirds of respondents in a 2022 VMware report witnessed malicious deepfakes used in an attack.

In March of 2019, criminals employed AI-based software to impersonate the CEO of a UK-based energy firm and demand a transfer of €220,000.

Callan, Tom Etheridge, chief global professional services officer at CrowdStrike, and Phil Quitugua, director at the tech advisory ISG, haven’t seen a prevalence of real-world audio attacks…yet.

Quitugua envisions deepfakes deployed alongside other tactics, like a bogus email invoice, to enhance legitimacy.

“The audio is going to be sort of complementary to some of the other types of attack techniques,” Quitugua told IT Brew.

Many companies have standard processes to combat the performed urgency of a spearphishing attack, like implementing a callback-verification process for new wire transfers.

The rarity of audio attacks will make them effective, said Callan, even against employees who are well aware of the deceptive email scams and text scams that appear to come from the “CEO” or other exec.

“The first thing is getting people to kind of have the same common-sense guard they have for…an email-enabled or text-enabled spear phishing attack,” Callan told IT Brew.

Or a kid trying to get a hotel room with his parents’ credit card.

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.