Skip to main content
AI

Voice generation AI blows through bank’s voice ID security

Using voice as a password might not be wise when anyone can clone a voice with a short sample and a few bucks.
article cover

Artemisdiana/Getty Images

4 min read

Wanna know the hottest new way to pull off a bank heist? Call it “speaking and entering.” A reporter, Joseph Cox, was able to break into his Lloyds Bank account by tricking its automated phone service line with a machine-generated mimicry of his own voice, demonstrating how weak an authentication factor voice is in the era of AI.

According to the reporter in Motherboard, the process was simple: Cox created a voice sample roughly five minutes long and uploaded it to ElevenLabs, a generative AI company that creates ultra-realistic copies of peoples’ voices. After several attempts and a few tweaks, Lloyds’s voice ID system eventually accepted a fake clip of the reporter saying, “My voice is my password” as genuine.

The only other form of authentication necessary to get into the bank account was knowledge of the target’s birth date, information not exactly difficult to acquire. Although Cox had easy access to his own voice sample, Motherboard reported that it takes only a few minutes of a person’s voice to replicate it, which can feasibly be gleaned from an online clip.

Other banks, including TD Bank, Chase, and Wells Fargo, use a similar voice ID service, according to Motherboard. The clip was generated before ElevenLabs introduced features intended to cut down on abuse, such as requiring identity verification to access more powerful voice tools.

Lloyds told Motherboard in a statement: “Voice ID is an optional security measure, however we are confident that it provides higher levels of security than traditional knowledge-based authentication methods, and that our layered approach to security and fraud prevention continues to provide the right level of protection for customers’ accounts, while still making them easy to access when needed.”

Rick McElroy, principal cybersecurity strategist at VMware Carbon Black, told IT Brew that voice authentication systems are primarily in use at businesses like Lloyds that offer customers biometric passwords, but are also sometimes used internally at organizations for purposes like help desk access. He said there are two primary malicious purposes audio deepfakes are helpful for: tricking those automated systems, and as an alternate or supplementary form of trickery in wire scams or other forms of frauds.

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.

“I would tell you that it is absolutely growing in prevalence, but I don’t know that any one organization would have the exact numbers on that,” McElroy said. “I would say from a technology implementation perspective at scale, [there is] very limited and almost nonexistent detection or prevention over these streams.”

Generative AI has the advantage in multiple areas, according to McElroy. For one, the financial incentives run in the wrong direction.

Even if companies like ElevenLabs invest in offsetting harms, cybercriminals seeking illegitimate revenue can train their own models. McElroy told IT Brew that organizations “need to do a risk assessment in their environment to figure out what the impact would be, so they can figure out the dollars to spend on it.” Vendors also need to know there’s a market large enough to invest in those technologies and develop them, which takes time.

Lisa O’Connor, global leader of cybersecurity research and development for Accenture, told IT Brew that attackers are very good at adapting to advances in detection technology, and the question is whether an arms race is “a good way to focus how we spend defensive dollars.”

Both McElroy and O’Connor advised organizations to treat voice authentication as a dead technology, especially given that it’s not even clear detecting future audio deepfakes will even be possible. At a minimum, they said, companies should protect customer accounts with multiple authentication factors.

“That’s not to say it can’t serve as one factor,” O’Connor said. “But it really should be phased out because the technology is there to overcome that immediately, and it’s very available.”—TM

Top insights for IT pros

From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.