Top insights for IT pros
From cybersecurity and big data to cloud computing, IT Brew covers the latest trends shaping business tech in our 4x weekly newsletter, virtual events with industry experts, and digital guides.
Weeks ago, Jodi Maas—governance, risk, and compliance manager at Exabeam—led an interview for the cybersecurity company’s senior security analyst position.
After a few minutes, Maas had a feeling that the interviewee was wrong for the job—mainly because the applicant’s eyes weren’t moving.
The woman on the video conference had other problems that implied she just wasn’t the right fit, like audio not syncing to mouth movements, distracting background noise, and a lack of satisfying answers to less technical questions—signs of a digital imposter.
“There was no personal touch. There was just no personality behind anything,” Maas told IT Brew.
The deepfake interview—a threat warned about by the FBI as early as June 2022—is one that some CISOs have noticed lately, and one where the defenses are still being written.
“The good news is, the technology is not that great right now. The bad news is, it’s going to get better,” Kevin Kirkwood, CISO at Exabeam, said.
Maas said the glitchy interviewee had trouble with emotion-based questions like, “What motivates you?”—a query that, according to Maas, led to an awkward pause, no change in facial expression, and a “process”-style recitation of a dedication to governance, risk, and compliance
Kirkwood believes the interviewee had canned responses ready—an ineffective strategy for off-the-cuff questions.
“I would ask them questions like…‘What’s that picture on the left-hand side there?’” Kirkwood suggested as he considered ways to detect today’s deepfakes.
After the interview, Maas went to Kirkwood, who quickly instructed HR, engineering, security, and IT teams to trust their instincts and watch out for suspicious behavior like unblinking, very-still interviewees.
About a year after the FBI’s June 2022 alert of deepfake applicants, the agency warned of IT workers from the Democratic People’s Republic of Korea (DPRK) allegedly aiming to generate revenue for the country’s weapons program, sometimes even infiltrating “the computer networks of unwitting employers to steal information and maintain access for future hacking and extortion schemes.”
The CISO of security-platform company KnowBe4 recently spoke with IT Brew about an encounter with a “fake IT worker.”
Kirkwood sees tech companies like his “remote-first” cybersecurity firm as valuable “central hubs” for hackers.
“If I get inside of your company and I inject code into your product, and you release that product without finding it, you become kind of the super-spreader,” Kirkwood said.
Looking back, Maas noticed the deepfaker’s résumé had some unverifiable details—degrees without institutions, for example. (The HR team did provide a pre-interview warning that the applicant’s responses during the initial screening process appeared scripted, Maas noted.)
Until Kirkwood finds a technical detection mechanism, like an algorithm that spots characteristics of a live, blinking person, valuable defenses include vetting and communication—asking thoughtful questions to applicants and voicing necessary alerts to the CISO.
“It was really a learning experience for me. I knew it was out there, but to actually see it just kind of reminds me of to stay on top of my game and actually communicate,” Maas said.