Job applicants are using deepfake AI to trick recruiters—Here’s how hiring managers can spot the next imposter
Vijay Balasubramaniyan knew there was a problem.
The CEO of Pindrop, a 300-person information security company, says his hiring team came to him with a strange dilemma: they were hearing weird noises and tonal abnormalities while conducting remote interviews with job candidates.
Balasubramaniyan immediately thought the issue might be interviewees using deepfake AI technology to mask their true identities. But unlike most other companies, Pindrop was in a unique position as a fraud-detecting organization to investigate the mystery itself.
To get to the bottom of it, the company posted a job listing for a senior back-end developer. It then used its own in-house technology to scan candidates for potential red flags. “We started building these detection capabilities, not just for phone calls, but for conferencing systems like Zoom and Teams,” he tells Fortune. “Since we do threat detection, we wanted to eat our own dog food, so to speak. And very quickly we saw the first deepfake candidate.”
Out of 827 total applications for the developer position, the team found that roughly 100, or about 12.5%, did so using fake identities. “It blew our mind,” says Balasubramaniyan. “This was never the case before, and tells you how in a remote-first world, this is increasingly becoming a problem.”
Pindrop isn’t the only company getting a deluge of job applications attached to fake identities. Although it’s still a nascent issue, around 17% of hiring managers have already encountered candidates using deepfake technology to alter their video interviews, according to a March survey from career platform Resume Genius. And one startup founder recently told Fortune that about 95% of the résumés he receives are from North Korean engineers pretending to be American. As AI technology continues to progress at a rapid clip, businesses and HR leaders must prepare for this new twist to an already-complicated recruiting landscape, and be prepared to face the next deepfake AI candidate who shows up for an interview.
“My theory right now is that if we’re getting hit with it, everybody’s getting hit with it,” says Balasubramaniyan.
A black mirror reality for hiring managers
Some AI deepfake job applicants are simply attempting to land multiple jobs at once to boost their income. But there is evidence to suggest that there are more nefarious forces at play that can lead to big consequences for unwitting employers.
In 2024, cybersecurity company Crowsdtrike responded to more than 300 instances of criminal activity related to Famous Chollima, a major North Korean organized crime group. More than 40% of those incidents were sourced to IT workers who had been hired under a false identity.
“Much of the revenue they’re generating from these fake jobs is going directly to a weapons program in North Korea,” says Adam Meyers, a senior vice president of counter adversary operations at Crowdstrike. “They’re targeting login, credit card information, and company data.”
And in December 2024, 14 North Korean nationals were indicted on charges related to a fraudulent IT worker. They stand accused of funnelling at least $88 million from businesses into a weapons program over the course of six years. The Department of Justice also alleges that some of those workers also threatened to leak sensitive company information unless their employer paid them an extortion fee.
To catch a deepfake
Dawid Moczadło, the co-founder of data security software company Vidoc Security Lab, recently posted a video on LinkedIn of an interview he did with a deepfake AI job candidate, which serves as a masterclass in potential red flags.
The audio and video of the Zoom call didn’t quite sync up, and the video quality also seemed off to him. “When the person was moving and speaking I could see different shading on his skin and it looked very glitchy, very strange,” Moczadło tells Fortune.
Most damning of all though, when Moczadło asked the candidate to hold his hand in front of his face, he refused. Moczadło suspects that the filter used to create a false image would begin to fray if that happened, much like it does on Snapchat, exposing his true face.
“Before this happened we just gave people the benefit of the doubt, that maybe their camera is broken,” says Moczadło. “But after this, if they don’t have their real camera on, we will just completely stop [the interview].”
It’s a strange new world out there for HR leaders and hiring managers, but there are other tell-tale signs they can watch out for earlier on in the interview process that can save them major headaches later on.
Deepfake candidates often use AI to create fake LinkedIn profiles that appear real, but are missing critical information in their employment history, or have very little activity or few connections, Meyers notes.
When it comes to the interview stage, these candidates are also often unable to answer basic questions about their life and job experience. For example, Moczadło says he recently interviewed a deepfake candidate who listed multiple well-known organizations on their resume, but couldn’t share any detailed information about those companies.
Employers should also look out for new hires who ask to have their laptop shipped to a location other than their home address. Some people are operating “laptop farms,” in which they keep multiple computers open and running so that people outside the country can log in remotely.
And finally, employee impersonators are typically not the best workers. They often don’t turn on their cameras during meetings, make excuses to hide their faces, or skip work gatherings altogether.
Moczadło says he’s much more careful about hiring now, and has implemented new procedures into the process. For example, he pays for candidates to come into the company’s office for at least one full day in-person before they’re hired. But he knows not everyone can afford to be so vigilant.
“We’re in this environment where recruiters are getting thousands of applications,” says Moczadło. “And when there’s more pressure on them to hire people they’re more likely to overlook these early warning signs and create this perfect storm of opportunity to take advantage of.”
This story was originally featured on Fortune.com
Source link