How deepfake AI job applicants are stealing remote work

6 hours ago 1

Job-seeking impostors, including deepfakes, are exploiting the remote work trend, defrauding U.S. companies and potentially threatening U.S. national security, according to experts.

Approximately 17% of hiring managers surveyed said they had encountered candidates using deepfake technology to alter their video interviews, according to career platform Resume Genius. It surveyed 1,000 hiring managers across the United States.

By 2028, 1 in 4 job candidates worldwide will be fake, according to research and advisory firm Gartner.

"Deepfake candidates are infiltrating the job market at a crazy, unprecedented rate," said Vijay Balasubramaniyan, CEO of voice authentication startup Pindrop Security, who said he recently caught a deepfake job candidate.

"It's very, very simple right now" to create deepfakes for video interviews, Balasubramaniyan said. "All you need is either a static image" or video of another person and a few seconds of audio of their voice, he said.

"Remote jobs unlocked the possibility of tricking companies into hiring fake candidates," said Dawid Moczadlo, co-founder of data security software company Vidoc Security Lab, who recently posted a viral video interaction with a deepfake job seeker on LinkedIn.

"If this trend continues and if we experience more and more fake candidates, then we definitely will need to develop some kind of tools to verify if the person is a real person, if they are who they claim to be," Moczadlo said.

While fraudulent job seekers can originate from anywhere, fake candidates with ties to North Korea have drawn significant headlines in recent months. 

In May 2024, the Justice Department alleged that more than 300 U.S. companies had unknowingly hired impostors tied to North Korea for remote IT roles, resulting in at least $6.8 million in overseas revenue. The workers allegedly used stolen American identities to apply for remote jobs and employed virtual networks and other techniques to conceal their true locations.

"When we hire candidates or fake candidates who are from sanctioned nations, it becomes a national security concern," said Aarti Samani, an expert in AI deepfake fraud prevention. "The reason it becomes a national security concern is because, once these candidates or these individuals are in an organization, they are taking that salary and funding activities back in those nations. And those activities can be illicit as well. So inadvertently, we are funding illicit activities in sanctioned nations."

As AI technology rapidly evolves, fake AI-generated job candidate profiles are undermining the credibility of the hiring process.

"The whole reason you need to worry about deepfake job seekers is, at the very least, they're making the real employees, potential employees and candidates not able to get the job or [get the] job as easy," said Roger Grimes, a veteran computer security consultant. "It can create all kinds of disruption, just making the hiring process longer and more expensive."

"Potentially, you could even be applying for a job and someone's not sure whether you're real or not, and you don't even get that call, and you don't know why you didn't get the call," Grimes said. "It was all because perhaps they saw something that made them think that maybe you're a deepfake candidate, even when you weren't."

Watch the video above to learn how fake candidates can harm businesses and what steps can be taken to combat this issue.

Read Entire Article