Cybersecurity

Recruiters, beware of deepfake applicants

FBI issues warning to beware of scammers posing as job candidates in virtual interviews.
article cover

Gn8/Getty Images

· 3 min read

Quick-to-read HR news & insights

From recruiting and retention to company culture and the latest in HR tech, HR Brew delivers up-to-date industry news and tips to help HR pros stay nimble in today’s fast-changing business environment.

Much like a heavily-filtered Instagram post, the line between reality and illusion is getting increasingly blurred in the world of remote recruiting. In June, the FBI’s Internet Crime Complaint Center (IC3) issued a security warning to employers, citing increased complaints of scammers using deepfake technology for remote IT, software, database, and programming jobs. The PSA warned that scammers are likely trying to gain access to these roles under stolen identities in an effort to access customer data at the companies.

The FBI said the complaints have largely been about “voice spoofing,” in which the video and audio don’t line up. With remote jobs now common and better deepfake technology available to criminals, the IC3 warning follows a similar warning from federal agencies that malicious actors from North Korea may be targeting sensitive information by applying for independent contractor roles or remote positions.

Deep what now? A deepfake is “synthetic audio, image, and video content created with AI or machine-learning technologies,” according to ZDNet. In the instances reported to the IC3, when companies did a background check on applicants, the personal identifying information (PII) given “belonged to another individual.” Vice reported that the targeted positions are attractive to scammers because of the traditional access these workers have to customer information and technology.

More like weakfakes? Experts say that most deepfake attempts currently fall short of verisimilitude. Sam Gregory, program director at Witness, a nonprofit that helps people “use video and technology to protect and defend human rights,” told IT Brew. “It is—as yet—hard to do a very convincing ongoing visual deepfake in a real-time context, but the technology is improving rapidly to enable the combination of near real-time voice generation with realistic face and lip movements that match.”

Although the technology has a long way to making fake job applicants believable, Aviv Ovadya, from Harvard Kennedy School’s Belfer Center, told IT Brew that videoconferencing tools don’t have deepfake detection yet, but even when they do, the criminals will likely use the technology to develop their own. “And so that becomes very hard to defend against. This is the deepfake-detection dilemma.”—KP

Do you work in HR or have information about your HR department we should know? Email [email protected] or DM @Kris10Parisi on Twitter. For completely confidential conversations, ask Kristen for her number on Signal.

Quick-to-read HR news & insights

From recruiting and retention to company culture and the latest in HR tech, HR Brew delivers up-to-date industry news and tips to help HR pros stay nimble in today’s fast-changing business environment.