Earlier this summer, the FBI Internet Crime Complaint Center (IC3) released a public service announcement due to an unusual influx of complaints from employers reporting the use of deepfakes to apply for remote jobs at their companies. The FBI reports these complaints have predominately come from job sectors such as Information Technology, computer programming, database, and software related roles where the ability to access sensitive information is common.
What are Deepfakes?
Deepfakes are algorithmically generated videos and images intended to exploit a person’s identity, making it appear that they are saying something they have never said. In addition, Deepfakes allows accessibility for fraudsters to exploit companies and employers in remote positions. Here are examples of such cases.
ID theft has existed for a long time ranging from fraudsters opening accounts, accessing file taxes, and making purchases. While employers can detect fraudsters at the interview stage of the job application, with the onset of emerging technology, shady job candidates are also utilizing stolen personally identifiable information (PII) and deepfake audio to get through the interview process. Once hired, successful candidates can access sensitive data, such as customer PII (Personally Identifiable Information), financial data, corporate IT databases and/or proprietary information.
According to the FBI, some victims of the scam reported that their stolen PII had been used to apply for a remote job and that their identities were being used for pre-employment background checks with other applicants’ profiles,
Some companies require their employees to sign agreements to only work at their place of employment. However, fraudsters are able to mitigate this by using PII to work for as many as 2 or 3 jobs. This is also known as Time Theft and results in compromised company data for several businesses.
What You Can Do
As more workplaces become fully remote and Deepfakes emerging technology continues to evolve, more fraudsters will seek to exploit the technology to defraud U.S. based companies. Here is what you can do.
Double down on background checks, reference checks and make every effort to conduct an in-person interview. Consider having your candidate meet with your closest (remote) worker just to verify humanity. The high-level interview questioning can happen via Zoom, but having at least one employee or trusted confidant meet the candidate will certainly weed out any deep fakes overseas applicants.
In addition to the interview, it is wise to do a quick, live skills demo. The goal is to quickly test skills and competency that are relevant to the job you are hiring for. Create a few simple projects with easy, medium, and hard to do tasks and have the candidate complete live and on camera.
Embrace AI fraud detection to evaluate resumes and job candidates. Fraud detection has been used for years in banking, insurance, and other fields; it is now possible to use it for the hiring process. Tools such as Microsoft’s Video Authenticator Tool, a powerful tool that allows employers to analyze still photo or video to provide a confidence score to detect whether or not the media is manipulated
To get in touch with us about cyber security, get in touch with us here. For more insights, follow us on LinkedIn and Twitter, and find job opportunities on our careers page.