North Korean hacker targets KnowBe4 with deepfake

Hacker Deepfake

A prominent cybersecurity training company recently fell victim to a sophisticated scam involving AI deepfake tools a North Korean cybercriminal used to fake his identity and infiltrate the organization. The Florida-based company identified the threat just in time, preventing data theft, but the incident underscores the growing risks such scams pose to businesses. The firm was hiring a remote software engineer for its internal IT team.

After a typical hiring process involving video conference interviews, background checks, and reference verifications, the company hired what appeared to be a qualified candidate. However, within 30 minutes of receiving a computer workstation, the new hire attempted to load malware and execute unauthorized programs. The company’s IT security department quickly detected the suspicious activity.

When confronted, the new hire initially claimed to be troubleshooting a tech issue but soon became unresponsive. The company terminated his access and employment immediately. The scammer had used a stolen identity and AI tools to enhance a stock photo, altering his appearance and voice for video interviews.

The workstation was sent to a U.S. address, traced to an “IT mule laptop farm,” while the scammer’s actual location was in North Korea or nearby China. He worked during U.S. daytime hours by remotely operating from Asia. Scammers may infiltrate companies for various reasons, such as performing real work while funneling earnings to fund North Korean state actions or, in this case, attempting to cause disruption, steal data, or demand ransoms.

In his short time with access, the cybercriminal managed to manipulate session history files and execute unauthorized software, though his ultimate goal remains unclear.

To protect your organization in the remote work era, consider the following steps:

1. Train your hiring team to be vigilant, similar to how employees are trained to recognize phishing emails.

2. Educate your hiring team about social engineering strategies used by cyber attackers.

Deepfake scam highlights hiring threats

3. Conduct all video interviews with the camera on and look for deepfake signs such as irregular lighting or unnatural movements. 4.

Utilize threat-detection tools to identify and flag potential deepfakes. 5. Consider in-person interviews, even for remote roles, to deter scammers.

6. Wipe laptops clean of data before shipping and send them only to the employee’s verified address or a trusted third-party office where a valid ID is required. 7.

Provide new employees only the access necessary for their role, avoiding sensitive systems and data. 8. Ensure IT systems are updated and can’t detect unauthorized access or downloads.

9. Regularly review your hiring processes to ensure compliance with best practices in background checks, interviews, and more. 10.

Conduct regular cybersecurity training sessions to keep all employees informed about the latest threats and how to report suspicious activities. If a specialized cybersecurity firm can fall prey to elaborate AI-driven scams, so can any business. Implementing these proactive steps can vastly reduce the risk of such incidents, ensuring your organization remains secure in the evolving landscape of remote work and AI technology.

devxblackblue

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.

About Our Journalist