devxlogo

Ai hallucinations cause new supply chain threat

AI Threat
AI Threat

Cybersecurity researchers are warning of a new type of supply chain attack called “slopsquatting.” It is caused by generative AI models recommending non-existent dependencies when creating code. A study by researchers from the University of Texas at San Antonio, Virginia Tech, and the University of Oklahoma found that about 20% of the packages recommended by AI models were fake. This is a problem because threat actors can register these fake package names and use them to spread malicious code.

The researchers looked at 16 different code-generation models, including GPT-4, GPT-3.5, CodeLlama, DeepSeek, and Mistral. They found that open-source models like DeepSeek and WizardCoder made up fake package names more often (21.7% on average) than commercial models like GPT-4 (5.2%). CodeLlama was the worst, with over a third of its recommendations being fake.

GPT-4 Turbo was the best, with only 3.59% fake recommendations. The researchers also found that these fake package names were persistent and believable. When they re-ran prompts that had previously produced fake packages, 43% of the fake names showed up every time in 10 runs.

New supply chain attack risk

And 58% appeared more than once. This means the fake names are not random but are repeatable artifacts of how the models respond to certain prompts.

The fake package names were also found to be convincing, with moderate similarity to real packages. This makes them more valuable to attackers. Only 13% of the fake names were simple typos.

While no real-world instances of slopsquatting have been reported yet, both the study and an analysis by cybersecurity firm Socket recommend protective measures. Developers are advised to install dependency scanners to detect and remove malicious packages. The rush to push AI models to market has led to insufficient testing, exposing systems to significant threats.

See also  Mercedes-Benz Plans 12 Dubai Skyscrapers

This highlights the importance of thorough validation and comprehensive security measures in AI development and deployment.

Image Credits: Photo by Mika Baumeister on Unsplash

April Isaacs is a news contributor for DevX.com She is long-term, self-proclaimed nerd. She loves all things tech and computers and still has her first Dreamcast system. It is lovingly named Joni, after Joni Mitchell.

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.