Researchers have uncovered a new supply chain attack called #Slopsquatting where threat actors exploit hallucinated, non-existent package names generated by #AI coding tools like #GPT4 and #CodeLlama
These believable yet fake packages (amounting to 19.7% or 205,000 packages), recommended in test samples were found to be fakes., can be registered by attackers to distribute malicious code.
Open-source models -- like #DeepSeek and #WizardCoder -- hallucinated more frequently, at 21.7% on average, compared to the commercial ones (5.2%) like GPT 4.
We Have a Package for You! A Comprehensive Analysis of Package Hallucinations
by Code Generating LLMs (PDF) https://arxiv.org/pdf/2406.10279