HomeNewsMassive language fashions hallucinating non-existent developer packages might gasoline provide chain assaults

Massive language fashions hallucinating non-existent developer packages might gasoline provide chain assaults

Massive Language Fashions (LLMs) have a critical “bundle hallucination” downside that might result in a wave of maliciously-coded packages within the provide chain, researchers have found in one of many largest and most in-depth ever research to analyze the issue.

It’s so dangerous, in truth, that throughout 30 completely different assessments, the researchers discovered that 440,445 (19.7%) of two.23 million code samples they generated experimentally in two of the most well-liked programming languages, Python and JavaScript, utilizing 16 completely different LLM fashions for Python and 14 fashions for JavaScript, contained references to packages that had been hallucinated.

The multi-university examine, first printed in June however not too long ago up to date, additionally generated “a staggering 205,474 distinctive examples of hallucinated bundle names, additional underscoring the severity and pervasiveness of this risk.”

See also  It’s not all doom and gloom: When cybersecurity gave us hope in 2023
- Advertisment -spot_img
RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -

Most Popular