
A third AI-related proof-of-concept attack that drew attention exploited a prompt injection to make GitLab’s Duo chatbot insert malicious lines into an otherwise legitimate code package. A variation of the exploit successfully exfiltrated sensitive user data.
Another notable attack aimed at the Gemini CLI coding tool. It let attackers run destructive commands—such as wiping a hard drive—on developers’ computers using the AI tool.
Using AI as bait and hacking assistants
Other LLM-linked hacks used chatbots to make attacks more effective or harder to detect. Earlier this month, two men were indicted for allegedly stealing and erasing sensitive government data. Prosecutors say one of the men tried to cover his tracks by asking an AI tool “how do i clear system logs from SQL servers after deleting databases.” Shortly after, he allegedly asked the tool, “how do you clear all event and application logs from Microsoft windows server 2012.” Investigators were nevertheless able to trace the defendants’ actions.
In May, a man pleaded guilty to hacking a Walt Disney Company employee by tricking the person into running a malicious build of a widely used open-source AI image-generation tool.
And in August, Google researchers warned users of the Salesloft Drift AI chat agent to treat all security tokens tied to the platform as compromised after discovering unknown attackers had used some credentials to access Google Workspace email. The attackers used those tokens to reach individual Salesforce accounts and then steal data, including credentials that could be reused in other breaches.
There were also multiple instances of LLM vulnerabilities that ended up harming the people using them. In one case, CoPilot exposed the contents of more than 20,000 private GitHub repositories from companies including Google, Intel, Huawei, PayPal, IBM, Tencent, and, ironically, Microsoft. The repositories had initially been accessible via Bing as well. Microsoft eventually removed them from search results, but CoPilot kept exposing them regardless.