GitHub is awash with leaked AI company secrets – API keys, tokens, and credentials were all found out in the open

Wiz research suggests AI leaders need to clean up their act when it comes to secrets leaking

GitHub logo and branding pictured against a mixed black and white background with cat logo.
(Image credit: Getty Images)

A host of leading AI companies are leaking key data on GitHub and lack proper disclosure channels to even be notified of potential security problems.

That's according to research by cloud security firm Wiz, which examined 50 AI companies and found that 65% had leaked "verified secrets" on GitHub. Wiz said that could include data like API keys, tokens and credentials, many of which were buried deep in "deleted forks, gists and developer repos".

"Some of these leaks could have exposed organizational structures, training data, or even private models," Shay Berkovich, threat researcher at Wiz, and Rami McCarthy, principal security researcher at Wiz, said in a blog post.

Image

Your easiest cybersecurity win this year.

<p><a href="https://go.nordlayer.net/aff_c?offer_id=563&aff_id=3013" target="_blank">Protect your networks with NordLayer and save 28% using the code BLACKLAYER-28.

The Wiz research follows a report earlier this year from Palo Alto Networks that showed data loss issues pinned on generative AI had more than doubled in early 2025, with AI data security incidents accounting for 14% of such problems across all software-as-a-service traffic.

Separate research last month suggested that AI coding tools are wreaking havoc themselves, with one-in-five CISOs saying they've suffered a major incident due to AI code. Such reports highlight the security risks posed by AI.

GitHub is a goldmine for threat actors

The Wiz researchers said they worked from the assumption that any big company with a large GitHub footprint likely has some exposed secrets, and worked their way through the Forbes AI 50 list to check that against leading AI companies, including big players like Anthropic and smaller startups.

Some didn't appear to use GitHub, but those that did were investigated by Wiz.

Leaks included keys that allowed access to insider data such as organizational members, which the researchers noted could be used by threat actors to target the company.

In another case, ElevenLabs API keys were listed in plaintext, which Wiz said suggested a relationship between vibe coding and leaking secrets.

Another company was leaking HuggingFace tokens in a deleted fork that allowed access to a thousand private models, as well as other data that revealed training details for private AI models.

One company had no public repositories and just 14 team members listed but still managed to leak sensitive data.

"Conversely, the company with the largest footprint without an exposed secret had 60 public repos and 28 organization members," researchers said.

What should companies do?

Wiz recommended companies run their own secret scans to ensure they aren't leaking such information, especially for anyone using a public Version Control System (VCS), but generally for all AI service providers.

"Too many shops leak their own API keys while 'eating their dogfood'," the researchers noted. "If your secret format is new, proactively engage vendors and the open source community to add support.

Wiz disclosed the leaks to all impacted companies, but in half of instances received no response or the message did not get through.

"Many lacked an official disclosure channel, failed to reply, and/or failed to resolve the issue," researchers said, calling for companies to ensure they have disclosure channels open and ready from day one.

"For AI innovators, the message is clear: speed cannot compromise security."

Make sure to follow ITPro on Google News to keep tabs on all our latest news, analysis, and reviews.

MORE FROM ITPRO

TOPICS

Freelance journalist Nicole Kobie first started writing for ITPro in 2007, with bylines in New Scientist, Wired, PC Pro and many more.

Nicole the author of a book about the history of technology, The Long History of the Future.