Google’s Hugging Face partnership shows the future of generative AI rests on open source collaboration
Closer ties between Google and Hugging Face show the tech giant is increasingly bullish on open source AI development


Google’s new partnership with Hugging Face marks another significant nod of approval from big tech on the potential of open source AI development, industry experts have told ITPro.
The tech giant recently announced a deal that will see the New York-based startup host its AI development services on Google Cloud.
Hugging Face, which describes the move as part of an effort to “democratize good machine learning”, will now collaborate with Google on open source projects using the tech giant’s cloud services.
With such a sizable number of models already available on Hugging Face, the attraction on Google’s end is clear, Gartner VP analyst Arun Chandrasekaran told ITPro, and underlines another example of a major industry player fostering closer ties with high-growth AI startups.
“Hugging Face is the largest hosting provider of open source models on the planet - it's basically GitHub for AI,” he said.
“This is why big firms like Google want to partner with it, because it can leverage the huge range of models available on the platform,” he added.
For Hugging Face, the deal will offer speed and efficiency for its users through Google’s cloud services.
Get the ITPro daily newsletter
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
Many Hugging Face users already use Google Cloud, the firm said, and this collaboration will grant them a greater level of access to AI training and deployment through Google Kubernetes Engine (GKE) and Vertex AI.
Google is also nipping at the heels of AWS with this partnership. Amazon’s cloud subsidiary announced a similar agreement with Hugging Face to accelerate the training of LLMs and vision models in February 2023.
Again, the MO was one of AI altruism. Both Hugging Face and AWS cited a desire to make it “easier for developers to access AWS services and deploy Hugging Face models specifically for generative AI applications".
RELATED RESOURCE
Discover how generative AI provides the technical support to operate successfully
DOWNLOAD NOW
Big tech firms have sharpened their focus on open source AI development over the last year, and there are a number of contributing factors to this, according to Chandrasekaran, especially with regard to driving broader adoption of generative AI.
“Rewind back nine months, and the AI landscape was dominated by closed source LLMs - think OpenAI or, to a lesser extent, Google,” he said.
“Since then, open source models have increased in volume and quality. Increasingly, they’re being used for enterprise because open source content is often licensed for commercial use,” he added. “At the same time, the quality of the actual models are improving.”
While open source models aren’t licensed for commercial use by definition, they are more likely to be. This makes them an attractive consideration for enterprise use, as businesses know they can roll out models freely within companies or on the open market.
Meta, for example, was among the first major tech firms to make a big statement on open source AI development last year with the launch of Llama 2.
Google eyes open source as the ticket to overcoming 'AI obstacles'
With issues of cost and scalability, as well as increasing regulation around the corner, open source seems to answer a lot of the big question marks around AI.
“Why is everyone interested in open source AI? The same reason they’re interested in open source in general,” Chandrasekaran said.
“It allows customizability, ease of use, and adaptability in the face of regulation,” he added.
“It allows customizability, ease of use, and adaptability in the face of regulation,” he added.
Matt Barker, global head of cloud native services at Venafi told ITPro there’s already an existing symbiotic relationship between open source and AI.
“Open source is already inherent in the foundations of AI,” he said. “Open source is foundational to how they run. Kubernetes, for example, underpins OpenAI,” he added.
“Open source code itself is also getting an efficiency boost thanks to the application of AI to help build and optimize it. Just look at the power of applying Copilot by GitHub to coding, or K8sGPT to Kubernetes clusters. This is just the beginning.”

George Fitzmaurice is a former Staff Writer at ITPro and ChannelPro, with a particular interest in AI regulation, data legislation, and market development. After graduating from the University of Oxford with a degree in English Language and Literature, he undertook an internship at the New Statesman before starting at ITPro. Outside of the office, George is both an aspiring musician and an avid reader.
-
Bigger salaries, more burnout: Is the CISO role in crisis?
In-depth CISOs are more stressed than ever before – but why is this and what can be done?
By Kate O'Flaherty Published
-
Cheap cyber crime kits can be bought on the dark web for less than $25
News Research from NordVPN shows phishing kits are now widely available on the dark web and via messaging apps like Telegram, and are often selling for less than $25.
By Emma Woollacott Published
-
'Customers have been begging us to launch': AWS just rolled out Amazon Q Business in Europe – and it includes new data residency features
News AWS has announced the availability of its Amazon Q Business platform in Europe in a move sure to please sovereignty-conscious customers.
By George Fitzmaurice Published
-
Google DeepMind’s Demis Hassabis says AI isn’t a ‘silver bullet’ – but within five to ten years its benefits will be undeniable
News Demis Hassabis, CEO at Google DeepMind and one of the UK’s most prominent voices on AI, says AI will bring exciting developments in the coming year.
By Rory Bathgate Published
-
Google CEO Sundar Pichai says DeepSeek has done ‘good work’ showcasing AI model efficiency — and Gemini is going the same way too
News Google CEO Sundar Pichai hailed the DeepSeek model release as a step in the right direction for AI efficiency and accessibility.
By Nicole Kobie Published
-
Google will invest a further $1 billion in AI startup Anthropic
News This is the latest in a flurry of big tech investments for the AI startup
By George Fitzmaurice Published
-
2024 was the year where AI finally started returning on investment
Opinion It's taken a while, but enterprises are finally beginning to see the benefits of AI
By Ross Kelly Last updated
-
Has Google made a quantum breakthrough?
ITPro Podcast The Willow chip is reported to run laps around even the fastest supercomputers – but the context for these benchmarks reveals only longer-term benefits
By Rory Bathgate Published
-
Google jumps on the agentic AI bandwagon
News Agentic AI is all the rage, and Google wants to get involved
By Nicole Kobie Published
-
AWS sharpens sustainability focus as AI environmental concerns rise
News The hyperscaler says sustainability plays a part in core decision-making in the age of AI
By George Fitzmaurice Published