What role will the cloud play in the AI era?
For most businesses, cloud technology will be essential for leveraging AI without making huge investments
When cloud computing first became more than just a theoretical idea designed to help drive business value, it offered organizations a revolutionary concept - access to huge amounts of compute power with little to no infrastructural commitment.
Understandably, the cloud has skyrocketed in popularity and is now ubiquitous in the tech landscape. Startups and enterprises alike turn to cloud computing to develop applications, use services, and store vital data.
With the shiny new allure of generative AI now the trending topic in tech, it may be tempting to think the cloud’s moment in the sun is coming to an end - but that’s far from the case. More than ever, businesses will need to leverage the flexibility of cloud computing.
Despite fears of a cloud slowdown on the back of economic disruption in 2023, this past year has seen growth begin again on the back of generative AI. Research from Canalys earlier this year predicted a 20% increase in cloud spending in 2024 in line with increasing AI adoption.
AI - particularly generative AI - is a notoriously demanding technology, requiring vast amounts of compute and graphics processor unit (GPU) infrastructure to conduct training and inferencing. Few firms will be in a strong enough position to deliver the foundation for AI.
Coincidentally, these firms have the infrastructure because of their long-standing cloud credentials. They have vast portfolios of data centers that lend themselves to AI, as they have the physical space and systems to meet demand.
Most firms will therefore see AI delivered through the cloud, whether that’s in the form of renting out GPU compute power from bigger firms or simply through using other firms’ AI tools via cloud platforms.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
AI infrastructure in the cloud
For organizations looking to train and develop their own, customized AI models, renting GPU capacity through the cloud may be the most attractive option. This gives businesses flexibility in AI development without firms needing to concern themselves with infrastructural demands.
Much like software-as-a-service (SaaS) that went before it in the cloud computing world, companies are now creating provisions for a new type of ‘as-a-service’ system that allows businesses to tap AI infrastructure via the cloud.
Gartner defines it as machine learning infrastructure-as-a-service (ML IaaS), an infrastructure delivery model that could provide virtualized resources for compute-intensive AI work.
Through ML IaaS, the cloud ecosystem becomes vital to the ever-changing AI landscape. Granted, it will be an ecosystem altered from the traditional cloud, but fundamentally many businesses will use the cloud to leverage AI infrastructure.
Nvidia is a key example. The firm’s ‘as-a-service’ offering gives customers access to supercomputer power to train large language models (LLMs) and develop AI technologies via its infrastructure.
AI services in the cloud
For small and medium-sized businesses (SMBs), the cloud will play a part in how AI services are delivered - regardless of provider, most AI tools will find their way to end users via the cloud.
Take an enterprise-focused chatbot, the likes of which many of the big names are beginning to roll out. While the underlying AI model and technology fueling the chatbot will live on hyper scale compute systems, the chatbot will get to the hands of an end user via that same firm’s cloud platform.
Few firms will have large-scale AI infrastructure, and not a huge amount will be renting GPUs - the vast majority of businesses will try and integrate existing AI tools into their workflows via the cloud.
Such a demand for these tools is already driving significant cloud growth. Earlier this year, analysts at Gartner told ITPro that growth in the global cloud computing market is expected to continue owing to high demand for AI.
“Already we’ve seen growth accelerating in 2024, and we’re projected this to continue through 2025 as enterprises integrate generative AI into their technology portfolio,” Gartner analyst Hardeep Singh said.
Singh said that investments into AI-powered enhancements for existing hyperscaler platforms and developer-focused offerings were also seen across the board.
More recently, research from IDC put total global revenue in the public cloud services market at almost $700 ($669.2) billion in 2023, up 19.9% from the previous year. Again, analysts put this down at least in part to AI, particularly in platform-as-a-service (PaaS) spending which saw the fastest year-on-year revenue growth.
"In large part due to end-user investment in AI, PaaS revenue growth continues to outpace the overall cloud market,” IDC analyst Adam Reeves said.
Data storage
Data is hugely important to leveraging AI - to build or customize an AI model that creates the optimum level of value, businesses need to have access to all their data in a way that is easily navigable and comprehensible to the AI system.
Take retrieval augmented generation (RAG), through which AI can be prompted beyond its training knowledge and given access to data that allows it to make real-time, informed decisions for business processes.
If used by a business, such a technique requires clean, well-organized data. Vector search requires similar control over data, allowing AI to trawl unstructured data sources in some cases.
While such data does not necessarily need to be kept in the cloud - many will opt for on-prem storage, particularly when dealing with sensitive data - the cloud will no doubt play a part in the data conversation surrounding AI.
The main advantage of cloud storage is, after all, flexibility, an advantage that will lend itself in some way to the deployment of AI tools and technologies. Flexibility in data storage will help when training new models or using new data.
Take a recent announcement from Oracle, which saw the firm roll out a database offering specifically tailored to vector search. Through this platform, customers can use vector search inside data sources, including cloud-based ones.
While many still undertake some business operations in on-premises data centers, cloud computing is now a core part of how most digital services are delivered. As generative AI continues to gather pace, it will certainly be no different.
George Fitzmaurice is a staff writer at ITPro, ChannelPro, and CloudPro, with a particular interest in AI regulation, data legislation, and market development. After graduating from the University of Oxford with a degree in English Language and Literature, he undertook an internship at the New Statesman before starting at ITPro. Outside of the office, George is both an aspiring musician and an avid reader.