NetApp is on a mission to drive hybrid cloud adoption – but first, the data silos have got to go
NetApp customers took to the stage to talk about their AI journey, how hybrid works for them, and why siloed data architectures are still a major stumbling block
At last year’s NetApp Insight conference in Las Vegas, CEO George Kurian told ITPro hybrid cloud models would be central to unlocking the true value of enterprise AI.
Projections suggest he was right, with the hybrid cloud market set to grow at a 17.2% CAGR, reaching $414.1 billion (£308.8 billion) by 2032, according to Allied Market Research.
But hurdles remain around how businesses are shifting to hybrid models and transforming their data infrastructure, namely the innumerable silos they create each time they add a new application that inhibits the mobility of the data AI systems rely on.
Speaking to ITPro, Matt Watts, chief technology evangelist at NetApp, harkened back to the birth of virtualization and the proliferation of tooling and complexity it helped eliminate.
Watts noted it looks as if the enterprise IT space has run into the same problem with the cloud.
“Now we’re in a world where people are using on-premises technology, and they’re using stacks of technology that they're building in AWS, Microsoft Azure, and Google Cloud, that’s a really similar problem that we’re creating that existed before virtualization.”
“So we’ve got a lot of companies – and they’re all at different stages – who are now looking at this [hybrid] environment where different people have gone out to build different environments, across different clouds, using different tools.”
Cloud Pro Newsletter
Stay up to date with the latest news and analysis from the world of cloud computing with our twice-weekly newsletter
Reimagining this data infrastructure to unify their data and simplify this environment will be key, Watts argued, and NetApp is positioning itself to capitalize on this demand.
“I think for most companies, they’re looking at it and saying, would there be layers of standardization that we could create that would simplify this kind of new hybrid, multi-cloud environment that we’ve built.”
Hybrid cloud is the way to go, that “key challenge” still remains
This year at NetApp Insight 2024 in Las Vegas, Harv Bhela, CPO at NetApp, invited a number of customers on stage to provide their AI-adoption stories and where hybrid cloud fits into this picture.
Monica Jain, director of R&D for data science at pharmaceutical giant Johnson & Johnson, outlined the specific challenges she faces orchestrating the company’s data strategy.
Jain specifically highlighted difficulties concerning the firms’ distributed operating model, as well as its place in a heavily regulated industry bound by stringent data protection responsibilities.
The key capabilities she looked for when designing this AI infrastructure revolved around providing agility alongside scalability, which she said was critical for supporting the evolving needs of the firm’s data scientists and AI applications, all while maintaining robust security,
Hybrid cloud’s mixture of on-prem and cloud storage presented an enticing solution for Johnson & Johnson, with the scalability and flexibility of the cloud being complemented with the cost efficiency and security of on-prem.
But Jain stressed the implementation process has not been simple. First among these challenges was ensuring its AI tools had access to clean, good quality data that could be used to generate accurate insights, This is vital when the consequences of drift could be potentially life threatening.
Jain added that data silos were another key factor inhibiting her organization’s data mobility and ability to meaningfully leverage AI.
“Another key challenge is the data silos, we have data coming from different departments and systems, and integrating it into a unified, accessible format for AI applications can be very complex, especially in the R&D world,” she said.
Shortcuts lead to silos – unified data is the true AI enable
Speaking during a media briefing at Insight 2024, Kurian posited that enterprises taking what appears to be the easy route leads to them struggling with siloed data architectures and wider dysfunction down the line.
“Silos are the result of taking a shortcut. The right way to do something is to unify it, but when it’s organizational expedient, easier technically or budget-wise, [firms say] ‘Hey I’m just going to stand up my old silo, and the result of that is usually it works for a short period of time and then collapses.”
Silos have long been a problem plaguing IT estates globally, but Kurian speculated that the ongoing AI boom could be the driving force the industry needs to finally tackle the problem.
He laid out NetApp’s philosophy in what he described as the ‘third wave’ of the modern data revolution: unified data that’s wide in scope as well as long in trend to unlock the true power of AI.
“We have always believed that you need to unify your data, because to derive insight from it, it’s better to have a wide scope of data, not just a long trend of data,” he explained.
“That has been proven completely true, especially in the advent of generative AI because what you find is you can analyze the unstructured data that is about 90% or 85% of an organization’s data.
“As a result if you are only analyzing 20% of the data, you’re way behind.”
Solomon Klappholz is a Staff Writer at ITPro. He has experience writing about the technologies that facilitate industrial manufacturing which led to him developing a particular interest in IT regulation, industrial infrastructure applications, and machine learning.