The future is flexible
When infrastructure is available on demand, businesses become more efficient and productive

If there’s one thing the past 18 months has taught us, it’s the importance of flexibility. Being able to adapt to a rapidly changing environment has been key when it comes to which businesses have survived and even thrived during the pandemic, and which have struggled or failed.
The lesson in the importance of flexibility is one that every organisation should take with it as the world begins to return to normal and incorporate into every area of its operations. A central pillar of any exercise in flexibility should be IT that can be flexible too. This is sometimes easier said than done, however; your IT estate may be seen as a ‘sunk cost’ that would be uneconomical to replace. You may be concerned about whether the resources you need will be available when you need them if you don’t build in spare capacity when you’re provisioning “just in case”. Or perhaps you’re not convinced that some of the more flexible models of IT consumption will actually be able to meet your needs.
When you’re talking about business-critical applications and workloads, these are very fair considerations. Indeed, it’s not that long ago that these kinds of reservations would have been very valid. However, on demand IT delivered in a hybrid model isn’t the same as using old-style colocation infrastructure, nor does it mean moving everything over to one of the public cloud hyperscalers. Instead, you can now bring a consumption-based, cloud-like experience to your own data centre and make it available for almost every use case you may need.
Digital transformation
Digital transformation has been a cornerstone of business development for the past several years and rightly so. Any organisation that doesn’t put digital at the heart of its operations and seek to adapt to the way business is done in an online-first world is going to struggle no matter its sector or size.
Flexibility and right-sizing are key tenets of digital transformation. Organisations need to have resources available when a project needs doing, but equally shouldn’t be weighed down by feeling they need to invest in masses of new infrastructure.
Cloud is often put forward as the answer to this particular question, but it’s not without its limitations. When used correctly for short periods in particular, public cloud compute services can be very cost effective. Frequently, however, this isn’t how it’s used, with overprovisioning or using it for the wrong kinds of workloads often making it a less cost effective choice than it may first have appeared. There are other important issues that can arise from using public cloud, too, such as difficulties with data exfiltration and questions over whether data sovereignty and data residency requirements are being met – particularly in highly regulated industries.
For this reason, organisations are increasingly choosing to adopt a hybrid cloud model. Software as a service (SaaS) is used to meet common business needs like customer relationship management (CRM), payroll, and some HR functions, while some workloads may be put in the public cloud and the rest remain on premises in a private cloud.
The private cloud element of this mix has sometimes proved controversial. Some industry watchers have argued it’s just a rebadging of co-location that long predates the cloud, or even a euphemism for a standard data centre. Infrastructure on demand, such as that offered by HPE GreenLake, overcomes many of these arguments, however. It offers a public cloud-like experience either on-premises or in a managed service provider’s facility, with the ability to scale up and down with a simple click of a button.
In fact, GreenLake goes further than some public cloud offerings as it’s not just compute or storage that can be turned on and off as needed, but even individual cores with the recently announced Silicon on Demand. In the words of HPE, it’s “the cloud that comes to you”.
DevOps and agile development
Another key trend in IT over the past few years has been DevOps. Sometimes seen as part of digital transformation, sometimes as its own separate initiative, it amalgamates the development and operations teams to reduce the amount of time it takes to put an application into production. As opposed to traditional waterfall development, it allows for a more iterative, less disruptive, and more innovative approach to the creations, management and release of business apps.
As highlighted by research firm Futurum’s Why Everything-as-a-Service? Why HPE? report, by its very nature, the infrastructure demands of DevOps are more variable than that of waterfall development. Having scalable resources on hand is therefore critical if you’re going to avoid the excess costs associated with overprovisioning. Once again, public cloud infrastructure is often put forward as an answer to this, but a private cloud underpinned by infrastructure on demand is often a better solution.
One of the problems with public cloud is latency, which is the enemy of the rapid development and deployment central to DevOps. Bringing scalable, on-demand private cloud infrastructure such as HPE GreenLake to the workloads instead, rather than sending them to a public cloud data centre that could be hundreds of miles away, reduces latency and lets DevOps projects achieve their full potential.
Artificial intelligence and machine learning
While still something of an emerging technology, artificial intelligence and its subcategory machine learning are attracting increasing interest – and investment – from businesses across the board.
As it will be a new initiative for most businesses, this offers a true opportunity to invest in the right IT infrastructure and platforms. As with DevOps, AI can be highly sensitive to latency and some applications are only suited to deployment on-premises in the data centre or at the edge. Similarly, it can have variable capacity needs depending on the nature of the project that’s underway – an intensive machine learning initiative that involves training the software on huge volumes of data will be more demanding than a more basic AI application running in a production environment. Therefore there’s still a need for responsive, scalable infrastructure.
A taste of the future
While these may be the three dominant IT trends impacting businesses currently, this isn’t where our flexible future ends. As time goes on, new technologies and ways of doing business will emerge – increasingly led by user experience according to Futurum’s research – and IT infrastructure will need to be responsive to this.
On-demand infrastructure will undoubtedly prove key in facilitating all of this, bringing a true cloud-like experience across all elements of a hybrid-cloud setup and the ability to run workloads in the best possible environment, whatever they may be and wherever that is.
Find out more about the flexibility that HPE GreenLake can provide for your business
Get the ITPro daily newsletter
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
ITPro is a global business technology website providing the latest news, analysis, and business insight for IT decision-makers. Whether it's cyber security, cloud computing, IT infrastructure, or business strategy, we aim to equip leaders with the data they need to make informed IT investments.
For regular updates delivered to your inbox and social feeds, be sure to sign up to our daily newsletter and follow on us LinkedIn and Twitter.
-
The Race Is On for Higher Ed to Adapt: Equity in Hyflex Learning
By ITPro
-
Google faces 'first of its kind' class action for search ads overcharging in UK
News Google faces a "first of its kind" £5 billion lawsuit in the UK over accusations it has a monopoly in digital advertising that allows it to overcharge customers.
By Nicole Kobie
-
HPE eyes enterprise data sovereignty gains with Aruba Networking Central expansion
News HPE has announced a sweeping expansion of its Aruba Networking Central platform, offering users a raft of new features focused on driving security and data sovereignty.
By Ross Kelly
-
HPE unveils Mod Pod AI ‘data center-in-a-box’ at Nvidia GTC
News Water-cooled containers will improve access to HPC and AI hardware, the company claimed
By Jane McCallion
-
‘Divorced from reality’: HPE slams DOJ over bid to block Juniper deal, claims move will benefit Cisco
News HPE has criticized the US Department of Justice's attempt to block its acquisition of Juniper Networks, claiming it will benefit competitors such as Cisco.
By Nicole Kobie
-
HPE plans to "vigorously defend" Juniper Networks deal as DoJ files suit to block acquisition
News The US Department of Justice (DoJ) has filed a suit against HPE over its proposed acquisition of Juniper Networks, citing competition concerns.
By Nicole Kobie
-
HPE Discover Barcelona: What’s the business benefit of supercomputers?
ITPro Podcast With potential in fields such as AI to scientific modelling, global interest in supercomputers continues to rise
By Jane McCallion
-
El Capitan powers up, becomes fastest supercomputer in the world
News Earth’s newest supercomputer is fast, efficient, and its use cases are rather different
By Jane McCallion
-
HPE ProLiant DL145 Gen11 review: HPE pushes EPYC power to the network edge
Reviews A rugged and very well-designed edge server offering a remarkably high EPYC core count for its size
By Dave Mitchell
-
Inside Lumi, one of the world’s greenest supercomputers
Long read Located less than 200 miles from the Arctic Circle, Europe’s fastest supercomputer gives a glimpse of how we can balance high-intensity workloads and AI with sustainability
By Jane McCallion