Public cloud does not hold all the answers

Vibrant abstract image features a pink cloud floating in a dreamy purple sky, with colorful small cubes falling like raindrops from the cloud. Represents imagination and diversity of creative ideas that are constantly raining down from the imagination.
(Image credit: Getty Images)

Only when leaders rigorously scrutinize their assumed knowledge will they be able to properly determine the best path to take for their organization's IT environment. This approach enables firms to make more informed decisions and have real confidence that they are more likely to support their business objectives. Accepting ‘received wisdom’ may not be a wise act in itself.

Public cloud has been the default option for many organizations over the past decade. The much-touted benefits of cloud computing and the dominance of a handful of major players swallowed most of the business, through brand recognition, lock-in through the wider technical environment, and partnerships with just about every MSP and channel partner in the world.

Assumptions were and continue to be made across the board about the inevitability of using public cloud services. The same names appear on everyone’s lips together with an implicit assumption that such services will meet most if not all requirements.

Which it does, for some. For many even. Especially those with relatively lightweight requirements. But when it doesn’t, when it leads to underperformance, downtime, and burst budgets, there is a reckoning which significantly impacts end users, MSPs, and channel partners.  

This is why there has been a recent movement away from public cloud. Many organizations realize there are solutions available that better fit their needs, and that the big tech vendors don’t necessarily hold all the answers.

Dynamic services can be an expensive illusion

Dynamic public cloud resources appear to be a perfect solution – only paying for the services you need when you need them. Hardware resources are transformed from an unsightly on premises space-eater to something to be spun out of the air when required. A kind of technological magic trick or illusion, if you will.

To provide dynamic resources, the public cloud vendor must have pre-provisioned large amounts of hardware to be ‘available’ for dynamically scaling customers. Much of that equipment won’t be in use for much of the time, sitting on standby, losing value and using up space in an expensive data center. To be profitable, the use of that equipment needs to be priced high and be over-provisioned.   

This equates to high usage costs, which are not always clear when forecasting budgets based on granular pricing. Additionally, there’s a risk of underperformance, which worsens with increased over-provisioning often done to try and keep monthly bills low.

To ensure profitability, this equipment will also remain in use for as long as possible, meaning customers are often offered base components that do not meet current demands. This can leave many organizations paying much more while being much less performant than they were before moving to the public cloud.

Against vendor dominance

Beyond the issues around cost and performance, the dominance of certain players now constitutes an increasing threat to the overall cyber resilience of global business. The recent global outage of Microsoft services caused by CrowdStrike, followed quickly by a global Azure outage, clearly demonstrates the risk of vendor dominance in any market.

Of course, it also goes without saying that providing anyone with an almost monopolistic dominance over a market is bad for competition, innovation, and specificity. In such scenarios, a small number of organizations win big and the rest lose.

Private cloud and sector experts

Mitigate the risk by investing in a wider range of technologies and supporting smaller, innovative vendors alongside the big players. This leads to a healthier, more varied environment, ensuring the market makes improvements in cost, performance, and security. It allows sector expertise to thrive and evolve, unlocking benefits that might otherwise be overshadowed by big tech generalism.

Security analyst working with multiple screens in front of them

(Image credit: Getty Images)

Three secrets to success for the MSSP

Private clouds may be less dynamic but are more likely to be fit for purpose when managed by sector-specific experts who can spend time with you to ensure appropriate solutions that meet both performance and budget requirements. Alongside better-informed solution design, it is also more probable that private cloud offerings will be supported by responsive, available experts rather than rabbit holes of outdated knowledge bases, AI best-guessing chatbots and layers of outsourced non-experts trying to communicate over muffled telephone lines.

Cloud benefits do exist

Much of the promise of cloud computing still exists. Centralization of resources can enable better collaboration with the right tech and solution design, better security management through the reduction of countless locations and endpoints, greater efficiency of estate administration, overall reduction of hardware, and increased resiliency. In addition, cloud-hosted services better serve modern hybrid and remote working use cases for businesses.

Scalability is also easier, even if not fully dynamic (the best solutions tend to be an informed balance of dedicated and dynamic resources). Finally, cloud services support greater flexibility, a critical factor in a world that is changing ever faster.

Reverting to on premises after being burnt by public cloud might be the right answer for your organization. But that could also be cutting your nose off to spite your face. Research private cloud options first, ask the difficult questions and make sure it’s right for you.

Tim Whiteley
Co-founder, Inevidesk

Tim Whiteley is the co-founder of Inevidesk, where he leverages over a decade of technical expertise to make virtual desktops accessible to organisations with high graphical demands. Through a combination of deep industry knowledge and strategic partnerships, Tim is helping businesses transition from physical workstations to GPU-powered Virtual Desktop Infrastructure (VDI), enabling them to work flexibly from anywhere without sacrificing performance. His focus is to drive agility, productivity, and competitiveness by delivering scalable, cost-efficient VDI solutions that are easy to deploy and deliver long-term business value.