Businesses are gambling with application availability
New research has found a high frequency of application downtime is caused by risky business practices and lax IT processes.
Major new research into the causes of application failure and downtime has also highlighted how it is affecting business productivity, damaging brands and hitting the bottom line.
Over half (57 per cent) of more than 1,200 IT professionals questioned by analyst Freeform Dynamics about the frequency with which IT system failures impact their businesses said disruptions occurred on at least a monthly basis. Of these, 24 per cent said they happened at least once a week.
As well as citing downtime as having a direct impact on productivity, increased IT overhead and knock on effects such as delays impact processes, schedules and plans, one in five organisations suffered brand damage or tangible financial loss on at least a quarterly basis.
Martin Atherton, Freeform Dynamics research director told IT PRO the frequency and impact of application downtime reported was surprising. "This research shows this is an issue begging for action. IT organisations need to think more about application resiliency and procedures for fixing failures."
He said much of the exposure leading to high failure rates is down to the fact that system availability is only considered towards the end of the project lifecycle. "Application downtime is perceived with so much frustration by operations because they are not involved in the development side of the lifecycle," said Atherton.
"The operational side of the IT organisation is also the one that keeps systems running and so understands the end users' needs," he added. "But the resiliency they need is generally only considered after the application falls over, by which time it's too late for the 20 per cent who said downtime results in tangible business damage."
The research uncovered issues ranging from inadequate application lifecycle management leading to software that isn't operations ready,' to simple things like an absence of failover systems for key applications or the lack of effective monitoring to pre-empt potential failures.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
Larger organisations (26 per cent had 5,000 employees or more) were more likely to be hurt by the failure of heavily integrated and customised core business applications. Smaller organisations (of which 44 per cent had less than 250 employees and 30 per cent had between 250 and 5,000) found point applications like email more likely to cause business disruption.
Atherton advised: "Don't try and boil the ocean. Look at problems areas first and make sure you have the right tools in the right places without creating islands of information. The research showed that those organisations with targeted IT SLAs [service level agreements] aligned with business objectives did best."
The research, sponsored by business continuity vendor Neverfail, included 49 per cent of responses from UK organisations, 16 per cent from the rest of Europe, 20 per cent from the US and 15 per cent from the rest of the world.
A 25-year veteran enterprise technology expert, Miya Knights applies her deep understanding of technology gained through her journalism career to both her role as a consultant and as director at Retail Technology Magazine, which she helped shape over the past 17 years. Miya was educated at Oxford University, earning a master’s degree in English.
Her role as a journalist has seen her write for many of the leading technology publishers in the UK such as ITPro, TechWeekEurope, CIO UK, Computer Weekly, and also a number of national newspapers including The Times, Independent, and Financial Times.