How the cloud helps build resilience: clearing up misconceptions about the costs and risks
Customers who are still running on-premises infrastructure don’t always understand how a cloud strategy can make significant contributions to organizational resilience. In fact, they often come with misconceptions about efficiency, security, and cost, and they conclude that the cloud actually undermines resilience.
One might call these myths, but I believe they arise from a lack of exposure. Finance leaders sometimes apply old models to new environments but overlook critical modern factors. As a result, they miss out on making good strategic decisions because they don’t have the right information or understanding.
I want to clear up six of those misconceptions so more organizations can take advantage of the cost savings and risk management that the cloud offers. I’m confident that when finance leaders have all the relevant facts, they can better assess the opportunities of the cloud.
Misconception 1: “I will be more secure if I control my own environment.”
The risk teams that support CFOs and boards understand the significant threats out there. An intrusion or data leak costs far more to recover from than to defend against in the first place. The constant stream of news about security breaches and regulatory fines helps drive that message home.
However, the question here is whether you can manage those risks better in an on-premises environment you control, or in a cloud environment. To answer that question, you have to consider two distinct aspects of security.
From a platform security standpoint, a single company has limited visibility into different threat vectors in the marketplace. But a large cloud vendor like Microsoft can aggregate the signals coming from thousands of large enterprise customers with varying levels of security risk. We regularly see bad actors trying to intrude into these environments and can learn from those signals and inoculate against them faster than a company could do on its own. We can also come up with more sophisticated security mechanisms to control, investigate, and detonate malicious code.
From an employee security standpoint, a company would have to assemble many different point solutions for identity management, content management, application management, and password management to monitor and control access. This piecemeal setup comes with additional complexity and cost, and it is more vulnerable to dangerous weaknesses. However, when a single vendor can provide many endpoint solutions for employees with seamless integration built in, you get the benefit of a simpler solution.
So, given that threats are evolving quickly, and employee environments are becoming more complex, outsourcing security to those who are on the front line often lowers both risk exposure and cost.
Misconception 2: “My server capacity should be determined by peak demand.”
Lead times and growth force on-premises environments to buy capacity today for the maximum demand of the future, leaving a lot of capacity idle until growth materializes. There are also requirements for disaster recovery, continuity investment, and failover capacity, beyond the actual production environment.
However, the promise of the cloud is scalability. By aggregating demand across many customers, a cloud can run more efficiently, allowing customers to acquire capacity based on their current demand average, and scale up and down by the minute as new peaks of demand come and go.
There is also different pricing and packaging for consistent workloads, which can be run on compute reservations at a significant discount. So, cloud server capacity can be determined and paid for based on actual granular usage and not averaged up to peak requirements.
Making core IT infrastructure more modular while maintaining high availability is an important driver of digital resilience.
Misconception 3: “Impact to EBITDA should determine what cloud strategy is adopted.”
EBITDA creates the illusion of cost efficiency by ignoring the depreciation behind massive IT capital expenditure investments. This illusion typically results in financially inefficient IT environments that are hardly measured properly, given that EBITDA does not provide an incentive for it.
Instead, cash flow should be the leading indicator to evaluate a cloud strategy for a proper financial comparison. Read on for more details as to why.
Misconception 4: “My on-premises environment is efficient because it runs at a very high utilization.”
Just because your capacity is being used, that doesn’t guarantee it’s delivering value. It’s important to distinguish the portion of the capacity used for real demand from the capacity that is running obsolete or unnecessary applications, or that is being pushed to users just because it is available.
Often IT will map their infrastructure and determine how many cores are running without evaluating whether the applications soaking up capacity are actually worth keeping. When we engage with customers, we do an analysis of all the apps running and ask about future usage and load. Then, if we find apps that aren’t delivering value, customers may decide to sunset those apps while continuing to run them on-premises, instead of migrating them. That’s a good way to take advantage of fully depreciated servers (which leads us to the next misconception).
Misconception 5: “An on-premises environment running fully depreciated servers is very cheap.”
While the useful life of some servers may exceed their depreciation life, allowing for some years of “depreciation savings,” other costs such as warranties, maintenance, and down time start becoming significant as failure rates increase. In addition, outdated hardware could lead to other security threats and high risk for critical applications.
We sit down with customers to get a cash basis view of what it costs to run their servers now, and what it will cost to replace and run them in the future, so they can have a full picture of the total cost of ownership. We also talk about the cost of forgoing the agility and increased security that comes with using a cloud vendor. When you don’t think about what’s going to happen five years from now, you can end up undercosting.
Misconception 6: “Cost per unit is better with a private cloud than a public cloud.”
It’s common for private cloud operators to compare their theoretical unitary costs (cost per virtual machine or VM) with public cloud pricing. Unfortunately, unitary costs mislead customers into ignoring the volumetric benefits of a public cloud, for three main reasons:
- Most private cloud environments oversubscribe the volume of VMs well beyond the physical infrastructure capacity behind them. Hence, private cloud costs get allocated across an artificially high number of VMs that cannot run concurrently, creating the false perception that unitary costs are much lower than they truly are when looking at physical infrastructure.
- When public cloud resources are underutilized, a unitary cost comparison will ignore the fact that a public cloud can produce the same amount of work with significantly fewer resources.
- The total costs allocated are not comprehensive. For example, it is common to ignore some centralized costs (such as personnel, supply chain, and procurement) or opportunity costs.
Once these misconceptions have been cleared up in conversations with customers, the conclusion is often inevitable: the cloud offers more efficiency and cost savings without compromising security. Strengthening financials, shifting to scalable infrastructure, and improving risk protection all contribute substantially to an organization’s ability to stay resilient in the face of disruption.
Learn more about the Microsoft approach to finance transformation. Visit the Microsoft Modern Finance website and find tools and resources to improve your operations, increase efficiencies, and drive greater impact in your organization.