The Myth of the Cost Saving Cloud

The Myth of the Cost Saving Cloud

"We starting using the cloud and saved a bunch of money" is a common refrain you might hear from executives these days. On the surface, it sounds pretty great. Who wouldn't want to an opportunity to reduce capital expenditure that only required writing a check to a company in Seattle. But is it truly that easy?

The cloud has the potential to offer impressive efficiency, but this benefit isn’t automatic or implicit. Because of this, too many companies don't fully realize these gains. The good news however is compared to legacy IT, it is vastly simpler to achieve those benefits with the right technique.

A quick google search might suggest where this notion comes from. CIO magazine quickly pops up with an article that explains how Cloud computing is able to provide added value in cost savings. One statement stands out: "For example, users only pay for what they use with a cloud platform, and you can see exactly what the power is costing you through the transparency of a cloud provider's interface."

This is a true statement depending on how we interpret the word "use". What constitutes use? In many cloud providers, use means someone has allocated resources for their use, but they might not in fact be actually using those resources. Similar to how one may rent an apartment: the agreement guarantees the exclusive right to the resource but the owner has little concern if you actually spend any time there.

Now to take the apartment analogy further, you might point out that renting that apartment was far less capital risk than building it yourself and owning the property. Also true; but perhaps this is where the analogy might break down. Because unlike real estate, the cloud has a relatively low transaction cost. It's extremely easy to allocate server or storage resources in the cloud. With automation it can take seconds and a couple clicks. Yet once they are allocated, they will stay that way until we take another action to deallocate them when we are done. This leads us to one of our next challenges: dealing with unused resources.

The friction inherent to deallocating cloud resources is one of the problems of keeping cloud use efficient. If I remove a resource that was in fact in use by accident, the consequence can be severe to the business. I've been in many IT orgs in my career where the attitude was it simply wasn't worth cleaning up unused cloud resources because of the associated risk. An enterprise IT organization has to constantly work against the cloud's almost magnetic force that tries to collect new virtual machines or storage without regard for usefulness. Side note: it is this behavior of clouds that makes them such appealing investments to Wall Street.

Another inherent challenge is sizing the allocation in the first place. Again to use the apartment analogy, if I rent a place that is quickly outgrown, it will be costly to have to move again so I might want to over-allocate in anticipation of future needs. Over-allocation is a common IT practice in the cloud, but its fraught with risk as it often is very anecdotal (I'll just double the number I need) and done so frequently that the overhead in total becomes substantial at scale. There must be a better way.

There are a lot of solutions that try to solve these challenges, including tools to help track cloud assets, reporting on idle systems, etc. But one thing many don’t realize is Cloud Native technologies such as containers and Kubernetes can help solve these as well, in addition to their other benefits to the application teams. For example, if my application is so well packaged and easy to move from server to server, the desire to pre-allocate is decreased as I can just as quickly respond to an increase demand by moving to another larger server. Additionally, features such as auto-scaling enable the system to monitor actual usage of the application and allocate as well as deallocate resources as it is able. This removes the cognitive burden from an engineer who has to decide to clean up resources, as it’s done scientifically and programmatically (reducing the chance for error).


And with Cloud Native design the benefits are not just limited to the operational efficiency. Developers benefit as well from the composable, easy to test and prototype building blocks in Kubernetes and similar technologies. This means that even before your application goes to the cloud, you have increased your development velocity and improved the experience for your developers.

I hope from these examples you can see how the journey to the cloud has great opportunity, and that opportunity can be maximized by using the right tools for the job. There is a lot of untapped potential to make things better, so go out there and do something awesome!  


Paul Stanton

Modern, comprehensive data consulting for enterprise DevOps, DataOps, ML, AI, and testing, with database subsetting and virtualization, synthetic data, and cross platform data migration.

5 年

Using the cloud without some control over consumption is guaranteed to blow the budget!

要查看或添加评论,请登录

William Jimenez的更多文章

社区洞察

其他会员也浏览了