Tracking Cloud and Data Center Spending Realistically

Numbers are always interesting, and sometimes there’s more interesting if you look at them in a different context. There’s a really interesting piece in The Next Platform on data center shifts, and I’d like to take that different look at it to raise some points I think are important.

The article’s thrust is that the public cloud is the leading factor in data center deployment, despite the fact that the IDC data it cites seems to show that investment in the public cloud declined over last year, and spending on traditional data centers grew. My own contacts with enterprises also suggests that their incremental spending on the public cloud (which of course would likely drive cloud infrastructure spending) grew last year and is growing more slowly this year. So far, we’re consistent.

Where I have a different perspective is in the interpretation of the numbers. Let me summarize what I’ve heard from enterprises over the last couple years to illustrate.

First, enterprises almost universally accept that the cloud is a better platform on which to deploy “front-end” technology used to present information and support customer, partner, and even worker application access. This has always been the dominant enterprise mission for the cloud. During the pandemic and lockdowns, changing consumer behavior (toward online shopping) and work-from-home combined to accelerate changes in application front-ends, and since those were dominantly cloud-hosted, that drove increased cloud spending. The cloud providers responded by building out capacity.

Second, what we’ve traditionally called “core business applications” have always been divided (like all business applications) into “transaction processing” and “user interface”. The latter tend to change faster and more than the former, and you can see that by reflecting on a widely used application like check processing (demand deposit accounting, or DDA, in banking terms). What’s involved in recording money movement doesn’t change much, but how you interact with the process can change quickly. The point is that during the same period, the transaction processing piece of applications, almost always hosted in the data center, didn’t require more resources. Keeping things up to date, meaning the replacement of obsolete gear, was the only driver.

Third, the pandemic put a lot of financial pressure on companies, and that pressure encouraged them to control their spending. If sustaining the top-line revenues in the face of change demanded more be spent on the cloud, the changes to the data center would logically be deferred as much as possible. They were, resulting in data center spending dips last year. However, what’s deferred isn’t denied, and so this year the data center got its turn at the well.

The article, like much that’s written on this topic today, seems to take the position that we’re on the road to moving everything to the cloud, that company data centers will pass away. The current shift is seen more as a bump in the road than an indication that the total-replacement theory isn’t correct. Recall that recently, the CEO of Amazon’s cloud business admitted that some workloads would never move. The article says “At some point in the future, maybe in 2030 or 2035, maybe this way [the data center way] of computing will go away. Or, maybe it will just live on forever as a niche for doing back office stuff.” I think the total-replacement theory is wrong, but that doesn’t mean that over time we don’t see more cloud in our infrastructure. It means we have to think of applications, clouds, and data centers differently.

Let me quote another line from the article: “We were born in the mainframe era, and there is a good chance that when we colonize Mars, the payroll will be done by some punch card walloping mainframe that is gussied up to look like an iPhone XXVI with a pretty neural interface or some such nonsense.” This view, which you have to admit sounds a bit condescending, reflects another misconception that relates to how we think about applications and how they’re hosted.

Let’s go back to the bank example. If you look at our DDA application, you realize that while there’s a lot of transactions associated with check-cashing, a lot of doing business is tied not to the transactions but to their result. There are statements, there’s regulatory reports, there’s that payroll and personnel stuff, and there are the myriad of analytical reports that are run in order to manage the business, optimize profits, and plan for the future. One bank told me that the result of a check being cashed was referenced on the average almost a dozen times by analytics and reporting applications. All these references, in a cloud world that charges for data movement, would raise costs considerably if they were moved to the cloud.

We just had a couple of major problems with online services, most recently the big Facebook outage, caused by Internet-related configuration errors. If you lose your network access to the cloud, you lose your applications. Stuff in the data center could be impacted by network problems, of course, but the data center network is more contained, has fewer components, is simpler, and is under a company’s direct control. Losing the cloud part of an application is bad enough, but to lose all data access? Company management tends to worry about that sort of thing.

Then there’s the security issue. Just last week, Amazon’s streaming game platform, Twitch, was hacked, creating what was described as the largest data compromise to date. This is Amazon, a cloud provider. Sure, enterprises have also suffered data losses, ransomware, and so forth, but most businesses are more comfortable with risks that are under their control than risks that they can’t control.

All these points, in my view, illustrate the most important conclusion we can draw from the data in the article. The division of application functionality between cloud and data center has been responsible for cloud growth; the focus on the front end of applications, and the user experience, focuses investment in the cloud because the cloud makes a great front-end. To impact that division, and the spending pattern that it creates, there would have to be a fundamental shift in cloud pricing and perception, a shift that I think would likely hit the cloud providers bottom line immediately, and might well not generate enough change to offset the loss.

Technology change isn’t mandatory for businesses. We adopt new things because they serve us better, and when we understand just why that is, we can optimize those new things so that we can justifiably adopt them faster and more broadly. The cloud is such a thing. There’s no reason to assign it a mission it’s not going to fulfill, a mission that hides its real value and potentially reduces development of the tools that would magnify that value. Reality always wins, and facing it is always the best idea.