I’m a programmer and software architect by background, and a student of computing history. I’ve seen the transformation of computing from the retrospective analysis of “keypunched” paper records of retail transactions to highly interactive assistance with online, real-time, activity. Through that, computers have migrated from data center to desktop to smartphones in our hands, and to the cloud.
All of this has changed our relationship with computers totally. Despite this, our thinking about computing still focuses on the data center, the same data center that would have held those old-line “mainframes” with punched-card readers, sixty years ago. Can we exploit the current state of computing and networking, the state that has given us the Internet and the cloud, when we’re still focused on data centers? That’s the real question that “cloud first” planning should be addressing. It’s what we’ll address here.
The cloud is not, on a system-by-system basis, cheaper than a data center. Run the queuing theory math and you’ll see that most enterprises would achieve economies of scale comparable to cloud providers. What makes the cloud different is that it’s elastic in capacity and distributed in space. Achieving those properties in the data center would require sizing resources well beyond “average usage” levels, and that would add to data center costs, where it’s already a property of the cloud. That means that proper cloud applications are those that benefit from those cloud properties, and that’s where changes in behavior come in.
The changes in mission for computing that we’ve seen boil down to two things. First, we have replaced “shopping” with the Internet equivalent. We now browse products, make decisions on what to get, and ever-more-often even complete the order entirely online. Second, we have exploited portable and mobile devices to move information delivery to workers outward to their points of activity. We don’t ask them to go to the computer, it comes with them. These mission shifts are the thing that’s been driving the cloud.
In the old days, a shopper might go to a counter and ask a salesperson to see (for example) a watch. They’d look it over, ask questions, and if they were satisfied, make the purchase. The chances of a successful purchase thus depends on that at-the-counter experience, and when you eliminate that in favor of an online experience, you need to be able to provide the prospect the same level of confidence that they’re making the right decision. You need to move the online process close to the prospect so it can hand-hold through the product evaluation and sale, so you need “the cloud”.
In the old days, a worker would have a desk in an office, and when they needed company information to do their job, they’d go to that desk in that office and use a computer there. If they needed some information to be taken away from the office, to a prospect or even to a warehouse, they’d print something out. Today, the worker would simply use a phone, tablet, or laptop, and carry it with them everywhere. If they needed information, they’d use a “portal” to the company databases. That portal would have to be customized to their use of the information in order to optimize their productivity. They’d need the cloud too.
Both our missions need the cloud, but neither mission needs the cloud to absorb all of our IT processes. If we consider the evolution of computing I cited above, we see that the basic information that’s associated with a business transaction (like the purchase of a watch) hasn’t really changed. It’s how IT relates to the purchase process that has changed, which means it’s the part of IT that does the relating that’s a cloud candidate. The central database, the processing of transactions against, the regulatory and management reporting needed to run the business, are all largely the same. In fact, in most cloud applications today, the cloud creates a front-end to legacy applications. So what we should have expected from the first is that enterprise cloud applications are hybrid cloud applications. We should thus visualize the application model of the future, the one that’s already evolving, as a model where the Internet and the cloud form a distributed virtual GUI that is then linked back to the data center. Given that, we can see that “the network” is no longer a VPN that links all the branch locations, it’s now part of the Internet and the cloud.
From a compute perspective, then, the goal should be to gradually, through regular application modernization, slim down the core data center applications to be nothing but transaction processors and analytics associated with the corporate databases that are collected from transaction processing. Beyond this, the logic should be pushed out into the cloud where it can better support the real-time relationships with customers/prospects, partners, and employees. This is already being done, but I don’t see much awareness of why, and so the process of modernizing data center apps isn’t necessarily being directed optimally.
From a network perspective, this means that corporate VPNs are going to diminish because office worker traffic will come in through the Internet/cloud front-ends. Businesses like banks might, for a time, still justify VPNs for specialized devices, but even that is likely to vanish as we start building true IoT and online appliances designed to be used on the Internet and with cloud applications. What will replace them is a virtual network model, much like SD-WAN, but likely with a cloud-hosted security component that makes the edge what’s popularly known as a Secure Access Service Edge or SASE. Thus, it would be smart for enterprises to be planning for this evolution.
If we consider the computing and networking pieces together, the best approach for planners to take, and for vendors to support, would be to start moving workers to Internet/cloud portals and SASE rather than through office-based VPNs, and to adopt SD-WAN to replace MPLS VPNs where they were still needed because applications hadn’t fully migrated or specialized devices needed VPN support.
Of course, service providers (particularly the ISPs and other network operators) will play in all of this. Some managed service providers (MSPs) are already offering SD-WAN services that include multiple Internet connections, wireless backup, etc. As the quality of Internet access improves, it paves the way for a transition to truly making the Internet into the universal dial tone of the future. That’s a good thing, because it’s inevitably going to become just that, and operators and enterprises alike should start planning for it. The fusion of the cloud and the Internet will create, finally, one network.
This fusion will facilitate a more complete mission transformation, at the user level as well as at the technology level. The cloud, edge computing, enhanced devices, new software models…all of these things combine to get information power close to us so it can be a part of our lives. In a sense, it’s a reversal of the metaverse model of moving us into virtual reality. Instead, we move virtual reality out to touch us, so the power of AI and digital twinning can make computing more effective in supporting what we do. It’s a future that we can approach slow or faster, depending on how effectively we can direct the sum of cloud and network toward achieving it. I’d love to see that be a target of venture capital, because that’s where startup innovation would serve us all best.