Three Steps to Maximize 5G (That Nobody will Like!)

I think there are three steps to 5G success.  Some of the steps are intuitive, some are counter-intuitive, and one may be downright offensive to some of the 5G proponents.  I’m sorry about that, but I have to call things as I see (or, in this case, model) them.

The first step is to recognize that supply-side thinking in telecommunications is dead forever.  Networks once were fulfillments of natural, unequivocal, demand.  If your goal is to support people talking and listening, nearly everyone is equipped to play their role.  If your goal is to provide them with information and entertainment, we have a series of objective steps that can identify the service characteristics and total addressable market.  When we try to promote network technology with value propositions beyond the things we can quantify, we’re in a different world, a world that I can’t visit.

5G standards, like every other standards process I’ve either participated in or been aware of for the last 15 years, were flawed by supply-side bias, the classic “Field of Dreams” approach.  I understand why that is; it worked for decades and it eliminates the need for network technologists to deal with uncomfortable questions and issues.  But whatever else happens now, it should be clear that 5G standards hasn’t served the industry well.  We can’t go back and fix that at this point, but what we can do is ensure that we don’t throw other technologies under the standards-organization bus.

Rakuten is demonstrating what’s arguably another model already, focusing on delivering service features more than rigid standards compliance.  It illustrates a key point, which is that a cellular network’s standards must define the interface between network users and the networks, and the way services cross provider boundaries.  The other role of standards, which is to ensure interoperability among devices within the network, is likely to be less valuable as those devices are implemented as cloud-hosted functions.

But don’t take this as my recommending that 5G it itself immune from standards-created risks.  A number of my telco friends tell me that the 5G vendors are pushing for a “complete commitment” to 5G, meaning adoption of all the 5G standards, including 5G Core.  Even when operators say they’re concerned about the business case and would prefer to ease in via 5G New Radio and NSA (Non-Stand-Alone, meaning 5G NR with 4G core), vendors push for more commitment.

Vendors have been reluctant to abandon that Field of Dreams, the old notion that infrastructure is always built in anticipation of demand.  That’s not been true for at least fifteen years, people.  Get over it.  The choice today is to face the need to do something meaningful at the application/service-feature level, or fail.  The problem is that for vendors focused on making their numbers in the current quarter, neither of these choices seem viable.  So instead, just ignore the facts and hope buyers will do the same.

The second step is to embrace open-model technology for 5G.  There are very few 5G vendors who will like that idea, of course, but the fact is that all technologies that depend on broad adoption for their success are going to be under cost pressure.  The more evolutionary they are in terms of their application, the more cost pressure is likely.  5G may have the potential to do new things, but if we picked a random smartphone user and gave them 5G, they’d likely never know they had it.  We need cheaper 5G gear, as Huawei’s success in the space shows.  We’re not going to get gear cheap enough without an open approach.

We have them, too.  Fierce Telecom described the latest marriage of an open-source device operating system (ONF Stratum) and a Telecom Infrastructure Project (TIP) open-model device (the Cassini optical transport gear).  The two organizations have been increasing their level of cooperation, and operators have been showing broader interest in both groups, and in particular in activities arising out of the cooperation.

The overall approach taken by the ONF with Stratum is not dissimilar to the approach taken by traditional operating systems to hardware variability; you have a “driver” that represents a class of hardware interfaces, and this driver then presents a single interface upward to the operating system and any applications.  Stratum embraces the P4 flow programming language this way, and that means that enhancements to network forwarding created at the chip level can be accommodated in an open way.

This hardware/software symbiosis is critical if an open-model approach to 5G is to be successful.  There are many missions associated with 5G, many approaches to fulfilling them.  Hardware/software innovation is not only good for missions, but for vendors.  Specialization of open hardware with chips, or open software via higher-level (above Stratum) elements, represents the best response network vendors could have to the inevitable price pressure of 5G, price pressure that’s intrinsic to the space’s dependence on the mass market, not due just to open-model competitors.

Stratum isn’t the only option, of course, DANOS is another network operating system that’s been open-sourced.  I’m less concerned about having a single focus here than on having an optimum hardware-software marriage, and with two initiatives in play we may improve our chances of getting one.  We may also improve the chances that what we end up with can be built up to support features above the network layer.

That brings up the last step, which is to build above the connection layer.  Look for a moment at the Internet.  As a “network”, it’s pretty much the same connectionless IP that’s been around for decades, and in that form, it would have very limited utility, and likely zero mass-market appeal.  Imagine socially minded teens saying “Hey, let’s go push some connectionless packets!”  What made the Internet, made it a cultural revolution, was not IP, but things like DNS, HTML, search engines, SIP servers, and other stuff that represented a higher layer of functionality.  The higher layers build mission support, and missions are what have business cases.

For years now, I’ve been harping on the notion that personalization and contextualization were the critical elements of whatever comes next for both business productivity and consumer entertainment and empowerment.  I’ve used the concept of “information fields” to illustrate how workers or consumers would move through a virtual world made up of what we know about them, their goals, their surroundings.  Obviously, I like the approach, but feel free to suggest a different model.  The point is that by picking the “information fields” model, I can then postulate the specific technical elements of that higher service layer.  From that, it would be possible to architect the layer, defining not only the features that it, in turn, would publish to applications above, but the features it would expect from below.  That’s how we get 5G into the picture.

There are a lot of things needed to create a virtual-world-to-real-world parallelism and exploit it in applications.  Location of the worker/consumer is a big piece, and relative location (the worker to the thing that’s to be worked on, the consumer to the shop or restaurant, the car to the intersection) the biggest piece of all.  I went with the “information fields” analogy because it’s convenient to think of empowering data as a “field” that extends our senses.  We have a visual field, so why not an information field.  Creating that field involves issues like where the intersection of fields is processed (the edge, the device?), how a user communicates sensitivity to a field (do they “look” for it, does their behavior infer they want it?), and how sensitivity translates into having the field information actually made available.  These are the application model issues I’m hoping we’ll start thinking about.

The presumption that 5G applications are things that can be done only with 5G is destructive, not helpful.  Again, I understand the lure of the “killer app”, but it’s been a long time since we’ve had a single unrealized opportunity that could pull through a mass technology change in a single revolutionary stroke.  What we need to have is not a 5G app, but a 4G mission whose early fulfillment would inevitably lead to the need for specialized 5G support.  If we get started on this now, we can still boost 5G, IoT, edge computing, and AI to near-optimum levels.

My frustration in all of this is that my modeling suggests that every feature of 5G and every anticipated mission for things like IoT and edge computing could be justified.  There are credible applications and services that would generate the business cases needed, and in most cases these applications and services could be realized with less effort overall than we’ve spent on things like NFV.  The difference is that rather than trying to promote a massive early buildout based on the hope of a business case, these applications/services would address the business cases immediately, the pull through technology changes down the line.  This is the logical approach, and I guess I’ll never understand why vendors and operators alike are refusing to consider it.