Is 6G Already on a Trajectory to Doom?

Let’s be honest, 5G has been a bit of a disappointment. There were all manner of wild claims made for it, starting when it was a gleam in the eye of standards-writers and continuing until real deployment started. Then people started to realize that the average mobile user would never know they had it at all. They realized that for operators, full 5G Core implementations with network slicing were going to be yet another technology looking for a justification. Then the media realized that they’d said all the good things they could have about 5G, so it was time to start questioning the whole thing…and looking for a replacement.

6G follows the recent trend in generational advances in cellular networking, and if you look at the definitions of 6G (like the one in Wikipedia), it’s hard not to compare them to early 5G promises. 6G is “the planned successor to 5G and will likely be significantly faster.” Wasn’t that the big deal with 5G versus 4G? Or “6G networks are expected to be even more diverse than their predecessors and are likely to support applications beyond current mobile use scenarios, such as virtual and augmented reality (VR/AR), ubiquitous instant communications, pervasive intelligence and the Internet of Things (IoT).” Again, wasn’t that the 5G story?

The truth is that we’re already on track to create a 6G mythology that will lead to hype, exaggeration, and disappointment. Is it too late to save 6G, and is there really any reason to save it at all? Let’s try to answer that question.

6G, like 5G, moves cellular to higher frequencies, higher cell capacities, and a larger number of potential user/device connections. It will likely lower network energy consumption, improve scalability, and it will perhaps allow for tighter integration between the network and contextual services, to use my term. If you wanted to offer a realistic summary of what to expect from 6G, that summary would be that it relieves limitations that would hit advanced applications based on 5G, as those applications became pervasive. That, it turns out, is the critical point about the whole 6G thing. 6G is in a race.

As I’ve pointed out in many of my previous blogs, there exists a “digital-twin-metaverse-contextual-services” model that could evolve to tie information technology more tightly into our lives and our work. In that model, we would all move through a whole series of “information fields” that could supply us with critical information at the very point where we interact with the world (and job) around us. These fields and our ability to use smartphones to tap into them would build that “hyper-connected future” that some vendors are already pushing. The problem with that exciting vision is that such a framework needs a whole lot more than a new generation of wireless. In point of fact, what we could really do in the contextual services space today wouldn’t likely tax even 4G, much less be sitting around and waiting for 6G to explode onto the scene.

As we wait, as 6G waits, it is subject to the same trajectory to doom that 5G followed. You start with unrealistic claims, convert them into wild expectations, stir in a lot of hype, and end with disappointment, disillusionment, and finally the worst thing that can happen to new technology, boredom. We are already on that path today, pushing the characteristics of something that has absolutely no technical substance yet. But whatever it is that we need in networking, be it excitement, profit, transformation, or whatever, we can assign to 6G. Because it has no technical substance, you can’t disprove any claim you make, and the wildest stories get the most publicity.

How is this a race? Well, 6G is following its trajectory-to-doom, and at the same time we’re starting to glimpse the elements of the sort of “contextual services” that could eventually exploit, even drive, it. We can’t justify radical technologies except through the exploitation of radical business cases. You can’t evolve to a revolution, so we either abandon the revolutionary or abandon safe, comfortable, and above all, slow progress in creating justifications. The question is whether the contextual framework can outrace that doom, and sadly, it’s not a fair race at all. 6G hype is likely to win.

The problem with contextual services can be explained by asking just what that “hyper-connected future” is connecting and what value that connectivity is bringing. Technology advances that don’t frame any real benefits and that can be deployed only by incurring real costs tend to stall out. That’s been the problem with all the new network services of the last couple of decades. I have offered, through contextual services, an example of what a hyper-connected application would look like, and think of all the pieces that are needed. You need contextual applications, you need sensors, you need “information fields”, you need network agents representing users and other elements, you need edge computing…the list goes on. If you have all that and it’s adopted, then 5G would indeed have to give way to 6G, but if deploying 6G is a major investment, then what kind of investment is needed for all that contextual stuff?

You can’t justify a new technology by citing the driver that other technologies create, if those other technologies also need justification. That’s particularly true when you’re trying to justify something that’s on the tail end of a long requirements chain. 6G, when you cut through the crap, is something that does more of what 5G was supposed to do, and we haven’t managed to get 5G to do it all yet. Or even a little of it.

We have technology in place to start developing contextual services. A team of maybe three or four good software architects and two-dozen developers could be working on a prototype by the end of the summer. We could identify places where the hosting and connectivity resources are available already, and where the value proposition for contextual services could be readily validated (or proved to be nebulous). If we expended a fraction of the resources that 6G will surely suck up on a contextual-services model, we could actually advance not only 6G but edge computing.

The notion of supply-side market drivers presumes one critical thing; pent-up demand. If you offer something that involves a significant “first cost” deployment investment, you have to assume that you’ll sell something that generates ROI, and quickly. But pent-up demand is something we can usually measure and model, and the most telling symptom is what I’ll call the “verge clustering” phenomena. If Thing X is needed and there’s really pent-up demand for it, then users/applications will cluster on the verge of Thing X, the place where it’s possible to get as close to the needed element as possible. We should be seeing 5G insufficiency in order to presume pent-up demand for 6G, and in fact we’re still trying to justify 5G.

Sadly, the same thing is true for contextual services, or perhaps even worse than the same thing. With contextual services, it’s hard to even define what “the verge” is, because we’re largely ignoring the concept. We talk about IoT, which is just a sensor-and-sensor-network technology, not an application. We need to talk about the utility of IoT, and of 6G, and of contextual services.

Why don’t I fix this, or try to? I’ve written a lot about the cloud, the edge, 5G, and more, and I think that most of my readers will say that I’ve been on the right side of most of the issues that have developed in any of these areas, well before they were widely accepted. I think I’m on the right side of this one too, but I’m a soothsayer; I can’t move mountains, only predict where they’ll go. But if you have a respectable plan for moving a bunch of rock, I’m willing to listen.