Sprint is yet another operator to announce an IoT platform, but we still don’t have a real handle on what a really useful IoT platform would look like. So far, perhaps not surprisingly, what’s coming out focuses on what the operator does in the RAN to support IoT sensors. Is that all there is, or even the best overall place to focus? I don’t think so.
It’s hardly surprising that operators look at IoT as a connectivity problem; to a hammer salesperson, everything is about nails. You need lots of devices, so lots of radios (and bills, of course). You need low latency so you need to pack everything toward the edge. You need security for public sensors, meaning that you likely need some specialized chips to keep cost and power down for your devices. You need connectivity, the ability to manage connectivity, and the ability to manage what’s using the connectivity. All this was the focus of Verizon’s early IoT stuff, and it’s also the focus of Sprint’s Curiosity platform for IoT.
You can forgive the operators for this perspective. Telephones, after all, were provided to users who then decided what to do with them. The connectivity was the operators’ contribution; everything else was up to you. That makes the biggest question for IoT hopefuls “Is IoT like telephony; just providing connectivity to an opportunity base?” We have to break that one down to answer it.
The first question is whether IoT is about deploying sensors on a RAN at all and that rests on a couple of second-level points. First, is there a business case for someone to deploy “public” sensors, meaning sensors that would be on a cellular network accessible to others, at least potentially? Second, would any such business cases be better served with a different approach? Third, are the security, privacy, and public policy issues associated with public sensors so daunting that only an alternative approach could manage the risks? Finally, could we hope to bootstrap these applications to manage first cost?
There are lot of things you can do with public sensors, but that’s not the same as saying there are a lot of business cases. Autonomous vehicles? Augmented reality? Yes, those are credible reasons to deploy public sensors in volume, but how rapidly will they develop. Let’s take them one at a time.
Autonomous vehicles aren’t quite as rare as being struck by lightning, but they’re up there. This, in an industry where the inertia of past purchases is strong. Over 90% of all successful vehicle models ever manufactured are still on the road in 15 years. Today, industry estimates are that one car out of 500 is electric, after almost a decade of availability. How long would autonomous vehicles take to build a critical mass, and even if you see that as possible, who pays for the deployments? Do manufacturers kick in to build out the sensor network? You see the problems.
Augmented reality? This is an exciting topic, but it’s far from clear that it’s something that drives IoT. What people want augmented about reality for is in the eye of the beholder. Landmark recognition? We can do that with phones today. Alerting them to sales in stores? Same answer. Substitute icons for people to make the real world look like a game? That’s a graphics problem more than an IoT problem, and that’s true of most augmented reality applications.
Then there’s our second point, which is whether the business case for “public sensor IoT” could be more easily made with something else. That probably depends on where the sensors are. In areas where human population density is high, and if the sensor is associated with something like an intersection where there’s normal power line wiring, I’d submit that cellular connectivity probably isn’t the cheapest option. Certainly we’d be refreshing traffic lights fast enough to keep pace with the onslaught of autonomous vehicles! In my view, overall, neither of our two key applications for IoT would require cellular sensors; something cheaper would be available.
The third point, then, may be redundant. We don’t need security, compliance, or public policy concerns to tip the balance. For the near term, there is nothing compelling in those two applications that would justify IoT deployment using public, cellular-connected, sensors.
There are other pedestrian (meaning not worthy of media attention so you never hear about them) applications of IoT. I’ve worked on transportation applications of sensor technology for decades, for example. If you have stuff that moves around on rails, roads, or in the air, you need cellular connectivity for a sensor on it. There tend to be a lot of these things, so they create a viable market, but do they require a whole new 5G infrastructure? Not so far because we already have them, and have had them for decades. There’s even more industrial control stuff that uses sensors, and still more stuff that’s used in residential monitoring and security, but all of that stuff is already served by wires, by local sensor protocols like Zigbee, or by WiFi.
Having dispensed (to my own satisfaction at least) with the justification for a 5G connection bias to IoT, we then turn to the question of what an IoT platform would look like. The answer to that question is fairly simple; it would have to look quite a bit like one of the function-cloud services we have out there, from Amazon, Google, and Microsoft. That means that an IoT platform would be a collection of hosting resources (edge, metro, core), a set of features/functions, and an orchestration process that brings all this together to create applications. Events from IoT trigger, via orchestration, features/functions that eventually do something, whether it’s a control response or a traditional transaction.
There’s connectivity involved with this, of course. When you host something, you have to get the information that triggered the hosting decision to where you put the feature/function. When that feature/function generates an event, you have to repeat that process until the chain of actions and reactions comes to its logical conclusion. Connectivity can be visualized on a per-hosting-site basis or on a process basis, meaning networking the resources or providing network-as-a-service features for the processes as part of the resource layer.
Obviously this isn’t what Sprint is describing, though it’s fairly clear that they could add this on if they needed. I don’t understand why they don’t see the need, or at least the benefit, because IoT is really about the applications and not the “I’s” or “T’s”. Experiences augmented by sensor activity is the real goal, the only thing that the users of IoT will pay for. The top of the revenue food chain then has to drive any lower-level charges for connectivity. That’s why omitting the real IoT platform is dumb. Somebody else will have to come up with that platform or the whole IoT strategy for Sprint, Verizon, and everyone else is just a pipe dream.
If somebody else does come up with the strategy, then isn’t their platform, resources, and so forth likely to include inter-process connectivity? Hasn’t the decision to stand off from the real IoT platform then compromised everything an operator could provide other, perhaps, than the cellular connection (if there really is one needed)? How many times have we seen operators come up with a strategy that builds in their own disintermediation? This sure looks like another one.