In earlier blogs, I’ve noted that AI is about as overhyped as a concept can be. That wouldn’t make a Cisco announcement in AI surprising; Cisco has always been willing to surf the hype wave. This time could be different, though. Cisco may actually be onto something very smart…emphasis on “may”.
To start with the hype angle, Cisco quotes a Gartner report that says “only 4% of CIOs worldwide report that they have AI projects in production.” Given the prominence of AI in stories, you’d think that number would be ten times higher. Add in the fact that AI technology is really not much different from what it was a decade ago, and you have what might seem like a total yawn in the making, reality-wise. Not so.
The most telling part of Cisco’s release on their new AI-and-machine-learning server is the quotes at the end, or rather who they’re from. One is (predictably) from Nvidia, whose GPUs power the Cisco server. Another is from a university, whose AI research is expected to be a driving force behind the harnessing of AI for the real world, and the last is from a cloud provider, who represents who Cisco is really targeting with this stuff.
What Cisco is announcing is a server that’s built on GPUs, the specialized chips used on complex computational applications. It’s designed to bridge the task of running the most common and also most forward-looking AI/ML software platforms we have today, the needs of developers to create a reasonable application ecosystem to justify AI, and the needs of those who will likely deploy this kind of stuff first. It’s also a bridge between two very distinct market periods.
Right now, AI is just hype, as Gartner says (and as I’ve been saying). That doesn’t mean it’s not valuable to a vendor. Sales calls sell products, but what sells sales calls is something new and interesting. Nobody is going to let a salesperson call with the mission of giving them the same update on the same stale crap they’ve been hearing about for years. Once you get in the door, you can dazzle the prospect with novel things and then get down to business and make your quota. That’s reason enough to support the hype phase of the AI market, but there’s more.
Servers have to deploy in data centers. Where will the largest data center opportunity come from for the next 20 years? Answer: Carrier cloud. Not only that, whoever else you think might be a winner in deploying AI, they’ll surely be a cloud provider of some sort. The most important element in the future of AI is who will deploy it, and the answer is “A cloud provider”. What class of provider represents the largest opportunity of the cloud provider group? Network operators or telcos or carriers or whatever you’d like to call them. They’re the future that AI has to connect with.
Cisco is a bridge vendor. They have a strong network position, strong engagement with the big network operators, and yet they also are a strong provider of servers and software platforms. UCS has been a success for Cisco, a way for Cisco to acknowledge that network operators in particular are looking beyond pushing bits as a source of future revenues. Experience hosting is the future, and to do it you need…well, obviously…hosts. That Cisco has them gives Cisco a good start on engagement, but it also puts competitors in the deep brown.
IT vendors like VMware (who has their own aspirations in the cloud and with network operators) has a platform but no real engagement with the buyer. Network rivals like Juniper have engagement but nothing to engage on. Cisco has that bridge, and by productizing it and linking it to AI, they nail their competition to their deficiencies in one area or the other.
Bridges have to take you somewhere, of course, and a hype-phase-one for AI hosting isn’t going to be helpful if nothing comes of it. That gets us to the second market period. Starting in 2021, we can expect to see a sudden upsurge in one particular driver of carrier cloud—contextual services. Today, it’s in the statistical noise level in terms of influence on deployment, but by 2021 it will double what it is today, and by 2024 it will be the largest single driver in the carrier cloud market.
Contextual services are services that take as input detailed information about user requests, but also about the context of those requests. Where is the user? What is the user doing, where is the user going, what’s influencing what they’re thinking, what are they seeing, how are they feeling? Answers to these questions, if they’re available, can be crunched into the critical answer to the real question, which is what does the user want. Not on the surface, but deep down. Contextual services are optimally relevant, and so they’re optimally valuable.
Contextual services are also at the heart of everything we’re hearing about. Autonomous vehicles are darn sure driven by contextual services. So is IoT success, and 5G success, and augmented reality, and personalization of advertising, and suggesting relevant content to viewers. Future applications to enhance buyer productivity and seller efficiency will be driven by contextual services, too. Altogether, contextual services represent over a trillion dollars a year in potential service revenues. If an operator today spends 19% of revenue on capex, that equates to almost two hundred billion dollars in equipment. Cisco would be stupid to ignore that.
Remember my point about contextual services being the heart of a bunch of significant stuff? The corollary point is that the networking industry isn’t going to be able to create silo systems for every possible one of those contextual-related applications. We need a single framework, a unified model for contextual services that can then drive everything. That way, early applications grease the skids for the later ones, letting those who have to deploy all this stuff reap the benefits of picking the best returns early on, and then exploiting what they’ve done with things that can’t generate enough ROI to justify a build-out.
The first phase here is then an ecosystem-building phase, from a technology perspective. Cisco’s not getting into networks with AI servers this year or next, but they’re going to get into labs (as the quote in the release demonstrates). What goes into networks will have to pass through labs, and so that’s a very important strength this announcement brings to Cisco. It’s also the place where their current weakness is exposed.
Servers are like oil tanks. They won’t heat the house, but they hold what does the job. There’s other stuff needed to connect the oil tank to a comfortable living room, and so while you can engage someone seeking comfort with talk of the tank (so to speak), you’ll quickly have to add some meat to the bones there. Cisco needs not an AI architecture (which the labs can provide) but a contextual services architecture. That’s not something that universities or network operators are likely to figure out. Even cloud providers like Amazon probably can’t do the intellectual heavy lifting needed to turn an experiment into AI into a multi-hundred-billion-dollar industry.
Can Cisco? That’s my “maybe”. They’ve always been the top sales firm, one of the most aggressive market-hype exploiters (and often generators). They can sing any story beautifully and some would say, cynically. Can they conceptualize the real pathway to AI success? There’s nothing in Cisco’s material to suggest they even know what contextual services are. There’s nothing to suggest that they can divide them into meaningful functional classes, devise a way to host each class, device a mechanism to integrate them to create various applications of contextual services, or to productize what buyers would really need to deploy the stuff. And they’ve given everyone else notice that there’s an opportunity to jump past the hype and deliver reality. Might another competitor do just that?
Remember, AI and contextual services requires an ecosystem, a platform with clear developer rules, clear business connections. Without that, it’s just a bunch of rods and joints seeking to become a jungle gym. This can’t be, for Cisco, one of those famous-of-old five-phase plan announcements where Cisco always placed themselves in phase two and never got beyond phase 3. This is going to require real software architecture work, which is why it’s not been done already. I don’t know whether Cisco has the expertise to do it, or the management risk tolerance to even try. In the end, will having those sales people get their meetings, sing their song, and then sell something else to make quota be enough for Cisco as well as the sales people? The answer to that question will determine whether Cisco really makes a seminal product in the AI space, or just erects an attractive billboard.