How “New” is the Newest Technology Publication?

Remember when I asked if we needed a new tech news site?  Well, we got one.  Protocol launched on February 5th, and it certainly looks different from the mainstream tech sites.  The question, which only time will answer, is whether it’s really offers not only more than we can get now, but offers what we need now.

The tag line for Protocol is “A new media publication focused on the people, power and politics of tech.”  Since this is coming from the publisher of Politico, a politics site I read every day, the emphasis isn’t surprising.  Is Protocol something that aims, in the end, to be a kind of tech-industry gossip column, or does it actually intend to serve a need?  If the latter, what need does it serve and does it look like it’s going to succeed?

Perhaps not surprisingly, Protocol looks a bit like Politico, and Source Code (a section also available for email delivery) looks a bit like the Playbook newsletters on Politico.  Source Code is news snippets like Playbook, but if it’s to set itself apart from the current crowd of technical publications, and fill a need in tech overall, the rest of Protocol has to be more insight, the depth Politico offers.

There is a need in tech for context and insight, and that’s because we need to both energize a mass market and educate an elite group, at the same time.  Ad sponsorship tends to push material toward the mass market, and so we don’t get to the information that the real movers and shakers need to have, and that’s hurting us as we try to advance networking and IT to the next wave of growth.  Buyers of network and IT products today tell me that they’re aware of new developments, but have a problem fitting them into an organized plan.  It wasn’t always that way.

In the early ‘80s, the total circulation of the best of the networking rags of the time (BCR, Data Communications, Network Magazine) was roughly equal to the number of true decision and planning professionals in the industry.  These publications talked their language.  Today, everyone with a phone or a computer makes technical decisions, and this mass market now dominates advertiser thinking.  Extensive detail on something new, such as cloud-native, doesn’t generate the clicks that some flashy story does, and so we end up with more flash than substance.

We can’t move to complex things like cloud-native or services composed from virtual features by exploiting the average.  The smartest people in the industry are coming up with good new stuff, and they need to be able to communicate it effectively to the best and brightest on the buyer side.  If I want to each quantum physics to a company, I don’t hold a mass class in the auditorium, I get the people with physics training together in a classroom.  Hopefully that elite group will exploit the technology in such a way as to support a mass-market impact.

Can Protocol give us an ad-sponsored site that can somehow build knowledge and insight in those who need it most, and on whom we’ll rely in building our tech future?  Let’s look at some of the early material to see.

In the first issue, Protocol covers an important topic, saying that the US (and in fact the networking industry) has been looking for an alternative to Huawei for 5G.  The main focus is to say that industry interest in an open-model networking solution to 5G is now getting government support.  It’s useful, but it could have been better.  I did my own take on this topic in my blog yesterday, by the way.

The piece says that “no American company is set up to compete head-on with Huawei in the 5G infrastructure business,” and that’s not really true.  Huawei doesn’t have any technology magic; their advantage lies in pricing.  Network equipment vendors have difficulty making a profit if they cut their prices to match Huawei, and that’s been true from the first.  Everyone knows the reasons Huawei’s critics give for Huawei’s price advantage, but whatever is true, cheapness sells.  Startups in the space have the same problem; no VC wants to fund an entry into a commoditizing market.  The only solution is open-model 5G.

Open 5G is only an extension of open-model networking, which is the combination of commodity “white-box” hardware and open-source software.  Everything in 5G is really part of either “network equipment” or the 5G New Radio (NR) space. We’ve had the for almost a decade now.  The Open Compute people have network device specs, and there’s a number of open-source projects on the software side.  The OpenRAN initiative is in the progress of giving us the latter.

We’re not 18 months from a solution here; parts of it are already deploying.  The real challenge for open-model networking is credibility.  Who stands behind it?  Who’s to say it will advance in the directions needed?  Who integrates all the pieces?  What we need for open-model 5G to work is first a credible OpenRAN model, which I think we’ll have by year-end, and a credible set of integrators, which may be harder to get.  Will operators pay for integration when they want 5G on the cheap?  To me, it means that open-model 5G supported by one of the major cloud vendors is the only answer.

The second issue offers another piece that’s interesting but not as insightful as it could be.  A Google spinoff, Replica, is offering statistical information on the movement of people and vehicles to urban planners, by using cellphone location information.  The article points out that the wealth of information available is actually intimidating and confusing to many planning boards, and that the future of the company is uncertain because the value proposition isn’t clear.  All true, I think, but it only gets tantalizingly close to the key point.

All of advertising, all of augmented reality, all of productivity enhancement, depends on the ability to create, for each of us, a parallel and contextually linked online universe.  The real world has to be known even to an AI process that wants to exploit or augment it in some way.  What Replica is showing is that it’s possible to know a lot about the overall movement of people, and by inference the movement of any given set of people, through cellphone location data.  This is somewhat helpful for urban planners, but how much depends on just what the knowledge could be used for.  In terms of providing my “information fields” in this parallel online universe, it’s critical.  Nothing matters more in contextualization than the physical location of things.

This truth ties into a parallel truth, which is that there’s probably nothing more sensitive than that location information.  The big wireless providers have been accused of selling user location data, and the selling or even using of the location of any specific person is certainly a major privacy risk.  Suppose, though, that you construct a set of services that instead of providing the raw (and dangerous) data, provides insights from it.

There are hundreds of things that could be done with that, from real-time traffic avoidance to helping friends meet.  In the former case, you don’t care who you’re about to collide with, only that collision is imminent.  Anonymized data is fine.  In the latter example, if two people agree to share location information, then services that use both locations to facilitate a meeting (or even avoidance) is also fine.  It’s easier to manage oversight and permissions through service abstractions than to make IoT elements themselves aware of the need.

Replica could be an example of a critical step forward for so many of the technologies we’re trying to promote that the list would be as complicated as Replica’s urban planning value proposition.  It’s this broader utility that makes the story important, and this is what I’d have liked Protocol to have pointed out.

My final example is very pertinent to the cloud-focus I mentioned earlier in this blog.  Protocol ran, on Friday of last week, a piece on “What earnings reports tell us about the state of the cloud”.  Those who read my blog know I regularly analyze these documents for insights, and I think they’re a valuable source.  In fact, I analyzed some of the same things that Protocol did, but I think my take was rather different.  I’m not saying that I’m right and they’re wrong, but that I tried to get under the surface facts, and I think they missed the boat.

The Important Truths about cloud computing are that 1) very few enterprises will ever move totally to the cloud, 2) that hybridizing public cloud services and data center applications are the real future, and 3) that we’re still in search of the broad software architectural model that’s needed for that.  I don’t think any of these are captured, and they may even be contradicted, in the piece.

“Cloud computing is still on the rise and starting to eclipse traditional enterprise technologies” is the thing that Protocol says is the takeaway from earnings.  The truth, I think, is the opposite.  Cloud computing is changing because it’s being fit into traditional enterprise technology, via the hybrid cloud.  That’s what’s driving an increase in cloud adoption.  The cloud is adapting to us, not the other way around.

How that adaptation is working, or should be, is our key cloud challenge.  We’re too fixated on the idea that stuff is “moving to the cloud” when the key to cloud growth is what’s being written for the cloud.  The highly visual and user-interactive pieces of applications, particularly the mobile and web front-end piece, are “new” development, and the cloud is the perfect place for them.  The traditional transaction processing and business reporting that’s still the key to enterprise IT has stringent security/compliance requirements that the cloud doesn’t meet (and may never meet), and the pricing model of the cloud would make moving these apps impossible without significantly increasing costs.  Accommodating both these points means creating explicit hybrids.

Microsoft, who has long had an enterprise data center presence, got the message a bit faster than Amazon, and Amazon has been focusing on expanding support for new cloud development (in that front-end piece) rather than on hybrid cloud.  It’s paid off for Microsoft, at least somewhat, but it won’t fully level the playing field with Amazon until Microsoft can present the hybrid application model framework that would govern the hybrid cloud overall.

The article also infers that enterprises’ turning to SaaS is a big factor, and that includes things like Microsoft’s Office 365 and various online video collaboration services.  The former is a better example of the software industry’s move to subscription services than an example of SaaS success, IMHO, and the latter is natural given that collaboration is a natural fit for a service feature, given the scope of collaboration, the specialization of needs by application, and the variability of usage.  Enterprises are interested in SaaS, demonstrably given Salesforce’s success, but primarily for applications peripheral to their core business, and the core business drives tech policy and spending.

Where does this leave us with Protocol and its mission, then?  I had a brief exchange with one of the editors when the publication was first announced last year.  In it, I said “What I think is most important in tech coverage is context.”  He responded “I completely agree — this is one of the things we want to do really well, making sure we try to tell the whole story instead of tiny pieces of it.”  So far, I don’t see that happening in the stories.

News with context is actionable insight, something we surely need in tech these days.  News without it is glorified gossip.  I don’t think Protocol has fulfilled its promise yet, but it’s early yet, and they’re still finding their legs.  I’ll be keeping an eye on things, and I may change my mind if they change their approach a bit, and if so, I’ll blog on the topic again.