Just What Does “Open” Technology Mean?

One interesting point of agreement across both the network operator and enterprise communities is that “There’s open, and then there’s open.” The meaning of this should be clear; not everything that’s touted as “open” is equivalently open. Given that both operators and enterprises say that “openness” is in at least the top two of their requirements, it’s worth looking at just what they think it means, and what the views of buyers mean to the market overall.

It’s no surprise that the top meaning attributed to “open” in both networking and IT is “not proprietary” or “no lock-in”. Buyers always tend to feel exploited by sellers, but that’s been a particular factor in both networking and IT for over two decades, according to my surveys. This definition isn’t particularly helpful, though, because it turns out that many “open” technologies (as classified by the buyers themselves) aren’t free of proprietary lock-ins.

In my survey attempts to dig down a bit, I found that there was some consensus on the meaning of the term I often use in my blogs, which is “open-model”. Buyers say that an “open-model” technology is one that is built from components that each have multiple sources and that can be freely interchanged. Consensus is nice, but even this definition has nuances. For example, “router networks” are considered by network operators as open-model (three-quarters say that), but not by enterprises (two-thirds say they’re not). The difference in viewpoint arises here because enterprises believe that router vendors will add technologies (usually operations/management tools) that lock them in.

To be an open-model technology by the definition of most buyers, you’d have to expand the definition to say that in addition to having interchangeable components with multiple sources, all functional elements have to be built this way, not just the primary elements. Thus, an “open-model” router network would have to be made up only of interchangeable components, including the management tools.

“Open source” is another concept that’s crept a bit into definition ambiguity. Strictly speaking, open-source means that the source code is freely available to anyone who uses the software, and can be modified as desired, subject to the terms of a “license”. There are at least a dozen different open-source licenses, though, and we also now have the concept of “dual licensing” that permits an “open-source” version of something and a more commercial version.

Enterprises believe that “open-source” should mean that the software has freely available source code, is supported by community development, and that only integration and support services are sold, not functional extensions. The network operators are a bit less concerned about the last point; they don’t mind commercial extensions as long as the interface to them is open and open-source.

There is, however, a general erosion of confidence in open-source software, even the stuff that meets buyer definitions. A decade ago, over 80% of both enterprises and operators believed that open-source software was secure and that governance was “strong”. Today, less than half of operators and just over half of enterprises believe that. Both groups are three times as likely to want to acquire open-source software from a third party source (Red Hat, for example), and those buyers cite governance and security risks as a reason not to go directly to the source. The open-source foundations like Linux, Apache, and the CNCF are rated as “critically important” to software quality and security by at least two-thirds of buyers today, when a decade ago they were important to a quarter or less.

Where things get really complicated is on the network side, with the white-box and disaggregated movements. Enterprises believe that white-box hardware and open-source networking software would create an open network device. They believe that white boxes are critical to this happy situation because they don’t believe that any open network software would be available unless there was a significant white-box community to drive interest.

Operators, whose focus is on the capex side, are more interested in open hardware, meaning white boxes, than in open software. That’s one reason why they aren’t concerned about the DriveNets model, where a software vendor uses open white-box hardware. For the operators, dependent on capacity and switching speed, issues of the compatibility of open-source software with the networking chips used is the critical software issue; they like the ONF P4/Stratum notion and model, but they’re not confident it will drive a lot of market change. They don’t see anyone stepping up to drive the model forward, and what they’d like is for a software player like Red Hat or VMware to do that, meaning they’d like to see both players field open-networking software and support a variety of white boxes.

As far as the disaggregation promises of the major network vendors, both groups will agree that offering routers or other network devices as separated software/hardware elements is “open” only if there is proof that the separated pieces will run in other combinations. Both groups characterize the disaggregation stories of vendors as “cynical” or even “misleading”.

Negative views can be helpful, but as far as what “open” doesn’t mean, the views of both operators and enterprises are pretty vague. Based on their high-level definition, it means “not proprietary”, but just what that means is tangled. On questioning, both groups say that open physical interface standards don’t create openness. Both groups say that the server side and software side of the tech space are much more “open” than the network side. That links their view here back to the goal of having open hardware models and open-source software, which both groups say characterize the server space.

As I said at the beginning of this blog, “open” ranks no lower than second in desirable attributes, so one might think that it would be decisive when selecting new technology. In the server space and platform software space, that tends to be true, but not in the networking space. When you ask the same operators and enterprises whether their last purchase of network equipment was “open”, well over three-quarters said it was not. When you asked what the decisive factor was in the purchase, the responses were split between “TCO” and “integration”, with the latter really meaning “incumbency”.

The “open” push in networking actually peaked in 2019, according to my surveys. Since then, its importance relative to other factors has declined. In the last year, in fact, buyers who actually purchased open network technology cited price as a bigger factor in their decision than openness. In cases where open technology lost deals to proprietary devices, the reason was that the vendors discounted their products to make the deal.

It’s difficult to say conclusively why open technology passed its peak of interest, but one big factor seems to be media coverage. Vendors push on what has editorial support, and at the moment that’s mostly 5G. This, despite the fact that only a bit over half the network operators and only about 10 percent of enterprises cite 5G as being a key strategic interest to them.

Openness was never everything it was claimed to be; few things are these days. It’s still a better process than the traditional commercial processes, and the issues evolving in open-model technology are arising largely from the same forces that create proprietary abuses. The lesson for everyone is that there’s no easy answer for buyers…or sellers.