Are all government broadband initiatives doomed? After I commented a bit on Australia’s NBN as an example of why you can’t count on a form of nationalization to save broadband, I got additional material on Australia, as well as commentary from people involved in various government-linked broadband initiatives here in the US. I think the sum of the material sheds light on why so many (most, or even all?) such plans end up in failure. That could serve to deter some, and perhaps guide the initiatives of governments determined to give it a try.
The single point I got from all my sources is that all studies commissioned to evaluate government broadband programs will be flawed. In every case where such a study was done, the study forecast a shorter period to full deployment, better outcome, and lower costs than were experienced. According to my sources, the average study missed the mark by almost 100% in each of these areas. That seems too much of an error to be simple estimate difficulties; there seems to be a systemic issue. In fact, there are several.
The issue most sources cite for study errors is there is a desired outcome for the study, and it’s telegraphed to the bidders. I’ve seen this in study RFPs I’ve received in the past, when my company routinely bid on these sorts of things. “Develop a report demonstrating the billion-dollar-per-year market in xyz” is an extreme example, but it’s a near-quote of one such RFP’s opening.
The second main source of study errors is that the organization requesting the study has no ability to judge the methodology proposed or the quality of resources to be committed. It does little good for an organization or government entity to request a study when they wouldn’t even know a plausible path to a good outcome, and yet the majority of studies are commissioned by organizations with no internal qualifications to assess the results. In some cases, that’s not a barrier because (as my first point illustrates) the desired result is already known, and the study is just going through the motions. In other cases, the organization requesting the study is simply duped.
The second-most-cited reason for the failure of government broadband projects is that a vendor or integrator misleads the government body in the capabilities of the technology. Everyone who’s ever done any kind of RFP knows that vendors will push their capabilities to (and often past) their limits. “To a hammer, everything looks like a nail” is an old saw that illustrates the problem. Go to a WiFi specialist and you get a WiFi-centric solution, whether it’s best or not.
This is the biggest technical problem with government broadband. Sometimes it’s the result of underestimating the pace of progress in technology relative to the timeline of the project. If you embark on a five-year effort to do something, the fast-moving world of network technology is likely to render your early product examples obsolete before the end of the project is reached. Sometimes, there are fundamental architectural issues that should have been recognized and were simply missed, or swept under the rug.
The third-most-cited source of problems with government broadband is lack of flexibility in dealing with unexpected issues. This covered a number of more specific points. First, the government projects tended to push issues under the rug when they arose to avoid compromising the plan, when in fact it made the issues nearly impossible to address when they finally blew up. Second, government projects were slow to adapt the plan to changes in conditions that clearly indicated adaptation was necessary. Third, government broadband didn’t properly consider new technical options when they arose.
Then, of course, there’s the general complaint that all government broadband is too political. This issue came out very clearly in Australia’s NBN, where the whole topic was a political pawn. Politics tends to polarize the decision-makers on extreme opposite sides of any issue, and with broadband that tends to promote a kind of all-or-nothing mindset at every step of the project.
The input I got suggests that most involved in government broadband projects agreed with my point, which was that the best strategy was likely incentive payments to competing operators to induce the behavior the government wanted, rather than shouldering the free market aside and taking over. A number of Australia’s operators tell me that they believe that the broadband situation would be far better had the government done nothing at all, and that a positive approach to dealing with the specific issues of a low-demand-density market would have served far better.
What, then, could a government do to optimize their chances of succeeding? There are some specific points that seem to be consistent with the experiences my contacts related.
The step that’s suggested most often is perhaps the simplest: Governments need to contract for a service level. The most-cited success story in government/network partnerships is the one involving Google Fiber. People will argue that Google cherry-picks its sites, but that’s not a reason to say that Google Fiber isn’t a good approach, only a reason to say it can’t be the only one.
Google Fiber tends to go after areas that have reasonable demand density but are under-served by traditional telco and cableco providers. That there are such areas is proof that the competitive market doesn’t always create optimum strategies. Some telco/cableco planners have confided that many, even most, Google Fiber targets were considered for new high-speed broadband, but that the market areas were too small to create sufficient profit, and there was a fear that other nearby areas would complain.
New technology of some sort, however, is almost surely required for improving broadband service quality in low-demand-density areas. There’s too often a focus on reusing the copper-loop technology left behind by the old voice telephone services. Rarely can this plant sustain commercially useful broadband quality, so a bid for a given service level has to be assessed considering the real capabilities of the technology to be used.
Perhaps the most important lesson of Google Fiber is that if a network company can exploit new technology to serve an area, they should be encouraged to do that, even to the point where the encouragement is a partnership with government. I think that millimeter-wave 5G in conjunction with FTTN could well open up many new areas to high-speed broadband. Since technology companies are more likely to understand this than governments, a corollary to this point is that governments should encourage approaches by network companies rather than pushing for something specific.
The second step is only a little behind the first: Think small, meaning try to get local government initiatives before you look for something broader. A specific city or county is more likely to be suitable for a given broadband strategy than an entire country. A solution that’s applied nationally tends to encourage the spread of the approach to areas that really weren’t disadvantaged in the first place. Did Australia have to create a national NBN, or should they have instead focused on regional solutions where Telstra and others weren’t able to create commercial services profitably?
It may be that “little government” is always going to do better with broadband programs, not because its people know more, but that they recognize their own limitations more readily. It may also be true that the best thing a national government can do for broadband is to step aside and let the little guys drive the bus. That, in my own view, is the lesson Australia should teach us all.