Is There a Practical Pathway to Fiber-to-the-Premises?

Despite all the hype around Google Fiber, the fact is that getting fiber to the premises (FTTP) is a major economic challenge.  While everyone wants fast Internet, nobody really wants to pay for it, and with the Internet (unlike, say, the auto industry) there’s a deep-seated public perception that somehow what they want should be free, or nearly so.  We started the broadband revolution by exploiting wireline infrastructure deployed for other reasons—DSL exploited phone lines and cable broadband exploited broadcast TV delivery via CATV.  Fiber, as new deployment, poses a completely different challenge, which is paying for something that’s expensive from a service that people want to get for nothing.

The fiber profitability challenge is created by something I’ve researched for well over a decade—“demand density”.  Roughly stated, this is the dollars of GDP per mile passed, and it’s a measure of the ability of infrastructure to pay back on its investment.  A high demand density (Japan, Korea, Singapore) means you can easily connect enough users to earn a respectable return, and a low one (Australia, Canada, the US) makes getting to that happy point much harder.

Where demand density is lower, the market seems to have identified two paths toward achieving good FTTP penetration.  The first is to make the access network what’s essentially a public utility, with government support, and this was adopted in Australia in the form of “NBN” or the National Broadband Network in 2009.

NBN was, of course, a child of the political process and so it generated a lot of debate and disputes from the first.  The goal of covering 93% of Australia’s households by 2021 was ambitious.  The fact that NBN was essentially taking over the incumbent telco’s (Telstra’s) role in access networking required compensation, the amount of which was controversial.  Cost estimates supplied for the project were disputed in the industry (including by me).  The head of NBN was Mike Quigley, who had been CEO at Alcatel, which also raised conflict questions.  Finally, in the lead-up to the 2013 elections, the current government came up with “MTM” meaning “Multi-Technology Mix”, which pulled back from full FTTP and introduced a mixture of other less costly technologies, including HFC, FTTN, and even satellite.

Cost overruns and technology issues have marred NBN according to most sources.  In 2014 an independent cost-benefit analysis estimated that the best option would be to continue an unsubsidized rollout, presuming that this could be made to work.

This year, former NBN head Mike Quigley defended NBN in a talk.  In it, Quigley blamed most of NBN’s woes on the way it was covered in the media.  His quote, from Lenin, is illustrative of the view: “A lie, told often enough, becomes the truth.”  Quigley’s data shows that the original FTTP concept was actually working, though he admits there were some issues (like discovering asbestos contamination in some facilities) that had to be addressed.  He does note that while the original NBN deal with Telstra would have protected NBN from overruns and quality issues, the renegotiation of the deal reduced these protections, which put the project more at risk.  In any event, in the MTM plan, the reality was that most of the success came from the original FTTP passes.

I’ve looked at the project a number of times, and I think that Quigley makes some valid points and perhaps doesn’t emphasize one critical one.  The valid point is that the FTTP technology target does appear to be a better option than the MTM plan, in hindsight.  The issue that made the difference is that the cost and time to roll out the other MTM options was underestimated badly.  Bad estimates are also behind most of the cost issues; it wasn’t that cost control was bad as much as that estimates were unrealistic.

The critical point is that political projects have to sustain popular support.  It’s not enough to say that you were right a half-decade or more after the issues were debated.  You have to make your position clear, sell the constituents, and then work to keep them onboard.  Companies understand marketing, but governments don’t always seem to get the picture.

Where does this all lead?  NBN wasn’t costed out properly at the beginning, which is hardly a surprise given the typically large cost overruns on government projects globally.  The decision to change horses to MTM in 2013 was based on overly optimistic data on the rate and cost of deployment.  The project goals could probably have been realized at acceptable rates of return had NBN stuck with FTTP.  But the big issue is that government projects are rarely run well, though Quigley presented some examples (like the Erie Canal, which most people would say isn’t easily made relevant to broadband!) to the contrary.  In the net, I would have to say that NBN proves that a successful government project for broadband deployment would be very difficult to run.  I have doubts even about municipal programs, though it might be possible to get a better handle on costs and benefits in a smaller geography.

That brings us to the second option for FTTH justification, which is to improve the ROI of the project by controlling costs and elevating benefits.  Demand density is a measure of economic power available to exploit, per mile of infrastructure.  If you could exploit more of that economic power, and do so at a better cost point, you could do more fiber deployment.

In the US, you can see both the demand density issue and the exploitation opportunity emerging, in a comparison between Verizon and AT&T.

The demand density of the Verizon “core wireline” territory is, on the average, seven times that of the AT&T territory.  In some areas, the difference is as much as eleven times.  It’s hardly surprising that Verizon’s original broadband strategy was to run FiOS to the places where density was high and let the others alone.  Not only that, Verizon sold off lines in areas where their demand density was far less.  AT&T’s approach was far less fiber-intensive, and they’ve now taken the course of using satellite to deliver TV in their territory at large.

The TV dimension here is important.  The current data suggests that about 25% of users will pay for more than basic broadband wireline services, but this depends on how much of a premium has to be charged.  The challenge in fiber deployment is that current pass costs for fiber are probably five times or more the cost of passing households with HFC as cable providers do.  Plant maintenance is lower, though, and Internet is better with fiber.  What’s made both HFC and FTTH work in the US has been broadcast TV delivery.

The dependence of fiber and HFV on TV is troubling given that Verizon’s data shows that well over a third of its customers have elected to contract for their low-cost “Custom” bundle, and that the number of customers who renew at this lower price point is so great that it overcomes the new subscriber numbers.  It would appear that attempts to go down-market with FiOS to get better penetration in the areas where FiOS is offered are converting existing customers to lower-priced plans.  If broadcast TV is in fact in jeopardy, then the TV justification for HFC and FTTH is too.

One possible answer is offered by 5G, as Verizon has said.  If you were to feed 5G microcells with FTTN technology, you could hop into the home at a lower cost and get a microcell community that could be exploited for other services too.  Since mobile broadband and the user habits it’s enabled are arguably a big factor in the drop of interest in broadcast TV, this would exploit the cause to mitigate the effect.  However, shifting users to an on-demand TV model of any kind reduces the role and power of the TV provider, shifting it more to the networks who develop the programming.  In the long run, if the Internet becomes the delivery mechanism for programming, the effect is to switch users to a service with a lower profit, further complicating the justification for fiber.

Nobody is committed to universal fiber in the US, nor does anybody in the provider community believe it is possible—and that includes Google.  We’ve probably picked most of the low-hanging areas in a demand-density sense, and while a symbiosis of 5G and fiber to the node/neighborhood is promising, we can’t say how much it would save in cost or drive in incremental revenue.  We also know that viewing trends seem to be shifting away from broadcast TV, albeit slowly, which threatens the biggest justifier for fiber.  So, in the end, this path to universal fiber is risky too.

So is there hope?  Quigley makes some interesting points in the “hope” area, and there’s support for them in other industry trends.  First, pass cost with fiber is coming down because of technology improvements.  Verizon’s pass cost for FiOS is said to be less than half what it was originally, perhaps as low as a third.  Second, the original cost estimates for NBN could still be met with FTTP today.  Taken together, though, this says that most of the realizable fiber-technology savings on the table would be eaten up by other factors.  That suggests that revenue is critical, and the US experience as reported by Verizon is troubling in that area.

We may, if broadcast TV continues to slip slowly as the cash cow for access infrastructure, have to consider the unthinkable, which is Internet that stands on its own.  Those who might argue that you can get broadband only from cable and telco providers should consider that those who do take TV fund a big chunk of the access infrastructure, from which broadband-only can then be delivered at a lower marginal cost.  What happens if that subsidy goes away?  If Verizon’s experience with its low-cost custom plan is an indicator, we’re seeing new pricing pressure on TV.

I reported on Verizon’s hope of marrying FiOS with 5G earlier, and that could allow Verizon to lower the connection cost for homes by eliminating a lot of the in-home elements and even the service call.  In the long run, though, it may be that we need 5G mobile services to play off those FTT-something microcells.  It’s possible to conceptualize mobile services created more by microcells than by traditional ones, but the technology issues are significant—roaming when you might pass through ten microcells just to get to the end of your street is an example.

Something in the mobile space is needed, IMHO.  Mobile infrastructure is getting the investment.  Mobile services are the direction the consumer is going.  If wireline broadband in general and FTTP in particular is the goal, then we’ll have to meet it increasingly with mobile contributions.  The technical issues, and even ownership and regulatory issues, will have to be overcome some way, and we should probably start thinking about that now.