Bad Numbers Mean Bad Decisions

Anyone who’s followed my writing knows that I’m no fan of the National Broadband Plan.  My main issue is with the data that’s been presented to back that plan, and some recent work I’ve been doing is making me even more skeptical—if that’s possible.

What started me off was a comment by a White House science type.  He said that he was sure that there were billions to be gained in productivity and jobs if broadband were more available, though he admitted he didn’t know exactly how those benefits were calculated or realized.  OK, I said, let’s then take a look at broadband versus economics and see if there’s a correlation.  The FCC has data that shows, by zipcode, where there’s a lot of broadband providers available.  Other agencies provide household income data, also by zipcode.  Suppose we correlated the two?

If broadband availability is in fact an economic benefit, we should see some correlation between the number of providers and the household income of consumers.  We do, but it’s the wrong kind.  The data shows that the correlation is overwhelmingly in the reverse.  The areas with the most broadband providers available are the areas with the lowest household income.

Let me illustrate with a random example from my own area.  Take two suburban communities in southern NJ as an example.  One, which is a kind of middle-upper community, has 11 providers according to the FCC.  The second, which is arguably the richest community in the area, has only 10.  Grab a random residential zipcode from across the river in Philadelphia, where the household income is a quarter that of the first community and a sixth that of the second, and you find they have 12 providers!

Now I’m not saying broadband is making people earn less, though in fact that’s a more supportable view given the data than the contrary assertion that it would help them earn more.  I’m not even saying that the urban poor have generally better broadband, the FCC’s rhetoric notwithstanding.  What I’m saying is that even a simple review of the data we’ve collected shows that our viewpoint on the role and value of broadband Internet isn’t supporting the popular views, or the views the FCC is presenting in its National Broadband Plan.

The data also seems to suggest that geographic factors like population density are by far the most significant forces in determining where broadband competition will develop.  Even in very poor zipcodes we see a lot of providers—more than in most of the richer ones.  Why would operators focus their efforts on places where household income is the lowest, if not because those places have population densities that overcome even four-to-six-to-one income disparities?   That proves our long-standing point that demand density means everything.

We’re also concerned that the FCC’s data includes non-facility providers of broadband, which in our view distorts the picture considerably.  The only way to get broadband to the user is to deploy infrastructure.  Riding as a wholesaler on someone else’s doesn’t create new options, only “new” providers.  In fact, there’s every reason to believe that multiplication of wholesale players might erode margins and further limit investment.  It certainly distorts the figures, and most people where I live couldn’t name more than two wireline and four wireless providers, which totals 6 and leaves at least four or five unaccounted-for.  Who are these providers, one must wonder?

The biggest problem here is the lack of clarity of data, or the reliance on incomplete or just bad data—it’s hard to say which.  The FCC appears to have gathered a lot of information through third parties, and also appears to have muddled its own data collection.  As I noted, it’s hard to say whether this was ineptitude or deliberate.  What’s easy to say is that bad policies are inevitable if bad data fuels them.

A “European Approach” for Us All?

Speaking yesterday at BBWF, Alcatel-Lucent’s CMO Stephen Carter talked about the need for creating a “European approach” to 4G broadband.  Some of the specific points in the talk weren’t new; we need to move beyond all-you-can-eat pricing, we need to add some specific partnership and settlement processes, and we need to recognize the intrinsic differences in the major markets.  What is interesting to me is that all of this is coming to a head right now.  Why that is might be the most interesting thing of all.

The reason is mobile 4G services, and the fact that these services are being driven by smartphones and tablets and even e-readers—appliances.  Mobile disintermediation via appliances is a real risk, and 4G bandwidth levels mean that there is truly an opportunity to create a new model of the user’s relationship to the network.  The risk that new model might end up being a reprise of the OTT-dominated wireline broadband market is very real now.  Further, 4G deployment offers operators a chance to reset the pricing and service relationships—to a point.  Operators either have to take the opportunity and level-set 4G differently, or they have to avoid 4G investment as being something unlikely to pay off for them in ROI terms.

Which of course is Alcatel-Lucent’s issue here.  Arguably companies like Alcatel-Lucent have been most successful in the wireless area, and an operator trend toward stagnation of wireless investment would be a major barrier to Alcatel-Lucent’s future profitability.  But the truth is that they aren’t the only one with a bet in the 4G game.  With the exception of Cisco, whose ambitions for revenue growth are spreading to markets adjacent to networking, every one of the major network vendors is a slave to wireless capex growth because wireline growth is not going to even sustain their current numbers.

What is clear to me is that everyone in the broadband game realizes that 4G is the watershed issue, the place where we either get control of network evolution in an economic sense or admit we can never control it.  In the latter case, it’s clear that we’ll see sharp capex declines beginning (according to our model) in 2012 as ROI pressure on operators constrains network investment.  In the former case, we could see the very thing Carter says we need—immersive broadband that touches all of us in all aspects of our lives, because it can profitably be made to do so.  It’s not a glamorous vision for the US market because we want to believe everything’s free.  It’s not simplistic like Cisco’s vision of driving infrastructure investment simply by forcing more traffic onto the network regardless of the ROI.  But it’s a true vision, and Alcatel-Lucent is perhaps best in all the industry in articulating it.

But can they deliver it?  The principles of Application Enablement are surely relevant to creating what Carter hopes for, but they’re not a sufficient condition as they stand.  There are too many holes in the story of the “European way” when you get to the rubber meeting the road.  Potholes are a bigger threat to ROI than the current disorder in a way, because without a clear path to invest everyone will hunker down and look ahead to when that path becomes clear.  That could hurt capex even earlier.  Four vendors (Alcatel-Lucent, Ericsson, Juniper, and NSN) have assets to build the kind of future Carter talks about, and not just for Europe.  Which one will come through?  We’ll likely know by spring.  Carter’s speech is proof that the issue is too acute to be ignored any longer.

Ecosystemic Security

Juniper announced a mobile security suite, building on its Junos Pulse agent/client software that operates across a wide variety of mobile and PC platforms.  The elements of the suite (the anti-virus, firewall, etc. that are common to most PC suites) are less news than the framework in which it’s being provided.  What Juniper is doing is binding security as an element in a device agent, then coordinating it through central management of that agent so that it’s effectively a part of a collective network- or organization-wide security program.

The newest problem both enterprises and operators are facing these days arises from the fact that a single user is extended across multiple appliances, and increasingly uses those appliances as facets of a virtual personality.  That’s true with social-driven consumers but also increasingly with productivity-driven enterprises.  Point-solution security not only doesn’t secure the range of devices, it forces those who want security to integrate disparate policies and processes to create a secure framework, and one miss destroys not only collective security but also risks cross-contamination of the other channels to the user.

I like the Juniper approach here not because of its capabilities or because of the need that Juniper-sponsored research was targeted at validating; we have security on devices, and we’ll have it on all eventually, and the problems of device security are hardly a surprise even without new research.  What I like is that Junos Pulse extends “the network” to the device itself and makes it an agent of network policy and services.  That seems the only long-term solution to both security issues and to creating service value-add.  Plus, the multiple device faces of the user are going to pop up in a lot of future service missions, and they will be problematic to those without a device-integrated approach.

It’s hard to pull this story out of the Juniper talk, in part because it’s focused so much on security needs and the point-solution remedy.  The real story is the ecosystem.

Is Ozzie Right?

Microsoft tech guru Ray Ozzie is leaving Microsoft, and in the wake of the announcement a memo from Ozzie was leaked to the media.  In the memo, Ozzie asks Microsoft to confront an age without PCs, an age where traditional Microsoft PC incumbency would thus be meaningless. 

What Ozzie is looking at is whether appliances like smartphones and tablets, combined with cloud-hosted services, could change the appetite of the public for personal computing.  I think that the answer is already known, but it’s ambiguous.

The question is whether cloud services can absorb all the functionality of local applications.  In theory?  Sure.  In practice, the problem is that of willingness to pay and profit.  If the total market for computing and applications among consumers is seen as being ad-sponsored, we’ve collapsed a multi-billion-dollar industry into something that’s likely a tenth its current size, simply because you can’t expect ads to sponsor all of content, all of software, and all of everything else when the world’s ad spend is only about $680 billion and isn’t even growing as fast as world GDP.  Thus, we’d have to expect that the consumer paid in some direct way for the incremental application services.  So whether that direct payment was less than the cost of central hosting of the applications becomes the question.

To answer it, we say that central IT resources are always cheaper—economy of scale, after all.  But the erlang curve shows that economies of scale taper off at volume, meaning that there’s a point where no further economy can be gained.  And you still need a screen, keyboard (even if its virtual and on-screen), CPU chip, and memory to create a network appliance.  The cost of making that into a computer isn’t incrementally enormous.  I can buy a netbook for three hundred bucks, get free or cheap software for writing, calculating, photo-editing, and more.  Sure I have to sustain the software, update it and secure it, etc.  But most of the threats to security come from the Internet, so don’t I have to secure my appliance anyway?

My point is that Microsoft is as much at risk for over-reacting to the future as it is to under-shooting it.  Its biggest problem is the same one it had before all the Internet appliance stuff hit the market—once everyone who needs a PC has one, what’s your future strategy for growing revenue?  Microsoft needs to capture the incremental revenue from the appliance-and-cloud craze, not to substitute that revenue for its current revenue stream.  If it does the latter, it dies pure and simple.

Revolutionary stuff is interesting, and in this mindless media age the only thing that matters is “interesting”.  Truth won’t create click-throughs.  But truth is what creates markets.

The Week Ahead: October 25th

There’s a significant potential for some swings in stock prices this week (not that we haven’t seen them in the past!) because of the volume of economic news and the number of earnings reports due.  The number that’s likely to be watched most closely is the 3Q GDP growth, which our model pegs at about 2.1% in annualized form.  While very few now expect a double-dip recession, this number will be seen by the stock market as an indicator of likely near-term future economic health, and hedge funds will certainly short the market aggressively if it dips.  I think it’s a bit of a tempest in a teapot over this one; whatever it is, it’s almost certainly better than 1Q and worse than 4Q so we’re slowly recovering.

The FCC is getting itself behind a wireless-based thrust at national broadband ills, but I don’t much like Genachowski’s style here.  He opened a recent talk with comments about the slide in the US economic standing worldwide, and then jumped to spectrum.  To me, that implies that we can lever our way into the top economic spot with wireless broadband, and if there’s any truth to that it’s yet to be substantiated by one piece of objective data.  Sure we might start a wireless bubble, but it’s not going to transform our economy to facilitate Twitter updates or let teens watch music videos.  If we have an economic problem (which clearly we do) it’s because we can’t produce substantive stuff any more; we’re trapped in social networks and deceptive advertisements for herbal supplements and consumer product gimmicks.  We became an industrial and economic giant by building the fundamentals—steel, cars, ships, planes.  Heavy industry is the base of everyone’s economic stability, and figuring out how to provide incentives (government and technology) there should be our top priority.

Does Apple’s Lion Strategy Threaten More Disintermediation?

Apple’s moves to converge its iOS and MacOS platforms over time and to create a unified developer environment between their disparate devices is a smart move that responds to the reality of the market and competitive environment.  The question is how far they’ll go and what impact the efforts will have on the appliance space, the developer community, and even the service provider market.

 The iPhone launched the smartphone revolution, which in turn launched the applet/widget revolution, which in turn is opening the question of whether device-resident intelligence will play a commanding role in the development of what the buyer/user perceives as “services”.  The iPad has had a similarly transforming effect in the tablet space, and competitors have already established themselves with smartphones—primarily via Android-based phones in the broad market and on RIM’s building on its enterprise incumbency.  Competition is also increasing from both sources in the tablet space, with pretty much the same cast of competitive characters.

What creates Apple’s platform dilemma is that broader installed bases begat greater support for developer opportunity, and thus a larger application community.  As I’ve noted before, this was one of the factors behind Apple’s loss of its early lead in the PC market to the IBM-compatibles.  An open framework attracts support because it is open, but it also reduces the originator’s ability to control and monetize its own marketing, which is why Apple has traditionally rejected such an open approach.  But a marriage of its Mac operating system and the OS used for appliances and the harmonizing of a development environment across both would have the effect of increasing Apple’s developer mass.

The challenge is that it will also almost certainly cause Google to prioritize Android as a tablet OS, thus exacerbating the competition between these two industry giants.  The further the Android OS goes in terms of supported hardware, the harder it will be for Apple to sustain itself as an appliance walled garden.  Some gestures of openness exist through the developer program, but Apple’s long-standing feud with Adobe over Flash illustrates where walled-garden thinking can take you and how it can create a lot of gratuitous enemies.

On the service provider side, the competition between Apple and Google (through its Android proxies) creates yet another path to disintermediation.  Ceding service-creation innovation to OTT players was a problem in wireline, and ceding it to smart device vendors and developers in the wireless space only makes things worse.  The so-far-ill-fated Microsoft phone strategy has been toying with hosted services, but probably more as a means of getting Microsoft into the OTT feature business than as a means of empowering operators.  Can operators respond with an approach of their own, and in time?  Their service-layer revenue future may depend on it.

Beware of Free

The Facebook scandal, where popular application providers shared private data without user permission, is only the latest in a series of targeting-related breaches of privacy and violations of “policy”.  The FTC has been of two minds regarding the issue, with some believing that regulation was necessary to protect consumers and others believing the industry could regulate itself.  The current direction appears to be toward self-regulation, despite the mounting evidence that the industry is unwilling and unable to do that.  What’s going on here, systemically?

First, to understand targeting you have to understand motivation.  The goal here is NOT to get the right ad in front of the right person, it’s to get ads in front of fewer wrong people.  The ad industry knows that things like TV commercials blast ads to the point where it’s unlikely that there’s any possible consumer who doesn’t see them.  Thus, a well-targeted ad isn’t any more likely to be seen by the prospective buyer than one that’s simply broadcast.  What is likely is that a well-targeted ad will be seen by a lot fewer people who aren’t likely prospects.  Even if you pay more for such ads, you spend less with “overspray”.

This goal of advertisers sweeps into a consumer market with an appetite for free stuff.  Nobody wants to pay for anything if they don’t have to, and so nobody really wants to pay for content, or Internet services, or online applications, or whatever.  They’re therefore likely to surrender a certain amount of privacy to secure what they want, believing that the cost to them is less than the benefit.  Since one could argue that the goal of regulation is to protect the public interest, it would seem illogical to regulate consumers out of the benefits they’ve elected to trade for.

But that presupposes they’ll actually get what they want, which is the big fallacy of targeting.  As I noted, all you need to do is run the numbers for online ads versus TV commercials to see that what’s happening isn’t a flight to quality in terms of consumer targets but a flight away from the non-engaged.  That flight is motivated by cost, not additional revenue, and thus it’s necessarily a less-than-zero-sum game.  And that means that success in targeting funds not more experiences but less.  Consumers are giving away their secrets to lose, rather than gain, and that’s something regulators should be dealing with.

Regulators need to be thinking more about the future of the industries they regulate; the financial crisis proves that point.  Those in the industry need to think a bit too, because the real opportunities in the long run are created by the long-term money flows.  Cost conservation never leads to anything but commoditization, no matter what part of the network food chain you’re in.

Reading the Apple-IBM Tea Leaves

Chicken Little got everyone upset by spreading the rumor that the sky was falling.  Apple and IBM both reported their earnings yesterday, and there were immediate calls for a Chicken-Little-like response.  Duck and cover?  It’s not that simple.

Both Apple and IBM beat estimates, for starters.  It’s nothing new for Wall Street to sell off companies who meet and even beat expectations.  The classic “buy on the rumor, sell on the news” mindset is exaggerated these days because hedge funds don’t hold stocks as an investment, the trade them to take advantage of either price increases or dips.  When earnings have been reported, the prices tend to settle and news is more scarce, and thus it’s fashionable to sell them off and move to something else.  All this means that sudden stock movements don’t necessarily mean anything, which is comforting for Apple whose shares dipped over 5% after hours.

We need to see how Apple’s or IBM’s outlook might reflect on 4Q.  The holidays are critical, not only to draw down inventory but also because consumer attitude can’t easily be turned around except in November and in May.  If we miss the current opportunity, we’ll have six months of slow growth and 2011 won’t measure up.  If we meet or exceed holiday expectations, our model says we’ll have an upside surprise next year in tech and in the economy overall.

Weekly Economic Update

We’re coming into the heart of earnings season and also into a new round of economic numbers this week, which will give us a better idea of just where things might be headed for the economy.  So far it looks like recovery is on track even though employment is lagging in the US in particular, but there’s still the question of whether consumers are really getting their hearts into the holiday period or simply saving one place to spend in another with less overall positive spending growth as a result.

 The mortgage situation is a troubling factor.  The flood of foreclosures created a flood of shortcuts in the handling, and now it appears that the process was violated.  Since that process is a statutory requirement for foreclosing, a problem here creates a cascade effect, and some of the lower-tier impacts could hurt.

 The top of the pyramid isn’t bad; foreclosures might be limited or halted for the time it takes to get the paperwork legally straight.  That would delay entry of homes onto the market, keep people in their homes longer, and otherwise do fairly good stuff.  But for homes already foreclosed, it clouds the title the banks can give to purchasers.  That means title insurance may be impossible, making mortgages difficult.  That could slow the housing process down, given that foreclosed homes make up nearly a quarter of current-period sales in some areas.

 Another question is whether the combination of the credit crunch and stalled spending plans might not be reducing liquidity in the economy overall, thus contributing to what’s called “deflation” or an effective decrease in the cost of goods and services that’s caused by money appreciation.  Deflation typically stagnates economic growth, in part by encouraging deferring of spending to a future time when stuff will be even cheaper.

 If we (somewhat simplistically) say that inflation is excess liquidity in the economy and deflation is insufficient liquidity, then the classic mechanism to fix deflation is to do something inflationary like reduce the Fed rate to increase the money in circulation.  When that’s not possible because of pre-existing reductions that have taken the rate to near-zero, the next step is “quantitative easing”, in which the Fed prints money (in effect) and uses it to buy debt, usually its own bonds.  That gets new money into circulation and hopefully increases overall liquidity.

 Some of the problem with liquidity today is likely created by cautious consumer spending and tight credit, and the flight of investment dollars offshore to developing markets with higher returns is another factor.  All of these factors can be reduced with a resumption in growth, but it’s the classical Catch-22 and that’s what the Fed hopes to break with additional asset purchases.  This is a good time, I think, because it corresponds to the holiday period when there’d normally be a bit of an uptick in consumer spending.  But it’s clear that the Fed isn’t 100% sure how much if any is needed or they’d have acted already to get ahead of the curve.

 There is some information to suggest that businesses may be holding to their 2010 IT spending, but the 4Q pattern will tell the tale since there’s a larger un-spent budget excess than normal coming out of 3Q according to the survey data we’ve collected so far (which isn’t complete, so it’s suspect).  That will be released if economic trends continue to show some improvement, and it will also impact budget planning for next year.  If we win in 4Q we’ll win in 2011 overall, in short.  This week’s data may tell the tale.

A New Case for Agility

It’s always fascinating to listen to network operators and even large enterprises talk about their infrastructure projects, and I’m just starting to analyze the first round of our fall strategy survey so I’m getting that chance on a large scale.  It’s too early to say how everything is going to come out, but one thing that does strike me at this point is the discrepancy between how real network-builders see their networks and how vendors see them.

 In the service provider space, the classic vision of networks is the hierarchy—access to aggregation/metro to core.  The goal is supposed to be the creation of uniform connectivity and good bandwidth economy of scale, and the practice dates from the earliest days of packet networking.  It’s also nearly always at odds with what’s going on in the real world.  Today, hierarchy is being replaced with delivery.  It would be safe to say that were video content the only traffic source (or even the overwhelming majority of traffic) the Internet would look more like a CDN than a hierarchical network; all traffic would be user-to-cache and the only “core” traffic would be to populate caches.

 In the enterprise space, people are starting to realize that the applications that consume a mass of incremental bandwidth aren’t universally distributed through the business.  Enterprises tell us that 73% of all their data center traffic and 81% of their collaboration traffic moves less than 10 miles.  The data center network is growing very fast as inter-process and storage traffic multiply, and the LAN traffic in major headquarters facilities is growing nearly as fast, but branch traffic is growing at a much slower rate.  You need only reflect on what happens in a local sales office or branch bank to understand why.  Most remote-office traffic is transactional and thus doesn’t expand with anything but business activity.  Who does a bank teller collaborate with if not the teller in the next station, who’s hardly a communications destination?  Who does a real estate office manager or manufacturers’ rep telepresence with?

 It’s never a good idea to be disconnected from the market reality.  We can’t build optimum networks for anyone without understanding what their networks are really doing, and it’s not a problem confined to the network space.  Enterprises tell us that vendors are proposing data mining benefits across an employee population whose job activity doesn’t involve any of the data they’re proposing to mine.  Providers tell us that vendors are suggesting operations cost savings that exceed their total operations budgets because they don’t understand how much is really spent on operations, or even what the word means to the network operators themselves.  Or what new services will do to the space overall.

 The appliance players like Apple, Google (via Android), Microsoft, and the host of tablet and phone vendors are creating a more responsive consumer playground to cater to instantaneous trends, or to launch those trends where possible.  With a low-inertia alternative developing, we could end up with traditional networking and even IT simply getting out-danced.  You can see already that metro networking is becoming Ethernet networking.  Alcatel-Lucent’s recent super-switch shows that vendors realize that transport/connection is getting pushed to lower OSI layers, but do they realize that differentiation/innovation is harder to deliver there?  The more we want to dazzle the consumer, the more we need to create new in-network value, the more we need to examine a new conception of what a network is and what it contains.