Neutrality: Shifting Sand or Quicksand?

Net neutrality has been a thorny issue for the industry from the first, and the importance of finding a rational policy increases as network operators come closer to the point of “ARPU turnaround”, when the revenue-per-user curve flattens and then falls.  Since traffic per user is increasing, this turn-around point spells the time when future service profits from the current model are now unlikely.  “Neutrality” closes off some future service models, and so could threaten investment.  It also protects consumers and the industry against anti-competitive behavior, price gouging that’s incentivized by falling profits.

The FCC’s neutrality ruling is still under appeal, but it’s one that neither side of the issue (pro-investment or pro-consumer) seems to like.  Thus, both sides have been pushing their agendas in their own favorite ways.  For those opposed to neutrality (largely Republicans), that’s meant introducing bills or amendments designed to curtail or roll back FCC actions.  For supporters of neutrality the reaction has been more grass-roots, and one prong of the effort has been to put pressure on telcos in particular to vote on neutrality compliance at the shareholder level.  We’ve had developments in both these spaces this week.

The SEC has ruled that neutrality is a significant policy issue and can therefore fairly be put to shareholder vote.  That doesn’t necessarily mean that it will, and even if it were it doesn’t mean it would pass.  In fact, my model says that shareholder votes on neutrality would likely spawn a campaign to explain the issues in terms of stock price, and that would likely result in a defeat for neutrality advocates.  Certainly the large-bloc holders would vote against it.  However, the efforts here would likely push the topic to the forefront (again), and that might have an impact on telco planning.

The other side of this is the legislative efforts to roll back or restrain neutrality, and here the situation is reversed in that anti-neutrality forces have won a victory.  But here again the impact of that victory is unlikely to be meaningful.  Attempts to amend the spectrum legislation before Congress to require that the FCC not impose any new neutrality rules on the services offered with the spectrum were withdrawn.  That doesn’t mean that neutrality rules will be extended to wireless, nor that the efforts to prevent that will end.  It means more debate.

Forgetting the politics of this for the moment (likely a good idea these days if you want to address anything rationally), the question here is how to insure that the Internet remains a fertile ground for innovation.  Unbridled neutrality doesn’t do that, nor does the complete lack of rules to insure neutral behavior.  I think that the place the FCC has gone astray here is in the ruling on third-party payment and settlement.  We should allow websites to pay for priority handling of their traffic and also allow for inter-provider settlement for traffic-handling, because that would be in the best interests of the consumers.  It might hurt the VCs, a group who have IMHO become not much (if at all) better than the bankers who gave us the 2008 crash, and it’s likely no coincidence that Genachowski is from that genre.  I’ve heard rumors that he may leave the FCC in 2013 even if Democrats retain control, and that could be a good thing for the debate.

This week, we got news that ad appearances in streaming video doubled in 2011, and that raises two significant questions.  First, why?  Second, does this mean that we actually need a renewed debate on neutrality?  The “why” of the process could be either recognition that streaming is becoming a real factor in viewing for at least some market segments, or the result of broader application of TV Everywhere principles.  The “do we need to talk” question, I think, has a clear answer.

Anything that drives up traffic without driving up revenue reduces the ROI for the network operator and disincentivizes infrastructure upgrades.  What’s interesting is that until the last year or so, streaming was largely “free”, meaning that there was little revenue generated by it and so little potential for settlement.  Then suddenly Netflix burst on the scene and became the largest single source of traffic.  Now we see paid advertising in streaming opening the door for ad-sponsored use of the Internet as a delivery vehicle.  Add Aereo into the picture, a company who wants to make a money-making business out of taking free-over-the-air onto free-streaming, and you have a picture of a behavior trend that’s shifting from being symbiotic to being parasitic.  Yes, the Internet is creating innovation, but is that innovation focusing on how to game the pricing model or advance the technology?

 

Market Shifts: LightSquared, Cloud, and Comm Standards

The FCC has reportedly (finally) said it won’t approve the LightSquared broadband-in-the-sky plan because of interference with GPS services.  I’ve never been a fan of the whole notion (two years ago I shocked one of the equipment vendors by telling them I didn’t think it would ever happen) because I don’t think the service would be commercially competitive, but certainly the FCC ruling would create angst among vendors (not to mention at LightSquared and Sprint, who is on the hook for a $65 million payment to LightSquared).

The GPS interference problem stems from the fact that GPS devices are small, low-powered, and easily overwhelmed by strong signals nearby.  That means that they could be impacted not so much by the satellite downlink but from broadband uplink traffic from devices near the GPS units.  The FCC has been wary of this issue for some time, and has finally decided it’s critical.  That leaves LightSquared with the ugly choices of a) hoping they’ll change their minds, b) trying to swap the spectrum, or c) shutting down.

IMHO, even if the spectrum issues swere solved the competitive problem is harder to pin down.  The founding notion of LightSquared was that of a wholesale LTE framework that could enable smaller operators to obtain a national footprint without deploying national infrastructure.  Obviously this would mean wholesale use of device-to-satellite connectivity, and with data usage growing you have to wonder how many data connections a bird could handle.  Some of more recent LightSquared options have been described as “hybrid” LTE and satellite and said to include fiber deployment, wholesaling stuff from Sprint, etc.  It’s never been clear to me exactly what LightSquared thought it would deploy and how it thought the thinning margins of mobile broadband could possibly support a wholesale middleman player.  Could this be a hype, dare we say, in our hype-ridden market?  Try to dig out the details on the network for yourself; you’ll likely come away with more questions than answers.  That’s troubling considering how far this process has gotten.

There are other hype spaces, of course, and none are more hype-ridden than the cloud.  Yet according to the cloud pundits, quoted in the cloud media, there’s plenty of opportunity for network operators in the cloud space.  At one level, that’s a kind of useless assertion; if there’s any opportunity at all, there clearly has to be some for the guys who own the networks.  At another level, it’s false because the theme of the stories is the disruption of the IaaS model and incumbent Amazon.  That’s not an opportunity, it’s a trap.

Some operators, at least, believe this.  One told me that they don’t want to move from depending on one service with low-and-shrinking margins (broadband) to depending on a new one—the cloud.  Yes, the operators have a natural advantage in any market that’s a race to the bottom; they do have scale and they also have a lower internal rate of return expectation.  But that doesn’t mean they have to eat sawdust if there’s caviar to be had.

Speaking of communications services, Cisco has decided to appeal the EU approval of the Microsoft acquisition of Skype, saying that there should have been more conditions set to guarantee interoperability.  The decision is unsurprising at one level; these days, everyone always says everyone else is either anti-competitive or infringing on some patent or another.  At another level it’s a thorny issue because the deal would likely accelerate a shift away from TDM overall (videoconferencing and voice), something that would clearly benefit Cisco.

The train has left the station on IP communications interoperability, in my view.  Regulators a century ago recognized that you can’t have a universal and efficient phone network if every town builds their own system using their own technology, and we got out in front with the notion of the Bell System.  Internet communications grew up in the age of deregulation, and so we’ve deliberately sacrificed standardization for innovation potential.  To impose a change in anyone’s protocol for anything at this point is more hurtful to the consumer than risking a kind of monopoly-by-acclimation where everyone flocks to one banner.  It’s a price we’ve paid willingly up to now.  What, after all, is Google or Facebook if not that?  Standardization efforts on something as simple as IM have had little impact, I’d assert.  How much would video or telepresence standards really do?

The real issue here isn’t “standards” per se, it’s the approach to community-building in complex services.  That, as I’ve said before, is a problem of a federation strategy and not an interworking strategy.  You can’t interwork things that aren’t based on the same essential feature and client framework, and in future services the personalization options offered to users alone will eradicate any chance of consistency.  You have to compose interworking of elements into service definitions, not interconnect the services that result from those definitions.

Apple is bringing communications features from its iOS appliances to the Mac, something that will certainly be managed more and more through the iCloud.  Whatever Cisco’s chances of success with the EU are (slim, IMHO), there’s little or no chance that Apple won’t field its own vision of teleconferencing, communication, and collaboration.  The only high ground Cisco could seize here is in the federation space, and that’s more a marketing/positioning stance than one before the bar.

Alcatel-Lucent’s New Broadband and IBM’s New Cloud

I’ve been speculating on the role that WiFi might play in the future mobile broadband ecosystem, and Alcatel-Lucent has apparently been doing the same.  The result is their latest enhancement to their lightRadio line, which they call “lightRadio WiFi”, a development that addresses the reality of mobile broadband—it’s not all about 3G and 4G but rather about appliances.

Smartphones and tablets have long used WiFi both to escape over-the-air charges for periods of high Internet use, and of course a growing percentage of tablets don’t have any other connection option.  Given this, it follows that WiFi resources need to be managed not as independent broadband oasis but rather as a kind of sparse mobile network, one that offers coverage in some places but not (geographically) in most.  At the same time, it’s clear (from trials in places like Florida) that it will soon be possible to cover a large geographic area with overlapping WiFi “cells”—and all of this is driven by the appliance side.  What Alcatel-Lucent has done with lightRadio WiFi is to bring WiFi into the fold in terms of connection management, to embrace fully the notion of hand-off and potentially even roaming and charging within a 3G/4G/WiFi mixed enclave, or even in theory for WiFi alone.

They’ve also illustrated why it’s so difficult to be fully engaged in the appliance revolution if you’re a vendor who has neither appliance nor RAN assets.  This new vision of a RAN could be transforming, and anyone who drives the transformation bus gets to decide who sits where.  That makes things harder for other vendors who have less-developed positions with the RAN and service/handoff/roaming control, and much more difficult for those who have no clear asset base on which to develop these capabilities.

In another critical market area, the cloud, IBM sponsored a survey that will shortly be published that shows that the key role of the cloud is not (or should not) be to make current business models cheaper, but to enable new ones.  This is the very point that I’ve been making based on our own surveys.  As our February Netwatcher edition will show, it’s IT projects that unlock new benefits that change the IT spending trend line overall, and these projects are falling off at an alarming rate.  If the current trend continues, then IT will become another business commodity item, something that IBM for sure doesn’t want to see.  It’s also interesting that this talk is coming out of IBM just after it’s made a leadership change.  Users in our surveys have reported that IBM has fallen from its once-unique position as leader in articulating the business-value-to-technology-decision evolution that’s critical in driving new productivity paradigms.  Does IBM now want to get this back?

That’s what we’re hearing.  Ginni Rometty, who came from Global Services, may have a better-than-average insight into what IBM needs to do to get businesses thinking that IT is something that can truly ramp up their bottom line, and that can be done only by showing that there are new benefits to unlock.  That these benefits could drive IT spending up an average of almost 25% over the next seven years versus current trends wouldn’t hurt IBM either.  But as good news as this could be for IBM and even for IT at large, it’s likely not good news for networking.  All of the IT-driven waves of the past involved networking as necessary connecting glue and thus (through several intermediate phases) drove universal broadband.  The next wave would need to drive a new network/IT relationship, and an IT company like IBM is more likely to push that relationship strongly into IT’s favored direction.

 

Another Look at TV, and the Cloud

The speculation on Apple’s and Google’s plans for entertainment products seems endless, but it also seems to be ramping up.  As is always the case with rumors and speculation, it’s far from clear that there’s any substance behind the stories, but there are at least some grounds to wonder a bit.  After all, there aren’t many consumer markets left.

At the high level, the issue for both companies seems to revolve around the viewing experience.  Google makes money on ads, and commercials are ads.  Apple makes money on devices, and TVs are devices.  Both companies covet some presence in the space of the other, and the best defense is a good offense.  The question is how both companies could deal with the margin and competition issues, both of which would be raised by a play.  And remember that with their MMI deal ratified by regulators in both the US and EU, Google is free to manufacture stuff on its own.  Is that threat enough to force Apple to be more aggressive?

Maybe, but it’s not clear that TV is the place for that aggression to play out.  I’m not sure I believe that Google wants to make TVs.  I’m similarly unsure that they want to make STBs, so for Google I think an Android-for-TV-and-STB strategy might be the strongest play, buttressed by an appliance approach that’s an evolution to Google TV in some way.  Apple may want to make TVs, but whether they do or not they face the challenge that if Google has any TV strategy at all, then Apple has to either make a TV or create a licensed iOS version to host on one.  That’s something Apple has always resisted.

On the STB side, the whole opportunity rests on the presumptions that (first) there’s a clear value in creating an ad link between streaming and RF TV, and (second) that an STB is the best way to do it.  Technically, having an STB create a merged HDMI stream from the Web (for ads) and traditional linear RF TV is feasible.  The question is whether it’s valuable.  For online ads inserted in video, advertisers have focused on using the targeting benefits to reduce their overall cost rather than to increase ad spending justified by creating additional sales.  That suggests that were inserted ads to replace broadcast commercials, the net receipts by the networks could well be lower.  Thus, no offering here seems logical.

Cisco jumped on the usage-price trend with another of its releases that seem to tell operators to suck it up and spend on Cisco.  The latest one seems to say that usage pricing is somehow going to increase data usage, logic that I’m frankly unable to decipher.  Operators are generally opposed to usage pricing as a long-term revenue strategy, according to my surveys, and they’ve been that way from the first.  That doesn’t mean that they don’t see a role for it, in mobile in particular.  The majority would rather see third-party charges, meaning the right to charge OTTs for access to users in some way, and that’s something the FCC says it doesn’t want.  All want to get into the higher-level service business themselves, and most of those with usage-price plans would see them more as a combination of bridge to a service-driven transformation and a barrier to rampant traffic growth driven by OTT efforts.

Cloud computing is one of the operators’ specific revenue goals outside their normal bit-pushing, and there are renewed claims that PaaS is “about to take off” as a cloud service.  True, perhaps, but not because of technology maturity.  The real driver is the maturing of the application of the cloud, from the early web-oriented apps toward the business core.  A realistic core application cloud would have to be one that spreads across the enterprise and one or more cloud providers to create elastic, fail-proof, resources and also support new applications better suited for cloud deployment than for central IT.  That combination of requirements would be met most easily by a software platform that subsumed the whole notion of computers, OSs, and middleware into “what the application runs on”, which is a pretty good description of PaaS.  Beware, EC2 competitors!

 

Reading the Earnings

Cisco and Alcatel-Lucent both delivered their quarterly numbers late last week, and in both cases the numbers were decent, but the fortunes of the two companies’ stock was different.  Cisco’s declined after its report, and Alcatel-Lucent climbed significantly.  The question is whether there was a difference in the numbers that justified the different investor reaction, and I think there may have been.

Alcatel-Lucent is arguably the broadest-based player among the service provider network equipment establishment, and it also has consistently scored high in strategic influence.  The problem the company has is one of cost, and no small part of that problem can be attributed to the now-long-past merger.  The Street believes that Alcatel-Lucent is making progress on the cost side, and since that’s the problem the Street sees, it rewards the progress.

Cisco’s problem is growth according to the Street.  In the service provider equipment space, the latest Street forecasts are predicting zero capex growth.  Enterprise spending on networking is somewhat better, but certainly not threatening to gain double digits.  For Cisco to sustain even 10% profit growth annually, it would need to demonstrate that it’s taking market share without sacrificing margins.  That didn’t come off in the report, and so Cisco was not rewarded.

To get a bit more color on the picture, we can add in the fact that two other players in the telecom equipment space (Ericsson and Juniper) both missed, with both companies citing a difficult carrier spending environment as the cause.  Add this to the Street perspective that there will be negligible to zero growth in capex for 2012 and you have another perspective on Street reaction to the quarterly numbers of Alcatel-Lucent and Cisco.  The organic growth potential for the market is minimal; cost reduction is thus the only path to progress, and high-margin, growth-dependent giants like Cisco aren’t favored by that scenario.

The thing that’s interesting, and perhaps a bit disheartening, about all of this is that the True Path to Street Success can only come by addressing the problem with capex, and that can be addressed only by providing operators some path toward higher revenue per bit.  I think that Alcatel-Lucent’s and Cisco’s calls acknowledged that in their own way—customer satisfaction or holistic approach both add up to looking beyond your own sales to the customers’ value proposition.  The question is whether either company will be able to put together a strong story that really addresses the revenue per bit problem.  I think both companies have the ingredients.  I think Alcatel-Lucent has a stronger foundation for the story at this point, and higher strategic credibility with the buyer, but I think Cisco has been gaining traction because it’s making its sales organization more articulate at the strategy level.

Ericsson’s problems suggest that what operators have told us in surveys is really true; they want their integrators to be big (if not their biggest) network equipment supplier.  We also hear that the operators want a service-layer strategy, a developer strategy, and that they aren’t hearing that from Ericsson yet.  It may be that Ericsson is being taken to task for being the only one of the big wireless players who don’t have an explicit service-layer and content approach.  NSN and Alcatel-Lucent both do, and in a market where mobile and content seem to be merging, the combination of both is a critical requirement for gaining strategic credibility.

The situation with Juniper, I think, is more complicated.  The company doesn’t have a real mobile asset base because it lacks the RAN and IMS elements, and without those it’s hard to engage convincingly in mobile plays.  Their new ACX line is targeted at mobile backhaul, but I think there needs to be more in the package to overcome the fundamental fact that competitors have more of the RAN/IMS combination Juniper lacks.  The deal Juniper did with BitGravity is aimed at improving their content position, but how much the service management piece they acquired will help things is also a question, in part because it’s not fully exposed and in part because it’s not clear what Juniper plans to do with it.  It’s not that Juniper lacks technology (they have some of the best) but rather that they are not showing the ecosystemic and customer-revenue-centric positioning that their main competitors, Alcatel-Lucent and Cisco, seem to be gaining traction with.  You can’t present solution elements in a market demanding total solutions, and Juniper still needs to bring this all together in a compelling vision.

But this is a side-show to the big event.  The real question here, I think, is whether ANY vendor will promote a real service-layer solution for operators.  Does Cisco believe it can just gain market share and avoid the risk of taking a strong stand in services?  Does Alcatel-Lucent believe they’ve done enough and the market will step to their window eventually?  Does NSN think it can ride a service-layer vision to professional services success, and does Ericsson think it can get to that same goal without a strong position anywhere in the service layer?  Huawei, meanwhile, looms for those who think that nothing is needed beyond bits.  No matter how you couch cost-based equipment marketing, it ends up spelling “commoditization”, and “Huawei wins” at the same time.

 

Strategy or Tactics?

Juniper’s partner conference this week has managed to catch the eye of Wall Street, which makes it more important than these sort of events typically are.  It’s not an accident; Juniper has been promoting its event more than usual too.  The new J-Partner program will spend more, promote partnerships more, and be dedicated to “driving deeper, more profitable relationships” between Juniper and its partners.

The Juniper event comes on the heels of Alcatel-Lucent’s announcement of ng Connect, an ecosystem of companies designed to create solutions and services for the NGN.  What’s interesting isn’t that I believe the two companies are vying for the same media attention, but rather that the two activities are so close in time and so different in direction.  They are so different, in fact, that you could say they’re not competitive, and yet the events may frame a contest that will have a real winner and loser.

In some ways, Alcatel-Lucent and Juniper have a similar problem.  The network equipment market is definitely under pressure, and the pressure is created by a lack of “new money” to drive buyers.  For service providers, the issue is declining revenue per bit.  For enterprises, it’s the lack of new productivity benefits to drive new project spending.  Both Alcatel-Lucent and Juniper are facing their own sales and profit pressure, created by the market conditions.  But the responses are very different.  Alcatel-Lucent has taken a very strategic step with ng Connect, and Juniper is taking a very tactical one with J-Partner.

Alcatel-Lucent seems to believe that the problem in the market is one of finding a buyer business model, a strategy that sells by helping the buyer make the case to buy.  Their ng Connect program isn’t about products, it’s about finding a path toward solving buyers’ business problems.  The approach seems to be to build a solution/service ecosystem, and it’s a step that suggests that the company is prepared to go toe to toe with competitors once the buyers get an idea of what their ecosystem would need in the way of product.  To Alcatel-Lucent, the problem is coming to terms with NGN.

Juniper has always been a tactical player, sales-driven, much like its arch-rival Cisco.  A partner program is an approach to quickly increasing sales, it’s a “channel program” because it channels products to the buyer.  The buyer business problems in the tactical world are a given; they’re whatever’s driving the hand that signs the check.  The goal of a partner/channel program is to get the product out there in a lot of partner hands, so at least one partner will intersect with that first hand, the one that’s holding the pen.  Feet on the street.

So how do strategy and tactics compare?  The obvious point is that tactics could pay off quicker than strategy because they influence near-term buying.  Whether you build channel programs on “solutions” or “products”, meaning whether you have partners adding significant functional value or not, is less an issue than whether there are near-term opportunities you can grab onto.  Do the buyers have money to spend, or do they need help finding a paradigm to invest in?  In the service provider space, I think the latter issue dominates.  That, of course, is where Alcatel-Lucent and Juniper compete head-to-head.  But Juniper is also an enterprise player.  Does the enterprise have a paradigm in play that will drive network purchasing, or do they need a strategy too?

That’s the question that 2012 will likely answer.  Everyone, myself included, has said that enterprise IT spending is being driven by data center evolution—first in the form of consolidation and virtualization and now in the form of the cloud.  But cloud projects, according to my survey, are the most behind of all projects and the most likely to fail or fall short in terms of benefit realization.  How much strength is there in the cloud as a driver of change, a driver of network spending in the enterprise?  That’s one big question for Juniper, because tactical channel programs will fail if tactics aren’t enough.

Strategy has its limitations too, though.  Look at Yahoo and Jerry Yang.  The company’s stock went UP yesterday when it was announced that Yang was departing.  His strategy, his vision, turned out to have feet of clay and his idiosyncratic views were certainly a factor in making a once-giant firm into something that’s teetering on the edge of being an also-ran in a market it arguably helped create.  With tactical approaches you can at least see whether you’re being successful and make changes quickly.  With strategic initiatives, it’s too late to fix a problem once it’s been recognized.  So will Alcatel-Lucent’s history of vision carry the day here, or Juniper’s focus on sales?  We’re going to know for sure in 2012, folks.  Somebody may be joining Yahoo in the Tech Hall of Declining Relevance.

 

More Facets of Video Future

Video and streaming are obviously going to be hot topics for a long time, and there’s interesting stuff happening all through the food chain.  The question is whether the ecosystem that’s being pushed in so many directions at so many levels is going to converge on anything that all the players can survive in.

CES suggested that there are going to be more options for video connectivity offered in the future.  Sets running Google TV (or Apple TV, of course, but they weren’t at CES), streaming to tablets in the home…maybe even smart car entertainment.  We’ve also seen increased attention to the broader video ecosystem from the vendor community.  Alcatel-Lucent has been promoting its “ng Connect” program, a framework for linking developers with bigger players who might be of help in sponsoring an innovative service notion, then moving the whole thing as a cooperative project.  The concept isn’t limited to video, but clearly video is a main target.  Cisco has announced a partnership with ActiveVideo, a “cloud video” player that creates better integration between legacy RF and IP streaming, to expand the functional scope of its Videoscape offering.  Hulu, who seemed to be on the block, now seems to be getting more financing from the very players who were trying to sell it.

There’s clearly going to be a new set of video options, but I still think that we’re sensationalizing the impact of streaming given the dire effects it would have on traffic.  We can deliver RF multi-channel on cable or fiber pretty darn easily and relatively cheaply.  To deliver exactly the same material to the same number of users with IP streaming would require significant investment in some combination of metro infrastructure and CDN caching.  Even in the US, TV delivery on what could be called “partitioned IP” meaning off-Internet like U-verse, wouldn’t violate neutrality, but that would require access providers get into the game, and that’s the segment that’s been looking elsewhere.  TV Everywhere isn’t intended to be “everywhere” meaning to the exclusion of linear RF to these guys!  It’s sort of “everywhere-my-RF-isn’t” instead.  I think that the Bell Ladies have to sing before we can call the ball on the future of streaming.

 

Bumps in the Internet Video Road?

Apple may be moving into the educational market in a different way, going after the enormous, highly politicized, and highly profitable textbook business.  The details of this move aren’t known at this point but there are some interesting questions about the whole textbook-ebook thing that bear review.  It might be something actually useful in education.

Leaving the question of what goes in a book aside, one of the issues with textbooks of all types is that they are highly inertial.  You don’t want to spend a bundle on books only to replace them every year, and yet there’s both a damage factor and obsolescence of content to consider.  The cost of books tends to induce schools to standardize more on material and to narrow the scope of what’s offered or made available for projects.  Imagine a school whose library and texts where all ebooks, all available to update as quickly as the editions could be changed electronically, all flexible in terms of who gets what, based on who needs it.  That’s probably the sort of vision Apple is looking at.

The challenges are formidable, though.  This sort of thing will not play well with the schoolbook publishing firms, all of whom are comfortably entrenched in their political games.  It’s also not going to be easy to answer the question of who supplies the devices to read the things.  In some schools, kids have laptops or tablets on a routine basis, and arguably that should be true more broadly, but the cost of the gear and the risk of theft or damage is very high when you’re talking about devices like this.  If Apple can’t figure out a way to create a populist educational book market, they risk creating an elitist tier of educational tools that not only won’t catch on but will likely create some back-pressure on the company.  Right now, Wall Street has high expectations for this announcement because they want to see if Apple innovation survived Steve Jobs’ passing.  We’ll find out soon.

Microsoft’s decision to put its subscription TV plan on hold might be linked both to issues of Apple innovation (fear of too much of it) and online business model problems, but most likely it’s what the rumors say it is—licensing.  Content owners are finally figuring out that many of them are being taken just like the access providers.  They make major investments that others leverage for cents on the dollar, so they believe, and the solution to that problem is to raise the rates to license material.  There’s a deeper issue here, though, which is that Microsoft likely fears that the whole of the entertainment ecosystem is in danger of destabilizing.  Too much free content kills the producers of content, the transporters of bits, and puts all the power in the appliance players.  Microsoft isn’t a winner in the appliance space yet; they’re still trying to get a phone and tablet strategy cobbled together.  If they spin out their story now, can they fully exploit the market they help to create?  I don’t think so, and I don’t think they see a clear path to success here either—yet.  Wait till the fall.

The Street is asking whether there are fundamental flaws in the whole carrier router model, and if there are I have to say that the FCC is in many ways at fault.  Genachowski is a VC at heart, one who wants novelty and dynamism in the industry more perhaps than health.  The FCC is rare among regulatory bodies in that it is charged to sustain a healthy industry and not just to protect consumer interest, but under Genachowski “the industry” has meant the OTTs.  It would be unfair to say that routers as a product class are being killed by regulatory stupidity, but it would be fair to say that the vision of the Internet as a high-speed, high-quality, traditional any-to-any grid has likely already been killed.  That, as I’ve said before, tends to push deployments down the OSI stack to the optical and Ethernet layer, because most traffic is going from an edge aggregation device like a BRAS to a cache or POP.  You don’t address anything, so you don’t need IP addressing.  We’re building an Internet a world wide and a metro deep, and that is eventually going to really hit the vendors hard, particularly those without RF or optical positions and without any service layer tools.

Router players have hardly been forthcoming on this trend despite the fact that their own product moves validate it.  That begs the question of whether they’re supporting a vision of market growth that’s already failing in the first step toward the future—the present.  It begs the question of whether their visions of enterprise trends in networking are any better.  It begs the question of whether new products that are supposed to ramp in 2012 are targeting any real opportunity.  My view is that we do have a very pervasive failure of market perception here, one that’s been developing for half a decade, and it’s about to bit us.  Is 2012 the year it does?  Could be.

 

 

Learning from Microsoft’s Mistakes

The story that PC sales slipped in 4Q, first raised as a Microsoft comment yesterday, is now being quasi-confirmed by more detailed shipment data released by various Wall Street researchers.  One, citing Gartner, says that y/y growth in PC sales was well below seasonality in the quarter, and the expectation overall is that PC sales in 2012 will be very near to flat.  Some forecasts are putting sales down a trifle.  Nobody doubts that this is due to tablets, but as I noted yesterday, the thing driving this change is less a movement away from PCs than a movement to defer upgrading PCs to focus instead on a cooler device.

What this does do, whatever the cause, is hurt players like Intel and Microsoft who have built their business substantially on PCs and who now face slower growth in revenues from that source.  Both companies have tried to move to catch the wave on smartphones and tablets, and in both cases it’s too early to say that they’ll fail.  It’s also far too early to say they’d succeed, and the odds are longer for their having delayed so long in recognizing what was happening in their markets.

Microsoft is now said to be preparing for a major marketing realignment that will include some layoffs, though not large numbers.  This to me seems another example of our industry’s tendency to shoot behind the duck.  The time to fix marketing problems is when you see market changes, not two or three years afterward.  The pace of change has advanced so far at this point that Microsoft may have a major problem creating an organization before its mandate has been rendered moot by further change.  They’ve never been able to think level with the market; how now will they learn to think ahead?

Network vendors might want to ponder that point.  In the network equipment space, the Street is mixed in their view, due in part to conflicting data and seemingly contradictory trends.  On the one hand, some analysts see the continued drop in ROI for operators, particularly in the mobile space, creating investment pressure and lowering capex.  Some think that video will increase traffic and increase spending, but then there are those who say that might be true but that the fruits of the trend will fall in the laps of Huawei and ZTE.  There was a comment today on Alcatel-Lucent, who carries a lot more cost than the Street likes.  While the company says they won’t do what NSN has done, which is make a major cut in what they sell and reduce costs to focus on key areas, I think that unless they figure out a way to capitalize on their positives more quickly, it’s going to come to that.  They have too much EU exposure to take market risks; the economic fundamentals are against them in their home market.

The direction of service provider equipment seems clear to me, but enterprise sales are harder to figure.  Our survey data indicates that projects in 2011 were pushed and that there will be fewer projects approved in 2012.  That creates a downward pressure this year, clearly, but more significantly it’s the projects in Year One that drive upward budget momentum in Year Two and beyond.  If we stay with a persistent under-supply of new projects we’ll be lowering future budgets too, which is a problem that takes a couple of years to work out once it gets started.  It’s been 20 years since budget spending in networking exceeded project spending, but we’re going to have that in 2012 according to the enterprises we survey.  There will be plenty of bright spots, and still chances for companies to improve market share, but the market is going to be harder to navigate in 2012 than it was in 2011, we think.  Since economic conditions are likely to improve we may be somewhat insulated from this, but if Huawei and ZTE (as expected) make a move on the enterprise space, it could be trouble.

Is the Cloud the Star of CES?

What’s the lesson of CES so far?  That a tablet is a window on the cloud.  Eric Schmidt danced around that point with his notion of Android making your house cooperate with you by essentially sensing your behavior.  Walk into a room and it’s like the old song; “The room was singing love songs…” because your favorite music or show comes on, the lights adjust…you get the picture.  This, of course, is nothing but another place to start my mobility/behavioral transformation.  An appliance can “sense” you, but to respond to your needs it has to be able to evaluate your behavior, and it’s clear that this will become more complex and social in nature as you move from living alone to being in the real world, at home or at large.  The true future mission of the cloud is to marshal technology to improve our lives.  It’s not to run old IT stuff, but new stuff that’s never been run, never been seen, never been considered.

There are some signs that operators, like Eric, are nibbling on the edges here.  AT&T commented at CES that it plans to create, via its Cloud Architect cloud/developer ecosystem, a global mobile cloud, which is nothing surprising.  Something that might be a bit so is the claim that they could deliver “entitled content” to anyone anywhere in the world.  That suggests to me that AT&T is talking about large-scale video federation, and if it’s true then that could be huge.

Cloud Architect is going to include OpenStack, and that puts AT&T behind the open-source cloud tool set.  Whether this is a good thing depends on whether you see the optimal cloud as a bunch of virtual machines with a workflow controller in front.  Virtualization is a kind of afterthought form of multi-tasking, a strategy designed to make two or more things run together when they were architected to run independently.  It’s a great strategy for server consolidation, but I think it’s selling the cloud short by a long shot.

AT&T’s Cloud Architect is essentially a developer community cloud that’s designed to provide those who support its service ecosystem a place to run stuff.  They could in fact run it elsewhere, like on Amazon, but the best place might be to run it in a platform ecosystem and not using virtualization at all.  Newly developed apps could in theory be run simply as tasks or threads in a multi-programming OS.  They aren’t in large part because it’s hard to prevent interaction among them, which would likely be undesirable.  So is security the only reason for using virtualization in cloud?  It may well be one, but probably the biggest reason today is that we don’t know how else to build a cloud.  The platform-like architectures, including Microsoft’s Azure, Joyent’s SmartOS and SmartDataCenter, and in fact any of the Solaris morphs, are in my view far better platforms for the cloud in general, and developer/service-layer clouds in particular.  OK, I admit that I’m not an OpenStack fan; I think the concept is just benefitting from mindless media clustering on something that sounds cloudy and populist at the same time.  So is a smoke lodge, and mysticism in any form is the wrong thing to build a service future on.

Alcatel-Lucent is also promoting its vision for connected futures in “ng Connect”, an ecosystem designed to promote cooperative development and deployment of services through what seems like standards-like interactions without (so they hope, I’d bet) the usual standards-body politicking.  The technology framework for ng Connect is a bit more flexible, I think, but I’m not sure whether the actual program will end up settling on a primary environment.  I’m also not sure how much cooperation and exchange is actually architected into a platform versus simply permitted ad hoc.  The thing is, it’s the first drive by a vendor to create a service partnership on an open scale, and it will be interesting to follow it.