Hotspots and Standards

Cisco followed up on Chambers’ vague comments about small-cell support with an announcement of its own Hotspot 2.0 WiFi roaming products, particularly a gateway designed to manage small-cell connection into a mobile network.  The move comes as network equipment vendors work hard to address the changes in networking being driven by the increased emphasis on mobile broadband.

As I noted earlier, it’s pretty obvious that increased use of mobile devices for video, something that tablet deployment will certainly encourage, generates a major cell congestion problem for operators.  The problem is unique in that it tends to focus congestion on areas where users can take a moment to look at something, and so it creates a highly uneven bandwidth demand gradient.  Traditional cell coverage plans have focused simply on having every area in range of something, and in many cases the expansion to broadband was planned based on the presumption that cell patterns would be relatively consistent but capacity per cell would increase.  Now we know that cell demand doesn’t increase uniformly, but rather focuses on specific places.  WiFi and femtocells are approaches to increase capacity in locations where users would be expected to congregate.  And feeding these cells, which could be ten to twenty times more numerous than current cells, might be the largest packet connectivity mission in all of metro networking, so getting traction here is important.

Traction is easy to come by if you happen to have a RAN and roaming system in your bag of tricks, but much more complicated if you don’t.  Cisco of course fits in the latter category, and so WiFi is a good way to establish a position in the “new-cell” business without having to buy up radio network companies.  The strategy may be particularly useful given the fact that in the tablet space the great majority of devices don’t come with 3G/4G radios at all, and so can be used only in WiFi mode.  I think that this could drive an eventual shift in the market toward the use of WiFi rather than femtocells, and the Hotspot 2.0 standard could support even a commercial alliance of hotspots with billing and roaming.

There’s still the question of how carriers will make money in this new world, and how they’ll manage new services that would presumably be their incremental monetization hope.  Juniper will be talking about a three-element mobile profit strategy later in the show, one that’s based on security, open programmability, and symmetry in upload/download.  There are no real details yet on this, or how it would be tied into Juniper’s MobileNext strategy; financial analysts didn’t get anything further at the Juniper event targeted at briefing them.  I think that security is a strong issue for mobile, but it really has to be addressed in the context of creating a feature-composition community or mobile cloud.  I don’t know if that’s what’s being proposed here; we may find out on Thursday.

The operators are still struggling with their own perspectives on mobile broadband and services.  A Light Reading story today talks about DT’s concern over the lack of standards that define the linkage between OSS/BSS and infrastructure.  This particular area has been the domain of the TMF, and I’ve noted before that the TMF (and to be fair, all standards bodies) tend to move at  a pace that makes glaciers and turtles seem positively hasty.  My own view is that the problem here is less with the specific OSS-to-network interfaces and more with the broader question of how you operationalize next-gen mobile services whose competitors will be cloud-based offerings from Apple and Google.  Look at the classic OSS/BSS approach and you see something that simply cannot scale to consumer-service levels, nor deliver functionality at a competitive cost point.  There’s a carrier coalition, the Next Generation Mobile Networks group (of which DT is a member), that seems to be trying to drive progress, but I’ve seen the operators do that before and fail to move the ball much.  The notion that RFPs might be issued in 2015, as the article suggests, doesn’t pose much threat either to standards-writers or vendors.  Are there other pressures behind the scenes?

 

Tablet Impacts on Mobile Networks

The second day of MWC is demonstrating a show that’s seemingly polarizing between appliances and tiny cells.  Obviously the trends are linked, and obviously the industry’s long-term health and direction may depend on how—and how well—the marketplace manages to link them.

We’re seeing an explosion in tablet sizes as vendors try to figure out what the optimum device might look like, but the only real advance in the tablet space may come from Huawei, who has its own quad-core chip in its own Android tablet now.  And Huawei’s numbers just came out; they hit $32 billion, which is half-again as much as Alcatel-Lucent and nearly as much as market-leader Ericsson.  And Huawei is everywhere in equipment, and equipment is what’s driving the market, even if it’s not the old traditional kind.

The tablet is what I think holds the real story in the appliance space.  Unlike phones, tablets are still struggling to find a real mission for themselves, and the mission of the tablet may be the determinant of overall market direction.  As consumers shift to social networking (Facebook, recall, is the first social-network keynote at MWC) they shift to social-mediated communication and that undermines the normal voice and SMS revenues of operators.  Tablets are not equipped with TDM or traditional voice at all, and so voice on tablet is creating an alternative approach even for those who want to talk.  Video/voice of course is a totally new model, and despite talk by people like AT&T that we need to have special facilities for pairwise video calling, everyone knows in their hearts that’s not going to happen.

The problem or challenge is that tablets can’t easily be held up to your ear.  Reviewers of large form-factor phones said they felt dorky using the gadgets to talk; imagine even a seven-inch tablet!  But equip a tablet with Bluetooth and you could carry one and still talk.  Will people do that?  We don’t know.  Will the increased horsepower and screenpower of the tablet be enough to overcome the difficulties using one while moving, and create a polarization of behavior that preserves a separate phone/tablet space?  Where will the boundary be, and how will it impact operator monetization?

Cisco may be hinting at its own take on this.  Chambers talked about the cloud and wireless, a topic we think could be critically important if one linked it with the whole mobile/behavioral transformation and the potential the cloud and mobility have to combine to redefine the way workers do their jobs.  It’s too early to know if Cisco is going to make any announcements of substance in the functional blending of the cloud and mobile, but it seems likely that they intend to blend their cloud model with their RAN and offload model.  That suggests two things; first that Cisco knows you need to be a RAN player to have any credentials in the mobile broadband revolution, and second that Cisco knows that the mobile broadband revolution will be defined by what we DO with the combination not by how the bits are pushed around.

All of this could be good news for Alcatel-Lucent, and even decent news for NSN, who has just said it will continue to sell off non-critical assets and slim down into a mobile broadband player.  That’s a great idea if you can provide everything the mobile broadband ecosystem needs, but NSN doesn’t have the equipment scope that Alcatel-Lucent has.  If NSN is to take on Cisco, Ericsson, and Huawei, it needs to have an optimum play where it plays, because it plays on a more narrow front than the others.  Where will NSN try to optimize?  It can’t just claim a better RAN, it doesn’t make its own transport gear for the most part.  But it does have a good mobile services strategy and a good multi-screen content strategy.  Can it add cloud and become whole, market-scope-wise?  We’ll see.

 

Handsets Fiddle at MWC; Do Networks Then Burn?

MWC kicks off this week, a show working to transition itself to relevance in a market that’s trying to do the same.  The questions are first whether either of the two transitions are possible, and second whether there’s a single direction that accomplishes both.

For the show, relevance means embracing social networks, handsets, developer programs, and all the things that speak “mobile” to consumers.  Facebook will give a keynote this year, the first time a social network player has had that honor.  But hey, a recent study says that one out of every eleven web accesses is to Facebook.  Nevertheless, simply saying that consumerism in general and social networks in particular are major drivers for mobile networking doesn’t address the problem of the mobile market, which more and more pundits and insiders are now willing to admit.  Look at what GigaOm said:  http://gigaom.com/2012/02/25/its-the-end-of-the-line-for-telco/

Mobile is falling into the same black hole as wireline, a black hole that has seriously eroded wireline capex—to the point where it’s been experiencing negative growth for years now.  Operators in high-competition market zones like Europe have long been fleeing to emerging markets for improved margins, but this year it’s clear that even the higher margins available in emerging markets in boom times won’t compensate for erosion in profits at home.  And these days are hardly boom times for emerging markets.

In my view, you can see battle lines being drawn.  Everyone, including giants like Google, knows that there cannot be a continuation of traffic growth without revenue growth, but everyone is hoping that the problem will solve itself, and in a way that hurts nobody or the other guy.  OTTs hope that operators will somehow make heaps more bits for heaps less bucks.  Equipment vendors hope that operators will magically make users pay for QoS.  Operators hope that things like usage pricing will save them.  All of these players acknowledge that their favorite remedy would work major hardship on everyone else in the ecosystem.

If Wall Street is right, and I think they likely are, then 2012 is the year we either solve the problem of network profit once and for all, or reach a point where there’s no longer time to apply a solution before the profit crunch hits operators and consolidation and commoditization results.  That’s alarming given the fact that few monetization projects launched to add revenue sources have advanced to at least a proof-of-concept trial, and we estimate that even by the end of 2012 we’ll have little progress.  Remember, mobile ARPU growth is slated to turn negative around the end of this year.  Time is passing the network by.

Even at MWC, the mobile network show.  The big media splash from MWC has been phones, predictably, and that only exacerbates the problems of the operators.  To most consumers, the phone is the service.  To most of those who feel otherwise, it’s the portal.  The operator is rarely on the list, unless something goes wrong.  Then the operator gets the call no matter what the symptoms or source; that’s how operators see it anyway.  Sadly, they’re right.

 

Is Cisco’s Lightwire Move a Smart One?

The M&A in network equipment continues, but it also continues to involve primarily specialty firms being picked up by the giants.  The latest is Cisco’s buy of optical specialist Lightwire, a company that’s specializing in creating optical interfaces whose low power dissipation (heat) means they can be packed more densely.  That’s a pretty solid indication that Cisco believes that they’ll be deploying higher-density optical interfaces in products, and that in turn could offer some insights into their view of the market.

This sort of high interface density could arise in two areas of the network—the aggregation edge, driven largely by demands for mobile deployment (particularly small cells) and in the data center.  For Cisco, a higher-density product in either space could give them some additional differentiation at a point where success will likely be key—the point where new service plans first impact infrastructure.

The question now is what’s driving Cisco here, other than this obvious differentiation point.  Do they intend to try to drive the market themselves with something that could improve their cloud momentum or mobile/behavioral creds, or do they simply plan to let nature take its course and use the interface-density point as a way to intercept opportunity?  If they do the former, they gain not only the ability to time the market on their own terms but also a chance to get the higher-level story that is most likely to generate press attention.  The price is the greater risk they then take; if you want to jump-start a market you have to give it the Kiss of Life for a while no matter how repulsive that may seem.

This comes as UBS reports that Cisco is at least threatened at the switch level.  Campus and LAN switching accounts for most of the spending in the enterprise Ethernet space but our surveys have consistently showed that it’s being increasingly driven at the requirements level by data center modernization, and that is increasingly driven by IT business model changes and cloud computing.  In 2011, UBS says that Cisco lost some share to rivals like Juniper, and so some strong play in the data center might be a very good idea for Cisco.  That’s particularly true given that Juniper was downgraded just yesterday by one Street research firm for its delays in getting its own new products out there.  QFabric, Juniper’s own data center switching keystone, is one of the products now expected by the Street to lag.  Does that give Cisco time to counter, and force rivals into counterpunching against a Cisco initiative?  Could be.

Then there’s wireless and backhaul.  We’re seeing most of the big RAN players focusing on making their wireless gear more suitable to smaller-cell deployments.  Even LTE cells are easily congested; users who expect to get double-digit download speeds could use up a complete cell with even accidental concentrations at a pedestrian crossing, much less a collective rest in a hospitality location.  The more cells, the more backhaul, the more potential for optical feeds…the more cost.  This also ties back to Cisco’s Lightwire strategy; high-density cells mean high-density backhaul.  Huawei has joined the rest in coming out with a small-cell strategy, so you know the market is getting serious.

Might we then expect to see more equipment startups, more “network technology” instead of social-overlay stuff, in the future?  There have been a few indications that could be true, one of which is the sudden interest in social IPOs.  A new one is the momentum of “do not track”, both in terms of Federal policy and in terms of “voluntary” industry steps.  The fact is that absent continued additional targeting and manipulative data on which to differentiate, the online ad market will see price commoditization in a couple of years that will largely offset its growth in appearances of online ads.  I think this is why we already see a few brave VCs looking at things like OpenFlow switching.  Frankly, I’d love to see some innovation at the lower layers here; we might even be able to do a better job of making the service layer work with something like OpenFlow.

 

Projects, Media, Gadgets, and M&A

HP’s numbers, like those from rival Dell, disappointed the Street, and this raises the question of whether tech capital spending might be showing an impact from the “project” issues I outlined in Netwatcher this month.   Projects that advance IT overall just aren’t being launched as much, so spending is stagnating.  The explanations offered by HP and some Street analysts are different; it was the hard drive shortage, or it was the impact of tablets.  I don’t buy either one because only software escaped a shortfall versus expectations.  I’m seeing caution in data center investment, and I’m also seeing a general focus on applications (meaning software) for that project spending that remains.  These are signs of caution, and the caution means that CIOs and executives may still not be confident about 2012.  Certainly they’re conservative about tech.

Juniper joined into the M&A announcement craze, but its focus was on beefing up its security portfolio with Mykonos Software, a company that provides a “tar trap” deception-based technique for identifying attempted hackers.  The concept is applicable either to enterprise or service/hosting/cloud providers, and so it could be a valuable addition to Juniper’s portfolio.  That’s particularly true given that Juniper has no conspicuous mobile assets to leverage and has been so far unable to create a captivating cloud positioning for itself.  Security is a viable strategy for adding value above the network.  Users in our survey think that security is at least in part a network issue.  Given that, network-hosted cloud security should be a slam dunk to sell if it’s positioned well.  We’ll see if Juniper can do that.

More M&A should be expected in networking according to a Silver Lake report cited by Bloomberg.  What’s interesting is that the report postulates that bigger players like Alcatel-Lucent, RIM, and NSN might be on the block.  The driver would be the consolidation that accompanies commoditization in a given market.  I agree with the commoditization stuff, but I’m not sure that it’s advanced far enough for consolidation to take place, or that the biggest players would be where everyone would start.  If you need IP and Ethernet, why not buy somebody smaller and cheaper, and ultimately why buy into a commodity industry if you’re not there already.  Who then buys these big players?  Ericsson is the only one larger than the others, and it seems to have been working to shed a lot of equipment baggage.  Also, Alcatel-Lucent’s creation-merger demonstrated how difficult it is to pull off a big merger in the tech space.

In the video space, Comcast has decided to deploy its own streaming video service, but to say it’s a Netflix competitor as some have done is IMHO incorrect.  What Comcast is doing is creating a streaming TV Everywhere service for its current cable subscribers.  Like TV Everywhere overall, the new service (Xfinity Streampix) is designed to supplement broadcast TV consumption by making shows available online, not to encourage cord-cutting.

It’s not yet clear just how the repertoire of material in Streampix will compare with normal in-home VoD from Comcast, how the content will match up against services from Hulu or Nexflix or Amazon…you get the picture.  It seems to me that if Comcast were to put its broadcast material on the new service en masse, it could virtually stall the use of other streaming services by Comcast customers, which would be pretty smart.

In the tablet wars, Barnes & Noble has launched a down-market model of its high-end Nook to better compete with the Kindle Fire, and both these products are said to be in jeopardy if Apple comes out with a 7-inch iPad.  To make matters even more complicated, Sony’s Vita portable gaming device might end up impacting the whole space if gaming were to be a decisive need for the entertain-me-on-the-move set.  It’s primarily for games, but it also already has some social network and media apps.  Sounds like a tablet-a-be?  The tablet guys probably think so.

To make a tablet a competitive gaming device, you need to make it smaller and much better in terms of graphics performance, and you might also have to customize the GUI for easy access to controls, perhaps with a “gaming dock” that added buttons and other goodies.  All of this would throw the whole tablet space up in the air because current devices don’t have any of that.  So will that validate Sony’s Vita as another gadget, will it open up the tablet market for more competition, or both?  Too soon to say, but right now it doesn’t look like Vita is setting the world on fire.

 

How We Read Earnings and MWC Tea Leaves

Continuing our media ramp-up to MWC, Ericsson announced it’s buying WiFi player BelAir, a step that follows a previous Alcatel-Lucent move in the WiFi direction.  As I said at the time, WiFi is a kind of collateral success, pulled through by a combination of bandwidth-hungry appliances like smartphones and tablets and the inevitable truth that cellular bandwidth isn’t going to be all-you-can-eat in the same sense that wireline’s bandwidth is.  The big change now is that WiFi has become an integral part of the wireless strategy, and in multiple dimensions.

In the WAN, WiFi is both a means of connecting data devices when you don’t have the spectrum or money to deploy micro- or picocells and an offload approach for when you have too much traffic.  In either mission, having the ability to “roam” between the two is important because it’s virtually inevitable that operators transition to support of simple VoIP.  Why?  Because users will transition to it, and because complicated voice services are just not going to sustain ROI.

In the enterprise, WiFi is a way to create a corporate information fabric that covers workers within any facility, and if there is a kind of auto-roam between facility WiFi and 3G/4G VPN services, you have a potentially global extension to the corporate WAN.  Here again pragmatics drives the operator; appliance players would be happy to manage this kind of seamless service interchangeability and in the process pull the WiFi traffic (that many plans still charge for) off the cellular operator’s bill.

What’s surprising to me here is that the data-oriented network equipment vendors didn’t see the opportunity from the other side, and push their own approaches to WiFi as a means of engaging in the mobility game without having to buy RANs and IMS elements.  The myopia is evidence of a general tactical illness, I think.  Nobody wants to take any step that’s not an immediate product hit, and an immediate sales gain.  As logical as making Wall Street happy with nice quarterly growth may be, the longer-cycle visionaries in the carrier equipment side may be able to use WiFi to do a kind of reverse strategy; engage the high-value side of enterprise networking without the need to deploy all the now-commoditizing LAN and branch products.  Could Alcatel-Lucent and Ericsson become de facto enterprise winners by playing the WiFi card?  I think they could, but who knows whether they will.

Operators in Europe are showing some stress cracks in their own financial-market images.  FT’s profits were down and DT’s CTO has left as the company undertakes a review of their technology direction.  Some of the problem in Europe may be their lagging recovery, but I think the greater challenge is that wireless there is more competitive and thus further ahead on the commoditization curve.  Something radical needs to happen here, but momentum to create that something seems hard to develop.  We’re told by lower-level visionaries that there are recommendations in place for transformation of operator business models in all three of the key market areas, but getting these from plan into launch and then into trial is difficult (see this month’s Netwatcher for details).  If that problem remains through 2012 it’s going to keep capex under pressure.

 

F5 Acquisition of Traffix Sets the Stage for MWC

F5 is buying a signal networking player Traffix, who specializes in the DIAMETER protocol used often in mobile IMS networks, a move that signals that the company is going to launch an attack on the backhaul and mobile-service infrastructure space in earnest.  What Traffix provides create what us effectively a signaling overlay network, and we think this could be a very smart play given that future service-layer applications and even current CDN-based content delivery are arguably signaling applications at the service level coordinating commodity IP data delivery.  F5 has done a good job in positioning itself in the cloud space, and it’s not a big stretch to presume that they would expand the Traffix mission to include signaling in the cloud and within CDN-cloud enclaves.  If they did, it would represent a significant market move and create a major threat to the major network equipment vendors by tapping off high-value applications.

The upcoming Mobile World Conference is already bringing out some developments, and it’s likely that more will follow as the marketplace prepares for the show that represents the last current bastion of carrier capex.  What I’ll be looking for is signs that vendors may finally be paying some attention to creating higher-layer service value-add for network operators.  The ingredients are in place for many of the vendors, but nobody has the magic formula yet, and I offer here a few brief comments on what would be needed.

Alcatel-Lucent has the technology foundation for the service layer in place, but it’s kind of trapped it between a very high-level articulation (High-Leverage Networking) and an almost-software-level Service Composition Framework.  Cisco has all of the pieces of a service layer but has no enveloping message to justify one.  They also lack precision in their positioning of the individual monetization-targeted offerings for content, mobile, and the cloud.  Ericsson’s biggest problem is their increased reliance on professional services, something their Telcordia deal isn’t likely to fix.  Huawei could put a major hurt on all its competitors with a strong service-layer strategy for mobile, but I doubt they’ll do that because they’re winning the price war against disorganized strategic resistance and they have no incentive to rock the boat.  Juniper appears to have retreated from its service-layer position to take a more operations-driven stance for its Junos positioning.   NSN has to solidify its new positioning that mobile is the heart of its opportunity.

Swing for the seats, everyone.  You’re running out of market.

New Networking Questions

The LightSquared drama continues with news that the company has failed to make an agreed payment to satellite provider Immarsat, and that some investors in the hedge fund that launched the venture were suing the fund.  There are also indications that some vendors may be under pressure from the likely failure of the venture, NSN in particular.  Read all the stuff on the topic and you’re left with the conclusion that this was booted from day one.  Maybe, but not for the reasons being tossed around.

The coverage on this deal has vacillated between saying there’s a big political gorilla in the room to saying that the GPS interference angle was inevitable from day one and should have been recognized.  The big question in my view is more one of margins.  Here’s this industry (telecom) that’s been mired in commoditization pressure for a decade.  Here’s a proposal to enter the market with technology that has a very long build-out cycle, and thus would hit the market even further along the downward spiral into marginal ROI.  So what’s the business model?  Wholesale, which divides up the minimal margins available even further.  It’s like saying “I can make money selling dirt because I have a new way to produce it!”

There are some successes, at least for the moment, in the networking space.  Brightcove’s IPO last week saw the stock rise by nearly a third, and it’s worth wondering whether the gain here is linked in some way to the “LightSquared mindset”.  There’s a perception that anything having to do with broadband, especially broadband video, is going to be great simply because everyone “knows” that it’s going to be REALLY big in the next couple of years.  The problem isn’t so much big-ness as profit, of course.  The streaming model isn’t going to create indiscriminate winners any more than other tech models did, but it will generate hype among the non-discriminating.  With Brightcove the value is the linkage to a TV Everywhere-like syndication of video rights.  The business model for streaming media that shows the best chance for near-term success is the model that uses it to extend and not to replace standard linear-RF multi-channel TV.

One of the big questions now is whether this revolution in video attitude will create an opportunity for the telcos.  TV Everywhere is a syndication approach, something that says that you can see a show because you’ve subscribed (implicitly or explicitly) to it.  The delivery of the material is secondary to rights determination, meaning that you can invoke delivery if you can validate rights.  Telcos and cable companies could easily expect to play in the syndication space, and cable companies like Comcast have in fact managed to create a framework for rights-mediated streaming and even offer it to partners.  The telcos are a bit behind the curve here, it seems, and the next question will be whether they can catch up or whether the networks themselves will take over rights management, on their own or via a third party.  In that case, someone wanting to stream something would come to the network for validation, possibly through the streaming provider (Hulu, Amazon, YouTube), and the operators could end up disintermediated again.

Why does this sound like groundhog day?  Operators have been waving to the departing trains of opportunity for five years now, and the reason is that they have yet to make a success of their monetization plans.  For the most part that’s because they’ve yet to make a success of a service-layer approach.  The need for an architecture to create NGN services is nearly universal.  The essential requirements for this architecture at the functional level are almost universally accepted.  The lack of progress in this area is if not universal, at least widespread.  I’m wondering what it’s going to take to fix the problem, and whether there’s any chance of getting it while the operators still have a chance to participate in NGN services.

 

Neutrality: Shifting Sand or Quicksand?

Net neutrality has been a thorny issue for the industry from the first, and the importance of finding a rational policy increases as network operators come closer to the point of “ARPU turnaround”, when the revenue-per-user curve flattens and then falls.  Since traffic per user is increasing, this turn-around point spells the time when future service profits from the current model are now unlikely.  “Neutrality” closes off some future service models, and so could threaten investment.  It also protects consumers and the industry against anti-competitive behavior, price gouging that’s incentivized by falling profits.

The FCC’s neutrality ruling is still under appeal, but it’s one that neither side of the issue (pro-investment or pro-consumer) seems to like.  Thus, both sides have been pushing their agendas in their own favorite ways.  For those opposed to neutrality (largely Republicans), that’s meant introducing bills or amendments designed to curtail or roll back FCC actions.  For supporters of neutrality the reaction has been more grass-roots, and one prong of the effort has been to put pressure on telcos in particular to vote on neutrality compliance at the shareholder level.  We’ve had developments in both these spaces this week.

The SEC has ruled that neutrality is a significant policy issue and can therefore fairly be put to shareholder vote.  That doesn’t necessarily mean that it will, and even if it were it doesn’t mean it would pass.  In fact, my model says that shareholder votes on neutrality would likely spawn a campaign to explain the issues in terms of stock price, and that would likely result in a defeat for neutrality advocates.  Certainly the large-bloc holders would vote against it.  However, the efforts here would likely push the topic to the forefront (again), and that might have an impact on telco planning.

The other side of this is the legislative efforts to roll back or restrain neutrality, and here the situation is reversed in that anti-neutrality forces have won a victory.  But here again the impact of that victory is unlikely to be meaningful.  Attempts to amend the spectrum legislation before Congress to require that the FCC not impose any new neutrality rules on the services offered with the spectrum were withdrawn.  That doesn’t mean that neutrality rules will be extended to wireless, nor that the efforts to prevent that will end.  It means more debate.

Forgetting the politics of this for the moment (likely a good idea these days if you want to address anything rationally), the question here is how to insure that the Internet remains a fertile ground for innovation.  Unbridled neutrality doesn’t do that, nor does the complete lack of rules to insure neutral behavior.  I think that the place the FCC has gone astray here is in the ruling on third-party payment and settlement.  We should allow websites to pay for priority handling of their traffic and also allow for inter-provider settlement for traffic-handling, because that would be in the best interests of the consumers.  It might hurt the VCs, a group who have IMHO become not much (if at all) better than the bankers who gave us the 2008 crash, and it’s likely no coincidence that Genachowski is from that genre.  I’ve heard rumors that he may leave the FCC in 2013 even if Democrats retain control, and that could be a good thing for the debate.

This week, we got news that ad appearances in streaming video doubled in 2011, and that raises two significant questions.  First, why?  Second, does this mean that we actually need a renewed debate on neutrality?  The “why” of the process could be either recognition that streaming is becoming a real factor in viewing for at least some market segments, or the result of broader application of TV Everywhere principles.  The “do we need to talk” question, I think, has a clear answer.

Anything that drives up traffic without driving up revenue reduces the ROI for the network operator and disincentivizes infrastructure upgrades.  What’s interesting is that until the last year or so, streaming was largely “free”, meaning that there was little revenue generated by it and so little potential for settlement.  Then suddenly Netflix burst on the scene and became the largest single source of traffic.  Now we see paid advertising in streaming opening the door for ad-sponsored use of the Internet as a delivery vehicle.  Add Aereo into the picture, a company who wants to make a money-making business out of taking free-over-the-air onto free-streaming, and you have a picture of a behavior trend that’s shifting from being symbiotic to being parasitic.  Yes, the Internet is creating innovation, but is that innovation focusing on how to game the pricing model or advance the technology?

 

Market Shifts: LightSquared, Cloud, and Comm Standards

The FCC has reportedly (finally) said it won’t approve the LightSquared broadband-in-the-sky plan because of interference with GPS services.  I’ve never been a fan of the whole notion (two years ago I shocked one of the equipment vendors by telling them I didn’t think it would ever happen) because I don’t think the service would be commercially competitive, but certainly the FCC ruling would create angst among vendors (not to mention at LightSquared and Sprint, who is on the hook for a $65 million payment to LightSquared).

The GPS interference problem stems from the fact that GPS devices are small, low-powered, and easily overwhelmed by strong signals nearby.  That means that they could be impacted not so much by the satellite downlink but from broadband uplink traffic from devices near the GPS units.  The FCC has been wary of this issue for some time, and has finally decided it’s critical.  That leaves LightSquared with the ugly choices of a) hoping they’ll change their minds, b) trying to swap the spectrum, or c) shutting down.

IMHO, even if the spectrum issues swere solved the competitive problem is harder to pin down.  The founding notion of LightSquared was that of a wholesale LTE framework that could enable smaller operators to obtain a national footprint without deploying national infrastructure.  Obviously this would mean wholesale use of device-to-satellite connectivity, and with data usage growing you have to wonder how many data connections a bird could handle.  Some of more recent LightSquared options have been described as “hybrid” LTE and satellite and said to include fiber deployment, wholesaling stuff from Sprint, etc.  It’s never been clear to me exactly what LightSquared thought it would deploy and how it thought the thinning margins of mobile broadband could possibly support a wholesale middleman player.  Could this be a hype, dare we say, in our hype-ridden market?  Try to dig out the details on the network for yourself; you’ll likely come away with more questions than answers.  That’s troubling considering how far this process has gotten.

There are other hype spaces, of course, and none are more hype-ridden than the cloud.  Yet according to the cloud pundits, quoted in the cloud media, there’s plenty of opportunity for network operators in the cloud space.  At one level, that’s a kind of useless assertion; if there’s any opportunity at all, there clearly has to be some for the guys who own the networks.  At another level, it’s false because the theme of the stories is the disruption of the IaaS model and incumbent Amazon.  That’s not an opportunity, it’s a trap.

Some operators, at least, believe this.  One told me that they don’t want to move from depending on one service with low-and-shrinking margins (broadband) to depending on a new one—the cloud.  Yes, the operators have a natural advantage in any market that’s a race to the bottom; they do have scale and they also have a lower internal rate of return expectation.  But that doesn’t mean they have to eat sawdust if there’s caviar to be had.

Speaking of communications services, Cisco has decided to appeal the EU approval of the Microsoft acquisition of Skype, saying that there should have been more conditions set to guarantee interoperability.  The decision is unsurprising at one level; these days, everyone always says everyone else is either anti-competitive or infringing on some patent or another.  At another level it’s a thorny issue because the deal would likely accelerate a shift away from TDM overall (videoconferencing and voice), something that would clearly benefit Cisco.

The train has left the station on IP communications interoperability, in my view.  Regulators a century ago recognized that you can’t have a universal and efficient phone network if every town builds their own system using their own technology, and we got out in front with the notion of the Bell System.  Internet communications grew up in the age of deregulation, and so we’ve deliberately sacrificed standardization for innovation potential.  To impose a change in anyone’s protocol for anything at this point is more hurtful to the consumer than risking a kind of monopoly-by-acclimation where everyone flocks to one banner.  It’s a price we’ve paid willingly up to now.  What, after all, is Google or Facebook if not that?  Standardization efforts on something as simple as IM have had little impact, I’d assert.  How much would video or telepresence standards really do?

The real issue here isn’t “standards” per se, it’s the approach to community-building in complex services.  That, as I’ve said before, is a problem of a federation strategy and not an interworking strategy.  You can’t interwork things that aren’t based on the same essential feature and client framework, and in future services the personalization options offered to users alone will eradicate any chance of consistency.  You have to compose interworking of elements into service definitions, not interconnect the services that result from those definitions.

Apple is bringing communications features from its iOS appliances to the Mac, something that will certainly be managed more and more through the iCloud.  Whatever Cisco’s chances of success with the EU are (slim, IMHO), there’s little or no chance that Apple won’t field its own vision of teleconferencing, communication, and collaboration.  The only high ground Cisco could seize here is in the federation space, and that’s more a marketing/positioning stance than one before the bar.