Neutrality Order Text Released

I had a chance to review the full text of the FCC’s Net Neutrality Order (10-201 if you’re into the FCC’s numbering system) and there were no real surprises in the material versus the commentary that was provided in the public meeting.  I’m still concerned that the FCC hasn’t created a solid legal foundation for the order, which means that it would be at risk in an appeal.  There are plenty on both sides who say they might appeal the matter, but in truth it does take some financial resources to fund an appeal process if you’re earnest about getting results.  The FCC, as I’ve noted in the past, is holding open the docket on reclassifying broadband under Title II, perhaps to threaten the ISPs (who have the deep pockets to appeal).

Another tactic that the text might reveal is the Commission’s declaring that a lot of the key issues like what constitutes traffic management are to be addressed on a case-by-case basis rather than through meticulous details in the order.  That means that anyone who wants to dispute something will either have to file for a declaratory ruling or wait to get zapped by the FCC and then make their case—to the FCC or on appeal.

One thing that does seem clear from the text of the order is the policy of the FCC on pay-for-priority systems.  What I gather is that such systems would be fine with the FCC if they were initiated by the consumer, but not if they were sold to content providers like Google.   What seems to be shaping up here is that if the consumer has a choice to pay for priority handling and that choice is explicit, then it’s probably OK for the content provider to collect the money and pay on the user’s behalf.

The issue of content-provider-pays is a bit murkier.  As I noted above, the provider could probably act as a payment agent or intermediary.  The FCC also stopped short of saying that there were no conditions under which a provider could pay.  That suggests to me that they might allow the content provider to “pay” for premium handling in a bundled content service.  The key point here seems to be that if the consumer isn’t paying anything for content (free Hulu versus paid Hulu or Netflix) then the prioritization would clearly be paid by the content provider and would clearly be linked with “Internet content” as opposed to being a “specialized service”.

This is the area where appeals to the order seem the most likely.  An ISP that has no specific content strategy could well offer streaming video players an opportunity to obtain special handling for premium service, as well as offering the consumer subscription premium options.  If competitors with their own channelized TV offerings didn’t like this (which they likely would not) then they might file an appeal to the FCC.  Or the ISP who wanted to do the prioritizing might apply for a declaratory ruling.  In either case, if the FCC doesn’t go along with the operator on the issue, there’s always the Court of Appeals–and there the uncertainties begin.

Week in Review: December 23rd

At CES, Microsoft will confront a demon that’s been haunting it from the early ‘90s, and how it does that will likely have a major impact on the future of the company and of Windows and its ecosystem.  The demon is the GUI.

Most OSs can be visualized as a kernel and a shell, with the latter providing the human interface to the OS service set.  There are APIs in both the kernel and shell, and developers write software to these APIs.  Thus, the software is dependent on the features exposed by those APIs.  Since the shell/GUI APIs are specific to the GUI model the OS uses, the APIs depend on how the user-to-system interaction looks and works.  It’s all an ecosystem.

The challenge this poses is that Windows has always had a GUI designed for keyboard/mouse use.  Touch-screen versions of Windows offer some relief from this link, but everyone realizes that the Windows GUI doesn’t scale down well to smartphones or even tablets because the navigation is too display-intensive to work when the display is very limited in real estate.  The GUI also doesn’t include handy finger-features for navigation that tablet or phone users would demand.

Tablets are the biggest thing in computing, and as I noted in our Annual Technology Forecast this month, they’re the focus of the most significant technology issues and product sectors for the coming year.  But Windows 7 is Microsoft’s success story, the thing that pulled it back from a cliff that the earlier Vista release hung Microsoft on the edge of.  But Windows 7 isn’t compatible at the GUI level with tablets; the navigation doesn’t port.  That means the APIs don’t port, which means the programs don’t port.  So Microsoft has to either create a new OS for tablets with a tablet-specific GUI and let tablets then step on Windows 7 success, or they have to try to make a Windows 7 GUI compatible with tablets and risk losing the critical tablet market completely.

If they do the former, they’ll be following Apple’s lead with iOS versus OS/X but they’ll have no real software base to work with.  If they do the latter, they’ll be risking the future of Microsoft.  Clearly they have to do something closer to the first choice, but how?  The obvious approach would be to create a Win 7 kernel and a new GUI shell, and then come up with some new APIs that could map to either the older Windows GUI or to the newer tablet GUI.  I’ve heard some rumors that this might be the track they take, but Redmond is a pretty close-mouthed shop so we’ll probably not know for sure until CES.

Economic news remains largely on track.  The Eurozone debt crisis isn’t getting any worse; Greece passed austerity measures and Irish courts approved a bank bailout.  In the US, economic data has been pretty much in line with expectations; a slight upward revision in the last quarter’s GDP, a dip in durable goods orders, and a small dip in unemployment claims.  Consumer spending and income levels were up in November, and October’s number for spending growth was revised upward.

In the carrier world, we had an interesting counterpoint to the net neutrality flap in a major Skype outage this morning.  The reason this is interesting is that it reflects the truth that network infrastructure investment at all levels has to be economically justified.  Could Skype have created a more bulletproof server hierarchy?  Sure, but it’s hard to justify a lot of spending to make a free service bulletproof.  The telephone network has never had an outage of that scale in its history.  I’m not criticizing Skype here; I’m just pointing out that investment in infrastructure depends on some mechanism to generate a return, and where that mechanism is limited so is the investment.  That’s the issue the FCC has to balance in net neutrality.

The sad thing about the whole debate over net neutrality is that virtually none of the debaters have any notion of how the Internet works, how regulatory processes work, or how business works, nor do they want to.  This is all about publicity and rhetoric, which means that there’s no contribution to be expected from these discussions in advancing the real needs of the market.

Internet and broadband policies are complicated for sure, but that’s no excuse for having useless debates based on extreme-at-best and wrong-at-worst positions.  We can’t expect good public policy from bad public participation, and the latter is inevitable without some understanding of the issues.

In the technology space, our deep analysis of our fall survey results seems to indicate that enterprises find vendors broadly at fault with respect to providing strategic guidance on either technology or its application to specific productivity problems.  A comprehensive look at the collaboration space—one of the “hot buttons” for a lot of vendors—shows that enterprises are having problems getting solutions to fit their requirements even as they’re getting those requirements stated more clearly, though they think Cisco with Quad might be approaching it.  The challenge, enterprises say, is that Cisco doesn’t focus its Quad story on general collaboration and then fit telepresence in but tends instead to focus on the telepresence.

I think this complain should be the tagline for the last decade, an indication that vendors have let the NASDAQ crash and the bubble-related legislation induce them to pull in their activities and focus only on tactical sales issues.  If a strong strategic portfolio can’t be valued by investors because it might be a bubble, then why have one?  Companies are responsible to boost their stock prices.  But this year we’re entering a critical phase for tech, one that demands looking ahead in a rational way and not just looking as far as your wallet.

I’ll be in a period of reduced coverage during the holiday period for reason of the fact that there’s likely to be less news.  From CIMI Corporation, Happy Holidays!

Well, Neutrality is (sort of) Here!

The FCC’s neutrality vote went as expected, with commentary by various people involved in the process, including the Commissioners.  I found a lot that I agreed with, but I disagreed with at least some of what virtually everyone said.  It’s not a disappointing order, though I’m sure that most will characterize it that way.  The only thing that’s disappointing is that it doesn’t in my view address the issue of the FCC’s authority to act.  The loss of the previous neutrality doctrine was a result of the Court of Appeals having overturned that doctrine for lack of authority to act.  I don’t think the current order establishes a strong position, and certainly there will be no lack of players to appeal the order.

The FCC’s position is pretty much as expected based on prior comments by the Commissioners.  The FCC will require that wireline broadband services be subject to handling rules that are transparent, non-discriminatory in terms of sites, devices, and traffic types.  For mobile services, the transparency rules are in force but non-discrimination is weakened a bit to reflect the special nature of wireless.  For mobile, blocking of traffic that’s competitive with the ISP’s own service is prohibited, but other blocking for traffic management may be allowed if the need can be proved.  The “specialized services” that flow in parallel with the Internet will be reviewed, but nothing will bar either payment for priority or tiered pricing per se.

The jurisdiction issue here is going to seem trivial, but it’s really central.  The current move is based on Section 706 of the Telecom Act, which the FCC itself has never before said offered it any independent authority to make new broadband rules (the Court of Appeals pointed this out in the Comcast ruling).  Further, Section 706 applies explicitly to telecommunications services, and in 2005 the FCC said that Internet broadband was not such a service.  Commissioner Copps took the strong stance that a return to Title II regulation was the right approach.  I agree.  The FCC’s “third way” would have given the order absolute legal foundation and would not have subjected the Internet to being regulated like a telephone network.

But Copps also said that we needed wholesaling for competition, which I’m not sure is true, and that we needed equal regulation in mobile services, which I’m pretty well convinced is not true. The Republican Commissioners laid out objections that boil down to “no neutrality” or “let the kids play”.  “Nothing is broken in the Internet access market that needs fixing” is one of the comments.  I don’t agree with that either.  So what we had was a bunch of political comments about a decision that was likely about as strong as the realities of politics could have allowed it to be.  If we saw the rules enforced, they’d likely not hurt anything, would almost certainly prevent egregious behavior, and might even help.  I’m not sure they can be enforced, and that’s my problem.

It’s not clear they even need to be enforced.  One valid point raised by the opponents of the order was the fact that the FTC and DoJ anti-trust regulations would cover consumers against anti-competitive behavior by ISPs.  That’s likely true, and thus you could reasonably say that the FCC’s order could simply be another round in a long-standing battle between the FCC and FTC for control over the telco markets.

So the Democrats, with Copps speaking to the impassioned Internet supporters, say that much more regulation is needed to keep the evil ISPs from our door.  Baloney.  Two of the three Democratic Commissioners said they wanted even more neutrality control than the order provides, but went along with the deal because it was the best available.  The Republicans say that these rules will kill the Internet, kill investment, kill society (online at least) as we know it.  Baloney.  The FCC that gave us the four principles was led by Republican-appointed Commissioners.  Were they in favor of industry-killing then, and have now changed their minds?  A pox on all politicians, and sadly the FCC Commissioners are politicians despite the fact that they’re appointed and not elected.

Might the politicians in Congress now jump in?  Sure, and they might pass other legislation despite their record of not getting much done.  Both parties can block action of the other here, and the division of the Commissioners by party makes it pretty clear that both parties would block Congressional action they didn’t favor.  There are some who believe that the Congress will move to give the FCC specific authority to cover the order, mooting any appeals, but I don’t think that’s likely.  We’ll have to wait until a Court of Appeals rules here, if not the Supreme Court, before we’ll see any differences in broadband as a result of the order.

How different is the new broadband under the order, anyway?  Despite all the hype on both sides, it’s not very different at all.  Likely the biggest changes will be the drive toward more settlement and payment options, moving away both from the unlimited-usage pricing and the bill-and-keep models of the past.  But even these changes may be modest until some legal validation of the order is available.  Thus, don’t expect to see very much from this in the near term.

Oracle Clouds, Neutrality-Eve, and NSN’s Vision of Three

We’re starting off what will likely (but you never know these days!) be a quiet week in the markets.  Top of the news is the announcement by Oracle that it will be supporting at least some of its PeopleSoft and JD Edwards applications on Amazon’s EC2.  This seems a reversal for the company, who had initially seemed to reject the cloud model, and I think it’s worth looking at it for some hidden truths.

First, the revenue impact of the decision isn’t significant on the face because Oracle will treat EC2 virtual machines just like customer virtual machines; same license terms and rules apply.  So what we’re seeing here is a model to accept infrastructure as a cloud service rather than to promote public cloud-based enterprise apps in SaaS form.  But why even do that?  I think that Oracle is realizing that the hybrid cloud is its path to enterprise prominence, and in particular a path leading past HP in a competitive sense.

Another factor is that Oracle’s database appliances are selling strongly.  These appliances provide DBMS-as-a-service, and thus could make it much more practical to have a cloud application access an on-premises database with reasonable performance.  Thus you could argue that the hybrid cloud model is perfect to socialize Oracle’s appliances in a market that already seems to be catching on to their value.

The FCC will be releasing its net neutrality order tomorrow, though it’s not fully baked at this point and might still be pulled from the agenda.  The order appears to be a curious mixture of logical application of neutrality and illogical legal foundation.  I’ve reviewed the Court of Appeals ruling in the Comcast case and it’s hard for me to see how this dodges the legal issues the court has already raised.  The only avenue forward would be for the FCC to now assert (and justify) the view that Section 706 of the Telecom act gave the FCC “new” powers to encourage broadband and not just a specific justification to exercise the powers it already had.  The FCC has taken the opposite position consistently.

Republicans in Congress are rattling their sabers, threatening to pass a bill that offers no funding for the FCC’s neutrality rules.  Apart from whether this is even legal, it’s pretty obvious that in the divisive political world of Washington it could never pass.  Similarly, it’s clear that neutrality legislation more aggressive than the FCC proposes (mandating no traffic management, no premium handling except for free, and full wireless regulation) wouldn’t pass either.  So whether either extreme is the right answer doesn’t matter.  What does is having a set of rules that will pass legal muster, and that’s where I’m concerned here.  The FCC’s “third way” was the right answer; it was clearly legal and it would have offered exactly what the situation needed.  Some of the Democratic Commissioners want it, and frankly I’d rather they held out.  I disagree that this order is better than no order—if it’s not enforceable then it is “no order”.

Economically, the EU sovereign debt problem is continuing to cloud things a bit, but even European stocks are up this morning and so are US futures.  I think that the only real question on the table has been whether Europe would let the EU sink rather than have the stronger countries guarantee the weaker ones.  That question appears to have been answered to the point where speculators aren’t quite as willing to play chicken.  The good news is that if the debt problem were to be put solidly at rest, Europe would likely start recovering faster.  That would be important because the bad news is that the austerity programs that would be demanded as a condition for loan guarantees to the weaker nations would certainly create social unrest, and possibly weaken the ties that bind them to the EU.  Would that hurt?  Truth be told, not much.  It’s doubtful that any of these nations could go it alone, and I think the voters there would draw back from the brink.  No question, though; better times would help a lot to ease tensions.

Alcatel-Lucent continues to showcase the developer side of its Application Enablement approach, including its Open API program that federates application services across multiple developers.  There is no question that the company has started to gain some traction in the market with this, but there is still a question in our mind regarding how quickly the program can adapt to market conditions.  The thing that made OTT players successful in the service layer is that they’ve dodged inertia.  They don’t worry about standards beyond blowing a casual kiss here and there, and thus they can expose features via APIs very quickly.  If you want for industry consensus on APIs, you’re putting yourself at the tail end of a multi-year process and then saying you’re running at market speed.  I’d like to see Alcatel-Lucent open up more regarding how it will create features in Application Enablement and how quickly it can expose them using RESTful APIs.

NSN’s CEO has recently suggested that the telecom sector will consolidate with only three major players remaining; Ericsson, Huawei, and NSN.  I think that vision of the future is a tad self-serving in terms of the players, but I think that it is very clear that somewhere around three players is what we could expect if the industry can’t find better feature differentiation.  If Alcatel-Lucent wants to make the cut here, they definitely need to make Application Enablement work, and it’s frustrating to me how close they are to that, and yet how far.  But it’s not atypical with service-layer strategies in the big vendor space.  Nobody has it right there.

One might ask where this consolidation would leave Cisco and Juniper, the other big players in the IP layer at least.  I think that’s another area where the NSN comments oversimplify.  We’re seeing, in operator trends toward “procurement zones” for buying, an attempt to create a market where a single giant with a full product line can’t dominate everything.  The operators would like innovation, particularly with respect to the service layer, and they can’t get it by having everything collapse into a single giant commoditized space.  But if the specialty guys like Cisco and Juniper can’t make a case in the service layer, then they can’t defend their narrower position in a commoditizing market.  Thus, we could see the NSN “vision of three” being right, even if the three turn out to be different from what NSN expects.

Economics and Profits

Even as economic conditions worldwide appear to be improving at the macro level, there are renewed pressures on the Eurozone sovereign debt issue, and concerns that managing a global shift from stimulus to the control of debt and inflation will be challenging.  Bond ratings for Ireland sunk and there may be further revisions in bond ratings for Greece, and even for Spain and Portugal.  Some financial experts think that the “edge countries” in the EU will all require support as sluggish economic growth and relatively expensive social programs create a gap that only borrowing and austerity can fill.

One proposal now gaining strength is the issuing of a Eurozone bond that would be used to fund a large rescue fund, essentially transferring the faith and credit of all the major (and more successful) EU economies to the debt of the weaker members.  This won’t mean that austerity programs won’t kick in where the funds are transferred, though, and as a result the measure will trade the tension of disparate debt ratings for a new tension in disparate quality of life, something labor in the impacted countries is already protesting.  But a debt crisis would produce a lifestyle crisis too, so the choice is the latter alone, or both.

In the US, we had a rare show of partisan cooperation with the passage of the tax bill, a bill that includes a Social Security payroll tax reduction of 2% for next year that is a form of stimulus and also an extension of unemployment benefits.  The general view is that this will keep the US economy on track in 2011, and that’s what our model says.  It probably adds about 0.2% to GDP growth for next year, and may reduce unemployment by a half-percent according to our own numbers.

Another significant event in Congress is the fact that the behemoth spending bill that was prepared to fund the government has been pulled in favor of interim funding because it cannot be passed over Republican opposition.  The problem here, at least on the surface, is that the bill contains billions for projects of questionable value, and likely millions in special earmarks that were a specific target of Tea Party activists who were elected to the next Congress.  Some kind of reform of the bloated federal budget process may be forthcoming, which couldn’t hurt.  It may also be a sign that Congress is going to work harder to be bi-partisan in 2011 and beyond.

Alcatel-Lucent may be looking to change video collaboration, announcing that Bell Labs and a Belgian research giant IBBT will collaborate on applications to “bring a new dimension to video communications”.  The scope of the work appears to include both stuff likely useful in the near term (like video content analysis and management of each user’s view of a conference relationship) to things like immersive panoramic experiences, ultra-high-def, and even 3D that we think may be simply going too far to be relevant to people who don’t want to be on camera when they’re feeling ugly.  Our research has long shown that a better and more socially linked collaborative dynamic would be highly valuable, and in fact might kick off a wave of productivity-based IT investment that would restart an industry stalled in underperformance relative to its glorious past.  The question is whether the research process will deal with the real and current market issues; the future of 3D telepresence is still a bit off, I think.

Oracle is clearly not off at all.  Their revenues were up 47% in large part on strong sales of Sun hardware.  Pipeline deals for the Exadata servers were about $2 billion.  Clearly Oracle is a Big Player now, and clearly they’re a special threat to HP, at whom Ellison took a shot during their call.  HP’s weakness is software in our view, which is Oracle’s strength, and there is a very good reason to believe that the special strength of Oracle in middleware is the secret sauce for the company’s diet of competitors in the data center.  IBM matches Oracle’s credentials here, but the company poses a threat to everyone else’s data center plans, including Cisco’s.  The Cisco comment raises the key point, one I’ve been raising with respect to Oracle for a year now.  What will they do in networking?  If they want to be a full-scale data center player, they need a network strategy.

A New NSN?

There are renewed stories that NSN is looking to sell about a third of itself to a private equity consortium.  The stories aren’t indicating at this point how the share would be divided among the buyers, nor where it would come from in terms of Nokia and Siemens.  It’s a classic good news versus bad news item no matter how it divides, though.

The good news side of this is that nobody buys something that’s worthless.  NSN does in fact have strong assets, and certainly those assets could be leveraged to produce a good return on any private equity investment.  The bad news is that if you’ve got good assets that could produce a good ROI, why aren’t they producing one for you, if not that you’re messing up?  Clearly neither Nokia nor Siemens would be looking to sell off a stellar activity.

But there are reports that the “managed services” space that Alcatel-Lucent, Ericsson, and NSN all crave share in is expanding; Ericsson won a 3 Italia deal to revamp their IT processes.  Not exactly a giant deal, and in any case it isn’t a broad endorsement of a outsource-based service-layer strategy.  Operators tell us that they’re happy to outsource stuff that’s a cost center, that has no direct competitive impact, and that depends on skills they don’t have and don’t want to develop.  They’re less sanguine about outsourcing what makes them profitable.  I think that the question here is whether the private equity guys are drinking the PR Kool-Aid on managed services or whether they see that changes need to be made in NSN’s service-layer positioning and are confident they can make them.

We said in our 2009 analysis of vendors that NSN needed to sing prettier at the strategy level to create service-layer-strategic traction with buyers.  We also said that such traction would be increasingly critical to success and to sustaining margins at lower layers in the network.  The problem is that our surveys have shown that NSN lost credibility in the period since that analysis.  While their worst dip was from the fall of 2009 to the spring of 2010, they’ve gained little ground between spring and fall, and in some key areas (like the radio network in mobile infrastructure) they actually lost slightly.  There’s absolutely nothing wrong with their product line or their technical skills here—their problem is purely marketing/positioning.

That’s the centerpiece of the dilemma that confronts any organization who buys a piece of NSN. You can believe that managed services tides will lift all boats, including NSN, and that you see this great truth even though neither of the current partners does.  Or you can believe that the problem of the service layer can be solved for NSN by singing their song more effectively.  Given that, I’d be looking at creating an NSN choir if I were senior management there!  Otherwise a deal could go sour simply by having the current NSN trends continue in the face of a newly aggressive position by one of the competitors.

Plucking the Differentiation Fruit

Enterprises are pushing through a set of complex political and project dynamics in 2011 according to our surveys.  The changes and their motivations offer us an interesting view on the cross-currents that really define what enterprises buy and how they buy it.  Thus, they offer a vision of what we could expect in terms of competitive dynamics for the balance of this decade, at least.

Over the past couple of decades, spending on IT and networking has oscillated between modernization-driven and benefit-driven.  In rough terms, the former is reflected in the “budgets” for IT spending that are assigned to the IT organizations themselves.  The latter represents special off-budget activity that carries an IT cost component but generally is justified by an operations benefit case.  Over the years, there’s been increased pressure on the budget side, pressure to deliver more applications at a lower overall cost.  It’s this pressure that has created things like server consolidation and its successor concepts.  Over the years, this pressure has been relieved to a degree by the growth in the mission of IT, its expansion to new operations areas.  That pushes up spending and increases the role of IT within the company.

The relationship between IT and budget spending has proved to be a fairly reliable indicator of whether IT is in an expanding mode or in a consolidating mode in the market overall.  Expanding IT means that feature differentiation is easier because their new missions not yet committed to current vendors and not necessarily supported by current features.  Consolidating IT tends to empower incumbents and makes TCO the only strategy to argue.  In the last 50 years, the thing that drove the expanding/consolidating cycle was the advent of new productivity-augmenting IT paradigms.  We’ve had cyclical budget behavior based on that for nearly all of IT history, until 2002.

Enterprises have been following a path toward a different paradigm of computing, but their route has been complicated by the fact that like most consolidation measures it has a short focus.  You can get from NY to LA by making a decision at every intersection based on local conditions, but it’s not likely to be a happy (or short) journey.  Server explosions created in the heyday of falling server costs were stemmed by server and data center consolidation.  That’s because support costs were now higher than capital equipment costs.  Now, we’re seeing consolidation in the form of static VM assignment to applications giving way to virtualized resource pools, and enlightened enterprises see these as giving way to private and hybrid clouds.

I think that most of us realize that if you follow a path long enough you get a sense of its destination even if you didn’t have that from the first.  I think most would agree that when that sense of destination is achieved, progress along the path is faster because it’s backed by greater confidence.  So it is here.  But the question for the “market” is how this greater confidence and speed of progress might impact the sales of IT components, and the progress of IT evolution.

The number of IT executives who realize that something more profound than “modernization” is occurring (and is required) has grown significantly in the last year.  We’re creating a new architecture from IT by trying to make IT less expensive and more easily supported.  While those aims are tactical, the same changes in IT paradigm could empower new mechanisms to improve the IT/operations link, and business produtivity, in the future.

Any time a new paradigm is on the rise, differentiation opportunity can also be expected to be higher.  The consolidation-project differentiation apples may not be as easy to pick as apples on the productivity-differentiation side, but they’re just as sweet.  For all vendors, they represent the space that has to be attacked to gain market share in 2011 and for all the space that must be defended to sustain it.  There probably has never been a year in recent IT history when the balancing of strategic and tactical demands focused on the same issue set.  Next year will be one.

Everything Changed will Change Again

In the last week we’ve seen web attacks, password and private data theft, and in all a lot of things that raise the fair question of whether the Internet is becoming the wild west.  It has been for some time, of course; what’s happening online now is simply a continuation of a set of problems that the Internet as a community refuses to solve and that the governments of the world are unwilling to confront.

Ultimately we’ll have to deal with the issues of security and privacy that the Internet is presenting us, even though those issues have become more divisive for our having delayed so long to address them.  The question is how much worse the problems will have to get before the public marshals support for change.  We are, I think, only a couple of years of neglect away from doing real harm to the basic principles of the Internet—the openness and the lack of a tie to a specific business model.  It would be tragically ironic if we lost those benefits largely because of the unenlightened way we’re pursuing them.

We may see some changes at least in the US in 2011, and there are also signs that Europe may be taking some steps.  I could offer as proof all manner of arcane regulatory comments and trends, but more convincing is the sudden decision by Google to be much more accommodating to the telcos, and even to ally itself with Verizon in proposals for neutrality.  Google is also obviously planning its own transition to a broader business model than advertising, recognizing that only paid services can expand its total addressable market fast enough to sustain its stock price.  Google knows that it’s one thing to offer free best-efforts delivery of content and another to offer paid delivery—in the latter case you’ll have to provide some assurance things will work, but more significantly you’ll have to share the revenue.

Speaking of changes, it’s interesting to see that the Comcast vision of the future of video seems to be emerging.  First, Comcast has forced Level 3 to pay more to enable delivery of Netflix to Comcast’s customers.  Second, Comcast has been running an experiment in socially-linked video as a means of further differentiating its TV Everywhere online offerings.  But just as the biggest proof point for regulatory changes was indirect via Google, the biggest proof point for a Comcast change may be Verizon’s Seidenberg and his comments on a future Verizon model.  Verizon seems to be saying that they’re prepared to be much more “granular” in their video offerings and in their broadband pricing.  On the surface that would seem to be undermining their own FiOS model, but what it’s really doing is exploiting the fact that OTT competition in video hurts the down-market competitors more than up-market Verizon.  If Comcast is one of those, then Comcast has to embrace a bit of the technology of cord-cutting to avoid losing to the business model the technology represents.

TV is changing, but I wonder if the changes are as radical as some say; certainly my own research doesn’t bear that out.  Does the average household watch 13 hours of TV per week as one study shows?  I don’t know of any typical household where that would be true, do you?  It is true that people are spending more time online.  It is true that online time is pulling some viewers away from TV, but so far as I can tell this is what I’ll call “settle-for” viewers.  There’s nothing on they like.  They used to settle for something they sort-of-liked, but now they check Facebook instead.  That’s not destructive to TV viewing; wait till they skip their favorite shows to do something online before you start to worry.  Thus, Comcast’s experiments with “social viewing” may be at least on one potentially valuable path.  We’ll probably see many more experiments like that in 2011.  Meanwhile, reports that things like Netflix are going to kill channelized video are, to quote Time Warner’s CEO, like thinking “the Albanian Army is going to take over the world”.  The establishment has time to work some magic for sure.

Microsoft is also trying to change, and according to the latest rumors from the WSJ it will be launching not only a new line of tablets at CES but also a preview of Windows 8.  The challenge for Microsoft in the tablet space is formidable because tablets are seen today as a kind of fat smartphone without voice instead of being a laptop without a keyboard.  For Microsoft, any tablet win that promotes that simple model is a loss for Microsoft.  It’s not a big player in the smartphone space, it’s not a recognized consumer cloud powerhouse, and a tablet strategy would almost have to be synthesized from both these fundamental elements.

But why Windows 8?  The problem that’s been reported is that Windows 7 is too gadget-intense for a tablet GUI where real estate is limited.  The buttons become too small to manage.  Some have pointed to the fact that when netbooks with Win 7 appeared, they often ran into trouble with applications whose window sizing strategy assumed a specific display form factor, and so cut off the bottom of menus and other windows when displayed on a netbook.  But just having a new version of Windows doesn’t establish a new GUI; developers would still have to embrace the change, and Microsoft would be breaking the momentum of Windows 7 at a time when that momentum may be critical for Microsoft.  How long would it be before users realized Microsoft was going to churn OSs every couple of years, and jumped ship to the thin-client-and-cloud approach.  Which would take us back to the tablet as the ultimate thin client.

Everything is circular, I guess.

Leading Up to a Critical Decision

The holiday season is always dominated by consumerism, but it should be pretty clear to everyone that networking itself is increasingly dominated by the consumer.  I think that we’re headed very quickly for a time when the consumer essentially funds all public networking, creates the design paradigms and the economic trade-offs.  Along the way, though, we’re facing some potentially significant hurdles and shifts in the course.

The Internet has already made public IP infrastructure the basis for public networks, though of course that infrastructure tends to be less homogeneous than many see.  Ethernet is a smarter edge strategy, for example, because most consumer services will haul traffic to either a metro off-ramp or a metro cache/server farm.  You don’t need a lot of connectivity to get to one place.  Still, the Internet has won IP a victory at the service layer, where the IP address space is the only framework we could expect to see in the network of the future.

This month, we’re heading to a kind of financial watershed with public network services.  There’s been a surge of growth in online services funded by advertising, but advertising represents only a fraction of the money needed to fund a public network, and recent legal disputes (on Interclick’s history-tracking, for example) show that advertising-related sites are pushing the limits of public and judicial tolerance in a quest to tie up those limited dollars.  Ultimately people have to pay for stuff to fund a three-trillion-dollar-worldwide industry like networking.  The FCC is likely to set the boundaries of where pay works and where it doesn’t in its December 21st order on net neutrality.  But whatever they do, there’s no turning away from the fact that advertising isn’t ever going to fund the public network, so something else has to.

Consumers would love a free Internet, just like they’d like free automobiles, homes, or cheese.  That doesn’t make the concept practical, even in a political climate where give-aways are the rule and not the exception.  We’ve taken free-ness about as far as we can at this point; even Google I think understands that it has to move from being totally ad-driven to having some set of for-pay products and services.

What the FCC’s order will do is establish the legal framework for an Internet that’s cooperative in a broader way than at the pure connectivity level.  What’s needed is the same today as it was back in the mid-90s when I participated in an attempt to bring financial order to the Internet by creating a formalized mechanism for peering and settlement that included QoS.  We have the technical means to do what’s necessary, but we don’t have regulatory air cover.  The question now is whether we can get it.

Genachowski’s attitude on net neutrality appears to have undergone a transformation, and at the same time the Comcast/Level 3 settlement seems to open the door for settlement between content providers and CDNs and access providers.  Any settlement at all here would be better than we have, but Comcast/L3 doesn’t go far enough.  It comes down to a question of whether the relationship is “peering” or “transit”, and neither of these concepts goes far enough because both are simply different ways of viewing the permitted traffic balance.  There’s still no QoS-based settlement, and without that the Internet can’t provide pan-provider quality of experience.  There will be a transformation of investment and a transformation of Internet architecture if we can’t settle QoS-based relationships across ISP boundaries.  Such limitations favor investments in caching over interconnection, and favor larger and larger players to create fewer and fewer inter-provider boundaries.  We may start to hear some details on the forthcoming order leaked this week, in advance of the meeting.  Pay attention; it could be critical.