Facing Some Broadband Reality

The latest political wisdom, arising from the results of the mid-terms, is that net neutrality as an issue hurt those who supported it.  That combines with the Democratic loss to create a loss of momentum—at least according to popular wisdom.  Actually, net neutrality never had any momentum.  The Congress doesn’t like to intervene in telecom because problems there are too easily created and too hard to fix.  That’s why we have an FCC.  When there’s a problem with FCC authority, Congress is inclined to stand by and let them do their best, as I think they’ve intended here all along.  That the problem is getting more complicated by things like the Fox/Cablevision brouhaha is only making “becoming a tree” a more attractive option to our leaders on the Hill.  The FCC closed comments on this, and they’ll no doubt issue a Notice of Proposed Rulemaking at some point, but don’t expect magic.

One of the complications is illustrated in Australia, where the boundaries between the new NBN, a public access network intended to provide good broadband by bypassing the commercial process and the national carrier (Telstra), and Telstra.  NBN now wants to get into inter-city transport, creating a backhaul network that would link metro broadband customers to the Internet by hauling them between cities to get to a big on-ramp.  At one level it’s not a serious issue; that type of traffic is often not particularly profitable.  On the other hand, it illustrates that when politics gets involved in broadband, the normal political tendency to build one’s own empire rolls over into the broadband space.  Does NBN’s head (Quigley, formerly of Alcatel-Lucent) see himself as the czar of Australian broadband, and perhaps even of networking?  Minister of Networks?  Aside from the fact that this sort of thing would devalue the investment of millions of Australians who bought into Telstra’s privatization with the encouragement of the government, it raises the question of how far NBN will really go, and how much taxpayers will have to kick in.

Worldwide, the tension between consumers/voters who want everything for nothing and businesses who want something for everything isn’t going to be resolved through government ownership.  If transport/connection isn’t profitable we need to figure out how to achieve public policy goals within the framework of networking today, because dismantling that framework at this point is simply not possible.  If broadband was a good business, VCs would be fighting over the carcass as we speak.  Instead, they’ve long since gotten out of Dodge.  Think about it.

Economic Update

It’s time for our weekly financial summary.  The Fed’s move in QE2 is being criticized internationally, though we think the criticism right now is pro forma.  Countries with large trading surpluses with the US have been upset by the US move to effectively devalue its currency, and that includes Germany, Brazil, and of course China.  We noted earlier that QE was had the effect of manipulating exchange rates, though of course other countries could take steps to devalue their own currency to compensate, or initiate their own QE programs.

The pressure that will be brought to bear on the US at the G20 meeting isn’t really expected to cause it to reverse its decision, or probably even to discourage further similar moves.  It’s all part of a global trend to try to gain advantage through exchange rates—the “currency war” that the ECB has been concerned about.  Emerging markets with large trade surpluses want a strong dollar, meaning one that favors their imports in US markets.  US manufacturers would benefit from a weaker dollar, though consumers might like a strong dollar because it would lower the prices for popular foreign-built items.  The real lesson of the G20 protest is that we’re not out of the woods on broader currency battles yet.

In the US, jobless claims fell sharply, largely in the retail sector as companies hired to fill sales slots in anticipation of the holiday season.  It would be better to see manufacturing jobs, of course, but retail gains are a sign that companies expect a better holiday season than last year.  That could draw down the inventory levels that manufacturing gains in the early part of this year had built up, which would then encourage further manufacturing growth in 2011.

Things are a bit spottier in Europe, where the improvements in the Greek debt problem aren’t yet spreading to other at-risk countries like Ireland, Portugal, and Spain.  I think that the US believes the Eurozone should be doing more to boost its own economy; that it’s relying on stronger US imports (boosted by a strong dollar relative to the Euro) to drag the rest of the world up.  The debt problem illustrates the challenges in that approach, because the individual countries in the zone are still sovereign and independent with respect to their own budgets and financial policies, and too much public stimulation that creates debt will threaten the weaker members further.  Germany, who is one of those opposing US actions and depending on exports to the US, could surely do more to help its Euro partners but the decision would be politically impossible there, just as a decision to let the US economy stagnate would be indefensible here.

Every country, including the US, does pretty much what it thinks it has to in order to support its own interests.  The excesses this could create are mitigated largely by fears of starting a global trade/currency war.  What we have to look for now aren’t signs of business as usual, but signs that something new and bad might be happening.  So far, I don’t see it.

Tech Week: Broadband

Broadband continues to be in the news, both in terms of policy and in terms of business model.  The two, of course, should have some relation to each other, but it’s increasingly clear that’s not going to be the case in many world markets.  The US elections and the announcement by Clearwire that they’d be cutting staff illustrate the issue perfectly.

The loss of the House for the Democrats means that Republicans will now chair the House committee that oversees the FCC, and this has already stimulated predictions that net neutrality is dead in Congress.  Not true; it’s been dead in Congress all along.  We’ve never believed there was much of a chance that Congress would step up on the issue, and frankly we’d be just as happy they didn’t.  The reason we have a federal commission in charge of communications is that Congress isn’t likely to be able to address complex technology issues well.  We saw how complex issues here are with the Fox/Cablevision standoff.

Fox cut access to some of its websites for Cablevision customers, at least for a time, and also cut off its TV programming as the companies battled over how much Cablevision would pay for carriage rights.  The FCC admitted it had no authority to act here, even though most would say that non-neutral behavior by content providers has the same effect as by access providers.  That means that meaningful net neutrality rules would have to come from Congress, and they’d have to break totally new ground.  That alone would be likely to send Congress into a tizzy, and combine it with the combination of heavy industry lobbying revolving around wireless impacts and consumer desires to have everything both neutral and free, and you have a political minefield that there’s no current incentive to address.  The election, after all, is over.

The Clearwire dilemma shows that some attention is needed here, though.  Broadband services of any sort create a classical S-curve of cash flow, with “first cost” driving a provider far into the red as they build out infrastructure to credible levels and fund marketing campaigns.  The hope is that success will turn this around, but the problem in broadband is that even “success” looks a lot like failure when it’s time to add up the numbers.  There’s not much margin, and that means it takes a very long time to recover early costs.  Clearwire needs to go back to the well, and neutrality issues with wireless aren’t going to make it any easier to do that.

The decision by T-Mobile to tout its HSPA+ offering as “4G” is another indicator.  Most operators would agree that HSPA+ is a less costly transition than a full migration to LTE, and if one accepted the 3GPP definition of 4G none of the current LTE offerings could qualify either.  But a marketing slogan may help sales, and sales could help turn that S-curve of cost around quicker.  Wireline services are already less than marginal in terms of profit, and wireless could easily move into negative territory as well.  We may see wireless capex slip in 2011 and beyond if we don’t get clarity on this issue.

The Handwriting on the Tablet

One of the prime areas of focus for tech recently has been the tablet space.  Tablets are far from new, and in fact some of the “new” models are more like reprises of earlier tablets in that they’re little more than a keyboard-less notebook.  The iPad, of course, created an alternative vision of a tablet as a kind of over-fed smartphone, a device that’s all display and designed to be a conduit of information to the user with a relatively sparse capability to move data the other way.  Some see tablets as consumer devices, and some like the model of enterprise use.  The vendors are struggling with which model to support; ViewSonic expects to offer both 7-inch form and 10-inch form and both Android and Windows 7 (even dual-boot, so the rumor goes).

However the tablet goes, the big news will be the network and the impact of tablets on user behavior.  Movement to tablets on a large scale means movement to ubiquitous wireless, but we’ll need to look hard at just what “ubiquitous” means.  As I’ve noted, there’s an opportunity for hospitality-Fi networks to play an enormous role in future tablet networking.  I think wireless providers and equipment vendors realize that and are trying to figure out how to promote a truly compelling case for 3G/4G wireless versus WiFi.  The problem is that it’s going to be an uphill battle, because device vendors have everything to gain by pushing WiFi versions of their devices to get a larger near-term market share.

Behavior and mobility and devices all create an interdependence.  The consumer isn’t set on tablet use, wireless models, or behavioral patterns at this point.  That means that giving them support for a specific usage model can condition them to consume that model, whatever form that support may take.  An explosion in tablet competition could empower a host of competitors, create a hospitality-Fi wave, and erode the business model for 4G.  It could foster a different model for mobility that focuses on roaming data sessions between WiFi hotspots, independent of traditional mobility tools of the past, and of IMS.  It could even erode the operators’ positions in the service layer, because WiFi is traditionally an OTT framework tied to no operator in a technical sense.

Alcatel-Lucent, whose quarter showed some real promise for growth, seems to recognize that.  They announced a program with Eurozone provider KPN that demonstrates the exposure of provider network assets through Alcatel-Lucent’s Application Enablement and Open API program.  This is the first large-scale success of a provider API program to deliver premium network features up the stack to the service layer.  The application itself is still a bit simplistic, focusing again on QoS and bandwidth rather than on the more complex areas of identity, federation, CDNs, and application-service feature creation, but it’s a convincing demonstration that operators do have a path to monetize their underlying network assets either by offering high-level services that exploit them or by wholesaling them to somebody else.  This kind of capability may be critical if things like tablets and hospitality-Fi start to erode the traditional mobile opportunity.

What Now?

In some ways, we’ve witnessed a historic election.  The margin of victory in the House for Republicans hasn’t been seen since Roosevelt’s time, after all.  But in a very important way we saw nothing but business as usual.  For the last three elections, US voters have turned out the party in power.  We’re never happy with our leadership these days, and for good reason.  Congressional popularity has been traditionally below 25% and the public sees Congress as the most contemptible branch of government.  But we elected them, after all.

The question now is how this will play for the US economy and for tech.  Specific party politics toward technology are, in my view, a minimal factor in this election because Congress isn’t any more likely now to pass tech-specific legislation like net neutrality than before—likely, they’re less likely.  Tech will be swept along by the economic forces.

Where will those forces sweep us?  It’s too early to say because of the split within the Republican Party that the Tea Party activists represent.  We’ve had conservative groups grab headlines within the Republican ranks before; Gingrich’s over-reaching of conservative power in the ‘90s is a potential poster-child for what might happen now.  Will activists compromise to pass legislation or will both parties do their usual political showboating?

On the positive side, Republicans are often seen by businesses themselves as being pro-business, and thus it’s possible that business may be more willing to invest and expand under a Republican House, but the House doesn’t change even Congress much less government.  On the negative side, conservative activism on the budget could cripple any economic recovery.  Attempts to repeal health care and financial reform, both promised by at least some Tea Partiers, would create gridlock for no likely gain.  But mainstream Republican leadership knows that.  In short, we don’t know yet how this will fall.

The Fed today will likely announce steps for quantitative easing, the purchase of old assets with “new money” to boost the money supply.  The question is how they’ll do it, and whether it will work.  Ultimately consumers have to start believing again.  This election isn’t going to make that happen; it was a vote against something rather than a vote for it, and that’s been the case for the last three elections, as I’ve noted.  The moderate center of America doesn’t want left- or right-wing ideology, but the parties are controlled by ideologues.  We have no candidates of our own kind to vote for, and that’s plenty of reason to be in a funk, unemployment notwithstanding.

In December I expect I’ll be able to run the model on the economic and technology future and get some numbers; in the meantime it will be watchful waiting here at CIMI as with all of the rest of you.

Clouds and Chips

The IT world has provided us with a number of interesting developments this week, starting with a Google suit filed over a proposed Department of the Interior messaging system award to Microsoft.  Google feels that its own Apps could have been used for this, and that they should have been given the opportunity to demonstrate their compliance with federal security requirements and bid on the contract.  Thus, the lawsuit.

Some in the DoI have suggested to us that the problem with Apps is the same one that’s a problem for users of Google’s online competitors to Office; the features Google provides are a subset of those already in use rather than the full set.  What’s not totally clear is whether the missing features are actually used at DoI, but in some ways you have to be sympathetic with the department; how easily could they find out whether all or some features were used?  The suit may thus be an important one for cloud services in general.  Many (probably most) cloud-based alternatives to popular installed software tools are functionally more limited than the stuff they’re intended to replace.  That’s also true with most open-source tools.  I’ve tried Google’s document tools and they won’t properly process either our spreadsheets or our presentations, and they create problems with some publication/paper styles as well.  Same for OpenOffice.  But there’s no question that you could do most of what I do in either Apps or OpenOffice if you started from scratch.  So cloud applications would be promoted if they were deemed acceptable if they offered relatively full functionality, even if differently, or if they offered at least some way of doing what buyers actually did rather than what they could do.  Without that kind of ruling, it may be hard to promote the cloud version of many apps unless the cloud providers step up and fully duplicate capabilities.  Frankly, that’s what they should do.  You can’t sue your buyer into submission as a long-term business strategy.

The other interesting development is in the chip space, and the two vendors making the news were Intel and Oracle.  Intel abandoned a long practice of keeping its fab to itself by doing a deal with an FPGA chip specialist Achronix for 22nm capacity.  The actual volume of fabrication here is small, but what may be interesting is that Achronix is perhaps the speed king of FPGAs, which are field-programmable chips that can be used for fast responses to market needs or applications where volumes won’t justify a custom ASIC.  It’s not hard to see that such chips might be very valuable in the consumer device market, which could mean either that Intel may want a stake in Achronix later on, or that it may itself be thinking about getting into the consumer space on a larger scale.  Recall that Intel has its own mobile OS and that it’s often been said to have aspirations of being a player in a retail space.  What better one than devices?

Oracle’s move is if anything even more interesting; they’ve taken a stake in Mellanox, who is one of the key providers of chips for InfiniBand data center switches.  They’ve been a partner with Sun and also provide stuff for Oracle’s storage appliances, but as we’ve noted before, Oracle is the only data center player with no position in networking, and nowhere is that position more important than in the data center.  InfiniBand is a superior technology to at least the current generation of Ethernet in terms of latency and capacity, and were Oracle to be planning to do a big flat fabric for the data center, Mellanox would be a likely player in their decision.  It’s also interesting to note that the deal includes Mellanox supporting Solaris as one of its host OSs.  That suggests that Oracle may be planning to continue to field Solaris as an alternative to Linux.  We think that’s smart; Solaris has a good following, and for specialty applications like OLTP we think it’s the best OS out there.  Could Oracle be planning a major data center move?  It certainly could be.

Hopeful Economic Signs?

Economically speaking it would be hard to characterize last week as great, and yet it was better than expected and certainly better than many had feared.  The critical number, the 3Q GDP, came in above last quarter’s level, and that pretty much laid the double-dip recession theory to rest.  Far from showing wild swings of volatility, the stock market was remarkably stable, varying only about 250 points on the Dow between lowest and highest levels and closing only about 70 points lower.

This week, of course, the elections in the US will likely drown out any economic data released.  The campaign has been among the most bitter in memory, with negative ads souring virtually all of the voters polled.  Democrats hold a significant edge in voter registration, so it’s very likely that were turnout to be high they’d hold on to most if not all seats.  The challenge for them is that the party who wins a Presidential election in the US nearly always loses seats in Congress in the mid-terms.  The question is how many, whether it would be enough to give Republicans a chance at putting forward their own agenda, or whether Democrats would work with Republicans to support at least some sort of legislative progress.

Republican priorities, such as they’ve been hinted, seem to be focused on show.  Repeal of the financial reforms or health care is next-to-impossible lacking veto-proof majorities in both House and Senate, and nobody is predicting that level of Republican win.  Democrats really haven’t articulated any substantive agenda either, in my view; likely they don’t think they’ll be in a position to promote one.  Thus, we can’t expect much but reactive politics no matter who wins.

For the economy in general, and for tech in particular, that might not be bad.  We need better financial reform than we got; hedge funds that only millionaires can invest in manipulate the markets and their “bets against the market” are really bets against the average investor—and we know who’s been losing.  We needed better healthcare reform too.  We have the most expensive healthcare system of any industrial nation, and yet we aren’t anywhere close to the healthiest or longest-lived among them.  But neither of these areas are going to be fixed further, and so having at least a stable framework is better than being in a constant state of flux.  The economy will now likely slowly recover, but we do believe that restoration of “normal” employment levels may take five years—if it ever comes.  The US is shifting away from being a producer economy because productivity gains aren’t keeping more expensive US labor competitive with emerging economies.  As we’ve noted before, spending on IT since 2001 to enhance productivity has not kept pace with past history.  That has to change to increase jobs here.

An article (http://www.businessinsider.com/m2-velocity-suggests-a-stronger-q4-gdp-2010-10) has correctly noted that the M2 money supply trends can be correlated reasonably with economic conditions.  M2 is a broad measure of money supply, and when it sinks sharply it’s an indication of money being hoarded.  It did shrink during the downturn, and it’s now expanding again, which is a good sign.

We must point out, though, that our own chart on the downturn, which was published in our special report in the fall of 2008, illustrates that “wealth growth” in the broadest economic sense tends to create bubbles if it’s not accompanied by GDP growth.  We also note that neither wealth nor GDP growth correlates well with how consumers feel.  Yes, a big downturn will create a corresponding dive in sentiment, but often upturns in sentiment come during or after downturns in wealth/GDP.  The mindset of the consumer is more complicated than simple charts can show, and it’s going to be the consumer that gets us out of this eventually.

Tech, of course, is both directly and indirectly linked to an economic recovery.  Most companies will spend more when they make more, and that’s also true with households.  More succinctly, belief in future progress tends to fuel current spending.  We hope that the M2 upswing is an indication of feel-good behavior, and the fact that the 3Q GDP growth was fueled largely by consumer spending is a good sign.

Not Chicken Little Time…Yet!

This week saw what’s become the usual push and pull of supply- and demand-side issues, and perhaps a bit more than the usual confusion in the markets (financial, enterprise, and consumer) about the net outcome.  It’s not been the wild week of stock swings that could have happened had economic news been bad, but at the same time there wasn’t much that could be called a big upside of hope either.  In all—tepid probably says it best.

I’ve commented several times this week on broadband issues, many arising out of what’s increasingly clear are misleading or bad numbers about broadband deployment.  It’s not surprising that broadband would become a political football in this most political of all recent election years, but it’s bad for the industry because it’s pulling everyone’s eye off the real ball.  Despite continuous evidence that economic density is the most decisive factor in broadband market effectiveness, we continue to ignore it.  Despite the fact that there’s no clear indication that broadband has any societal value whatsoever, we continue to assert that it does.  A real plan, based on exploiting what we know and studying objectively that which we don’t know, could get the market moving.

Meanwhile, the mobile space is showing us the shape of the future.  4G is going to bring usage pricing to mobile, and it will leak back into 3G and into wireline eventually, at least in markets where economic density is low and access profits are likewise.  Smartphones are reported by one analyst firm to be creating a mobile market owned by the handset giants like Apple and Google and not the operators.  While that’s clearly an exaggeration, it’s true that smartphones are disintermediating operators in mobile just as the OTT players disintermediated them in wireline.  Operators fled wireline into mobile to flee low ROI.  If mobile gives them the same low ROI, can they then flee to telepathy or something?  Hardly likely; they’ll simply have to accept a tailing off of revenues, which means tailing off of capex.  Big telco Verizon and the cable industry overall both showed us that the Street will punish those who let capex rise as a percent of sales.

Enterprises have had their own challenges.  We’ve seen that spending on some hardware and software has been strong through the year, but that strength has been created in large part by the suppression of orderly upgrades of baseline IT infrastructure by the past economic crisis.  You can only catch up for so long; after that, growth will depend on exploiting new productivity paradigms, and the market hasn’t been very good at doing that since 2001.

I’m not playing Chicken Little here; the industry isn’t going to crash.  In fact, it’s likely that by 2012 it’s going to prosper, because any time demand overwhelms the insight of the sellers, there’s going to be a new crop of leaders created.  Incumbents in all areas of tech have gotten too comfortable with old paradigms, and new players are the ones agile enough to seize the opportunities.  Those “new players” aren’t likely to be startups, VCs having fled the equipment space to social networking and other areas with more potential for bubble-creation economics.  Instead they’ll be smaller vendors, often public companies.  Watch F5 and some of the deep-packet-inspection companies; they are looking to skim the networking cream.  In IT, watch Oracle; software has the most direct link to productivity and so software companies can transform to build new cost/benefit paradigms most easily.

Bad Numbers Mean Bad Decisions

Anyone who’s followed my writing knows that I’m no fan of the National Broadband Plan.  My main issue is with the data that’s been presented to back that plan, and some recent work I’ve been doing is making me even more skeptical—if that’s possible.

What started me off was a comment by a White House science type.  He said that he was sure that there were billions to be gained in productivity and jobs if broadband were more available, though he admitted he didn’t know exactly how those benefits were calculated or realized.  OK, I said, let’s then take a look at broadband versus economics and see if there’s a correlation.  The FCC has data that shows, by zipcode, where there’s a lot of broadband providers available.  Other agencies provide household income data, also by zipcode.  Suppose we correlated the two?

If broadband availability is in fact an economic benefit, we should see some correlation between the number of providers and the household income of consumers.  We do, but it’s the wrong kind.  The data shows that the correlation is overwhelmingly in the reverse.  The areas with the most broadband providers available are the areas with the lowest household income.

Let me illustrate with a random example from my own area.  Take two suburban communities in southern NJ as an example.  One, which is a kind of middle-upper community, has 11 providers according to the FCC.  The second, which is arguably the richest community in the area, has only 10.  Grab a random residential zipcode from across the river in Philadelphia, where the household income is a quarter that of the first community and a sixth that of the second, and you find they have 12 providers!

Now I’m not saying broadband is making people earn less, though in fact that’s a more supportable view given the data than the contrary assertion that it would help them earn more.  I’m not even saying that the urban poor have generally better broadband, the FCC’s rhetoric notwithstanding.  What I’m saying is that even a simple review of the data we’ve collected shows that our viewpoint on the role and value of broadband Internet isn’t supporting the popular views, or the views the FCC is presenting in its National Broadband Plan.

The data also seems to suggest that geographic factors like population density are by far the most significant forces in determining where broadband competition will develop.  Even in very poor zipcodes we see a lot of providers—more than in most of the richer ones.  Why would operators focus their efforts on places where household income is the lowest, if not because those places have population densities that overcome even four-to-six-to-one income disparities?   That proves our long-standing point that demand density means everything.

We’re also concerned that the FCC’s data includes non-facility providers of broadband, which in our view distorts the picture considerably.  The only way to get broadband to the user is to deploy infrastructure.  Riding as a wholesaler on someone else’s doesn’t create new options, only “new” providers.  In fact, there’s every reason to believe that multiplication of wholesale players might erode margins and further limit investment.  It certainly distorts the figures, and most people where I live couldn’t name more than two wireline and four wireless providers, which totals 6 and leaves at least four or five unaccounted-for.  Who are these providers, one must wonder?

The biggest problem here is the lack of clarity of data, or the reliance on incomplete or just bad data—it’s hard to say which.  The FCC appears to have gathered a lot of information through third parties, and also appears to have muddled its own data collection.  As I noted, it’s hard to say whether this was ineptitude or deliberate.  What’s easy to say is that bad policies are inevitable if bad data fuels them.

A “European Approach” for Us All?

Speaking yesterday at BBWF, Alcatel-Lucent’s CMO Stephen Carter talked about the need for creating a “European approach” to 4G broadband.  Some of the specific points in the talk weren’t new; we need to move beyond all-you-can-eat pricing, we need to add some specific partnership and settlement processes, and we need to recognize the intrinsic differences in the major markets.  What is interesting to me is that all of this is coming to a head right now.  Why that is might be the most interesting thing of all.

The reason is mobile 4G services, and the fact that these services are being driven by smartphones and tablets and even e-readers—appliances.  Mobile disintermediation via appliances is a real risk, and 4G bandwidth levels mean that there is truly an opportunity to create a new model of the user’s relationship to the network.  The risk that new model might end up being a reprise of the OTT-dominated wireline broadband market is very real now.  Further, 4G deployment offers operators a chance to reset the pricing and service relationships—to a point.  Operators either have to take the opportunity and level-set 4G differently, or they have to avoid 4G investment as being something unlikely to pay off for them in ROI terms.

Which of course is Alcatel-Lucent’s issue here.  Arguably companies like Alcatel-Lucent have been most successful in the wireless area, and an operator trend toward stagnation of wireless investment would be a major barrier to Alcatel-Lucent’s future profitability.  But the truth is that they aren’t the only one with a bet in the 4G game.  With the exception of Cisco, whose ambitions for revenue growth are spreading to markets adjacent to networking, every one of the major network vendors is a slave to wireless capex growth because wireline growth is not going to even sustain their current numbers.

What is clear to me is that everyone in the broadband game realizes that 4G is the watershed issue, the place where we either get control of network evolution in an economic sense or admit we can never control it.  In the latter case, it’s clear that we’ll see sharp capex declines beginning (according to our model) in 2012 as ROI pressure on operators constrains network investment.  In the former case, we could see the very thing Carter says we need—immersive broadband that touches all of us in all aspects of our lives, because it can profitably be made to do so.  It’s not a glamorous vision for the US market because we want to believe everything’s free.  It’s not simplistic like Cisco’s vision of driving infrastructure investment simply by forcing more traffic onto the network regardless of the ROI.  But it’s a true vision, and Alcatel-Lucent is perhaps best in all the industry in articulating it.

But can they deliver it?  The principles of Application Enablement are surely relevant to creating what Carter hopes for, but they’re not a sufficient condition as they stand.  There are too many holes in the story of the “European way” when you get to the rubber meeting the road.  Potholes are a bigger threat to ROI than the current disorder in a way, because without a clear path to invest everyone will hunker down and look ahead to when that path becomes clear.  That could hurt capex even earlier.  Four vendors (Alcatel-Lucent, Ericsson, Juniper, and NSN) have assets to build the kind of future Carter talks about, and not just for Europe.  Which one will come through?  We’ll likely know by spring.  Carter’s speech is proof that the issue is too acute to be ignored any longer.