FTTH and Jobs, NSN and Opportunity

One of the recurring claims in the modern world of networking is that broadband and the Internet create jobs, improve overall health, education, etc.  The latest manifestation of that focuses on FTTH, and it was a recurring theme at BBWF.  The question is important because broadband in many areas depends on public-policy subsidization, and without some clear public benefit that kind of thing is increasingly difficult to promote.  So what’s the truth?

Let’s start with a lesson in causality versus correlation.  I heard this morning that kids who go to bed and get up early are less likely to be obese.  The statistical relationship here is clear, but it’s not clear that there is a causal effect.  That means that by making your kid go to bed and get up early, you’re not likely to make them less obese.  Sedentary habits and obesity may have a common cause, not be causally linked in themselves.

A lot of the studies I’ve seen on broadband and FTTH have that same issue.  Somebody will point out that homes with high-speed broadband have higher incomes, or have kids with better College Board scores.  The latest point is that households with FTTH are more likely to start a home business.  All of this to prove that broadband is good for us.  It may well be, but these stories don’t prove it.

People with high incomes are more likely to have broadband, and to have fast broadband, because they have more money to spend.  Their kids are better educated because the parents are, and can afford to send them.  People with FTTH likely live in high-income suburbs because that’s who’s targeted by ISPs for FTTH; you need a high ROI.  Those same people, with better skills and education, are more likely to start their own businesses.

Digging through data on broadband deployment and income from the FCC and the BEA, Census Bureau, etc. I cannot find any correlation between any surveyed aspect of broadband and a growth in jobs, an improvement in skills, a brighter or better-educated set of children.  I believe from my own Internet use that for those who are inclined to use the information resources available online, the Internet and broadband are enormously powerful tools.  But I also believe that using similar data in a public library were and are powerful means of gaining insight and knowledge.  Availability doesn’t imply consumption, though.  I’ve seen dozens of cases—including this debate—where available online information has been either ignored or the researchers weren’t aware it existed—yet it was there.

The great majority of broadband is used for entertainment, and that’s the truth.  That’s where the capacity goes, why the service is purchased.  People watch stupid pet tricks, chat with their friends or look at social-network updates.  They are, for the most part, no more likely to learn new skills or start new businesses based on broadband than they were based on the availability of libraries.

But that doesn’t mean that you can’t subsidize it.  We don’t make 911 calls most of the time and yet we have universal service rules because everyone might need to make one.  And entertainment isn’t something for the rich or the geographically fortunate only.  Which is what bothers me here.  It’s not enough that we get an answer that we want, or get the right answer.  We have to get the answer in the form we want it.  We have to believe something that’s certainly not provable and likely isn’t even true, to base our decisions on, rather than just to accept reality.

Accepting reality may be something Nokia and Siemens are finally prepared to do with NSN.  The two parent companies are both kicking in new capital, refreshing the leadership suite, and generally working harder to make the joint venture a success.  That’s good because NSN is the most up-and-coming of all of the non-Asian telecom equipment players.  It’s a company that has seen its strategic influence rise noticeably over the last year, that has improved its positioning and articulation, and that offers its parents (frankly) better growth prospects than the parents themselves could hope for.

 

 

Huawei Steps Up It’s Game

Sometimes, often in fact, really important events go unrecognized when they happen but loom large in retrospect.  I think we have one this week, and it’s Huawei’s U2Net vision.  Yes, to an extent, this is a marketecture, but it’s a marketecture from a company that is first a skyrocket in terms of strategic influence and second a company whose messaging hasn’t always been the equal to its capabilities.

U2Net means “Ubiquitous Ultra-Broadband Network”, and it includes an edge-to-core vision of a network that’s designed to support the enormous elasticity in connectivity and traffic that will characterize the future needs of consumer broadband.  In itself, you may think that’s hardly new; Alcatel-Lucent has had its “High-Leverage Network” for some time and NSN has just been promoting its “Liquid” concept of elasticity and flexibility.  But what U2Net does is mark a transition for Huawei, a transition from point-product competition on price to systemic competition on vision.

I’ve noted before that Huawei has come a long way, both in terms of objective capabilities and in terms of customer perception.  In emerging markets in particular, they have created teams that stand with the very best in the industry and that offer not only attractive pricing but strong support and insightful commentary.  In fact, operators in emerging markets rate Huawei sales as number one or number two in quality of insight in nearly three-quarters of all the places we survey.  That is a big jump from a year ago.

The reason this is all important for the market and not just (obviously) for Huawei is that issues and features are the only defense against price-based competition, and you lose them when you lose the issues to the price leader.  It’s very obvious that within a year, Huawei will be able to articulate its story at the strategy level with at least the refinement of the market leaders in strategic influence.  Alcatel-Lucent has held the top spot in that particular race for the last five years, but we think that they’ll be nearly tied with Huawei in our fall results and that Huawei will pull ahead in some areas in the spring of next year at the current pace of advance.

What has happened here is that all the major network players have wasted five years, pure and simple.  With buyers demanding more support for their monetization goals, the classic vendors have simply ignored the pressure and pushed boxes, and box-pushing cedes strategic differentiation because it cedes any sense of context in the sale.  There’s no such thing as a strategic box, and box-level features are hard to argue when networking is a cooperative ecosystem.

We’re also seeing signs of enterprise movement for Huawei.  While they’re not in our survey results there at any statistically significant level yet, they have appeared for the first time outside of China in (you guessed it) an emerging market.  The focus of the enterprise pitch is the cloud, and I think Huawei has picked that focus because emerging markets are disproportionately likely to be consumers of public cloud services and so it’s likely that larger businesses with current IT deployments will need to hybridize.  The cloud is a bridge between an established position with network operators and an emerging one with the enterprise.

For five years now, Huawei’s competitors have wallowed in complacency.  Now it’s time to be afraid.

 

Fire in the Cloud?

Well, Amazon finally announced its tablet.  The event itself might have offered some clues because Apple would have done this in the Superdome and Amazon had something that looked more like a high-school auditorium.

Bezos set the tone for the launch with a long praise-fest for the Kindle and the ebook and e-ink concept.  Then he jumped to talking about the Kindle Touch, which is an e-ink product that’s an advance from the current Kindle but much more like Barnes & Noble’s newest Nook model, a cross between a tablet and an e-reader but much more the latter than the former.  This new product has a $99 buck price point (for WiFi; $149 for 3G), which undercuts the new Nook.  The product senses touch via IR rather than capacitance and so I wonder how it will work for those who like (or need to use) a stylus.  With a pre-cached dictionary and Wikipedia capability, I think this device is aimed a lot at the student reader.  For the truly cheap of heart, there’s a $79 version that omits the touch-screen capability.  This, it’s safe to say, is the new mainstream Kindle product, the basic e-reader.

But it’s obvious that Bezos couldn’t stop there.  Ten thousand or more media and PC analysts would have stormed his castle and burned him alive, likely.  After blowing Android Kisses a while, then touting new media and app stores and Amazon Prime and even EC2, he finally got to the point.  The future is media and cloud service offerings!  It’s Kindle Fire.  It’s not a tablet, but a Media Cloud Appliance!

Let’s come back to earth for a moment for the specs.  Fire will have a dual-core processor and a seven-inch screen, making it a less-than-iPad right there, and the announcement is likely to be disappointing to many who had expected them to field an iPad-like product for about the same price as the HP TouchPad sold at in the after-the-market-exit fire sale.  Yes, that would have been wonderful, but as my readers know I’ve never believed for a minute that was Amazon’s intent, and clearly it was not.

Fire is based on a customized (by Amazon) older version of Android, the latest to be available as open-source, and like the Color Nook there’s an overlay GUI on it that harmonizes the look and feel with something a reader-focused buyer would want.  But it’s really a bit more than books, it’s CONTENT, but it’s also a bit less than a real tablet, or the iPad in particular.  A seven-inch form factor is one big difference.  The smaller screen is essential for a reader-focused tablet; people don’t want to really read books on something the size of a cocktail table book.  But it limits the entertainment value of the device and its value as a generalized Internet portal.

The price point for the Fire is within a dollar of the level ($199 versus $200) I blogged this week as the likely floor price for a subsidized tablet/reader.  My model says that you can make money overall at that price because of the ebook sales (and Prime membership sales) you’ll then get as a follow-on.  But at that price the subsidy of follow-on sales is critical, and so that shapes the nature of the Fire.  No matter what others (including Amazon) might say, it’s a “Nook-alike”; more of a B&N competitor than an Apple competitor.

But it does redefine that competition by adding in the video content dimension that Amazon has and B&N lacks.  That makes it a kind of reader-plus or tablet-minus.  You can see that Amazon isn’t trying to say Fire is an iPad, but they’re trying to say that the Fire is a better content device than a generalized tablet, and obviously a much better e-reader.

One innovative feature of their Silk browser is the split architecture; there’s EC2 back-end processing linked to a Fire front-end.  This may be the first example we’ve seen of a cloud service backing up a tablet experience at the GUI level, and it’s also certainly a model of how the cloud hosts what I’ve always said was a “service-layer” function.  Certainly it cements the relationship between the cloud as an IT model and the service layer.  Fire cements the role of Amazon’s EC2 in the web-front-end application model, even expands it a bit.  EC2 is used to enhance the viewing experience by pre-processing stuff that would normally be done on the client, but it seems likely that the role of enhancing the experience could easily be expanded to the functional level under the same model.  Along the way, this pre-processing might reduce communications load.

I think it’s clear that this isn’t a direct challenge to Apple, but it may just be a formidable indirect one.  Fire is a clear partnership between content, appliance, and cloud services.  That’s what I think Apple has been aiming for with iCloud and has not yet achieved.  Why?  Because clouds are fuzzy and hard to market.  Apple had the disadvantage of having a stable of appliances in place before they fielded their cloud approach, and so pretty much had to let the cloud stand on its own.  To make it less complex they’ve kind of dumbed it down.  Amazon can make Fire the face of the cloud, which is what I think they intend to do.  That is a serious challenge to B&N but it’s also a challenge to Apple because the Amazon store retail model is much broader and more successful than Apple’s stores.  Retail is more directly suitable to profit-building than ad subsidies too, so Fire may threaten the Hulu and Netflix models as well.

Fire will disappoint many, as I’ve said, but it may also have a longer-term, and greater, impact on the industry than it would have had it simply gone head-to-head with Apple.

Mobile Broadband’s Impact, Juniper’s Mobility Launch

We’re on the eve of Amazon’s tablet, and rather than speculate now in advance of the announcement, I’ll wait until tomorrow to talk about the device and how it might impact the tablet space.  What I propose to do today is chat about the tablet space at a higher level, and in particular about the new world the tablet and smartphone are creating.

If you look at entertainment through human history for a moment, you’ll see that it’s COLLECTIVE in nature.  People sat in the Coliseum; a crowd.  They go to movies and theaters and shows and concerts.  Even television, according to research, is most often a shared event.  One might ask why that is, and there are two primary drivers.  First, people are naturally sociable and like to share experiences.  Second, many experiences are simply too expensive or impractical to have as individuals.

What mobile broadband has done is to permit socialization without physical collection.  We are building a generation that’s as comfortable with virtual as with actual, a generation that doesn’t need to be in close proximity in a physical sense to be “together”.  Our appliances are getting sophisticated enough to allow us to build relationships (to a point, obviously) though them.  The question is to what “point” virtual can replace actual.

So here’s my point.  I think that mobile broadband is creating the opportunity to virtually collectivize our lives.  That virtual collectivization will be appealing for people who usually are able to meet but can’t for some period do so.  It will be appealing to a smaller group of people who really do prefer to keep the real world at arm’s length.  But it won’t totally transform our entertainment expectations.  The crowds that fill concerts today are the leading edge of the mobile/social revolution, and yet they’re still in the concert.  What they are doing is not changing their overall behavior to preference virtuality, but simply putting the ability to have a virtual social relationship in their repertoire of stuff they can do at a given time.  It displaces the stuff that’s at a lower priority, so somebody who at last resort watched TV at home with the folks has a next-to-last-resort that’s better.

What mobile broadband will change most is the connective tissue of our social interactions, not the form or nature of the interactions themselves.  I can sell a gang of friends a social service that will help them meet for a bite or a drink more easily than one that will perpetually link them virtually, because they WANT to get together.  They were settling for the virtual part.  Tablets and smartphones will become, to quote a commercial for cotton, the “fabric of our lives” but not the focus of them.  Amazon’s tablet will transform reading, but ebook readers transformed it more, and while both impact “books” they don’t impact reading per se.  What will make Amazon’s tablet a revolution in a true sense is whether it offers something that really enables the social connection, and that’s the same thing that would make iPads transformational in a true and persistent sense.  Nobody has that quite locked town at the moment in my view.

Juniper announced a new mobile workforce strategy called “Simply Connected” which is one of the best marketing pushes the company has done in ages.  What they’re doing is linking switching, wireless, their Pulse client, and security into one purpose-built package.  The target here is the growing number of enterprises who realize that the tablet/smartphone connective-tissue argument I’ve just made applies rather well to workers.  Here we have social relationships that have a non-social motivation, and thus relationships that are often better served in virtual form.  We also have people who are tasked with cooperating when they’re collaterally tasked with doing stuff that’s almost certain to take them in different physical directions even as they try to virtually connect.  The concept of a kind of social bundle for mobility is a good one, and it’s particularly good because of Junos Pulse, which is a client-side agent software component that gives tablets or smartphones an anchor in a management and security sense even if the devices come from multiple vendors and enter the corporate net in part through casual use of personal devices by workers.

This is the classical model of solution selling, and we think that Juniper needs to do more of this sort of thing.  Like most vendors these days, they tend to atomize themselves into little product silos that sing their own individual songs.  I’ve recently spent some time with enterprises and service providers outside the US and I can tell you that there’s a gap between even what VENDORS call a “solution” and what buyers consider a problem.  Ethernet networking isn’t a solution; IP networking isn’t either, and security flunks the problem-linkup test too.  It’s not that nobody is tasked to do those specific things, but rather that those things are simply ways of dealing with the way technology is applied to solving a problem.  To get control of the deal, to maximize the sales and buyer connection, you have to focus on what that PROBLEM is from the buyer perspective.  Do more of this, Juniper, and most important of all, make sure that this kind of thinking permeates your strategy and product planning and not just your marketing.

 

Content is (Still) King!

We’ve been seeing some interesting developments in the media space, though they’ve been a bit overshadowed in a news sense by more dramatic technology announcements.  Media and media-related consumer behavior is important because it’s the driver of many new business models (too many, I think) and also because it’s the major driver of change in network services and technology.

One interesting item is that more people are abandoning TV news in favor of online news, except for weather, traffic, and some local events.  I can understand this at one level; national news is increasingly more like a variety show than a news program.  Mobile and online news sources are obviously growing, but Pew research shows that they’re more often used as a source of information about shopping than for “news” in a strict sense.  However, apps that provide weather and traffic information could be an issue at some point for TV news programs.

The reason this could be important is that any form of live programming is an enormous stimulus to channelized delivery.  In fact, any regular consumption of TV programming, especially popular material, that’s streamed is going to create disproportionate demand for capacity.  And guess what; bits cost.

Our research continues to demonstrate that even in 20 years, channelized television will still account for over half of all material viewed.  The reason is simple; economics.  The real question is how we’ll transition to a more personalized experience for the other half of the material.  Every on-demand HD show streamed to a viewer (using 7 Mbps per show as our metric) requires a level of quality capacity equal to five T1 lines that would have cost a thousand dollars a month only 20 years ago.  The same show in linear form consumes no bandwidth at all.  So the question is whether loss of news credibility for channelized TV creates wider reliance on streaming.  Answer: Yes, among the under-32 age groups where channelized entertainment is too sedentary to fit into their behavior patterns.  Elsewhere, our research says decisively “No!”  But it could hurt local station economics because news advertising is one of their largest sources of revenue.  Thus, it’s the changing money-flow that we have to worry about here before we start speculating on revolutionary technology changes.

Dish networks wants to join with Blockbuster to make its own run at the VoD space, combining satellite delivery for channelized material (still the option with the lowest costs) with the mail-me-the-DVD and streaming choices for on-demand.  They sense Netflix vulnerability after the “misstep” the company had with splitting off its streaming and DVD rental businesses.  They’re right, but wrong at the same time.

The Netflix problem is the classical parasitic problem.  People want to watch stuff they like.  In any streaming service or mail-delivery option, the consumer grabs the material they like best in the first month, and from there on things go downhill.  The only salvation is an avalanche of fresh content, and that can come only from television because movies aren’t made in enough volume.  But if streaming cannibalizes TV ad revenues, then the TV networks who are producing the content have no incentive to push their material through the streaming channels; they have the opposite incentive.  Netflix is facing higher costs and reduced chances of renewal as its material ages, and so they’ve just done a major Latin American expansion.  Fresh eyeballs are as good as fresh material.  The only reason Dish can even think about this sort of thing without revealing a mental impairment is that they can’t really do VoD from a bird in the sky.  A crippled strategy is better than none.

Netflix’s deal with DreamWorks is being touted by some streaming fans as proof positive that the streaming model will Eat the World, but I think these guys are also showing some signs of impairment.  The deal doesn’t kick off for two more years, and when it does it’s first for the DVD delivery mode for most material, with limited streaming releases later on.  So why the hype?  If you were Netflix’s CEO, wouldn’t you want to show off something positive after a very bad week?

What this shows is that content is still king, and that content ownership and the ability to produce new material is the one irreplaceable asset in this confusing market.  That bodes well for Comcast and other distribution giants who own media assets, and bad for the streaming up-and-comings, who almost universally do not.

 

HP: Not Enough Change

Well, Meg Whitman now has the responsibility for making the right move at HP, and frankly I’m not encouraged by some of her early comments.  The only thing that seems to be on the table, of all the changes that brought down Leo Apokether’s reign there, is the fate of the PC business.  That may well be the only thing that was a smart move.  What HP has to avoid above all is business as usual, and the second-greatest risk is looking indecisive.  With one chance to get it right, Whitman seems to be getting it wrong.

PCs haven’t been a great business for years now, simply because consumerism has driven down the prices and tipped the scales of innovation to the software side.  The hardware platforms are all basically the same, no matter what Mac aficionados think; it’s the operating system and software that matters.  Even there, price pressures are formidable.  Now, with the onrush of tablets, we’re seeing the web-client dimension of PC use vanish, and with it likely even more of the profits.

HP’s problem was that it didn’t see the tablet shift coming, not that it needed to be more of a software player.  The thing that HP needed was a cloud vision, a vision of a future of network-connected appliances that could marshal a lot of power and knowledge and focus it on something a user was carrying around in a pocket, briefcase, or purse.  Yes, software is an element of that, but it’s not all of it.  I’ve learned by talking to users worldwide that everyone understands the pieces of the cloud, it’s the cloud as a conceptual whole that they don’t quite get.  It’s the notion of a new ecosystem with new relationships, new value focus points, etc.  HP might have shown us that conceptual whole, by dumping PCs and focusing on the cloud.  Instead they dumped PCs and dumped the cloud too, and Whitman is proposing to rethink the dumping PCs part.  Maybe she thinks that’s best now because it’s too late to get the cloud back, but it’s not going to work.

There’s a lesson here in the networking side.  The cloud is a symbol of the new age of communications-connected intelligence, the age that empowers every client because it’s connected to every possible service.  This is an age that networking has created, and one that’s not generating value to networking in proportion to its contribution.  That’s because network executives have been just as blind, dare we say as dumb, as HP was.  The cloud should have been their vision, but they were simply not agile enough to grasp it.  They recognized, as HP did, that somehow software was involved in the solution, but they never realized that just as “hardware” encompasses both mainframes and smartphones, doorknobs and drawer pulls, “software” means too many things to be a useful goal.  “Serviceware” or “cloudware” was what was needed, and that’s middleware software built on the cloud model.

Another tie here is that HP is now terribly wounded, perhaps even fatally, and this pulls a major competitor off the field in terms of data center competition.  HP might have been a leader here, and that’s going to be very hard to achieve in the current situation.  Autonomy is the wrong software; rethinking PC spin-off the wrong decision.  Nobody could benefit here as much as Cisco, who has been punching out at competitors with edgy marketing because they need to recover their own position.  The problem is that Cisco, to succeed, still has to face the reality that HP didn’t face and that Cisco has yet to really address—the cloud.  There is no enterprise on this planet, no network operator, that isn’t a potential buyer or seller of cloud services.  I’ve seen this proven in major markets and emerging markets.  The cloud is the symbol of the new IT, the new network.  You stand or fall—in the cloud.

 

 

 

Oracle’s Quarter Sends a Message, Adobe’s a Warning

Oracle and Adobe both announced their earnings after the bell yesterday, and both companies were being closely watched as possible indicators of the overall health of the tech sector in what’s pretty obviously at least an economic hiccup on the way to recovery.  Both the companies reported quarters that included upside surprises, sending both stocks up in after-hours trading.  The details of the two were different, and of course so are the implications for the industry.

Oracle beat the estimates in profits and their license numbers did particularly well, which suggests that database and application software sales are still strong among enterprises.  That would be a key metric in my view because these are the two software components most directly linked to corporate productivity, and thus the two that would be most likely to reflect “project” spending rather than just orderly enhancement of IT budgets.

The hardware end of the business, acquired with Sun, was the weak point in the numbers; hardware sales dipped for the quarter.  Oracle said that it would be focusing on profitable “appliances” more in the future, which would be likely to improve both sales and margins on hardware.  It’s also, in my view, a reflection of a basic truth, which is that servers are a commodity item where it will be difficult to sustain margin and profit growth in the future.  Take note, HP!  Oracle’s decision to de-emphasize the x86 models in favor of Sparc models is a further indication of a flight from a commodity space.  Powerful Sparc appliances are likely the best strategy against the SAP/IBM and SAP/HP combinations too.

Adobe actually had a weak quarter but offered high-end-of-the-range guidance.  The highlight may have been the fact that CS5.5 revenues were consistent with the run rate for CS3, the last of the Creative Suite versions to do really well in the market (it came along before 2008).  However, I’m not sure that the Street is getting the full picture here.  I don’t see much of an indication that Adobe is pushing Creative Suite beyond the base audience it enjoyed with CS3, meaning that it’s not selling new players.  Moreover, I see little sign that Adobe is capturing a growing share of the consumer editing space with its various Elements products.  There again they seem to be upselling within the same audience.  Acrobat was perhaps its greatest overall success, the only place where it demonstrated it might be gaining market share in a convincing way.  Acrobat here means the “professional” or authoring piece of the PDF process.

Adobe’s other long-term problem is HTML5 and the flight from Flash.  Adobe’s professional video tools and web tools have benefitted from the dominance of Flash in web video, and it’s pretty likely that dominance is coming to an end.  Microsoft’s decision to reduce Metro’s dependence on plugins mirrors Apple’s moves, and while Android will surely continue Flash support to tweak Apple’s nose, the handwriting is on the wall here.  HTML5 will open up web video to competitors in a big way, and just having HTML5 capability on what was once a pure Flash server product isn’t going to save Adobe here.  Adobe is pushing a new version of Flash and AIR, its rich media environment, but I wonder if AIR can withstand the tablet revolution.

 

 

 

 

 

Upcoming Netwatcher Topics

We want to provide our Netwatcher subscribers, blog readers, and other interested parties with a summary of the topics we’ll be covering in our technology journal, Netwatcher, this fall.  Here’s how the editorial schedule is lining up so far!

In September, we’ll feature the fourth segment in our service-layer series, which looks at the practical question of implementing a service layer and at the service-layer approaches of the network equipment vendors who lay claim to some service-layer functionality.  We’ll also be looking at the application models that enterprises believe are most amenable to cloud computing, and what might happen to realize their opportunity.

In October, Netwatcher will round out our service-layer series with a look at standard and open solutions to the service layer.  We’ll also cover the OpenFlow architecture and its notion of software-defined networks (SDNs).

November’s Netwatcher will include an extra section, this one in the form of an “open letter” from me to the management of the leading network vendors.  If I were giving the executive teams private advice, this is what I’d say!  The issue also includes a feature on the enterprise mobility space, and in particular what industries represent the real mobility opportunity.  We’ll offer some practical guidance in addressing mobility opportunity, too.  Finally, we’ll look at the so-called “emerging markets” in networking, asking what it is about these players that make them different from the primary markets…if anything.

December is our Annual Technology Forecast, and this year for the first time we’re integrating this issue with the results of our fall survey of enterprises and service providers to provide a one-stop shop for a vision of 2012.

If your company subscribes to Netwatcher, they have the right to distribute it freely within the company, so you can get a copy from your internal resource.  Contact us at netwatcher@cimicorp.com to find out who that is.  If you are not with a subscribing company, this same email can be used to request subscription information.  A sample issue and a table of contents for back issues is posted on our website.

 

NSN’s Liquid Touch

NSN is making another wave in the market, this time by extending the notion of adaptive networking in the mobile/EPC world from the RAN (where its product is Liquid Radio) to the network core, with Liquid Net (including Liquid Core and Liquid Transport).  These products are aimed mostly at the mobile/metro space, where the explosion in wireless traffic has created considerable stress on access infrastructure.  Mobile services, because of the dramatic changes in mass-user location through the day and week, can strand considerable capacity and NSN is aiming to let operators run their metro networks more efficiently, freeing capacity and optimizing traffic routing for performance.  Underneath the covers, this is all a part of a shift by NSN toward a more cloud-based network infrastructure, something we’ve noted even in the way that it supports the service layer.  By hosting the “Liquid Core” logic of mobility and gateway functionality (IMS and the related elements like SGSN, MGW, etc plus the circuit- and packet-switched network) at the control level on ATCA platforms, NSN is focusing itself on functionality and not on hardware, an exceptionally smart move in a market that’s crying out for network vendors to embrace a more hosted mindset.

What’s particularly interesting to me here is that NSN has been a wallflower for ages now, a player who barely articulated anything.  Now suddenly it’s getting not only smart but very smart.  What will be interesting to see is how this impacts the rest of the players.  NSN is strong in mobile and metro but not particularly so in the IP layer, where it OEMs gear from Juniper.  You could argue that the ATCA position of Liquid Core is calculated to refocus investment and management attention on something NSN does make, taking it from what they don’t.  Might that then undermine IP devices?  Might Ericsson, who has a similar problem of IP-layer incumbency, follow suit and create a trend?  What will Alcatel-Lucent, Cisco, and especially Juniper do?

Liquid Core could also make EPC (Enhanced Packet Core) more of a mainstream issue.  There’s been some back-and-forth between whether metro infrastructure should have a lot of EPC or as little as possible, and the notion of “core liquidity” or flexibility and controllability through EPC could catch on even outside the pure mobile space.  That would again pose questions for traditional metro players who have no strong mobile position, Cisco and Juniper again.

 

Some Video Developments and Musings

Apple is always a wonderful target for rumors, and the current one is that the company is getting into the TV business for real, launching not only a service but a line of TV sets that would link to it.  The idea has the media agog of course, but no matter who is supposed to be fielding streaming substitutes for channelized TV, there are formidable issues involved both technical and non-technical, and there’s so far no indication that Apple is dealing with any of them.

The first technical problem is that about a quarter of US households couldn’t receive HDTV streaming properly because their Internet connection is too slow.  That means that to save money by cutting the cord, they’d have to spend more money.  And there’s no guarantee that it would work for them anyway, because many broadband users streaming video at one time could congest the networks.  Users could also get to whacked with usage-over-cap charges if they watched a lot of TV.

The second problem is that multi-TV houses would be even more likely to have a problem with quality because multiple streams would almost certainly create issues.  Remember that you can’t easily buffer live delivery of channels because you’ll get behind the program schedule, so congestion events could be a disaster.

Then there’s the big non-technical problem, which is getting rights to the material in the first place.  The networks own their own shows, and there is no legal obligation on their part to sell streaming of contemporaneous episodes, which would mean that the material available couldn’t be the normal channelized programming.  Cutting the cord from the cable bill is one thing; cutting off your programs is another.

The final non-technical issue is advertising.  Online ads in streaming video bring in a thirtieth of the amount that broadcast commercials bring.  If users had to pay the difference, it would cost over $170 per month, which is way more than the cable bills.

So what’s really up here?  I think it is very likely that Apple is creating a line of TVs, and that these will be tightly integrated with iTunes.  I think it is very unlikely, bordering on the impossible, that they plan to field a streaming TV service aimed at competing with channelized TV.  The recent Verizon moves to beef up their VoD sales show the real aim of Apple, in my view.  They’re going to leave channelized TV alone and go for VoD only, which means that their sets will still tune channelized TV and they won’t alter the basic economics of TV viewing very much.

Netflix launched another round of angst with a comment by the CEO that it would be virtually separating the DVD and streaming services, even to the point of separate websites.  While this is upsetting their customers even more, it may reveal something about the motivation of the pricing changes.  I’m hearing that the negotiations for streaming rights are getting more complicated and Netflix doesn’t want to tie them too tightly with negotiations on DVD rental, which are a completely different issue.  Thus, the current flap may be a signal that there’s a lot going on in streaming negotiation, which could be because Apple is now looking at the model a bit more closely.  There’s been speculation all along that Apple’s iCloud should stream video; maybe it will.