Strategy or Tactics?

Juniper’s partner conference this week has managed to catch the eye of Wall Street, which makes it more important than these sort of events typically are.  It’s not an accident; Juniper has been promoting its event more than usual too.  The new J-Partner program will spend more, promote partnerships more, and be dedicated to “driving deeper, more profitable relationships” between Juniper and its partners.

The Juniper event comes on the heels of Alcatel-Lucent’s announcement of ng Connect, an ecosystem of companies designed to create solutions and services for the NGN.  What’s interesting isn’t that I believe the two companies are vying for the same media attention, but rather that the two activities are so close in time and so different in direction.  They are so different, in fact, that you could say they’re not competitive, and yet the events may frame a contest that will have a real winner and loser.

In some ways, Alcatel-Lucent and Juniper have a similar problem.  The network equipment market is definitely under pressure, and the pressure is created by a lack of “new money” to drive buyers.  For service providers, the issue is declining revenue per bit.  For enterprises, it’s the lack of new productivity benefits to drive new project spending.  Both Alcatel-Lucent and Juniper are facing their own sales and profit pressure, created by the market conditions.  But the responses are very different.  Alcatel-Lucent has taken a very strategic step with ng Connect, and Juniper is taking a very tactical one with J-Partner.

Alcatel-Lucent seems to believe that the problem in the market is one of finding a buyer business model, a strategy that sells by helping the buyer make the case to buy.  Their ng Connect program isn’t about products, it’s about finding a path toward solving buyers’ business problems.  The approach seems to be to build a solution/service ecosystem, and it’s a step that suggests that the company is prepared to go toe to toe with competitors once the buyers get an idea of what their ecosystem would need in the way of product.  To Alcatel-Lucent, the problem is coming to terms with NGN.

Juniper has always been a tactical player, sales-driven, much like its arch-rival Cisco.  A partner program is an approach to quickly increasing sales, it’s a “channel program” because it channels products to the buyer.  The buyer business problems in the tactical world are a given; they’re whatever’s driving the hand that signs the check.  The goal of a partner/channel program is to get the product out there in a lot of partner hands, so at least one partner will intersect with that first hand, the one that’s holding the pen.  Feet on the street.

So how do strategy and tactics compare?  The obvious point is that tactics could pay off quicker than strategy because they influence near-term buying.  Whether you build channel programs on “solutions” or “products”, meaning whether you have partners adding significant functional value or not, is less an issue than whether there are near-term opportunities you can grab onto.  Do the buyers have money to spend, or do they need help finding a paradigm to invest in?  In the service provider space, I think the latter issue dominates.  That, of course, is where Alcatel-Lucent and Juniper compete head-to-head.  But Juniper is also an enterprise player.  Does the enterprise have a paradigm in play that will drive network purchasing, or do they need a strategy too?

That’s the question that 2012 will likely answer.  Everyone, myself included, has said that enterprise IT spending is being driven by data center evolution—first in the form of consolidation and virtualization and now in the form of the cloud.  But cloud projects, according to my survey, are the most behind of all projects and the most likely to fail or fall short in terms of benefit realization.  How much strength is there in the cloud as a driver of change, a driver of network spending in the enterprise?  That’s one big question for Juniper, because tactical channel programs will fail if tactics aren’t enough.

Strategy has its limitations too, though.  Look at Yahoo and Jerry Yang.  The company’s stock went UP yesterday when it was announced that Yang was departing.  His strategy, his vision, turned out to have feet of clay and his idiosyncratic views were certainly a factor in making a once-giant firm into something that’s teetering on the edge of being an also-ran in a market it arguably helped create.  With tactical approaches you can at least see whether you’re being successful and make changes quickly.  With strategic initiatives, it’s too late to fix a problem once it’s been recognized.  So will Alcatel-Lucent’s history of vision carry the day here, or Juniper’s focus on sales?  We’re going to know for sure in 2012, folks.  Somebody may be joining Yahoo in the Tech Hall of Declining Relevance.


More Facets of Video Future

Video and streaming are obviously going to be hot topics for a long time, and there’s interesting stuff happening all through the food chain.  The question is whether the ecosystem that’s being pushed in so many directions at so many levels is going to converge on anything that all the players can survive in.

CES suggested that there are going to be more options for video connectivity offered in the future.  Sets running Google TV (or Apple TV, of course, but they weren’t at CES), streaming to tablets in the home…maybe even smart car entertainment.  We’ve also seen increased attention to the broader video ecosystem from the vendor community.  Alcatel-Lucent has been promoting its “ng Connect” program, a framework for linking developers with bigger players who might be of help in sponsoring an innovative service notion, then moving the whole thing as a cooperative project.  The concept isn’t limited to video, but clearly video is a main target.  Cisco has announced a partnership with ActiveVideo, a “cloud video” player that creates better integration between legacy RF and IP streaming, to expand the functional scope of its Videoscape offering.  Hulu, who seemed to be on the block, now seems to be getting more financing from the very players who were trying to sell it.

There’s clearly going to be a new set of video options, but I still think that we’re sensationalizing the impact of streaming given the dire effects it would have on traffic.  We can deliver RF multi-channel on cable or fiber pretty darn easily and relatively cheaply.  To deliver exactly the same material to the same number of users with IP streaming would require significant investment in some combination of metro infrastructure and CDN caching.  Even in the US, TV delivery on what could be called “partitioned IP” meaning off-Internet like U-verse, wouldn’t violate neutrality, but that would require access providers get into the game, and that’s the segment that’s been looking elsewhere.  TV Everywhere isn’t intended to be “everywhere” meaning to the exclusion of linear RF to these guys!  It’s sort of “everywhere-my-RF-isn’t” instead.  I think that the Bell Ladies have to sing before we can call the ball on the future of streaming.


Bumps in the Internet Video Road?

Apple may be moving into the educational market in a different way, going after the enormous, highly politicized, and highly profitable textbook business.  The details of this move aren’t known at this point but there are some interesting questions about the whole textbook-ebook thing that bear review.  It might be something actually useful in education.

Leaving the question of what goes in a book aside, one of the issues with textbooks of all types is that they are highly inertial.  You don’t want to spend a bundle on books only to replace them every year, and yet there’s both a damage factor and obsolescence of content to consider.  The cost of books tends to induce schools to standardize more on material and to narrow the scope of what’s offered or made available for projects.  Imagine a school whose library and texts where all ebooks, all available to update as quickly as the editions could be changed electronically, all flexible in terms of who gets what, based on who needs it.  That’s probably the sort of vision Apple is looking at.

The challenges are formidable, though.  This sort of thing will not play well with the schoolbook publishing firms, all of whom are comfortably entrenched in their political games.  It’s also not going to be easy to answer the question of who supplies the devices to read the things.  In some schools, kids have laptops or tablets on a routine basis, and arguably that should be true more broadly, but the cost of the gear and the risk of theft or damage is very high when you’re talking about devices like this.  If Apple can’t figure out a way to create a populist educational book market, they risk creating an elitist tier of educational tools that not only won’t catch on but will likely create some back-pressure on the company.  Right now, Wall Street has high expectations for this announcement because they want to see if Apple innovation survived Steve Jobs’ passing.  We’ll find out soon.

Microsoft’s decision to put its subscription TV plan on hold might be linked both to issues of Apple innovation (fear of too much of it) and online business model problems, but most likely it’s what the rumors say it is—licensing.  Content owners are finally figuring out that many of them are being taken just like the access providers.  They make major investments that others leverage for cents on the dollar, so they believe, and the solution to that problem is to raise the rates to license material.  There’s a deeper issue here, though, which is that Microsoft likely fears that the whole of the entertainment ecosystem is in danger of destabilizing.  Too much free content kills the producers of content, the transporters of bits, and puts all the power in the appliance players.  Microsoft isn’t a winner in the appliance space yet; they’re still trying to get a phone and tablet strategy cobbled together.  If they spin out their story now, can they fully exploit the market they help to create?  I don’t think so, and I don’t think they see a clear path to success here either—yet.  Wait till the fall.

The Street is asking whether there are fundamental flaws in the whole carrier router model, and if there are I have to say that the FCC is in many ways at fault.  Genachowski is a VC at heart, one who wants novelty and dynamism in the industry more perhaps than health.  The FCC is rare among regulatory bodies in that it is charged to sustain a healthy industry and not just to protect consumer interest, but under Genachowski “the industry” has meant the OTTs.  It would be unfair to say that routers as a product class are being killed by regulatory stupidity, but it would be fair to say that the vision of the Internet as a high-speed, high-quality, traditional any-to-any grid has likely already been killed.  That, as I’ve said before, tends to push deployments down the OSI stack to the optical and Ethernet layer, because most traffic is going from an edge aggregation device like a BRAS to a cache or POP.  You don’t address anything, so you don’t need IP addressing.  We’re building an Internet a world wide and a metro deep, and that is eventually going to really hit the vendors hard, particularly those without RF or optical positions and without any service layer tools.

Router players have hardly been forthcoming on this trend despite the fact that their own product moves validate it.  That begs the question of whether they’re supporting a vision of market growth that’s already failing in the first step toward the future—the present.  It begs the question of whether their visions of enterprise trends in networking are any better.  It begs the question of whether new products that are supposed to ramp in 2012 are targeting any real opportunity.  My view is that we do have a very pervasive failure of market perception here, one that’s been developing for half a decade, and it’s about to bit us.  Is 2012 the year it does?  Could be.



Learning from Microsoft’s Mistakes

The story that PC sales slipped in 4Q, first raised as a Microsoft comment yesterday, is now being quasi-confirmed by more detailed shipment data released by various Wall Street researchers.  One, citing Gartner, says that y/y growth in PC sales was well below seasonality in the quarter, and the expectation overall is that PC sales in 2012 will be very near to flat.  Some forecasts are putting sales down a trifle.  Nobody doubts that this is due to tablets, but as I noted yesterday, the thing driving this change is less a movement away from PCs than a movement to defer upgrading PCs to focus instead on a cooler device.

What this does do, whatever the cause, is hurt players like Intel and Microsoft who have built their business substantially on PCs and who now face slower growth in revenues from that source.  Both companies have tried to move to catch the wave on smartphones and tablets, and in both cases it’s too early to say that they’ll fail.  It’s also far too early to say they’d succeed, and the odds are longer for their having delayed so long in recognizing what was happening in their markets.

Microsoft is now said to be preparing for a major marketing realignment that will include some layoffs, though not large numbers.  This to me seems another example of our industry’s tendency to shoot behind the duck.  The time to fix marketing problems is when you see market changes, not two or three years afterward.  The pace of change has advanced so far at this point that Microsoft may have a major problem creating an organization before its mandate has been rendered moot by further change.  They’ve never been able to think level with the market; how now will they learn to think ahead?

Network vendors might want to ponder that point.  In the network equipment space, the Street is mixed in their view, due in part to conflicting data and seemingly contradictory trends.  On the one hand, some analysts see the continued drop in ROI for operators, particularly in the mobile space, creating investment pressure and lowering capex.  Some think that video will increase traffic and increase spending, but then there are those who say that might be true but that the fruits of the trend will fall in the laps of Huawei and ZTE.  There was a comment today on Alcatel-Lucent, who carries a lot more cost than the Street likes.  While the company says they won’t do what NSN has done, which is make a major cut in what they sell and reduce costs to focus on key areas, I think that unless they figure out a way to capitalize on their positives more quickly, it’s going to come to that.  They have too much EU exposure to take market risks; the economic fundamentals are against them in their home market.

The direction of service provider equipment seems clear to me, but enterprise sales are harder to figure.  Our survey data indicates that projects in 2011 were pushed and that there will be fewer projects approved in 2012.  That creates a downward pressure this year, clearly, but more significantly it’s the projects in Year One that drive upward budget momentum in Year Two and beyond.  If we stay with a persistent under-supply of new projects we’ll be lowering future budgets too, which is a problem that takes a couple of years to work out once it gets started.  It’s been 20 years since budget spending in networking exceeded project spending, but we’re going to have that in 2012 according to the enterprises we survey.  There will be plenty of bright spots, and still chances for companies to improve market share, but the market is going to be harder to navigate in 2012 than it was in 2011, we think.  Since economic conditions are likely to improve we may be somewhat insulated from this, but if Huawei and ZTE (as expected) make a move on the enterprise space, it could be trouble.

Is the Cloud the Star of CES?

What’s the lesson of CES so far?  That a tablet is a window on the cloud.  Eric Schmidt danced around that point with his notion of Android making your house cooperate with you by essentially sensing your behavior.  Walk into a room and it’s like the old song; “The room was singing love songs…” because your favorite music or show comes on, the lights adjust…you get the picture.  This, of course, is nothing but another place to start my mobility/behavioral transformation.  An appliance can “sense” you, but to respond to your needs it has to be able to evaluate your behavior, and it’s clear that this will become more complex and social in nature as you move from living alone to being in the real world, at home or at large.  The true future mission of the cloud is to marshal technology to improve our lives.  It’s not to run old IT stuff, but new stuff that’s never been run, never been seen, never been considered.

There are some signs that operators, like Eric, are nibbling on the edges here.  AT&T commented at CES that it plans to create, via its Cloud Architect cloud/developer ecosystem, a global mobile cloud, which is nothing surprising.  Something that might be a bit so is the claim that they could deliver “entitled content” to anyone anywhere in the world.  That suggests to me that AT&T is talking about large-scale video federation, and if it’s true then that could be huge.

Cloud Architect is going to include OpenStack, and that puts AT&T behind the open-source cloud tool set.  Whether this is a good thing depends on whether you see the optimal cloud as a bunch of virtual machines with a workflow controller in front.  Virtualization is a kind of afterthought form of multi-tasking, a strategy designed to make two or more things run together when they were architected to run independently.  It’s a great strategy for server consolidation, but I think it’s selling the cloud short by a long shot.

AT&T’s Cloud Architect is essentially a developer community cloud that’s designed to provide those who support its service ecosystem a place to run stuff.  They could in fact run it elsewhere, like on Amazon, but the best place might be to run it in a platform ecosystem and not using virtualization at all.  Newly developed apps could in theory be run simply as tasks or threads in a multi-programming OS.  They aren’t in large part because it’s hard to prevent interaction among them, which would likely be undesirable.  So is security the only reason for using virtualization in cloud?  It may well be one, but probably the biggest reason today is that we don’t know how else to build a cloud.  The platform-like architectures, including Microsoft’s Azure, Joyent’s SmartOS and SmartDataCenter, and in fact any of the Solaris morphs, are in my view far better platforms for the cloud in general, and developer/service-layer clouds in particular.  OK, I admit that I’m not an OpenStack fan; I think the concept is just benefitting from mindless media clustering on something that sounds cloudy and populist at the same time.  So is a smoke lodge, and mysticism in any form is the wrong thing to build a service future on.

Alcatel-Lucent is also promoting its vision for connected futures in “ng Connect”, an ecosystem designed to promote cooperative development and deployment of services through what seems like standards-like interactions without (so they hope, I’d bet) the usual standards-body politicking.  The technology framework for ng Connect is a bit more flexible, I think, but I’m not sure whether the actual program will end up settling on a primary environment.  I’m also not sure how much cooperation and exchange is actually architected into a platform versus simply permitted ad hoc.  The thing is, it’s the first drive by a vendor to create a service partnership on an open scale, and it will be interesting to follow it.



Reading Into Juniper’s Miss

Surprise, surprise!  The latest data from the European Telecommunications Operators Association shows that costs are up and revenues are down, and the author wonders how long this sort of imbalance can go on without compromising spending.  Maybe it already has; we noted that Acme Packet showed weakness that could be a symptom of a capex problem with operators.  There’s more indication now.

Juniper, who pre-announced revenues, earnings, and margins lower than expected, has been by the Street as a player on the Internet’s success.  The problem is that while the Internet has been a consumer success and has driven a host of OTTs to astronomical heights, it’s been a mixed blessing for operators.  On the one hand, they needed a consumer revenue beyond voice and they got one.  On the other hand, that new model has utterly commoditized bandwidth and transformed networking from connectivity to experience delivery.  Juniper and other vendors were swept along with the benefit transformation and they’re all now facing the downside of bit commoditization.

The general view of the Street is that there’s a sector problem here (true) and also a Juniper problem (true again).  Our surveys started showing that Juniper was losing strategic influence at Level 3 of the network—the router layer it so needs to sustain.  This loss was in our view a result of a failure to bake their content monetization, mobile/behavioral monetization, and cloud services positions.  Players who had better credentials there, especially the cloud, were able to gain share.  Some of the Street analysts think the proximate cause of the Juniper disappointment was that Cisco took market share.  Likely true, but the issue of why that happened is more relevant.

Cisco has the most solid cloud position of all the network equipment vendors, and my survey work is showing that the cloud is moving faster than other optimization strategies simply because operators know how to move it.  There are operators we survey who launched content projects, then cloud projects, and the content projects have yet to advance even to trial.  The cloud projects are awarded, in many cases to Cisco.  The question is whether Cisco won them or Juniper lost them.  Both companies need to try to push the outcome in their direction.  Who will?


Are We Consumer-Electronics-Overconsumed?

We’re on the eve of the Consumer Electronics Show, and it’s already clear to most of us that the event is in jeopardy.  Not only are some big names (like Microsoft) expected to withdraw after this year, the whole notion of shows in general and this one in particular is under threat.  Ironically, what’s threatening both are the things that the show helped to promote in the past.

Anyone who remembers the ‘90s knows that media has been changed profoundly by the Internet.  The ability to communicate instantly with prospective buyers stimulates the desire to do that, and that means having something instant to communicate.  In the past, it was considered smart to make big splashes at trade shows because the media would be there to cover things and you could explain your product easily.  Today nobody wants the product explained anyway, and the media is spending weeks in advance speculating or publishing leaks.  By the time the event arises there’s more a sense of anticlimax.

That’s not all, either.  Only a few companies can really hope to create a stir at an event like CES.  If you’re one of them, you can create a stir anywhere, so why let a bunch of techno-leeches barge in on your thing?  You can control your message more in a major launch where you’re on stage alone, and you get all the attention.

This year we may be adding in another factor.  How much consumer electronics can we show?  There’s not much “new” expected at the show this year.  We’ll have perhaps new tablet models, new OS features, maybe some trendy little gadgets, but it’s not like the heady age of smartphone and tablet launches in a market that was comparatively a desert.  We’ve had HDTV and 3DTV.  What now?  Smellavision?  There are a few places we could still stick touchscreens, but it’s pretty likely we’ve passed the point of diminishing returns.  Consumer marketing is about fads, but fads are like cries of “wolf!”  You can promote them when people are bored, but every one of them raises the bar for the next.


Tale of Two TVs

I’ve got a kind of “tale of two TVs” today, if that’s not too euphonic for you!  The big story at CES may not be tablets after all, but Google TV.  And the biggest IPTV success story may be doubling down on their approach, but shifting to Microsoft’s  Mediaroom.

Google has tried TV already, and it didn’t exactly work out.  The problem with their first attempt was primarily that it wasn’t offering much in terms of a different experience, meaning that at the end of the day you got something like streaming from Netflix or Hulu or Amazon.   What they’re now looking to do is to integrate a lot more Android into Google TV, creating a kind of enormous tablet from a flatscreen.  It’s not quite that, of course, but the analogy is decent.  What I hear is that there will be a host of apps from the Android Marketplace that will run on it, and that these apps will permit not only viewing and information-gathering, which would make the Google TV a direct web portal, but also communications.  I’ve also heard that some set manufacturers are looking at introducing a camera, which would take the analogy to the tablet further, and that Google has an app in the works that will allow Android tablet users to sync their tablets to the TV, so touching the screen on the tablet will perform the analogous function on the TV.

All this is interesting given that the reason I hear that Microsoft is getting into Telefonica is that they’re going to do a lot of joint TV app work.  Making IPTV a decent profit model isn’t easy since the technology is inherently more expensive than multi-channel TV over CATV cable, but it is fair to say that if there’s a way for IPTV to add value it might be in how it would be better at integrating the online and viewing experience.

That makes these two developments competitors in my view.  If IPTV needs to tap the melding of Internet and  TV more effectively, then having the TV do that by itself is hardly promoting the right market model.  If  both these approaches get traction we may be watching the battle, even perhaps the last battle, over whether IP streaming can replace linear RF on any scale.

Interesting to speculate.


Good-By UMI, Hello Strategy?

Cisco is discontinuing its personal/consumer telepresence product, Umi (or however you like to spell it; I’m not going to imitate their accent-U!), a move that’s no big surprise to most industry-watchers, including me.  The big question isn’t why (because it didn’t sell) but whether the move might represent a gradual shift by Cisco toward a more sustainable vision of the future.

First, no significant audience is going to pay the Umi price for good consumer telepresence; there are too many things with webcams out there.  I think Cisco knew that, and perhaps was hoping to spawn some sort of populist drive for premium telepresence service.  The problem with that, of course, is the neutrality issue.  But the real problem with Cisco’s telepresence approach has been that it’s traffic-push and not revenue-pull.

For a year and a half now, Cisco has been champion of what could be called the “bits-suck” theory of network infrastructure.  If something comes along that consumes bits, that just sucks the old dollars right out of operator pockets and into the pockets of network equipment vendors (mostly, of course, Cisco).  It’s not volitional, it’s Natural Law.  Hence telepresence in any form is good for Cisco, and consumer telepresence is great.  Hence, Umi.

The challenge for this approach is that driving up consumption of bandwidth in an all-you-can-eat world only drives down operator profits.  It’s kind of ironic that Cisco would promote its own profit growth through a strategy that proposes its customers abandon their own hopes for more profitable services.  Not only is it ironic, it’s ineffective, which means that at some point Cisco has to move on to something more realistic.

The Street is liking the “new Cisco”, a company that seems to be more humble, hard-nosed, and hard-competing.  It’s going to like “hard-thinking” even better, if Cisco starts to demonstrate that.  I think Acme Packet’s shortfall announcement yesterday demonstrates that oldthink in networking is going to end up taking you to some ugly investor meetings.  So Cisco needs newthink, which has to be a position more aligned with what operators can sell than what they are going to be forced to buy, or give away.

The other problem with consumer telepresence as a driver is that it’s aimed at wireline; you can hardly cart Umi around with you.  Right now, ROI on wireline infrastructure is falling too fast for operators to be even modestly interested in supporting new wireline models of any sort, and particularly those that make the operator a passive traffic conduit.  Which is of course the core of the issue here.  Network equipment vendors have stubbornly clung to the notion that traffic is valuable despite continued proof to the contrary.  It’s not.  It never will be.  Services are valuable, if the “service” itself either creates productivity improvements for business or differentiable experiences for consumers.  Simple transport and connection will always be the foundation of the network, but never again will they be the foundation of network profit, and thus they can never be the foundation for network equipment vendor profit either.  Cisco, more than any other network player, has the assets to swing for the stands here with a service story that really links the transport of yesterday to the profit of tomorrow.  They’re coming to bat this year, and we’ll watch closely to see how they do.  One thing is clear already; the space is Cisco’s to win or lose because nobody else has the leverageable assets they can push into service in time to influence the market.  This is the transitional year in service provider networking.


Why the Street’s Antsy About Tech

Acme Packets, one of the growth leaders in the network equipment space as far as the Street was concerned, did a surprise pre-announce of a less-than-expected quarter.  Supporters of the company have rallied around the notion that this is somehow just a transient blip.  Yet the company told investors that the shortfall came from a bunch of geographically scattered deals they’d had 50% confidence or more they could close, and somehow didn’t.  That sounds a bit broad to be transient to me.

I think the problem here is the old issue of ARPU.  Look at voice calling from an operator perspective and you see a market that you’d really have no reason to want to be in if you weren’t there already.  ARPU in wireline has plummeted.  ARPU in wireless voice is already negative in most markets, with broadband growth likely to keep it positive overall for not much longer than another year.  The largest OTT voice provider, Skype, has been bought by Microsoft, and while the Windows giant might well run it into the ground, they might take it somewhere that would be transformational.  Google Voice, fearing what Microsoft might do, could jump out to try to beat Microsoft to the punch.

What seems to be true here is that if you’re a vendor who can roll out LTE you’re still engaging with carrier buyers almost without positioning effort (you’ll have competition, though).  If you’re not, then you have to fight for engagement.  The longer this problem lasts, the harder it will be for vendors to break out of the pattern, because engagement and disengagement are both contagious; you disconnect and you lose touch with the issues, which makes you less relevant, which makes you disconnect further.  I don’t doubt for a minute that we’re going to see players step up and sing new and interesting songs in the first half, and by doing so raise their relevance.  I don’t doubt we’ll see players fail to do that, and lose more market share.

The Street has also been talking a lot about enterprise. Their view, like the one I expressed in Netwatcher last month, is that IT spending in 2012 isn’t going to be strong but that there’s a chance of recovery in 2013.  Their thesis is that the 2012 problem is largely due to economic uncertainty, which I agree with.  They think that’s going to be eliminated by 2013, and I hope they’re right.  In any case, I still cling to the fact that if IT improves productivity then IT should be doing BETTER in a period like this.  The fact that we’re calling 2012 a down market means that we’re admitting to the fact that IT has failed to deliver on new productivity paradigms, which is also what my numbers have said.

The fact that storage, in mid-sized-and-larger firms, is the hot IT space according to the Street is an indicator that we’re placing our bets more on business intelligence or data mining.  Those are IT and processing functions, and they are great for making better business decisions if you assume that the knowledge to understand a new optimality was collected and stored even though you didn’t recognize the value at the time.  I wonder how true that is.  Productivity is like warfare; you win it with boots on the ground, at the place where the work and the worker intersect.