Is the Cloud the Star of CES?

What’s the lesson of CES so far?  That a tablet is a window on the cloud.  Eric Schmidt danced around that point with his notion of Android making your house cooperate with you by essentially sensing your behavior.  Walk into a room and it’s like the old song; “The room was singing love songs…” because your favorite music or show comes on, the lights adjust…you get the picture.  This, of course, is nothing but another place to start my mobility/behavioral transformation.  An appliance can “sense” you, but to respond to your needs it has to be able to evaluate your behavior, and it’s clear that this will become more complex and social in nature as you move from living alone to being in the real world, at home or at large.  The true future mission of the cloud is to marshal technology to improve our lives.  It’s not to run old IT stuff, but new stuff that’s never been run, never been seen, never been considered.

There are some signs that operators, like Eric, are nibbling on the edges here.  AT&T commented at CES that it plans to create, via its Cloud Architect cloud/developer ecosystem, a global mobile cloud, which is nothing surprising.  Something that might be a bit so is the claim that they could deliver “entitled content” to anyone anywhere in the world.  That suggests to me that AT&T is talking about large-scale video federation, and if it’s true then that could be huge.

Cloud Architect is going to include OpenStack, and that puts AT&T behind the open-source cloud tool set.  Whether this is a good thing depends on whether you see the optimal cloud as a bunch of virtual machines with a workflow controller in front.  Virtualization is a kind of afterthought form of multi-tasking, a strategy designed to make two or more things run together when they were architected to run independently.  It’s a great strategy for server consolidation, but I think it’s selling the cloud short by a long shot.

AT&T’s Cloud Architect is essentially a developer community cloud that’s designed to provide those who support its service ecosystem a place to run stuff.  They could in fact run it elsewhere, like on Amazon, but the best place might be to run it in a platform ecosystem and not using virtualization at all.  Newly developed apps could in theory be run simply as tasks or threads in a multi-programming OS.  They aren’t in large part because it’s hard to prevent interaction among them, which would likely be undesirable.  So is security the only reason for using virtualization in cloud?  It may well be one, but probably the biggest reason today is that we don’t know how else to build a cloud.  The platform-like architectures, including Microsoft’s Azure, Joyent’s SmartOS and SmartDataCenter, and in fact any of the Solaris morphs, are in my view far better platforms for the cloud in general, and developer/service-layer clouds in particular.  OK, I admit that I’m not an OpenStack fan; I think the concept is just benefitting from mindless media clustering on something that sounds cloudy and populist at the same time.  So is a smoke lodge, and mysticism in any form is the wrong thing to build a service future on.

Alcatel-Lucent is also promoting its vision for connected futures in “ng Connect”, an ecosystem designed to promote cooperative development and deployment of services through what seems like standards-like interactions without (so they hope, I’d bet) the usual standards-body politicking.  The technology framework for ng Connect is a bit more flexible, I think, but I’m not sure whether the actual program will end up settling on a primary environment.  I’m also not sure how much cooperation and exchange is actually architected into a platform versus simply permitted ad hoc.  The thing is, it’s the first drive by a vendor to create a service partnership on an open scale, and it will be interesting to follow it.

 

 

Leave a Reply