Tech Week: Broadband

Broadband continues to be in the news, both in terms of policy and in terms of business model.  The two, of course, should have some relation to each other, but it’s increasingly clear that’s not going to be the case in many world markets.  The US elections and the announcement by Clearwire that they’d be cutting staff illustrate the issue perfectly.

The loss of the House for the Democrats means that Republicans will now chair the House committee that oversees the FCC, and this has already stimulated predictions that net neutrality is dead in Congress.  Not true; it’s been dead in Congress all along.  We’ve never believed there was much of a chance that Congress would step up on the issue, and frankly we’d be just as happy they didn’t.  The reason we have a federal commission in charge of communications is that Congress isn’t likely to be able to address complex technology issues well.  We saw how complex issues here are with the Fox/Cablevision standoff.

Fox cut access to some of its websites for Cablevision customers, at least for a time, and also cut off its TV programming as the companies battled over how much Cablevision would pay for carriage rights.  The FCC admitted it had no authority to act here, even though most would say that non-neutral behavior by content providers has the same effect as by access providers.  That means that meaningful net neutrality rules would have to come from Congress, and they’d have to break totally new ground.  That alone would be likely to send Congress into a tizzy, and combine it with the combination of heavy industry lobbying revolving around wireless impacts and consumer desires to have everything both neutral and free, and you have a political minefield that there’s no current incentive to address.  The election, after all, is over.

The Clearwire dilemma shows that some attention is needed here, though.  Broadband services of any sort create a classical S-curve of cash flow, with “first cost” driving a provider far into the red as they build out infrastructure to credible levels and fund marketing campaigns.  The hope is that success will turn this around, but the problem in broadband is that even “success” looks a lot like failure when it’s time to add up the numbers.  There’s not much margin, and that means it takes a very long time to recover early costs.  Clearwire needs to go back to the well, and neutrality issues with wireless aren’t going to make it any easier to do that.

The decision by T-Mobile to tout its HSPA+ offering as “4G” is another indicator.  Most operators would agree that HSPA+ is a less costly transition than a full migration to LTE, and if one accepted the 3GPP definition of 4G none of the current LTE offerings could qualify either.  But a marketing slogan may help sales, and sales could help turn that S-curve of cost around quicker.  Wireline services are already less than marginal in terms of profit, and wireless could easily move into negative territory as well.  We may see wireless capex slip in 2011 and beyond if we don’t get clarity on this issue.

The Handwriting on the Tablet

One of the prime areas of focus for tech recently has been the tablet space.  Tablets are far from new, and in fact some of the “new” models are more like reprises of earlier tablets in that they’re little more than a keyboard-less notebook.  The iPad, of course, created an alternative vision of a tablet as a kind of over-fed smartphone, a device that’s all display and designed to be a conduit of information to the user with a relatively sparse capability to move data the other way.  Some see tablets as consumer devices, and some like the model of enterprise use.  The vendors are struggling with which model to support; ViewSonic expects to offer both 7-inch form and 10-inch form and both Android and Windows 7 (even dual-boot, so the rumor goes).

However the tablet goes, the big news will be the network and the impact of tablets on user behavior.  Movement to tablets on a large scale means movement to ubiquitous wireless, but we’ll need to look hard at just what “ubiquitous” means.  As I’ve noted, there’s an opportunity for hospitality-Fi networks to play an enormous role in future tablet networking.  I think wireless providers and equipment vendors realize that and are trying to figure out how to promote a truly compelling case for 3G/4G wireless versus WiFi.  The problem is that it’s going to be an uphill battle, because device vendors have everything to gain by pushing WiFi versions of their devices to get a larger near-term market share.

Behavior and mobility and devices all create an interdependence.  The consumer isn’t set on tablet use, wireless models, or behavioral patterns at this point.  That means that giving them support for a specific usage model can condition them to consume that model, whatever form that support may take.  An explosion in tablet competition could empower a host of competitors, create a hospitality-Fi wave, and erode the business model for 4G.  It could foster a different model for mobility that focuses on roaming data sessions between WiFi hotspots, independent of traditional mobility tools of the past, and of IMS.  It could even erode the operators’ positions in the service layer, because WiFi is traditionally an OTT framework tied to no operator in a technical sense.

Alcatel-Lucent, whose quarter showed some real promise for growth, seems to recognize that.  They announced a program with Eurozone provider KPN that demonstrates the exposure of provider network assets through Alcatel-Lucent’s Application Enablement and Open API program.  This is the first large-scale success of a provider API program to deliver premium network features up the stack to the service layer.  The application itself is still a bit simplistic, focusing again on QoS and bandwidth rather than on the more complex areas of identity, federation, CDNs, and application-service feature creation, but it’s a convincing demonstration that operators do have a path to monetize their underlying network assets either by offering high-level services that exploit them or by wholesaling them to somebody else.  This kind of capability may be critical if things like tablets and hospitality-Fi start to erode the traditional mobile opportunity.

What Now?

In some ways, we’ve witnessed a historic election.  The margin of victory in the House for Republicans hasn’t been seen since Roosevelt’s time, after all.  But in a very important way we saw nothing but business as usual.  For the last three elections, US voters have turned out the party in power.  We’re never happy with our leadership these days, and for good reason.  Congressional popularity has been traditionally below 25% and the public sees Congress as the most contemptible branch of government.  But we elected them, after all.

The question now is how this will play for the US economy and for tech.  Specific party politics toward technology are, in my view, a minimal factor in this election because Congress isn’t any more likely now to pass tech-specific legislation like net neutrality than before—likely, they’re less likely.  Tech will be swept along by the economic forces.

Where will those forces sweep us?  It’s too early to say because of the split within the Republican Party that the Tea Party activists represent.  We’ve had conservative groups grab headlines within the Republican ranks before; Gingrich’s over-reaching of conservative power in the ‘90s is a potential poster-child for what might happen now.  Will activists compromise to pass legislation or will both parties do their usual political showboating?

On the positive side, Republicans are often seen by businesses themselves as being pro-business, and thus it’s possible that business may be more willing to invest and expand under a Republican House, but the House doesn’t change even Congress much less government.  On the negative side, conservative activism on the budget could cripple any economic recovery.  Attempts to repeal health care and financial reform, both promised by at least some Tea Partiers, would create gridlock for no likely gain.  But mainstream Republican leadership knows that.  In short, we don’t know yet how this will fall.

The Fed today will likely announce steps for quantitative easing, the purchase of old assets with “new money” to boost the money supply.  The question is how they’ll do it, and whether it will work.  Ultimately consumers have to start believing again.  This election isn’t going to make that happen; it was a vote against something rather than a vote for it, and that’s been the case for the last three elections, as I’ve noted.  The moderate center of America doesn’t want left- or right-wing ideology, but the parties are controlled by ideologues.  We have no candidates of our own kind to vote for, and that’s plenty of reason to be in a funk, unemployment notwithstanding.

In December I expect I’ll be able to run the model on the economic and technology future and get some numbers; in the meantime it will be watchful waiting here at CIMI as with all of the rest of you.

Clouds and Chips

The IT world has provided us with a number of interesting developments this week, starting with a Google suit filed over a proposed Department of the Interior messaging system award to Microsoft.  Google feels that its own Apps could have been used for this, and that they should have been given the opportunity to demonstrate their compliance with federal security requirements and bid on the contract.  Thus, the lawsuit.

Some in the DoI have suggested to us that the problem with Apps is the same one that’s a problem for users of Google’s online competitors to Office; the features Google provides are a subset of those already in use rather than the full set.  What’s not totally clear is whether the missing features are actually used at DoI, but in some ways you have to be sympathetic with the department; how easily could they find out whether all or some features were used?  The suit may thus be an important one for cloud services in general.  Many (probably most) cloud-based alternatives to popular installed software tools are functionally more limited than the stuff they’re intended to replace.  That’s also true with most open-source tools.  I’ve tried Google’s document tools and they won’t properly process either our spreadsheets or our presentations, and they create problems with some publication/paper styles as well.  Same for OpenOffice.  But there’s no question that you could do most of what I do in either Apps or OpenOffice if you started from scratch.  So cloud applications would be promoted if they were deemed acceptable if they offered relatively full functionality, even if differently, or if they offered at least some way of doing what buyers actually did rather than what they could do.  Without that kind of ruling, it may be hard to promote the cloud version of many apps unless the cloud providers step up and fully duplicate capabilities.  Frankly, that’s what they should do.  You can’t sue your buyer into submission as a long-term business strategy.

The other interesting development is in the chip space, and the two vendors making the news were Intel and Oracle.  Intel abandoned a long practice of keeping its fab to itself by doing a deal with an FPGA chip specialist Achronix for 22nm capacity.  The actual volume of fabrication here is small, but what may be interesting is that Achronix is perhaps the speed king of FPGAs, which are field-programmable chips that can be used for fast responses to market needs or applications where volumes won’t justify a custom ASIC.  It’s not hard to see that such chips might be very valuable in the consumer device market, which could mean either that Intel may want a stake in Achronix later on, or that it may itself be thinking about getting into the consumer space on a larger scale.  Recall that Intel has its own mobile OS and that it’s often been said to have aspirations of being a player in a retail space.  What better one than devices?

Oracle’s move is if anything even more interesting; they’ve taken a stake in Mellanox, who is one of the key providers of chips for InfiniBand data center switches.  They’ve been a partner with Sun and also provide stuff for Oracle’s storage appliances, but as we’ve noted before, Oracle is the only data center player with no position in networking, and nowhere is that position more important than in the data center.  InfiniBand is a superior technology to at least the current generation of Ethernet in terms of latency and capacity, and were Oracle to be planning to do a big flat fabric for the data center, Mellanox would be a likely player in their decision.  It’s also interesting to note that the deal includes Mellanox supporting Solaris as one of its host OSs.  That suggests that Oracle may be planning to continue to field Solaris as an alternative to Linux.  We think that’s smart; Solaris has a good following, and for specialty applications like OLTP we think it’s the best OS out there.  Could Oracle be planning a major data center move?  It certainly could be.

Hopeful Economic Signs?

Economically speaking it would be hard to characterize last week as great, and yet it was better than expected and certainly better than many had feared.  The critical number, the 3Q GDP, came in above last quarter’s level, and that pretty much laid the double-dip recession theory to rest.  Far from showing wild swings of volatility, the stock market was remarkably stable, varying only about 250 points on the Dow between lowest and highest levels and closing only about 70 points lower.

This week, of course, the elections in the US will likely drown out any economic data released.  The campaign has been among the most bitter in memory, with negative ads souring virtually all of the voters polled.  Democrats hold a significant edge in voter registration, so it’s very likely that were turnout to be high they’d hold on to most if not all seats.  The challenge for them is that the party who wins a Presidential election in the US nearly always loses seats in Congress in the mid-terms.  The question is how many, whether it would be enough to give Republicans a chance at putting forward their own agenda, or whether Democrats would work with Republicans to support at least some sort of legislative progress.

Republican priorities, such as they’ve been hinted, seem to be focused on show.  Repeal of the financial reforms or health care is next-to-impossible lacking veto-proof majorities in both House and Senate, and nobody is predicting that level of Republican win.  Democrats really haven’t articulated any substantive agenda either, in my view; likely they don’t think they’ll be in a position to promote one.  Thus, we can’t expect much but reactive politics no matter who wins.

For the economy in general, and for tech in particular, that might not be bad.  We need better financial reform than we got; hedge funds that only millionaires can invest in manipulate the markets and their “bets against the market” are really bets against the average investor—and we know who’s been losing.  We needed better healthcare reform too.  We have the most expensive healthcare system of any industrial nation, and yet we aren’t anywhere close to the healthiest or longest-lived among them.  But neither of these areas are going to be fixed further, and so having at least a stable framework is better than being in a constant state of flux.  The economy will now likely slowly recover, but we do believe that restoration of “normal” employment levels may take five years—if it ever comes.  The US is shifting away from being a producer economy because productivity gains aren’t keeping more expensive US labor competitive with emerging economies.  As we’ve noted before, spending on IT since 2001 to enhance productivity has not kept pace with past history.  That has to change to increase jobs here.

An article ( has correctly noted that the M2 money supply trends can be correlated reasonably with economic conditions.  M2 is a broad measure of money supply, and when it sinks sharply it’s an indication of money being hoarded.  It did shrink during the downturn, and it’s now expanding again, which is a good sign.

We must point out, though, that our own chart on the downturn, which was published in our special report in the fall of 2008, illustrates that “wealth growth” in the broadest economic sense tends to create bubbles if it’s not accompanied by GDP growth.  We also note that neither wealth nor GDP growth correlates well with how consumers feel.  Yes, a big downturn will create a corresponding dive in sentiment, but often upturns in sentiment come during or after downturns in wealth/GDP.  The mindset of the consumer is more complicated than simple charts can show, and it’s going to be the consumer that gets us out of this eventually.

Tech, of course, is both directly and indirectly linked to an economic recovery.  Most companies will spend more when they make more, and that’s also true with households.  More succinctly, belief in future progress tends to fuel current spending.  We hope that the M2 upswing is an indication of feel-good behavior, and the fact that the 3Q GDP growth was fueled largely by consumer spending is a good sign.

Not Chicken Little Time…Yet!

This week saw what’s become the usual push and pull of supply- and demand-side issues, and perhaps a bit more than the usual confusion in the markets (financial, enterprise, and consumer) about the net outcome.  It’s not been the wild week of stock swings that could have happened had economic news been bad, but at the same time there wasn’t much that could be called a big upside of hope either.  In all—tepid probably says it best.

I’ve commented several times this week on broadband issues, many arising out of what’s increasingly clear are misleading or bad numbers about broadband deployment.  It’s not surprising that broadband would become a political football in this most political of all recent election years, but it’s bad for the industry because it’s pulling everyone’s eye off the real ball.  Despite continuous evidence that economic density is the most decisive factor in broadband market effectiveness, we continue to ignore it.  Despite the fact that there’s no clear indication that broadband has any societal value whatsoever, we continue to assert that it does.  A real plan, based on exploiting what we know and studying objectively that which we don’t know, could get the market moving.

Meanwhile, the mobile space is showing us the shape of the future.  4G is going to bring usage pricing to mobile, and it will leak back into 3G and into wireline eventually, at least in markets where economic density is low and access profits are likewise.  Smartphones are reported by one analyst firm to be creating a mobile market owned by the handset giants like Apple and Google and not the operators.  While that’s clearly an exaggeration, it’s true that smartphones are disintermediating operators in mobile just as the OTT players disintermediated them in wireline.  Operators fled wireline into mobile to flee low ROI.  If mobile gives them the same low ROI, can they then flee to telepathy or something?  Hardly likely; they’ll simply have to accept a tailing off of revenues, which means tailing off of capex.  Big telco Verizon and the cable industry overall both showed us that the Street will punish those who let capex rise as a percent of sales.

Enterprises have had their own challenges.  We’ve seen that spending on some hardware and software has been strong through the year, but that strength has been created in large part by the suppression of orderly upgrades of baseline IT infrastructure by the past economic crisis.  You can only catch up for so long; after that, growth will depend on exploiting new productivity paradigms, and the market hasn’t been very good at doing that since 2001.

I’m not playing Chicken Little here; the industry isn’t going to crash.  In fact, it’s likely that by 2012 it’s going to prosper, because any time demand overwhelms the insight of the sellers, there’s going to be a new crop of leaders created.  Incumbents in all areas of tech have gotten too comfortable with old paradigms, and new players are the ones agile enough to seize the opportunities.  Those “new players” aren’t likely to be startups, VCs having fled the equipment space to social networking and other areas with more potential for bubble-creation economics.  Instead they’ll be smaller vendors, often public companies.  Watch F5 and some of the deep-packet-inspection companies; they are looking to skim the networking cream.  In IT, watch Oracle; software has the most direct link to productivity and so software companies can transform to build new cost/benefit paradigms most easily.

Bad Numbers Mean Bad Decisions

Anyone who’s followed my writing knows that I’m no fan of the National Broadband Plan.  My main issue is with the data that’s been presented to back that plan, and some recent work I’ve been doing is making me even more skeptical—if that’s possible.

What started me off was a comment by a White House science type.  He said that he was sure that there were billions to be gained in productivity and jobs if broadband were more available, though he admitted he didn’t know exactly how those benefits were calculated or realized.  OK, I said, let’s then take a look at broadband versus economics and see if there’s a correlation.  The FCC has data that shows, by zipcode, where there’s a lot of broadband providers available.  Other agencies provide household income data, also by zipcode.  Suppose we correlated the two?

If broadband availability is in fact an economic benefit, we should see some correlation between the number of providers and the household income of consumers.  We do, but it’s the wrong kind.  The data shows that the correlation is overwhelmingly in the reverse.  The areas with the most broadband providers available are the areas with the lowest household income.

Let me illustrate with a random example from my own area.  Take two suburban communities in southern NJ as an example.  One, which is a kind of middle-upper community, has 11 providers according to the FCC.  The second, which is arguably the richest community in the area, has only 10.  Grab a random residential zipcode from across the river in Philadelphia, where the household income is a quarter that of the first community and a sixth that of the second, and you find they have 12 providers!

Now I’m not saying broadband is making people earn less, though in fact that’s a more supportable view given the data than the contrary assertion that it would help them earn more.  I’m not even saying that the urban poor have generally better broadband, the FCC’s rhetoric notwithstanding.  What I’m saying is that even a simple review of the data we’ve collected shows that our viewpoint on the role and value of broadband Internet isn’t supporting the popular views, or the views the FCC is presenting in its National Broadband Plan.

The data also seems to suggest that geographic factors like population density are by far the most significant forces in determining where broadband competition will develop.  Even in very poor zipcodes we see a lot of providers—more than in most of the richer ones.  Why would operators focus their efforts on places where household income is the lowest, if not because those places have population densities that overcome even four-to-six-to-one income disparities?   That proves our long-standing point that demand density means everything.

We’re also concerned that the FCC’s data includes non-facility providers of broadband, which in our view distorts the picture considerably.  The only way to get broadband to the user is to deploy infrastructure.  Riding as a wholesaler on someone else’s doesn’t create new options, only “new” providers.  In fact, there’s every reason to believe that multiplication of wholesale players might erode margins and further limit investment.  It certainly distorts the figures, and most people where I live couldn’t name more than two wireline and four wireless providers, which totals 6 and leaves at least four or five unaccounted-for.  Who are these providers, one must wonder?

The biggest problem here is the lack of clarity of data, or the reliance on incomplete or just bad data—it’s hard to say which.  The FCC appears to have gathered a lot of information through third parties, and also appears to have muddled its own data collection.  As I noted, it’s hard to say whether this was ineptitude or deliberate.  What’s easy to say is that bad policies are inevitable if bad data fuels them.

A “European Approach” for Us All?

Speaking yesterday at BBWF, Alcatel-Lucent’s CMO Stephen Carter talked about the need for creating a “European approach” to 4G broadband.  Some of the specific points in the talk weren’t new; we need to move beyond all-you-can-eat pricing, we need to add some specific partnership and settlement processes, and we need to recognize the intrinsic differences in the major markets.  What is interesting to me is that all of this is coming to a head right now.  Why that is might be the most interesting thing of all.

The reason is mobile 4G services, and the fact that these services are being driven by smartphones and tablets and even e-readers—appliances.  Mobile disintermediation via appliances is a real risk, and 4G bandwidth levels mean that there is truly an opportunity to create a new model of the user’s relationship to the network.  The risk that new model might end up being a reprise of the OTT-dominated wireline broadband market is very real now.  Further, 4G deployment offers operators a chance to reset the pricing and service relationships—to a point.  Operators either have to take the opportunity and level-set 4G differently, or they have to avoid 4G investment as being something unlikely to pay off for them in ROI terms.

Which of course is Alcatel-Lucent’s issue here.  Arguably companies like Alcatel-Lucent have been most successful in the wireless area, and an operator trend toward stagnation of wireless investment would be a major barrier to Alcatel-Lucent’s future profitability.  But the truth is that they aren’t the only one with a bet in the 4G game.  With the exception of Cisco, whose ambitions for revenue growth are spreading to markets adjacent to networking, every one of the major network vendors is a slave to wireless capex growth because wireline growth is not going to even sustain their current numbers.

What is clear to me is that everyone in the broadband game realizes that 4G is the watershed issue, the place where we either get control of network evolution in an economic sense or admit we can never control it.  In the latter case, it’s clear that we’ll see sharp capex declines beginning (according to our model) in 2012 as ROI pressure on operators constrains network investment.  In the former case, we could see the very thing Carter says we need—immersive broadband that touches all of us in all aspects of our lives, because it can profitably be made to do so.  It’s not a glamorous vision for the US market because we want to believe everything’s free.  It’s not simplistic like Cisco’s vision of driving infrastructure investment simply by forcing more traffic onto the network regardless of the ROI.  But it’s a true vision, and Alcatel-Lucent is perhaps best in all the industry in articulating it.

But can they deliver it?  The principles of Application Enablement are surely relevant to creating what Carter hopes for, but they’re not a sufficient condition as they stand.  There are too many holes in the story of the “European way” when you get to the rubber meeting the road.  Potholes are a bigger threat to ROI than the current disorder in a way, because without a clear path to invest everyone will hunker down and look ahead to when that path becomes clear.  That could hurt capex even earlier.  Four vendors (Alcatel-Lucent, Ericsson, Juniper, and NSN) have assets to build the kind of future Carter talks about, and not just for Europe.  Which one will come through?  We’ll likely know by spring.  Carter’s speech is proof that the issue is too acute to be ignored any longer.

Ecosystemic Security

Juniper announced a mobile security suite, building on its Junos Pulse agent/client software that operates across a wide variety of mobile and PC platforms.  The elements of the suite (the anti-virus, firewall, etc. that are common to most PC suites) are less news than the framework in which it’s being provided.  What Juniper is doing is binding security as an element in a device agent, then coordinating it through central management of that agent so that it’s effectively a part of a collective network- or organization-wide security program.

The newest problem both enterprises and operators are facing these days arises from the fact that a single user is extended across multiple appliances, and increasingly uses those appliances as facets of a virtual personality.  That’s true with social-driven consumers but also increasingly with productivity-driven enterprises.  Point-solution security not only doesn’t secure the range of devices, it forces those who want security to integrate disparate policies and processes to create a secure framework, and one miss destroys not only collective security but also risks cross-contamination of the other channels to the user.

I like the Juniper approach here not because of its capabilities or because of the need that Juniper-sponsored research was targeted at validating; we have security on devices, and we’ll have it on all eventually, and the problems of device security are hardly a surprise even without new research.  What I like is that Junos Pulse extends “the network” to the device itself and makes it an agent of network policy and services.  That seems the only long-term solution to both security issues and to creating service value-add.  Plus, the multiple device faces of the user are going to pop up in a lot of future service missions, and they will be problematic to those without a device-integrated approach.

It’s hard to pull this story out of the Juniper talk, in part because it’s focused so much on security needs and the point-solution remedy.  The real story is the ecosystem.

Is Ozzie Right?

Microsoft tech guru Ray Ozzie is leaving Microsoft, and in the wake of the announcement a memo from Ozzie was leaked to the media.  In the memo, Ozzie asks Microsoft to confront an age without PCs, an age where traditional Microsoft PC incumbency would thus be meaningless. 

What Ozzie is looking at is whether appliances like smartphones and tablets, combined with cloud-hosted services, could change the appetite of the public for personal computing.  I think that the answer is already known, but it’s ambiguous.

The question is whether cloud services can absorb all the functionality of local applications.  In theory?  Sure.  In practice, the problem is that of willingness to pay and profit.  If the total market for computing and applications among consumers is seen as being ad-sponsored, we’ve collapsed a multi-billion-dollar industry into something that’s likely a tenth its current size, simply because you can’t expect ads to sponsor all of content, all of software, and all of everything else when the world’s ad spend is only about $680 billion and isn’t even growing as fast as world GDP.  Thus, we’d have to expect that the consumer paid in some direct way for the incremental application services.  So whether that direct payment was less than the cost of central hosting of the applications becomes the question.

To answer it, we say that central IT resources are always cheaper—economy of scale, after all.  But the erlang curve shows that economies of scale taper off at volume, meaning that there’s a point where no further economy can be gained.  And you still need a screen, keyboard (even if its virtual and on-screen), CPU chip, and memory to create a network appliance.  The cost of making that into a computer isn’t incrementally enormous.  I can buy a netbook for three hundred bucks, get free or cheap software for writing, calculating, photo-editing, and more.  Sure I have to sustain the software, update it and secure it, etc.  But most of the threats to security come from the Internet, so don’t I have to secure my appliance anyway?

My point is that Microsoft is as much at risk for over-reacting to the future as it is to under-shooting it.  Its biggest problem is the same one it had before all the Internet appliance stuff hit the market—once everyone who needs a PC has one, what’s your future strategy for growing revenue?  Microsoft needs to capture the incremental revenue from the appliance-and-cloud craze, not to substitute that revenue for its current revenue stream.  If it does the latter, it dies pure and simple.

Revolutionary stuff is interesting, and in this mindless media age the only thing that matters is “interesting”.  Truth won’t create click-throughs.  But truth is what creates markets.