Cloud Impact on the Technology Half Life

Mainframe (old robot) I'm a PC (the PC guy) I'm a Mac (The mac guy)I would like to think there is such a thing as a technology half life? That would be the amount of time required for the installed base of given technology to decay to half. I would also imagine that with the emergence of cloud computing, the technology half life of nearly every technology is getting a lot shorter.

Definition of Technology Half Life

To help with the definition of technology half life I thought it would be easiest to begin with other more commonly understand examples of half life.

Radioactive decay is often discussed in simple terms of its half life for how long a particular substance remains radioactive like with nuclear waste. So a radioactive substance with a half life of 100 years will decay to half its radioactivity in 100 years, then half again in another 100 years and so on.

Another example can be found in pharmaceuticals with the biological half life  which is the time it takes for the activity of a substance, like a drug, to lose half its effectiveness.

Technology half life doesn’t follow a linear decay. It too it is more akin to the other forms of half life based on the extinction timeline which also doesn’t means something disappears entirely. Instead it means it has ceased to be relevant.

The End of the Mainframe

A few days ago NASA CIO, Linda Cureton, posted on her blog The End of the Mainframe Era at NASA. Reading Cureton’s post and the many heartfelt comments of readers stirs many nostalgic thoughts for me.

It also had me puzzled as to how NASA is handling its computational and processing needs now that the last IBM Z9 is going out the door.

I also wondered if the prediction of the end of the mainframe were now finally coming true. After all the mainframe’s demise had been predicted ever since the emergence of client-server computing.

In attempting to determine if the installed base of mainframes was in decline or not, I was surprised to find very little data available. And what data I did find contained a mixed message. It appears the number of mainframe footprints in 2002 was thought to be in excess of 38,000 but by 2007 was reported to be only 10,000 footprints.

If client server computing in fact lead to the demise of the mainframe and the numbers I cite here are even close to accurate it took a lot longer than anyone had predicted. Meaning that the half life of the mainframe market share decay may no longer be measured in decades as before but instead only 5 or 6 years.

But in terms of the technology half being a measure of relevancy, consider the number of installed MIPS surged in 2010 and the platform has evolved leading IBM to overtake HP as the largest server manufacturer in 2011. But in terms of market share and the broader trend, relevancy is another matter.

Cloud Impact on Technology Half Life

Now I wonder if the emergence of cloud computing solutions will similarly bring about the decline of the enterprise client-server computing model. If so, what will be the rate of decay for the current installed base before its relevancy limit is reached.

Will cloud adoption rates represent mostly new applications, growing mostly by absorbing the expansion of enterprise needs? Or will cloud adoption rates reflect a replacement strategy where cloud substitutes for traditional lifecycle refresh?

If cloud adoption is driven by expansion it will be a very long time before it has a dominate market share of the installed “server” market. But if cloud adoption becomes a viable option in the lifecycle refresh we could see an uptake that produces a half life of 8 to 10 years leaving a fraction of legacy enterprise platforms in the enterprise.

But the real impact occurs once you are in the cloud. That’s because the cost of conversion is less of a barrier and by its nature as a cloud service (self service – subscription based) switching from one solution to another can occur at any time and there are no stranded costs.

The cloud impact on technology half life isn’t limited to the data center. Because mobile devices have grown hand in hand with cloud services the impact on the desktop’s technology half life has already begun to shorten it.

The desktop’s technology half life will then be replaced with a half life of the BYOD devices which currently reflect consumer refresh rates of only 1 to 2 years. This also means a technology half life on relevancy can occur just from the release of the new iPad.

I want to be clear, there is no science at work here. Only a perception that the rate of change is accelerating by the forces of cloud services.

For colleges and universities who have struggled historically with funding lifecycle refresh rates of 4 to 5 years this will present an entirely new set of problems if your financial planning models doesn’t adjust for it today.

And for IT strategic planning, how will you factor for the extinction of a given technology in a world that is accelerating?

PS – If you want to take a quick trip down memory lane of the IBM mainframe history I recommend scrolling through IBM Mainframes – 45+ Years of Evolution.

This entry was posted in Cloud Computing and tagged , , , , , , . Bookmark the permalink.

7 Responses to Cloud Impact on the Technology Half Life

  1. Eric says:

    I was just reading through your article, “Cloud Impact on the Technology Half Life”, and I was just wondering at what point in the half-life of technology do you think we could consider as no longer relevant? Do you think it follows the Moore’s law rule of thumb of roughly every 18 months?

  2. The Higher Ed CIO says:

    Thanks for your note today. Let me see if I understand your half life point. I think Moore’s law is more about performance which hits at the gains from normal refresh. Today’s server is some amount “x” faster than the one it is replacing. Same idea on storage with Moore’s law.

    The half-life idea though is more about displacement and obsolescence. Where a new technology that is disruptive causes the established technology to become irrelevant such that the majority of refresh moves to the new technology and the installed base erodes at its half-life.

    Because this can be a slow process as with mainframe being displaced by client-server or JBOD by DASD, NAS, SAN another disruptive technology can emerge like IaaS offering cloud storage or processing which cuts the half-life down and speeds up the process. So mainframe displacement will speed up and so will client-server displacement.

    Similarly the dumb terminal, desktop, laptop, thin client, virtual desktop and tablet with mobile device. This is more from disruptive changes than Moore’s Law.

  3. Eric says:

    That is a great point about Moore’s law – It definitely is more about increased performance than obsolescence – I suppose Moore’s law would be an inverse of Jerry’s half-life law (I’m trying to picture it in my head!).

  4. Rufus Jones says:

    This is an interesting bit of reasoning, and I hate to poke at it. But have you considered the mundane possibility that the data you’ve basing your analysis on is mistaken?

    A lot of facts we think we know are not actually facts. In 2011, someone tried to analyze Rush Limbaugh’s listener base– he claims 20 million daily listeners– and it turned out that no one can find data to document that figure. It appears to date back to a media kit from the mid-90’s quoting “surveys” but not citing any specific ratings. Whether the actual number is higher or lower, the 20 million is almost certainly a SWAG created by an ad salesman two decades ago.

    A 2010 article in the Atlantic Monthly , “Lies, Damned Lies and Medical Science” presented the work of Dr. John Ioannidis, a Harvard-trained MD who has spent the last 20 years analyzing the methodology used by medical research studies. His data indicates that 80 percent of non-randomized studies turn out to be wrong, as do 25 percent of randomized trials, and as much as 10 percent of large randomized trials. He traces the cause to myriad factors: poor understanding of statistical techniques, errors in interpretation, deliberate bias in selection, collection, interpretation and reporting or just plain typos.

    This is for medical research, mind you– the things that help determine what pharmaceuticals are being licensed for use, what is prescribed and what foods and behaviors doctors advise patients to pursue or or avoid.

    If this is so (and no one in the field disputes Ioannidis, given the number of studies that contradict every year), I’d hate to think what the error rate on some of those Gartnerian, self-reported, “If you don’t estimate how many memo pads your enterprise uses every week, we won’t give you the $10 Amazon gift certificate for completing the study.”

    So, employing Occam’s Razor, the most likely explanation of the data you found so surprising is not a paradigm-shifting revelation but bad data. It’s elegant reasoning and it might be quite right, but I would not assume it is. Not long ago, I found a large enterprise still using a program running on IMS and ADSO, and I remain convinced mainframes, styrofoam and cockroaches are the only things that will survive the apocalypse.

  5. The Higher Ed CIO says:

    Poking at ideas is how they improve. I totally see your point and thought I had acknowledged that as much myself. I did track down what looked like an authoritative market report but couldn’t get my hands on it without shelling out the big bucks.

    So let’s do this by simply reasoning from our own perspectives even if we don’t have definitive data. There are fewer mainframe vendors today, the total number of mainframes has declined in part due to mainframe improvements but also because people use different technology, the total number of mainframes in terms of magnitude is not in the millions and may not be in the 100’s of thousands either, client server is in the millions. And a similar gross numbers approach can be applied to other technologies.

    But I am most interested in if you have some alternate perspective on what the data is.

  6. Art in LA says:

    “But technology half life doesn’t follow a linear decay.”

    Your radioactivity and pharmacokinetics examples are not linear decay either. The logarithm of their decay period is linear.

    Interesting observations otherwise. I think the BYOD half-life might be longer since innovations can be implemented on the cloud side, the BYOD becomes just a viewer.

  7. The Higher Ed CIO says:

    I think you are agreeing but it appears my wording could be creating confusion. I thought I was affirming tech half life is not linear just like radioactive decay or drugs. So I need to fix my phrasing. It would read better as:

    Technology half life doesn’t follow a linear decay either. It too is more akin to the other forms of half life based on the extinction timeline which also doesn’t means something disappears entirely. Instead it means it has ceased to be relevant.

    Thanks for pointing it out.

Comments are closed.