I would like to think there is such a thing as a technology half life? That would be the amount of time required for the installed base of given technology to decay to half. I would also imagine that with the emergence of cloud computing, the technology half life of nearly every technology is getting a lot shorter.
Definition of Technology Half Life
To help with the definition of technology half life I thought it would be easiest to begin with other more commonly understand examples of half life.
Radioactive decay is often discussed in simple terms of its half life for how long a particular substance remains radioactive like with nuclear waste. So a radioactive substance with a half life of 100 years will decay to half its radioactivity in 100 years, then half again in another 100 years and so on.
Another example can be found in pharmaceuticals with the biological half life which is the time it takes for the activity of a substance, like a drug, to lose half its effectiveness.
Technology half life doesn’t follow a linear decay. It too it is more akin to the other forms of half life based on the extinction timeline which also doesn’t means something disappears entirely. Instead it means it has ceased to be relevant.
The End of the Mainframe
A few days ago NASA CIO, Linda Cureton, posted on her blog The End of the Mainframe Era at NASA. Reading Cureton’s post and the many heartfelt comments of readers stirs many nostalgic thoughts for me.
It also had me puzzled as to how NASA is handling its computational and processing needs now that the last IBM Z9 is going out the door.
I also wondered if the prediction of the end of the mainframe were now finally coming true. After all the mainframe’s demise had been predicted ever since the emergence of client-server computing.
In attempting to determine if the installed base of mainframes was in decline or not, I was surprised to find very little data available. And what data I did find contained a mixed message. It appears the number of mainframe footprints in 2002 was thought to be in excess of 38,000 but by 2007 was reported to be only 10,000 footprints.
If client server computing in fact lead to the demise of the mainframe and the numbers I cite here are even close to accurate it took a lot longer than anyone had predicted. Meaning that the half life of the mainframe market share decay may no longer be measured in decades as before but instead only 5 or 6 years.
But in terms of the technology half being a measure of relevancy, consider the number of installed MIPS surged in 2010 and the platform has evolved leading IBM to overtake HP as the largest server manufacturer in 2011. But in terms of market share and the broader trend, relevancy is another matter.
Cloud Impact on Technology Half Life
Now I wonder if the emergence of cloud computing solutions will similarly bring about the decline of the enterprise client-server computing model. If so, what will be the rate of decay for the current installed base before its relevancy limit is reached.
Will cloud adoption rates represent mostly new applications, growing mostly by absorbing the expansion of enterprise needs? Or will cloud adoption rates reflect a replacement strategy where cloud substitutes for traditional lifecycle refresh?
If cloud adoption is driven by expansion it will be a very long time before it has a dominate market share of the installed “server” market. But if cloud adoption becomes a viable option in the lifecycle refresh we could see an uptake that produces a half life of 8 to 10 years leaving a fraction of legacy enterprise platforms in the enterprise.
But the real impact occurs once you are in the cloud. That’s because the cost of conversion is less of a barrier and by its nature as a cloud service (self service – subscription based) switching from one solution to another can occur at any time and there are no stranded costs.
The cloud impact on technology half life isn’t limited to the data center. Because mobile devices have grown hand in hand with cloud services the impact on the desktop’s technology half life has already begun to shorten it.
The desktop’s technology half life will then be replaced with a half life of the BYOD devices which currently reflect consumer refresh rates of only 1 to 2 years. This also means a technology half life on relevancy can occur just from the release of the new iPad.
I want to be clear, there is no science at work here. Only a perception that the rate of change is accelerating by the forces of cloud services.
For colleges and universities who have struggled historically with funding lifecycle refresh rates of 4 to 5 years this will present an entirely new set of problems if your financial planning models doesn’t adjust for it today.
And for IT strategic planning, how will you factor for the extinction of a given technology in a world that is accelerating?
PS – If you want to take a quick trip down memory lane of the IBM mainframe history I recommend scrolling through IBM Mainframes – 45+ Years of Evolution.