The IT Revolution May Be Over, But It’s Still A Jungle
February 4, 2013 Alex Woodie
A JPMorgan Chase analyst recently made headlines with a paper that noted the price of computer equipment and software is decreasing at the slowest rate in a generation, which is interpreted to reflect a notable downturn in IT industry innovation. With the heyday of rapid IT innovation and easy productivity gains behind us, the question now becomes: What drivers will emerge to shape the $4 trillion information economy?
JPMorgan’s chief US economist Michael Feroli made interesting observations in his two-page paper US: is IT over?, which starts on page 13 of JPMorgan Chase’s recent Global Data Watch economic research report (a hefty 1.7 MB PDF, so beware those stuck on dial-up).
Feroli observes that the annual decreases in computer, IT equipment, and software prices that we have grown to expect from year to year have dramatically tapered off in recent years. In 1998, we were seeing 25 percent annual decreases in the price of a computer with a given capability (standardized using advanced “hedonic” adjustment methods). This phenomenon has led to the ubiquitous $1,000 PC gaining considerable new capabilities, year over year, for decades.
By the year 2010, however, standardized computer prices were dropping only 5 percent annually. Similarly, whereas we were seeing 15 percent drops in the cost of a given piece of information processing equipment or a piece of software in 1998, we are seeing only 3 percent annual drops in equivalent products today.
“…[M]ore rapid increases in the capability of computing equipment should imply more rapid declines in the price of a computer with a given, fixed capability,” Feroli writes. “The slower decline in information technology (IT) equipment prices indicates a slower increase in the level of technology incorporated in that equipment.”
Some of the decrease in price/performance that Feroli observed can be attributed to the recent flattening of Moore’s Law, which states that the density of transistors on a chip doubles approximately every 18 months. Moore’s Law has been an invisible, guiding hand to chipmakers and, therefore, to the computer makers for decades since Intel co-founder Gordon Moore postulated his premise way back in 1965.
Intel and IBM have invested heavily to shrink their processors over the years, but we are entering an era of diminishing returns as chip designs are starting to reach the physical limits of space and heat dissipation. The re-introduction of liquid cooling into high-end mainframes and supercomputers and the focus on massive parallelization in so-called “industry standard” X86 servers are two ways that system makers are pushing the computing envelope in lieu of denser processors with higher clock speeds.
To be sure, there are other factors contributing to the price that a given piece of IT equipment will command in the marketplace besides the number of transistors one can see with a microscope, including macroeconomic conditions. From 1997 to 2000, the dot-com boom, the Y2K bug, and the ERP wave spurred a massive buying frenzy for data center gear, with 25 percent year-over-year increases in IT spending. According to Feroli, innovation was high at this point, as evidenced by the large year-over-year drop in computer prices.
After the recession started in early 2001, demand for IT gear fell precipitously, and total IT spending went flat. Ordinarily, one would think that lower demand for IT gear would lead to lower prices, and of course that is true to an extent. But according to Feroli’s data, the price of computers began to fall more slowly after the recession started. And that rate of decrease has slowed ever since, except for about a two-year period from 2005 to 2006 when the housing bubble was fueling the economy (and leading to an up-tick in IT spending).
The numbers don’t lie. The rate of increase in corporate IT spending is way below the historical norm for 1980 to 2005. At the same time, the price/performance curve has flattened, which leads to less year-to-year change in hardware and software. It is interesting to note that this phenomenon–the drop in IT spending, the increase in relative computer prices, and the drop in innovation–also corresponds with worker productivity. After many years of annual increases in the productivity of the American worker, productivity maxed out last year.
On the plus side, Feroli notes that, while innovation and productivity may be down at the moment, “A temporarily slower patch of growth on the supply side will allow the economy to more quickly absorb and re-employ the vast number of underutilized workers.”
So where does that leave us? Does it mean we’re entering a new era of IT doldrums, perhaps a “lost decade” before a breakthrough can be made in some newfound computing method, such as use of carbon nanotubes or DNA strands?
That seems unlikely, considering the overall health of the IT industry, which now accounts for $4 trillion in spending globally (including communications). Feroli–the former Fed economist who also noted last year that the new iPhone 5 could boost US GDP by a half percent all by itself–left some maneuvering room to come up with another theory for how things will shape up in the future.
“What might the above argument be missing?” Feroli writes. “First, it is always possible that in spite of their efforts, the statistical agencies may be missing some aspect of quality improvement in IT equipment. Second, there could be, and likely are, lags between technological advance and productivity gains, so that there is still a vast amount of untapped potential in the computing advances that have already occurred. Both of these arguments have solid merit, but without further evidence they remain theoretical objections.”
Feroli’s right. The old objective measures of price/performance will lead one to misjudge the health and vitality of the IT industry. Yes, that $1,000 PC you bought in 2007 can still run today’s desktop software quite capably (thank you, Microsoft, for getting a handle on Windows OS bloat). But the real innovation today is occurring on the Web and in mobile devices. Unit sales of smartphone and tablets have already overtaken desktop computers, and will overtake in terms of revenue soon (if it hasn’t already).
Meanwhile, on the server side, companies have gotten much smarter about utilizing their performance since 2005. Virtualization software is pretty much ubiquitous in the corporate data center, and has had a measurable impact on the number of CPU cycles, CPWs, or MIPS that are needed to satisfy workloads. And in other parts of the application stack, software is being distributed on cheaper iron even if it is not being virtualized and instead of being put on big SMP servers. You might be able to build Google on a cloud of IBM mainframes, but you couldn’t afford to be in business if you did.
At the same time, many more companies are ditching their own data centers for the cloud, where managed service providers (MSPs) expertly use virtualization to squeeze much more work out of their servers than any individual company can. In the case of public and private cloud computing, there is definitely valuable innovation occurring, even though it doesn’t present itself with big, gaudy server sale figures for IBM, Hewlett-Packard, Dell, and Oracle.
The fact is that, following the Y2K bomb and the ERP wave, corporations realize they had overbought on computing capacity. After the first recession ended in 2004, corporations were smart about building up their server capacity again. While innovation didn’t really start to tail off until about 2005, according to Feroli, corporations were already being careful about buying too much server capacity again.
IT’s Still A Jungle
The IT industry has become a massively complex and interwoven beast with many heads. Innovation is alive and well, as evidenced when Apple can single handedly goose the US economy by launching a handheld computer. At the same time, the ongoing information explosion is forcing companies to rethink how they deal with new sources of data, including users on mobile devices and social networking sites.
Companies will continue to invest in big iron and midrange systems, but the shape and makeup of that iron is changing, and the sales patterns in the future won’t look like those of the past. Better compression and smarter handling of data in “big data” appliances, such as those from Oracle and IBM, may eat into some storage array sales. SAP‘s use of a single database, HANA, to run both production and analytic workloads will eliminate the need for separate servers and all the accoutrement they entail. IBM says it can similarly eliminate server sprawl with its “datacenter in a box,” which it calls PureSystems. All of these are examples of how the IT industry is simultaneously innovating and keeping costs down for customers.
It may be tough to see it right now, but we are about to enter a period of rapid innovation–call it punctuated equilibrium, if you will, as paleontologists Niles Eldredge and Stephen Jay Gould did when talking about life on earth. We have been sipping at the water fountain of data, but lately it has turned into a fire hose. We have not figured out how to make sense of all the data we are generating, and all the incredibly powerful tools we have at our disposal, so it’s natural for us to pull back a bit and regroup. And while better, faster computer gear will help us squeeze useful information from the fire hose of data, the biggest breakthroughs will come from better, smarter software. That is where the real innovations will occur, but they will largely be qualitative, subjective improvements, rather than quantitative, objective ones.