The X Factor: Economic Recession Is the IT Innovator’s Ally
August 27, 2007 Timothy Prickett Morgan
With the private equity markets drying up and the stock markets suffering from a bad case of jitters thanks to a financial crisis brought about by the writing of heaven knows how many bad home mortgages in the world, it is natural enough for everyone in the working world, as well as those who are retired, to be worried about economic recession. But not everyone will be worried. IT vendors looking to take on entrenched incumbents will, if history is any guide, do pretty well if a recession results from these financial shenanigans.
While it is difficult to establish a direct correlation between the viability of various national economies and the overall global economy and trends in data processing and information technology over the past century, it takes neither an economist nor a historian to figure out that data processing and information technology have been deployed to reduce human labor costs and to provide new kinds of products and services as well as existing ones at lower costs. What started out as an exotic punch card loom for calculating the U.S. census turned into a means of doing accounting and managing production and distribution of products. And because of the benefits of data storage and data processing that punch cards and then electronic computers offered, the growth in the use of these technologies in the first half of the 20th century was more or less unaffected by what was going on in the economies at large, except during the Great Depression, when everything was in the tank.
But since the mainframe era began in earnest in the mid-1960s, the correlation between a falling economy and a phase change in the style of computing in what came to be known as the data center have become coupled. While IT managers are always under pressure from business owners in private companies or C-class executives in public ones to cut costs, pressure turns to mandate when the economy heads south. Each national economy has a slightly different curve, and has absorbed IT in different ways at different times, but the history of the economy and IT in the United States illustrates the point.
The U.S., like every other country from 1929 through the end of the World War II, was in depression. The technical definition of a recession is two consecutive quarters of negative growth in gross domestic product. Depression means that growth plummets for years, and is accompanied by widespread unemployment, wage declines, higher inflation, a downturn in industrial production (which further depresses wages), tightening of credit (because people have less spending power), and the consequent desire by companies and individuals to stay away from credit because of the overall uncertainty of the economy. All of these and other factors are connected to each other, like cell biology, and do not affect each other linearly. They also set up create negative feedback loops that are hard to stop. A war often does the trick because it uses debt to boost production, thus reversing the depression, and a cynical person might say that governments engage in wars to goose their economies–whether they do so intentionally or not.
In any event, the U.S. and its Allies won World War II, and being the only major economy that was not sacked by the war, the U.S. was able to reap the double benefits of funding the recoveries in Europe and Asia and of selling many of the products and services that were involved in rebuilding Europe and Asia. The U.S. economy surged from 1947 from 1968, with some brief periods of recession that were fixed by the Korean War in the 1950s and actions by the government (increased military spending and taxes) in the early 1960s, but then the expansion stopped.
At this time, the mainframe revolution began in earnest, as companies sought to replace legions of clerks and mountains of paper with central computers, banks of tape and disk drives, and application software that allowed a relatively small number of employees to manage a much larger amount of data.
The availability of inexpensive and readily available energy is perhaps the most critical aspect of any modern, industrialized economy. The oil crisis of 1973 was caused less by the Saudi Arabian oil embargo and more by the wars in the Middle East and the belated realization that the United States had hit peak oil production around 1971 (when the U.S. pulled out of the Bretton Woods agreement to standardize the value of the dollar against gold, causing the dollar to plummet). So by 1974, allegedly because of the Middle East crisis but more because the U.S. was producing less and less of its own oil, global oil prices quadrupled in 1974 to $12 per barrel and the stagflationating U.S. economy remained in recession. By the time the Western economies were in a full tailspin in late 1975, Gene Amdahl, one of the key designers of the System/360 mainframe at IBM, was preparing to launch the world’s first clone, IBM-compatible mainframes. By the mid-1970s, not only were more and more companies seeking to computerize their business operations, they were able to grind multiple vendors of compatible gear against each other. The right data processing machine was in the right place at the right time.
The economy in the U.S. dragged on and eventually hit another wall in 1979 when the Iranian Revolution got underway and produced another massive oil shock and eventually another recession. IBM was not only fighting off clone mainframes by then, but also a proliferation of minicomputers–including its own System/3X machines but also including Digital Equipment VAXes, Hewlett-Packard 3000s, and many other systems that offered mainframe-class computing for a lot less money. Mainframes were surrounded by other gear, and workloads started to move.
The roaring economies of the late 1980s ran out of gas in the fall of 1987 and hit a wall in 1988, and it may be a coincidence that this was also when the next round of minicomputers were coming out from IBM, Digital, HP, and others. New vendors, such as Sun Microsystems, came into the server space for the first time in the mid-1980s with new RISC architectures, and were poised to benefit from an economic shock that was on the horizon (although no one was predicting it at the time). Anyway, many of these new midrange machines were running Unix, not just proprietary operating systems, were now powerful enough to do a lot of mainframe-sized workloads. The Unix-based open systems war had started, and so did a second round of plug-compatible mainframe innovation. Even more workloads shifted off big iron to little iron, and from proprietary minicomputers to other architectures. Also, file serving and related applications running on NetWare, Unix, and other platforms started being deployed in commercial settings. It was normal for end users to work on networks by the end of the 1980s, and this was a whole new opportunity, driven in part by new uses for computers but also by the desire to do more with less because of the struggling economy.
It was no surprise, then, in 1991, when the U.S. went into another recession (helped in no small part by the savings and loan scandal in the U.S. that smells a little too much like the current mortgage crisis), that IBM’s mainframe business was on the rocks and Unix servers were on their way to utterly dominate the server market for the next decade.
When the economy had improved by 1993–in part because of ridiculously low oil prices–PCs were powerful enough to be useful to business–the client/server revolution got underway. Companies started figuring out how to make PCs take work off their central machines, since PCs are a lot less expensive per unit of processing power than a minicomputer and are ridiculously less expensive than a mainframe. As the economies of the world just roared in the late 1990s, Unix grew, and Windows and Linux burst onto the scene to challenge it for hegemony in the server space. Unix was holding its own until the economy hit the wall in early 2000 and was hammered by the 9/11 terrorist attacks 18 months later. By then, the first 64-bit Opteron processors were on the horizon, 32-bit Xeon processors were very powerful and very inexpensive, and everyone was looking around, trying to figure out how to spend a lot, lot less for servers. Unix took a severe beating, proprietary midrange servers had already taken it on the chin for the past decade, and Windows and Linux on X86 and now X64 servers are clearly ascended.
And here we sit in 2007. The irrational exuberance of the dot-com boom was replaced by manic investment in real estate, driven by skyrocketing prices, very poor screening on some mortgage applications, and private equity and hedge funds who were more than willing to back such investments indirectly as a means of raising capital to then do acquisitions to go on one of the largest corporate buying binges in history. This has propped up the stock markets of the world and has kept the economies humming, which is a good thing, until the bill arrives.
If you believe that we are right now at or near global peak oil production, then we are in for a humungous economic shock. It is hard to say how big, but in January 2007 dollars, oil peaked at over $100 per barrel in December 1979, and the current oil price is hovering around $72 a barrel as I write this, after hitting $78 a barrel at the end of July, when the mortgage nonsense first started dominating the news. We still have a ways to go before oil is as expensive as it was in 1979, which is good. But if a recession starts because inflation jumps, the stock markets crash, oil prices spike because of conflicts in the Middle East, or more hurricanes hit the Gulf of Mexico, then we can probably expect a phase change in IT to our list of predictions on the coming years.
Such a recession may never materialize and therefore will not accelerate some of the IT phase changes that appear to be underway. Sometimes, appearances can be deceiving. A decade ago, it looked like application service providers, or ASPs, would become a dominant way of consuming software, but only now, a decade later, is software as a service, or SaaS, becoming a noticeable part of the overall software market. No matter what happens, the desire to virtualize servers and storage and move toward data centers that are dynamic utilities remains–perhaps mixing applications inside the company with services and applications residing outside the company into one big mashup that costs less and does more than today’s infrastructure. Such a mad-dash to virtualize could breathe some life into big iron boxes, which have very sophisticated virtualization capabilities compared to the X86 and X64 iron that has sprawled all over data centers in the past decade. Or, it could kick the X64 into high gear on the virtualization front to capitalize on the opportunity presented by demand for cheaper platforms thanks to a recession.
It will be interesting to see what happens to IT should a recession materialize in the global economy. But just the same, I think it might be better for us all to keep our jobs and move into virtualization, SaaS, and utility computing at a slower pace. Let’s hope the oil and stock markets cooperate.