Gartner Releases IT and Business Trends Through 2010
November 8, 2004 Timothy Prickett Morgan
When the year comes to a close, it is the traditional time for pundits and IT analysts to whip out their crystal balls and prognosticate about the future of corporate computing. The analysts at Gartner, which hosted its annual Symposium/ITxpo last week, jumped the gun a bit (it is only early November) and released their report on the technology and business trends that will affect IT operations.
Many of the trends Gartner outlined will already be familiar to you, and we have covered these topics from different angles in our various newsletters in the past year or so, as well. But Gartner always brings something new to the discussion when it looks ahead each year, and the predictions it made last week bring these issues into sharper focus and show how different trends will overlap to create synergistic effects in the corporate world.
In a presentation given by Steve Prentice, chief of research for hardware and systems at Gartner, the company outlined five technology trends that will affect us between now and 2010. And just for kicks, when I talked to Prentice, he gave me a sixth trend that is not in the Gartner reports.
Trend 1: Consumerism of Technology
The relentless pace of Moore’s Law, which posits that computing power can double every 18 months or so, because of the advances that semiconductor companies can make in squeezing more and more transistors into tighter spaces and running devices at higher frequencies, has made computing not just cheap but also ubiquitous. Forty years ago, computers were exotic, almost other-worldly things that people treated with awe and respect. Now computing is not just pervasive; it’s almost invisible. Sophisticated computers are getting embedded into just about every kind of product. Prentice says that 45 percent of the semiconductor chips made in the world today end up in a consumer device, and that, by 2013, consumer devices will account for more than half of the chips used in the world, outpacing the commercial use of chips for the first time. Gartner is predicting that a stunning 10 billion processors will ship in 2006 and that there will be an installed base of 200 billion processors running in businesses and homes around the world by 2013. This is a tremendous amount of computing power. And no one will care, at least not the way that we have for the past four decades.
The prevalence of computing among consumers–the absolute “so what, no big deal” nature of computing–will create an interesting feedback loop into IT organizations and the products that vendors create for data centers. In the past, governments and their militaries, as well as businesses, consumed the vast majority of IT resources, and their needs and history of consuming IT resources in a particular way determined the kind of IT products that were created and sold. Prentice said that one milestone in IT is that the Sony PlayStation 3 will be rated at 196 gigaflops, and while this is not exactly a fair comparison, an entry-level Cray supercomputer is rated at 204 gigaflops. How long before someone lashes together PlayStations to create a massively parallel supercomputer?
As consumers themselves buy more and more of the computing power that is created each year, this will cause IT vendors to create IT products with features that appeal to consumers and move to lifecycles that are more akin to those in the consumer market. IT will be created and consumed in a manner that is less like a server farm and more like a cell phone, backed up by services and billed according to use, not on license terms. “Ultimately, we are predicting the demise of the data center as we know it,” says Prentice. “Within five to 10 years, most companies will have a hybrid data center, with companies contracting out much of their IT infrastructure as a service.” He concedes, however, that a minority of companies may want to keep their own IT infrastructure for critical applications.
Trend 2: IT Virtualization
With the exception of sophisticated and expensive proprietary midrange and mainframe systems, which have workload managers that allow them to do many things at once, the bulk of IT shops have isolated their application workloads by putting different applications on their own servers. This has been a boon for server, operating system, middleware, and application software vendors, because IT shops had to plan for peak loads and to buy lots of servers and software accordingly. However, their machines run at a fraction of their peak processing capacity most of the time.
Virtualization in server hardware and operating system software will change this, big-time. Prentice backs up something I have been saying to IT analysts (including his peers at Gartner) and server makers for years, which is that the advent of virtualization means server makers will sell fewer boxes and that it is very hard to figure how this does not adversely affect their revenue streams. If they have been selling lots of boxes that run at 10, 15, or 20 percent of utilization, how can they continue to sell as many boxes when virtualization allows customers to run boxes at 60 or 70 percent of utilization? When you throw in the 30 percent per year improvement in price/performance, I can’t understand how vendors will generate more server revenues, except in the short term as customers dump their existing boxes and buy new virtual-capable systems. Even Prentice, who is much less pessimistic than I am on this trend, says that companies will probably be able to cut the number of servers they need by 40 percent. That is a big whack.
The advent of virtualization is also going to drive another trend: a change in software licensing. In a world of dynamic logical or virtual machine partitions that can change the amount of CPU, memory, and I/O resources consumed on a minute-by-minute basis, how can software vendors charge on a per-CPU or per-system basis for their software. “Virtualization will reduce the number of servers sold, and it calls into question the way software is licensed on these machines,” says Prentice.
Trend 3: A Wireless World
Because of the bandwidth requirements that server infrastructures require, it will be quite some time before the data center goes wireless. But just about every other kind of computer is going to cut the cord, says Prentice. Home computing environments have arguably kept ahead of business computing environments since the Internet went mainstream in the mid-1990s, and consumers are once again at the forefront with wireless technology. Gartner expects that hybrid networks using short-range wireless technologies such as Bluetooth Zigbee and Ultra Wide Band working side-by-side in WiFi and other longer-range wireless networks, both at home and in business.
Trend 4: Real-Time Infrastructure
Prentice summed up this trend with one sentence: “You do not have to own the box to be able to use it.” IT will move away from being a product that is sold on a quarterly basis, in one, two, or three year cycles, to a service that is consumed minute by minute.
Prentice says that the real-time infrastructure that Gartner envisions is not the same thing as the utility computing schemes that many server vendors have been peddling over the past few years. Those utility computing schemes are more like sophisticated financial deals in which a company still ends up paying for a box over the course of its life. Sun Microsystems, which launched the capacity-on-demand utility computing model simultaneously with Hewlett-Packard at the end of 1999 in the Unix server market, is perhaps on the leading edge of this true utility computing concept, as it is getting set to deliver computing capacity at a rate of $1 per CPU per hour. There are 8,766 hours in a year, so you can use one of Sun’s server processors for a year for $8,766. The question you have to ask yourself is this: does it cost you that much to acquire, maintain, and support one of your processors for a year?
Trend 5: Software As Service
Just like systems will be delivered as a hodge-podge of services, software is going to have to change from being a collection of monolithic programs that are difficult to integrate, change, and reintegrate to being a set of services defined by a services-oriented architecture. Gartner predicts the emergence of services-oriented business applications and services-oriented development of applications, the latter being a framework for developing and changing these applications. What is driving this trend? Businesses need to change the software on shorter and shorter time cycles, but the way software is created today, it takes more and more time to change because of the difficulties of integrating various systems and applications. This all sounds like a great idea, but I wonder how much of this will turn out to be vaporware and who is going to pay for the transition from the way applications are written today to a future services-oriented world.
Trend 6: There Is Always a “Next Big Thing”
Because IT has undergone so much change in four decades, it is easy sometimes to think that everything that could be done with computers has been done, that the systems and software technologies we have today will be, more or less, what we’ll have in the future.
Prentice says we will have continued innovation in IT, and he argues that, whether we admit it or not, we need it. “It doesn’t matter what part of an IT cycle we are in, there is always a new next big thing,” he says. We had the mainframe, the minicomputer, Unix, client/server, and the Internet, and there will be other cycles. “There will be a never-ending stream of new technologies,” he says, “and businesses have to keep looking at these new technologies to see what strategic advantage they might bring to their businesses.”
However, Prentice warns that incumbents do not always survive major transitions, and he cautions that IT companies will increasingly try to create exclusive ecosystems that they control and that they can pitch to their customers as complete solutions. And rhetoric from the IT players aside, Prentice says that IT ecosystems probably will not overlap. “It is going to be increasingly difficult to be a supplier to everybody, because ecosystems will be restrictive,” he says. “This is where the fallout is going to come, and many hardware and software vendors around today are not going to survive these transitions.”
That’s comforting, isn’t it?