Are We There Yet? Perspectives on the Future of IT
by Dan Burger
Science fiction sometimes has a way of becoming science fact. If someone doesn't dream about it, chances are it isn't going to come to pass. Sure, there are some great accidental discoveries, but many of those were discovered because someone was on his or her way to where no one had ever gone before. John Hogan is a 60-year-old futurist with a deep understanding of information technology and business goals. His forecast for the future reflects great changes, but is tempered by a present-day realism rather than great leaps of faith.
Hogan will be one of the keynote speakers at the SHARE training and education conference, which is scheduled for August 21-26 in Boston. SHARE is an organization with membership primarily from the world's largest enterprise organizations. The event this year celebrates the 50th anniversary of SHARE. Hogan is one of the organization's past presidents and has been active in the group since 1969.
Hogan's IT career started with a technology-based Department of Defense spin-off from the Massachusetts Institute of Technology's Lincoln Labs; specifically, it was a think tank known as The Mitre Corporation, which also did research for NASA and the IRS. (Another Lincoln Labs spinoff was minicomputer innovator Digital Equipment Corporation.) In the mid-1960s, he worked on the world's first database management system. Later, his career centered primarily in the financial services sector, where he served as chief information officer at four large banks. Today, he is a one-man consulting business.
In order to be a consultant, Hogan says, it's critical to know what's going on across the board. He credits his continuing involvement in SHARE for keeping him current with the issues CIOs are facing and what is going on in the industry. "I have always needed to steer clients in the right direction," he says, "so I needed to know which directions were available."
The idea of predicting the future, especially something like 50 years into the future of technology, is very arrogant. At least that is Hogan's assessment, and I don't disagree. His approach is to try to avoid the arrogance. "Little of it can be done as an exact science, other than doing some research on what others are predicting," he says. "The very act of thinking about it and trying to understand the kinds of things that might happen and what the limitations are on the ability to predict puts things in perspective for dealing with the future. Everybody is wondering what the next big thing is going to be. Many things purport to be the next big thing, but 9 out 10 turn out not to be."
In Hogan's view, he chooses to think as if he was a CIO at a mid-size company, trying to decide which technology to invest in. With that in mind, we covered some topics on which any IT professional might ponder.
Dan Burger: One technology that IT managers have to decide whether they will invest in is Linux. What do you think of the opportunity for Linux to really succeed on an enterprise level and how will this affect the future of the Unix platform as we know it today?
John Hogan: The future of the Unix platform is spelled Linux. Not because it's Linux per se, but because it is open source. There is a very real possibility that Linux will catch on at an ever-increasing rate. I take into account that IBM is supporting Linux and a lot of people being dissatisfied with some of the other options. There are decision-makers at large established enterprises evaluating the risk of jumping from Microsoft to this thing that looks like a radical, revolutionary movement.
I think the economics will drive Linux. The way Bill Gates is milking his market with the level of service he provides is unsustainable. I believe Microsoft has grabbed on to far more than it can chew. Microsoft actually has never perfected anything--and this is not a slam on them.
It took IBM from the late 1960s until the 1990s, and it is now able to build robust, mature operating systems. It has done this with three different architectures: mainframe, OS/400, and Unix. These are virtually bulletproof. There are other companies with bulletproof systems as well--Hewlett-Packard and Sun Microsystems, for instance. But none of the Microsoft stuff is bulletproof. That's because Bill Gates has not gone through that 20 to 25 year learning curve. You can't take a Robust Operating System 101 course in computer science.
IBM is the undisputed king of building robust operating systems. Anyone who does not acknowledge this is not telling the truth.
DB: What we have now are many flavors of Unix. How will this change in the future?
JH: This will relate to how fast the Linux revolution takes off. I would suggest it has already taken off and we will see it increasing. If that thesis is correct, the faster that goes, the sooner it will create fall-off of marginal personalities of Unix that are supported by a variety of companies. By the way, there are a lot that are marginal right now.
It doesn't matter which ones survive, because during this same time period we will see the capability to move almost seamlessly from one flavor Unix to another. Linux is just one of those flavors.
If Linux grabs the corporate imagination, especially the large enterprise shops, the marginal Unix flavors will begin disappearing within five years. It also depends on the vendor of the operating system. What is the vendor's position on Linux? IBM has it both ways: AIX and Linux. Same with Sun and HP. All the major vendors are set up to win either way.
In the bigger picture, there is a trend toward seamless "integratability." One of my predictions for the 50-year time frame is that the notion of plug-and-play will become real. Tools will be of an architecture that can self-configure. It's almost here now. There will be standardization.
DB: I've reviewed some of your presentation that you are preparing for the SHARE conference in Boston in August. You use a term called legacy inertia. How might it play out in the future?
JH: Let me tell you a story that comes from the insurance industry. The insurance industry has undergone a huge amount of consolidation--acquisitions and mergers--in the past 20 years. This process has been handled, generally speaking, in one of two ways. One is to go in and convert all the systems to one standard system. The other allows the continuation of the existing legacy systems of the company that has been acquired.
In one instance I am familiar with, the company's main line of business applications included 137 systems, and on the life insurance side the company had 75 primary contract writing systems to produce insurance policies. The cost of maintaining those systems increases each year as new regulations and requirements from government and industry are put into place. That's the kind of inertia that, if you don't do something about it, will take you down.
DB: With factors such as seamless integration and open source playing important roles in the future of information technology, how do you see the competitive wars among the major software companies being resolved?
JH: I don't think all the major companies such as Sun, Hewlett-Packard, IBM, Microsoft, Oracle, and others will survive. Look at 10, 20, and 30 years ago and who the big players were. There are only one or two that are still around today. Back in 1985 you wouldn't have predicted that Digital Equipment Corporation would be out of business today.
The integratability issue actually might give some of these companies a bit of a reprieve. With everybody following the same standards for the componentry, you cease buying because the component supports this innovation or that innovation, because they all do. You buy because that company did a better job on that component. The quality of the component will actually be a force in the marketplace.
The other thing to consider is that no single company will be able to take on the manufacturing and distribution of all of the necessary objects. Even the Microsofts, the Oracles, and the IBMs of today will have to support some third-party widgets, because they can't possibly make all their own widgets.
DB: Is there any end to the rapidly continuing advance in computing power per dollar as defined by Moore's law?
JH: On the topic of computing power and processing power, I think it is going to hit the wall. Looking at the kinds of things you can do by extending engineering into miniaturization of chips, this will run out of gas 15 to 30 years from now. Some say we will have quantum computing, which will take up the slack and we will continue to get prodigious increases in the amount of power available and that will enable you to do more things. I dispute whether quantum computing will be here in 30 years. I don't think it will be. The engineering problems are so difficult. I don't believe anybody has a viable concept today.
DB: What do you see for the future for IT departments evolving?
JH: Look at the people who use computers today. Go into any office--regardless of what kind of business it is--and you'll find people who using word processing, using the Internet, doing file and organizing types of things. They are knowledge workers of one kind or another, yet all are using a lot of the same utilities, the same spreadsheets, and the same word processing programs.
All of those workers, to some extent, are doing some amount of programming. We don't call it programming, but it is. When you are setting up Microsoft Word to produce a document, you program it. That analogy gets deeper and deeper over time. So you are looking at a transformation from the IT department doing all the IT things to a point where the people in the departments that IT departments currently serve, do everything. As things get easier--like that plug and play thing--you need less expertise to do the configuring and the management there of. There may continue to be a small number of IT people, but they may be called information architects or something like that. They will be professionals who have a broad understanding of how it's all done.
JH: None of these professional positions are going away overnight. If you look at the journey we have to make from today to this world of the future, there are a lot of hard decisions that have to be made. The reasons for outsourcing are varied. The people who are doing it for the wrong reasons are the ones who do it to save money. Those who do it because they find better people to do that work are doing it for the right reasons. Today, 50 percent are doing it for the wrong reasons. The really smart companies locate the right people with the right credentials. It doesn't matter where they are--in India or right down the street.
Another classic example of outsourcing for the wrong reason is because they don't know what the hell to do with IT. So they're abdicating their responsibility. The compatibility of architecture also makes it easier to outsource. The IT department could become a bunch of migrant workers. The real issue is companies adjusting to the idea of not doing all the IT in the IT department, but teaching the business how to do its own IT. This makes it a different game than the one we are playing today.