An IT Retrospective: Forty Years in the Business
January 9, 2006 Rich Loeber
I got to thinking the other day that as of November 2005, I celebrated my fortieth year in the information technology field. That thought has prompted me to think a lot about how I got started and the way things have changed over the years. This article will try to explore these four decades of progress from just one person’s perspective.
I have absolutely no recollection of my first day on the job working in the IT field, but I clearly remember the second day. I started work on November 8, 1965, working in the computer room for the New York Central Railroad at 466 Lexington Avenue in Manhattan. November 9, my second day on the job, was the day of the Great New York Blackout of 1965. I remember having finished work and riding home on the train when it stopped about a mile short of my station in White Plains, New York, in the Westchester suburbs. In those days, it was not so unusual for the train to stop and the commuters just sat and waited for the train to start up again. Then, someone actually looked out the window and saw darkness where there should have been lights from dozens of houses. We all ended up getting off the train and hitching rides on the Bronx River Parkway after the conductor reported to us that the third rail was dead and they had no idea how long it would be out. It was quite a night and a memorable introduction to the idea of commuting and dealing with this aspect of a working life.
That first job, which I held for all of two months, was working as a clerk in the computer room. Since then, I’ve learned that we were providing an input/output control function; but everyone in the computer room referred to my work area simply as the “Idiot Table.” There were four of us plus our supervisor, a featherbed union position given to an elderly former conductor who constantly smoked a cigar and, to my recollection, never spoke a word to any of us. Our job was to look through manually prepared records of train movements that had been mailed in to the computer room and then match them up with piles of computer printouts of train movements. Our job was to look for “missing trains.” Missing in the sense that the computer did not know about the train movements even though they actually occurred in the real world.
It was my first real job after graduating high school and I was eager to do it well. I jumped into it and found that I had a talent for matching things up and tracking down the trains. I was so good, in fact, that my workmates took me aside and told me to slow down because I was a) making them look bad and b) cutting into their backlog of work. If I continued on the pace I had set, we’d get all caught up and there would be no possibility of overtime. Some of us might even get laid off, especially the new guy. I slowed down a bit, but I still wanted to impress, so when a job came up as a keypunch operator, I bid on it and won.
The railroad sent me to keypunch school at IBM for a whole week to learn how to operate the machine. Part of the class involved programming the keypunch machine and I found that to be the most interesting. After a week of training, they put me on the midnight shift working with three other guys, the railroad being essentially and all male club back then.
Our job had two facets. One was to take transaction rejections in the form of a stack of punched cards, and find/correct the errors so they could be reprocessed. The other was to listen to tape recorded telephone calls from small regional rail yards and prepare punch cards to process these train movements through the system. Again, I was good at this and my workmates were constantly after me to slow down and pace myself. During this time, I also learned how to operate the card sorter machine, the gang punch machine, and an aging (even then) IBM 407 computer that was programmed on a wiring board (an early version of RPG?). I remember that the printer on the 407 ran at the blistering speed of 50 lines per minute.
I think my work ethic impressed someone in management because they approached me about taking a Programmer’s Aptitude Test. In those early days of computing, colleges did not offer any curricula for information technology (we just called it Data Processing). Most employers who were implementing computer systems were looking to their own people to train to do the programming. I took the test and ended up scoring the highest score the railroad had ever gotten on the test. (Two years later, after I’d been working successfully as a programmer, I took the same test again at Reader’s Digest and failed it miserably . . . so much for the PAT.) The railroad offered me a job as a programmer trainee and I accepted.
Since you could not go to college for programming, the only place to learn was from the computer manufacturer. So, the New York Central sent me off to IBM’s programming school in Manhattan, where I started out my programming career by learning “1401 Autocoder.” At the end of the class, I started programming on the railroad’s IBM 1401 computer system. This first computer was quite different from what we think of today as a computer. It was the size of two desks stacked on top of each other and came from the factory with all of 4 K of memory. On that machine, 4 K of memory meant 4,000 characters of storage, not the 4096 bytes that we think of today. The machine I worked on had a “caboose” on it that included the optional additional 4 K, so we had the huge amount of 8 K of memory to work with. That machine also did not have any operating system installed on it (they were another couple of years in the future) nor did it have any disk drives or tape drives. The input unit was a punched card reader where you loaded your programs to run along with your input data. The output units consisted of a line printer and a card punch unit that was integrated with the card reader.
The IBM 1401, first introduced in 1959, and its later cousin the IBM 1410, were variable word length machines. The size of each word, rather than today’s standard 4 bytes, was determined by the placement of a wordmark bit. This was a part of the 8-bit character coding known as Binary Coded Decimal (BCD). The character was comprise of 6 bits (4 numeric bits and 2 zone bits) and the other two bits were for the wordmark and parity check. The memory on these systems was actually made up of magnetized “cores” or very small circles that could be charged in one direction or the other (on or off). I even have a small sample of this core memory that I keep clipped to a cork board in my office today as a reminder of where the term “core storage” came from.
After six months or so, I graduated to programming on the “big machine” in the computer room. This was an IBM 7010 (a grown up version of the IBM 1410) which had 100 K of memory, 6 tape drives, a very early disk drive unit and an interactive console which looked an awful lot like an IBM Selectric typewriter. It also had a very early form of an operating system and could actually run two programs at the same time; quite an advanced application for its day. The railroad used this system to keep track of its rolling stock in files on the disk drive. We actually worked on developing the idea of indexed-sequential files on this system and this is one of the applications I worked on.
The disk drive, an IBM 2302, was huge with platters that were about 6 feet across and 8 pneumatic access arms. I remember that if it ever got turned off, like after the blackout, it took at least 30 minutes to warm up and get back up to speed before it could be used. Since there was no operating system to manage the disk contents, we had to keep the disk layout all mapped out so that every area of the disk that was used was pre-assigned a space.
This system was also the basis for the first commercial application of CRT devices. The railroad had a communications network that connected a series of CRT’s made by Hazeltine. These devices could be used at various locations around the railroad system to inquire as to the exact current location of any freight car, engine or caboose on the rail line. This same communications network was used to transmit train movement information between stations using automated keypunch machines that read punch card information from one station and then duplicated those cards down the line at another station. In the process, all of this train movement information was captured in New York by a Collins communications computer and stored on tape so that our disk database could be constantly updated.
After I had been with the railroad for a little more than two years, we went through a merger with the Pennsylvania Railroad and I was put on the new Penn Central’s data integration team, based in Philadelphia. The merged data operation was going to be implemented on IBM’s brand new line of System/360 computers that came complete with an operating system and embodied many of the concepts of computers still in use today. I found that not only did I have to learn a new programming language, Basic Assembler Language (BAL), but I also had to learn to work within the confines of the operating system. Not only did this new operating system support indexed-sequential file formats, it also had partitioned data sets and program files that were independent of the physical files on the computer so that you could reference the same file layout for different actual physical files.
After working on the computer system merger project for a year, I left the railroad and embarked on a 16-year career with PepsiCo back in the New York metropolitan area. I admit it, I missed New York and wanted to get back “home.” During my entire career with PepsiCo, I never once had anything to do with soft drinks in my work assignments. Initially, I worked with their auto leasing subsidiary in Great Neck on Long Island. It was here that I finally started working in a high level language and taught myself COBOL, the language that I describe as my “native” language to this day. While learning COBOL, I spent a lot of time examining compile listings to see what assembler instructions were generated by the various COBOL code constructs and learned a lot about how high level languages get implemented at the machine level.
When the auto leasing company was sold, I transferred to a heavy equipment leasing subsidiary located in Lexington, Massachusetts. It was here that I spent a short stint working on a Honeywell mainframe. My assignment, along with a few friends from New York, was to help get this subsidiary ready so that PepsiCo could divest the company. I was there about a year, then transferred back “home” to New York and got the job of Data Processing Manager at PepsiCo Wines and Spirits, where I remained for the last nine years of my PepsiCo career. It was here that we started to finally move away from punched cards and into a more contemporary setting of CRTs on everyone’s desk and direct entry to the computer files to be processed.
It was while working here that I took my first step away from the IBM mainframe environment. In fact, I recall a decision point that I had to make that would define the rest of my career. The PepsiCo corporate data center was looking for someone to handle software support on the 370 operating system. In those days, there was a systems programmer in big shops whose responsibility was to keep the operating system running smoothly. The operating system had to be periodically updated and recompiled and this was a big task. The data center director offered the job to me and I thought about it long and hard. In the end, I decided to turn it down and stay on the application programming side of the fence. At the time, I didn’t realize the long term ramifications of this decision, but in retrospect it was a major career decision point. Had I chosen the other route, I would probably not be where I am today.
While working at PepsiCo Wines and Spirits, I got my first introduction to the IBM line of minicomputers. We implemented distributed application running at an office in Bermuda on an IBM System/32. This system, about the size of an office desk, had a 5 MB disk drive, a built in communications port, printer, display, and keyboard, and was an early form for a PC. Unfortunately–that is very heavy sarcasm–I had to travel to Bermuda several times a year to keep it running smoothly. We followed this on quickly with a System/34 for the offices in Purchase, New York, where PepsiCo had finally landed its headquarter offices. The System/34 gave the division some real autonomy in their processing needs and integrated well with both the corporate 370 environment and the System/32 in Bermuda. This soon moved to the System/36 platform as a natural upgrade.
It was shortly after the System/36 implementation that I finally decided to part ways with PepsiCo. It was time for me to get a promotion, but Pepsi was only interested in promoting people with masters degrees in information systems and I was sitting there still with my high school diploma. So, I left PepsiCo and went out on my own as a consultant. I started Kisco Information Systems in June of 1984 and am still doing this today.
In those early years, a lot of programmers talked about becoming consultants and that was a dream for many of us. I got to realize the dream and also the responsibility of running your own company, hiring people, observing all of the government red tape associated with all this and then some. I must like it since that was 21 years ago and I’m still at it.
Shortly after going into consulting, I came to the realization that when you’re consulting, you are constrained in your ability to generate income by the number of hours you can work. While I was making a comfortable living with a growing client base, I wanted to broaden the revenue base, so I started writing general use utility software for the System/36. That platform had a loyal following and there were thousands of small software developers writing packages for it. I joined the group and soon had several software products on the market. I quickly learned that writing an application for a single user and writing for a broad range of users are two very different things. To this day, I am constantly astounded by the ways customers find to use the software that we sell; ways that we never imagined when we started out.
While the PC revolution got started in the early 1980s, I did not jump to that platform until the latter part of the decade. My first PC was a Compaq Portable I (which I still have). It had 640 KB of memory, MS-DOS and two 5.25-inch diskette drives. Even with this limited capacity, I was able to learn programming in dBase II and convert my company processing onto the PC in a custom implementation. Over time, I upgraded the Compaq to add a hard drive of 5 MB that was later updated to 10 MB and then 40 MB. I have since learned to upgrade your PC about every 5 years and buy as much PC as you can each time. Then, nurse it along until you find a drop dead issue that calls for another outright replacement.
When the AS/400 was announced in 1988, I got my order in right away for one of the new B10 systems. I ended up taking delivery of the first customer AS/400 installed in our county. I then quickly moved one of our most popular System/36 applications over to run on the AS/400 and entered into the AS/400 software market as well. I soon found, however, that the AS/400 was vastly different from the System/36. IBM did a nice job of making the System/36 customers feel at home, but I found that I needed to learn a new skillset to work successfully with this system.
In all these years–as I’ve moved from the IBM 1401/1410/7010 to the IBM 360, IBM 370, IBM System/32, IBM System/34, IBM System/36, IBM AS/400/iSeries/i5–I have often thought about these early years and the excellent base of knowledge that I had to work from. A lot has changed over the years, especially in complexity. But, a lot has remained the same, too. That early application that I worked on with indexed-sequential file formats still embodies the essence of the file access method in use today. When I’m working on a coding problem today, I often try to think about what’s really going on at the machine level and find that this approach will frequently give me the solution to the apparent problem that I’m working with.
Looking at the IT industry today from this forty-year perspective, I see things that some people may miss. For starters, it seems in this business “what goes around, comes around.” Over the years, I’ve seen the pendulum swing back and forth on centralized processing versus decentralized processing. There will be a big push come along to centralize and control processing followed not long by another strong push to decentralize. Other hot issues regularly come along that fall into this category.
Another observation I have is that “programming is programming.” I’ve learned and implemented in a lot of different languages. In each case, the result was normally a successful project implementation and the choice of programming language often ended up being fairly unimportant. The problems associated with creating and maintaining programs are much the same no matter the programming environment that you choose to implement.
The biggest change from my perspective has been the advent of global networking. This change is huge in its impact and will continue to grow as more and more network connections become available. A few years ago, I decided that because of the global network, I could move my business anywhere I wanted to; and I did, moving to a remote part of upstate New York. I work in a small office, but am in constant contact with all of my customers just because of the global network.
When I started in business, most communications took place by phone and our software orders generally came in the mail. With the proliferation of fax technology in the 1980s, orders shifted to arriving by fax most of the time. When the world-wide-web became prevalent in the mid-1990s, communications shifted again to a Web site and email. Today, most of my end user communication is via email with an occasional phone call. We receive orders via a Web site and ship software via email. Now, with generally good wireless technology, I can make my office quite portable and travel. My customers don’t know that I’m anywhere but at the office slaving over their support requests.
The biggest challenge over these 40 years is to keep current. Mostly, this means staying on top of whatever is popular at the moment and understanding it. Too often, I’ve seen IT professionals get stuck at a particular point in time and not be able to adapt to the next wave of change that comes along. That it a death sentence to a career. Sometimes, all this means is that you’re able to understand the latest wave a jargon. I can’t tell you the number of times that someone has hit me with a new buzzword and when I admit ignorance, I find out that it is just another name of something that has been around since the year of the flood. For me, I read a lot to keep current. With the availability of information via the global network these days, there is no excuse for not being able to keep current.
Rich Loeber is president of Kisco Information Systems, an iSeries-AS/400 software developer located in Saranac Lake, New York. He can be reached at email@example.com.