Mad Dog 21/21: You Can Teleworker, But Who Will Listen?
Published: September 22, 2008
by Hesh Wiener
As you read this on your display screen, you are a potential (if not actual) teleworker. Maybe you're in the office, but you could as easily be far away. In principle, people in many jobs could work outside the office, and, with gasoline at a buck a quart, it's no longer hypothetical. These days, teleworking might be practical not only for employees but also for employers. It is a topic that requires fresh scrutiny, some new technology, and appreciation of its roots, which go back 2,500 years.
The possibility that people whose work is based on using a computer could do their jobs outside the office first became practical in the 1970s. That is when IBM 3270 CRT displays and a body of related technologies made interactive computing in corporate settings by ordinary clerical employees and others a central theme in IBM's marketing. IBM mainframes went virtual to better support the displays, initiating the technology transition that would reduce and ultimately eliminate the punch card as the primary input medium of the data processing world.
Aficionados of non-mainframe technologies say IBM's transition from punch cards to pixels was sparked by competition from smaller but more progressive companies. These firms were included the minicomputer makers, such as Digital Equipment. A number of firms created small systems that included a cluster of terminals, one or more small printers, and a shared minicomputer (which today would be called a server), usually one that understood how to talk to an IBM mainframe; Wang Labs was one of the more prominent examples. As IBM moved ahead, so did its rivals in the large systems market. They embraced the CRT (along with associated cluster controllers) as an attractive alternative to punch cards.
Heliograph: In 1910, sunlight and mirrors provided networking technology to surveyors mapping the border between Canada and Alaska.
In the early 1970s, the focus of all these developments was the corporate office, not some far-flung community of end users. Moreover, big computers at the time lived in glass houses that were considered showcases of corporate modernity. Security issues, to say nothing of terrorism, had little or no impact on commercial computing in those days of innocence. During the punch card era, the sheer volume of cards that had to be read (and often copied to files kept on magnetic tape) became a barrier to growth. CRT terminals allowed data to be entered directly into mainframes or at least put through satellite computers and onto tapes that the mainframe could read. Overall, this proved to be a cheaper, faster process than anything involving cards. As disk drives improved and supplanted tape, the data entry and retrieval via terminals became the norm. Still, the work was, for the most part, carried out in corporate settings where the CRT terminals were physically close to computers and their role in information processing remained very much the same as that of the punch card machines they supplanted. IBM's punch card culture persisted long after the punch cards themselves became obsolete.
This situation apparently didn't impress Jack Nilles very much. In 1973, while at the University of Southern California in Los Angeles, Nilles noticed that commuting around LA was often arduous, costly, and time-consuming. It was pretty much impossible for people to live where they worked, and public transport, where it existed at all, was not very good. Backed by the National Science Foundation, Nilles started thinking about ways some people might work from home or satellite offices and how this process, which he called telecommuting, might affect these people, whom he dubbed teleworkers, and the institutions for which they worked. By 1974, Nilles had pulled together enough information to publish a study that became a pillar of a large body of subsequent work done by him, his colleagues, and others he inspired.
At the time, ARPANET, funded by the U.S. Department of Defense and the ancestor of the Internet, was taking off. ARPANET wasn't on the radar of IBM's commercial systems group even if it was well known to IBM's researchers. ARPANET was a place where Digital was the top systems vendor, notably because of its PDP-6 and successor PDP-10 computers. That role gave DEC firsthand experience among the researchers and academics who first came to see the physical distance between their desks and the host systems they used as unimportant if not irrelevant.
ARPANET users usually did not care where the central systems they used were physically located, as long as the central systems had satisfactory network connectivity. What the users needed to know was how to connect to a resource they wanted to use, which was not the same as knowing where it was in the physical world. But this was not the usual way of connecting a user with a distant computer during the 1970s. At that time, the most common way of liking to a remote system was to use a modem that created data links that rode on point-to-point voice networks.
Chappe Semaphore Tower: The French loved the semaphore system developed by the Chappe brothers and used it as part of the Empire's information processing infrastructure.
In many ways, business communications in the 1970s had not advanced very much compared to the technologies used by classical Greece.
Around 405 BC, Greek military commanders kept in touch with a system that later evolved into a gadget called a heliograph. Basically, the Greeks used sunlight reflected off highly polished shields to send serial data from one point to another. (They also used signal fires.) The data included questions and answers pertaining to battle conditions. The idea persisted after the Greeks had peaked.
There are records suggesting similar technology was used by the Romans hundreds of years later. During World War II; every ship had a signaling light, often called an Aldis lamp, that is a direct descendant of these technologies. The name comes from that of Arthur Aldis, the British inventor of the most widely used form of the device. An Aldis lamp is basically a light source and a shutter mounted so it can be pointed at the other party's corresponding signal station. Even today, flashing light signals can have an advantage over radio transmissions because they are harder to intercept. Still, to move signals around corners, light is only a viable medium if it is guided by some kind of medium, such as a fiber optic cable.
In business, field sales personnel are sometimes compared to soldiers. If there is one role in business where telework is likely to be accepted, it is sales. So, in a sense, talking telework and sales is not talking change. For telework to be a change, something that is different and possibly better, it has to be used among workers who are not the corporate equivalent of footsoldiers.
In the 1970s, telework got a foothold in the United States. In that era, telework depended on technologies that are now obsolescent if not obsolete, or which may exist today mainly in a form that would hardly be recognizable to one of the early teleworkers.
The most widely available way to establish connectivity was with a dial-up link. Dial-up technology in the 1970s has very little capacity compared to today's wired and wireless broadband systems, which are in some cases also dial-up technologies, at least in a technical sense. The low speeds constrained the data flow between teleworkers and central computer systems, but the central computers of that era didn't have very fast I/O, and they could not have handled the faster data streams we take for granted today. Remote computing involved message compression and abbreviation among other techniques, and it made the most out of scarce bandwidth.
The communications schemes used in the 1970s to make the most of the available links between CRT terminals and central systems borrowed ideas from other media with limited bandwidth, and the roots of this effort did not always lie in contemporary technologies.
One example of a slow medium that was refined to improve total is the semaphore, an idea that was invented by the Chappe brothers, who wanted to stay in touch while attending separate but nearby schools. During the late eighteenth and early nineteenth centuries, the semaphore became the basis of communications across France and elsewhere, and also evolved into the signal systems used by railroads.
Flag semaphores, related to the mechanical devices created by the Chappe brothers, were widely used by navies and seem to have played a critical role in the Battle of Trafalgar. Because flag systems could only carry data at a rate of about 15 characters per minute, the organizations that used semaphores developed very effective systems of abbreviation.
Similar thinking was applied to telegraphic systems and also to computer communications during the narrowband era. Today, even though data compression remains an aspect of telecommunications, the entire frame of reference has shifted in ways that would be barely recognizable to users in the 1970s and 1980s, when even the best communications technologies were impossibly slow and expensive by today's standards.
Nevertheless, even with very severe limits on affordable data links, Nilles found that there was a case to be made for telecommuting. To make his case, he did not focus on automatic field sales reps or others whose roles might correspond to that of a foot soldier. Instead, he tried to determine whether telework was a possible alternative for employees whose office jobs were right in the mainstream of central office operations.
Nilles focused on an industry that was heavily based on information processing and that was also a key market for IBM's mainframe systems. He did not choose as a target the users of interactive computers like the ones that DEC sold. Nilles found that even back in the mid-1970s and even in the insurance business, telework could provide economic advantages to employees, employers, and to the surrounding society.
What helped Nilles get attention is a situation that is echoed today: an oil price crisis. In 1973 and 1974, the United States and other Western countries were hit by a powerful oil price shock as producing nations stepped on the post-industrial world's fuel line. This made more people who were involved in computing, which to a considerable extent defined post-industrial enterprise, receptive to the concept of telework.
The oil shock took years to work its way through the world's economic system, but the United States managed to adjust (not without some pain). Oil prices eventually fell, though, and the West reverted to its energy hogging old ways, failing to make the changes in culture, economy, and infrastructure that might have reduced its thirst for oil. Telework became less important, and commuters, now driving SUVs enjoyed a resurgence. This seemed to be perfectly fine until recently, when once again oil prices have jumped.
IBM's 3270 DRT Display: This classic dumb terminal marked the transition from punched cards to purely electronic data streams.
Even though more than 30 years have elapsed since Nilles made his initial case for telework, most of the big considerations remain the same; they are pretty much the ones anybody can spot on the surface.
For the employee, telework eliminates commuting time and costs, but it's not without a different cost, the loss of the support and company of other workers. For the employer, there is in increase in productivity due to reduced office overhead costs (plus a possible gain from increased employee enthusiasm), but also the loss of managerial supervision and control that an office provides could offset this gain (plus additional costs if the teleworker exaggerates the time and effort going into the job).
If the employee or employer is really interested in whether telework is going to produce good results, the best way to find out is to test the situation on a temporary basis and see what happens. While immediate results might not be the only issues, and they could work out fine without producing good results in the medium or long term, in business facts are generally a better basis for planning than theories.
Sure there are risks that don't show up right away. Over time, a teleworker can become alienated from the office and vice versa, and the consequences of that are not good. On the other hand, it is also possible for a company to create a corporate culture for teleworkers that is stimulating and productive, and which produces growing effectiveness as teleworkers learn to do their jobs more effectively and develop ways of staying in touch even when they are not in a traditional office.
Of course, in the real world things are never so simple. There is no reason for a teleworker to telework every day. In some cases the best outcome might be the result of going to the office three days a week and teleworking the other two. Telework does not have to be all or nothing.
To the extent technology can help integrate teleworkers into business processes, we seem to have quite a lot of communications and collaboration software we can draw on, and there is more of this stuff every day. IBM turns out to be one of the players with some of the right software and services, and Microsoft is another. Both find their established products overwhelmed by the things that seem to be emerging as part of the huge web services battle between Google and Microsoft. In this complicated situation, one part of Microsoft is in competition with another group in the same corporate entity. IBM could soon find itself in a similar position.
IBM's current technology for telework and collaboration in general (including collaboration among people who all work in traditional offices) is centered on its Domino server and the client suite, Symphony. Microsoft's alternative is Exchange Server plus Outlook and Live Mail. IBM has not yet been able to deliver the variety and quality of Web-based applications that Google and Microsoft are trying to develop, and it may be concerned about losing out to these technologies. Google and Microsoft are, of course, not the only outfits trying to find a way to make a business out of web-based alternatives to costly packages like Office. Still, in the end, IBM could get into the fray very quickly by buying one or more of the companies that offer Web-based productivity application services not yet in the IBM arsenal.
As it stands, Web-based productivity applications have not really had a big impact on corporate computing practices. Even with all the AJAX stuff, they just don't work right compared to desktop programs. The broadband systems that make so much of the Internet as fast as any user might wish are not up to the task or supporting Web-based productivity applications. Moreover, the cable industry, which provides broadband to many teleworkers and prospective teleworkers, is busy putting bandwidth caps and usage surcharges into their tariffs.
The carriers' pricing change is aimed at customers who are hogging capacity to share music and video files, but telework, or at least telework that requires significant bandwidth, is going to get hit. The cable companies don't appreciate the consequences of killing off future telework customers, and maybe they never will. But there is no question that the future of telework will be shaped by the cost and availability of high-speed communications.
It might turn out that bandwidth is ultimately not so important, and that the future of telework will be shaped by software that runs on client devices and that it will be anchored not by some service system in the network cloud, but rather by the very same technologies used to support end users working locally in an ordinary office. As usual, the future may not be dominated by one or a few strategic choices, but instead allow organizations to choose from a rich menu of hardware, software, and connectivity options.
Jack Nilles: Telework couldn't have been his middle name because he coined the word, using it in a study he published in the early 1970s.
This does not mean that anything goes, or that it is impossible for anyone to look ahead.
Whatever choices a business makes for its teleworkers (and in-house employees, too), smart, flexible client devices require a lot of support. For the moment, teleworkers seem to prefer the most flexible (and support hungry) clients of all, general-purpose PCs that run the latest or nearly latest version of Windows. Even without telework as a major consideration, many corporate users have tried to defer upgrading their end users' client systems from Windows XP to Vista, not because they fail to appreciate some of Vista's advantages, but because they fear the cost of installing and supporting Vista may exceed the benefits it can provide. Moreover, once Vista began to tarnish, the case for a two-generation leap, from XP to Vista II or whatever when Microsoft ends up calling Vista's successor.
Google is trying to address this by first coming in with a browser and later with a mobile phone platform and them maybe later than that with its own distro of some kind of operating system. (Google's employees sit at Web screens powered by a variant of Debian Linux already.) But putting anything complicated and flexible in the hands of an end user is a business risk. So the missing piece, for Google, for Microsoft if it wants to really succeed with a telework client, and for IBM if it wants its own piece of the market, is some way to make a standard PC function as a fixed function telework client and also work as a regular PC when the teleworker is not on the job. If the system allows for the possibility of a transition from client-based applications to Web-based applications, too, all the better.
The way to do this might involve replacing the single instance of Windows that is on most PCs with a virtual system that can boot into two or more alternatives. One could be a general purpose system, and the others could be more limited but much more stable (and easy to support) telework systems. We've looked for the seeds of this sort of thing in the announcements of Microsoft, Google, VMware, IBM, and others, and we haven't seen anything yet. So far, the industry seems to be a lot better at adding skins and features than it is at addressing some of the more substantial issues affecting the workplace and the welfare of the people and companies in it.
Post this story to del.icio.us
Post this story to Digg
Post this story to Slashdot