• The Four Hundred
  • Subscribe
  • Media Kit
  • Contributors
  • About Us
  • Contact
Menu
  • The Four Hundred
  • Subscribe
  • Media Kit
  • Contributors
  • About Us
  • Contact
  • Mad Dog 21/21: The Mainframe Was The Message

    May 2, 2016 Hesh Wiener

    In 1964, as IBM announced the System/360, Marshall McLuhan, a professor at the University of Toronto, published a remarkable book, Understanding Media: The Extensions of Man. He said each medium, independent of its content, is a powerful social force with characteristics that reshape the way people are interconnected.

    McLuhan distilled his thesis to a single memorable phrase: the medium is the message. Like print, radio, movies, and television, computing technologies, from the punch card to the mainframe to the mobile internet, are media, too. IBM doesn’t fully understand this; consequently, it flails and struggles.

    Marshall McLuhan: Canadian academic who changed how the world looked at communications media and their impact on society.

    One of McLuhan’s observations about media is that they generally carry as content older media. For example, the medium of theatre and the medium of print publishing are the content of films. Films can be the content of television. Television has become part of the content of website presentation. But as one medium uses a predecessor for content, the nature of the newer medium may differ a lot from that of the older one.

    As an example, McLuhan characterizes film as a hot medium, by which he means that film in a theatre shown on a big screen with its rich images floods the main sense, vision, with information. The viewer doesn’t have to do much work to catch all the details; on the contrary, the viewer may be overwhelmed. Add in surround sound and even without 3D or VR presentation, the audience is awash in stimulus. By contrast, the same film presented on a small television screen, the kind that was the norm during McLuhan’s time, 50 years ago, requires the viewer to psychologically lean in, to do some mental work to catch the detail. McLuhan calls the low-res TV of his time a cool medium, his term for a medium one that demands effort from a viewer.

    Today, TV is designed for two meter diagonal, high resolution, high definition screens. It can provide an experience that more closely resembles the one offered by a cinema than the old round-corner boob tube of the vanished past. But that is only one kind of video experience. A viewer catching a show on a tablet or a smartphone is not immersed. The video experience on high end screens may have a lot of pixels, but it is impossible for a viewer to enjoy all the subtleties presented on a big screen. So that TV show, whether a film or a sports event or a news report, can be hot on a home theatre screen but cooler on a Kindle Fire and very cool on a 5-inch phone.

    Punch Cards: The prevailing data medium before the System/360 captured the corporate market was the punch card.

    McLuhan tried to give his readers, his students, and his consulting clients (including IBM) a way to think about the varying impact of different media and the way the recipient of the medium message as well as the content might react to an impact of the presentation. A hot medium such as a big screen film may produce an emotional reaction in a viewer, but that viewer is sitting in one place in the theatre. That same film at home may allow some different freedoms. There may be a pause button that breaks the spell of a movie even as it allows the viewer to grab refreshments or halt to take a phone call. On a tablet or phone, incoming traffic is potentially available at all times. The viewer is constantly reminded of her connection to others including friends, family, and robots offering traffic guidance or weather alerts or email from the bank.

    In computing, the information processing technology of McLuhan’s time was just beginning its transition from mechanical to electronic functionality. Students at McLuhan’s Toronto school who took courses dealing with computers probably submitted their homework as a deck of punch cards and had the resultant output, most likely on green-bar paper, graded by hand or evaluated by software that helped check for the results of a multi-step industrial process. The homework might first have to be compiled and then put in a queue for batch processing. By the time students who entered U of T the year Understanding Media was published completed their four-year degrees, punch cards had given largely way to terminal-based interaction.

    But the IT process remained linear and essentially mechanical. It would be several more years before university students would get their own personal computers. Networking didn’t become cheap and essentially ubiquitous until the 1980s. And the Macintosh, the first affordable personal computer with a nice graphical user interface, didn’t hit the market until 1984, 20 years after McLuhan’s book on media. By that time, McLuhan had been dead for about four years. He would not be around to observe the effect of the GUI as an influential and popular medium; he would not be around to see interactive computing in its infancy during the era of AOL and its ilk mature into an Internet with vast search facilities, huge bandwidth, and wireless connectivity capable of bringing electronic payment to rural West Africa and to summon Uber cars to your front door.

    IBM 029 Card Punch: When the System/360 first hit the market, data entry on its systems and all their predecessors was accomplished using electromechanical card punches.

    Before 1964, IBM had built its business on technology that read a card and printed a line. Some of this technology was still largely mechanical, processing paper cards and sorting or selecting the cards using brushes that felt for punched holes and paper guides that sorted cards into banks of hoppers. The technological high end of IBM’s product line was still migrating from electronic systems based on vacuum tube triodes to circuit cards using discrete transistors. Magnetic tape was the emerging storage medium; disks were not yet sufficiently capacious or adequately affordable to displace mag tape. Tape is still a widely used archiving medium, possibly awaiting extinction by disks in the cloud but by no means assured of consignment to the dustbin of history.

    IBM’s corporate thinking, like that of the contemporary industrial empires that were its customers, mirrored the information processing machines it built. Computing, even as it went electronic, involved breaking a problem down into processing components the way an industry assembly process was divided into tasks. The components were executed in sequence, each receiving as input the output from a prior stage of work, each yielding as output the transformed batch of data.

    Until very recently, IBM personnel at work were largely shielded by the transition of computing from punch cards to richly interactive mobile multimedia activity. IBM’s System/360 was at first an electronic embodiment of punch card systems and the batch processing technology of earlier computing systems like the IBM 1401. It took IBM a decade to upgrade the 360 to the 370 and even then the early 370 models didn’t feature what would quickly become their defining technological advance: virtual memory. Still, by the mid-1970s IBM was showing customers that computing via CRT terminals was a key step on the path to the future. IBM’s mainframe processor business and, in parallel, its lines of small and midrange systems, was thriving. But by that time, IBM had begun to lose touch with developments in semiconductor manufacturing, communications technology and software that would trip it up during the 1980s.

    Just as the mainframe seemed in some ways to be the cinema version of punch card apparatus, a development that was for all practical purposes unknown to IBM management, the personal computer was turning into the television version of the glass house system. The first personal computer that became known around the world was the MITS Altair 8800, featured in Popular Electronics magazine in 1975. In just a few years, dozens of companies were selling hundreds of thousands of small computers. These computers were truly a different medium than the glass house systems they would soon transform and, eventually, as servers developed that used the technology popularized by personal clients, largely replace.

    IBM 3270 Terminal: The punch card and related data processing input equipment was eventually replaced by the CRT.

    By the 1980s, personal computers were evolving into a medium that encompassed data processing, added the potential for animated video and delivered audio content. These small computers accepted, in addition to keyboard data, tactile input from a mouse. They didn’t provide a rich and immersive multimedia experience at first, but they were clearly headed in that direction.

    IBM at the corporate level remained oblivious to the nature of personal computers even as it began to provide its own PCs to a hungry market. At the same time, IBM management didn’t pay much attention to the impact the PC business was having on the semiconductor business. Imagine if Ford never paid attention to developments in the steel business. That was IBM during the 1980s.

    Because IBM management seemed to only pay attention to PCs used in corporate offices and didn’t notice how its employees used these machines in their homes and, perhaps more importantly, how its employees’ children used the PC, it never saw that, once you took it out of a quiet office, the personal computer was a multi-media presentation device with a tremendous appetite for communications bandwidth. IBM didn’t even figure out that lighter and smaller (but every more powerful) PCs were taking over the outside-the-office market until its rival were regularly eating large chunks of the market.

    While IBM was putting its efforts into figuring out how to sell laptops made in China and desktop PCs made wherever the price was right, other companies were trying to pack the power of a PC into a phone. Nokia, Motorola, and others were going a pretty good job of this.

    IBM 3033 Mainframe: IBM’s flagship glass house computer was king of the mainframe world in the late 1970s.

    Meanwhile, all during the 1980s IBM management listed to its large account sales reps who said their customers wanted MIPS, MIPS, and more MIPS. IBM, thinking as linearly as its read-a-card, print-a-line old data processing systems behaved, interpreted this as a signal to build ever larger factories. Nobody was thinking about Moore’s Law or, even if the IBM PCs used Intel chips, what Moore learned in the shark-infested memory business, to say nothing of the do-or-die-but-shrink-that-die processor business.

    Driven more by the PC business than the glass house systems market, semiconductor companies in the US, Europe, Japan, and elsewhere in Asia were bringing down the cost and size of memory modules at a breakneck pace. In parallel, armed with ever-improving automated design systems, the processor makers were learning to put an awful lot of processing power onto a single chip, and how to mount multiple chips in very small but thermally robust packages.

    IBM was building giant factories while its scientists and engineers, to whom management apparently paid practically no attention, learned to pack orders of magnitude more MIPS and huge amounts of memory into each frame. By 1990, one of IBM’s key problems was its excessive inventory of manufacturing plants. It needed less space to build more glass house computers; it was unable to develop automation or other manufacturing technologies that would give it a leadership position in PC fabrication; and it failed to see that computing and mobile telephony were converging into a new medium that right now we call client device computing, but which might get a new name as we gain experience.

    IBM management may now realize that its processor business is on the road to oblivion, but it may not have any viable options. The company’s leaders and certainly their legal counsel ought to know that the only reason that there is a mainframe business at all is IBM’s superb skill at blocking the efforts of competitors. These rivals, left unchecked, would long since have offered X86-based mainframe emulators that put the IBM z architecture into smaller boxes and, these days, frames that can live in cloud server farms. A similar situation exists in the Power market, which persists because IBM prepaid $1.5 billion in subsidies to Globalfoundries to create the illusion of affordable technology for its next generation or two of processor chips.

    Just as the punch card medium became content for the 1401 and the 1401 became content for the mainframe, the mainframe could become content for the computing cloud, if it survives at all. But the back end system, whether mainframe or Power or IBM i or Oracle Sparc is only one aspect of future information technology, the way movies are only one aspect of the mobile tablet presentation environment.

    IBM is promoting the idea that stuff it calls cognitive computing may help it enjoy a successful future. It is making a big effort to put its impressive Watson technology to work on vexing problems, such as the selection of cancer therapies. But so far the company hasn’t found a way to ask Watson for help with some of the immediate matters that affect Big Blue’s operations, such as the edginess of customers who worry about IBM’s staying power, the effectiveness of employees who question the security of their positions, and the shareholders whose main case for investment is “well, Warren Buffett did it.”

    Share this:

    • Reddit
    • Facebook
    • LinkedIn
    • Twitter
    • Email

    Tags:

    Sponsored by
    UCG Technologies – Vault400

    Do the Math When Looking at IBM i Hosting for Cost Savings

    COVID-19 has accelerated certain business trends that were already gaining strength prior to the start of the pandemic. E-commerce, telehealth, and video conferencing are some of the most obvious examples. One example that may not be as obvious to the general public but has a profound impact on business is the shift in strategy of IBM i infrastructure from traditional, on-premises environments to some form of remote configuration. These remote configurations and all of their variations are broadly referred to in the community as IBM i hosting.

    “Hosting” in this context can mean different things to different people, and in general, hosting refers to one of two scenarios. In the first scenario, hosting can refer to a client owned machine that is housed in a co-location facility (commonly called a co-lo for short) where the data center provides traditional system administrator services, relieving the client of administrative and operational responsibilities. In the second scenario, hosting can refer to an MSP owned machine in which partition resources are provided to the client in an on-demand capacity. This scenario allows the client to completely outsource all aspects of Power Systems hardware and the IBM i operating system and database.

    The scenario that is best for each business depends on a number of factors and is largely up for debate. In most cases, pursuing hosting purely as a cost saving strategy is a dead end. Furthermore, when you consider all of the costs associated with maintaining and IBM i environment, it is typically not a cost-effective option for the small to midsize market. The most cost-effective approach for these organizations is often a combination of a client owned and maintained system (either on-prem or in a co-lo) with cloud backup and disaster-recovery-as-a-service. Only in some cases of larger enterprise companies can a hosting strategy start to become a potentially cost-effective option.

    However, cost savings is just one part of the story. As IBM i expertise becomes scarce and IT resources run tight, the only option for some firms may be to pursue hosting in some capacity. Whatever the driving force for pursing hosting may be, the key point is that it is not just simply an option for running your workload in a different location. There are many details to consider and it is to the best interest of the client to work with an experienced MSP in weighing the benefits and drawbacks of each option. As COVID-19 rolls on, time will tell if IBM i hosting strategies will follow the other strong business trends of the pandemic.

    When we say do the math in the title above, it literally means that you need to do the math for your particular scenario. It is not about us doing the math for you, making a case for either staying on premises or for moving to the cloud. There is not one answer, but just different levels of cost to be reckoned which yield different answers. Most IBM i shops have fairly static workloads, at least measured against the larger mix of stuff on the public clouds of the world. How do you measure the value of controlling your own IT fate? That will only be fully recognized at the moment when it is sorely missed the most.

    CONTINUE READING ARTICLE

    Please visit ucgtechnologies.com/IBM-POWER9-systems for more information.

    800.211.8798 | info@ucgtechnologies.com

    Article featured in IT Jungle on April 5, 2021

    Share this:

    • Reddit
    • Facebook
    • LinkedIn
    • Twitter
    • Email

    Sponsored Links

    COMMON:  2016 Annual Meeting & Expo, May 15 - 18, in New Orleans! Great Power Systems event!
    NGS :  Webinar: Realizing the Power of IBM i with NGS-IQ. May 11. RSVP now!
    Profound Logic Software:  'i on the Enterprise' Worldwide Virtual Event. June 8. Register Now!

    IBM i Scalability Stays The Same With 7.3 Why Node.js?

    Leave a Reply Cancel reply

Volume 26, Number 20 -- May 2, 2016
THIS ISSUE SPONSORED BY:

BCD Software
ProData Computer Services
Rocket Software
HiT Software
WorksRight Software

Table of Contents

  • Making The Case For Flash Over Disk In Power Systems
  • Study Identifies Disturbing IBM i Security Weaknesses
  • IBM Bolsters HyperSwap to Protect IBM i Against Downtime
  • Mad Dog 21/21: The Mainframe Was The Message
  • Good IBM i Ideas From Wisconsin

Content archive

  • The Four Hundred
  • Four Hundred Stuff
  • Four Hundred Guru

Recent Posts

  • Query Supervisor Gives Database Engineers New Power
  • IBM Unveils New and Improved IBM i Services
  • 3 Takeaways from the 2021 PowerTech Security Report
  • Four Hundred Monitor, April 14
  • IBM i PTF Guide, Volume 23, Number 15
  • Big Blue Unveils Spring 2021 IBM i Technology Refreshes
  • Thoroughly Modern: Innovative And Realistic Approaches To IBM i Modernization
  • Guru: Web Services, DATA-INTO and DATA-GEN, Part 2
  • Back To The Future With A New IBM i Logo
  • IBM i PTF Guide, Volume 23, Number 14

Subscribe

To get news from IT Jungle sent to your inbox every week, subscribe to our newsletter.

Pages

  • About Us
  • Contact
  • Contributors
  • Four Hundred Monitor
  • IBM i PTF Guide
  • Media Kit
  • Subscribe

Search

Copyright © 2021 IT Jungle

loading Cancel
Post was not sent - check your email addresses!
Email check failed, please try again
Sorry, your blog cannot share posts by email.