• The Four Hundred
  • Subscribe
  • Media Kit
  • Contributors
  • About Us
  • Contact
Menu
  • The Four Hundred
  • Subscribe
  • Media Kit
  • Contributors
  • About Us
  • Contact
  • As I See It: The Future in Parallel

    September 14, 2009 Victor Rozek

    Ours is a mixed marriage. I am PC and my wife is Mac. We are the Yang and Yin of personal computing. She has Mac OS X, I have Windows XP. She has iPhoto, I have HP Photo. She has iChat, iMovie, iDVD, iCal, iSync, and iTunes, I have no tunes and a bunch of other stuff I seldom use. She has Steve Jobs, I have Bill Gates; which is to say she has cool and I have nerd. She has a virus free computer, I have multiple layers of security software and in spite of daily updates and scans, I’m still not sure my PC is intrusion free. I console myself with the fact that I could buy two PCs for the cost of her Mac.

    The debate between PC and Mac users has been long and impassioned, with staunch supporters on both sides of the divide. Of course, if market share was the only criteria, there would be no debate. Windows holds nearly 91 percent of the market while Mac is a distant second with a whisker over 8 percent. Linux, the Ralph Nader of operating systems, has a lot to recommend it, but can’t find traction in the marketplace, holding a less than 1 percent share.

    But Windows’ seemingly insurmountable lead has been steadily eroding. Since 2004, Windows’ market share dropped by 6 percent; while Mac, fueled by the runaway popularity of the iPhone and iPod, increased its share by over 5 percent. Proof that “cool” remains a powerful marketing strategy.

    In the past, even though its technology was not always superior, the PC forces nevertheless managed to paint Mac with the frivolity brush. Back in the 1980s, before it conceded the PC market to Microsoft, IBM competed with Apple for the burgeoning education market. At the time, IBM’s PC operating system was the bland DOS and Mac developers were innovating circles around the PC, offering ease of use and graphic capabilities PC users could only envy. In response, IBM published the results of a study that would define the relationship between the two products for a generation.

    Needing to re-establish its edge, and lagging behind the innovation curve, IBM decided to disparage the value of the Mac’s most popular features. It released the results of a survey that found that “serious” work, like writing term papers and theses, running spreadsheets, and developing databases, was far more often done on IBM systems; while more “frivolous” activities such as playing games and doodling with graphics, were the provenance of the Mac. The none-too-subtle implication was that IBM equated to serious scholarship and scholastic success–and by extension–success in life; while Macs were a fun and pleasant distraction, but ultimately only that. Since so much of business and personal computing still runs on PCs, the strategy apparently resonated with buyers especially when voiced by the safe, reassuring voice of IBM.

    But over time such distinctions–real or not–became irrelevant. Software providers write for both platforms and Macs even offer the option of running Windows under OS X. (Needless to say it runs slower, but what would you expect, optimization?) More important, both camps now buy chips from Intel. Core Duo processors (two processors engineered onto a single chip) are the next evolutionary step in keeping with Gordon Moore’s prediction. (At least they were when introduced in January of 2006; a quad core version followed). Moore, Intel’s co-founder, made the now-famous prediction that his industry could double the number of transistors placed on a chip every 12 months. He subsequently adjusted his timeline to 24 months, but regardless, the brassy prediction proved so accurate that it ascended to the mythical stature of being an industry “law.” And for an astonishing 50 years, Moore’s Law held firm, like a dam holding back ever increasing volumes of water.

    But managing capacity, it turns out, is a secondary problem. Speed is the challenge to Moore’s Law, more accurately, the by-product of speed which is heat. At speeds over 3 GHz, things begin to melt. In an effort to extend Moore’s Law, chip manufacturers found a work-around to those other annoying laws–the laws of physics–by loading multiple processors on each chip and dividing tasks between them. According to Justin Rattner, Intel’s chief technology officer, within a decade chips may contain 100 cores or more.

    That’s good news/bad news since each solution creates its own set of fresh problems. Parallel processing does not occur automatically. To survive another decade, Moore’s Law will need an assist from software developers because neither operating systems nor programming languages, nor development tools, and certainly not end user applications have been designed to take advantage of parallel processing on such a grand scale.

    Parallel processing is, of course, alive and well and thriving in the world of supercomputers, but software development is slow, complex, expensive, and requires specialized skills. The elite programmers most familiar with the demands of parallel processing are not likely to suddenly shift their attention from, say, weather modeling to creating comparatively mundane PC applications.

    For IT professionals, the new generation of processors promises to be both curse and blessing. On the one hand they will challenge old-school programmers who cut their teeth writing applications for sequential engines. On the other, they will provide an opportunity for a new generation of programmers for whom parallel processing will become the baseline. But if Microsoft is to be believed, the transition won’t be easy. Daniel Lyons, writing for Newsweek, quotes Craig Mundie, Microsoft’s chief research and strategy officer: “For 50 years we’ve done things one way, and now we’re changing to a different model. . . . It’s the biggest single change Microsoft has ever faced.” Which means it will be a big change for the rest of us.

    Dividing larger problems into smaller ones and solving them concurrently is doubtless a powerful idea, but ironically the actual improvement in throughput will be delimited by yet another “law,” this one courtesy of the legendary Gene Amdahl. Amdahl’s Law essentially says that an application running in parallel can only be as fast as its slowest segment. For example, if a program runs for 20 hours using a single processor, but a particular segment cannot be parallelized and takes one hour to complete, even though the remaining 19-hour portion can be parallelized, regardless of how many processors are devoted to the task, the minimal execution time cannot be less than one hour. It would seem then that one of the unintended results of parallel processing will be the creation of excess capacity.

    Whether Macs make further inroads into PC’s dominance of the personal computing market may, in part, be determined by how well parallel processing is implemented on each platform. But baring a disastrous outcome by one side or the other, few minds will be swayed–at least if my wife is any indication. She calls my PC “cumbersome and unfriendly.” Macs, she says, are designed by people who are intelligent; PCs are designed by people who want you to know they are intelligent.

    Picky, picky, picky.



                         Post this story to del.icio.us
                   Post this story to Digg
        Post this story to Slashdot

    Share this:

    • Reddit
    • Facebook
    • LinkedIn
    • Twitter
    • Email

    Tags: Tags: mtfh_rc, Volume 18, Number 32 -- September 14, 2009

    Sponsored by
    Raz-Lee Security

    Start your Road to Zero Trust!

    Firewall Network security, controlling Exit Points, Open DB’s and SSH. Rule Wizards and graphical BI.

    Request Demo

    Share this:

    • Reddit
    • Facebook
    • LinkedIn
    • Twitter
    • Email

    Managed File Transfer: A New Product Category That’s Here to Stay Free RPG Editor is Open Source, Runs on Linux

    Leave a Reply Cancel reply

TFH Volume: 18 Issue: 32

This Issue Sponsored By

    Table of Contents

    • The Feeds and Guessed Speeds of Power7
    • Server Makers Stomach the Worst Quarter in History
    • Training for the Future: An IT Degree in Energy Efficiency
    • As I See It: The Future in Parallel
    • IBM Gets Less Restrictive with Power ISV Rebates
    • COMMON RiPS: A Good Idea Needing a Better Acronym
    • EU Haunts Oracle-Sun, Oracle Taunts IBM
    • IBM Mothballs Older Versions of Host Integration Server
    • Vendors Go Virtual with Annual User Conferences
    • Greater Responsibility a Necessary Part of Vlok’s Vision

    Content archive

    • The Four Hundred
    • Four Hundred Stuff
    • Four Hundred Guru

    Recent Posts

    • Public Preview For Watson Code Assistant for i Available Soon
    • COMMON Youth Movement Continues at POWERUp 2025
    • IBM Preserves Memory Investments Across Power10 And Power11
    • Eradani Uses AI For New EDI And API Service
    • Picking Apart IBM’s $150 Billion In US Manufacturing And R&D
    • FAX/400 And CICS For i Are Dead. What Will IBM Kill Next?
    • Fresche Overhauls X-Analysis With Web UI, AI Smarts
    • Is It Time To Add The Rust Programming Language To IBM i?
    • Is IBM Going To Raise Prices On Power10 Expert Care?
    • IBM i PTF Guide, Volume 27, Number 20

    Subscribe

    To get news from IT Jungle sent to your inbox every week, subscribe to our newsletter.

    Pages

    • About Us
    • Contact
    • Contributors
    • Four Hundred Monitor
    • IBM i PTF Guide
    • Media Kit
    • Subscribe

    Search

    Copyright © 2025 IT Jungle