Mad Dog 21/21: Tycho Brahe Had No Nose. How Did He Smell? Terrible
January 7, 2013 Hesh Wiener
Tycho Brahe, the sixteenth century Danish polymath with a prosthetic nose, believed facts yield advances in science and business. Good at both, Brahe accumulated a fortune and built an observatory while accumulating the astronomical data he bequeathed to his German colleague Johannes Kepler. Decades later, formulating laws of gravitation, England’s Isaac Newton stood on Kepler’s shoulders. The computer industry could sure use a Brahe right now.
It is having difficulty charting a course between the Scylla of technological change and the Charybdis of fickle markets. Consequently, following the traditional leaders, as loyal, trusting customers do, is downright risky.
The roots of the troubles afflicting the computer business are very deep, as were those of Renaissance astronomy. Ancient computing platforms like the mainframe and the IBM i platform have changed very little since their inception. They are based on a marketing system in which customers are encouraged to shape their glass house hardware installations to needs perceived as specific and unique. On top of this tailored hardware, customers run one or several operating systems, also considerably customized. Users may call a lot of this tuning and rationalize it as the best way to get the most value from their equipment, but it is also a trap they have built for themselves. The customization makes it nearly impossible, in practical terms, for the uses to migrate, even when they know more cost-effective alternatives are abundant.
The world of legacy computing resembles the geocentric astronomy of Ptolemy, an Egyptian who lived in the first century. Many of the oldest manuscripts attributed to Ptolemy are written in Arabic. Scholars generally believe Claudius Ptolemy was a Roman citizen, very possibly of Greek descent; his names are markers for those two classical cultures. Ptolemy’s work depicts a solar system visible to the naked eye revolving about the Earth. As this was consistent with material in the Old Testament and also because it put man on Earth at the center of things, Ptolemy’s work shaped the official doctrines of the Catholic Church until well after the death of Galileo Galilei some fifteen centuries later. Basically, long after astronomers including Brahe and Kepler, and most notably the Polish scholar Nicholaus Copernicus, knew the geocentric model of the solar system and surrounding universe was just plain wrong, it was very risky to publicly state what had become obvious, let alone publish new descriptions of the heavens.
Copernicus, whose work preceded that of Galileo, got away with discussions of a heliocentric solar system because he (and those around him who were protective) described his theories as mathematically convenient but philosophically untrue. Galileo got into political hot water for saying things Copernicus was known to have said, but Galileo was in Florence, closer to Rome, and, at the time, living in what was the commercial epicenter of Europe. In every time and place, trouble where the big money comes from will stir things up more than trouble in the hinterlands, which included Kracov at that time.
For hundreds of years, the Ptolemaic geocentric universe worked fine. Astronomers could work within its framework and, by recording their observations and noting the differences between what they saw and what various theories predicted, contribute to the development of calendars and other record-keeping mechanisms that, as civilization grew more complex, became quite important. The persistence of geocentric concepts only became a big problem later, as explorers and later navies reached out from the Old World and a better understanding of the heavens became valuable to those pursuing various goals on Earth. Just as flat Earth Euclidean trigonometry is accurate enough for road building and architecture but fails when one needs maps spanning great distances, Ptolemy was just fine for most people until well after the age of Brahe.
Similarly, the excellent bean-counting power of legacy computers remains adequate and, in the view of proficient glass house IT folk, superior to all alternatives. Backroom bookkeeping on systems like the z/OS and IBM i machines is done well and cheaply. It almost doesn’t matter whether the same work done on a newer kind of computer might be a bit less expensive in some cases. Glass house computing, at its best, is very highly refined and, combined with the IT culture living in those legacy data centers, quite practical.
But as heliocentric models of the solar system and, later, even more advanced and complex models of the astronomical universe are necessary to bring cosmology to a higher level than the limited (and incorrect) geocentric models allow, computing technologies and organizational schemes that differ from legacy alternatives open up wider horizons.
The Internet depends largely on an IT industry tied to X86 hardware and the software that has been written for it. Computers are built to generally conform to certain hardware standards. This hardware architecture does continue to evolve. For example, one recent trend of considerable importance is the enhancement of features that make virtualization more efficient and secure. While virtualization was first developed for glass house systems as a way to keep them economically viable, technology has kept moving. Hardware currently being developed for X86 processors not only makes it easier to pack more virtual systems into a single server but also to link groups of servers to a virtual plane so multiple machines can work as a team to support multiple workloads. The clusters of servers that deliver the clouds built by Google, Amazon, Microsoft, Red Hat, and others have technical and economic characteristics that simply cannot be matched by any legacy systems, at least not in any practical way. (There are those, including IBM, who would argue this point.)
But just as Brahe’s work served as a waystation on the route from Ptolemy to Newton and beyond, so the X86 world, even though it may define today’s state of the art, is only one more step in a journey that could soon leave it behind. For as the PC market and its vast unit volumes compared to anything in computing that came before reshaped the economics of chip production, processor fabrication and peripheral device development, the smart client market, based on ARM chips and other contenders, is again redefining technological capability, price/performance and particularly energy/performance.
If X86 architecture led to a new world reckoned in dollars per unit of performance, ARM and its ilk is not only more cost-effective than X86 (but, so far at least, for smaller units of power) it is also the basis of what has become an equally important measure of computing advancement, performance per watt.
Intel says its chips can match ARM in performance per watt, and perhaps so far they can. But the race is new and the outcome cannot be forecast. However, if history is any guide, ARM or more likely some future architecture that today isn’t even on most users’ radar will set new standards for performance per watt and bring to an end the formerly unlimited growth of X86 hardware market. Basically, it seems reasonable to guess that a future architecture, maybe ARM, maybe something else, will do to X86 what X86 did to IBM’s legacy mainframe and OS/400 markets as well as to others such as DEC VAXes and HP 3000s as well as to Unix platforms from IBM, Hewlett-Packard, and the former Sun Microsystems, now part of Oracle.
Unix, because it lacks real standards that enable competition has become just another legacy system. Linux, by contrast, may have been the bastard offspring of Unix, but it has evolved into one of the two giant server systems for X86. From a business perspective, Linux is more like Windows than Unix and Unix is more like z/OS than anything else. The separation of Linux from Unix in a commercial sense is as important as the separation of Tycho Brahe from the goofy dueling culture of his day. As a young man, Brahe got into a spat that went from hot words to flashing steel. The outcome was disfigurement: Brahe lost the bridge of his nose. He covered his scar and shame with a gold-colored prosthesis (that may have been made of brass) and took away from that incident a keen appreciation of the value compromise can sometimes have.
When his work matured and his observations made the use of a heliocentric model of the solar system almost inescapable, Brahe devised a unique theory in which some planetary objects revolved around the Earth while other revolved around the sun. This largely saved him from potentially dangerous confrontations with Papal authorities who were desperately clinging to many traditional ideas because their power was already being challenged by the increasingly popular views of Martin Luther and his successors.
It seems likely that Brahe knew his model stunk scientifically even if it was brilliant philosophically, and that Brahe, nose or no nose, wisely fed his data to Kepler, who added lots of facts of his own and eventually managed to publish excellent studies of the skies that became academic classics. But even before Kepler expanded and extended Brahe’s observations, the Dane’s work was not only intellectually impressive it was also quite a physical feat. Unlike Galileo, who created fabulous optical instruments for the study of the skies, Brahe worked without a telescope.
As the key developers of new computing platforms, such as Google’s Android team and Apple‘s iOS developers, move ahead they will inevitably challenge their predecessors in the X86 and glass house worlds. A great deal of the prominent technology that has emerged in the world of mobile devices–touch screens, compact navigation satellite receivers, Bluetooth transceivers–are elements of an Internet of things. But less visible developments may ultimately reshape computing far more than some of the dramatically prominent gadgets.
One example is Google Wallet, the application of near field communications (NFC) that lets a mobile device serve as a collection of virtual credit cards. Initially tied to a single card, Wallet currently allows users to build bridges between their phones and any legacy credit account they might have. One phone can provide access to an essentially unlimited number of credit or debit cards. As Wallet catches on, Google will own the first port through which payment transactions must pass. That role may turn out to be the nose of the camel that will eventually enter the bean-counting tent where legacy systems along with X86 systems dominate.
Today, smartphones pass transactions to networks and server farms that don’t have phone-compatible processors. But so many developers are trying to change that. They want to hang ARM or MIPS or Intel Atom chips along the networks that link devices that originate transactions to the central facilities that process the transactions to conduct business and banking operations. They will succeed if they can absorb more of the workloads done by upstream systems and get the jobs done for less money and with less power and cooling.
Current ARM-based servers are mainly imitations of X86 servers built with different chips. That won’t do. Somebody, somewhere will have to combine the advanced capabilities of new computers with the capabilities of legacy and X86 systems. When that happens, the fastest growing branches of computing, which have long since decamped from glass houses to X86 farms, will travel once more.
Computer vendors and their customers who attempt to deny the trends that are apparent in unit sales of hardware, of technology companies’ financial results, in the very way people conduct their business and personal lives are going to be left behind. They may have had the right ideas once, but then so did Ptolemy. This might be a good time to recognize that computing is not standing still. One might say of the locus of technology, as Galileo did the Earth and Luther his bowel, “and yet it moves.”