As I See It: The Visionary
July 9, 2012 Victor Rozek
He was a world-class marathon runner, but he is not known for that. He was a defender of human rights, but neither is he known for that. He is better remembered for having shortened a long and terrible war, and for being an originator of the field that remade the world in the last half-century. That field is computer science, and his name is Alan Turing. Yet given his many achievements and with everything to live for, why then did he commit suicide at the age of 42?
Turing was a child of the British Empire. His father was a prominent functionary in the Indian Civil Service, and his mother, the daughter of a renowned railroad engineer. He was conceived in India, but in 1912 his parents briefly returned to England, presumably so that Alan could be born on British soil. It was an oppressive time, a period in English history when a young man’s future was rigidly determined by class, duty, and the demands of Empire. Success, in large part, depended upon admittance and advancement in the British public school system, available only to the upper classes. It was therefore decided that Alan and his older brother would be best served by staying in England. It was an unpromising start. According to biographer Andrew Hodges, Alan spent his youth “fostered in various English homes where nothing encouraged expression, originality, or discovery.”
Lacking parental support, Turing turned his attention to the only comfort available in his stodgy surroundings: books. One in particular captured his youthful imagination, Natural Wonders Every Child Should Know, and it became a preamble to his own life story, an entree into the world of science. Turing’s curious mind was coupled with uncommon determination. By chance, his first day at Sherborne School (the equivalent of high school) coincided with a national strike. Transportation was unavailable, so he commandeered a bicycle and peddled 60 miles in order not to miss class. Sherborne, however, was more oriented toward a classical education than a scientific one, and the headmaster was reportedly concerned that Turing was “wasting his time.”
But he managed to graduate and enrolled at King’s College at Cambridge in 1931. There, in Hodges’ words, “he entered a world more encouraging to free-ranging thought.” That range included the work of von Neumann on the logical foundations of quantum mechanics, and Bertrand Russell on logic and mathematical truth. By 1934 he had earned a “distinguished degree,” followed by a Fellowship at King’s College, and a Smith’s Prize in 1936 for work on probability theory.
But the problem that fascinated Turing–and would focus his mind on how an intelligent machine could be developed–was the question of Decidability posed by German mathematician David Hilbert: Did there exist a definite method that could, in principle, be applied to any mathematical assertion, and which was guaranteed to produce a correct decision on whether that assertion was true? The ultimate answer was no, not every assertion is provable. But Turing thought they could be tested. His answer was something he called the Universal Turing Machine–a device constructed mostly in his mind.
Using modern terminology, Turing thought that a mathematical assertion could be analyzed using a computer with specialized software and an algorithm. Turing’s key insight was that “symbols representing instructions are no different in kind from symbols representing numbers,” which would become the basis of computing. He essentially identified the means and methods of modern computing at the time when none of these things existed. It would take another nine years before electronic technology evolved enough to turn his imagination into engineering.
But Turing’s computational musings were interrupted by the Second World War. He abandoned his research and went to work for the British government at the Code and Cipher School headquartered at Bletchley Park. At the outbreak of hostilities, England was at a disadvantage, an isolated island dependent on shipping for its survival. And the Germans had built a formidable fleet of submarines. These U-boats hunted in groups called “wolfpacks” sinking thousands of allied ships. The British Isles were slowly being choked.
Key to winning the battle of the Atlantic was decoding German ciphers that controlled the movement of the submarine fleet. The Germans used variations of a sophisticated electro-mechanical rotor cipher machine called Enigma. Some progress had been made decoding German Luftwaffe communications, but naval encryptions in particular were considered unbreakable. No one wanted to waste time on them, so Turing took on the challenge alone. And he succeeded. Once the codes were broken, German submarines were able to be tracked and hundreds were destroyed. The advantage in the battle of the Atlantic shifted to the Allies. It is commonly thought that the ability to decrypt German ciphers shortened the European war by at least two years. Churchill, however, went so far as to credit the winning of the war to the work done at Bletchley Park.
Toward the end of the war, electronic technology, according to Hodges, “made its first appearance at Bletchley Park.” Turing immediately saw the potential and planned to build “the embodiment of the Universal Turing Machine in electronic form.” In effect, he was proposing the invention of the digital computer. His innovative approach included “implementing arithmetical functions by programming rather than by building in electronic components,” a concept which at the time differed from that of American designs. He projected a computer “able to switch at will from numerical work to algebra, codebreaking, file handling, or chess-playing.” His Abbreviated Code Instructions marked the beginning of programming languages; but he also predicted that computers would expand their own programs, exhibiting “the faculties of the human mind.” As early as 1944, Turing spoke of “building a brain.” Hodges calls his paper, Computing Machinery and Intelligence, published in 1950, “a classic contribution to the philosophy and practice of Artificial Intelligence research.”
But then, just as technology was beginning to catch up to Turing’s prophetic imagination, the unthinkable happened.
The hero of Bletchley Park, the innovator, the visionary, was arrested in March 1952. The nation he helped save now turned against him. As Hodges notes, while Nazi war criminals roamed free, Turing faced the choice of prison or poisoning. And his only crime was the unforgivable sin of being himself. Turing was gay, and he was apprehended after the police learned of his relationship with a young man. The arrest didn’t require much investigative skill because Turing never hid his homosexuality. He openly and volubly lived his life without shame or apology. Turing believed laws against homosexuals were Medieval, born of ignorance and dark fears, and would surely be overturned. But this time, the man who was right about so much, was wrong. He was convicted of “gross indecency” and the court gave him an impossible choice: prison, or massive doses of estrogen. He chose chemical castration.
He surely must have found it ironic that having helped win a war that was ostensibly fought for freedom, his own freedom should now be abridged. As a consequence of his conviction, he lost his security clearance, and the British government lost a once-in-a-generation mind. Over time, the estrogen treatments caused him to grow breasts. By 1954 he had become despondent, and on June 8 he was found dead of cyanide poisoning from a half-eaten apple found by his bedside.
It would take another 46 years before the laws that drove Turing to suicide would be abolished. In November 2000, equality was finally granted homosexuals in Britain despite fierce opposition from the moral descendants of Turing’s persecutors.
For its part, the British government chose to honor Turing on the anniversary of his birth with a centenary celebration. This year, dozens of events will be staged throughout the country honoring his contributions to mathematics, cryptography, computer science, artificial intelligence, philosophy, and a number of other fields touched by his genius.
In an extraordinary departure from usual protocols, Prime Minister Gordon Brown offered a posthumous apology, saying how deeply sorry he was for what happened to Turing. He added, “The debt of gratitude he is owed makes it all the more horrifying that he was treated so inhumanely.”
Turing once observed that: “We can only see a short distance ahead, but we can see plenty there that needs to be done.” His life–and his death–remain a monument to that truth.