»  National Review

April 16, 2012

  The Great Numbers Crunch

Turing's Cathedral: The Origins of the Digital Universe
        by George Dyson

—————————

This year marks the centenary of British mathematician Alan Turing, whose researches in the unlikely and very abstruse field of mathematical logic did much to create the world in which we now live. In 1936 Turing published a paper titled "On Computable Numbers" in the Proceedings of the London Mathematical Society. The paper received almost no attention. "Only two requests for reprints came in," George Dyson tells us. The reason for this is interesting — is, in fact, one of the main themes in Dyson's book.

It is an odd thing that in 1936, digital technologies were old hat. The coming age looked to be entirely analog. (Digital phenomena are staccato, stepping from one value direct to another: analog is legato, gliding smoothly through all intermediate points.) If you had asked a well-informed person of that date to point to some digital technologies, he would have cited the Western Union man with his green eyeshade and sleeve garters, tapping out Morse code on the telegraph key, or perhaps the Asian shopkeeper working his abacus. The spiffiest new devices were all analog: radio, movies, vinyl disks, the slide rule, and soon TV and radar.

So were the latest grand scientific theories. The spacetime of General Relativity flexed analogically to accommodate mass and charge. Quantum Mechanics contained some irritatingly digital elements — you can't have half a quantum — but the underlying equations were written in the comfortingly analogic language of traditional calculus. Only biology had been through a modest digital revolution. The notion of "blending inheritance" (trait of the offspring falls halfway between the corresponding trait in the parents) had caused much vexation to 19th-century biologists, including Darwin, as it led logically to a population of clones; but no-one could come up with an alternative. The 1900 rediscovery of Mendel's more digital theory resolved the issue.

Genetics aside, the late 1930s was thus a time of analog triumphalism. Across the following decades, everything changed. We now live in a thoroughly digital world. Digital gadgets twitter and beep all around us. At the deepest level, there are serious speculations that spacetime itself may be digital — Scientific American magazine recently did a cover story on the topic. Analog principles and gadgets survive only in a few pockets of the deepest reaction. My own house, for example, contains an analog TV set and — yes! — a slide rule.

George Dyson tells the story of this great conceptual and technological transformation in Turing's Cathedral, concentrating on the key years from 1936 to 1958. It was in that latter year that the computer at the Institute for Advanced Study in Princeton, New Jersey was decommissioned after seven years operation. The IAS computer, which had no formal name (MANIAC, with which it is confused, was a clone machine at Los Alamos), was the brainchild of John von Neumann, one of the most tremendous geniuses that ever lived. He had been one of the first to notice Turing's 1936 paper — the two shared office space at Princeton. Turing studied for his Ph.D. at the university, 1936-38; von Neumann had come to the university in 1931, then been given a professorship at the new IAS in 1933.

They shared much else. Though both were brilliant pure mathematicians, neither disdained physical gadgetry. When researching a book on a famous conjecture in pure mathematics, I was surprised to learn that Turing conceived the idea of a mechanical computing device to disprove the conjecture, and had even cut some of the gear wheels himself in his college's engineering workshop.

There lay the reason for the lack of interest in Turing's 1936 paper. In it he had conceived the idea of a Universal Computing Machine: an imaginary device that could duplicate the behavior of any other you might think up. The paper was founded in the purest of pure mathematics, drawing from work by the previous generation of mathematical logicians, who themselves had built on work by David Hilbert, Whitehead and Russell, and earlier enquirers all the way back to Leibniz. The centerpiece of it, though, was that machine. Dyson: "Engineers avoided Turing's paper because it appeared entirely theoretical," while "theoreticians avoided it because of the references to paper tape and machines."

John von Neumann's career at the Institute for Advanced Study hit the same fault line. The IAS had been conceived as a place where the greatest minds might think their lofty thoughts without the distraction of students, publication schedules, or academic politics — the purest of pure research institutes. Though not a tinkerer like Turing ("He would have made a lousy engineer," testified his colleague Hermann Goldstine), von Neumann was free of intellectual snobbery. He was in fact a worldly man, a bon vivant even — he never drove anything but Cadillacs — quite opposite to the popular image of a math professor. He believed, he told J. Robert Oppenheimer, that mathematics grew best when nourished by "a certain contact with the strivings and problems of the world."

Holding such an opinion, von Neumann certainly lived at the right time. Never were there such world-wide strivings and problems as in the middle years of the 20th century; never were the contributions of mathematicians more essential. Job One was of course to win the greatest, most technologically sophisticated war ever fought. Typical problems were the accurate aiming of large artillery pieces and the understanding of the effects of powerful explosions. Both involve large arrays of complex mathematical expressions — differential equations — which must then be reduced to arithmetical algorithms to be cranked through by relays of computers.

Hence von Neumann, writing to his wife on his arrival at Los Alamos in September 1943: "computers are, as you suspected, quite in demand here, too." This was of interest to Mrs von Neumann, as she was herself a capable computer. Until well into the 1950s, you see, the word "computer" meant "skilled human calculator." It was thus defined in the dictionaries of my own childhood. These human computers, working on a superior kind of electro-mechanical adding machine, had come into their own in support of gunnery projects in World War One, and in the interwar years had proved indispensable in other kinds of research: demography, weather forecasting, and the new science of operations research.

Turing's 1936 paper had opened up the possibility that these rooms full of human computers might be replaced by a single machine, with a single way of remembering both its data (the particular numbers to be operated on) and its algorithms (the sequences of instructions that define the operations). By the early 1940s this possibility had sunk in with researchers in Britain, Germany, and the U.S.A. Early prototypes of electronic computers were already in action, though too late for the Manhattan Project, whose numbers were crunched by human computers — initially the wives of physicists, then by draftees from the Army's Special Engineering Detachment.

With the hot war won, all this work passed into the shadow of the H-bomb. The computations here were at a yet higher level, but now true computers were coming into their own, with an assist from lots of cheap war surplus equipment. Techniques, too, had advanced. One especially dear to von Neumann's heart was the Monte Carlo Method, to which Dyson gives over a whole chapter. Suppose you want to know the likelihood of some event — say, that a stick of given length, cast down onto a planked floor whose boards are a given width, will end by lying across one of the floorboard-cracks. You could do the appropriate math — it's a straightforward problem in Measure Theory — or you could just try the thing a few thousand times, throwing the stick at random angles and speeds, and average out the results. The latter is the Monte Carlo Method, sometimes unkindly disparaged as: What to do until the mathematician arrives.

Not surprisingly, von Neumann was a keen gambler: he met his wife in the casino at Monte Carlo. The method had great appeal to him, in spite of some knotty conceptual issues around the definition of "random." It was also well suited to the new devices, and was used to model the paths of neutrons through fissile material. This work all came to triumph on November 1, 1952, with the successful detonation of the first H-bomb. Just six months later James Watson and Francis Crick published their landmark paper on the structure of DNA, and modern science, of both the "dry" and the "wet" variety, entered into digital adulthood.

The true hero of Dyson's book, it can be seen, is not Alan Turing, though Turing's momentous contributions are properly described and appreciated. It is John von Neumann who holds the story together.

In July of 1955, aged just 51, von Neumann suddenly collapsed, and was diagnosed with advanced cancer. Nineteen agonized months later, this colossal intellect left the world he had done so much to transform. With his influence decisively removed — it had already been weakened in 1954, when Eisenhower had appointed him to the Atomic Energy Commission — and with the pacifistic world-government flim-flam favored by Einstein and others in the ascendant, the purists at the IAS made their comeback. Said IAS physicist Freeman Dyson (the author's father): "When von Neumann tragically died, the snobs took their revenge and got rid of the computer project root and branch." It would be twenty-two years before the IAS next had a computer.