HomeGroupsTalkZeitgeist
Hide this

Results from Google Books

Click on a thumbnail to go to Google Books.

Turing's Cathedral: The Origins of the…
Loading...

Turing's Cathedral: The Origins of the Digital Universe (original 2012; edition 2012)

by George Dyson

MembersReviewsPopularityAverage ratingMentions
4811521,406 (3.65)12
Member:pierthinker
Title:Turing's Cathedral: The Origins of the Digital Universe
Authors:George Dyson
Info:Allen Lane (2012), Hardcover, 432 pages
Collections:Your library
Rating:***
Tags:None

Work details

Turing's Cathedral: The Origins of the Digital Universe by George Dyson (2012)

None.

Loading...

Sign up for LibraryThing to find out whether you'll like this book.

No current Talk conversations about this book.

» See also 12 mentions

English (14)  Danish (1)  All languages (15)
Showing 1-5 of 14 (next | show all)
If you are looking for information about Alan Turing, look elsewhere. The title is a metaphor.

The Nazis did the U.S. a huge favor with their boorish and stupid racial policies. Many prominent Jews were brilliant mathematicians and physicists, and when the “cleansing” of universities began by the Nazis, people like Van Neumann, Einstein, and many others fled to the United States where they were of immense assistance in the development of the atomic bomb.

This book is about the origins and development of the digital age and Dyson spends considerable space on the people and institutions key to that development. The Princeton Institute for Advanced Research, for example, under Abraham Flexner and Oswald Veblen, recruited many of these refugees who helped build the Institute into one of the premier research institutions. I suppose it all has special interest for me as my life span parallels the development of the computer. I was born in 1947. In the 7th grade I became fascinated by ham radio and electrons and studied the intricate workings of the vacuum tube, a device for which I still have some reverence. I’m still dismantling and messing with the insides of computers.

Ironically, given the book’s title, John Van Neumann takes center stage with Turing playing only a peripheral role. Van Neumann’s interest in digital computation was apparently sparked by reading Turing’s seminal article. “On Computational Numbers” that led him to the realization of the importance of stored program processing.

What Turing did that was so crucial was to take Gödel’s proof of the incompleteness theorem that permitted numbers to carry two meanings. Turing took that and thought up the paper tape computer that produced both data and code simultaneously. That realization alone was fundamental in providing the basic building block for the computer.

The builders had conflicting views of the incredible computational power they had unleashed that was to be used for both ill and good. Van Neumann recognized this: “ A tidal wave of computational power was about to break and inundate everything in science and much elsewhere, and things would never be the same.”

It would have been impossible to develop the atomic bomb without the computational abilities of the new “computers.” So naturally, the Manhattan Project is covered along with the influence of the evil Dr. Teller (I must remember to get his biography,) who was the character (Dr. Strangelove) brilliantly played by Peter Sellers. After the war, Teller pushed very hard for the development of the “super-bomb” even though he knew, or must have known, that his initial calculations were flawed because he didn’t have the computational power to do them completely. One number that I questioned was the Dyson’s reporting that when the Russians exploded a three-stage hydrogen bomb in 1961, the force released was equivalent to 1% of the sun’s power. That sounds wildly improbable. Anyone able to contradict number?

Some interesting little tidbits. One computational scientist refused to use the new VDTs, preferring to stick with punched cards (he obviously never dropped a box of them) which seemed far more tangible to him than dots on a screen. I guess fear of new technology is not reserved for non-scientists.

One of the major and very interesting questions addressed by Turing and reported on in the book is what we now call artificial intelligence. When we use a search engine are we learning from the search engine? or is the search engine learning from us? It would appear currently the latter may be true. Clearly, the search engines have been designed to store information and use that information to learn things about us both as a group and individually. I suspect that programs now make decisions based on that accumulation of knowledge. Is that not one definition of intelligence? (I will again highly recommend a book written and read quite a while ago that foresaw many of these issues: The Adolescence of P-1 by Thomas Ryan (1977)** . Note that Turing talked about the adolescence of computers and likening them to children.)

Some reviewers have taken Dyson to task for emphasizing abstract reasoning that went into the development of the computer while downplaying the role of electrical engineers (Eckert and Mauchly) in actually building the things. I’ll leave that argument to others, not caring a whit for who should get the credit and being in awe of both parties. On the other hand, the book does dwell more on the personalities than the intricacies of computing. There are some fascinating digressions, however, such as the examination of digital vs analog and how the future of computing might have been altered had Vann Neumann not tragically died so young as he had a great interest in biological computing and the relationship of the brain to the computer.

**For a plot summary of The Adolescence of P-1 see https://en.wikipedia.org/wiki/The_Adolescence_of_P-1 ( )
3 vote ecw0647 | Jul 28, 2015 |
This would have been better titled von Nuemann's Kingdom, but I'm sure the question of whether or not to capitalize the V in 'von' would have driven them crazy. There's actually not all that much on Turing himself or his work, really. Still, it's a good book and should be of interest to those of you who enjoy learning more about the history of computing and the circumstances in which much of the work around it began. ( )
  tlockney | Sep 7, 2014 |
A fascinating and illuminating book, but also a frustrating one because it should have been a lot better than it is.

The heart of the story is more or less on target – a collection of very interesting anecdotes and narratives about the personalities involved in building America's first computer, at Princeton's Institute for Advanced Study after the Second World War. Leading the team was the quite extraordinary figure of John von Neumann, about whom I knew rather little before reading this. He comes across as by far the most brilliant mind in these pages (not excluding the presence of one A. Einstein), with a near-eidetic memory, an ability to understand new concepts instantly and make staggering leaps of reasoning to advance them further. Not a very endearing character, though – a refugee from 1930s Europe, he pushed the nuclear programme hard and argued to the end of his life that the best way to create world peace was to launch a full ‘preventive’ hydrogen bomb strike against the USSR, mass civilian deaths and all.

The nuclear project was central to the invention of the computer. The first incarnation of the machine (the ‘ENIAC’, later nicknamed ‘MANIAC’) was developed specifically to model fission reactions, which involve some rather tricky maths. But von Neumann and other thinkers realised early on that a machine capable of doing that would also be able to fulfil Alan Turing's description of a ‘universal computer’: if it could do the arithmetic, it turned out, it could do practically anything else too, provided there was a way of feeding it instructions.

‘It is an irony of fate,’ observes Françoise Ulam, ‘that much of the hi-tech world we live in today, the conquest of space, the extraordinary advances in biology and medicine, were spurred on by one man's monomania and the need to develop electronic computers to calculate whether an H-bomb could be built or not.’

What is particularly fascinating is how these studies gradually let to a blurring of the distinction between life and technology. The development of computing coincided with the discovery of DNA, which showed that life is essentially built from a digital code (Nils Barricelli described strings of DNA as ‘molecule-shaped numbers’), and early programmers soon discovered that lines of code in replicating systems would display survival-of-the-fittest type phenomena. This is entering a new era with the advent of cloud-sourcing and other systems by which computing is, in effect, becoming analog and statistics-based again – search engines are a fair example.

How can this be intelligence, since we are just throwing statistical, probabilistic horsepower at the problem, and seeing what sticks, without any underlying understanding? There's no model. And how does a brain do it? With a model? These are not models of intelligent processes. They are intelligent processes.

All of this is very bracing and very interesting. Unfortunately the book is let down by a couple of major problems. The first is a compulsion to give what the author presumably thinks is ‘historical colour’ every two minutes – so the IAS is introduced by way of an entire chapter tracing the history of local Algonquin tribes, Dutch traders, William Penn and the Quakers, the War of Independence – all of which is at best totally irrelevant and at worst a fatal distraction.

The second, even more serious failing is that the technology involved remains extremely opaque. One of the most crucial parts of this story should be following the developments of cathode-ray storage, early transistors, the first moves from machine language to codes and programs. But the explanations in here are poor or non-existant. Terms like ‘shift register’, ‘stored-program computer’, ‘pulse-frequency-coded’, are thrown around as though we should all be familiar with them.

My favourite story to do with the invention of the digital world involves Claude Shannon and his remarkable realisation – one of the most civilisation-altering ideas of our species – that electronic transistors could work as logic gates. It's not even mentioned in this book. And so the crucial early building blocks of what a computer actually is – how, on a fundamental level, it really does what it does – that is all missing. And it's a pretty serious omission for someone that finds it necessary to go back to the Civil War every couple of chapters.

A lot of reviews here, especially from more technical experts, really hate this book, but on balance, I'd still recommend it. It's a very important story about a very important period, and the later chapters especially have a lot of great material. But reading around the text will probably be necessary, and this book should have offered a complete package. ( )
3 vote Widsith | Jul 7, 2014 |
In large part a nonlinear biography of John von Neumann and history of the Institute for Advanced Study in Princeton, where von Neumann in the late 1940s did his influential-ever-after "architecting" of the MANIAC computer. Woven in are discussions of many relevant topics such as Gödel/Turing metamathematics, early ways of programming, Monte Carlo approximation, the theory of self-reproducing automata, and today's accelerating trend towards a compu-singularity. (Why not mention the obliteration of privacy, Mr Dyson?) Overwhelming everything, however, is the dreary -- nay, sick and ghastly -- fact that nuclear weaponry and other military evils were the main driving force behind the building of the first electronic digital computers with Turing universality. A powerful, discerning, penetrating book.
  fpagan | Aug 29, 2013 |
Couldn't get into it - the style of writing was very irritating. Gave up after 3 chapters. ( )
  SChant | May 3, 2013 |
Showing 1-5 of 14 (next | show all)
You must log in to edit Common Knowledge data.
For more help see the Common Knowledge help page.
Series (with order)
Canonical title
Original title
Alternative titles
Original publication date
People/Characters
Important places
Important events
Related movies
Awards and honors
Epigraph
It was not made for those who sell oil or sardines . . .
--G. W. Leibniz
Dedication
First words
At 10:30 P.M. on March 3, 1953, in a one-story brick building at the end of Olden Lane in Princeton, New Jersey, Italian Norwegian mathematical biologist Nils Aall Barricelli inoculated a 5-kilobyte digital universe with random numbers generated by drawing playing cards from a shuffled deck.
Quotations
A fine line separates approximation from simulation, and developing a model is the better part of assuming control. So as not to shoot down commercial airliners, the SAGE (Semi-Automatic Ground Environment) air defense system that developed out of MIT’s Project Whirlwind in the 1950’s kept track of all passenger flights, developing a real-time model that led to the SABRE (Semi-Automatic Business Related Environment) airline reservation system that still controls much of the passenger traffic today, Google sought to gauge what people were thinking and became what people were thinking, Facebook sought to map the social graph, and became the social graph. Algorithms developed to model flucutuations in financial markets gained control of those markets, leaving human traders behind. “Toto,” said Dorothy in The Wizard of Oz. “I’ve a feeling we’re not in Kansas anymore.”           What American termed “artificial intelligence” the British termed “mechanical intelligence,” a designation that Alan Turing considered more precise. We began by observing intelligent behavior (such as language, vision, goal-seeking, and pattern-recognition) in organisms, and struggled to reproduce this behavior by encoding it into logically deterministic machines. We knew from the beginning that this logical, intelligent behavior evident in organisms was the result of fundamentally statistical, probabilistic processes, but we ignored that (or left the details to the biologists), while building “models” of intelligence-with mixed success.           Through large-scale statistical, probabilistic information processing, real progress is being made on some of the hard problems, such as speech recognition, language translation, protein folding, and stock market prediction – even if only for the next millisecond, now enough time to compete a trade. How can this be intelligence, since we are just throwing statistical, probabilistic horsepower at the problem and seeing what sticks, without an underlying understanding? There’s no model. And how does a brain do it? With a model? These are not models of intelligent processes. They ARE intelligent processes.
Last words
(Click to show. Warning: May contain spoilers.)
Disambiguation notice
Publisher's editors
Blurbers
Publisher series
Original language

References to this work on external resources.

Wikipedia in English (3)

Book description
Haiku summary

Amazon.com Product Description (ISBN 0375422773, Hardcover)

“It is possible to invent a single machine which can be used to compute any computable sequence,” twenty-four-year-old Alan Turing announced in 1936. In Turing’s Cathedral, George Dyson focuses on a small group of men and women, led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey, who built one of the first computers to realize Alan Turing’s vision of a Universal Machine. Their work would break the distinction between numbers that mean things and numbers that do things—and our universe would never be the same.
 
Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars.
 
Dyson’s account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic developments: the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It’s no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time.
 
How did code take over the world? In retracing how Alan Turing’s one-dimensional model became John von Neumann’s two-dimensional implementation, Turing’s Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.

(retrieved from Amazon Thu, 12 Mar 2015 18:21:31 -0400)

(see all 2 descriptions)

"Legendary historian and philosopher of science George Dyson vividly re-creates the scenes of focused experimentation, incredible mathematical insight, and pure creative genius that gave us computers, digital television, modern genetics, models of stellar evolution--in other words, computer code. In the 1940s and '50s, a group of eccentric geniuses--led by John von Neumann--gathered at the newly created Institute for Advanced Study in Princeton, New Jersey. Their joint project was the realization of the theoretical universal machine, an idea that had been put forth by mathematician Alan Turing. This group of brilliant engineers worked in isolation, almost entirely independent from industry and the traditional academic community. But because they relied exclusively on government funding, the government wanted its share of the results: the computer that they built also led directly to the hydrogen bomb. George Dyson has uncovered a wealth of new material about this project, and in bringing the story of these men and women and their ideas to life, he shows how the crucial advancements that dominated twentieth-century technology emerged from one computer in one laboratory, where the digital universe as we know it was born"--"Legendary historian and philosopher of science George Dyson vividly re-creates the scenes of focused experimentation, incredible mathematical insight, and pure creative genius that gave us computers, digital television, modern genetics, models of stellar evolution--in other words, computer code"--… (more)

(summary from another edition)

» see all 6 descriptions

Quick Links

Swap Ebooks Audio
106 wanted4 pay3 pay

Popular covers

Rating

Average: (3.65)
0.5
1 2
1.5
2 9
2.5
3 19
3.5 5
4 24
4.5 4
5 16

Audible.com

An edition of this book was published by Audible.com.

See editions

Is this you?

Become a LibraryThing Author.

 

Help/FAQs | About | Privacy/Terms | Blog | Store | Contact | LibraryThing.com | APIs | WikiThing | Common Knowledge | Legacy Libraries | Early Reviewers | 99,071,880 books! | Top bar: Always visible