HomeGroupsTalkZeitgeist
Hide this

Results from Google Books

Click on a thumbnail to go to Google Books.

Turing's Cathedral: The Origins of the…
Loading...

Turing's Cathedral: The Origins of the Digital Universe (original 2012; edition 2012)

by George Dyson

MembersReviewsPopularityAverage ratingMentions
5751917,144 (3.66)14
Member:megamorg
Title:Turing's Cathedral: The Origins of the Digital Universe
Authors:George Dyson
Info:Pantheon Books (2012), Hardcover, 432 pages
Collections:Your library
Rating:
Tags:Computers, Science, Innovation, IAS

Work details

Turing's Cathedral: The Origins of the Digital Universe by George Dyson (2012)

None.

Loading...

Sign up for LibraryThing to find out whether you'll like this book.

No current Talk conversations about this book.

» See also 14 mentions

English (17)  Spanish (1)  Danish (1)  All (19)
Showing 1-5 of 17 (next | show all)
This book is an illuminating history of how/why the first computers were built and the people who did it. It is also a survey of the problems they were destined to solve. And while these short descriptions are accurate, it does little to describe how powerful this book is. It set off a chain reaction in my intellectual life.

Why? It conveys how special computer technology is, how many people struggled to make it work, and how cheaply and easily we, today, can gain access to this medium of computation and thought. It also shares a vision for computers that does not come across in the mainstream -- a vision that mathematicians dreamed not so long ago. It intimately ties computers to our own birth/death cycles via the exploration of DNA and the development of the atom bomb (computers were funded to perfect destruction, not just power desktop or mobile apps). This and many other ideas and events are woven together into an awe-inspiring and personal story of the author’s childhood.

I enjoy many other things about this book. I like that you won’t find any mention of Microsoft, Apple, or other current players in computing (nor will you learn much about the title character, Turing, unfortunately). The book rightly focuses on the past, Princeton, the ENIAC, and the people who transported these ideas across the physical barrier (sometimes with the use of bicycle wheels and wire). I also like the liberties Dyson takes to create metaphors for technology; they make the subject a lot less dry and add a sense of wonder.

Dyson admirably brings in many viewpoints from women and people who normally are not talked about in association with technology. He speaks about the native Americans that inhabited Princeton and even writes about what happened to the natives of Enewetok before it was annihilated by the first hydrogen bomb.

So I’ve convinced you that I’m a fan, but this book does deserve some criticism. The audience of this book is not clear. Dyson alludes to complex ideas in passing and I imagine some would be caught off guard and discouraged from reading more. The chapter on Godel was especially esoteric (though motivating). I also wish there had been more references offered for self-study of computer engineering (this is a good one https://www.librarything.com/work/7767819).

People have pointed to other faults of the book and I disagree with some of them. First, this book is not misleading. Some poor reviews complain that details of the architecture and planning phases of ENIAC aren't included. There is no need for this since the author is hitting on the main conceptual developments. Some also say that mysticism pervades this book. I would argue that is an exaggeration and any hint of that tone actually helps us remember how special computers are. I certainly feel some "magic" and emotion in the fact that numbers and transistors have been able to conquer so much of our world. Others also say the writing is vague and the story has too many loose ends. I think Dyson leaves loose ends to pique your curiosity so you’ll find more sources. This is not an encyclopedic reference, rather a personal perspective. Finally, some accuse this book of being scattered and random. Know that it is organized thematically, not chronologically. This does cause the story to jump in time, but the overall organization is great if you're more interested in making connections.

Give this book a chance. You may not like it at first, but chapters can often be read independently and the historical accounts are eye-opening. They might lead you on a path like it did for me. ( )
1 vote danrk | Nov 16, 2016 |
Tough going if you don't know much about computing. Most science frightens the hell out of me these days. This book has some interesting speculation about the march to mechanical intelligence. Completing the book induced a pervading depression for me.
1 vote ivanfranko | Jul 12, 2016 |
Excellent narrative that makes clear the link between the computing revolution and the race for better military calculations, especially the modeling of nuclear blasts. ( )
  steve.lane | Nov 28, 2015 |
Has nothing to do with Alan Turing, more of a high society history of the beginnings of Princeton, and a lengthy overly detailed technical manual on how the first computers worked. NO narrative at all. ( )
  Victor_A_Davis | Sep 18, 2015 |
If you are looking for information about Alan Turing, look elsewhere. The title is a metaphor.

The Nazis did the U.S. a huge favor with their boorish and stupid racial policies. Many prominent Jews were brilliant mathematicians and physicists, and when the “cleansing” of universities began by the Nazis, people like Van Neumann, Einstein, and many others fled to the United States where they were of immense assistance in the development of the atomic bomb.

This book is about the origins and development of the digital age and Dyson spends considerable space on the people and institutions key to that development. The Princeton Institute for Advanced Research, for example, under Abraham Flexner and Oswald Veblen, recruited many of these refugees who helped build the Institute into one of the premier research institutions. I suppose it all has special interest for me as my life span parallels the development of the computer. I was born in 1947. In the 7th grade I became fascinated by ham radio and electrons and studied the intricate workings of the vacuum tube, a device for which I still have some reverence. I’m still dismantling and messing with the insides of computers.

Ironically, given the book’s title, John Van Neumann takes center stage with Turing playing only a peripheral role. Van Neumann’s interest in digital computation was apparently sparked by reading Turing’s seminal article. “On Computational Numbers” that led him to the realization of the importance of stored program processing.

What Turing did that was so crucial was to take Gödel’s proof of the incompleteness theorem that permitted numbers to carry two meanings. Turing took that and thought up the paper tape computer that produced both data and code simultaneously. That realization alone was fundamental in providing the basic building block for the computer.

The builders had conflicting views of the incredible computational power they had unleashed that was to be used for both ill and good. Van Neumann recognized this: “ A tidal wave of computational power was about to break and inundate everything in science and much elsewhere, and things would never be the same.”

It would have been impossible to develop the atomic bomb without the computational abilities of the new “computers.” So naturally, the Manhattan Project is covered along with the influence of the evil Dr. Teller (I must remember to get his biography,) who was the character (Dr. Strangelove) brilliantly played by Peter Sellers. After the war, Teller pushed very hard for the development of the “super-bomb” even though he knew, or must have known, that his initial calculations were flawed because he didn’t have the computational power to do them completely. One number that I questioned was the Dyson’s reporting that when the Russians exploded a three-stage hydrogen bomb in 1961, the force released was equivalent to 1% of the sun’s power. That sounds wildly improbable. Anyone able to contradict number?

Some interesting little tidbits. One computational scientist refused to use the new VDTs, preferring to stick with punched cards (he obviously never dropped a box of them) which seemed far more tangible to him than dots on a screen. I guess fear of new technology is not reserved for non-scientists.

One of the major and very interesting questions addressed by Turing and reported on in the book is what we now call artificial intelligence. When we use a search engine are we learning from the search engine? or is the search engine learning from us? It would appear currently the latter may be true. Clearly, the search engines have been designed to store information and use that information to learn things about us both as a group and individually. I suspect that programs now make decisions based on that accumulation of knowledge. Is that not one definition of intelligence? (I will again highly recommend a book written and read quite a while ago that foresaw many of these issues: The Adolescence of P-1 by Thomas Ryan (1977)** . Note that Turing talked about the adolescence of computers and likening them to children.)

Some reviewers have taken Dyson to task for emphasizing abstract reasoning that went into the development of the computer while downplaying the role of electrical engineers (Eckert and Mauchly) in actually building the things. I’ll leave that argument to others, not caring a whit for who should get the credit and being in awe of both parties. On the other hand, the book does dwell more on the personalities than the intricacies of computing. There are some fascinating digressions, however, such as the examination of digital vs analog and how the future of computing might have been altered had Vann Neumann not tragically died so young as he had a great interest in biological computing and the relationship of the brain to the computer.

**For a plot summary of The Adolescence of P-1 see https://en.wikipedia.org/wiki/The_Adolescence_of_P-1 ( )
3 vote ecw0647 | Jul 28, 2015 |
Showing 1-5 of 17 (next | show all)
You must log in to edit Common Knowledge data.
For more help see the Common Knowledge help page.
Series (with order)
Canonical title
Original title
Alternative titles
Original publication date
People/Characters
Important places
Important events
Related movies
Awards and honors
Epigraph
It was not made for those who sell oil or sardines . . .
--G. W. Leibniz
Dedication
First words
At 10:30 P.M. on March 3, 1953, in a one-story brick building at the end of Olden Lane in Princeton, New Jersey, Italian Norwegian mathematical biologist Nils Aall Barricelli inoculated a 5-kilobyte digital universe with random numbers generated by drawing playing cards from a shuffled deck.
Quotations
A fine line separates approximation from simulation, and developing a model is the better part of assuming control. So as not to shoot down commercial airliners, the SAGE (Semi-Automatic Ground Environment) air defense system that developed out of MIT’s Project Whirlwind in the 1950’s kept track of all passenger flights, developing a real-time model that led to the SABRE (Semi-Automatic Business Related Environment) airline reservation system that still controls much of the passenger traffic today, Google sought to gauge what people were thinking and became what people were thinking, Facebook sought to map the social graph, and became the social graph. Algorithms developed to model flucutuations in financial markets gained control of those markets, leaving human traders behind. “Toto,” said Dorothy in The Wizard of Oz. “I’ve a feeling we’re not in Kansas anymore.”           What American termed “artificial intelligence” the British termed “mechanical intelligence,” a designation that Alan Turing considered more precise. We began by observing intelligent behavior (such as language, vision, goal-seeking, and pattern-recognition) in organisms, and struggled to reproduce this behavior by encoding it into logically deterministic machines. We knew from the beginning that this logical, intelligent behavior evident in organisms was the result of fundamentally statistical, probabilistic processes, but we ignored that (or left the details to the biologists), while building “models” of intelligence-with mixed success.           Through large-scale statistical, probabilistic information processing, real progress is being made on some of the hard problems, such as speech recognition, language translation, protein folding, and stock market prediction – even if only for the next millisecond, now enough time to compete a trade. How can this be intelligence, since we are just throwing statistical, probabilistic horsepower at the problem and seeing what sticks, without an underlying understanding? There’s no model. And how does a brain do it? With a model? These are not models of intelligent processes. They ARE intelligent processes.
Last words
(Click to show. Warning: May contain spoilers.)
Disambiguation notice
Publisher's editors
Blurbers
Publisher series
Original language

References to this work on external resources.

Wikipedia in English (5)

Book description
Haiku summary

Amazon.com Product Description (ISBN 0375422773, Hardcover)

“It is possible to invent a single machine which can be used to compute any computable sequence,” twenty-four-year-old Alan Turing announced in 1936. In Turing’s Cathedral, George Dyson focuses on a small group of men and women, led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey, who built one of the first computers to realize Alan Turing’s vision of a Universal Machine. Their work would break the distinction between numbers that mean things and numbers that do things—and our universe would never be the same.
 
Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars.
 
Dyson’s account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic developments: the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It’s no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time.
 
How did code take over the world? In retracing how Alan Turing’s one-dimensional model became John von Neumann’s two-dimensional implementation, Turing’s Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.

(retrieved from Amazon Thu, 12 Mar 2015 18:21:31 -0400)

(see all 2 descriptions)

"Legendary historian and philosopher of science George Dyson vividly re-creates the scenes of focused experimentation, incredible mathematical insight, and pure creative genius that gave us computers, digital television, modern genetics, models of stellar evolution--in other words, computer code. In the 1940s and '50s, a group of eccentric geniuses--led by John von Neumann--gathered at the newly created Institute for Advanced Study in Princeton, New Jersey. Their joint project was the realization of the theoretical universal machine, an idea that had been put forth by mathematician Alan Turing. This group of brilliant engineers worked in isolation, almost entirely independent from industry and the traditional academic community. But because they relied exclusively on government funding, the government wanted its share of the results: the computer that they built also led directly to the hydrogen bomb. George Dyson has uncovered a wealth of new material about this project, and in bringing the story of these men and women and their ideas to life, he shows how the crucial advancements that dominated twentieth-century technology emerged from one computer in one laboratory, where the digital universe as we know it was born"--"Legendary historian and philosopher of science George Dyson vividly re-creates the scenes of focused experimentation, incredible mathematical insight, and pure creative genius that gave us computers, digital television, modern genetics, models of stellar evolution--in other words, computer code"--… (more)

(summary from another edition)

» see all 7 descriptions

Quick Links

Swap Ebooks Audio
101 wanted4 pay3 pay

Popular covers

Rating

Average: (3.66)
0.5
1 3
1.5
2 9
2.5
3 21
3.5 6
4 30
4.5 4
5 18

Audible.com

An edition of this book was published by Audible.com.

See editions

Is this you?

Become a LibraryThing Author.

 

You are using the new servers! | About | Privacy/Terms | Help/FAQs | Blog | Store | APIs | TinyCat | Legacy Libraries | Early Reviewers | Common Knowledge | 110,831,391 books! | Top bar: Always visible