HomeGroupsTalkZeitgeist
This site uses cookies to deliver our services, improve performance, for analytics, and (if not signed in) for advertising. By using LibraryThing you acknowledge that you have read and understand our Terms of Service and Privacy Policy. Your use of the site and services is subject to these policies and terms.
Hide this

Results from Google Books

Click on a thumbnail to go to Google Books.

Turing's Cathedral: The Origins of the…
Loading...

Turing's Cathedral: The Origins of the Digital Universe (2012)

by George Dyson

Other authors: See the other authors section.

MembersReviewsPopularityAverage ratingMentions
7252119,404 (3.55)15
Recently added bywwj, DSBarberis, vandaaway, fdmiranda, thindor, livus, private library, PJB84, hobus

None.

Loading...

Sign up for LibraryThing to find out whether you'll like this book.

No current Talk conversations about this book.

» See also 15 mentions

English (19)  Spanish (1)  Danish (1)  All languages (21)
Showing 1-5 of 19 (next | show all)
George Dyson's book (2012) is about the origins of the modern computer, developed in a time when a computer wore skirts. A 'computer' was a woman with an adder, a calculation machine. They were working in teams to create tables for -say - grenade launchers, to calculate the right trajectory for any given configuration.

The machine Dyson writes about was a different kind. The kind with vacuum tubes and wires. A Turing Machine like a walk-in closet, a cathedral of calculation. He recalls the history of computers with names like ENIAC and MANIAC. The brainchildren not only of Babbage and Turing, but most of all of John von Neumann. MANIAC made calculations about the weather, the evolution of species and for tables used by anti-aircraft gunners, but it was famous (or maybe notorious) because of the calculations it did on the H-Bomb: neutrino behavior, shockwaves. That all in a computer which was special because it was the first to be equipped with Random Access Memory (RAM).

The book recalls the origin of the place where Von Neumann worked on his love child, IAS (Institute for Advanced Studies) in Princeton, and tells the story about the many persons involved. These people included Stanislaw Ulam, who developed the famous Monte Carlo statistics approach, and engineer Julian Bigelow, who could make a space rocket from two empty jerrycans and a wooden plank.
The story shows how intertwined the scientific effort was with the war effort, which regularly led to friction. It also gives a great account of the development, its ups as well as its downs, of the IAS.

The book is meticulously documented. No detail is left out. It can overwhelm you. Added to that George Dyson jumps forwards and backwards through time; as soon as a new character is introduced he starts all over again: 'X was born in Y. His father was Z, a simple farmer from ...' It makes the account even more hard to follow.
At the end of the book I had the impression that I knew a lot about everyone and everything, but not about the exact order in which it all happened. My guess is that George Dyson, son of Freeman Dyson, is too much of an insider. He knew every one of them and can place them exactly in their respective contexts. Not so for the average reader. In that respect it's not so much a book for specialists (as everything is well explained) but for people who are willing to take the time to learn the cast of characters, even if it means to stop reading and revisit the whole list. Some organization schedules of the IAS through the years might have been a good addition.

Overall the subject is interesting enough and its prose captivating enough to make it a 'four out of five stars' book. Even if Georges Dyson's story requires a lot of human brain RAM to process effectively. ( )
  jeroenvandorp | Jan 7, 2018 |
I might have easily given this book four stars if Dyson could have stuck to history instead of indulging himself in inane speculations, and commentaries that are sadly meant to sound profound. The connections he draws between completely unrelated aspects of technology and biology are so strained that whenever I read a particularly grievous one, I'm forced to put the book down and walk around the room until the waves of stupidity subside a bit. For example, at one point Dyson asks us to consider whether digital computers might be "optimizing our genetic code ... so we can better assist them." At another he explains the reason we can't predict the evolution of the digital universe is because algorithms that predicted airplane movements in WW2 had to be normalized to the reference frame of the target... or something? Throughout the entire book there's a complete disconnect between the technical nature of the things he describes and the vague abstractions that he twists into obscenely trite metaphors.
Dyson seems to live in some sort of science-fiction wonderland where every computer program is a kind of non-organic organism. He calls code "symbiotic associations of self-reproducing numbers" that "evolved into collector societies, bringing memory allocations and other resources back to the collective nest." They are active, autonomous entities which "learned how to divide into packets, traverse the network, correct any errors suffered along the way, and reassemble themselves at the other end." By the end of the book I'm not even sure if Dyson means this as a metaphor - he appears to genuinely believe that it's merely a matter of perspective.
The truth is, if every human died tomorrow and the internet was left to run from now to infinity, not a single advance would be made in the state of computing. The viruses would quickly burn themselves away, the servers would grind monotonously at their maintenance routines, and the Google webcrawlers would stoically trudge through every porn site on Earth, an infinite number of times.
Dyson might respond that programs integrate humans as a symbiotic part of their evolution, but in that case you could say the same thing about clothing, music, or furniture. In this light the IKEA franchise must be viewed as a great self-replicating organism, conscripting humans in the propagation of its global hegemony of coffee tables. ( )
2 vote the_lemur | Nov 9, 2017 |
Fell flat with me. Not what I was wishing the book to be. ( )
1 vote sirk.bronstad | Feb 16, 2017 |
This book is an illuminating history of how/why the first computers were built and the people who did it. It is also a survey of the problems they were destined to solve. And while these short descriptions are accurate, it does little to describe how powerful this book is. It set off a chain reaction in my intellectual life.

Why? It conveys how special computer technology is, how many people struggled to make it work, and how cheaply and easily we, today, can gain access to this medium of computation and thought. It also shares a vision for computers that does not come across in the mainstream -- a vision that mathematicians dreamed not so long ago. It intimately ties computers to our own birth/death cycles via the exploration of DNA and the development of the atom bomb (computers were funded to perfect destruction, not just power desktop or mobile apps). This and many other ideas and events are woven together into an awe-inspiring and personal story of the author’s childhood.

I enjoy many other things about this book. I like that you won’t find any mention of Microsoft, Apple, or other current players in computing (nor will you learn much about the title character, Turing, unfortunately). The book rightly focuses on the past, Princeton, the EDVAC, and the people who transported these ideas across the physical barrier (sometimes with the use of bicycle wheels and wire). I also like the liberties Dyson takes to create metaphors for technology; they make the subject a lot less dry and add a sense of wonder.

Dyson admirably brings in many viewpoints from women and people who normally are not talked about in association with technology. He speaks about the native Americans that inhabited Princeton and even writes about what happened to the natives of Enewetok before it was annihilated by the first hydrogen bomb.

So I’ve convinced you that I’m a fan, but this book does deserve some criticism. The audience of this book is not clear. Dyson alludes to complex ideas in passing and I imagine some would be caught off guard and discouraged from reading more. The chapter on Godel was especially esoteric (though motivating). I also wish there had been more references offered for self-study of computer engineering (this is a good one https://www.librarything.com/work/7767819).

People have pointed to other faults of the book and I disagree with some of them. First, this book is not misleading. Some poor reviews complain that details of the architecture and planning phases of EDVAC aren't included. There is no need for this since the author is hitting on the main conceptual developments. Some also say that mysticism pervades this book. I would argue that is an exaggeration and any hint of that tone actually helps us remember how special computers are. I certainly feel some "magic" and emotion in the fact that numbers and transistors have been able to conquer so much of our world. Others also say the writing is vague and the story has too many loose ends. I think Dyson leaves loose ends to pique your curiosity so you’ll find more sources. This is not an encyclopedic reference, rather a personal perspective. Finally, some accuse this book of being scattered and random. Know that it is organized thematically, not chronologically. This does cause the story to jump in time, but the overall organization is great if you're more interested in making connections.

Give this book a chance. You may not like it at first, but chapters can often be read independently and the historical accounts are eye-opening. They might lead you on a path like it did for me. ( )
2 vote danrk | Nov 16, 2016 |
Tough going if you don't know much about computing. Most science frightens the hell out of me these days. This book has some interesting speculation about the march to mechanical intelligence. Completing the book induced a pervading depression for me.
1 vote ivanfranko | Jul 12, 2016 |
Showing 1-5 of 19 (next | show all)

» Add other authors

Author nameRoleType of authorWork?Status
George Dysonprimary authorall editionscalculated
Mendelsund, PeterCover designersecondary authorsome editionsconfirmed
You must log in to edit Common Knowledge data.
For more help see the Common Knowledge help page.
Series (with order)
Canonical title
Original title
Alternative titles
Original publication date
People/Characters
Important places
Important events
Related movies
Awards and honors
Epigraph
It was not made for those who sell oil or sardines . . .
--G. W. Leibniz
Dedication
First words
At 10:30 P.M. on March 3, 1953, in a one-story brick building at the end of Olden Lane in Princeton, New Jersey, Italian Norwegian mathematical biologist Nils Aall Barricelli inoculated a 5-kilobyte digital universe with random numbers generated by drawing playing cards from a shuffled deck.
Quotations
A fine line separates approximation from simulation, and developing a model is the better part of assuming control. So as not to shoot down commercial airliners, the SAGE (Semi-Automatic Ground Environment) air defense system that developed out of MIT’s Project Whirlwind in the 1950’s kept track of all passenger flights, developing a real-time model that led to the SABRE (Semi-Automatic Business Related Environment) airline reservation system that still controls much of the passenger traffic today, Google sought to gauge what people were thinking and became what people were thinking, Facebook sought to map the social graph, and became the social graph. Algorithms developed to model flucutuations in financial markets gained control of those markets, leaving human traders behind. “Toto,” said Dorothy in The Wizard of Oz. “I’ve a feeling we’re not in Kansas anymore.”           What American termed “artificial intelligence” the British termed “mechanical intelligence,” a designation that Alan Turing considered more precise. We began by observing intelligent behavior (such as language, vision, goal-seeking, and pattern-recognition) in organisms, and struggled to reproduce this behavior by encoding it into logically deterministic machines. We knew from the beginning that this logical, intelligent behavior evident in organisms was the result of fundamentally statistical, probabilistic processes, but we ignored that (or left the details to the biologists), while building “models” of intelligence-with mixed success.           Through large-scale statistical, probabilistic information processing, real progress is being made on some of the hard problems, such as speech recognition, language translation, protein folding, and stock market prediction – even if only for the next millisecond, now enough time to compete a trade. How can this be intelligence, since we are just throwing statistical, probabilistic horsepower at the problem and seeing what sticks, without an underlying understanding? There’s no model. And how does a brain do it? With a model? These are not models of intelligent processes. They ARE intelligent processes.
Last words
(Click to show. Warning: May contain spoilers.)
Disambiguation notice
Publisher's editors
Blurbers
Publisher series
Original language
Canonical DDC/MDS

References to this work on external resources.

Wikipedia in English (5)

Book description
Haiku summary

Amazon.com Product Description (ISBN 0375422773, Hardcover)

“It is possible to invent a single machine which can be used to compute any computable sequence,” twenty-four-year-old Alan Turing announced in 1936. In Turing’s Cathedral, George Dyson focuses on a small group of men and women, led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey, who built one of the first computers to realize Alan Turing’s vision of a Universal Machine. Their work would break the distinction between numbers that mean things and numbers that do things—and our universe would never be the same.
 
Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars.
 
Dyson’s account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic developments: the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It’s no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time.
 
How did code take over the world? In retracing how Alan Turing’s one-dimensional model became John von Neumann’s two-dimensional implementation, Turing’s Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.

(retrieved from Amazon Thu, 12 Mar 2015 18:21:31 -0400)

(see all 2 descriptions)

"Legendary historian and philosopher of science George Dyson vividly re-creates the scenes of focused experimentation, incredible mathematical insight, and pure creative genius that gave us computers, digital television, modern genetics, models of stellar evolution--in other words, computer code. In the 1940s and '50s, a group of eccentric geniuses--led by John von Neumann--gathered at the newly created Institute for Advanced Study in Princeton, New Jersey. Their joint project was the realization of the theoretical universal machine, an idea that had been put forth by mathematician Alan Turing. This group of brilliant engineers worked in isolation, almost entirely independent from industry and the traditional academic community. But because they relied exclusively on government funding, the government wanted its share of the results: the computer that they built also led directly to the hydrogen bomb. George Dyson has uncovered a wealth of new material about this project, and in bringing the story of these men and women and their ideas to life, he shows how the crucial advancements that dominated twentieth-century technology emerged from one computer in one laboratory, where the digital universe as we know it was born"--"Legendary historian and philosopher of science George Dyson vividly re-creates the scenes of focused experimentation, incredible mathematical insight, and pure creative genius that gave us computers, digital television, modern genetics, models of stellar evolution--in other words, computer code"--… (more)

» see all 7 descriptions

Quick Links

Popular covers

Rating

Average: (3.55)
0.5
1 5
1.5
2 11
2.5
3 26
3.5 7
4 33
4.5 4
5 18

Is this you?

Become a LibraryThing Author.

 

About | Contact | Privacy/Terms | Help/FAQs | Blog | Store | APIs | TinyCat | Legacy Libraries | Early Reviewers | Common Knowledge | 133,421,254 books! | Top bar: Always visible