HomeGroupsTalkZeitgeist
Hide this

Results from Google Books

Click on a thumbnail to go to Google Books.

Turing's Cathedral: The Origins of the…
Loading...

Turing's Cathedral: The Origins of the Digital Universe (original 2012; edition 2012)

by George Dyson

MembersReviewsPopularityAverage ratingMentions
4121325,790 (3.63)11
Member:megamorg
Title:Turing's Cathedral: The Origins of the Digital Universe
Authors:George Dyson
Info:Pantheon Books (2012), Hardcover, 432 pages
Collections:Your library
Rating:
Tags:Computers, Science, Innovation, IAS

Work details

Turing's Cathedral: The Origins of the Digital Universe by George Dyson (2012)

None.

Loading...

Sign up for LibraryThing to find out whether you'll like this book.

No current Talk conversations about this book.

» See also 11 mentions

Showing 1-5 of 13 (next | show all)
This would have been better titled von Nuemann's Kingdom, but I'm sure the question of whether or not to capitalize the V in 'von' would have driven them crazy. There's actually not all that much on Turing himself or his work, really. Still, it's a good book and should be of interest to those of you who enjoy learning more about the history of computing and the circumstances in which much of the work around it began. ( )
  tlockney | Sep 7, 2014 |
A fascinating and illuminating book, but also a frustrating one because it should have been a lot better than it is.

The heart of the story is more or less on target – a collection of very interesting anecdotes and narratives about the personalities involved in building America's first computer, at Princeton's Institute for Advanced Study after the Second World War. Leading the team was the quite extraordinary figure of John von Neumann, about whom I knew rather little before reading this. He comes across as by far the most brilliant mind in these pages (not excluding the presence of one A. Einstein), with a near-eidetic memory, an ability to understand new concepts instantly and make staggering leaps of reasoning to advance them further. Not a very endearing character, though – a refugee from 1930s Europe, he pushed the nuclear programme hard and argued to the end of his life that the best way to create world peace was to launch a full ‘preventive’ hydrogen bomb strike against the USSR, mass civilian deaths and all.

The nuclear project was central to the invention of the computer. The first incarnation of the machine (the ‘ENIAC’, later nicknamed ‘MANIAC’) was developed specifically to model fission reactions, which involve some rather tricky maths. But von Neumann and other thinkers realised early on that a machine capable of doing that would also be able to fulfil Alan Turing's description of a ‘universal computer’: if it could do the arithmetic, it turned out, it could do practically anything else too, provided there was a way of feeding it instructions.

‘It is an irony of fate,’ observes Françoise Ulam, ‘that much of the hi-tech world we live in today, the conquest of space, the extraordinary advances in biology and medicine, were spurred on by one man's monomania and the need to develop electronic computers to calculate whether an H-bomb could be built or not.’

What is particularly fascinating is how these studies gradually let to a blurring of the distinction between life and technology. The development of computing coincided with the discovery of DNA, which showed that life is essentially built from a digital code (Nils Barricelli described strings of DNA as ‘molecule-shaped numbers’), and early programmers soon discovered that lines of code in replicating systems would display survival-of-the-fittest type phenomena. This is entering a new era with the advent of cloud-sourcing and other systems by which computing is, in effect, becoming analog and statistics-based again – search engines are a fair example.

How can this be intelligence, since we are just throwing statistical, probabilistic horsepower at the problem, and seeing what sticks, without any underlying understanding? There's no model. And how does a brain do it? With a model? These are not models of intelligent processes. They are intelligent processes.

All of this is very bracing and very interesting. Unfortunately the book is let down by a couple of major problems. The first is a compulsion to give what the author presumably thinks is ‘historical colour’ every two minutes – so the IAS is introduced by way of an entire chapter tracing the history of local Algonquin tribes, Dutch traders, William Penn and the Quakers, the War of Independence – all of which is at best totally irrelevant and at worst a fatal distraction.

The second, even more serious failing is that the technology involved remains extremely opaque. One of the most crucial parts of this story should be following the developments of cathode-ray storage, early transistors, the first moves from machine language to codes and programs. But the explanations in here are poor or non-existant. Terms like ‘shift register’, ‘stored-program computer’, ‘pulse-frequency-coded’, are thrown around as though we should all be familiar with them.

My favourite story to do with the invention of the digital world involves Claude Shannon and his remarkable realisation – one of the most civilisation-altering ideas of our species – that electronic transistors could work as logic gates. It's not even mentioned in this book. And so the crucial early building blocks of what a computer actually is – how, on a fundamental level, it really does what it does – that is all missing. And it's a pretty serious omission for someone that finds it necessary to go back to the Civil War every couple of chapters.

A lot of reviews here, especially from more technical experts, really hate this book, but on balance, I'd still recommend it. It's a very important story about a very important period, and the later chapters especially have a lot of great material. But reading around the text will probably be necessary, and this book should have offered a complete package. ( )
2 vote Widsith | Jul 7, 2014 |
In large part a nonlinear biography of John von Neumann and history of the Institute for Advanced Study in Princeton, where von Neumann in the late 1940s did his influential-ever-after "architecting" of the MANIAC computer. Woven in are discussions of many relevant topics such as Gödel/Turing metamathematics, early ways of programming, Monte Carlo approximation, the theory of self-reproducing automata, and today's accelerating trend towards a compu-singularity. (Why not mention the obliteration of privacy, Mr Dyson?) Overwhelming everything, however, is the dreary -- nay, sick and ghastly -- fact that nuclear weaponry and other military evils were the main driving force behind the building of the first electronic digital computers with Turing universality. A powerful, discerning, penetrating book.
  fpagan | Aug 29, 2013 |
Couldn't get into it - the style of writing was very irritating. Gave up after 3 chapters. ( )
  SChant | May 3, 2013 |
Meandering and portentous but very much worth reading… ( )
  Katong | Apr 14, 2013 |
Showing 1-5 of 13 (next | show all)
You must log in to edit Common Knowledge data.
For more help see the Common Knowledge help page.
Series (with order)
Canonical title
Original title
Alternative titles
Original publication date
People/Characters
Important places
Important events
Related movies
Awards and honors
Epigraph
It was not made for those who sell oil or sardines . . .
--G. W. Leibniz
Dedication
First words
At 10:30 P.M. on March 3, 1953, in a one-story brick building at the end of Olden Lane in Princeton, New Jersey, Italian Norwegian mathematical biologist Nils Aall Barricelli inoculated a 5-kilobyte digital universe with random numbers generated by drawing playing cards from a shuffled deck.
Quotations
A fine line separates approximation from simulation, and developing a model is the better part of assuming control. So as not to shoot down commercial airliners, the SAGE (Semi-Automatic Ground Environment) air defense system that developed out of MIT’s Project Whirlwind in the 1950’s kept track of all passenger flights, developing a real-time model that led to the SABRE (Semi-Automatic Business Related Environment) airline reservation system that still controls much of the passenger traffic today, Google sought to gauge what people were thinking and became what people were thinking, Facebook sought to map the social graph, and became the social graph. Algorithms developed to model flucutuations in financial markets gained control of those markets, leaving human traders behind. “Toto,” said Dorothy in The Wizard of Oz. “I’ve a feeling we’re not in Kansas anymore.”           What American termed “artificial intelligence” the British termed “mechanical intelligence,” a designation that Alan Turing considered more precise. We began by observing intelligent behavior (such as language, vision, goal-seeking, and pattern-recognition) in organisms, and struggled to reproduce this behavior by encoding it into logically deterministic machines. We knew from the beginning that this logical, intelligent behavior evident in organisms was the result of fundamentally statistical, probabilistic processes, but we ignored that (or left the details to the biologists), while building “models” of intelligence-with mixed success.           Through large-scale statistical, probabilistic information processing, real progress is being made on some of the hard problems, such as speech recognition, language translation, protein folding, and stock market prediction – even if only for the next millisecond, now enough time to compete a trade. How can this be intelligence, since we are just throwing statistical, probabilistic horsepower at the problem and seeing what sticks, without an underlying understanding? There’s no model. And how does a brain do it? With a model? These are not models of intelligent processes. They ARE intelligent processes.
Last words
(Click to show. Warning: May contain spoilers.)
Disambiguation notice
Publisher's editors
Blurbers
Publisher series
Original language

References to this work on external resources.

Wikipedia in English (3)

Book description
Haiku summary

Amazon.com Product Description (ISBN 0375422773, Hardcover)

“It is possible to invent a single machine which can be used to compute any computable sequence,” twenty-four-year-old Alan Turing announced in 1936. In Turing’s Cathedral, George Dyson focuses on a small group of men and women, led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey, who built one of the first computers to realize Alan Turing’s vision of a Universal Machine. Their work would break the distinction between numbers that mean things and numbers that do things—and our universe would never be the same.
 
Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars.
 
Dyson’s account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic developments: the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It’s no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time.
 
How did code take over the world? In retracing how Alan Turing’s one-dimensional model became John von Neumann’s two-dimensional implementation, Turing’s Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.

(retrieved from Amazon Mon, 30 Sep 2013 13:56:01 -0400)

(see all 2 descriptions)

"Legendary historian and philosopher of science George Dyson vividly re-creates the scenes of focused experimentation, incredible mathematical insight, and pure creative genius that gave us computers, digital television, modern genetics, models of stellar evolution--in other words, computer code. In the 1940s and '50s, a group of eccentric geniuses--led by John von Neumann--gathered at the newly created Institute for Advanced Study in Princeton, New Jersey. Their joint project was the realization of the theoretical universal machine, an idea that had been put forth by mathematician Alan Turing. This group of brilliant engineers worked in isolation, almost entirely independent from industry and the traditional academic community. But because they relied exclusively on government funding, the government wanted its share of the results: the computer that they built also led directly to the hydrogen bomb. George Dyson has uncovered a wealth of new material about this project, and in bringing the story of these men and women and their ideas to life, he shows how the crucial advancements that dominated twentieth-century technology emerged from one computer in one laboratory, where the digital universe as we know it was born"--"Legendary historian and philosopher of science George Dyson vividly re-creates the scenes of focused experimentation, incredible mathematical insight, and pure creative genius that gave us computers, digital television, modern genetics, models of stellar evolution--in other words, computer code"--… (more)

(summary from another edition)

» see all 6 descriptions

Quick Links

Swap Ebooks Audio
95 wanted4 pay3 pay

Popular covers

Rating

Average: (3.63)
0.5
1 1
1.5
2 7
2.5
3 18
3.5 5
4 18
4.5 3
5 12

Audible.com

An edition of this book was published by Audible.com.

See editions

Is this you?

Become a LibraryThing Author.

 

Help/FAQs | About | Privacy/Terms | Blog | Contact | LibraryThing.com | APIs | WikiThing | Common Knowledge | Legacy Libraries | Early Reviewers | 92,677,396 books! | Top bar: Always visible