HomeGroupsTalkZeitgeist
Hide this

Results from Google Books

Click on a thumbnail to go to Google Books.

Turing's Cathedral: The Origins of the…
Loading...

Turing's Cathedral: The Origins of the Digital Universe (original 2012; edition 2012)

by George Dyson

MembersReviewsPopularityAverage ratingMentions
3931227,240 (3.58)11
Member:megamorg
Title:Turing's Cathedral: The Origins of the Digital Universe
Authors:George Dyson
Info:Pantheon Books (2012), Hardcover, 432 pages
Collections:Your library
Rating:
Tags:Computers, Science, Innovation, IAS

Work details

Turing's Cathedral: The Origins of the Digital Universe by George Dyson (2012)

None.

Loading...

Sign up for LibraryThing to find out whether you'll like this book.

No current Talk conversations about this book.

» See also 11 mentions

Showing 1-5 of 12 (next | show all)
A fascinating and illuminating book, but also a frustrating one because it should have been a lot better than it is.

The heart of the story is more or less on target – a collection of very interesting anecdotes and narratives about the personalities involved in building America's first computer, at Princeton's Institute for Advanced Study after the Second World War. Leading the team was the quite extraordinary figure of John von Neumann, about whom I knew rather little before reading this. He comes across as by far the most brilliant mind in these pages (not excluding the presence of one A. Einstein), with a near-eidetic memory, an ability to understand new concepts instantly and make staggering leaps of reasoning to advance them further. Not a very endearing character, though – a refugee from 1930s Europe, he pushed the nuclear programme hard and argued to the end of his life that the best way to create world peace was to launch a full ‘preventive’ hydrogen bomb strike against the USSR, mass civilian deaths and all.

The nuclear project was central to the invention of the computer. The first incarnation of the machine (the ‘ENIAC’, later nicknamed ‘MANIAC’) was developed specifically to model fission reactions, which involve some rather tricky maths. But von Neumann and other thinkers realised early on that a machine capable of doing that would also be able to fulfil Alan Turing's description of a ‘universal computer’: if it could do the arithmetic, it turned out, it could do practically anything else too, provided there was a way of feeding it instructions.

‘It is an irony of fate,’ observes Françoise Ulam, ‘that much of the hi-tech world we live in today, the conquest of space, the extraordinary advances in biology and medicine, were spurred on by one man's monomania and the need to develop electronic computers to calculate whether an H-bomb could be built or not.’

What is particularly fascinating is how these studies gradually let to a blurring of the distinction between life and technology. The development of computing coincided with the discovery of DNA, which showed that life is essentially built from a digital code (Nils Barricelli described strings of DNA as ‘molecule-shaped numbers’), and early programmers soon discovered that lines of code in replicating systems would display survival-of-the-fittest type phenomena. This is entering a new era with the advent of cloud-sourcing and other systems by which computing is, in effect, becoming analog and statistics-based again – search engines are a fair example.

How can this be intelligence, since we are just throwing statistical, probabilistic horsepower at the problem, and seeing what sticks, without any underlying understanding? There's no model. And how does a brain do it? With a model? These are not models of intelligent processes. They are intelligent processes.

All of this is very bracing and very interesting. Unfortunately the book is let down by a couple of major problems. The first is a compulsion to give what the author presumably thinks is ‘historical colour’ every two minutes – so the IAS is introduced by way of an entire chapter tracing the history of local Algonquin tribes, Dutch traders, William Penn and the Quakers, the War of Independence – all of which is at best totally irrelevant and at worst a fatal distraction.

The second, even more serious failing is that the technology involved remains extremely opaque. One of the most crucial parts of this story should be following the developments of cathode-ray storage, early transistors, the first moves from machine language to codes and programs. But the explanations in here are poor or non-existant. Terms like ‘shift register’, ‘stored-program computer’, ‘pulse-frequency-coded’, are thrown around as though we should all be familiar with them.

My favourite story to do with the invention of the digital world involves Claude Shannon and his remarkable realisation – one of the most civilisation-altering ideas of our species – that electronic transistors could work as logic gates. It's not even mentioned in this book. And so the crucial early building blocks of what a computer actually is – how, on a fundamental level, it really does what it does – that is all missing. And it's a pretty serious omission for someone that finds it necessary to go back to the Civil War every couple of chapters.

A lot of reviews here, especially from more technical experts, really hate this book, but on balance, I'd still recommend it. It's a very important story about a very important period, and the later chapters especially have a lot of great material. But reading around the text will probably be necessary, and this book should have offered a complete package. ( )
1 vote Widsith | Jul 7, 2014 |
In large part a nonlinear biography of John von Neumann and history of the Institute for Advanced Study in Princeton, where von Neumann in the late 1940s did his influential-ever-after "architecting" of the MANIAC computer. Woven in are discussions of many relevant topics such as Gödel/Turing metamathematics, early ways of programming, Monte Carlo approximation, the theory of self-reproducing automata, and today's accelerating trend towards a compu-singularity. (Why not mention the obliteration of privacy, Mr Dyson?) Overwhelming everything, however, is the dreary -- nay, sick and ghastly -- fact that nuclear weaponry and other military evils were the main driving force behind the building of the first electronic digital computers with Turing universality. A powerful, discerning, penetrating book.
  fpagan | Aug 29, 2013 |
Couldn't get into it - the style of writing was very irritating. Gave up after 3 chapters. ( )
  SChant | May 3, 2013 |
Meandering and portentous but very much worth reading… ( )
  Katong | Apr 14, 2013 |
Turing's Cathedral: The Origins of the Digital Universe
George Dyson
April 1, 2013

George Dyson is interestingly the son of Freeman Dyson, who was part of the events chronicled in the book. The author describes the creation of the first electronic computer, the MANIAC, at the Institute for Advanced Study at Princeton, starting in 1949 and completing in 1953. The impresario of the project was John von Neumann, who gathered engineers to build the computer and helped to design the first programming language. Much of the impetus for the computer was to complete calculations for the hydrogen bomb, then in development. The mathematicians and physicists involved were mostly also involved in the atomic bomb project in Los Alamos. The computer used vacuum tubes, and the filament heaters consumed several kilowatts of power, and the air conditioning to keep the apparatus cool also used gross amounts of power, often icing over in the humidity. The memory was about 5 kilobytes, stored in Williams' cathode-ray storage tubes (the persistent phosphor glow allowed bits to be retained, and read out by the electron beam). The engineers were the first to develop a command line, and read instructions into the machine with paper tape, later punch cards. The input and output followed the same patterns as earlier special purpose machines like those at Bletchely Park in England during WWII, and ENIAC created for calculating artillery tables. This is a fascinating time in engineering history, but the story is very liberally padded with irrelevant information, like the history of Princeton in Indian and Colonial times. Dyson at times speculates about the "digital universe" and its relationship to human thought: "With our cooperation, self-reproducing numbers are exercising increasingly detailed and far-reaching control over the conditions in our universe that make life more comfortable in theirs". "The paradox of artificial intelligence is that any system simple enough to understand is not complicated enough to behave intelligently, and any system complicated enough to behave intelligently is not simple enough to understand." It is interesting that random searches may be more efficient on large machines than encoding a solution to a problem, and by looking through the number of solutions that have already been encoded in the digital universe, it may be easier to find answers. "In 2010 you could buy a computer with over a billion transistors for the inflation adjusted cost of a transistor radio in 1956" ( )
  neurodrew | Apr 7, 2013 |
Showing 1-5 of 12 (next | show all)
You must log in to edit Common Knowledge data.
For more help see the Common Knowledge help page.
Series (with order)
Canonical title
Original title
Alternative titles
Original publication date
People/Characters
Important places
Important events
Related movies
Awards and honors
Epigraph
It was not made for those who sell oil or sardines . . .
--G. W. Leibniz
Dedication
First words
At 10:30 P.M. on March 3, 1953, in a one-story brick building at the end of Olden Lane in Princeton, New Jersey, Italian Norwegian mathematical biologist Nils Aall Barricelli inoculated a 5-kilobyte digital universe with random numbers generated by drawing playing cards from a shuffled deck.
Quotations
Last words
(Click to show. Warning: May contain spoilers.)
Disambiguation notice
Publisher's editors
Blurbers
Publisher series
Original language

References to this work on external resources.

Wikipedia in English (3)

Book description
Haiku summary

Amazon.com Product Description (ISBN 0375422773, Hardcover)

“It is possible to invent a single machine which can be used to compute any computable sequence,” twenty-four-year-old Alan Turing announced in 1936. In Turing’s Cathedral, George Dyson focuses on a small group of men and women, led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey, who built one of the first computers to realize Alan Turing’s vision of a Universal Machine. Their work would break the distinction between numbers that mean things and numbers that do things—and our universe would never be the same.
 
Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars.
 
Dyson’s account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic developments: the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It’s no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time.
 
How did code take over the world? In retracing how Alan Turing’s one-dimensional model became John von Neumann’s two-dimensional implementation, Turing’s Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.

(retrieved from Amazon Mon, 30 Sep 2013 13:56:01 -0400)

(see all 2 descriptions)

"Legendary historian and philosopher of science George Dyson vividly re-creates the scenes of focused experimentation, incredible mathematical insight, and pure creative genius that gave us computers, digital television, modern genetics, models of stellar evolution--in other words, computer code. In the 1940s and '50s, a group of eccentric geniuses--led by John von Neumann--gathered at the newly created Institute for Advanced Study in Princeton, New Jersey. Their joint project was the realization of the theoretical universal machine, an idea that had been put forth by mathematician Alan Turing. This group of brilliant engineers worked in isolation, almost entirely independent from industry and the traditional academic community. But because they relied exclusively on government funding, the government wanted its share of the results: the computer that they built also led directly to the hydrogen bomb. George Dyson has uncovered a wealth of new material about this project, and in bringing the story of these men and women and their ideas to life, he shows how the crucial advancements that dominated twentieth-century technology emerged from one computer in one laboratory, where the digital universe as we know it was born"--"Legendary historian and philosopher of science George Dyson vividly re-creates the scenes of focused experimentation, incredible mathematical insight, and pure creative genius that gave us computers, digital television, modern genetics, models of stellar evolution--in other words, computer code"--… (more)

(summary from another edition)

» see all 6 descriptions

Quick Links

Swap Ebooks Audio
95 wanted3 pay3 pay

Popular covers

Rating

Average: (3.58)
0.5
1 1
1.5
2 7
2.5
3 18
3.5 3
4 17
4.5 3
5 10

Audible.com

An edition of this book was published by Audible.com.

See editions

Is this you?

Become a LibraryThing Author.

 

Help/FAQs | About | Privacy/Terms | Blog | Contact | LibraryThing.com | APIs | WikiThing | Common Knowledge | Legacy Libraries | Early Reviewers | 91,550,677 books! | Top bar: Always visible