Click on a thumbnail to go to Google Books.
The Innovators: How a Group of Inventors, Hackers, Geniuses and Geeks… (2014)
by Walter Isaacson
No current Talk conversations about this book.
Excellent history and biography of many key players in the evolution of computer technology. It's really interesting to see think about the continuing evolution, esp. of technologies like ChatGPT that have really exploded on the scene in the last couple of years. Would be nice for Isaacson to write an updated edition. ( )
I would not have guessed when this book was selected by our three-person, in person, buddy read group that it would be such a fun read. Isaacson, as always, does his homework and writes clearly. He highlights some consistent themes about the value of diversity (in terms of skills and general outlook), persistence in the face of apparently insurmountable obstacles, and vision, out of the profusion of people and ideas that formed the background leading to our current digital age.
Lots of people. Lots and lots of names, some familiar, many not. For me the two who stood out most were the visionaries, Ada Lovelace and Vannever Bush. Although I was familiar with Lovelace's reputation as the first person to expand on the potential contained within the idea of a computing device, in about 1845, until this book I was unaware of the extent to which she predicted certain key elements of today's software, such as subroutines. And a response her contention that computers could never go beyond what they are programmed to do was the basis of Alan Turing's "Imitation Game", designed to test the difference between human and computer statements.
Vannever Bush, an engineer and administrator of many projects that were involved one way or the other with the development of modern computer technology, in a 1945 article predicted the "Memex", a concept that has taken life in today's personal computers and cell phones. “A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility”. He even predicted hypertext!
One last individual who deserves a call out here is poor Al Gore, so often pilloried for taking what is commonly assumed to be too much credit for the development of the internet. Turns out that as a Senator he was actually committed to the idea and directly responsible for the funding needed to bring it to fruition.
I'm sure that for many readers reading the book will recall our first forays into the world of personal computers. I found myself smiling as I remembered taking the case off my first PC to add memory chips. By contrast, remembering the earlier punch cards, the "Fatal Error Line 3" message, and the long waits to have the cards re-run, generates a feeling that is definitely less warm. No wonder the PC and a canned spreadsheet program (Lotus!) was a such a joy by comparison. The fact that it can all now be done on my phone still stuns me.
Thanks to our buddy group, and thanks to Isaacson, for a highly enjoyable read.
This was a decent overview of computing from Ada Lovelace to around 2006, but I expected something a lot better. There was a deeper than Wikipedia level of research into the specific people and projects (especially interesting were some of the parallel developments in the first computers and transistors), but nothing particularly insightful or driven by understanding of the technology or uses for it — it was more “these people built X and this other team also built something equivalent to X but without being recognized or funded”.
Things got dumber and more like a magazine article closer to the present day (Internet, Web). I have much more familiarity with these as both a participant at the time and from talking with/interacting with the principals mentioned, and it is never a good sign when a book is weakest in areas where you understand the most.
There are probably better overviews of the separate topics covered in the book; early days of the transistor and the IC are each worthwhile, and then probably Internet to modern day, including VC and startups, as a separate work.
An excellent review of the history of the digital technology.
Almost everything we do these days has some link to the world wide web, or involves interacting with some sort of computer, but how did these things become so pervasive and essential? In this book Isaacson writes about the people that made the companies, that made the products that we all now use.
Starting on the earliest computer, the Analytical Engine conceived by Charles Babbage, which he made with Byron’s daughter Ada Lovelace. It was a purely mechanical device, made at the very limits of engineering capability at the time. It took another century until the next computers surfaced. A man called Vannevar Bush was instrumental in developing a differential analyser for generating firing tables, followed in World War 2 by the Colossus at Bletchley used for attacking the Nazi Enigma codes. These new room sized contraptions used the old vacuum tube valves, and consumed vast amounts of energy and took large numbers of people to maintain and use the machines.
For computers to reach the point where you could get more than one in a room, the technology would need to be miniaturised. The team in America that achieved this using the semi conducting properties of silicon would earn themselves a Nobel Prize. This moment was the point where the modern computer age started, especially when it was realised that there could have a variety of components, and therefore circuits on a single piece of silicon. These new microchips were initially all taken by the US military for weapons, but as the price of manufacture fall, numerous commercial applications could be realised.
Some of the first products that used microchips that the general public saw were calculators, but as engineers started to use their imaginations almost anything was possible. The coming years saw the development of the first video games, personal computers that you could fit on a desk and the birth of the internet. Most of these innovations came out of one place in California that we now know as Silicon Valley. It formed a new way of working too, with unlikely collaborations, spin offs and the beginning of software and hardware companies that have now become household names.
It didn’t take too long for people to start wanting to hook computers together. The original ARPNET was a military network, but it soon had links to academia and not long after that the geeks found it. It was still a niche way of communicating, until Tim Berners-Lee invented the World Wide Web with hypertext linking, and the world was never the same again.
Isaacson has written a reasonable book on the history of computing and the internet, and the significant characters and people who discovered or made things, or who just happened to be in the right place at the right time. He covers all manner of noteworthy events right up to the present day. Mostly written from an American centric point of view, it feels like a book celebrating America’s major achievements in computing. Whilst they have had a major part to play, they have not had the stage entirely to themselves; there is a brief sojourn to Finland about Linux and CERN with Berners-Lee there is very little mention of other European.
There are some flaws though. He doesn’t mention the dark net or any of the other less salubrious activities that happen online either; ignoring them doesn’t make them go away. There is very little mention of mobile technology either. It was a book worth reading though, as he shows that some of the best innovations have come from unlikely collaborations, those that don’t follow the herd and those whose quirky personalities and way of seeing the world bring forth products that we never knew we needed.
... even at its most rushed, the book evinces a genuine affection for its subjects that makes it tough to resist. Isaacson confesses early on that he was once “an electronics geek who loved Heathkits and ham radios,” and that background seems to have given him keen insight into how youthful passion transforms into professional obsession. His book is thus most memorable not for its intricate accounts of astounding breakthroughs and the business dramas that followed, but rather for the quieter moments in which we realize that the most primal drive for innovators is a need to feel childlike joy.
References to this work on external resources.
Wikipedia in English (2)
"Following his blockbuster biography of Steve Jobs, The Innovators is Walter Isaacson's revealing story of the people who created the computer and the Internet. It is destined to be the standard history of the digital revolution and an indispensable guide to how innovation really happens. What were the talents that allowed certain inventors and entrepreneurs to turn their visionary ideas into disruptive realities? What led to their creative leaps? Why did some succeed and others fail? In his masterly saga, Isaacson begins with Ada Lovelace, Lord Byron's daughter, who pioneered computer programming in the 1840s. He explores the fascinating personalities that created our current digital revolution, such as Vannevar Bush, Alan Turing, John von Neumann, J.C.R. Licklider, Doug Engelbart, Robert Noyce, Bill Gates, Steve Wozniak, Steve Jobs, Tim Berners-Lee, and Larry Page. This is the story of how their minds worked and what made them so inventive. It's also a narrative of how their ability to collaborate and master the art of teamwork made them even more creative. For an era that seeks to foster innovation, creativity, and teamwork, The Innovators shows how they happen"--
No library descriptions found.
Amazon Kindle (0 editions)
Audible (0 editions)
CD Audiobook (0 editions)
Project Gutenberg (0 editions)
Google Books — Loading...
Melvil Decimal System (DDC)004.092Information Computing and Information Computer science Computer science -- subdivisions History, geographic treatment, biography Biography
Is this you?
Become a LibraryThing Author.