This site uses cookies to deliver our services, improve performance, for analytics, and (if not signed in) for advertising. By using LibraryThing you acknowledge that you have read and understand our Terms of Service and Privacy Policy. Your use of the site and services is subject to these policies and terms.
Hide this

Results from Google Books

Click on a thumbnail to go to Google Books.

Normal Accidents: Living with High Risk…

Normal Accidents: Living with High Risk Technologies - Updated Edition (edition 2011)

by Charles Perrow (Author)

MembersReviewsPopularityAverage ratingMentions
247670,070 (3.79)2
Title:Normal Accidents: Living with High Risk Technologies - Updated Edition
Authors:Charles Perrow (Author)
Info:Princeton University Press (2011), Edition: Revised ed., 456 pages

Work details

Normal Accidents: Living with High-Risk Technologies by Charles Perrow



Sign up for LibraryThing to find out whether you'll like this book.

No current Talk conversations about this book.

» See also 2 mentions

English (5)  Danish (1)  All languages (6)
Showing 5 of 5
Normal Accidents is a pretty fascinating look at the complex systems in our lives and how failures of these systems is inevitable given their complicated inner workings. The fact that it took me over a month to finish this book is not indicative of it being boring, on the contrary, it was a very good read.

While I was reading this book, I was easily able to come up with countless examples of system accidents that occurred since the book's manuscript was completed (1983): Chernobyl, Fukushima, Exxon Valdez, Bhopal, Boeing 737 Max-8, Challenger, Columbia, etc. The book argues that for complex systems that cannot be made more linear or less complex, and for which the possible disasters outweigh the benefits (nuclear power and nuclear weapons, for example), we should abandon the technologies. Unfortunately, this is not a stance that the global elite agrees with, and thus we are stuck with disasters such as Chernobyl and Fukushima.

The afterward, written in 1999, contains more analysis of disasters that happened in the late 20th century, and anticipates those that may be in store in the future. (Which is now our past.) I found interesting a few sentences speaking of the possibility of issues coming from the complexity of our financial system. Unfortunately, the author spent an entire extra chapter talking about Y2K, something that caused approximately zero problems, rather than the financial sector, which caused a global recession unmatched by anything since the Great Depression. (Of course, as the author mentions, hindsight is 20-20, and causes us to find problems that may have remained hidden if not for the problem(s) that caused it to come to light.)

I believe that even though this book is rather outdated that it is still a very useful read, and I wish that the system of corporate capitalism and short-term profits didn't reign supreme even more so today than it did in 1983. I feel even less hopeful now that anybody will take these sorts of reasoned arguments against pointlessly risky technology seriously, and that we will only be exposing ourselves to more potential disasters. ( )
  lemontwist | May 27, 2019 |
Author Charles Perrow was a Yale sociology professor when he wrote Normal Accidents; he’s now an emeritus professor at both Yale and Stanford. The book is relatively old (1984, with a 1999 postscript). My initial impression was that it was the usual diatribe against nuclear power – we can’t control it, yada, we’re all gonna die, yadayada, we’ll be radioactive for centuries, yadayadayada – written by somebody who has no idea what he was talking about. Alas for initial impressions, Perrow really does make some insightful points which bear up well under further thought and analysis. He does give abundant evidence he doesn’t know what he’s talking about when it comes to technology, but that he does know what he’s talking about when it comes to organizations – and that’s what the book is about.

Perrow’s premise it that technological systems – his initial topic is the nuclear power industry, but he also covers petrochemical plants, aircraft, airways, marine accidents, dams, mines, space travel, nuclear weapons, genetic engineering and (in the postscript) Y2K, (and bureaucracies, although he doesn’t devote much space to them) can be divided into linear and complex and also into loosely coupled and tightly coupled, thus making a 4x4 matrix. Things can go wrong with a component failure – basically, one thing breaks (although a human operator can be a “component”); or a system failure. A system failure is more than just multiple simultaneous component failures, as implied an entire “system” of things fails, and in the examples Perrow gives that always involves human operators.

One of my initial objections is there is no rigor to these definitions; there’s no formula for saying how complicated something has to be to become a “complex” system or how coupled it has to be to become “tightly coupled”. However, I get his point; there is no precise definition of the start and end of the Renaissance, for example, but we can still talk about something being characteristic of Renaissance Italy and be confident we’re communicating what we mean; in addition, in the Afterword Perrow references subsequent researchers who have made contributions toward more rigorous definitions of the terms. In Perrow’s complex systems, equipment is tightly spaced, there are many “common mode” connections, it is difficult to isolate failed components, human operators are highly specialized which leads to limited awareness of interactions; there are limited opportunities to substitute components, and there are many control parameters. (“Common mode” connections are things like electrical power, cooling water, process water, steam, etc.). In tightly coupled systems, delays in processing are difficult or impossible, processing sequences cannot be varied, there is only one way to get to the goal, and there is little slack available for supplies or personnel. Linear systems are the opposite of complex and loosely coupled systems are the opposite of tightly coupled systems. In Perrow’s 4x4 matrix, nuclear plants are the extreme tightly coupled complex system; most manufacturing is loosely coupled and linear. R&D firms are an example of complex but loosely coupled; hydroelectric dams are tightly coupled but linear (there are a bunch of other points in the matrix; these are just examples). Perrow pays particular attention to nuclear power as an example of the bad consequences of tight coupled complex systems, with Three Mile Island as his centerpiece (Chernobyl gets a couple of one sentence mentions in the Afterword; Fukushima hadn’t happened yet). His theme is the Three Mile Island incident involved multiple components failing, the failures interacting in unexpected ways, and the operators making inappropriate responses (because the system complexity made it difficult or impossible to figure out what the appropriate response would be). This, in turn, is the kind of accident expected in a complex, tightly coupled system; and he goes on to conclude that nuclear power will always be a complex and tightly coupled system (although he gives some thought to the possibility that it may be possible to change that); that systems accidents will continue to occur; that the consequences of nuclear power accidents are potentially catastrophic; and therefore, nuclear power should be eliminated.

Perrow goes on to apply these concepts to his other subjects. He’s always able to find examples where a component in a tightly coupled, complex system – for example, the failure of a coffee maker in an airliner – lead to an accident (Chafing in instrument control wires shorted to the coffee maker. A circuit breaker failed to trip; additional wires burned and shorted; the aircraft fuel dump, flaps, thrust reversers and antiskid all failed. The IAI 1124 made a successful overweight night landing – successful in the sense that there were no injuries; the aircraft ran off the end of the runway after the brakes caught fire and failed).

All of these points have been made before under more prosaic names – Murphy’s Law, The Law of Unintended Consequences. Perrow’s observations are more or less in accord with my observations of one of the “complex” systems he discusses – government bureaucracies. One of his minor points is something that I’ve observed many times in hazmat and general industry – ad hoc “safety” features, added in reaction to some incident, usually make things less safe, because they add unforeseen interactions with other components; or they make the job they are making “safer” so cumbersome and difficult that they are disconnected or removed; or both. I note with a little sardonic amusement that Perrow is just as guilty here as some of the entities he criticizes – for example, after observing that a lot of aircraft crash injuries are caused by seats detaching, complains that all the aircraft manufacturers need to do is use stronger bolts to hold the seats down. Well, this is an example of the potential for simple engineering safety solutions to have unintended consequences – what if the stronger bolts cause other problems? Maybe they need to be a larger diameter and the larger bolt holes will weaken the cabin floor. Maybe they need to be a different material and that will cause electrolytic corrosion. You can probably think of half a dozen other possibilities.

Perrow’s arguments would be more convincing if he gave more evidence of understanding the systems he’s talking about. In his discussion of nuclear power, he uses all sorts of acronyms – LOCA, HPI, PORV, ASD – which suggests he’s trying to impress readers with his understanding of the system; he does the same thing when he talks about “petrochemical” plants, mentioning startup heaters, convection coils, reflux operation, and bypass valves. Yet the text is full of egregious little errors; Perrow thinks it would be safer for airplanes to land on icy runways if they were equipped with studded snow tires; that ammonium nitrate is highly explosive; that tankers move in knots per hour, and that the Bhopal chemical leak involved methal isocyanate. The funniest example of this is the postscript, written in 1999 and full of doomsday predictions about Y2K – especially because Perrow mocks the predictions of optimists. Y2K will cause a global economic recession, airplanes will fall out of the sky, the electric power grid will collapse, telecommunications will fail, and, of course, unspecified but catastrophic things will happen to nuclear power plants. Perrow’s problem here seems to be that he saw Y2K as a hardware problem – he keeps talking about “embedded chips” failing, the same way he talks about valves at nuclear plants failing to close or open, and as if the Y2K date rollover was going to cause some sort of physical damage to the hardware. (Perrow eventually does sort of caucus with the optimists; he says he’s going to buy an emergency generator and fuel but not stock up on groceries and ammunition).

To be fair to Perrow, although he’s probably of a more or less liberal bent – he uses the Reagan administration as a convenient whipping boy for things like air traffic control problems and marine accidents (the last supposedly because Reagan defunded the USCG), he usually doesn’t blame any of the problems he cites on “capitalism”; rather than citing pressure for “profits” he mentions pressure for “production” and notes that planned economies are just as likely to provide pressures for production as market ones. He does call for more regulation in some areas: marine traffic and “recombinant DNA” (“genetic engineering” hadn’t yet become the term of choice when he was writing). His arguments for increased marine traffic regulations may have some merit; his arguments about genetic engineering have the same flavor as the ones about Y2K, believing all the most pessimistic predictions and failure to understand that genetic engineering is primarily a software process rather than a hardware one.

I hope I don’t seem overly critical here; although frustrating to read sometimes, especially with the benefit of hindsight, this is a worthwhile and enlightening book. I’d like to see more mathematical rigor, but that seems to have been done by later workers. Some charts, tables, and graphs, a simplistic diagram of the Three Mile Island plant, and some plots of ship tracks to illustrate Perrow’s contention that marine collision accidents are sometimes head-slappingly inexplicable. Lots of references; the index seems sparse and I had a hard time finding some things I wanted to look up. ( )
  setnahkt | Dec 16, 2017 |
8/14/2011 I keep recommending this book and with the BP disaster, it continues to be very, very timely. One of the points made by Perrow is that when complex technology "meets large corporate and government hierarchies, lack of accountability will lead inexorably to destructive failures of systems that might have operated safely." (From a review of [b:A Sea in Flames: The Deepwater Horizon Oil Blowout|9678872|A Sea in Flames The Deepwater Horizon Oil Blowout|Carl Safina|http://ecx.images-amazon.com/images/I/51fFGw05k1L._SL75_.jpg|14566774] by Gregg Easterbrook in the NY Times April 23, 2011.)

Note added 3/2/09: Perrow's discussion of the problems inherent in tightly coupled systems is certainly timely given the intricacies of the recent financial disaster. Certainly of a tightly coupled system that cause the entire system to collapse when only one component fails.
This is a totally mesmerizing book. Perrow explains how human reliance on technology and over-design will inevitably lead to failure precisely because of inherent safety design. Good companion book for those who enjoy Henry Petroski.

Some quotes: "Above all, I will argue, sensible living with risky systems means keeping the controversies alive, listening to the public, and recognizing the essentially political nature of risk assessment. Unfortunately, the issue is not risk, but power; the power to impose risks on the many for the benefit of the few (p. 306)," and further on, "Risks from risky technologies are not borne equally by the different social classes [and I would add, countries:]; risk assessments ignore the social class distribution of risk (p. 310)." and "The risks that made our country great were not industrial risks such as unsafe coal mines or chemical pollution, but social and political risks associated with democratic institutions, decentralized political structures, religious freedom and plurality, and universal suffrage (p. 311).

( )
  ecw0647 | Sep 30, 2013 |
This book was recently reviewed positively in the Economist, which is usually a fairly good tip, and as I am interested in systems and complexity, I picked it up. Having now finished its 400-odd pages, I am not entirely convinced that the Economist reviewer actually ploughed through the whole book.
It starts well, with a convincing development of a theory of the inevitability of accidents in systems which are both complex and tightly-coupled. These concepts are explored in some detail, with numerous interesting case studies (of varying quality). This first section, in which the core thesis is developed, is excellent, and a useful contribution to the study of complex systems, risk, and failure.
However, Perrow then goes on to draw conclusions such as "nuclear power should be abandoned because it can never be made safe". In attempting to shore up this rather shaky position, some quite dubious analyses are put forward, including the arguments that space missions can never cause catastrophes for third parties (there is a village in China which would disagree), that aircraft carriers are "self-correcting" systems, and that chemical accidents are inherently small-scale (by contrast to the nuclear industry). The events at Bhopal would tend to destroy this argument: they are explicitly addressed in a postscript, where they are termed a "non-systemic" accident by reason of drawing the boundaries of the event in a rather arbitrary manner (a passingly acknowledged weakness of "Normal Accident Theory").
The final postscript on Y2K issues is also somewhat spurious.
Overall, worth reading - but if you feel like stopping half-way through, you're not missing much. ( )
  gbsallery | Jan 18, 2012 |
I enjoyed this book very much. As an engineer it provided me with some insights that were new to me. I really appreciated how Perrow showed how sociology had an important and understandable role to play in safety. ( )
  jbheffernan | Nov 12, 2007 |
Showing 5 of 5
no reviews | add a review
You must log in to edit Common Knowledge data.
For more help see the Common Knowledge help page.
Series (with order)
Canonical title
Original title
Alternative titles
Original publication date
Important places
Important events
Related movies
Awards and honors
First words
Last words
Disambiguation notice
Publisher's editors
Publisher series
Original language
Canonical DDC/MDS
Book description
Haiku summary

Amazon.com Amazon.com Review (ISBN 0691004129, Paperback)

Hang a curtain too close to a fireplace and you run the risk of setting your house ablaze. Drive a car on a pitch-black night without headlights, and you dramatically increase the odds of smacking into a tree.

These are matters of common sense, applied to simple questions of cause and effect. But what happens, asks systems-behavior expert Charles Perrow, when common sense runs up against the complex systems, electrical and mechanical, with which we have surrounded ourselves? Plenty of mayhem can ensue, he replies. The Chernobyl nuclear accident, to name one recent disaster, was partially brought about by the failure of a safety system that was being brought on line, a failure that touched off an unforeseeable and irreversible chain of disruptions; the less severe but still frightening accident at Three Mile Island, similarly, came about as the result of small errors that, taken by themselves, were insignificant, but that snowballed to near-catastrophic result.

Only through such failures, Perrow suggests, can designers improve the safety of complex systems. But, he adds, those improvements may introduce new opportunities for disaster. Looking at an array of real and potential technological mishaps--including the Bhopal chemical-plant accident of 1984, the Challenger explosion of 1986, and the possible disruptions of Y2K and genetic engineering--Perrow concludes that as our technologies become more complex, the odds of tragic results increase. His treatise makes for sobering and provocative reading. --Gregory McNamee

(retrieved from Amazon Thu, 12 Mar 2015 17:59:00 -0400)

(see all 3 descriptions)

No library descriptions found.

Quick Links

Popular covers


Average: (3.79)
2.5 1
3 7
3.5 1
4 7
4.5 1
5 4

Is this you?

Become a LibraryThing Author.


About | Contact | Privacy/Terms | Help/FAQs | Blog | Store | APIs | TinyCat | Legacy Libraries | Early Reviewers | Common Knowledge | 135,697,635 books! | Top bar: Always visible