Fukushima: a "normal accident".

AuthorMorris, Jane Anne

In retrospect (that is, after an accident has occurred), it is often easy to look back at a situation and describe it as an accident waiting to happen. Too often we just leave it at that, perhaps hoping that someone else will figure out how we could have foreseen it.

Somebody has. Charles Perrow has reviewed a range of technologies--including petrochemical and nuclear facilities, dams, mines, weapons and space research, aircraft and airways, DNA research, and shipping--and written an insightful and persuasive analysis of the kinds of systems in which accidents are inevitable, or "normal" in his terminology. [1] Nuclear power plants are excellent examples of such systems.

According to Perrow, normal accidents occur in systems that share a few key characteristics. First, the system has many components (parts, procedures, and operators) arranged in a complex way. (That is, it's not one long linear process where all that happens is that A leads to B leads to C, etc., and that can be stopped easily at any time.)

In such a system (with many components arranged complexly), it is obvious that many small failures--things like faulty switches, burned-out light bulbs, minor operator errors--will occur. Such failures are not expected to be catastrophic because numerous back-up and emergency response systems--also complex--are in place.

The second characteristic of a system that will, according to Perrow's analysis, experience normal accidents, is that two or more failures (of parts, procedures, or operator judgment)--failures that may be trivial in themselves--can interact in unexpected ways. For example, part P (a light bulb on a gauge) might fail at the same time that part Q (part of a back-up system) is off-line for maintenance. The failure of part P might leave operators unaware that a problem was developing; the inactive status of part Q might deactivate the emergency system that would have (probably) either alerted operators to the problem, or shut down now-dangerous components.

But the problem is just beginning. For one thing, the operators may not know that anything unusual is happening. There is so much going on, that in a system with literally billions of components, they may not know that part Q is not on-line. They have no way of knowing that the light bulb in a particular gauge should be blinking "danger." The complex system, with all of its gauges, back-up systems, and interdependent processes (for example, certain pumps automatically go...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT