Homeland Security Watch

News and analysis of critical issues in homeland security

July 21, 2010

Learning from Failure or Failing to Learn

Filed under: Risk Assessment — by Mark Chubb on July 21, 2010

column by William J. Broad in The New York Times on Tuesday makes the argument that engineering failures like the one that produced the Deepwater Horizon catastrophe inevitably lead to new insights that prevent future calamities. In other words, every cloud has a silver lining. This may be so, but the evidence is equally compelling that those responsible for Deepwater Horizon failed to learn the right lessons from past failures, and by the way it is still raining.

Deepwater Horizon is only the latest marine catastrophe resulting from our unrelenting thirst for petroleum. If engineers have their way, it seems certain that it will not be the last.

Those concerned with reliability, human error and decision-making have long studied past disasters associated with the oil and gas industry for lessons we can apply to preventing future crises. For many years, the 1988 Piper Alpha disaster in the North Sea served as a convenient metaphor for what Charles Perrow termed normal accidents (see Perrow 1999). This theory holds that accidents involving complex technologies are inevitable largely because the complex interactions among their various elements (including the environment) cannot be fully appreciated much less predicted in advance.

James Reason (1990) has pointed out that disasters do not result from a single failure, but rather a concatenation of small failures or holes in the system that eventually line up allowing errors to slip through undetected until their consequences become evident. These errors often involve a combination of active and latent failures that include design defects and operational or maintenance errors fueled by the failure to recognize something unexpected occurring before it was too late.

The complexity of highly technological systems is among the reasons human beings still play critical and intimate roles in their their design, operation and maintenance. Humans have a far greater capacity to deal with ambiguity than machines, and under both ideal and less than ideal circumstances they can often make up for the inability to predict in advance what will happen when things go wrong.

But human beings have their weaknesses too. Experiments by the noted German psychologist Dietrich Dörner (1996) have demonstrated three particular shortcomings of people making decisions under conditions of complexity:

  1. We tend to think in terms of simple cause-effect relationships,
  2. We too easily underestimate the influence of exponential changes in time-space relationships, and
  3. We overlook network effects by either diving to deeply into the details or skipping fleetingly from one thing to another.

All three of these problems afflict public policy on off-shore drilling in the deep sea. Suggesting that we can overcome these problems with better engineering is simply hubris.

Assuming this disaster occurred due to a design flaw or mechanical failure of the blowout preventer overlooks evidence that operators had detected but failed to respond to operational anomalies over the course of several weeks before a methane blowout destroyed the platform. Each day that passed without incident reinforced expectations that the detected problems were not as serious or urgent as they seemed, when in reality the platform was lurching toward disaster with increasing speed. These problems, however, pale in comparison to the way many observers and commentators are treating the aftermath.

On one hand, people are focusing too narrowly on the destruction of the platform itself and the resulting environmental catastrophe resulting from the long unimpeded flow of oil into the waters of the Gulf of Mexico. Impatience at the pace of efforts to staunch the flow of oil has now been replaced by a desire to see the well remain sealed permanently so it no longer occupies a prominent place in the public’s attention. On the other hand, those concerned with the impacts of these events seem to skip from one hardship faced by Gulf Coast residents and businesses to another without ever focusing on the root cause of their current situation.

It may be reasonable to ask what we are willing to do to restore ecosystems, protect jobs, and preserve the unique culture of Gulf Coast communities, but dealing with these problems without addressing the systemic problems arising from the dependence of our economy and way of life on petroleum will only delay the inevitable adaptations necessary to restore balance in the fragile Gulf Coast ecosystem.

Engineering as a discipline concerns itself with how we as a society apply science to solve specific problems. Unfortunately, engineers often mistake the appearance of solutions for evidence that a problem they can solve should be solved as they define it. Efforts to improve deepwater drilling technology may prove itself just such a situation.

Putting science back at the fore of policy discussions (if not the decisions themselves) does not mean giving technologists the keys to the kingdom. Likewise, worry about the inevitable effects of what we do not know or cannot predict need not be seen as a sign of weakness. Benefit-cost analysis and risk management can only help us answer questions that  present us with clear choices that involve better and worse alternatives. When we only have bad options available, as we do now when it comes to meeting short-term energy needs on the way to energy independence and greater availability of clean, renewable sources of supply, we only fool ourselves to think the costs can be managed when the externalities have such devastating long-term effects.

Whether engineers can improve the safety and reliability of drilling rigs and the associated extraction operations is a completely separate and distinct question from whether or not such activities should be occurring at all. Engineers have just as much right as anyone else to participate in such important public policy decisions, but they should not expect their opinions to hold any more sway than those of any other interested party.

The bigger question is not whether engineers will learn anything from Deepwater Horizon, but whether we trust them to apply those lessons to create even bigger problems than the ones already confronting us.

Further reading:

Schrage, M. (2010). The Failure of Failure, Harvard Business Review Blogs, posted March 3, 2010.

Senge, P., et al. (2008). The Necessary Revolution. New York: Doubleday.

Share and Enjoy:
  • Digg
  • Reddit
  • Facebook
  • Yahoo! Buzz
  • Google Bookmarks
  • email
  • Print
  • LinkedIn

3 Comments »

Comment by William R. Cumming

July 21, 2010 @ 5:57 am

Great post and once treated to expert testimony on Charles Perrow’s theories for several days at the Shoreham Station Nuclear Plant Liscensing hearing before the NRC Atomic Safety Liscensing Board. Always fascinated me that human’s overriding safety systems because of their non-belief in accuracy of instruments has been documented many times in aircraft accidents. John Kennedy, Jr. may be a leading example. Also Perrow discourse on human intervention and the belief that humans know better than machines is clearly a WESTERN concept. When he deals with civil defense and its history in his books he is way of base, however.

That said it now appears that METHANE not oil may be the big issue coming out of the BP catastrophe in the GOM. And may have been responsible for the initial blowup of the rig. Suggest we all follow this issue closely. Also Congress about to launch hearings on the NCP and its effectiveness.That could be of interest depending on the witnesses. My bottom line [hate that term]is can we save the 5 Gulf States impacted? Time will tell. And by the way unless you can show tax returns with income, it is NO SHOW NO PAY by BP and Feinberg. Hey this may be the largest tax enforcement effort ever in the US even as tax enforcement is now pinpointed as the major failure in the Euro zone and much of the world.

Comment by Mark Chubb

July 21, 2010 @ 9:46 am

Despite some minor disagreements among the principal theorists on human-machine interactions and human error, the one thing I think they all agree on is that any complex system involves intimate interactions between the two elements that we simply cannot afford to overlook. Engineering humans out or relying too heavily on them, either one, can produce disastrous results. Even fully automated systems still retain the flaws and shortcoming of their human designers and those who monitor or maintain them.

The methane issue has received some interesting treatment outside the MSM in recent days. I am not too sure what to make of these discussions beyond what I already surmised: We know a lot less than we need to know and only a fraction of what we think we know about the risks and consequences of this enterprise.

All science requires us to accept the falsifiability of our hypotheses and theories. We would do well to remember that most of the theories we have about the most complex systems are little more than untested hypotheses that have yet to reveal themselves (or our understandings of them) as flawed.

Comment by 66

July 22, 2010 @ 11:02 am

C.F. Kettering summarized in Catching Up With Nature, “When a scientist conquers something, he abides by the fundamental laws and does so with Nature’s permission. He has learned that conquering is submission.” Submissions can be long to me. Succession planning is important when cleaning up and in business you want to clean up. At the rate we are going, the next generation is going to be cleaning up. We should support them in that effort.

RSS feed for comments on this post. TrackBack URI

Leave a comment

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>