A column by William J. Broad in The New York Times on Tuesday makes the argument that engineering failures like the one that produced the Deepwater Horizon catastrophe inevitably lead to new insights that prevent future calamities. In other words, every cloud has a silver lining. This may be so, but the evidence is equally compelling that those responsible for Deepwater Horizon failed to learn the right lessons from past failures, and by the way it is still raining.
Deepwater Horizon is only the latest marine catastrophe resulting from our unrelenting thirst for petroleum. If engineers have their way, it seems certain that it will not be the last.
Those concerned with reliability, human error and decision-making have long studied past disasters associated with the oil and gas industry for lessons we can apply to preventing future crises. For many years, the 1988 Piper Alpha disaster in the North Sea served as a convenient metaphor for what Charles Perrow termed normal accidents (see Perrow 1999). This theory holds that accidents involving complex technologies are inevitable largely because the complex interactions among their various elements (including the environment) cannot be fully appreciated much less predicted in advance.
James Reason (1990) has pointed out that disasters do not result from a single failure, but rather a concatenation of small failures or holes in the system that eventually line up allowing errors to slip through undetected until their consequences become evident. These errors often involve a combination of active and latent failures that include design defects and operational or maintenance errors fueled by the failure to recognize something unexpected occurring before it was too late.
The complexity of highly technological systems is among the reasons human beings still play critical and intimate roles in their their design, operation and maintenance. Humans have a far greater capacity to deal with ambiguity than machines, and under both ideal and less than ideal circumstances they can often make up for the inability to predict in advance what will happen when things go wrong.
But human beings have their weaknesses too. Experiments by the noted German psychologist Dietrich Dörner (1996) have demonstrated three particular shortcomings of people making decisions under conditions of complexity:
- We tend to think in terms of simple cause-effect relationships,
- We too easily underestimate the influence of exponential changes in time-space relationships, and
- We overlook network effects by either diving to deeply into the details or skipping fleetingly from one thing to another.
All three of these problems afflict public policy on off-shore drilling in the deep sea. Suggesting that we can overcome these problems with better engineering is simply hubris.
Assuming this disaster occurred due to a design flaw or mechanical failure of the blowout preventer overlooks evidence that operators had detected but failed to respond to operational anomalies over the course of several weeks before a methane blowout destroyed the platform. Each day that passed without incident reinforced expectations that the detected problems were not as serious or urgent as they seemed, when in reality the platform was lurching toward disaster with increasing speed. These problems, however, pale in comparison to the way many observers and commentators are treating the aftermath.
On one hand, people are focusing too narrowly on the destruction of the platform itself and the resulting environmental catastrophe resulting from the long unimpeded flow of oil into the waters of the Gulf of Mexico. Impatience at the pace of efforts to staunch the flow of oil has now been replaced by a desire to see the well remain sealed permanently so it no longer occupies a prominent place in the public’s attention. On the other hand, those concerned with the impacts of these events seem to skip from one hardship faced by Gulf Coast residents and businesses to another without ever focusing on the root cause of their current situation.
It may be reasonable to ask what we are willing to do to restore ecosystems, protect jobs, and preserve the unique culture of Gulf Coast communities, but dealing with these problems without addressing the systemic problems arising from the dependence of our economy and way of life on petroleum will only delay the inevitable adaptations necessary to restore balance in the fragile Gulf Coast ecosystem.
Engineering as a discipline concerns itself with how we as a society apply science to solve specific problems. Unfortunately, engineers often mistake the appearance of solutions for evidence that a problem they can solve should be solved as they define it. Efforts to improve deepwater drilling technology may prove itself just such a situation.
Putting science back at the fore of policy discussions (if not the decisions themselves) does not mean giving technologists the keys to the kingdom. Likewise, worry about the inevitable effects of what we do not know or cannot predict need not be seen as a sign of weakness. Benefit-cost analysis and risk management can only help us answer questions that present us with clear choices that involve better and worse alternatives. When we only have bad options available, as we do now when it comes to meeting short-term energy needs on the way to energy independence and greater availability of clean, renewable sources of supply, we only fool ourselves to think the costs can be managed when the externalities have such devastating long-term effects.
Whether engineers can improve the safety and reliability of drilling rigs and the associated extraction operations is a completely separate and distinct question from whether or not such activities should be occurring at all. Engineers have just as much right as anyone else to participate in such important public policy decisions, but they should not expect their opinions to hold any more sway than those of any other interested party.
The bigger question is not whether engineers will learn anything from Deepwater Horizon, but whether we trust them to apply those lessons to create even bigger problems than the ones already confronting us.
Schrage, M. (2010). The Failure of Failure, Harvard Business Review Blogs, posted March 3, 2010.
Senge, P., et al. (2008). The Necessary Revolution. New York: Doubleday.