Homeland Security Watch

News and analysis of critical issues in homeland security

July 13, 2012

Can you envision a “successful failure”?

Filed under: Catastrophes,Risk Assessment,Strategy — by Philip J. Palin on July 13, 2012

In the movie Apollo 13 — recounting the nearly deadly 1970 moon mission —  the heroic NASA mission director says, “Failure is not an option.”

The real hero — Gene Kranz — never said this.   It’s a scriptwriter’s creation.   After the movie’s success, Mr. Kranz did use the phrase as the title of his memoir.

Failure is always an option.  We recently received several reminders of this reality:

The final report on Air France Flight 447 found that “the crew was in a state of near-total loss of control” because of inconsistent data reports.

A  Japanese parliamentary commission found the Fukushima nuclear emergency was a “profoundly man-made disaster.” (See a good summary from the BBC.)

Last week from Columbus, Ohio to Charleston, West Virginia to Washington DC the best laid plans of intelligent people and competent organizations unraveled before an unexpected strong storm.

There was failure.   There was passivity, fear, denial, selfishness and greed.

At Fukushima and in response to the derecho there was also creativity, courage, patience,  generosity, self-sacrifice and resilience.  We don’t know enough about what happened over the South Atlantic to be sure, but I expect even in those horrific 3 minutes, 30 seconds the full range of humanity could be found.

Across all these situations there was uncertainty.   Some uncertainty is innate to nearly every context.  But we are increasingly adept at self-creating even more.

Responding to the Air France Final Report, William Voss, President of the Flight Safety Foundation, told The Guardian, “Pilots a generation ago would have… understood what was going on, but [the AF447 pilots] were so conditioned to rely on the automation that they were unable to do this,” he said. “This is a problem not just limited to Air France or Airbus, it’s a problem we’re seeing around the world because pilots are being conditioned to treat automated processed data as truth, and not compare it with the raw information that lies underneath.”

It’s a problem well-beyond commercial aviation.  We organize much of our lives around the assumption that automated processes will persist and critical information will be available.  We expect to be warned of a threat, about the location and condition of our family and friends,  and about when a crisis will be over.  We expect to be able to access our credit and cash accounts. We expect to be able to travel from here to there to purchase what we need and reunite with those we love.   If necessary, we expect to be able to call 911 and quickly get professional help.  Over the last two or three generations everyday life has — increasingly — demonstrated these are reasonable expectations.

We are habituated to success.

But like the Air France pilots, when our information habit is not being fed our response can be self-destructive.   In the absence of information we tend to continue as usual or focus on restoring access to information. Both behaviors can significantly increase our risk by ignoring rapidly changing conditions and/or delaying thoughtful engagement with changed conditions.

The Apollo 13 Review Board found the accident, “…resulted from an unusual combination of mistakes, coupled with a somewhat deficient and unforgiving design.”

The deficient and unforgiving design that many of us — private citizens as well as public safety agencies — have adopted is dependence on just-in-time information.

My twenty-something children  seldom pre-plan in any significant way. They expect cell phones, text messaging, Facebook, and email to allow them to seize the best opportunities that unfold.   It works and I envy them.  Except when it does not work.  Except when these digital networks fail.

Much of our consumer culture is built around the same approach. We have become an economy, a society optimized for just-in-time. It can be a beautiful dance of  wonderful possibilities emerging in a moment and rapidly synchronized across time and space.  Until the music stops.

In the three examples above (not all catastrophic) there is a shared over-confidence in the fail-safe capabilities of protective design and effective communications.   In each of these cases the design bias increased risk exposure, communications was confusing or worse,  and both the design and the communications protocols complicated effective human response once risk was experienced.

There are several contending definitions of resilience.  Something that all the definitions I have encountered share is an expectation of failure.  Resilience is in many cases the learned-response to failure.  If it doesn’t kill you, you can learn from it.   The good news — and the bad news — is that catastrophes are sufficiently rare that we don’t get many opportunities to learn about catastrophic resilience.  What is a “forgiving design” for encountering catastrophe?

In April 2010 Jim Lovell, the commander of Apollo 13, called the mission a “successful failure.” Lovell explained that while Apollo 13 never reached the moon, there was  ”a great success in the ability of people to take an almost certain catastrophe and turn it into a successful recovery.”

Envision a complete blackout of telecommunications (voice and data) across a region, say, extending from the mouth of the Susquehanna River south to the Potomac River and from about the Bull Run Mountains in the West to the Chesapeake Bay in the East.  This encompasses roughly 5 million residents.

Such a blackout for any sustained period  is an “an almost certain catastrophe”.   Can we envision how to “turn it into a successful recovery?”  What could be done?  What should be done?  What does the mental exercise (more?) tell us about our dependencies, our operational options, mitigation opportunities, and creativity?

I know, I know… such an event is wildly unlikely… nearly unimaginable.  Just about as silly as a bad thermostat undoing a mission to the moon.

–+–

This is part of a series examining potential relationships between catastrophe, resilience, and civil liberties.  We have spent the last several Friday’s looking mostly at catastrophe.  With this post we are pivoting toward resilience.   There have been a couple of great conversations.   Please contribute to the conversation by selecting the comment function immediately below.

Share and Enjoy:
  • Digg
  • Reddit
  • Facebook
  • Yahoo! Buzz
  • Google Bookmarks
  • email
  • Print

3 Comments »

Comment by Bruce Martin

July 13, 2012 @ 4:38 pm

I personally favor a “hi-tech/low-tech” approach. In other words, be able to use all the tools available and work around or through the challenges confronting you. Know your options when your primary tools fail. This is as much learned and cultural behavior as a doctrine for resilience, and it could be applied societally as well as individually.

Certain entities espouse resilient behavior, beginning with the Boy Scouts (Be Prepared). NGOs, many long adapted to making do with much less, have been actively collaborating, with and without government support (see CARD at http://sfcard.org/wp/). I believe CARD’s purpose is to assist their members to be as resilient as possible and able to continue their respective missions. Rural dwellers must often by circumstance be more resilient than city dwellers, who have (in a normal state of affairs) quicker access to goods and services.

The Boy Scouts teach individual resilience, and encourage contributing to community resilience (at least that’s what I remember). Military, aviation (!) and first responder groups teach skills and mind-sets that build resilience as well, couched as human factors and crew resource management: Royal Marines (http://blogs.hbr.org/cs/2012/06/keeping_things_hot_in_a_risky.html?awid=7067298757216513442-3271); wildland firefighters (www.fireleadership.gov); aviation (http://en.wikipedia.org/wiki/Crew_resource_management).

Interestingly enough, on a deployment to a rural county post-Hurricane Irene last year, I observed some emergency management professionals somewhat bothered that more citizens weren’t using the shelters we had arranged. The citizens were proving resilient, taking care of themselves and each other, and elements of the response/recovery system didn’t know how to process that. My conclusion is that not only has our society become vulnerable in the way you describe, but the expectations of those who respond and aid recovery do not factor much resilience in either.

I have always been interested by Geert Hofstede’s assessment that US society displays “low uncertainty avoidance,” that is, we take things as they come rather than building in certainty via preparedness. Hofstede’s work has been challenged, but my professional observations are that for whatever reason, preparedness and resilience are not prioritized in many Americans’ lives.

I attribute that prioritization to an incomplete knowledge of how things work on the part of many citizens, and your observation of habits. We trust that the (CIKR) system will work. We are surprised when it doesn’t. The system’s success reinforces its weakness in that we are not motivated to figure out “Plan B” as Plan A just keeps working. If you’ve ever lived in a rural area with not infrequent power outages and you have a water well, you may understand. In that situation, a back-up or option to grid power is desirable. If water always comes out the tap, a back-up does not seem as urgently needed.

Getting our citizens such knowledge isn’t often high on government’s priority list, either. I believe it should be. As has been mentioned in other threads, informed and educated citizens are better able to contribute and are likely to be more resilient.

Comment by Philip J. Palin

July 14, 2012 @ 5:07 am

Bruce, Your allusions to a rural-urban divide may point to a whole range of other issues. Even today, rural residents do — and often must — give more attention to personal preparedness. They (we, actually) can call 911, but the response times are not reassuring. My wife and I have extensive redundancies for water, heat, shelter, food, medical care, etc. We depend on these back-ups regularly enough that the investment is regularly rewarded.

In an urban-suburban context the pay-off comes less often and the investment costs are higher. So fewer choose to make the investment. But in case of a potentially catastrophic event, this preparedness is, I suggest, at least as essential as in a rural context.

In regard to uncertainty avoidance, while this cultural trait causes us to be less explicitly prepared, does it also facilitate more creativity (resilience?) in response to the unfolding of uncertainty?

Comment by A Riley Eller

August 15, 2012 @ 2:26 pm

The 9/11 Commission Report says all of these things did happen and that lives were lost because of it. Every massive natural event of sufficient magnitude causes this scenario to unfold. Because of these experiences, there are literally continuous conferences, field trials, practice events, software development activities, and books published on the topic. Even before this century there were active naturalist, survivalist, and scavenging populations practicing the skills that are needed when infrastructure fails.

If you would like to be brought up to speed on solving infrastructure dependence, I would be happy to spend some time introducing the luminaries of the field. I will be in Washington, D.C. September 13-14 at the Woodrow Wilson Center for International Scholars discussing the issues of trust in crowd-sourced data. Many of those people will be in attendance.

RSS feed for comments on this post. TrackBack URI

Leave a comment

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>