Homeland Security Watch

News and analysis of critical issues in homeland security

December 30, 2009

“The operation was a failure, but the patient lived.”

Filed under: Aviation Security,Events,General Homeland Security,Intelligence and Info-Sharing,Strategy — by Christopher Bellavita on December 30, 2009

“Politically correct” means constraining the way one behaves or uses language because one is afraid to violate powerful orthodoxies.

President Obama has officially declared that “a systemic failure has occurred,” and he considers it to be “…totally unacceptable.”

Obviously, when a system fails in a technologically advanced society, the only politically correct thing to do is fix it.

One fixes system failures by identifying the offending elements and replacing them with elements that are not going to fail.

It is irrational to do anything other than that.

But what if this was not (except with hindsight) a preventable systemic failure?  What if it is in the nature of complex systems to “self organize” and every now and then  just fail?

On this point, see “Complexity, contingency, and criticality,” by Bak and Pakzuski (originators of the sandpile avalanche metaphor — i.e., “for a wide variety of phenomena, there are no deep underlying causes, just an accumulation of tiny accidents.”).

Less technical treatments of the idea can be found in Charles Perrow’s , “Normal Accidents: Living with High-Risk Technologies;Mark Buchanan’s Ubiquity: Why Catastrophes Happen;” or Joshua Cooper Ramo’s, The Age of the Unthinkable: Why the New World Disorder Constantly Surprises Us And What We Can Do About It.”

If you have time for only one of these, I’d recommend Buchanan’s book, Ubiquity.  Here is an excerpt from Edward Skidelsky’s review of Ubiquity:

Applied to history, this theory suggests that … [significant events] demand no explanation beyond a narration of the precise chain of events that compose them. In the sand pile, it is impossible to specify the cause of a huge avalanche other than by tracing its exact progress right back to the original grain that triggered it all off. There are no “laws of avalanches” distinct from the laws governing the movement of the individual grains. And any grain … can, if it falls at the right time and place, start an avalanche. The only way to understand the history of the sand pile is to recount it; old-fashioned narrative history turns out to be the most scientific of all.

The vision of history that emerges from Ubiquity is tragic. It is the vision of the Iliad. History stands permanently poised on the brink of catastrophe; the abduction of one woman can lead to the destruction of cities. Instability is an inalienable feature of human life. We flatter ourselves that we have overcome it through the development of rules and institutions, not realising that those very rules and institutions are equally subject to its depredations…. [my emphasis]

———————–

From the  perspective of “self organized criticality,” what has been termed “system failure” is not always a problem that can be fixed.  Sometimes it’s a terrain feature one has to adapt to.

It may be politically correct to use the “fix it and move on” language.  But defaulting to such correctness may constrain useful thinking about alternatives.

[Mark Chubb's very thoughtful piece earlier today illustrates such alternative thinking.]

Resilience is premised on the idea that sometimes bad stuff happens.  And when it does, you get back up.

One does not encourage resilience by placing blind faith in the perfectibility of complex systems — particularly systems whose complexity is generated by people and technology.  One’s faith is better placed in the knowledge that complex systems will fail, so what happens when they do?

Questions like that outline a path toward resilience.

———————–

Here’s an image of the TSA system emerging orthodoxy says “failed:”

tsa-layers-of-security-vertical

Maybe political correctness demands there should be more or better pieces, or sub-pieces, or links, or procedures added to the complexity of the 20 layers and the unfathomable environment that surrounds those layers.

But you will note that “Passengers” are part of the current system.

As Mark notes, Flight 253 did land safely. Abdulmutallab failed.

Some element in the homeland security enterprise ought to get credit for the success.  The passengers did not sit quietly and wait for the bomber to try again amidst the smoke and smell.  They acted.

It is trite to say, but homeland security, including aviation security, is not simply the government’s job.  It is everyone’s responsibility — not in theory, but in fact.

It is politically incorrect to think otherwise.




Share and Enjoy:
  • Digg
  • Reddit
  • Facebook
  • Yahoo! Buzz
  • Google Bookmarks
  • email
  • Print
  • LinkedIn

8 Comments »

Comment by Gen Gadsden

December 30, 2009 @ 7:09 pm

The reason this is all moot is because deep inside we all realize this is a False Flag event handled by some governmental secret service. You need to look at the history of False Flag events and their benefits for the governments that use them. Just google “False Flag” to get an overview.

Undoubtedly you heard about the sharp-dressed man who got “the terrorist” through the gate. Surely you also heard about the suit who calmly filmed the terrorist for the entire flight. Lastly, you may not have heard about the dog that sensed explosives in a carry-on after the flight and the man who was handcuffed and led away while the FBI said there were no other detentions.

Your gut tells you something.. believe it. If you need a little more help, look here:
http://www.infowars.com/bombshell-eyewitness-revelations-confirmed-fbi-cover-up-of-flight-253-attack/

Comment by William R. Cumming

December 30, 2009 @ 7:14 pm

Whatever the risk assessment of various terrorist threats, the risk to survival of the herd does not exist for loss of a single aircraft no matter what that loss encompasses.
What is clear is that low probability high consequence events still face critical failures in preparedness, planning, and prevention.

Note that the 15 scenarios used as planning basis under HSPD-8 [now being rewritten if info I have is correct] non involve regulated industries which IMO are the best targets, and this includes airlines. A revised target capabilities list was issued in September 2007 and interesting that this single event has drawn more attention from Obama Administration than many more significant risks again IMO. So thanks Chris for adding some perspective on this event!

Comment by Mark Chubb

December 30, 2009 @ 9:58 pm

I’d like to add James Reason’s classic text, Human Error, to the list of excellent books on the subject of complex accidents.

Like most events of this sort, neither the term “accident” nor “human error” seems to apply in any strict sense. I accept that the “failures” that allowed Umar Farouk Abdulmutallab to board this aircraft were predictable and, therefore, preventable, but we must ask ourselves, “At what cost?”

Obviously, I agree that the outcome of this event suggests something far short of failure in fact occurred. I also agree that we can and should do more, but I would prefer to see effort invested in something we might call open source intelligence, which could rightly be considered an extension of what we used to call common sense.

Had we viewed this situation in the context suggested by Chris, a young man with no baggage and a questionable travel history who paid cash at the ticket counter for a Transatlantic flight probably would have been denied boarding. The best available technology for detecting this suspicious pattern is the human-being processing the transaction, not a full-body scanner.

Comment by Clinton J. Andersen

December 31, 2009 @ 1:40 am

That may be true, Mark: “The best available technology for detecting this suspicious pattern is the human-being processing the transaction, not a full-body scanner.” The problem we are faced with now, however, is the fear of being called out for profiling. I have no problem with pointing out patterns. But for some, patterns are profiling. How do we get over that hump besides waiting for a person in the right position to say, “Deal with it”???

Comment by christopher tingus

December 31, 2009 @ 7:07 am

Happy New Year 2010 to all!

From this, we shall again learn and the system will be again improved….all are doing a good job and again Kudos to the bravery of passengers! As far as profiling…contrary to most these days, I do believe a go ahead though with professionalism….we are at war which should also mean we use every ounce of our power to make sure all not underestimate our commitment to Republic!

I also believe strategy dictates that that every action should receive retaliation and apparently at least in Yemen, a message of clarity has been appropriately forwarded as news reports talk of strikes in western Yemen to let the fellas know we have no intention to let this one go unnoticed….or for that matter, any such dastardly deeds!

Christopher Tingus
Main Street USA
chris.tingus@gmail.com

Comment by Philip J. Palin

December 31, 2009 @ 7:27 am

A front page story in the December 31 Washington Post is sub-titled “Viewed in retrospect, Nidal Hasan’s life becomes a trail of evidence leading to an inevitable end.” We will soon be able to say the same about Umar Farouk Abdulmutallab. But what is perfectly clear in retrospect is very seldom as clear in advance. Certainly we should learn what we can from this event. It would also be profoundly helpful — politically and practically — to find the humility or common sense to accept we will not predict or prevent such events… even with the best trail of retrospective evidence. Thank you, Chris, for explaining why this is the case.

Comment by Nick Catrantzos

January 2, 2010 @ 2:49 pm

Chris,

I find it fascinating to see that passengers are at the bottom of the diagram you offered on security layers yet they are the final and, arguably, best defense. One of Peter Drucker’s oft repeated mantras was that it is the executive’s responsibility to play to people’s strengths and make their weaknesses irrelevant. Does it not seem that all the talk of systemic failures is doing quite the opposite? Meanwhile, we are missing the opportunity to take advantage of and foster passenger responders.

More on this at my January 2, 2010 post on http://all-secure.blogspot.com/

Nick

Comment by Christopher Bellavita

January 5, 2010 @ 7:51 pm

Worth reading, Nick. Great image: The Passenger Responder – Fly together or die together.

RSS feed for comments on this post. TrackBack URI

Leave a comment

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>