“Politically correct” means constraining the way one behaves or uses language because one is afraid to violate powerful orthodoxies.
President Obama has officially declared that “a systemic failure has occurred,” and he considers it to be “…totally unacceptable.”
Obviously, when a system fails in a technologically advanced society, the only politically correct thing to do is fix it.
One fixes system failures by identifying the offending elements and replacing them with elements that are not going to fail.
It is irrational to do anything other than that.
But what if this was not (except with hindsight) a preventable systemic failure? What if it is in the nature of complex systems to “self organize” and every now and then just fail?
On this point, see “Complexity, contingency, and criticality,” by Bak and Pakzuski (originators of the sandpile avalanche metaphor — i.e., “for a wide variety of phenomena, there are no deep underlying causes, just an accumulation of tiny accidents.”).
Less technical treatments of the idea can be found in Charles Perrow’s , “Normal Accidents: Living with High-Risk Technologies;” Mark Buchanan’s “Ubiquity: Why Catastrophes Happen;” or Joshua Cooper Ramo’s, The Age of the Unthinkable: Why the New World Disorder Constantly Surprises Us And What We Can Do About It.”
Applied to history, this theory suggests that … [significant events] demand no explanation beyond a narration of the precise chain of events that compose them. In the sand pile, it is impossible to specify the cause of a huge avalanche other than by tracing its exact progress right back to the original grain that triggered it all off. There are no “laws of avalanches” distinct from the laws governing the movement of the individual grains. And any grain … can, if it falls at the right time and place, start an avalanche. The only way to understand the history of the sand pile is to recount it; old-fashioned narrative history turns out to be the most scientific of all.
The vision of history that emerges from Ubiquity is tragic. It is the vision of the Iliad. History stands permanently poised on the brink of catastrophe; the abduction of one woman can lead to the destruction of cities. Instability is an inalienable feature of human life. We flatter ourselves that we have overcome it through the development of rules and institutions, not realising that those very rules and institutions are equally subject to its depredations…. [my emphasis]
From the perspective of “self organized criticality,” what has been termed “system failure” is not always a problem that can be fixed. Sometimes it’s a terrain feature one has to adapt to.
It may be politically correct to use the “fix it and move on” language. But defaulting to such correctness may constrain useful thinking about alternatives.
[Mark Chubb’s very thoughtful piece earlier today illustrates such alternative thinking.]
Resilience is premised on the idea that sometimes bad stuff happens. And when it does, you get back up.
One does not encourage resilience by placing blind faith in the perfectibility of complex systems — particularly systems whose complexity is generated by people and technology. One’s faith is better placed in the knowledge that complex systems will fail, so what happens when they do?
Questions like that outline a path toward resilience.
Here’s an image of the TSA system emerging orthodoxy says “failed:”
Maybe political correctness demands there should be more or better pieces, or sub-pieces, or links, or procedures added to the complexity of the 20 layers and the unfathomable environment that surrounds those layers.
But you will note that “Passengers” are part of the current system.
As Mark notes, Flight 253 did land safely. Abdulmutallab failed.
Some element in the homeland security enterprise ought to get credit for the success. The passengers did not sit quietly and wait for the bomber to try again amidst the smoke and smell. They acted.
It is trite to say, but homeland security, including aviation security, is not simply the government’s job. It is everyone’s responsibility — not in theory, but in fact.
It is politically incorrect to think otherwise.