Homeland Security Watch

News and analysis of critical issues in homeland security

July 19, 2012

Security and resilience, knowledge and uncertainty

Filed under: General Homeland Security — by Philip J. Palin on July 19, 2012

Wednesday I spent the day at a meeting focused on an aspect of critical infrastructure and key resources (CIKR).  The sixty-some participants included invited experts from the federal government, academe, and the private sector.  The private sector was mostly represented by consulting firms and associations, but a few crucial owners/operators were also in the room.

It was a room with a history.  Most of those in the room knew each other from several previous meetings.  I was a newbie.

The purpose was to respond to, inform, and influence emerging policy at both the national and international level.  I am being vague on purpose.  I don’t think the specifics are needed to explore the issue of this post and I want to keep faith with what needs to be a discreet process… especially as a newbie.

Security and resilience were explicitly on the agenda.  I was there as an expert on resilience.  (I have a strong urge to write “expert”, but that was the label applied to get me in the room.)  Most of the others in the room were security experts.  Further, most of these others demonstrated a very real quotation-mark-free expertise.

It was a bit unnerving.  There was a great deal of knowledge in the room.  There was also an extraordinary faith in the power of knowledge and the possibility of true wisdom.  Some quick, perhaps overly personalized definitions:  Data are discrete facts and observations, information is organized data, knowledge is information-in-context, wisdom is knowledge effectively applied to solve problems.

Early in the meeting one of the veterans said that in regard to our specific CIKR focus, “all the information exists.” I initially took this as a form of understated humor.  Then it became clear he was serious.  Then I realized no one was going to challenge his claim.    My own silence emerged from both real surprise and the social caution of being a newbie.

As the morning progressed it seemed that most of the veterans agreed, “all the information exists.”  Moreover, there was an often unstated, occasionally explicit assumption that all the information exists necessary for us to make well-informed, validated, expert decisions that will secure and ensure the resilience of the focus-of-our-concern.

Or per my definitions: there was broad consensus that all the data exists necessary for this purpose.  The problem-at-hand is the lack of processes and technologies to transform this data into information from which knowledge and wisdom will be derived.  So most of the discussion focused on how we ought to use cooperation or incentives or regulations or some other tool to gather up all the data and appropriately organize it.

Late in the morning I made a brief intervention.  Who knows what I really said (writing is much more trust-worthy), but my intention was to argue:

1) We have seemingly agreed our focus-of-concern is complex.

2) If you mean complex as I mean complex, then there are by defintion many aspects of our focus-of-concern that are unknowable.

3) While we should seek to know what we can know, isn’t it also important to our goals that we recognize there are unknowns, some of which we may sense and many that will remain hidden until they emerge?

4)  What can we do to anticipate and mitigate unknowns?

In making this argument I pointed to the Cynefin framework that our meeting’s facilitator had previously introduced.

The Cynefin Framework, David Snowden et al.

Perhaps I had something green caught between my teeth.  Perhaps my zipper was open and a white shirt tail had escaped. Perhaps in my newbie-caution — or lack of insider-language — I was speaking gibberish.   Perhaps in the afterglow of confirming the Higgs boson all things seem knowable.  In any case, nothing came of this intervention.

Yet as morning unfolded into a long afternoon the prospect of capturing the needed knowledge seemed — not just to me — to recede before our eyes.   Data and information that at 9AM was clear had by 4PM become profoundly suspect.

By this morning-after most of my colleagues have probably regained confidence in the power of knowledge and the possibility of wisdom. I hope so.  For many, many aspects of reality this is an appropriate, important, and productive faith.

I also want knowledge and seek wisdom.  I want to be an expert too.  But I have found myself knowing mostly about aspects of homeland security (life?) that are — so far — beyond fully knowing.    It is tempting, at least for me, to descend into a deeper consideration of how we know what we know.

Instead of epistemology I will offer this observation: security and resilience are two distinct operational — and even psychological — responses to distinct categories of reality.   When reality is simple or complicated we can know and predict and control a great deal.  When this is the context, security is our appropriate focus.

But when reality is complex or chaotic there is much we cannot know, cannot predict, and cannot control… and if we delude ourselves into thinking otherwise we only increase our risks.  When this is the context resilience is the preferred strategic stance.

So says the “expert”.

Share and Enjoy:
  • Digg
  • Reddit
  • Facebook
  • Yahoo! Buzz
  • Google Bookmarks
  • email
  • Print
  • LinkedIn


Comment by Michael Brady

July 19, 2012 @ 10:41 am


Data are discrete facts and observations, information is organized data, knowledge is information-in-context, wisdom is knowledge effectively applied to solve problems.

That is as succinct a working definition of these important terms as I’ve ever seen.

And thanks for introducing me to Snowden’s Cynefin Framework. Very useful. His blog looks interesting as well http://cognitive-edge.com/blog/author/all

I say it’s time to take the scare quotes off your expert status.

Comment by josephdietrich

July 19, 2012 @ 2:47 pm

That wisdom is merely defined as “knowledge effectively applied to solve problems” speaks volumes.

Comment by Philip J. Palin

July 19, 2012 @ 3:58 pm

Mr. Dietrich, Your’s is a fair concern. Any reductionist summary invites such concerns.

Comment by John Plodinec

July 21, 2012 @ 11:26 am

I, too, appreciate your bringing the diagram to our attention. I think there is a big error on the left hand side, tho. For both the complex, and esp. the chaotic, there should be a fourth action added – “Repeat.” In fact, for both, the old military “OODA” loop might be more appropriate:

Comment by Philip J. Palin

July 21, 2012 @ 3:25 pm

Dr. Plodinec, David Snowden, who conceived the Cynefin Framework, would agree with your prescription to repeat. That’s a good idea to add the word to the illustration.

I think OODA and Cynefin are especially complementary. With this encouragement from you I will (pending real world events) give some further attention to both — and their implications for resilience — next Friday.

Pingback by Security and resilience, knowledge and uncertainty | #UASI

July 22, 2012 @ 2:07 am

[…] Read more @ hlswatch.com […]

Comment by Benjamin Berg

July 23, 2012 @ 2:07 pm


I couldn’t help but see parallels between your discussion and an article I read recently from Infinity Journal regarding complexity and systems theory. The author, Lt Gen Paul Van Riper, contends that Clausewitz, in his book On War, was describing the near impossible challenge of understanding systems that are non-linear. (Van Riper, Paul, “The Foundation of Strategic Thinking,” Infinity Journal, Volume 2, Issue No. 3, Summer 2012). The article points out that linear systems are understandable and can be ‘known’ whereas non-linear systems, with their multitude of interconnected variables and cascading effects become the ‘unknown’ that we attempt to model, but often do disservice in the attempt. He writes, “We benefit little when we separte the parts of an interactively complex system and study the parts in isolation, because in the act of separation the system loses its coherence and the parts lose their meaning.” I imagine that resilience becomes that complex system that is nearly impossible to model with any accuracy (though we did give it a solid try in the Critical Infrastructure course at CHDS).

Greatly enjoyed seeing the Cynefin framework again! Will keep reading your blog posts with high interest…



Comment by Philip J. Palin

July 24, 2012 @ 4:12 am

Ben, Thanks. Fantastic to know a US Marine Corps LT GEN has also seen the edge where command-and-control may falter.

We want to predict. We want — or think we want — to precisely measure the future. We want a deus ex machina, an algorithm, in which each part is isolated, its influence mapped, and all available as variables for our purposeful manipulation. I do too.

Prediction is a secular culture’s rebranding of predestination and it flatters the Calvinist pride still lurking in most of us. Complex Adaptive Theory, resilience, and related may be our secular attempt to retrieve concepts of grace.

Pingback by Homeland Security Watch » SnOODAn: Boyd, Snowden, and Resilience

July 26, 2012 @ 12:12 am

[…] Last Thursday I posted a bit on Cynefin. Developed by David Snowden and others, the Cynefin Framework can be a helpful tool for engaging reality’s varied flows, especially the flows from known to knowable to complex to chaotic and betwixt and between. […]

RSS feed for comments on this post. TrackBack URI

Leave a comment

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>