Homeland Security Watch

News and analysis of critical issues in homeland security

December 14, 2009

Redaction: A Metaphor of Security

Filed under: General Homeland Security — by Jessica Herrera-Flanigan on December 14, 2009

Co-authored by Colin Bortner

The House Homeland Security Committee Subcommittee on Transportation Security and Infrastructure Protection  will be holding a hearing Wednesday, following the revelation last week that the Transportation Security Administration (TSA) posted the agency’s Screening Management Standard Operating Procedure manual as an amendment to a contract notice on the Federal Business Opportunities website.  A blogger discovered the pdf document, as well as the ability to undo the redaction of sensitive information. Users of Adobe Acrobat publishing software were able to remove the blacked-out paragraphs and read the text beneath.

At the hearing this week, Gail Rossides, the acting TSA administrator will testify about the incident, including discussing actions taken and what the administration is doing to prevent future disclosures.  It has been widely reported that five employees have been put on leave.  TSA also has indicated that the document was outdated and not the one currently being used at airport checkpoints.

Putting aside the details of the document’s release or the findings from the hearing this week, the ease with which the redacted text was recovered from the redacted document is a simple illustration one of the peculiar challenges of information security: metaphors.

The linguist George Lakoff has argued the metaphors are the foundation of human thinking. For him, the development of thought has been the process of developing better metaphors. The application of one domain of knowledge to another domain of knowledge offers new perceptions and understandings. In a naïve sense, we understand new things in terms of old things. In this case, we understand PDFs in terms of paper.

Portable Document Format (PDF) files encapsulate a collection of elements–including text, fonts, and images–into a fixed composition. Adding a new element (such as a black rectangle) to the composition doesn’t remove the material the element may obscure. In the redacted document, behind the inserted black rectangles, all the sensitive text remained.

That this happened is not hugely surprising. PDFs are a very, very good metaphor for paper, and that is not by accident. They come in paper sizes, they’re broken into pages, and we rarely edit them. Adobe and other developers have capitalized on the strength of that metaphor and delivered on users’ expectations that PDFs should behave like paper in many ways.  Most people, including probably those at TSA responsible for the document, applied the paper metaphor, saw the black rectangles, and expected that they effectively destroyed whatever text was underneath them-as though it were a photocopy.

This is just one illustrative case. We use metaphors in information security all the time. Some examples are very familiar:

* locks and keys (physical),

* viruses and infection (medical),

* victims and theft (crime),

* externalities and costs (economic), and

* fronts and response (warfare).

At the 2007 Workshop on the Economics of Information Security (WEIS), a group of researchers from Indiana University’s School of Informatics presented a paper on “Mental Models of Computer Security Risks” showing empirically that security experts and non-experts have different mental models relating to metaphors like those above.

As one of the authors of the WEIS paper wrote in an earlier article:

“The different examples and metaphors suggest difference responses. Crime suggests investigation of every virus and worm. Crime also suggests minimal citizen responsibility with the possibility of neighborhood watch. The public health or metaphor implication requires coordinated public action with a fundamental requirement for retaining individual autonomy and civil rights. The criminal metaphor requires tracking and prosecution. The concept of warfare requires tight constraints on the network, with limited autonomy and top-down controls. Non-technical individuals will take all the implications of the metaphors.”

The author continues:

“Therefore when communicating with policy makers, media, and non-technical users the computer security expert should consider which metaphor correctly communicates user expectations.”

The differing expectations between experts and non-experts in information security are the foundation for events like the leaked TSA manual last week.

Share and Enjoy:
  • Digg
  • Reddit
  • Facebook
  • Yahoo! Buzz
  • Google Bookmarks
  • email
  • Print

3 Comments »

Comment by William R. Cumming

December 14, 2009 @ 6:18 pm

I guess this furthers the notion of “thinking” starting with the development of language as opposed to instinct.It would be hard to convince me that any bureacracy no matter how oriented to the public is willing to disclose its methods and processes. What is interesting about this “accidental” release is how it fits the various screening posts over the years on this blog. Somewhat primitive historian that I am I find over and over that earlier work not available virtually often addresses many of the same issues and policy questions now confronting DHS? Some new certainly but much studied and analyzed long before but just lost in the caverns of the bureacracy. Ever wondered what the Israeli screening manual looks like and reads?

Comment by lily

October 14, 2010 @ 4:57 pm

I think that nowadays security has to be help by politicians because it is so difficult to walk alone in the street without feel afraid about it. Generic Viagra

Pingback by More Junkmail from Bob, #221 « xpda

November 1, 2011 @ 12:10 am

[...] http://www.hlswatch.com/2009/12/14/7634/ [...]

RSS feed for comments on this post. TrackBack URI

Leave a comment

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>