From James Clapper’s Tuesday testimony to the Senate Select Committee on Intelligence:
We are in a major transformation because our critical infrastructures, economy, personal lives, and even basic understanding of—and interaction with—the world are becoming more intertwined with digital technologies and the Internet. In some cases, the world is applying digital technologies faster than our ability to understand the security implications and mitigate potential risks.
State and nonstate actors increasingly exploit the Internet to achieve strategic objectives, while many governments—shaken by the role the Internet has played in political instability and regime change—seek to increase their control over content in cyberspace. The growing use of cyber capabilities to achieve strategic goals is also outpacing the development of a shared understanding of norms of behavior, increasing the chances for miscalculations and misunderstandings that could lead to unintended escalation.
Compounding these developments are uncertainty and doubt as we face new and unpredictable cyber threats. In response to the trends and events that happen in cyberspace, the choices we and other actors make in coming years will shape cyberspace for decades to come, with potentially profound implications for US economic and national security.
A major hospital system has delayed deploying an extensive (expensive) digital patient record system. Everyone agrees the new system will produce significant financial and clinical benefits. But no one has figured out how to ensure an effective non-digital capability persists. This was not a design specification.
There are multiple digital redundancies. But what if electric power is lost beyond the capacity of back-up generators? How can patient records and status be accessed and updated if the digital system is dead for days?
This is more than a technical problem. Many of the efficiencies generated by the ready-to-go system depend on collecting digital signals from various diagnostic tools and displaying integrated clinical outcomes. Today the sub-systems feeding these displays — and their strengths and weaknesses — are understood by clinical staff. Today it is not uncommon for an experienced nurse or lab tech to recognize that a specific data source can be “screwy” and should be rechecked. The new system will sufficiently obscure data sources to make this nearly impossible.
One hospital administrator comments, “As long as we have clinical staff who remember how to use pre-digital systems, we can probably recover capabilities.” But given staff turn-over this sort of human redundancy is expected to disappear within seven years.
My auto mechanic recently said, “When computer diagnostics first came out it was a big help, but I could still do most of my work without it, just not as quick. Now if the computer is on the fritz I can’t do anything.” He suggests younger mechanics are just “playing electronic games with your car,” and don’t understand any of the underlying systems. The hospital is trying to avoid this outcome.
I was talking to the manager of a large municipal water system. “Actually I feel pretty good about our resilience,” he said. “We’re a collection of several largely separate legacy systems built over the last century-plus: lots of innate redundancy, mostly gravity fed, almost all of it requires a human to turn a valve somewhere. Not nearly as efficient as the newest systems, but take out one piece and the rest just keeps on flowing. Bad planning has had some unintentionally good results.”
Meanwhile without digital scanning and communications most retail, wholesale, and shipping would suddenly stop. This includes food and pharmaceuticals. When the March 11, 2011 earthquake-and-tsunami hit Northeastern Japan the digital voices of those inside the impact zone went silent. The voice of hoarders hundreds of miles away became a shout. The supply chain responded to expressed want, not silent need.
The digital world has become the frame and filter on which many of us depend to engage the real world. Humans have long depended on frames and filters to simplify what would otherwise be too complex. Mathematics, religion, law and more are all tool-sets for framing and filtering.
There is often a temptation to mistake form for function. Framing reality has always included the risk of warping reality. We have experienced the consequences of these risks. (I seem to experience them daily.)
But never before has access to water, food, and other essentials for such large populations been so dependent on the quality and survivability of our frames.
“Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”
Daniel Kahneman, Thinking, Fast and Slow