Thursday, September 10, 2015

Deadly Cultural Bias in Technical Decision Cycles


Another mental journey into “bet your life” Epistemology -- how we know, what we think we know.  Or in this case -- how to act -- or not.  There is a “cultural bias” coloring how pilots -- and sometimes regulators & BigGov actors -- bet your life in a decision cycle.  A specific historical example below.

Setting, 2002 Europe:  Überlingen mid-air collision over Germany between a DHL-611 cargo flight -- and a Russian Bashkirian Airlines Flight 2937 (BTC-2937).  This collision is an excellent illustration of how culture plays into life-and-death decisions.   

SHORT VERSION?  Mixed cultural reactions in a decision cycle: (a) Folks from "open" societies tend to make "more local" decisions.  (b) Folks from authoritarian or totalitarian societies tend to discount their local understanding in favor the "all knowing" authorities.  When these two different cultural and social responses to decision cycle are mixed -- it can end very poorly.

VIDEO INTRO: For a well developed background and narrative unfolding of the event -- enjoy this 45 min YouTube video:


45 Min Video: Überlingen Mid-Air Collision

Analysis of Culture Bias upon Pilot Actions and the Bias Upon Situation Awareness:  

Both the DHL-611 and BTC-2937 pilots had fully functional aircraft and instruments -- both had access to the same electronic collision avoidance “advice” from the onboard Traffic Collision Avoidance System (TCAS).


Cockpit TCAS Panel Indicator (from Wikipedia)

Yet the BTC-2937 Russian pilots had been given conflicting instruction and “advice” from the human Air Traffic Control (ATC) -- a human with access to ground radar.  In theory -- pilots in both aircraft -- had "Pilot in Command" discretion to over-ride the ATC erroneous call -- yet the erroneous ATC command was (unfortunately) directed at only the Russian pilot.  And here is where a cultural bias afflicts -- and infects -- the “pilot in command” life & death decision cycle.  

How so?  First a little background to help understand how this cultural bias is inculcated and conditioned into pilot decision makers -- and “plays” against real life decisions.

Back in the mid-1990s -- this author had a USAF fighter pilot describe his experiences in the 1991 Iraq war.  This gentleman shared a bit about how the Iraqi "command & control" (C&C) system was specifically knocked out -- early in the war -- to deprive the Iraqis of their ability to consult with -- and receive orders from -- the Iraqi central command.  Why?  Apparently -- at the US military academies -- the decision cycles -- and the field execution of "orders" -- are carefully studied by the officer corps.  

More than just the USA command / execution decision cycle is studied -- but also the decision cycle of many potential adversaries and foreign governments -- the old Sun Tzu adage "Know thyself, know thy enemy."  

The Iraqi military were inculcated and conditioned to respond only to the central command.  Knocking out Iraqi communications effectively shutdown the Iraqi military's ability to respond.  The local commanders were cut off -- and mentally "lost" -- without guidance from their central command.

For follow up, I asked if the US military had a similar C&C knock out vulnerability.  My USAF fighter pilot had a surprise -- and detailed answer -- an answer that says much about HOW decision cycles are afflicted -- and infected -- with cultural bias.  And how the local decision cycles can be made more robust by the human element -- guys "on the ground."

The USA military learned -- and apparently this lesson is propagated in the US military academies -- that "global planning, local execution" is key to a military campaign.  This lesson was especially learned during the 1846-1848 Mexican-American war -- as field commanders were often weeks to months out of contact with the central authority. 


   1847: 2nd Battle of Tabasco (from Wikipedia)

This "global planning, local execution" lesson was driven home by experiences during the 1861-1865 US Civil War -- and even with the advent of the electric telegraph -- communications were slow and intermittent -- and subject to enemy interruption / interception / spoofing.   The effectiveness of "global planning, local execution" was reinforced in the US military sub-culture -- as one of the “best” ways to manage and prosecute a war.  

From cursory historical research -- apparently beginning in 20th century -- the US Military began to collect lessons learned from "global planning, local execution" -- around the nomenclature of "Centralized Planning, Decentralised Execution" (CPDE). 

This systemic decision orientation was formalized in the 1986 Goldwater-Nichols Act
For an interesting 2009 summary history -- by Lt. Colonel Hinote -- see "Centralized Control, Decentralized Execution - A Catchphrase in Crisis?"

In simple summary words -- the US military expects the US central planners & authority set the overall “big picture” goals -- and leave the local execution details to local commanders.  These local commanders often have access to a more detailed and refined "Fingerspitzengefühl" -- a detailed "finger tip feel" for local situational awareness “picture.”  

If the local commanders are “cut off” from their command & control -- they are trained to proceed on their own initiative and local situation understandings -- to effect the “big picture” goals.  Apparently -- today in 2015 -- US Military policy tends to default to this “global planning, local execution” mode -- via training and propagation of the military sub-culture -- unless “over-ridden” by political micro-management.

On flip side, apparently the US Military actively takes advantage of many dictatorships and their “top down” government systems.  Most of these have “grown up” with fair-to-excellent command & control technology and communications options -- and sometimes lack long term military academic history.  These top-down military systems only “know” how to run their combat & war systems by removing local commander discretion and execution -- in favor of "wisdom" from the Jefe Máximo (big boss).  Thus the emphasis -- in 1991 -- and possibly in other conflicts -- of the US military to knock out the C&C structure of adversaries.

Back to original topic: How does this “global planning, local execution” mental orientation play into the Überlingen mid-air collision?

Consider the Soviet top-down pilot & military training "atmosphere" -- say from 1918-1992.  Please consider -- that in 2002 -- the Russian pilots were mid to late career.  This was just 10 years after the break up of the USSR-Soviet Union.  This means that much of the early training of the Russian pilots was inculcated and colored by the Soviet orientation -- a centralized, top-down command & control culture.  In this top down culture -- a pilot would only career progress by demonstrating a strong proclivity to respond the orders from the central authority -- and pilots would not be trusted with expensive aircraft and significant Soviet state property -- if they failed to instantly respond to this central authority.  In simple words, the 2002 pilot's unconscious mental bias was roughly: Central authority knows best for the mission, local details not so much.

TECHNICAL vs CULTURAL DECISION CONFLICT:  The DHL-611 pilots responded correctly -- and as per training -- to the “descend” command from the onboard electronic Traffic Collision Avoidance System (TCAS).  


Panel Display: TCAS EU Flysafe (from Wikipedia)

However, the Russian BTC-2937 pilots had a cockpit decision conflict:  The onboard TCAS  system correctly commanded “climb” -- to avoid the DHL-611 aircraft -- yet the Russian pilots had been commanded by the human ATC controller to “descend” -- resulting in both aircraft descending into each other -- and mid-air collision.  

For a few seconds -- the Russian pilots responded as per ATC instruction -- demonstrating their 1st training instinct and allegiance -- and desire -- to be responsive to the top down, central authority.  As characterized in the video -- the Russian pilots were mentally torn and confused by the conflicting ATC vs TCAS collision avoidance commands.  The odds are good these few seconds of delay was probably due to the Russian pilots training and cultural bias:  Equipment could be ignored with less peril to career, life and liberty -- than ignoring the human command & control.  

UPSHOT: For pilot decision makers -- under top down, centralized systems -- like the former Soviet USSR -- ignoring the human chain-of-command was heretical to past training and mental conditioning.  This probably contributed much to the Überlingen mid-air collision.  The odds are good that there are many other life and death decision cycles -- decision cycles that involve endangerment of people and expensive equipment -- from the size of aircraft -- to whole nation states -- that are colored by past and ongoing top-down training and cultural bias (think and compare with the nuclear Chernobyl Disaster ).

SUMMARY Quote by Steven R Covey -- "We see the world, not as it is -- but as we are" (or as we are conditioned to see the world).  And "big" decision makers see the world "as they are" -- just like pilots.

Tuesday, September 8, 2015

Big Decision Makers Can Be Dead Wrong


The "Regulatory Mentality" and "Mastermind" orientation like to flatter themselves into believing that they know what is best for you and your situation.  These regulators and masterminds often live in a world of bulk statistics and "big picture" numbers -- polls, census, economic trends, map databases and related measurements -- that give them apparent "god like" views of the regulated landscape. 

Implicit in your interactions with the regulatory mastermind mentality:  "What could you and your little perspective possible know?  We see the big picture."

Unfortunately -- in the routine daily / monthly / yearly work-cycle of the regulatory estate -- all too often a "full loop" feedback is missing -- and is epistemologically and/or politically "broken."  


Brokend Feedback Loop (modified from Wikipedia)

Big Mastermind Decision Makers are very rarely required to face the risk and consequences of their interpretations and "understanding" of the observed numbers -- and the effected regulatory takings and mandated regulatory "law" (regulations that have the effect and weight of statutory law -- until testing in court).   

Big Decisions Makers are running "open loop" and not  "closed loop" in the short term -- and -- also often in the long term -- for both feedback scope and effect.  These masterminds rarely are required to "eat their own dog food" -- and live fat and happy off your taxes and fees -- and your   "chasing-your-own-tail" efforts at "regulatory compliance."

EXAMPLE FAILURE?  Perhaps a "fully closed loop" failure example will illustrate how Big, Mastermind Decision Makers infesting the Regulatory Estate and Plantation can be DEAD WRONG -- even with "simple" numerical choices -- like a compass heading.

PARALLEL EXAMPLE: Airlines -- and especially pilots -- must live and work in world of atmospheric environmental measurements and navigational observations -- and the quite literally "on the fly" real-time interpretations of the numbers -- can make the difference between life and death. 

VIDEO: To help illustrate and see a "Real Life" example, where key decision makers bet their lives -- and the lives of others -- a video treat.  This video explores the key questions: “What do we know – and how do we know it?” -- and "Oh, No! What now?"


Varig Boeing 737-241 (from Wikipedia)

BRAIN JOURNEY: When you have about a hour -- get a coffee or a beer -- sit down in front of the computer or big screen media center -- and watch this YouTube video: Varig Flight 254 -- and the air crash investigation circa 1989-1990.  Please note how both pilots' blew a "simple" instrument setting -- and failed to use 2nd & 3rd verification navigational data.  The pilots apparently so trusted the electronic in-flight navigational instruments -- they even ignored visual navigational evidence of the sun, the moon and the stars -- just out the window. 



44 Min VIDEO: Varig Flight 254

SMARTER DECISION MAKER SHOPPING: Let your mind “float” with the video narrative -- as presented -- and ask yourself this question: “When should the key decision makers have known they were in trouble?”

Internally watch your how both your objective and subjective mind responds to the narrative and flow of facts.

Afterwards, please reflect on how million and billion dollar decision makers operate relative to this example.  This is fun -- and astounding cross training compare for those engaged in "regulatory compliance."  Just like pilots who set a simple compass number wrong -- so do regulators make "simple" yet fatal mistakes.

BACKGROUND: Why these air crash investigations?   What relevance do they have to the regulatory compliance "process"?  Three reasons: 

(a) In high school and college, this author worked on fixed and roto wing aircraft for four years, picked up several wrecks -- a couple with fatalities -- and wondered why people did the dumb things with their lives and expensive equipment,

(b) Aircraft instruments are much like a scientific experiment -- providing repeatable measurements of the atmospheric environment -- and the navigational observational evidence external to the aircraft.  Short of a triple instrument error -- and a complete "brain fart" breakdown in "seat-of-the-pants" mental monitoring and pilot's flight "Situation Awareness" (SitRep) -- the only wild card for understanding and correctly betting your life are the pilots' INTERPRETATIONS of the instrument readings,

(c) Back in the late 1990s -- both a key mentor scientist -- and an equipment sales rep -- introduced this author to the formal background of "Epistemology" -- how we know what we think we know.  Or as the book title sez: "How we know what isn't so."



These air crash investigations are fast feedback, risk-consequence mini brain-processing laboratory “experiments” -- where “correct” and “whole cloth” SitRep interpretation of the instrument data results in a safe arrival -- and “incorrect” interpretations result in painful outcomes.  A knife edge test and result missing from most “science” based hypothesis testing -- and regulatory / political wishful thinking.

OVER-RIDING REPTILIAN BRAIN FUNCTION: There is another Mayday-Air Crash investigation video -- that brings in an “extra” decision cycle biological factor:  The highly biased data of the human reptilian brain complex -- and how the inner ear and vertigo information swirling & mixing in the brain -- dominates and overwhelms the mind's interpretation of the instrument data.  A “ghost” in the organic “machinery” of the brain.  


China Airlines 006 (from Wikipedia)

After you watch the above video -- search on YouTube for "China Air 006" and "panic over the pacific" -- and note how the reptilian brains of the pilots mix with the instrument data in a less than useful manner.

UPSHOT:  Near real-time feedback is critical for "enforcing" good decision cycles.  The Mayday-Air Crash Investigation video series -- and how pilots make life-and-death decisions via hard instrument observations -- and human brain SitRep interpretations -- illustrate a "full loop" risk-consequence decision cycle -- with real, short term harm and consequences.  This "full loop" feedback is missing in the "regulatory compliance" working world.  The Mastermind Decision Makers are not betting their lives and careers -- as are the pilots of aircraft.  TAKE AWAY?  To obtain a "better" and "more sane" regulatory landscape -- the goal of every "compliance" expert should be to close the loop -- and bring the consequences of Mastermind decisions and choices home -- and onto the heads of the Masterminds.

To quote from Shakepeare's Henry V -- "All things be ready if our minds be so."