The Fallacy of the UNpossible

Guest authors: Thomas P. Seager, Lucien Hollins, and Marcus Snell

In the popular US cartoon television series The Simpsons, there is a brief scene in which one of the child characters is placed on academic probation.  His response is, “Me, fail English?  That’s unpossible!”  As viewers, it strikes us as ridiculous to witness a character so unaware of his own cognitive limitations when they are so obvious to us.  Yet, we sense that his lack of awareness ensures that he’ll never overcome the deficit.  On a cartoon series, the stupidity of a cherubic elementary school student is comedic.  In infrastructure management, such a lack of self awareness can be tragic.

Sometimes these failures can be attributed to distortions of perception called cognitive bias.  Common human biases, fallacies, and systematic errors are now so well documented that Wikipedia has catalogued an extensive list.  For example, the Normalcy Bias describes the common misconception that conflates unprecedented with impossible.  We often hear the argument that because something has never happened before, it must be true that such a thing could never happen.  While the Normalcy Bias correctly captures this distortion of perception, it is so dangerous, and we see this erroneous reasoning so often, that when we notice it in infrastructure systems, we should give it a more memorable name.  Let us call it The Fallacy of the UNpossible.

The Fallacy of the UNpossible describes the phenomenon we observe when otherwise reasonable people conflate rare, or highly unlikely probabilities, with impossibility.  While there are some things we can imagine that really are impossible, such as traveling faster than the speed of light or violating the second law of thermodynamics, unprecedented flooding, or earthquakes, or stock market crashes, or even election outcomes are not subject to these physical constraints.

For example, consider the flooding of California’s Oroville Dam.  Still the tallest dam in the United States, officials recently ordered a mandatory evacuation of communities downstream of the dam, for fear the 770 foot edifice would collapse due to the simultaneous failure of the main and emergency spillways — an unlikely failure scenario environmental groups predicted back in 2005.  In a public hearing held as part of a relicensing approval process, these groups argued that the emergency spillway was unsafe.  According to them, should the spillway ever be activated, the flow of water over the top would erode the supporting embankment and eventually lead to collapse.

But the dam, built in the late 1960’s to provide flood control and water diversion from northern to southern California, had never before reached water levels that necessitated use of the emergency spillway.  Over three decades of safe operation at the dam convinced officials that the “facilities, including the spillway, are safe during any conceivable flood event.”

Herein lies the fallacy.  Dam officials were convinced their facilities were safe because they could not conceive of any scenario in which they weren’t.  In the face of criticism of the spillway design, they held to the view that it was safe despite previous surprises, including a 1997 evacuation in response to the failure of some downstream levees.

Six years later, in 2011, water levels climbed within 11 inches of the Oroville emergency spillway. Such a near-miss might drive officials to rethink the security of the spillway, or perhaps conduct tests of spillway performance.  However, in cases such as this officials sometimes confuse near-misses as proof of the graceful extensibility of their system, rather than as a warning that failure is more likely than previously thought.

The lessons of the Oroville Dam and recognition of the Fallacy of the UNpossible might inform other potential catastrophe scenarios.  One may be found in extreme heat.  Temperatures in Phoenix Arizona in the United States recently neared 49 degrees Celsius. While such extreme readings are previously unheard of for a major US population center, a combination of climate change and urbanization might push future temperatures even higher — especially in the desert southwestern US, where Phoenix is a major transportation, economic, and cultural hub.

At the component level, the physical response of infrastructure to such extremes is predictable.  For example, record high temperatures cause record high demand for electric power (for air conditioning), at exactly a time when power plants and transmission lines are operating at lower efficiencies (due to the higher ambient temperatures).  However, at the larger complex systems scale, the consequences become un-predictable, and potentially catastrophic.  Should a cascading power failure occur during an extreme heat event, individuals would no doubt seek refuge in the mountains or towns surrounding Phoenix.  Yet gasoline stations, dependent upon the city’s electrical grid, would be unable to refuel private cars.  For those with fuel, traffic signal and rail crossing outages may result in gridlock, overwhelming taxed emergency response crews.  Water pumps would fail and, when elevated storage reserves became depleted, there would be no pressure in the distribution system for supplying water for drinking, firefighting, evaporative cooling or irrigation.  Sewage pump stations and treatment plants without back up power sources would overflow.  And those with backup generation would be difficult to resupply.  Airborne evacuation would be complicated by the fact that helicopters and jets are not rated for flight in temperatures that near 50 degrees C.

A 2011 power outage in neighboring Mesa, Arizona that left over 100,000 people without power for 11 hours gave citizens and officials a brief glimpse of what such a catastrophe might look like.  However, the fact that the potential catastrophe was contained may result in overconfidence, rather than additional precaution.  When subject to the Fallacy of the UNpossible, people sometimes confuse the worst historical case with the worst possible case, and thus fail to prepare for the unprecedented.  This type of thinking places emphasis on ensuring that the UNpossible event could never take place — i.e., to reduce the probability of the already unlikely.  But recognition of the fallacy demands anticipation of the event rather than forecasting it — as in, “What do we do if… ?” rather than “How can we be sure that… ?”   This reframing avoids the Fallacy by applying resources to minimizing the consequences of the event, rather than the probability.

As climate, technology, and social systems evolve, we have every reason to believe that the future will be, in some characteristic and important ways, a significant departure from the past.  Thus, extrapolation from historical datasets is likely to be less reliable than ever.   Wherever we encounter designers, operators, policy-makers, or others arguing that historical experience has given them confidence in bounding the future, we must recognize the Fallacy of the UNpossible.  Here, we use the ridiculous prefix “UN-” rather than “im-” to remind us that tragi-comedy only exists in cartoon worlds.  Our real experiences will not submit to the fantastic failures of our imagination.

 

Biographies:

Dr. Thomas P. Seager is an Associate Professor in the School of Sustainable Engineering and the Built Environment at Arizona State University and leads a research group of scientists, engineers and students dedicated to creating new knowledge to make infrastructure safer and more resilient.

Lucien Hollins is an Astronautical Engineering Undergraduate Student in the School for Engineering of Matter, Transport & Energy at Arizona State University and a resilience researcher under the direction of Dr. Seager. He previously served in the US Navy operating and maintaining machinery for nuclear power and propulsion.

Marcus Snell is a Resilience Engineer who received his MS, in Civil Engineering, from the School of Sustainable Engineering and the Built Environment at Arizona State University. He models resilience, as a process, within the built environment.

Further References:

Additional Resources: redesigning Resilient Infrastructure Research Webinar: https://www.youtube.com/watch?v=632yydH-7pw

Acknowledgements: we are grateful to Jonathan Gao, Safety Science Innovation Lab at Griffith University, for first drawing our attention to normalcy bias and to Laura Maguire, Cognitive Systems Engineering at The Ohio State University, who reviewed an early draft of this essay and provided comments that helped improve it.We are also grateful to URNet editors Lorenzo Chelleri and Daniel Eisenberg for their enthusiasm regarding the progress of our work.