I was personally saddened to mark the recent passing of a towering intellectual figure who fundamentally changed my approach to clinical (and administrative) decision-making. Daniel Kahneman, who died on March 27 at the age of 90, won the Nobel Prize in Economics in 2002 for psychological research into how individuals make financial choices. His research was conducted with his lifetime friend and colleague Amos Tversky, who died prior to the awarding of the prize. Their Prospect theory and studies showing the dominance of loss aversion, the finding that people are more likely to act to prevent losses than achieve gains, contradicted over 200 years of acceptance of Adam Smith’s “invisible hand” theory, in which markets were supposedly guided by collections of individuals making rational economic choices. Kahneman and Tversky’s research gave birth to the field of behavioral economics.
But it is Kahneman’s best-selling 2011 book Thinking, Fast and Slow that has the most relevance for health care. In the book, he describes how humans use two systems of thinking in everyday life. The first, and most commonly used, allows for quick decisions based on simple pattern recognition and use of mental short cuts – heuristics. It heavily relies on emotional responses and intuition. The second system is used less frequently and is slow, deliberate, analytical, logical, and unemotional. It is precisely the kind of thinking that gave rise to the Scientific Revolution.
For those in health care, System 1 thinking allows us to get through a busy clinic or round on a busy ward but can give rise to diagnostic and therapeutic errors, since such thinking is replete with availability, anchoring and confirmatory biases. These cognitive biases prevent the careful interpretation of multiple, nuanced, occasionally contradictory, facts and consideration of uncommon diagnoses and complex treatment choices. Heuristics lead to acceptance of erroneous but “easy-to-reach” conclusions, which are then held on to tenaciously despite mounting contrary evidence. For physicians, nurses, pharmacists, and other providers, such thinking can lead to medical errors, patient harm and/or add to wasteful spending. It also erodes public trust and fuels frivolous malpractice lawsuits.
For public health experts, System 1 thinking leads to premature acceptance of, and a refusal to give up on, ultimately unwise policies. Sandro Galea, dean of the Boston University School of Public Health, notes that such thinking conflates truth and belief. He argues in his book Within Reason that when this happens, one no longer challenges tenuously based new theories but rather accepts them as personal beliefs and views any contrary opinions or data as a personal attack and a violation of one’s deepest principles.
We saw conflation of belief and truth multiple times during the height of the Covid pandemic, when critics of masking, social distancing, lockdowns, and school closures were viewed as ignorant, reactionary luddites. But in retrospect masks were, in fact, relatively ineffective except among those properly using fitted N-95 versions. Moreover, the precise social distance that dramatically reduced viral transmission was never firmly established, yet this did not stop experts from admonishing those not practicing various prescribed lengths even when attending outdoor events, where in fact such distancing was not needed. Lockdowns continued for months in some cities and for years in some countries, yet durations of more than a few weeks produced economic and psychological harms in excess of the short-term benefits of avoiding exhaustion of PPE and hospital acute care resources. Perhaps the worst abuse, as Galea points out, were school closures. In some locales, these lasted two years, causing developmental and educational harm to an entire generation of affected children for de minimis health benefits. Covid also unmasked the growing demand for a society with zero risk, leading to global restrictions on individual freedom when judicious protection of the most at-risk individuals would have sufficed.
We still see evidence of such belief-driven, zero-risk thinking in some public health officials’ dismissal of the benefits of natural immunity in favor of global use of an endless series of progressively less effective mRNA vaccines, instead of their selective use in high-risk individuals. There is also evidence that some public health officials may have overestimated the benefits and downplayed the risks of these vaccines in certain populations.
But perhaps the greatest casualty of System 1 thinking by public health policy makers and practitioners has been a global loss of trust in their pronouncements – even when they are sound and evidence-based, as recent CDC recommendations have been. We see its sequelae on the right, in the generation of often wild conspiracy theories, and on the left, by the stubborn refusal to abandon masking and periodic warnings that future lockdowns or school closures may be needed.
Restoration of trust will take time and require faithful adherence to the basic principles of the scientific method. It will also require an admission by public health policy makers that mistakes were made, and a full exposition of lessons learned during the pandemic. As a start I would strongly urge everyone at USF Health to read Kahneman and Galea’s terrific books and never conflate personal beliefs with the truth.