by Allen Weiss, MD, MBA, FACP, FACR
President and CEO, NCH Healthcare System
We all like to believe that whatever we believe is true. That way, our knowledge leads us to behaviors which are helpful and positive for ourselves and the societies around us.
But sometimes, we have the disquieting experience of objectively examining our knowledge and discovering it was our preconceived notions which encouraged us to identify “facts” to support our conclusion—even though on closer examination, we find we have erred in many ways.
The subject of a recent popular book in the business community is understanding why we made decisions
which affect the future with real consequences, and most importantly, what was the thought process behind these “rational” actions.
The book is titled Superforecasting by Philip Tetlock, a University of Pennsylvania and Wharton School Professor of Psychology and Political Science, and Dan Gardner, a journalist and author of Risk: The Science and Politics of Fear and Future Babble: Why Pundits Are Hedgehogs and Foxes Know Best.
Let’s start with a few historical medical examples of “confirmational bias.” Did blood-letting help or harm Presidents George Washington and Abraham Lincoln?
Sadly, when George Washington fell ill in 1799 his esteemed physicians bled him relentlessly, dosed him with mercury to cause diarrhea, induced vomiting, and raised blood-filled blisters, according to historical accounts. At the time, I’m sure the conclusion was the President was just too sick for the treatment to overcome the illness.
The second century physician Galen influenced the development of various scientific disciplines, including anatomy, physiology, pathology, pharmacology, and neurology, as well as philosophy, and logic. He was not noted for his modesty. Galen influenced generations of physicians and was an indisputable source of authority. “It is I, and I alone, who has revealed the true path of medicine,”
Galen wrote with his usual lack of modesty. Obviously, he was untroubled by doubt.
Each medical outcome confirmed that Galen was right, no matter how equivocal the evidence might look to someone more objective than the master. “All who drink of this treatment recover in a short time, except those whom it does not help, who all die. It is obvious, therefore, that it only fails in incurable cases.”
Fast forward to today’s evidenced-based medicine, focused on what is important to the patient. Today’s physicians and non-physician providers ideally listen more than they talk, use proven therapies based on extensive clinical studies, and comfort when cure is not possible.
We are often tempted to make the facts fit our conclusions. But with so much more transparency in the digital age, everyone has access to better information that can improve health.
Confirmational bias extends across many disciplines. Financial markets with booms and busts are perfect examples of smart folks with a selfish point of view misrepresenting “facts” to further their own interests.
In the December, 2016 Harvard Business Review is an article entitled “The Overvaluation Trap.” It tells the story of the $571 billion that big oil has poured into prospecting for new reserves.
Current proven reserves contain almost two trillion barrels worth of oil which represents a 53-year supply ready to tap. The upsetting aspect of this continued search and use of oil: If companies are to sell what they already have, the world will have to burn carbon at a rate that will damage our planet’s ecosystem so much that economic growth itself (and therefore, the future
need for oil) will be compromised.
And yet, oil executives believe they are sensibly pursuing a sound asset acquisition strategy. Oil companies have profited in the past by finding new reserves. That fact is true. But the new information that we have more than enough oil is a fact not used because it runs counter to the goals of oil executives.
Again, there is a confirmational bias that finding reserves creates value and—and that ignores the fact that we have too many reserves already and need new forms of energy production.
Let’s return to the world of medicine. Even when cures are found, changing behavior is difficult.
Consider the discovery of Vitamin C for the treatment or scurvy. Calling Englishmen “limeys” has at its root the fact that Great Britain had at one time the greatest navy in the world. However, in the late 1700s the British navy had its own plague,
After sailors had been on board for months, their cuts would not heal, their teeth loosened, they bruised easily and eventually they had a high mortality. They suffered from scurvy.
Interestingly, as soon as they hit land and had a regular diet, the signs of disease quickly cleared.
Dr. James Lind, a British ship’s doctor, took twelve sailors suffering from scurvy and divided them into pairs giving each pair a different treatment: vinegar, cider, sulfuric acid, seawater, bark paste, and citrus fruit.
You know the results. The citrus fruit containing Vitamin C was the cure. Limes (which travel well) were added to the diet of the seamen. Lager and lime became a popular drink. But it took another few decades to get the entire navy on board. So even when you have the right answer, making change can be difficult.
Believing that things will get better and ignoring reality are the behaviors of confirmational bias. Certainly we desire optimistic forecasts and enjoy being around positive, engaged, and satisfied friends, colleagues and family members.
But we need to be realistic with our assessments and avoid confirmational bias if we want to succeed as individuals or as a society.