Improving patient safety: Overcoming clinical biases and misperceptions

by Nancy Fliesler on November 3, 2011

Context can create bias: Squares A and B are the same shade of gray (created by Edward Adelson, professor of vision science, MIT)

Before you read this post, look at squares A and B in the image to the left. Which is darker?

Next, answer the following questions:

  1. A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost? 
  2. If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets? 
  3. In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake? 

Did your mind leap to these quick answers — 10 cents, 100 minutes, 24 days?

Such errors on this Cognitive Reflection Test are quite common, and not so different from the lapses in thinking that underlie medical errors. The mind leaps to the quick intuitive answer, and rarely goes back to check its work.

“In medicine, most errors occur in the intuitive mode of thinking,” said experimental psychologist and emergency medicine physician Patrick Croskerry earlier this week.

Croskerry, a professor at Dalhousie University in Nova Scotia, spoke at the Risky Business conference, presented by the Program for Patient Safety and Quality at Children’s Hospital Boston.

Doctors pride themselves on their medical intuitions – they’re fast, compelling and mostly serve them well. But they can occasionally be catastrophic. A patient with a history of constipation goes to an emergency department with abdominal pain, and is given a new kind of laxative and sent home. Later, he dies from an abdominal aortic aneurysm. A 19-year-old woman is treated for pneumonia, when she really has a pulmonary embolism. A man in his 40’s with heart failure is diagnosed with asthma.

It’s partly the mind being “comfortably numb,” said Croskerry, grabbing fragments of information and piecing them together to reach a quick “obvious” conclusion. Thinking it sees a familiar pattern when the facts, closely examined, don’t really conform to that pattern.

And it’s partly bias – bias toward events that are more easily brought to mind, perhaps based on experience with a previous case, and frank stereotyping of patients. David Ropeik, a consultant in risk perception and risk communication who also spoke at Risky Business, calls these kinds of biases “mental shortcuts for decisionmaking.”  They let us make sense of partial information quickly and turn it into the pattern we call “judgment.”

Rational thinking takes much more effort, and biases can flourish under pressure, as in a busy clinic or overworked ED.  “Cognitive biases are invisible – people identify them after the fact,” Croskerry said. “The challenge of de-biasing is a big one, but it’s got to be done. It’s an absolute imperative in medicine.”

Interestingly, clinical biases can be overcome with training. Intelligent tutoring systems, software systems with built-in artificial intelligence, can give clinicians feedback about cognitive errors. As clinicians interact with the system, making decisions about hypothetical cases, it identifies biases such as overconfidence (a miscalibration of one’s own sense of accurary) and anchoring (a tendency to fixate on the patient’s initial presentation too early and not adjust the initial impression in light of new information) and point these out to the clinician.

Simulator programs for medical education can also include bias in their drills – giving clinicians and teams a chance to practice not just technical skills, but their ability to think clearly under pressure and speak up when superiors give directives that don’t seem to make sense. When real crises or near-misses occur at Children’s, for example, they’re often modeled in the Simulator so that others can learn from them.

Or maybe bias can be overcome by harnessing a concept in risk perception: Appeal to the brain’s evolutionary first responder, the amygdala, which sets off the fight-or-flight response. “Fear factors” cause us to perceive a risk to be greater than it really is.

So to make a concept stick, Ropeik suggested during the Q&A, we may need to speak to the amygdala, presenting scary case studies that make doctors realize “you could make a mistake and kill somebody.”  In other words, scare the bejeebers out of them. “We are hard-wired to give more emphasis to how things feel than the facts,” said Ropeik.

Click here for more on Risky Business and to view videos of previous years’ presentations. Oh, and by the way, the correct answers to the above questions are: 5 cents, 5 minutes and 47 days.

1 comment

  • Cara Gillotti

    Here’s a link to the proof for the Adelson image: http://persci.mit.edu/gallery/checkershadow/proof

Previous post:

Next post: