Getting it right

How doctors in training can learn to avoid making the wrong diagnosis
By Greg Breining

Diagnosing illness is one of the most important things doctors do. Yet they get it wrong a small but significant part of the time.

“We are not nearly as good at diagnosis as we thought we were,” says Andrew Olson, M.D., assistant professor of medicine and pediatrics in the University of Minnesota Medical School. “We produce people who are right 85 to 90 percent of the time in the hospital and 95 percent of the time in outpatient settings. That’s not good enough.”

To improve doctors’ skills, Olson has been spearheading efforts in the Medical School’s undergraduate and graduate programs to teach diagnosis more systematically and comprehensively.

“In my opinion, the most promising strategy is to improve education,” he says. “If I teach a student who is then going to practice for 40 years and see thousands and thousands of patients, the impact potentially is huge.”

Though fundamental, diagnosis is one of the most complex things a doctor does; it’s based on a combination of subjective information, objective tests, rational thought, and no small amount of intuition that defies easy explanation. 

“Until recently, we have not understood terribly well the processes we use to make diagnoses,” says Olson. “When we don’t understand the process very well, how do we teach it?”

As a result, diagnosis has been taught more implicitly than explicitly. Young doctors learn to take medical histories and then follow more experienced mentors who impart — clearly or not — their own diagnostic methods. That sort of learning leads to uneven results. And, research shows, reliance on intuition and mental shortcuts at the wrong times leads to errors. 

Cognitive bias

Medicine has treated knowledge as the basis of good diagnosis. But, as Olson and colleagues recently noted in the journal Diagnosis, “knowledge is necessary but not alone sufficient.”

“The reasons I err aren’t necessarily that I didn’t know a fact,” says Olson. “If you look at what gets missed in the hospital, it’s things like sepsis, strokes, heart attacks. What gets missed in outpatients? Cancer, most commonly. Why is that? We know a lot about all those things. It’s not a knowledge problem alone. It’s that I put the pieces together the wrong way.”

That happens because doctors rely on intuition when circumstances require more deliberation, explains Pat Croskerry, M.D., who leads the Critical Thinking Program in the Division of Medical Education at Dalhousie University in Halifax, Nova Scotia. 

“In an emergency setting, I can see one problem after another, and I can function at an almost intuitive level and say this almost always turns out to be that, and that’s how I’m going to treat it,” says Croskerry, a coauthor of the Diagnosis paper. “A patient may come in who is constipated. Constipation is pretty straightforward in 99 percent of cases.” A laxative solves the problem. But infrequently, constipation may be caused by a growing tumor or neurological problem. 

Doctors must “be alert to the possibility that what you’re seeing may not be routine,” says Croskerry. “That’s the real trick in being a good diagnostician — keeping an eye on what your intuitive system is doing.”

Writing in the New England Journal of Medicine, Croskerry described a patient who came to the hospital with stab wounds to his arm, back, and chest. After ascertaining that the wound to his back did not penetrate the chest cavity, doctors patched him up and sent him home. Five days later, the man went to another hospital in the area complaining of vomiting and blurred vision. A CT scan revealed a knife wound several inches into his brain. 

“Anchoring bias” led the resident to focus on the most obvious injuries, to the exclusion of the head wound. “Even though you think you may be being vigilant and careful, you can still get caught on these biases,” says Croskerry.

Scientists have written about biases for 40 years, but medicine has largely overlooked the research because it is so intangible (see sidebar). 

“It’s something that goes on between the physician’s ears,” says Croskerry. “Anybody observing my behavior from the outside doesn’t really know what my brain was going through when I made that decision. But cognitive scientists do. And medicine really hasn’t embraced cognitive science and asked questions about how we think.”

Better strategies

The human brain is wired to believe we’re correct. Being right and being wrong feel exactly the same until someone tells you you’re wrong.

– Andrew Olson, M.D.

Olson agrees that cognitive shortcuts lead to errors, but he questions whether simple awareness will lead to dramatic improvements.

“Maybe that is the problem, but addressing it might not be the solution,” he says. “I think what we’re all working on now is what strategies may be helpful to avoid some of those cognitive errors? We all know the worst strategy in the whole world is to ‘try harder.’”

What might work?

Developing skills in “reflective practice” is one possible strategy, says Olson. Making a grid of possible diagnoses — “What’s for it? What’s against it?” — can force a physician to reason more deliberately.

Another is feedback. “The way training is now, a student or resident admits a patient at night, they go home, they don’t find out what happened to that patient. So they think they’re right. The human brain is wired to believe we’re correct. Being right and being wrong feel exactly the same until someone tells you you’re wrong,” says Olson. The Medical School participated in a recent study with six institutions to bring feedback into training programs. “We found that the diagnosis changed about 40 percent of the time — a lot.”

Doctors should also be willing to admit they don’t know. Pressure to diagnose leads to “premature closure” and can stymie further inquiry, says Olson. “I think a lot of the reason we label stuff early on is because we are uncomfortable saying, ‘I don’t know, but I’ll be with you.’ That’s probably one of the most important things that we can do as clinicians.”

Teamwork can improve diagnosis as well. Medical teams, which are often hierarchical, can learn from aviation. Air crews, from pilots to copilots to flight attendants, speak up if something doesn’t seem right.

Woven into the curriculum

A more systematic approach to teaching diagnosis has been creeping into medical school curricula as practitioners and faculty realize the shortcomings of learning diagnosis strictly by doing. 

“The notion even that there was a science around clinical diagnosis is a relatively new concept,” says Robert Englander, M.D., associate dean for undergraduate medical education. “It’s moving from an implicit approach to an explicit approach around clinical reasoning. Only in the last 10 to 15 years has there been an exploding focus on patient safety and quality. The attention to clinical diagnostic reasoning has sprung out of that movement.”

To help students develop these skills, the Medical School weaves in lessons in diagnostic thinking right from the start and throughout the undergraduate and graduate experiences, says Englander. 

A course called Foundations in Clinical Thinking spans the first two years of medical school and allows students to work with clinical problems and generate differential diagnoses. At the Medical School’s Duluth campus, a problem-based learning thread during the first two years helps students think about diagnosis.

There’s also a greater emphasis on teamwork, Englander says. Students are engaged in collaboration with students in nursing, pharmacy, dentistry, and public health. 

In Olson’s Advanced Physical Diagnosis course, fourth-year students study research literature to evaluate aspects of physical examination that are most useful in making diagnoses. Erica Levine, M.D., an internal medicine resident, says the course developed her physical examination skills and ability to evaluate evidence-based medicine. 

“We definitely looked a lot at uncovering some myths the medical community holds true,” says Levine. “We focused a lot on the data behind different findings.”

“I think the main takeaway is to think critically, because that separates doctors from computers,” adds Gretchen Colbenson, M.D., who also took the class. “One of my biggest takeaways is that diagnoses are much grayer than anyone would tend to believe before any medical training.”

The Medical School is looking for better tests to gauge how well students and residents are learning. Multiple-choice exams are “a pretty bad indicator of diagnostic ability,” Olson says. “We need to move to something called a workplace-based assessment, which is our ability to actually assess a learner’s ability to do something in a real-world environment with real-world patients. A lot of that is focused on the process they use.”

Olson is particularly proud of the pediatrics residency program, which pairs resident doctors with role-playing patients in ambiguous situations. Residents explain the diagnostic process to patients and discuss why a diagnosis may have to wait. Additionally, faculty discuss errors they have made in diagnosis, “which is really a powerful thing to hear,” he says.

To minimize such errors, the Medical School will continue to hone its curriculum aimed at strengthening diagnostic skills, develop assessment tools to gauge students’ abilities, and conduct research to determine how well students are learning. 

“There is no one answer and there are no easy answers,” says Olson. “If there was an easy answer, we would have found it already.”

Published on April 3, 2019

Lead illustration: Dan Woychick

What is bias?

Bias influences how and why we make decisions—and it’s no different for doctors. According to a report by Pat Croskerry, M.D., in Academic Medicine, common cognitive biases that affect physicians’ diagnoses include:

  •  Availability bias, which occurs when recent similar examples (immediately available in the physician’s memory) unduly influence a decision about a current patient’s diagnosis.
  •  Gambler’s fallacy, which is the opposite of availability bias and occurs when a physician doubts the current patient’s diagnosis based on similar diagnoses of other patients, i.e., gambles that the trend will not continue. 
  •  Confirmation bias, which occurs when a physician forms an early opinion about a patient’s diagnosis and consequently looks for and favors evidence that confirms that diagnosis.
  •  Anchoring bias, which occurs when a physician locks on to particular symptoms and signs in a patient’s initial presentation and fails to adjust that initial impression as new data become available.