Intended for healthcare professionals

Practice Diagnosis in General Practice

Iterative diagnosis

BMJ 2009; 339 doi: https://doi.org/10.1136/bmj.b3490 (Published 22 September 2009) Cite this as: BMJ 2009;339:b3490
  1. Geoff Norman, assistant dean, programme for educational research and development1,
  2. Kevin Barraclough, general practitioner2,
  3. Lisa Dolovich, research director, department of family medicine1,
  4. David Price, chair, department of family medicine1
  1. 1Faculty of Health Sciences, McMaster University,1200 Main St W, Hamilton, ON, Canada L8N 3Z5
  2. 2Painswick GL6 6TY
  1. Correspondence to: G Norman, Program for Educational Research and Development Room 3510, MDCL, Faculty of Health Sciences, McMaster University,1200 Main St W, Hamilton, ON, Canada L8N 3Z5 norman{at}mcmaster.ca

Strategies for improving the pattern recognition involved in making a correct diagnosis amount to forcing yourself to use analytical reasoning; diagnosis of vertigo (doi:10.1136/bmj.b3493) is an example

What is iterative diagnosis?

The traditional model of diagnosis is one of initial collection of information in the history and examination, followed by deductive steps to reach a diagnosis. We suspect that most clinicians do not recognise or use this process.

A more realistic model was formulated by Elstein and Schwarz 25 years ago.1 It was called the hypothetico-deductive model, but we will call it the process of iterative diagnosis. This model recognises that clinical reasoning usually involves the clinician generating one or more possible hypotheses early on in the consultation (often, but not always, by pattern recognition2) and then recurrently—iteratively—testing these. Clinicians use many such shortcuts (heuristics) in clinical reasoning. This is not a fault: the shortcuts are typically correct and allow them to arrive at a working diagnosis with the minimum of delay, while avoiding excessive testing and anxiety. Exhaustive data collection without hypotheses—the medical student’s history and examination—usually does not improve diagnostic accuracy and may make it worse.

The initial steps in the process of making a diagnosis are therefore often non-analytical or intuitive.2 The initial hypothesis (the limited list of possible diagnoses) is often formulated before much data collection has occurred—from the “eyeball” impression as the patient walks in or as he or she is speaking.3 The process of testing of the hypothesis then proceeds by careful and systematic gathering of data and weighing the elicited information against the mental rules that are referred to in the literature as analytical reasoning.

Research in clinical reasoning is moving to a consensus that both analytic and non-analytical processes operate simultaneously in problem solving and that the clinician relies to a greater or lesser degree on one or the other, depending on experience, familiarity with the problem, and the stage of the diagnostic process.4 It is clearly not always a case of pattern recognition and, faced with difficult problems, clinicians may revert to “basic principles,” where they reason the problem out from a mechanistic, physiological, model.4

When is iterative diagnosis used?

General practitioners will recognise that they often formulate one or more presumptive hypotheses as the patient walks into the room or when they start speaking: a “hang dog” demeanour suggests depression; a unilateral stiff arm gait suggests parkinsonism; the acute onset of vertigo when rolling over in bed suggests benign positional vertigo. The general practitioner then listens to the history through the “filter” of the initial hypothesis: does the patient describe low mood, agitation, sleep disturbance? Does he or she get stuck as they roll over in bed; have they noticed that the fingers of the non-swinging arm are clumsy; does the vertigo come on with head movement and become less severe over a minute or so?

The general practitioner’s examination will usually be directed towards supporting or refuting a hypothesis: on direct questioning does the patient admit to anhedonia and pessimism? Does he have lead pipe rigidity of the arms? Does the Hallpike test have positive results?

Sometimes intuition applies a brake to the reasoning in the recurrent (iterative) testing: despite the coherent illness narrative, something doesn’t fit. Although the doctor has seen a hundred febrile children with sore throat, something “just doesn’t feel right” about this one. This may prompt referral or early review.

How does iterative diagnosis go wrong?

This process (of simultaneous intuitive and analytical clinical reasoning) is usually invisible. It is so inherent to the clinician that he or she will be scarcely be aware of using it. That it is occurring at all is usually apparent only when it fails.

When diagnoses are missed it is usually assumed that they have been missed because of inadequate data collection. No doubt, some errors are a consequence of poor data gathering. One study in particular found that the dominant cognitive bias that resulted in diagnostic error was premature closure.5 However, the missed diagnosis may not be a consequence of sloppiness or inadequate attention to detail; instead, the critical data are often missed simply because the clinician was not thinking of the correct diagnosis. Although clinicians gather less data as they gain experience, this does not seem to have a negative effect on diagnostic accuracy.6

The “error” may more often be due to one or more common, recurrent cognitive biases. We clinicians, like other people, use cognitive heuristics that may occasionally lead us astray by biasing our weighting of evidence; these include confirmation bias—gathering information that will confirm rather than refute the diagnosis; availability bias—relating the case to easily recalled examples; premature closure—arriving at a conclusion before gathering the critical data and not revisiting it. Other examples of cognitive biases are the framing bias—being swayed by the way in which the problem is phrased, and base rate neglect—forgetting that common diseases are common.7 Many of these biases may affect the hypotheses that are initially considered, or the data gathered to support these hypotheses.

Further, there is some evidence that age, independent of experience, leads to increasing use of non-analytical reasoning, which has the effect of increasing reliance on early data, and less willingness to re-examine the diagnosis in light of new, conflicting data.8

How can we improve?

Awareness of these common pitfalls in our heuristic reasoning may help us to avoid them. Although some familiarity with the nature of these biases is probably useful, specific and simple strategies can lead to real improvement.9 10 The common denominator in these strategies is encouraging the clinician to re-examine the data and reconsider the formulation, a process we call iterative diagnosis. It amounts to deliberately forcing yourself to use analytical reasoning. Six strategies are useful.

Routinely second guess

We can remind ourselves to routinely consider alternatives to our initial diagnosis: “What can I not afford to miss? Am I sure that this person’s red painful foot is cellulitis rather than critical ischaemia?”

Seek data that would not fit with the hypothesis

We can specifically go after signs or symptoms that would be inconsistent with the diagnosis and suggest alternatives—such as facial weakness in benign positional vertigo, or an explosive onset of headache in migraine.

Reframe when recording

We can consciously re-examine the history as we write the notes. Do not merely record a history that fits in with the hypothesis—“pulsating unilateral headache with nausea and visual aura”. Consider that the “framing” maybe misleading—the history of nausea was elicited in response to a direct question (“yes, a bit”); the term pulsating was never used; the “aura” was momentary visual “greying.”

Reconsider dissonant facts

We can review and re-examine facts that don’t quite fit—perhaps this headache is far worse than any migraine the patient has ever had and she reports slight neck stiffness.

Know about test accuracy

We can become more familiar with test accuracy: an earlier, normal cervical smear does not exclude cervical cancer; a raised serum urate concentration does not mean arthralgia is due to gout. This may also involve being aware of pretest and post-test probabilities, the subject of an article later in the series. Negative tests (for example, a D dimer blood test or an exercise electrocardiogram) often do not adequately rule out disease in patients with high pretest probabilities, whereas they may rule out disease in those with low pretest probabilities.

Use time as a diagnostic test

Appropriately timed follow-up (as occurred in the companion article on vertigo11) may also allow the general practitioner to review the diagnosis and separate minor and time limited conditions from potentially more serious problems.12

Conclusion

Iterative diagnosis is an essential component of medical expertise. It involves rapid, simultaneous generating and testing of hypotheses. It is usually fast, efficient, and accurate. When errors in diagnosis occur they are often due to one or more of a set of predictable cognitive errors, rather than carelessness or lack of knowledge. Simple strategies can increase awareness of potential pitfalls and reduce errors.

Key points

  • Clinical diagnosis uses two kinds of thinking processes: analytical (logical, intensive, careful) and non-analytical (unconscious, rapid, contextual)

  • Errors can result from both pathways

  • Diagnostic errors tend to be a consequence of cognitive biases, rather than deliberate sloppiness or omissions

  • To reduce errors, simple strategies may encourage clinicians to “iterate” or reconsider the diagnosis—for example, by routinely second guessing (“This unilateral headache in 62 year woman may be migraine as she thinks, but what can I not afford to miss?”)

Notes

Cite this as: BMJ 2009;339:b3490

Footnotes

  • This series aims to set out a diagnostic strategy and illustrate its application with a case. The series advisers are Kevin Barraclough, general practitioner, Painswick, and research fellow in community based medicine, University of Bristol; Paul Glasziou, professor of evidence based medicine, Department of Primary Health Care, University of Oxford; and Peter Rose, university lecturer, Department of Primary Health Care, University of Oxford.

  • Contributors: GN wrote the paper, incorporating comments and suggestions. KB contributed clinical material and took lead on a major revision. LD reviewed drafts and made suggestions. DP reviewed drafts, made suggestions, and offered clinical perspective. PR made suggestions for improvement. PG helped in initial formulation and made suggestions for further revisions. GN is guarantor.

  • Competing interests: None declared.

  • Provenance and peer review: Commissioned; externally peer reviewed.

  • Patient consent not required (patient anonymised, dead, or hypothetical).

References