ABSTRACT
Clinical reasoning and decision-making are essential to medical practice, where poor clinical judgement can lead to serious diagnostic errors, suboptimal use of diagnostic tests and investigations, flawed management strategies and ultimately, adverse patient outcomes. It is known that in real-world clinical environments, clinicians often rely on intuitive judgements for decision-making, due to natural proclivities for pattern-recognition and retrieval of pre-existing illness scripts, as well as out of practical necessity to be efficient in fast-paced work environments with high patient volume and cognitive load. Yet, an over-reliance on intuitive judgements can result in cognitive errors. While biases associated with heuristics-based or intuitive thought processes are often discussed in literature, there is also a lesser entity of noise that affects consistency and reliability of clinical decision-making. In this article, we highlight the importance of learning foundational principles of clinical reasoning and understanding how cognitive errors happen in medical training, and suggest a series of educational pedagogies and workplace-based interventions that could help to cultivate and optimise medical decision-making in real-world practice.
In 1999, the US Institute of Medicine released a landmark report “To Err is Human: Building a Safer Health System” where it highlighted the high prevalence of preventable medical errors,1 which led to concrete measures that resulted in significant improvement in patient safety standards. Two decades later, there has now been a call to address the epidemic of diagnostic errors,2,3 where it has been reported that nearly 800,000 Americans suffer from disability or mortality annually due to misdiagnoses.2 Apart from diagnostic errors, clinicians also often make poor decisions in other areas of medical practice, such as ordering of inappropriate investigations,4 making unnecessary subspecialty referrals,5 and lack of person-centred clinical management,6 which are attributable to suboptimal clinical reasoning and decision-making skills in real-world practice environments.7 As internal medicine physicians from different subspecialties who are heavily involved in both undergraduate and postgraduate medical education, we realised that the present focus on developing foundational content knowledge, clinical skills and expertise within the confines of a competency-based medical education system is by itself, inadequate in helping trainees navigate the complexities of real-world clinical decision-making. Therefore, in this article, we sought to discuss the importance of teaching and learning foundational principles of clinical reasoning, and understanding of how cognitive errors happen. We also propose educational pedagogies and workplace-based interventions that could help to cultivate and optimise physicians’ decision-making in real-world practice.
Clinical reasoning and sources of error in medical decision-making
Clinical reasoning can simply be defined as the thinking and decision-making processes associated with clinical practice,8 which involves systematically observing, collecting, analysing and interpreting patient information to arrive at a diagnosis and management plan.9 Yet this process is highly influenced by one’s cognitive capacity, capabilities, habits, biases and limitations, emotional/affective overlay, and other external influences,7 thereby resulting in heterogeneous standards of clinical decision-making in real-world practice.
Conventionally, clinical decision-making can be understood from the traditional 2-system model of cognition, which comprises a fast-paced, unconscious, pattern recognition-based intuitive thinking (system 1) and a slower, deliberate, more analytical form of cognition (system 2).3,10,11 In practice, intuitive thinking remains, by far, the predominant mode of clinical reasoning because they are more naturistic (clinicians as trained diagnosticians), practical (in busy, high-acuity clinical environments) and potentially more accurate in complex or uncertain clinical situations (i.e. less-is-more by excluding irrelevant cues).12 In fact, physicians are fundamentally intuitive Bayesian diagnosticians who subconsciously form probabilistic estimates of likelihood of disease and treatment success in every patient encounter, based on clinical and investigative cues, and constantly refine and recalibrate with new incoming environmental signals.13
Nonetheless, intuitive clinical judgements can often lead to decisional errors, which are attributable to the presence of cognitive bias and noise.14
Cognitive bias in clinical judgements refer to uniformly skewed patterns of systematic errors arising from common cognitive dispositions to respond.14,16 Till date, there have been more than 100 cognitive biases described in clinical practice.17 For example, anchoring bias occurs when physicians quickly latch on to key features early in the encounter and focus further evaluation purely on proving or disproving the generated hypotheses—which can lead to premature diagnostic closure.11 Similarly, confirmation bias happens when physicians choose to pay selective attention to clinical cues that support their provisional diagnosis while neglecting or minimising conflicting evidence.11 In availability bias, physicians may overestimate the likelihood of diseases that are easily recalled such as conditions that they have recently treated, whereas in representative heuristic bias, atypical disease manifestations may be neglected due to a tendency to look for prototypical illness patterns.11 In recent years, implicit biases—which are inherent stereotypes against certain patient populations—such as marginalised persons, psychiatric patients, drug or alcohol abusers, those with health-seeking behaviours, are increasingly recognised as a root cause of inappropriate dismissal of valid clinical concerns (i.e. medical gaslighting).11,18 Similarly, visceral bias occurs when physicians’ judgements are clouded/influenced by their emotional responses to the patient encounter such as presence of counter-transference or specific liking/dislike of the patient.11
In contrast, noise refers to an unacceptable degree of practice variability that occurs when different individuals or the same individual under different circumstances make significantly different decisions on a similar problem.15 In clinical practice, this not only leads to decisional errors but also causes inconsistent, unreliable medical decision-making that erodes patient confidence.19 Causes of inter-clinician variance may be attributable to differences in knowledge, expertise, resources, medical training, past/recent experiences, unique clinical consultation or management styles, subjectivity of the clinical entity being evaluated and different weightage given to cue relevancy.12, 15, 19 For example, a previous cohort study of Medicare beneficiaries in the US found significant regional differences in Medicare spending and health services rendered (e.g. inpatient consultations, major/minor procedures, patient disposition) for similar medical conditions.20 Interestingly, there is also an intra-clinician variability when the same physician makes different judgements for an exact same clinical scenario but under different circumstances.19 This can happen due to changes in the individual’s physical and mental state, such as fatigue or prevailing emotions,19 as well as other situational/environmental factors.21 For example, a previous study found that decision fatigue led to primary care physicians prescribing more antibiotics for acute respiratory illness towards the end of clinic sessions.22 Overall, in a healthcare system with excessive, unregulated noise, patients will face unacceptably high degrees of variability and inconsistency in clinical practices, management standards and eventual outcomes depending on where and with whom they seek medical consultation.
Pedagogical and workplace-based interventions to reduce bias and noise
We herein propose pedagogical and workplace-based interventions that may help to mitigate the problems of bias and noise in real-world clinical practice.
First, foundational teaching on Bayesian rules in clinical decision-making14 could potentially mitigate both bias and noise, as it addresses base-rate neglect and inter-clinician variability in decision-making through the use of common first principles. To standardise intuitive thinking when faced with clinical problems, clinicians may make use of simple Bayesian rule-of-thumbs such as testing and treatment thresholds to standardise clinical decision-making.14 For example, patients with clinical presentations that give rise to a low pre-test probability below testing threshold would be worked up for alternative diagnoses, whereas those with high pre-test likelihood of disease crossing treatment threshold may be empirically started on treatment. For patients who are between testing and treatment thresholds, rule-in and rule-out tests may be selected depending on the likelihood of disease and management intention.
To improve the accuracy of clinical predictions in an intuitive Bayesian model, medical educational pedagogies can emphasise illness scripts development through problem-based learning (ideally with real cases contextualised to a local setting), self-explanations, concept mapping and script concordance tests.26 Moreover, in constructing illness scripts, trainees must be specifically taught the base-rate prevalence in specific patient populations, as well as how the presence or absence of clinical risk factors and/or symptoms/signs will affect pre-test probabilities.27 Then, to cultivate self-regulated learning and recalibration of internal clinical judgements/predictions, metacognitive practice should be encouraged, through think-aloud exercises,28 verbalisation of propositional confidence of one’s clinical assessments,29 and retrospective chart-stimulated recall sessions30 as part of workplace-based pedagogy.
Over the years, many cognitive de-biasing or forcing strategies have been proposed to deal with specific, identified biases in decision-making.16 However, in busy clinical environments, we contend that a heuristics-based debiasing approach is likely more efficient and effective than having to perform deliberate analyses of all possible options. For example, to counter anchoring bias and premature diagnostic closure, simple techniques such as ruling out worst case scenario (e.g. to rule out dangerous diagnoses, clinical red flags)10 or simple aetiological mnemonics (e.g. Vascular, Infection/Inflammatory, Traumatic, Autoimmune, Metabolic, Iatrogenic/Idiopathic, Neoplastic, Congenital, Degenerative, Endocrine)11 can be adopted. In 2012, Shimizu and Tokuda described a novel pivot-and-cluster debiasing strategy, where physicians come up with an initial intuitive (pivotal) diagnosis based on the clinical presentation and surround this diagnosis with a cluster of differentials with similar manifestations.31 They further advocate the contemplation of diagnostic confidence, where a greater degree of diagnostic uncertainty would lead to the expansion of the clusters of differential diagnoses, including those with lesser diagnostic similarities to also be considered.31 To avoid confirmation bias that leads to tunnelled vision, actively framing differentials using factors for and against each diagnosis is a useful countermeasure.11 Last, implicit bias can be mitigated through improved awareness of its entity, reflective practice and even undergoing implicit association tests, which are psychological assessments developed to reveal subconscious associations and biases held by individuals towards other people or entities.32
To counter the more pervasive noise (i.e. inconsistencies) in clinical practice, a multi-pronged approach is needed through both pedagogical and workplace-based interventions. First, medical trainees must learn to craft accurate problem statements/representations with the right semantic qualifiers to actively sieve out pertinent information amid irrelevant details (background noise).33 Second, institutional/consensus guidelines, protocols and decision pathways19 are needed to improve standardisation of clinical decision-making at the local practice. Third, decision hygiene can be optimised through aggregate judgements,15 such as round-table discussions, departmental grand rounds or meetings to evaluate complex clinical cases. However, the effectiveness of aggregate judgements in optimising decision-making is contingent on avoidance of groupthink, and requires an open culture of holding objective and frank clinical discussions without fear of reprisal.19 Finally, to mitigate intra-clinician variability in decision-making due to situational or emotional influences on clinical judgement, cognitive-behavioural self-regulation techniques and external environmental modifications are important. To avoid letting emotional encounters cloud clinical judgement, physicians may practise mindfulness (acceptance of current emotions without pre-judgements) and adopt cognitive-behavioural techniques aimed at reframing negative thoughts and avoiding maladaptive actions. For example, instead of engaging in conflicts or avoidance behaviours in a difficult patient encounter, the physician may attempt to directly address the ongoing tension and identify the underlying reason(s) for the patient’s behaviour. Lastly, workplace-based interventions such as mandated rest or mental breaks, reduction in work intensity/interruptions, adequate clinical consultation time, and better clinical support systems are important to reduce errors from excessive cognitive load and decisional fatigue. Behavioural nudges, such as pre-set default options to optimise medication prescriptions,34 can also be useful to facilitate improved decision-making under rushed or stressful situations.
CONCLUSION
In summary, we highlighted the importance of understanding foundational Bayesian principles of clinical reasoning, and sources of cognitive errors in the form of bias and noise. We further discussed a series of practical pedagogical and workplace-based interventions that could strengthen clinical reasoning and fidelity of decisional processes in real-world clinical practice.
REFERENCES
- Kohn LT, Corrigan JM, Donaldson MS, editors. To Err Is Human: Building a Safer Health System. Washington DC: National Academies Press; 2000.
- Newman-Toker DE, Nassery N, Schaffer AC, et al. Burden of serious harms from diagnostic error in the USA. BMJ Qual Saf 2024;33:109-20.
- Phua DH, Tan NCK. Cognitive aspect of diagnostic errors. Ann Acad Med Singap 2013;42:33-41.
- Scott IA, Crock C, Twining M. Too much versus too little: looking for the “sweet spot” in optimal use of diagnostic investigations. Med J Aust 2024;220:67-70.
- Ng IKS, Lim SL, Teo KSH, et al. Referring wisely: knowing when and how to make subspecialty consultations in hospital medicine. Postgrad Med J 2024;101:76-83.
- Moore L, Britten N, Lydahl D, et al. Barriers and facilitators to the implementation of person-centred care in different healthcare contexts. Scand J Caring Sci 2017;31:662-73.
- Scott IA. Errors in clinical reasoning: causes and remedial strategies. BMJ 2009;338:b1860.
- Cooper N, Frain J. ABC of Clinical Reasoning. 1st ed. London: BMJ Books; 2016.
- Hawks MK, Maciuba JM, Merkebu J, et al. Clinical Reasoning Curricula in Preclinical Undergraduate Medical Education: A Scoping Review. Acad Med 2023;98:958-65.
- Kahneman D. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux; 2011.
- Ng IKS, Goh WGW, Teo DB, et al. Clinical reasoning in real-world practice: a primer for medical trainees and practitioners. Postgrad Med J 2024;101:68-75.
- Gigerenzer G. Why Heuristics Work. Perspect Psychol Sci 2008;3:20-9.
- Lim TK. The predictive brain model in diagnostic reasoning. The Asia Pacific Scholar 2021;6:1-8.
- Ng IKS, Goh WGW, Lim TK. Beyond thinking fast and slow: a Bayesian intuitionist model of clinical reasoning in real-world practice. Diagnosis (Berl) 2024.
- Kahneman D, Sibony O, Sunstein CR. Noise: A Flaw in Human Judgment. New York: Little, Brown Book Group; 2021.
- Croskerry P. Achieving quality in clinical decision making: cognitive strategies and detection of bias. Acad Emerg Med 2002;9:1184-204.
- Croskerry P. From mindless to mindful practice—cognitive bias and clinical decision making. N Engl J Med 2013;368:2445-8.
- Ng IKS, Tham SZ, Singh GD, et al. Medical Gaslighting: A New Colloquialism. Am J Med 2024;137:920-2.
- Mullins CF, Coughlan JJ. Noise in medical decision making: a silent epidemic? Postgrad Med J 2023;99:96-100.
- Fisher ES, Wennberg DE, Stukel TA, et al. The implications of regional variations in Medicare spending. Part 1: the content, quality, and accessibility of care. Ann Intern Med 2003;138:273-87.
- Penner JC, Schuwirth L, Durning SJ. From Noise to Music: Reframing the Role of Context in Clinical Reasoning. J Gen Intern Med 2024;39:851-7.
- Linder JA, Doctor JN, Friedberg MW, et al. Time of day and the decision to prescribe antibiotics. JAMA Intern Med 2014;174:2029-31.
- Morgan DJ, Pineles L, Owczarzak J, et al. Accuracy of Practitioner Estimates of Probability of Diagnosis Before and After Testing. JAMA Intern Med 2021;181:747-55.
- Zwaan L, Monteiro S, Sherbino J, et al. Is bias in the eye of the beholder? A vignette study to assess recognition of cognitive biases in clinical case workups. BMJ Qual Saf 2017;26:104-10.
- Pauker SG, Kassirer JP. The threshold approach to clinical decision making. N Engl J Med 1980;302:1109-17.
- Lubarsky S, Dory V, Audétat M-C, et al. Using script theory to cultivate illness script formation and clinical reasoning in health professions education. Can Med Educ J 2015;6:e61-70.
- Stern SDC, Cifu AS, Altkorn D, et al. Symptom to Diagnosis: An Evidence-Based Guide. 4th ed. New York: McGraw Hill Professional; 2019.
- Jagannath AD, Dreicer JJ, Penner JC, et al. The cognitive apprenticeship: advancing reasoning education by thinking aloud. Diagnosis (Berl) 2022;10:9-12.
- Fleming SM. Metacognition and Confidence: A Review and Synthesis. Annu Rev Psychol 2024;75:241-68.
- Philibert I. Using Chart Review and Chart-Stimulated Recall for Resident Assessment. J Grad Med Educ 2018;10:95-6.
- Shimizu T, Tokuda Y. Pivot and cluster strategy: a preventive measure against diagnostic errors. Int J Gen Med 2012;5:917-21.
- Sabin JA. Tackling Implicit Bias in Health Care. N Engl J Med 2022;387:105-7.
- McQuade CN, Simonson MG, Lister J, et al. Characteristics differentiating problem representation synthesis between novices and experts. J Hosp Med 2024;19:468-74.
- Talat U, Schmidtke KA, Khanal S, et al. A Systematic Review of Nudge Interventions to Optimize Medication Prescribing. Front Pharmacol 2022;13:798916.
Not applicable.
The authors declare they have no affiliations or financial involvement with any commercial organisation with a direct financial interest in the subject or materials discussed in the manuscript. This research did not receive any specific grant from funding agencies in the public, commercial or not-for-profit sectors.
Professor Lim Tow Keang, Division of Respiratory & Critical Care Medicine, Department of Medicine, National University Hospital, 1E Kent Ridge Road, Singapore 119228. Email: [email protected]