[2019] Assessment of Cognitive Bias as a Root Cause of Diagnostic Interpretation Error in Pathology Residents

Maxwell L Smith, Suzanne Dintzis, Stephen S Raab. Mayo Clinic Arizona, Scottsdale, AZ; University of Washington, Seattle, WA; Eastern Health and Memorial University of Newfoundland, St. John's, NL, Canada

Background: The cognitive root causes of diagnostic interpretation error in surgical pathology are poorly understood. We developed a simulation-based medical education (SBME) model to measure the biases associated with resident diagnostic interpretation error and developed methods of reference range forecasting for residents and pathologists to reduce error frequency.
Design: We measured diagnostic interpretation errors in a cohort of 8 residents participating SBME, designed incrementally to challenge residents with more difficult and rare cases based on previous performance. For each case, residents completed criterion and disease pattern checklists and provided final diagnoses. For cases in which the resident made a diagnostic interpretation error, we evaluated the accuracy of checklist components, bias type (n=35), and level of training. We categorized specific case types (i.e., rare or common) and training level with specific biases and knowledge gaps and designed a forecasting checklist to assist residents and pathologists in practice.
Results: Regardless of experience level, residents learned quickly to complete criterion and pattern checklists accurately but were challenged in linking histologic patterns to specific disease types (75% of failures). For inexperienced residents, 60% of diagnostic errors were associated with knowledge gaps and anchoring, recency, or attention biases. In more experienced residents 50% of errors were associated with expectation, over confidence, or clustering illusion bias. Errors involving special stains were associated with confirmation or gaze bias. For straightforward cases, 55% of errors were associated with anchoring, observer-expectation, or recency bias. For complex cases, 45% of errors were associated with do no harm, confirmation, or framing bias. Using forecasting, the residents were competent at recognizing knowledge gaps and difficult case biases. Residents often did not use bias-checklists in easier cases.
Conclusions: We hypothesize that a considerable proportion of diagnostic errors are linked to bias and that individual characteristics (e.g., experience) and specific scenarios are associated with specific bias types. We further hypothesize that SBME and the use of forecasting with bias-checklists has the potential to decrease diagnostic error through altering traditional cognitive patterns.
Category: Quality Assurance

Tuesday, March 5, 2013 9:30 AM

Poster Session III # 262, Tuesday Morning

 

Close Window