Implementation of a Digital Slide System for Quality Assurance
B Dangott, R Silowash, L Anthony, R Wilson, J Duboy, L Drogowski, AV Parwani, J Ho, D Jukic. University of Pittsburgh Medical Center, Pittsburgh, PA
Background: In many pathology departments, a subset of cases is routinely reviewed by a second pathologist as a method of quality assurance (QA). Studies have shown that disagreement rates range from 0.26% to 1.2% for global in-house prospective review and 4.0% for retrospective blinded review. The disagreements may be classified as major, minor or clerical. Traditionally, QA evaluation has been done by physically transferring the case to another pathologist within the department. This approach is somewhat limited in geographic reach, and there may be an element of bias because of the working relationship between the pathologists. By implementing a digital slide (DS) approach to QA, the geographic limitations can be eliminated, and the QA evaluation can be done anonymously in a more effective manner. This study was performed to evaluate DS as a method of performing QA, and to compare DS QA results with the original glass slide QA results.
Design: A CoPath (Cerner Corporation, Kansas City, MO) tool electronically queried the surgical pathology database for cases that had been previously viewed for quality assurance. For simplicity purposes, investigators decided to exclude cases that had over five blocks. Thirty accessions containing a total of sixty-six cases were then randomly selected from this subset, and the 202 glass slides from these cases were scanned using an Aperio T2 scanner (Aperio Technologies, Vista, CA).
Results: Pathologists viewed the digital slides through network connections on remote workstations via a downloadable database. A quality assurance survey recorded the level of agreement of each of the participating pathologists at case completion. Two-hundred and forty responses were received from 6 participating pathologists. There were 11 moderate disagreements and 3 major disagreements recorded in a total of 12 case parts (2 cases had 2 pathologists disagree with the original diagnosis). This represents an average of 5.0% disagreement which is comparable to the 4.0% average reported in blinded retrospective reviews.
Conclusions: Using DS for quality assurance requires pathologists to make a diagnosis on the digital slide, and this is a significant move toward implementing the technology in a clinical setting. In addition, the results from the QA study demonstrate comparable levels of agreement with traditional QA models. DS has the added advantages of overcoming geographic boundaries, eliminating potential biases, and improving quality assurance practices.
Category: Quality Assurance
Monday, March 22, 2010 1:00 PM
Poster Session II # 204, Monday Afternoon