Image Analysis of the HER2 Immunohistochemical (IHC) Stain – A Study Comparing a Lab Validated Scoring Method and the Manual Assessment Method.
Douglas M Minot, Jesse S Voss, Susan B Rademacher, Toe Y Lwin, Jessica L Orsulak, Aziza Nassar, Beiyun Chen, Amy C Clayton. Mayo Clinic, Rochester, MN
Background: The HER2 IHC stain can be assessed quantitatively using digital image analysis (IA) to identify HER2 positive tumors. The goals of this study are to examine the concordance of IHC and FISH and the reproducibility of HER2 IHC interpretation by manual and automated methodologies.
Design: This retrospective study included 154 tumors with both HER2 IHC and FISH analyzed. Manual assessments of HER2 IHC were performed by 3 pathologists (PA) and 3 cytotechnologists (CT). The 3 CTs also performed IA using a laboratory validated scoring method with the ACIS III (Dako, Carpinteria, CA), which consists of a mean score from 2 areas each of tumor with high, moderate, and low intensity staining (6 areas total). Concordances with FISH results were determined. Intraobserver variability was tested by reanalyzing 20 cases by all observers and methods.
Results: Concordances with FISH were very good for IHC negative (0, 1+) and IHC positive (3+) tumors by all methods (Table 1). CT manual and CT ACIS undercalled two cases (IHC-/FISH+), while PA manual and CT ACIS had one overcall (IHC+/FISH-). The ACIS method had fewer 2+ results overall (n=16) compared to both CT manual (n=23) and PA manual method (n=25) and had a higher concordance with FISH (31% vs. 26% for CT manual and 20% for PA). CTs had higher interobserver reproducibility by both manual (0.747) and ACIS (0.779) methods than PAs (0.697) (Table 2). CTs had better intraobserver reproducibility (0.882) using ACIS method than manual assessment by either CT (0.828) or PA (0.766).
|PA Manual||29 (0)||88 (1)||25 (20)||12 (92)||154|
|CT Manual||21 (0)||101 (2)||23 (26)||9 (100)||154|
|CT ACIS||45 (0)||82 (2)||16 (31)||11(91)||154|
|Comparison (Average)||Interobserver (Avg. Kappa)||Intraobserver (Avg. Kappa)|