[564] Development and Validation of a Tool To Evaluate the Quality of Medical Education Websites in Pathology

Raja Alyusuf, Kameshwar Prasad, Ali Abdul Satir, Ali Abalkhail, Roopa Arora. Salmaniya Medical Complex, Manama, Bahrain; All India Institute of Medical Sciences, New Delhi, India; Nile College, Khartoum, Sudan; Arabian Gulf University, Manama, Bahrain

Background: The exponential use of the internet as a learning resource coupled with varied quality of many websites, lead to a need to identify suitable websites for teaching purposes.
Aim:
To develop and validate a tool which evaluates the quality of undergraduate medical educational websites; and apply it to the field of Pathology.
Design: A tool was devised through several steps of item generation, reduction, weightage and pilot testing. After developing a draft tool that encompasses criteria to be used in evaluating medical education websites, steps included pilot testing of the tool, post-pilot modification of the tool and validating the tool. Tool validation included measurement of inter-observer reliability; and generation of criterion related validity by testing the tool against a gold standard, construct related validity by measuring the relationship between the gold standard consensus with the actual score of the tool and the relationship of gold standard consensus with general website rating tools and content related validity by comparing the tool with general website rating tools and obtaining the subsequent gold standard rating of the tool. The validated tool was subsequently tested by applying it to a population of pathology websites.
Results: The tool was validated by applying a number of reliability and validity tests. Reliability testing showed a high internal consistency reliability (Cronbach's alpha = 0.92), high inter-observer reliability (Pearson's correlation r= 0.88), intraclass correlation coefficient = 0.85 and Kappa= 0.75. It showed high criterion related, construct related and content related validity. The tool showed moderately high concordance with the gold standard (Kappa=0.61); 92.2% Sensitivity, 67.8% Specificity, 75.6% positive predictive value and 88.9% Negative Predictive Value. The validated tool was applied to 278 websites; 29.9% were rated as recommended, 41.0% as recommended with caution and 29.1% as not recommended.
Conclusions: A systematic tool was devised to evaluate the quality of websites for medical educational purposes. The tool was shown to yield reliable and valid inferences through its application to pathology websites.
Category: Education

Tuesday, March 20, 2012 9:30 AM

Poster Session III # 128, Tuesday Morning

 

Close Window