The Effect of Using Two Duplicated Examination Sites to Simulate the Same Cases on the OSCE Reliability.
- Author:
Hoonki PARK
1
;
Jungkwon LEE
;
Seungryong KIM
;
Kyoungtai KIM
;
Haeyoung PARK
Author Information
1. Department of Family Medicine, Hanyang University College of Medicine.
- Publication Type:Original Article
- Keywords:
Evaluation;
Reliability;
OSCE;
Duplication
- MeSH:
Clinical Clerkship;
Humans;
Korea;
Physical Examination;
Primary Health Care
- From:Korean Journal of Medical Education
1999;11(1):37-52
- CountryRepublic of Korea
- Language:Korean
-
Abstract:
If large-scale testing programs are being used, OSCE stations may be duplicated into two or more sites. There are a few studies on the reliability of OSCE with duplicated stations in Korea. The purpose of this study was to investigate the effect of duplication on the reliability of OSCE. At Hanyang university college of medicine, an OSCE is given to all senior medical students(91 per class) upon completion of all clinical clerkship rotations. The examination consisted of twenty one stations and eighteen cases that represented commonly encountered problems in primary care. Each station required seven minutes for its administration, with 6 to 6.5 minutes for the student-SP or model encounter, during which the students performs a complete focused history and/or physical examination and/or procedure and/or management, and another 0.5 to 1 minutes for the evaluator to feedback case-related comments. We analysed the reliability of duplication by comparing total OSCE scores and case scores between two exam sites. We also evaluated the reliability of duplicated stations from student's and professor's subjective response to the OSCE. All 91 fourth-year students attended the OSCE. Standardized Cronbach coefficient of the OSCE was 0.67. The station scores and OSCE total scores were different between two duplication sites. The total OSCE score of one site was slight higher than that of the other site(p=0.03). Of total 19 stations in which students were evaluated by staff evaluator, six stations are more advantageous to one part compared with counterpart stations, other six stations are vice CONCLUSIONS: OSCE reliability can be affected by duplication of examination sites and inter-rater reliability is the most important determining factor. The results demonstrate a need for caution in the interpretation of scores obtained from OSCE with duplicated stations.