Council on Medical Student Education in Pediatrics

COMSEP Logo

Search This Site

COMSEP 2008 Atlanta Meeting

Poster Presentation:


Using trained lay observers and faculty physicians to assess the clinical skills of medical students working with real patients in an out-patient clinic

Authors:
Woodhead J, Bergus G, Kreiter C.
The University of Iowa

Background: Observations of clinical skills in an OSCE may not reflect performance in real patient encounters. Structured observations of students during real patient encounters can provide reliable measures of clinical skills, but using physicians to observe students can be time-consuming and costly.
Objectives: To employ trained lay observers to rate students’ communication skills during patient encounters and faculty physicians to rate students’ data collection and clinical reasoning skills after the encounters.
Methods: Third year medical students on the required Pediatrics Clerkship were accompanied into clinic exam rooms by trained lay observers, who completed 15-item communication checklists. Afterwards, students presented their findings to faculty physicians, who scored data collection and clinical reasoning skills using a six-item global rating instrument.
Results: A total of 200 patient visits, involving 52 different medical students, were observed by trained observers. Students presented clinical findings to 15 different Pediatric faculty. The alpha reliability across cases for the faculty ratings was 0.62, but this included multiple ratings by faculty of some individual students. When a single rating per faculty-student dyad was used to calculate alpha, the reliability fell to 0.44. Combining 3 unique faculty ratings with 3 communication scores collected by lay observers resulted in an overall reliability of 0.50. There was moderate correlation between the communication scores awarded by the lay observers and the communication scores obtained by the students on the Pediatric OSCE (r= 0.53, p<0.001).
Conclusions: Trained lay observers can reliably evaluate communication skills in real patient encounters. Providing physician faculty with a structured format to rate data collection and clinical reasoning assists evaluation. The combination may result in scores with a reliability approaching the OSCE, depending on the number of patient encounters and the number of unique faculty raters employed per student.