Council on Medical Student Education in Pediatrics

COMSEP Logo

Search This Site

Journal Club

Schwartz L.M., Fernandez, R, Kouyoumijan, S.R., Jones, K. A. A Randomized Comparison Trial of Case-based Learning versus Human Patient Simulation in Medical Student Education. Academic Emergency Medicine 2007;14:130-137. Reviewed by Katinka Kersten; University of Missouri


Schwartz L.M., Fernandez, R, Kouyoumijan, S.R., Jones, K. A. A Randomized Comparison Trial of Case-based Learning versus Human Patient Simulation in Medical Student Education. Academic Emergency Medicine 2007;14:130-137.

Reviewed by Katinka Kersten; University of Missouri

Mannequin-based human patient simulation (HPS) in medical education is gaining popularity. There is enough data to show that participants respond favorably to HPS-based training. However, there is little evidence to support that HPS is superior in acquiring knowledge and skills when compared to more traditional teaching formats.

In this study the efficacy of simulation training versus case-based learning (CBL) among medical students was evaluated as measured by observable behavioral actions after the educational intervention.

Fourth year medical students that were enrolled in a mandatory, month-long emergency medicine (EM) clerkship were studied. In week one the students were given a lecture on EM management of acute chest syndrome (ACS) and they received the core objectives. In week two students were consented for participation in the study and were randomized to participate in a one hour HPS-based instruction or CBL session. In the CBL session the students worked through a vignette of a patient with ACS with the help of a facilitator and they reviewed management of ventricular tachycardia and ventricular fibrillation. During the HPS session participants individually assessed and managed a simulated patient with ACS and subsequent cardiac arrest with guidance and feedback from an instructor. At the end of the clerkship all students participated in an ACS OSCE similar to the case presented earlier. A trained evaluator who was blinded to the intervention groups scored the students' performance utilizing a 43-point checklist of required actions. All sessions were recorded and a subset of students' performance was scored again by two physicians who were also blinded.

A total of 102 students participated in this study (n = 52 for CBL group and n = 50 for HPS group). Student performance on the OSCE exam was similar between the two groups for the majority of items. There was no mean difference between groups on the overall score (43 items), history category (22 items), acute MI evaluation and management (13 items), and cardiac arrest management score (8 items). Demographics and subspecialty interest at the time of the study were well balanced between the groups. The overall percent agreement between the physicians and trained evaluator scores was 89%.

The study was fairly small and it was not possible to randomize for academic achievement and prior patient care experiences with the potential for baseline group differences. One of the strengths of HPS is the unlimited ability for repetition of skills with assessment and feedback. This has been shown to improve acquisition of expertise in medicine. Repetition of skills with potential outcome changes was not part of this study. In addition, HPS-based training appears to be particularly effective in training cognitive strategies and situational awareness. These qualities were not evaluated with the OSCE exam.

The use of patient simulation training in medical schools is growing exponentially. This prospective randomized study showed that the outcome on a clinical OSCE exam was no different when students participated in a one hour CBL session versus a one hour HPS session. Clearly more outcome-based research is needed in the field of simulation. Don't discard those problem based learning cases and CBL vignettes as of yet!

There's a repetitive process that plays out as new teaching methods are developed. The early literature on PBL was much like this, and we've seen the same thing with computer-aided instruction. Should we be surprised that we see these papers with simulation? Eventually, this technology will find its place. My guess: team training and procedure training. I have my doubts about diagnostic skills and clinical reasoning skills, at least with 2007 technology. Bruce Morgenstern.

Return to Journal Club