Council on Medical Student Education in Pediatrics

COMSEP Logo

Search This Site

Journal Club

Pusic MV, Pachev GS, MacDonald WA. Embedding Medical Student Computer Tutorials into a Busy Emergency Department. Acad Emerg Med 2007; 14:138-148. Reviewed by Chris White Medical College of Georgia.


Pusic MV, Pachev GS, MacDonald WA. Embedding Medical Student Computer Tutorials into a Busy Emergency Department. Acad Emerg Med 2007; 14:138-148.

Reviewed by Chris White Medical College of Georgia.

Overview:
The authors wanted to see if focused, brief computer tutorials could supplement medical student learning when placed in the midst of the clinical learning environment. They created 6 short computer tutorials (each designed to be able to be completed in 10-15 minutes), and placed them on a computer at the nursing station in the pediatric emergency department.

The study involved senior medical students during their required 2-week pediatric emergency medicine rotation. The students were asked (not required) to do 3 of the tutorials whenever they could find a convenient time to do so during the rotation. They were randomized to do 3 of the 6 cases. Outcome measures included: 1) Statistics on student usage of the tutorials (how many cases were done, how long each case took, time of day cases were done, etc.); 2) Student performance on a 6-item, short-answer written examination. The exam was given as a pre-test at the start of the rotation, and at the end of the rotation. The exam did not count toward their final grade. 3) Multiple surveys of student computer experience, attitudes toward the tutorials, clinical experience during the rotation. 4) Faculty survey regarding their attitudes toward the intervention. The students' performance on the written exam was graded by 3 reviewers, two of whom were blinded to the identity of the student and whether it was the pretest or posttest. Since the students only did 3 tutorials but were tested on all 6, they also served as control groups for the tutorials they were not assigned on the written examination.

The computer cases were designed using a program called Toolbook II Instructor, version 5.5 (SumTotal Corp., Mountain View, CA). The topics of the 6 cases were: cervical spine x-rays, febrile seizures, fever without source, growth plate fractures, oral rehydration solutions and tissue adhesives.

Results:
74 students took both the pre-test and post-test, and 73% of the students did all three cases they were assigned. The mean tests scores improved from 2.9 (±1.9) out of 10 to 4.9 (±2.4) from the pre-test to the post-test, which was a large statistically significant effect size. For 5 of 6 tutorials there was at least a moderate statistically significant improvement in test scores by the students who completed the tutorial. Interestingly, the tutorial where no effect was noted involved growth plate fractures. The authors postulate that this was due to the dedication of the teaching faculty to teaching this concept. Most students found the tutorials helpful, and the location in the middle of the nursing station was not a problem.

Limitations: Some of the students found ways to do the other 3 cases they were not assigned by signing in under the "residents" or "other" category (these groups had access to all 6 tutorials). This "contamination" of the control group might have lessened the impact of the intervention.

Comments:
The authors felt "the single most important finding of the study was that medical students on rotation in a busy clinical setting could and would do the computer tutorials." There is an excellent discussion in the introduction of the paper about the use of multimedia learning strategies, and using computer-assisted instruction (CAI) for situational or "just-in-time" learning. They chose these 6 tutorials because they felt they were important for the students to learn, lent themselves readily to a computer-based teaching format, and they involved the kinds of patients that most students will see in their emergency department. The authors hypothesized that students who seek to acquire knowledge when they need it most will have the greatest motivation to learn. Thus having a short, focused learning module on a specific topic can reinforce a concept that the student has just seen in an actual patient. Many of us encourage students to read about the patients they see because they remember it better. The use of CAI in this study is very analogous to that concept, and is much different than the use of CLIPP, which attempts to create a simulated patient with embedded learning issues. These short computer tutorials lend themselves well to placement in the clinical environment, as they can be completed in 10-15 minutes. This type of CAI might also lend itself well to being converted to a podcast, which could be reviewed by the student as often as needed and would be readily available.

Editorial Comment:
I can't help but think the newest film in a well-known series will be called "Fast and Furious: Medical School Edition." We all are looking for highly interactive education sessions that minimize faculty student time but maximize adult learning. "Just in time" learning is hot and many of the tools to create CAI modules (see http://www.toolbook.com/ or articulate.com) seem easy to use.

Return to Journal Club