Council on Medical Student Education in Pediatrics


Search This Site

COMSEP 2005 North Carolina Meeting

Poster Presentation:

Development of a Validated, Curriculum-based Exam for the Pediatric Clerkship


 Venus Wilke, University of Utah; Norman Berman, Dartmouth Medical School; Leslie Fall, Dartmouth Medical School; David Levine, Morehouse School of Medicine; Chris Maloney, University of Utah; Mike Potts, University of Illinois at Rockford; Ben Siegel, Boston University; Sherilyn Smith, University of Washington

Objective: The Computer Learning in Pediatrics Project (CLIPP) program comprehensively covers the COMSEP curriculum, and includes an exam based on the content of simulated cases. We sought to validate the CLIPP exam as a tool to assess medical student performance, based on item analysis and comparison to the National Board of Medical Examiners (NBME) Pediatrics Subject exam.

Methods: After pilot testing the CLIPP exam, performance on individual exam items was reviewed and problematic items were revised. Individual questions on a 100-item CLIPP exam for students from 4 schools were analyzed using a commercial software package. Students from two schools were administered both the NBME and CLIPP exam. The student performance on the two exams was compared using Pearson correlation statistics and chi-square analysis.

Results: Analysis of 100 individual exam items for 148 students shows the average item R value is 0.196 (a positive correlation.) 128 students took both exams. The average score for the NBME was 72.4, compared to 76.7 on CLIPP. The Pearson correlation R score is 0.46 (p<0.0001). Chi-square analysis shows that high or low performance on the CLIPP exam is predictive of similar performance on NBME.

Conclusions: The individual CLIPP exam questions perform well at discriminating high and low student performance on the CLIPP exam. Student performance on the CLIPP exam correlates moderately well to NBME. The CLIPP exam is a feasible and valid alternative to the NBME shelf exam.