Julie Story Byerley, MD, MPH; David Hollar, PhD; Kelly Lear, BA; Kenya McNeal-Trice, MD; Suresh Nagappan, MD University of North Carolina, Chapel Hill, North Carolina
Objectives: To develop an additional assessment piece that would better evaluate clinical reasoning, important pediatric patient care skills, and provide additional distinguishing data for student assessment of core pediatric knowledge.
Methods: We developed a brief examination that contains 4 written questions (dosing a medication for a small child using provided resources, writing a safe fluid order in a given clinical scenario, plotting a growth curve, and appropriately ordering immunizations) and a ten minute case discussion of a common pediatric condition one-on-one with a course director. It was implemented in 2006-7. We compared our assessment data from this new tool to the four previously used data points of clinical evaluations, small group participation, write ups and shelf test performance.
Results: Statistical evidence indicates that the new examination represents a distinctive instrument for measuring student performance. There were no significant correlations between the new examination scores and scores from the other evaluation measures, and regression analyses show no significant prediction of new examination scores by the pre-existing data points.
Conclusion: Distribution of performance on this new assessment tool differs from that of previously used measures. Informal feedback from course directors and students implies that the stated objectives are now better emphasized. The addition of this assessment tool broadens our evaluation data and emphasizes important pediatric knowledge and clinical skills.