Council on Medical Student Education in Pediatrics

COMSEP Logo

Search This Site

COMSEP Meeting in Ottawa, ON

Poster Presentation:


Development and Multicenter Validation of a Written Pediatric History and Physical Exam Evaluation (P-HAPEE) Rubric

Authors:
Marta A. King,Saint Louis University School of Medicine,St. Louis,MO,Carrie A. Phillipi,Oregon Health & Science University,PDX,OR,Paula M. Buchanan,Saint Louis University,St. Louis,MO,Linda O. Lewin,University of Maryland School of Medicine,Baltimore,MD

Background

The written history and physical exam (H&P) encompasses numerous core competencies and entrustable professional activities and is an underutilized source of medical student assessment data.  No H&P scoring tools have been validated with senior resident raters or electronic health record (EHR) generated notes and no pediatric tools have been tested at multiple institutions.

Objective

To describe the development and multicenter validation of a novel written Pediatric History and Physical Exam Evaluation (P-HAPEE) rubric.

Methods

Through an iterative process, the authors drafted, revised, and implemented the 10-item rubric at three academic institutions.  Eighteen attending physicians and 5 senior residents each scored 10 3rd year medical student H&Ps.  Inter-rater reliability (IRR) was determined by intra-class correlation coefficients. Cronbach’s alpha was used to report consistency and Spearman rank-order correlations to determine relationships between rubric items. Raters also provided a global assessment, recorded time to review and score each H&P, and completed a rubric utility survey.

Results

The rubric’s overall IRR was 0.85. Global assessment IRR was 0.90. IRR for low and high quality H&P’s was significantly greater than for medium quality ones but did not differ based on rater category (attending physician vs. senior resident), note format (electronic health record vs. non), or student diagnostic accuracy. Cronbach’s alpha was 0.92.  The highest correlation between an individual item and total score was for assessments (0.83); the highest inter-item correlation was between assessment and differential diagnosis (0.78). Mean time to review and score an H&P was 16.6 minutes; residents took significantly longer than attending physicians (19.3 min vs. 13.1 min; p= <0.01). All raters described rubric utility as “good” or “very good” and endorsed interest in continued use.

 

Discussion: The P-HAPEE rubric offers a novel, practical, reliable, and valid method for supervising physicians to assess medical students’ pediatric written H&Ps. The rubric could also supply a framework for developing curricula for written documentation and clinical reasoning by demonstrating desired outcomes and could be used track skill progression across the medical education spectrum.