Council on Medical Student Education in Pediatrics

COMSEP Logo

Search This Site

Journal Club

Assessing the Reliability and Validity of the Mini-Clinical Evaluation Exercise for Internal Medicine Residency Training. Durning, SJ, Cation, LJ, Markert, RJ, Pangaro, LN. Academic Med 2002; 77:900-904. Reviewed by Sherilyn Smith, University of Washington.


Assessing the Reliability and Validity of the Mini-Clinical Evaluation Exercise for Internal Medicine Residency Training. Durning, SJ, Cation, LJ, Markert, RJ, Pangaro, LN. Academic Med 2002; 77:900-904.

Reviewed by Sherilyn Smith, University of Washington.

Purpose: This research study examines the validity of the mini-clinical evaluation exercise (mCEX) on PGY-1 internal medicine residents at one institution. The authors compare the mCEX to the American Board of Internal Medicine monthly evaluation form (MEF) and the American college of Physicians/American Society of Internal Medicine in training examination (ITE).

Methods: The scores of different sections of the mCEX, MEF and ITE from 3 groups of residents (n=23) were reviewed and compared. The sections of the mCEX during the study period were: clinical skills history, clinical skills physical examination, clinical judgement, humanistic attributes and overall clinical competence. The MEF contains sections on: clinical skills history, clinical skills physical examination, clinical judgment, humanistic attributes, medical knowledge, medical care and overall clinical competence.

Results: Each resident had an average of 7 mCEX, 12 MEF and 1 ITE performed during the study period. The mean scores on the mCEX and MEF were 7.5 (of a possible 9) and 7.67 (rating scale not stated). The authors found good correlation between the mCEX and MEF in clinical skills, clinical judgment, humanistic attributes and clinical competence (correlation coeffiencents ranging from 0.59-81). The mean scores on the mCEX did not change through out the study period.

Comments: The strengths of the study are the methods used to evaluate the validity and reliability of the mCEX. Additionally, the study compared tools commonly used to assess internal medicine residents' skills beyond medical knowledge. The good correlations are reassuring and indicate that we may be measuring the same skills with different tools. The limitations of the study stem from: the small study size (8 residents/year, mostly male residents and a single institution), lack of validation of the MEF as a sensitive evaluation tool, the large number of attending physicians completing the mCEX (46) and no description of faculty development for the use of the two clinical evaluation tools. The fact that the mean scores on the mCEX were relatively high (7.5 of 9) and did not change over time suggests that the power of the mCEX to finely discriminate levels of clinical skills is limited. Overall, the mCEX is probably a useful, time friendly method to evaluate feedback of clinical skills but needs continued refinement and study to maximize its utility. Pediatric educators should think about developing a similar tool or explore the application of the currently developed mCEX for residents and students.

(Seems like we all wish we could find a tool that would encourage evaluation of real bedside practice by the students and promote more direct observation. Do you use a tool to collect evaluation data on individual patient encounters for your students? Do you think that such a tool would evaluate things that are currently not captured by traditional end of rotation global assessments - or by OSCE's and SP's? Steve Miller)

Return to Journal Club