Section D

Performance Based Evaluation of Clinical Competence:
The Objective Structured Clinical Examination (OSCE)

Linda Shaw, M.D.

Overview
The Objective Structured Clinical Examination (OSCE)format has students rotate through a series of stations where clinical skills are assessed.Interest in its use evolved from the realization that oral clinical examinations were too subjective, while written examinations tested only knowledge.Acceptance of its use, in the conservative tradition of medical education, may rest on the requirement of the Liaison Committee on Medical Education (LCME) that faculty must directly observe and assess core clinical skills and behaviors.Roadblocks to the OSCE’s widespread use have been the large expenditure of faculty time and resources required for its development and implementation.However, its potential for stimulating faculty interest in the curriculum make it an exciting format.

Description and Rationale for Use
The OSCE is a format for evaluation. The methods of assessment in an OSCE can include written, oral, clinical observation and use of standardized patients.An OSCE uses multiple stations, each with a specific time limit, that every student passes through.At the stations basic clinical skills, including procedural, problem solving and Counciling skills are evaluated.Examples of stations include: 1) taking a history or doing a physical on a simulated patient;2) interpreting x-rays, microscopic slides or EKGs; 3) analyzing diagnostic or management data.Stations might require answering a set of written or oral questions.The station may use a trained examiner using a standardized checklist.Actual or simulated patients can be used as trained examiners or a separate examiner can be utilized.

In February, 1991, changes were published by the Liaison Committee on Medical Education for accreditation standards in medical schools.These changes included “the evaluation of student achievement must employ a variety of measures of knowledge, competence and performance, systematically and sequentially applied throughout medical school.””Institutions must develop a system of assessment which assures that students have acquired and can demonstrate on direct observation the core clinical skills and behaviors needed in subsequent medical training” (Liaison Committee on Medical Education, 1991).

The LCME changes seem to reflect an evolving recognition that medical schools need more appropriate techniques for evaluating the clinical (performance) skills of their students.Dr. Osler in 1885 had written that we need to pay more attention to “practical portions” on examinations of students of medicine.Multiple-station laboratory examinations, for example in anatomy and pathology, were familiar models used in medical schools.Dr. Ronald Harden of the University of Dundee in Scotland used this model in developing a clinical medicine version of such examinations in the mid-70’s, i.e., the OSCE.By adding the OSCE to his “examiners’ toolbox,” he and colleagues hoped to create a more valuable examination that was practical, reliable and valid.Over the past 20 years, performance-based examinations have continued to be developed to enhance our ability to assess psycho-motor, problem-solving, attitudinal and communication skills.These skills are not well assessed on written examinations, which test cognitive skills.

The evolution of evaluation methods was encouraged by a gradual recognition of the need to change traditional medical school curriculums to better teach the skills required by physicians.Some educators realized that perhaps the most effective way to change the curriculum was to change the methods of assessment.Over the last 15 years, several medical education conferences have focused on teaching clinical competency. With a recognition that evaluation drives learning; teaching physical diagnosis, interviewing, and problem solving became important curricular issues.In 1984, the AAMC recognized that the medical curriculum must specifically include synthesis and application of knowledge in clinical settings.They acknowledged the curriculum must teach students to effectively interact with patients.Currently, the AAMC is still concerned that mandating clinical assessment examinations may be too costly and too complicated for all medical schools, but the organization heartily supports requiring some kind of performance-based clinical examination.

Strengths and Weaknesses

Strengths : An OSCE focuses on the ability to synthesize and apply knowledge in clinical settings, as well as interact effectively with patients.Presumably, students will then learn these skills because the examination influences what is to be learned on the clerkship.That is, the student will focus on the ability to gather data, analyze it, and make justifiable

conclusions.A strength of OSCEs5is in testing motor, interpretive and clinical integration.Importantly, students’ test performances can point out flaws in the curriculum and lead to changes in teaching.Ideally, faculty interest in medical education and methods of teaching will be stimulated by OSCE feedback from students and medical educators.

Traditional methods of clinical observation by faculty and residents are subject to poor inter-rater reliability, the influence of irrelevant attributes of the student, the halo effect, etc.The OSCE makes an attempt to overcome the low reliability and poor validity of direct observation evaluation.The OSCE overcomes the uncontrolled variables of: 1) case difficulty, 2) differing range of focus and standards of evaluators, 3) lack of agreement on acceptable performance, and 4) collective knowledge (of peers, residents and faculty) contaminating the students’ own knowledge.In reality, evaluators rarely see students with real patients.It can be difficult to find appropriate patient problems to assess in clinic or inpatient settings.During rotations, interpersonal skills are rarely formally assessed.No performance criteria are typically used to assess interpersonal skills in the clinical setting.So, the OSCE is an effort to make evaluation “more authentic,” that is, examine those behaviors that are important.

Weaknesses:Development, implementation and ongoing use of the OSCE takes considerable resources. There is a large time-commitment to develop and run OSCEs.Therefore, a busy faculty must embrace the project.Typically, both individual and committee work is required to make sure each station is consistent with the objectives of the curriculum.Not only is the development phase time- and labor-intensive, but the implementation of the exam requires significant organization and manpower.(Some believe that in order for the exam to be reliable and valid, it must last 2 1/2 – 4 hours.)Careful sampling techniques must be employed to decide on the examination content.There must be a balance between broad sampling of skills and practicality.The stations must not focus on inappropriately minor aspects of a clinical skill.And it does not necessarily follow that because a student can pass a skill at a station, the student can appropriately respond to related patient problems.Also, the issues of subjectivity of scoring and inter-rater variability are never completely overcome by this format.Problems with equipment, e.g. failures of microscopes, blood pressure cuffs, lighting, etc., can occur.Test security, that is, the impact of repeated use of a particular station may present a problem.To overcome this, the use of repeat stations in any one OSCE administration should be limited.

Appropriate Applicability – Implementation Strategy

There is little documentation of the applicability of the OSCE as an evaluation tool in pediatric clerkships.Its use as part of a summative evaluation requires further investigation.Nonetheless, the OSCE has been used for a range of purposes.It is being used as a teaching tool and for formative evaluation in introduction to clinical medicine courses.There are anecdotal examples of its use for skills assessment at the end of clerkships.It is being developed as an evaluation of minimal clinical competency for graduation at many medical schools.Performance-based components are being developed as part of licensing examination procedures nationally and internationally.

Programs interested in implementing OSCEs have several avenues to explore.In 1981, the Clinical Skills Assessment Alliance (CSAA) was established.This alliance, made up of eight leading medical professional organizations, has elaborated a plan to comprehensively develop and promote reforms in evaluating competency of physicians.The Special Interest Group on Standardized Patients, Group on Educational Affairs of the AAMC also has been active in promoting the use of evaluating clinical competencies.The Macy Foundation has financed five regional consortia, covering 20 medical schools.These consortia are sharing resources in the implementation of developing performance-based clinical exams.

In addition to consulting these groups, there are several publications that describe the development and implementation of OSCEs.(See References) Clerkship directors may want to engage a consultant from one of the programs that have experience employing an OSCE.For example, educators from Southern Illinois University, the University of Texas Medical Branch, the University of North Carolina, the Henry Ford Health System, the University of Virginia and the University of West Virginia have published reports on their use of OSCEs.At the 1994 Spring APA meeting, a workshop was offered on the Anatomy Of An OSCE.This workshop on the planning, production, and administration of an OSCE was presented by faculty from the Department of Pediatrics, Henry Ford Health System and St. Joseph’s Mercy Hospital in Michigan.

Anecdotal Examples

In response to a Pediatric Clerkship Evaluation Survey done by COMSEP members in the fall of 1993, about seven responders indicated that the OSCE is being used in their institutions.A set of questions about their use of the OSCE was later sent to each of those programs.At the University of Alberta, Canada, an OSCE is used at the end of the pediatric clerkship.Each OSCE lasts three hours and is repeated every eight weeks.Staff developed the OSCE after instructional seminars.They believe it is a valuable tool which has replaced the traditional oral examination.The University of Massachusetts uses a two station OSCE in evaluation of their pediatric students.They use the OSCE because they believe it correlates with their clerkship goals and objectives.Per 100 students, they estimate the cost at $3,000.The Mayo Clinic Department of Pediatrics is using an OSCE, but their stations are incorporated into the Year-3 and Year-4 clinical skills examination.There are 3-4 pediatric stations used in their medical school examinations.They describe the process as “time-consuming” to develop and administer.They believe it is an expensive modality for testing.However, they feel it meets the LCME requirement for testing integrated skills.Other COMSEP members indicating they use an OSCE for clerkship evaluation were Dartmouth, Hahnemann, McGill, the University of Illinois at Peoria, and the University of Manitoba at Sherbrooke.

Cost/Resources Required

Estimates of costs to develop, implement and maintain on OSCE in a clerkship program are difficult.They have been quoted as ranging from $60 per examinee to $900 per examinee. The estimates may vary because the full range of activities to develop and implement an OSCE — e.g., faculty time to determine objectives, recruitment and training of patients: recruitment and training of evaluators, and logistical arrangements (equipment, space, etc.) — may not be consistently included in the estimates.Supporters allege that when compared to the true cost of using actual patients and increasing faculty involvement in realistic observation of students, an OSCE is cost effective.Expenses must consider secretarial and administrative support, payment to simulated patients, space for the examination, etc.The most significant resource appears to be faculty time – commitment to develop and to maintain the use of the OSCE.An educational consultant, knowledgeable about validity, reliability, sensitivity, specificity may be a necessary resource.Reportedly, finding children for use as patients or simulated patients, can be difficult.Station examiners are often less problematic to engage.Residents, faculty, other students, and allied health professionals, etc. can be trained as evaluators.

 

REFERENCES

  1. Liaison Committee on Medical Education. Functions and Structure of a Medical School.Washington, DC.AAMC and AMA, 1991.
  2. Joorabchi B.Objective structured clinical examination in the pediatric residency program.Am. J. Dis. Child. 145: 757-762. 1991.

Other References

  1. Department of Pediatrics, Henry Ford Health System and St. Joseph Mercy Hospital. APA Workshop. Spring, 1994.
  2. Reznick RK,et al.Guidelines for estimating the real cost of an objective structured examination.Acad. Med.,68:513-517 1993.
  3. Barrows HS.An overview of the uses of standardized patients for teaching and evaluating clinical skills.Acad. Med. 68:443-453. 1993.
  4. Mast TA, Barrows HS.,eds. Special section: Annex to the proceedings of the AAMC Consensus Conference on the use of standardized patients in the teaching and evaluation of clinical skills.Teach. Learn. Med. vol. 6, No.1 1994.
  5. Otten AL.Bermuda revisited: Impact of a conference, “Clinical education and the doctor of tomorrow” five years later.Josiah Macy, Jr. Foundation 1994.
  6. Miller GE.The assessment of clinical skills/competence/performance.Acad. Med. 65:563-567. 1990.