April 2021

| 0

COMSEP Journal Club
April 2021
Editors: Karen Forbes, Jon Gold and Randy Rockney

 

Working on Feedback

An Educational Intervention to Increase Student Engagement in Feedback.  McGinness HT,  Caldwell P, Gunasekera H, Scott KM.  Medical Teacher 2020; 42(11): 1289-1297.

https://doi.org/10.1080/0142159X.2020.1804055

 

Reviewed by Stephen Weimer

 

What was the study question?

Does a brief feedback workshop given to medical students before their pediatric rotations improve student self-directed feedback behaviors and satisfaction?

 

How was it done?

The authors designed and implemented a feedback workshop for 3 student cohorts at the start of a pediatric clinical experience at a single institution.   The feedback workshop discussed the students’ role in feedback and the challenges of feedback in the clinical setting along with coping strategies.  The workshop was piloted on a volunteer basis in Cohort 1; post-workshop surveys were used to improve the workshop and shorten it from 60 to 30 minutes.   The workshop was then given to all students in Cohorts 2 and 3.   Pre and post-interventions surveys were administered to all participants and included quantitative and qualitative data.

 

What were the results?

The proportion of students who reported seeking feedback weekly or more often increased after the workshop (Cohort 2: 50% pre-intervention,  82% post-intervention (P=0.004); Cohort 3: 44% and 65% respectively (P=0.031)  The percentage of students who reported satisfaction with feedback also increased (Cohort 2: 23% to 65% (p=0.002) and Cohort 2 40% to 70%  (P=0.003)). Qualitative data suggested that the workshop made students more aware of their role in the need to seek feedback but also reflected challenges to the feedback process in the learning environment.

 

What are the implications?

A brief feedback workshop resulted in more students seeking and receiving feedback weekly or more often and improved their satisfaction with feedback.  Since feedback is an essential component of medical student education, medical schools should consider a feedback workshop for all medical students at the start of clinical training.

 

Editor’s Note: This brief intervention was based on the recent understanding of feedback in the education literature as an equal partnership between teacher and learner.  It clearly improved the perceived quality and quantity of feedback, at least in the short-term.  It is a great example of how educators can take a scholarly approach to their work as long as they plan and think about the meaning of their work in advance.   Unfortunately, it didn’t seem to affect the utilization of feedback.  Perhaps longer or repeated interventions would help with that. (JG)

 

 

 

Verbal versus written feedback in a Malaysian medical school.

 

Feedback after OSCE: A comparison of face to face versus an enhanced written feedback. Ngim CF, Fullerton PD, Vanassa Ratnasingam V, Arasoo VJT, Dominic NA, Niap CPS, Thurairajasingam S.   BMC Med Educ. 2021 Mar 24;21(1):180. https://doi.org/10.1186/s12909-021-02585-z

 

Reviewed by Sanghamitra Misra

 

What was the study question?

After an OSCE, is face to face immediate feedback or individualized enhanced written feedback more beneficial for students?

 

How was it done?

For one cohort of medical students in Malaysia, face-to-face immediate feedback was provided after a formative OSCE to students in semester 1 while individualized enhanced written feedback (including detailed breakdown of their grade and free text comments along with a grading rubric) was provided in semester 2. Students and staff evaluated the feedback provided from both semesters, and student impressions were compared to their overall performance on the OSCE.

 

What were the results?

One-hundred sixteen students completed both OSCEs, 96 students completed the survey on face-to-face (FTF) feedback in semester 1 (response rate 82.8%), and 100 responded to the survey on enhanced written (EW) feedback in semester 2 (response rate 86.2%). Of the students, 75% preferred EW feedback and 23% preferred FTF. Students who preferred EW feedback had significantly lower OSCE scores compared to those who preferred FTF feedback (p=0.049). Fourteen examiners delivered both types of feedback, and 8 of them preferred the EW format and 6 preferred FTF feedback.

 

What are the implications?

Providing useful and meaningful feedback is crucial to improving the clinical skills performance of medical students. Taking into consideration the process of feedback after exams like OSCE’s can help create processes that fit the educators and students at individual institutions.

 

Editor’s Note: The authors argue that one possible explanation that students in this Malaysian medical school preferred the written feedback was that Asian students culturally  prefer ‘teacher-centered feedback’   It is interesting to think about this statement in comparison to the focus on feedback as an ‘educational alliance’ which is the focus of the article reviewed above (JG).

 

 

Thinking about learning, learning about thinking

 

What Were You thinking? Medical Students’ Metacognition and Perceptions of Self-regulated Learning.  Versteeg M, Bressers G, Wijnen-Meijer M, et. al. Teaching and Learning in Medicine 2021doi: 10.1080/10401334.2021.1889559

 

Reviewed by Melissa Held

 

What was the study question?

What type of metacognitive competencies do final year pre-clinical medical students display during a conceptual learning task, and how do they perceive self-regulated learning in their curriculum?

 

How was it done?

This qualitative study utilized a think-aloud assignment followed by a semi-structured interview. Participants were recruited using purposeful sampling and included eleven final year pre-clinical students. Students were asked to think-aloud while solving four exercises on medical physiology using a multiple-choice format.  After each exercise, students were prompted to evaluate their conclusions and certainty. The follow up interview explored how participants experienced the session, if questions were challenging for them, and how they went about solving the problems. A coding template was developed by the authors and refined during interviews until consensus was reached.

 

What were the results?

The authors were able to identify whether difficulties occurred during the planning, monitoring, or evaluation of the problem-solving process using their designed template. Students did not use as much time in the planning or evaluation phases of problem-solving, but rather on monitoring. Examples of monitoring strategies included rereading, goal-checking, visualizing the situation, and eliminating answer options to get to the correct solution. Many students felt that learning facts alone was adequate for success in the pre-clinical training phase and used superficial cues get to answers rather than metacognitive approaches.

 

What are the implications?

Student may need specific metacognitive training on how to plan, evaluate and adjust their problem-solving practices.  This reminded me of the Master Adaptive Learner cycle (Planning à Learning à Assessing à Adjusting). As students transition into clinical training, teaching and assessment of metacognitive and self-regulated learning may contribute to improved “clinical reasoning” and help students succeed as lifelong learners.

 

Editor’s comments: This study confirms the commonly held notion that “assessment drives learning” as students describe the value they assign to assessment outcomes, particularly as related to their need to learn factual knowledge. It would be interesting to brainstorm ways to integrate metacognition both in the pre-clinical and clinical learning environments, where students discuss their own thinking and learning. (KFo)

 

 

Life is a Highway

 

Traveling by winding roads or highways: Stability of medical students’ specialty preferences over time.  Querido  SJ, Wigersma L & ten Cate O. Medical Teacher. 2020; 42(11): 1298-1300. DOI: 10.1080/0142159X.2020.1804056

 

Reviewed by Kathryn Eckert

 

What was the study question?

How consistent is specialty choice through medical school and what influences the decision?

 

How was it done?

Medical students were interviewed four times over a three-and-a-half-year period, with 20/24 students completing all four interviews. The first interview took place at the beginning of the final study year. The second interview was at the end of the final study year, i.e. around graduation. The third and fourth interviews were one and one and a half years after graduation. At each interview, first and second career preferences were recorded, and a stability score was calculated using multiple data points over time.  The scores were stratified into low stability (winding road), medium stability (country road) and high stability (highway) based on how their career choice changed over time.  It should be noted that this study took place in Netherlands, where undergraduate medical education is a combined college/medical school experience over 6 years with the clinical exposure primarily in the last year of study. Additionally, it is common for students to delay entering residency after completion of medical school with opportunity to experience many specialties after graduation.

 

What were the results?

Students on the ‘highway path’ did not seriously consider many other options, and often had clinical experiences in the chosen specialty in their undergraduate medical education. Most of the students in the ‘country road’ and ‘winding road’ categories chose a specialty that was not experienced in undergraduate medical education. For those in non-highway paths it is likely they used the time between graduation and residency to explore a variety of specialty options.  The predominance of female study subjects may have influenced specialty choices as well.

 

What are the implications?

While the medical education system is different in the Netherlands compared with the US system, the study suggests that exposure to specialties early in the medical school curriculum influences decision making for residency and future career paths.  Therefore, the influence of preceptorships, pre-clerkship clinical curriculum and other clinical and research opportunities is significant and should ensure broad exposure for students.

 

Editor’s comments: While the authors describe three categories of student preferences, their results seem a bit more dichotomous to me: those who know what they want to do (highway path) and those whose preferences change over time. It is interesting to note that more than half of the interviewed students fit the latter category, suggesting that career exploration is indeed a valuable and needed aspect of undergraduate medical training. (KFo)