April 2024

Hello COMSEP!

Many of us our now facing the ‘post-conference blues’, that inevitable period in which energy, enthusiasm, and a basketful of new ideas run headlong into the quicksand of our obligations and daily work.

Don’t lose that momentum!  Take out your notes from the annual meeting and remind yourself of what you wanted to accomplish.  Pick one small project and take the first step to making it happen.

Read this edition of the COMSEP Journal Club to see what other medical educators are up to. And, if you are curious and motivated, volunteer to write a review for a future edition. We are now looking for reviewers for summer and fall. Reach out to COMSEP to volunteer or if you have any questions.

Enjoy!

Amit, Jon and Karen

Choosing chief residents

Mirabal, S.C., Wright, S.M. & O’Rourke, P. The selection of chief residents across residency programs at a large academic medical center. BMC Med Educ 23, 931 (2023). https://doi.org/10.1186/s12909-023-04896-9

Reviewed by Juliana Fan

What was the study question?

How are chief residents (CR) evaluated and chosen by residency program leadership across graduate medical education (GME) programs?

How was it done?

This study was a single-center analysis with 1:1 semi-structured interviews with leadership of all 21 GME programs with chief resident roles. Deidentified transcripts were coded by two investigators to categorize and identify meaningful themes.

What were the results?

The study identified four salient themes.

  1. First was identification of candidates, including timing of candidate interest, nomination by peers or other individuals, and clinical, teaching, or personal qualities. Program directors felt that teaching and clinical skills were mostly subjective since they are assessed by evaluations. Desirable personal qualities include ability to communicate and manage conflict.
  2. The second was in expression of intent or interest on the part of the residents and evaluation of interviews themselves. Many programs asked for applications and performed interviews.
  3. The third was in the selection of candidates, which includes parameters such as voting, discussions amongst leadership, and program director preferences. Some programs ask all the residents to vote whereas others only involve the leadership team.
  4. The final theme that emerged was an overall sense of high confidence in the CR selection process and outcomes.

How can this be applied to my work in education?

This study encourages program leadership at other institutions to reflect on their selection processes, noting especially that most measures used were subjective with only some objective criteria. Since CRs can go on to remain involved in academic medicine, this study emphasizes the importance of being cognizant of how biases may affect the selection process and encourages prioritizing diversity and inclusivity considerations. It would be interesting for such a work to be replicated at another institution to see similarities across graduate medical programs.

Editor’s Note: This study really makes us all reflect on many of our selection processes that may have little objectivity. I know for anything I am involved in the future I will really try to gather as much objective evidence prior. (AKP)

Growth mindset: true and false

Memari M, Gavinski K, Norman MK. Beware False Growth Mindset: Building Growth Mindset in Medical Education Is Essential but Complicated. Acad Med. 2024 Mar 1;99(3):261-265. doi: 10.1097/ACM.0000000000005448.

Reviewed by: Chas Hannum & Antonia Kopp

What is mindset theory?

Mindset theory explores how learners’ beliefs about intelligence affect their behavior in learning environments, based on the work of Carol Dweck. It distinguishes between a growth mindset and a fixed mindset. Dweck believes that every individual is a blend of the two mindsets. In the educational setting, those who use growth mindset theory bring attention to the process of improvement and how this can drive performance.

What was the main question this article was addressing?

What is the current application of mindset theory within medical education? What are common misunderstandings about growth mindset that would interfere with its application?

What is the current application?

Growth mindset theory can be used to reduce stressors often seen by students - imposter syndrome, burnout and when students struggle in learning. Coaching frameworks and normalizing a culture of feedback can improve the commitment to lifelong learning for students. Additionally, growth mindset theory can help align performance within competency-based assessment.

What is false growth mindset? 

A term that Dweck coined referring to common misunderstandings of her work, including:

  • Simply referring to positive personal traits (e.g. flexibility) as growth mindset: It is not just the trait or characteristic but it’s application to a challenge that represents growth mindset.
  • Believing that growth mindset is only about effort: seeking feedback and guidance are equally important.
  • Telling learners that they “can accomplish anything”: goal setting should be realistic
  • Believing that is up to learners alone to correct a fixed mindset: students depend on the learning environment, educators and organizational supports for guidance on developing and applying growth mindset thinking.

How can I apply this to my work in education?

Educators can work to foster a growth mindset both in their own individual teaching as well as in the larger environment in which they work. It is paramount to be cognizant of false growth mindset misunderstandings. Normalizing struggles and growth for physicians, challenging assessment strategies to focus on process in addition to outcomes, and mindset theory training for faculty can promote engaged, resilient and growth-oriented learners.

Editor’s Comments: This thought-provoking article is a must read for all medical educators. Key messages that I took from it included (1) that mindsets are domain specific and may change over time and in different circumstances; and (2) that it can be challenging as an educator to foster a growth-mindset culture when our institutional structures and environments are founded in a fixed mindset. If you’ve read this far, I would encourage you to read the full article! (KFo)

Testing a clinical reasoning tool

Hornos E, Pleguezuelos E, Bala L, Collares CF, Freeman A, van der Vleuten C, Murphy KG, Sam AH.

Reliability, validity and acceptability of an online clinical reasoning simulator for medical students: An international pilot. Med Teach. 2024 Mar 15:1-8. https://dx.doi.org/10.1080/0142159X.2024.2308082

Reviewed and edited by Karen Forbes

What was the study question?

What is the reliability, validity and acceptability of Practicum Script, an online simulation-based program for developing medical students’ clinical reasoning skills using real-life cases?

How was it done?

A multi-centre international pilot study was conducted with final-year medical students (n=2457) from 21 schools worldwide. Twenty clinical cases were used in a formative manner; students were asked to independently complete cases on their own time and could repeat a case if they failed in formulation of 2 or more hypotheses or had non-valid answer options in more than 2 of 5 clinical scenarios for a case. Using psychometric analysis measures, reliability was calculated for three test domains: hypothesis generation, hypothesis augmentation, and knowledge application. Further confirmatory factor analysis and measurement alignment were done to obtain validity evidence. At the end of the pilot period, an anonymous survey was sent to students who completed at least 3 clinical cases.

What were the results?

1502 registered students (61.13%) answered at least 80% of cases, and 1430 students (58.20%) completed all 20 cases with an average time of 24 minutes for case completion including review of feedback. The mean number of plausible diagnostic hypotheses given by participants was 1.84 (+/- 0.63) with 80.3% matching with experts. Reliability estimates for the three test domains were high, ranging from 0.78 to 0.93. Validity evidence revealed acceptable goodness-of-fit indices for the three-factor model with moderate to high significant correlations between constructs. 380 of 1952 (19.47%) students who completed at least 3 clinical cases responded to an anonymous survey, of which 89.80% rated the experience with the cases as excellent or good, and 82% reporting recommending incorporation of Practicum Script into the UME curriculum.

How can this be applied to my work in education? [including Editor’s comments (KFo)] This educational tool allows students to practice clinical reasoning with expert developed cases based on real patients, with associated uncertainty, and receive in the moment corrective feedback. Analysis of reliability, validity and acceptability based on a large number of student participants revealed strong measures. I went to the online information site https://www.practicumscript.education/en_en/how-to-train-the-reasoning-clinic to learn more about this simulation. I was quite impressed with how the case simulations progress (no conflicts of interest to disclose here), employing both heuristic and hypothetico-deductive reasoning in different aspects of the case, and incorporating degree of certainty and comparison and feedback from experts. In terms of applicability to pediatrics, the website shows that pediatric cases/courses are available, although currently only in Spanish. It appears that one can purchase access to the cases, but I could not determine the actual cost (and it was not actually available for me to purchase). The authors note that Practicum Foundation is a non-profit developer of Practicum Script. So, while this online simulation tool looks to have great potential in providing students with the opportunity for developing clinical reasoning, it seems that it may not be ready for widespread use for our students just yet.