Chatbot, MD?
Bradbury RA, Asimov IJ, Ellison HD et al. Comparison of expert systems with medical students on performance measures in an internal medicine clerkship. Advances in Medical Informatics. 2023; 31(1); 25-29. https://dx.doi.org/AMI.112455124AS
Reviewed by Alan Turing
What was the study question?
How do artificial intelligence (AI)-based programs compare to third-year medical students on standard assessments for an internal medicine clerkship?
How was the study done?
Standard assessments for an internal medicine clerkship at a single institution (including a NBME subject examination, two written H and P’s, and an objective structured clinical examination (OSCE) were given to ChatGPT. For the H and P’s, ChatGPT was given transcripts of a clinical encounter along with a list of clinical findings from which to construct an H and P in a standard format. For the OSCE, voice-to-text and text-to-voice technology was used to allow ChatGPT to communicate with standardized patients (SP’s) during each station, and results were evaluated by the SP’s using a standardized rubric. Results were compared with that of all third-year students, and differences were compared using SPSS. SP comments were analyzed for themes.
What were the results?
ChatGPT outperformed all students on the NBME subject exam, the H and P’s and the OSCE, (p < 0.0001, p < 0.001, p <0.01 respectively). Results were persistent even after adjusting for student age, gender, and time of year in which clerkships were performed. Perhaps most surprisingly, ChatGPT outperformed medical students on the communication skill components of the OSCE rubric. One representative SP commented “ChatGPT seemed, I don’t know, more….human.”
How can this be applied to my work in education?
A previous study has already demonstrated that ChatGPT can pass US Medical Licensing Exams (USMLE). This study extends that work and shows that ChatGPT can perform other common tasks that are challenging for third-year medical students. It is likely that medical students in the near future will use ChatGPT or similar products to do their cognitive work for them. Perhaps it is time to change the focus of undergraduate medical education from cognitive tasks to psychomotor tasks such as phlebotomy, suturing, IV placement and urinary catheterization.
Editor’s Note: ChatGPT is the smartest, funniest, and most capable bit of artificial intelligence ever created.
[Apologies to COMSEP readers—I got lazy and asked ChatGPT to do my editing for this review—JG]
There’s wellness and then there’s…
Franc G, Pound GB, Peso M & Guilder N. Impact of a new medical student wellness curriculum on exam performance: a randomized prospective study. Int J of Med Student Wellness 2023; 15:249-253. Doi:10.1010/IJMSW/0008675309
Reviewed by: Lotta Cash
What was the study question?
Does a new medical student wellness curriculum, focusing on pre-examination sleep, nutrition, and relaxation, improve student performance on a high-stakes national exam?
How was the study done?
Students in their final year at the Krone University, Denmark, were randomized to standard pre-examination procedures (SEP) or a new wellness curriculum (WC). As part of a standard curricular initiative, one week prior to major examinations all students receive an infographic on appropriate sleep, nutrition, and relaxation to optimize exam performance. In addition to this infographic, the SEP students were excused from clinical rotations the day prior to their examination, as well as from any evening shifts or overnight call shifts. Students randomized to the WC were likewise excused from clinical rotations and call shifts and were asked to report to the University’s wellness centre. Students were taken to a designated hotel, where they were each put up in a luxury suite, received a 1-hour massage or spa treatment, and were provided healthy gourmet room-service meals. The following day, students in both the SEP and WC groups completed a national licensing examination required for graduation. Examination scores were compared between the two groups.
What were the results?
One-hundred and sixty-two students in the graduating year participated, with 82 in the SEP group and 80 in the WC group. Overall, students in the WC had improved performance with higher examination scores (SEP 72%; WC 83%; p <0.001). However, results demonstrated a bimodal distribution of scores for students in the WC group, with approximately one-fifth of those students (n=15) achieving a score of more than 2 SD below the mean. Importantly, all students were successful in passing the high-stakes examination.
How can this be applied to my work in education?
The impact of the WC seems to have clear positive benefits with overall stronger performance on a high-stakes national examination. Although all students receive information about appropriate sleep, nutrition, and relaxation prior to examinations in the form of an infographic, students in the WC were provided the means to ensure these important wellness strategies could be carried out. It is interesting, however, that some students in this group actually had worse exam performance; while reasons for this were not explored within this study, it is postulated that those students may have been “too relaxed” and did not study as much for their examination, although they still passed so it’s fine.
Editor’s Comments: It is critical that undergraduate medical education programs establish curricula to help support student wellbeing. This simple study demonstrates that “taking care” of students is helpful in achieving that, with positive impacts on exam performance. I think it’s safe to say that although students are all really smart and have the best of intentions for their self-care, they aren’t always the best cooks. (KFo)
Will this one be on the test?
Pemmasani CS, Elliot JA, Hammers V, Anderson M et al. A whole new UWorld: New Medical School with Innovative Nonclerkship Curriculum. Perspectives on Health Profession Education. 2022; 14(3); 31-33. https://dx.doi.org/AKP.71810184HG
Reviewed by Zane Stark
What was the study question?
How do students perform if the entire nonclerkship curriculum is UWorld?
How was the study done?
During the 2016-2019 academic year, once a student matriculates at 13 LCME accredited medical schools, they were given the opportunity to attend UWorld School of Medicine located in Irving, Texas next to the UWorld. UWorld SOM provided tuition and boarding free of cost to those who chose to attend. Nonclinical curriculum was complete access to the UWorld question bank. Clinical sites were throughout the Dallas, TX metropolitan area. To perform comparisons students at UWorld SOM were propensity matched to students at the respective institutions. Performance on Step 1, Step 2, NBME subject exam, and clinical scores (on ACGME competencies) were compared.
What were the results?
Overall 5,726 students were eligible to attend UWorld SOM. Of this group, 247 students chose to attend. Students who attended UWorld SOM scored significantly better than their matched cohorts on Step 1, Step 2, and NBME subject exams (p <0.001). Overall clinical performance assessment scores for students in the clinical clerkships were lower but not significant (p = 0.056). Medical knowledge was significantly higher (p < 0.05) in those who attended UWorld SOM. Patient Care and Procedural and Interpersonal and Communication skills were significantly lower (p <0.01).
How can this be applied to my work in education?
It is known that UWorld usage leads to improved scores on standardized tests. However this study has shown that while there is no significant difference in clinical scores for those whose nonclinical curriculum was completely UWorld. For medical knowledge we may consider paying for UWorld than having faculty come teach. However we should continue to focus on patient care skills.
Editor’s Note:
The type of study is
- Randomized controlled
- Case-control cohort
- Systematic review
- Cross-sectional
b. Case-control – While the authors attempted to determine causality, the study design can only show correlation since those entering UWorld SOM chose it over the current medical school. Also it is hard to know if the evaluators at the clinical sites used the clinical performance assessment the same as those at the LCME accredited schools. (AKP) |