Cook DA, Stephenson CR. Validation of the Learner Engagement Instrument for continuing professional development. Acad Med. 2024;99(9):1024-1031. doi: 10.1097/ACM.0000000000005749.

Abstract

Purpose: Learner engagement is the energy learners exert to remain focused and motivated to learn. The Learner Engagement Instrument (LEI) was developed to measure learner engagement in a short continuing professional development (CPD) activity. The authors validated LEI scores using validity evidence of internal structure and relationships with other variables. Method: Participants attended 1 of 4 CPD courses (1 in-person, 2 online livestreamed, and 1 either in-person or livestreamed) in 2018, 2020, 2021, and 2022. Confirmatory factor analysis was used to examine model fit for several alternative structural models, separately for each course. The authors also conducted a generalizability study to estimate score reliability. Associations were evaluated between LEI scores and Continuing Medical Education Teaching Effectiveness (CMETE) scores and participant demographics. Statistical methods accounted for repeated measures by participants. Results: Four hundred fifteen unique participants attended 203 different CPD presentations and completed the LEI 11,567 times. The originally hypothesized 4-domain model of learner engagement (domains: emotional, behavioral, cognitive in-class, cognitive out-of-class) demonstrated best model fit in all 4 courses, with comparative fit index ≥ 0.99, standardized root mean square residual ≤ 0.031, and root mean square error of approximation ≤ 0.047. The reliability for overall scores and domain scores were all acceptable (50-rater G-coefficient ≥ 0.74) except for the cognitive in-class domain (50-rater G-coefficient of 0.55 to 0.66). Findings were similar for both in-person and online delivery modalities. Correlation of LEI scores with teaching effectiveness was confirmed (rho=0.58), and a small correlation was found with participant age (rho=0.19); other associations were small and not statistically significant. Using these findings, we generated a shortened 4-item instrument, the LEI Short Form. Conclusions: This study confirms a 4-domain model of learner engagement and provides validity evidence that supports using LEI scores to measure learner engagement in both in-person and livestreamed CPD activities.

Questions

  1. Why is measuring learner engagement in in-person and livestreamed educational activities important to medical educators? How would you define an engaged learner in the activities you facilitate?
  2. The article described several tools, including the Learner Engagement Instrument (LEI), the Continuing Medical Education Teaching Effectiveness (CMETE) tool, a 2-item scale from the Maslach Burnout Inventory, and a 2-item digital savviness scale. Have you used any of these tools in your educational or research practice? What insights did this article provide about LEI?
  3. How relevant do you find the authors' proposed 4-item LEI for your clinical teaching practice?

Comments

 Measuring engagement across in-person and livestreamed CPD matters because we keep assuming attention equals learning, and the modality can hide disengagement in different ways. An engaged learner in my sessions is asking targeted questions, attempting the practice tasks, and circling back to apply the content rather than just rating the speaker highly. I haven’t used the LEI before, but I have used brief burnout items and I appreciated the validity work here, especially the consistent 4-domain fit and the weaker cognitive in-class reliability. The 4-item short form feels practical for routine feedback, though I’d still want a fuller instrument when evaluating major curriculum changes. Also, our course eval inbox keeps getting junk links such as https://crazytimegame.org/ and https://xxxtremeroulette.com/ which makes me extra careful about keeping measurement tools clean and clearly educational .

cakorib cakorib replied on