Theme 2 | Assessment technologies and methods
Chair: José Miguel Pêgo, Co-chair: Carlos Collares | 15th October 2020, 12:30 - 14:30 (UK’s time)

Using Computer Based Virtual Scripts to Assess Clinical Capabilities in Athletic Training Students

* Paul Geisler, Patrick McKeon
* Corresponding author: Paul Geisler, Ithaca College (United States), pgeisler@ithaca.edu

Background
Medical and health professions educators have long been documenting student competency or capability in specific domains of knowledge and skill germane to their home fields. Increasingly, educational outcomes are mandated by a combination of internal and external forces, and curriculum designers and educators are responsible for developing, implementing, and disseminating effective methods for documenting student progress towards autonomous, entry-level practice. Athletic training (US) and athletic therapy (Canada and Ireland) are no exception as it regards these essential responsibilities. Documentation of competency in the myriad domains relative to professional practice are required for accreditation, institutional and program compliance, and profession specific ethical and civic responsibilities. As part of our assessment as learning approach to programmatic assessment, we have implemented a series of no stakes, computer based, virtual illness and injury script modules to assess our students’ clinical capabilities in nine specific domains of practice. Using QualtricsTM, we have designed integrated virtual case studies to assess the “shows how” level of Miller’s competency pyramid. Each module is designed to challenge and assess basic and clinical science knowledge, clinical reasoning, and management of common conditions in athletic healthcare.  

Summary of Results
Over the last 4 years of cycled administration, 100% of our students were capable of recognizing and managing commotio cordis, 94% were capable of prescribing an evidence-informed therapeutic plan of care for patellofemoral pain syndrome, 90% were capable of assessing and managing a case of exercise-induced laryngeal obstruction, 85% graded as capable of recognizing and managing exertional heat illness, 80% were capable of evaluating and treating influenza, and 70% were capable of using evidence to make clinical decisions and formulate policy. Over the last 9 years, 100% of our students have passed their national board examinations on the first attempt, allowing them to gain state licensure and qualifying them for clinical practice. Discussion: We use nine macro clinical capabilities as our ultimate program outcome measures, all representative of integrated clinical skills, knowledge and attitudes from our professional scope of practice (athletic training). Annual sampling with our integrated virtual capability modules has helped identify deficiencies in student learning that have allowed us to provide deliberate feedback to students prior to matriculating into autonomous practice, and to identify gaps in curricular content and/or delivery that need attention. 

Conclusion(s)
Virtual platforms allow for realistic and progressive script presentations that can closely mimic clinical thinking and decision making, can incorporate elements of evidence-based practice, and can include pictures and/or videos, tables and other data presentation formats. They also prevent students from looking forward/backward to glean more information or change initial answers, and improve data management.  

Take-Home Message
Virtual, computer based platforms can be used to present injury/illness scripts in authentic ways, and are acceptable indirect measures of the “shows how” level of clinical competence. They also provide valuable feedback to both students and educators; improving student learning as well as helping to identify necessary programmatic adjustments and curricular interventions.  

References
Charlin B, van der Vleuten C. Standardized assessment of reasoning in contexts of uncertainty: The script concordance approach. Eval Health Prof. 2004;27:304-319.  
Charlin B, Roy L, Brailovsky C, Goulet F, van der Vleuten C. The script concordance test: A tool to assess the reflective clinician. Teach Learn Med. 2000;12:189-195.  
Hayward J, Cheung A, Velji A, Altarejos J, Gill P, Scarfe A & Lewis M. Script-theory virtual case: A novel tool for education and research. Med Teacher. 2016;38(11):1130-1138.  
Huwendiek S, Reichert F, Duncker C, de Leng BA, van der Vleuten C, Muijtjens AMM, et al. Electronic assessment of clinical reasoning in clerkships: A mixed-methods comparison of longmenu key-feature problems with context-rich single best answer questions. Med Teacher. 2017;39(5):476-485.  
Lubarsky S, Charlin B, Cook DA, Chalk C, van der Vleuten CP. Script concordance testing: A review of published validity evidence. Med Educ. 2011;45:329-338.  
Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;Sep;65(9S):S63-67.  
McNeil HP, Hughes Cs, Toohe SM, Dowton SB. An innovate outcomes-based medical education program built on adult learning principles. Med Teach. 2006;28:527-534.  
Neve H & Hanks S. When I say…capability. Med Ed. 2016;50:610-611.  
Sam AH, Hameed S, Harris J, Meeran K. Validity of very short answer versus single best answer questions for undergraduate assessment. BMC Med Ed. 2016;16(266):1-4.  
See KC, Tan KL, Lim TK. The script concordance test for clinical reasoning: Re-examining its utility and potential weakness. Med Educ. 2014:48:1069-1077.

© Copyright 2020 The European Board of Medical Assessors (EBMA) | Disclaimer