Friday 8 November - Short Presentations
10:50-12:30, Room: 1.20, 2nd floor 

Reasoning Theme

Chair: Associate Professor José Miguel Pêgo, University of Minho 

On the face of it: Exploring the validity of spatial reasoning in the assessment of problem solving skills in the Biomedical Admissions Test (BMAT)

* Aaron Mortlock, Tania Clarke, Dr Sarah McElwee, Paul Crump, Alex Naughton, Thomas Iveson-Malcolm
* Corresponding author: Aaron Mortlock, Cambridge Assessment Admissions Testing, United Kingdom, mortlock.a@cambridgeassessment.org.uk 

Background 
Spatial reasoning is an essential aspect of human cognition pertaining to one’s ability to visualise, manipulate, maintain and retrieve visual-spatial information including 2D or 3D shapes or structures. The BioMedical Admissions Test (BMAT) currently uses question types assessing candidates’ spatial reasoning in Section 1 of BMAT as part of the wider examination of candidates’ problem solving abilities. Section 1 elicits thinking skills students are expected to utilise during undergraduate medical and biomedical degrees; providing universities with supporting evidence to assist them with selection decisions. 

Summary of work 
A review into the construct validity of problem solving skills elicited in BMAT (Cheung & McElwee, 2017) required stakeholder consultation on spatial reasoning to inform future test development. Spatial reasoning is often conceptualised as an aspect of fluid intelligence and is frequently found in IQ tests. BMAT does not purport to test IQ therefore the inclusion of spatial reasoning requires investigation into its construct and face validity. Quantitative analysis revealed spatial reasoning questions display measurement characteristics similar to other sub-skills of problem solving assessed, but the cognitive validity of these question types for biomedical study is not well-understood. Semi-structured interviews used existing spatial reasoning questions to elicit admissions tutors’ and test-takers’ views and sought to understand how stakeholders conceptualise the construct. 

Summary of results 
Stakeholders had different views regarding the relevance of spatial reasoning and its usefulness in undergraduate biomedical study. Test-takers who value spatial reasoning in BMAT commented on specific curricula areas where it was perceived as particularly useful. 

Discussion and Conclusions 
Previous research supports some stakeholders’ perceptions that spatial reasoning is relevant, suggesting spatial ability predicts success studying and working in STEM. Research also indicates that spatial and numerical thinking are related. Given that the construct assessed in Section 1 pertains to problem solving in numerical contexts, this presupposes that spatial thinking should be an integral sub-construct. Other test providers claim spatial thinking is relevant to developing skills such as reading x-rays, inferring cognitive validity. Further research is needed to validate such claims. 

Take-home Messages 
• BMAT users have different perceptions regarding the relevance of spatial reasoning for medical and biomedical study. 
• Research suggests spatial reasoning is important to STEM study and professions. 

References 
Ardila, A., & Rosselli, M. (1994). Spatial Acalculia. International Journal of Neuroscience, 78(3–4), 177–184. https://doi.org/10.3109/00207459408986056 
Cheung, K. Y. F., & McElwee, S. (2017). What skills are we assessing? Cognitive validity in BMAT. In Applying the socio-cognitive framework to the Biomedical Admissions Test. Cambridge University Press. 
Dolores de Hevia, M., Vallar, G., & Girelli, L. (2008). Visualizing numbers in the mind’s eye: The role of visuo-spatial processes in numerical abilities. Neuroscience and Biobehavioral Reviews, 32, 1361–1372. 
Kawamichi, H., Kikuchi, Y., Noriuchi, M., Senoo, A., & Ueno, S. (2007). Distinct neural correlates underlying two- and three-dimensional mental rotations using threedimensional objects. Brain Research, 1144, 117–126. https://doi.org/10.1016/j.brainres.2007.01.082 Lowrie, T., Logan, T., & Ramful, A. (2017). Visuospatial training improves elementary students’ mathematics performance. British Journal of Educational Psychology, 87, 170–186 
Newcombe, N. S. (2010). Picture This: Increasing Math and Science Learning by Improving Spatial Thinking. American Educator Wai, J., Lubinski, D., & Benbow, C. P. (2009). Spatial ability for STEM domains: Aligning over 50 years of cumulative psychological knowledge solidifies its importance. Journal of Educational Psychology, Vol 101(4), 817–835

Script concordance test: perception of staff and residents towards SCT as a tool for clinical reasoning in orthopedic department, Alexandria faculty of medicine, Egypt

* Ayat Eltayar, Ibrahim Eldaly, Doha Mohamed, Noha Mahmoud, Hoda Khalifa
* Corresponding author: Ayat Eltayar, Alexandria Faculty of Medicine, Egypt,
dr.ayateltayar@yahoo.com  

Introduction 
Script concordance test (SCT) remains a fairly innovative written assessment format to assess higher-order clinical reasoning skills in medicine (Charlin et al. 2000). Recently, medical colleges have implemented SCT for undergraduate and postgraduate formative and summative assessment of clinical reasoning in numerous medical disciplines (Meterissian et al. 2007; Lubarsky et al. 2009). Unlike classic written assessment methods which assess the ‘know’ level, questions in SCT could be used to assess the higher thinking skills at the ‘knows how’ level (Miller 1990). Thus, SCT is considered to be unique and reliable tool for assessing the essential clinical reasoning and data interpretation skills in authentic clinical scenarios that mirrors ambiguity and grey zone in medical practice (Mehay 2017). 

Justification of the study 
Graduate medical education programs at Alexandria Faculty of Medicine (AFM), still, uses traditional assessment methods including written assessments for evaluating residents’ knowledge, and clinical exam to evaluate their clinical competence. These assessment methods, however, do not evaluate our residents’ systematic thinking or clinical reasoning skills.  

Methods 
A descriptive correlation study, where a formative assessment tool for clinical reasoning (SCT) composed of 10 items (30 questions) in orthopedics was developed based on the program outcomes of orthopedic residency training program at AFM. Two experts of orthopedic department staff members were involved in the process of the development of SCT questions. Face and content validity of the questions were checked by both orthopedic staff members and members of the medical education department. The developed SCT was administered to the orthopedic residents at AFM (25) who were 2011- 2014 graduates and a panel of 10 orthopedic surgeons (staff and assisting staff). Since it was the first time to use SCT as an assessment method, test administration was preceded by orientation session to provide instructions on how to answer the test and the grading system. Adding to that, trial questions were provided to the candidates to make sure they became aware of the new assessment method. The test was anonymous, but number of residency years was required to be mentioned (PG level), test was administered on 4 face to face sessions to fit the work schedule of residents, the session was 30 minutes and was held in Al Hadarah hospital of orthopedic at AFM. Answers of the experts were considered as reference for scoring answers of the residents. Permission to implement the assessment in the orthopedic department was obtained from the faculty dean and the head of the orthopedic department. 

Results 
Correlation between number of residency years and SCT score percent showed strong positive statistically significant correlation (p<0.001). 

Discussion 
Orthopedic department was the first to use SCT test as an assessment tool in its residency training program in AFM. The study results showed that number of residency years in orthopedic training program was positively correlated to the SCT score percent. These results support the assertion of Ruiz et al. (2010) on SCT as an assessment tool to discriminates between physicians according to level of clinical experience (seniority). Similarly, SCT reflected clinical experience in post-graduate medical groups, especially during residency training (Iravani et al. 2016). Likewise, the SCT was able to differentiate junior from senior general surgery residents (Nouh et al. 2012). 

Conclusions 
SCT could be used as instructional and assessment method to improve clinical reasoning abilities of different graduate level (PG level) residents in AFM. Additionally, faculty training on developing and administrating such tool is recommended before implementing SCT as one of the assessment method used in Orthopedic residency program.   

References 
1. Charlin B, Roy L, Brailovsky C, Goulet F, van der Vleuten C. 2000. The Script Concordance test: a tool to assess the reflective clinician. Teach Learn Med. 12:189-95. 
2. Meterissian S, Zabolotny B, Gagnon R, Charlin B. 2007. Is the script concordance test a valid instrument for assessment of intraoperative decision-making skills? Am J Surg. 193:248-51. 
3. Lubarsky S, Chalk C, Kazitani D, Gagnon R, Charlin B. 2009. The Script Concordance Test: a new tool assessing clinical judgement in neurology. Can J Neurol Sci. 36:326-31. 
4. Miller GE. 1990. The assessment of clinical skills/competence/ performance. Acad Med. 65:63-7. 
5. Mehay R. The essential handbook for GP training and education. Assessment and competence. Available from: www.essentialgptrainingbook.com chapter-29 (last accessed on October 2017). 
6. Ruiz J, Tunuguntla R, Charlin B, Ouslander J, Symes S, Gagnon R, Phancao F, Roos B. 2010. The Script Concordance Test as a Measure of Clinical Reasoning Skills in Geriatric Urinary Incontinence. J Am Geriatr Soc. 58:2178–84. 
7. Iravani K, Amini M, Doostkam A, Dehbozorgian M. 2016. The validity and reliability of script concordance test in otolaryngology residency training. J Adv Med Educ Prof. 4:93-96. 
8. Nouh T, Boutros M, Reid S, Pace D, Walker R, Maclean A, Hameed M, Charlin B, Meterissian S. 2012. The script concordance test as a measure of clinical reasoning : a national validation study. AJS. 203:530–4. 


Survey of Teaching Methods, Integration and Assessment of Clinical Reasoning in Undergraduate MD Curricula of Georgian HEIs

* Paata Tsagareishvili, Tamar Talakvadze, Nino Tabagari, Sergo Tabagari
* Corresponding author: Paata Tsagareishvili, David Tvildiani Medical University, Georgia,
vice_dean@aieti.edu.ge 

Clinical reasoning is important outcome in healthcare education that should be consistently taught and assessed. The practice of teaching and assessment of clinical reasoning in MD curricula of Georgian HEIs is poorly investigated. The study explored how Clinical Reasoning is taught and assessed in Undergraduate Medical Doctor (MD) programs in Georgia. A Descriptive, cross-sectional survey was administered to representatives of 19 Georgian HEIs, which deliver 32 Undergraduate MD programs. An electronic 18-question survey was distributed to the deans of all 32 MD programs accredited by the National Center for Educational Quality Enhancement of Georgia. Descriptive statistical analysis was performed.  

A Response rate of 50 % (n=16) was achieved for MD programs and 63.2% (n=12) for HEIs. All respondents reported that Clinical Reasoning is incorporated into their MD curricula. Most respondents (88%) reported that Clinical Reasoning is integrated in various courses, incorporation of clinical reasoning during clinical rotations was reported by 69% of respondents and only 12% indicated that other practical settings are also used. The research showed that having common definition of clinical reasoning does not correlate with “diversity” of its incorporation into the MD curricula (educational environment, diversity of teaching and assessment of clinical reasoning). Out of the modern tools for teaching clinical reasoning the following tools are mostly used by MD curricula: Case-Based Discussions (93.75%), Real Patient Examples (87.50%), Diagnostic Algorithms (75%) and Problem-Based Learning (75%); and the following tools are mostly used for the assessment of clinical reasoning: Direct Observation of Clinical/Procedural Skills (93.75%), Multiple Choice Questions (87.5 %), Short Answers (87.5 %), Oral Case Presentation (87.5 %), Oral Exam (81.25%).  

Respondents also reported the tools for teaching and assessment of clinical reasoning they didn`t know, particularly for teaching clinical reasoning the less known tools include Mnemonics (50%), Low-fidelity Simulations (38%), High-fidelity Simulations (38%), Concept Maps (31.25%), Morning Reports (31.25%), One-Minute Preceptors (31.25%), while for assessment of clinical reasoning the less known tools include Comprehensive Integrative Puzzles (37.50%), Concept Maps (37.50%), Chart-Stimulated Recall/Review (37.50%), Self-Regulated Learning Microanalysis (37.50%). Clinical reasoning is recognized as significant and obligatory component of MD curricula, although its teaching and assessment practice varies between programs, whereas within the programs inconsistencies between declared by curriculla and incorporated tools for teaching and assessment of clinical reasoning are observed. These findings emphasize the need for further faculty developments in clinical reasoning teaching and assessment tools and their relevant incorporation into MD curricula.  

Declared acknowledgement of Clinical Reasoning as important outcome in healthcare education could differ from the real practice of its teaching and assessment in MD curricula of Georgian HEIs.

Use of Practicum Script to enhance medical students’ clinical reasoning skills and ability to manage uncertainty

* Amir H. Sam, Eduardo M. Pleguezuelos, Carlos F. Collares. Adrian Freeman, Eduardo Hornos. Cees van der Vleuten
* Corresponding author: Amir Sam, Imperial College London, United Kingdom
a.sam@imperial.ac.uk 

Background  
There is a growing demand for better assessment of medical students’ clinical reasoning skills and their ability to manage uncertainty in clinical practice. Practicum Script (http://www.practicumscript.education) is an online simulation-based program that is designed to enhance clinical reasoning skills as well as introducing the concept of uncertainty in clinical decision-making. 

Summary of work  
This multicentre pilot study, coordinated by the European Board of Medical Assessors (EBMA), will examine the utility of Practicum Script as a clinical reasoning training tool in undergraduate assessment. It is envisaged that twenty medical schools from across Europe, the USA and South America will participate. The assessment material will consist of 20 internal medicine cases reviewed by an international reference panel. For each clinical scenario, final year medical students will need to generate ‘free-text’ hypotheses and provide justifications by identifying pertinent positive and/or negative findings in the case. Students will be asked to report how new information affects their original hypotheses. We will perform psychometric analyses of the students’ responses to the items for each case. 

Summary of results 
The level of agreement between experts will be analysed. The validity based on the internal structure will be analysed using a hierarchical polytomous item response theory model. Cognitive diagnostic modelling will be used to explore the possibility of determining profiles of specific clinical reasoning skills needed to solve each item. Student satisfaction and perceptions will be also be evaluated. 

Conclusion 
This multicentre study will investigate the utility of Practicum Script in formative assessment of medical students’ clinical reasoning skills and their ability to manage uncertainty in clinical practice. 

Take-home message 
Practicum Script offers an innovative approach to developing medical students’ clinical reasoning skills and aims to introduce the concept of managing uncertainty at the undergraduate level.

© Copyright 2019 The European Board of Medical Assessors (EBMA) | Disclaimer