Saturday 9 November - Academic Short Presentations (EBMA)
10:50-12:30, Room: 1.18, 2nd floor
Simulation Theme
Chair: Professor Debbie Jaarsma, Groningen University
Standardized measurement of the efficiency of diagnostic skills: InSimu Diagnostic Competition
Gábor Tóth, *Andrea Herdon, InSimu, Debrecen, Hungary
Corresponding author: * Andrea Herdon, InSimu, Hungary,
andrea.herdon@insimu.com
The InSimu Patient is a ground-breaking educational program developed for physicians and medical students to practice diagnostic thinking in the safety of a virtual environment. InSimu brings the interactive learning of clinical reasoning and decision making to the utmost level, while it is still easy to use on your smartphone or tablet. The app will evaluate your diagnostic performance and compare it for you to current guidelines of cost- and time-efficient, evidence-based diagnosis. Completely free gameplay - ask for any diagnostic test anytime for any patient Developed using U.S. structures and report systems (i.e. history taking, RadReport templates, etc.) Based on international peer-reviewed literature and developed by physicians to ensure professional pertinence Realistic time and cost associated for each test (Medicare pricing)
The InSimu Patient has been used by 48,000 medical students, residents and physicians worldwide. A developing partner, the University of Debrecen already uses the InSimu app in the education of medical students. InSimu Diagnostic Competition is built on the InSimu Patient app which makes it possible for conference attendees to compare and analyse their clinical knowledge and diagnostic skills real time by solving the same diagnostic cases in a given timeframe in the most entertaining way.
At the end of the InSimu Diagnostic Competition workshop, the participants will be able to: Organize an engaging interactive workshop on their own for teaching differential diagnostics skills while encouraging students or colleagues to work in teams. Gain diagnostic experience on 5 virtual simulated patients. Participants will have the opportunity to test and compare their own diagnostic skills on 5 simulated patients in an objective way. Their performance will be automatically scored and analysed on factors like the correctness of the diagnosis, time and cost-effectiveness, missed tests, and correctness of the ordered tests.
At this workshop, we’ll provide the participants with hands-on experience, how they can improve their diagnostic skills every day, how can they implement a patient simulator platform in gradual or post gradual training to measure the pulse of teaching and the objective progress in student’s diagnostic skill development.
OSCE station with a standardised patient helps for the first time to identify Military Medical Faculty students with communication problems at the Medical University of Lodz
* Malgorzata Skibinska, Tomasz Sikorski, Joanna Narbutt,
* Corresponding author: Malgorzata Skibinska, Medical University, Poland
malgorzata.skibinska@umed.lodz.pl
Background
Using standardised patients has been a staple in designing OSCE stations at medical universities all over the world.
Summary of Work
Short OSCE was introduced for the first time to the fourth-year Military Medical Faculty students at the Medical University of Lodz (UMED) during academic year 2018/2019. One of the stations involved a short consultation with a patient in a general practitioner setting regarding lifestyle advice or planned vaccination. The examiner was assessing the students, but the patient had the opportunity to approve or not how the consultation was performed (not the content). The students who did not get the patient’s approval were identified and given the option to meet with a senior clinician to discuss their performance.
Summary of results
In total 198 students took part in the short OSCE. Sixteen students (8%) did not get patients approval. A separate email was sent to those students, requesting to contact a senior clinician, who was also one of the OSCE coordinators and an experienced OSCE examiner (University College London). Each meeting took between 30 and 45 minutes. In the students and the clinician’s opinion main issues were identified and a plan of action was drawn up. The main problem appeared to be the new format of the exam and time pressure connected with short OSCE stations.
Discussion & Conclusion
It is a well-known fact that observing medical students performing tasks with patients helps to identify those with potential communication problems. This was the first time an evaluation took place during an objective structured clinical examination in this group of students. We are currently discussing the most efficient methods of giving the students more opportunities to assess their communication skills.
Take-home Message
The first ever OSCE organised for the Military Medical Faculty students at the Medical University of Lodz helped to identify students with potential communication problems.
How to run a successful short OSCE organised for the first time for the Military Medical Faculty students at the Medical University of Lodz? Evaluation of the exam and students’ feedback.
Michal Niedzwiedz, *Malgorzata Skibinska, Joanna Narbutt, Abstract presenter details
* Corresponding author: Malgorzata Skibinska, Medical University, Poland
malgorzata.skibinska@umed.lodz.pl
Background
Short OSCE was introduced to the fourth-year Military Medical Faculty students at the Medical University of Lodz (UMED) during academic year 2018/2019. A pathway needed to be established to identify and solve emerging problems for both students and examiners.
Summary of Work
Preparation for the exam started with a series of meetings with students and their representatives to introduce the rationale behind the exam. A mock OSCE was organised for the students to familiarise them with the format. A tutorial was introduced to teach students and to help them to prepare for the exam. Examiners and standardised patients were trained how to perform their roles. A request to fill in a survey was sent out to all the students after the exam. Apart from answering questions, regarding the examiners and organisation of the exam, the students provided very detailed feedback on their experience.
Summary of Results
The exam for 198 students was organised over a period of 2 days in the Medical Simulation Centre in June 2019. All students achieved a positive result. No major interruptions took place and both examiners and students gave an overwhelmingly positive feedback regarding their experience.
Discussion & Conclusion
Main problems associated with introducing OSCE included an entirely different format of the exam and the lack of practical experience for students and examiners. It was overcome by a series of planned activities for both groups. The survey to which 86 out of 198 students responded revealed an overall support for how the exam was organised (91.9%) and assessed (67.4%). Moreover 60% of the students indicated in their exam feedback that more stations involving standardised patients should be included in OSCE. Detailed comments revealed students’ difficulties regarding performing consultation with a standardised patient and during simulation procedure stations.
Take-home Message
Introduction of new methods of assessments, especially as significant as OSCE requires full collaboration between the Faculty and the students.
Patients as summative assessors of communication skills - what difference do they make?
Andrew Kelly, *Jo Cockerill, Martin Roberts, Tom Gale
* Corresponding author: Jo Cockerill, Plymouth University Faculty of Medicine and Dentistry, United Kingdom
jo.cockerill@plymouth.ac.uk
Background
The University of Plymouth, Peninsula School of Medicine uses Integrated Structured Clinical Examinations (ISCEs) to assess a range of clinical consultation skills at the end of Years 2 and 4 of the five year BMBS programme. The assessment is run in two phases with six or eight stations in each phase; students passing Phase 1 are not required to take Phase 2. Summative assessments of performance are completed by clinicians (one per station) who are trained as ISCE assessors. Each station has a real or simulated patient who is asked to complete a formative feedback form which is made available to the students alongside their summative results. This study examines the potential difference that using simulated patient scores summatively would make to the final ISCE outcomes for students.
Summary of Work
Data from the Phase 1 ISCEs in 2016-17 for students in Years 2 and 4 were compiled to include all summative (assessor) and formative (simulated patient) data. Where there was a common domain assessed by both the assessor and patient (e.g. application of communication skills: student-patient interaction) the patient grade was converted to a score. The assessor score was then replaced by (i) the patient score, or (ii) the mean of the patient and assessor scores. For each method the grade boundaries were recalculated and applied to the revised student scores, with comparisons made to the original summative scores and grades. Final ISCE grades are also dependent on students achieving a specified number of passing global grades; for the modelling only the assessor global grades were used.
Summary of Results
The modelling showed strong positive correlations between the original scores and those derived by replacing the assessor scores with the patient scores (Pearson correlation coefficient 0.946 and 0.974 respectively for Years 2 and 4), and the mean of the assessor and patient scores (Pearson correlation coefficient 0.987 and 0.994), for the common domains. Using the patient scores 92.3% of students received the same grade, 5.3% received an improved grade and 2.4% received a poorer grade; using the mean of the patient and assessor scores 95.9% received the same grade, 3.5% an improved grade and 0.6% a poorer grade. With either method no additional students would have failed the assessment, with poorer grades being Satisfactory instead of Excellent.
Discussion & Conclusion
Acknowledging the importance and value of patient feedback this modelling allowed us to determine that there would be no significant change to student outcomes. Internally, this was well received and supported a move towards using patient feedback summatively, with steps to be taken to ensure the process is valid and fair for students, and acceptable to patient assessors.
Take-home Message
Patient feedback is a valuable resource and whilst some resistance to incorporating patient feedback into summative assessment exists, doing so can reinforce the importance of patient-centred care to our students and help fulfil General Medical Council (UK) requirement for medical schools to support innovative approaches to patient involvement in teaching, feedback and assessment.
Challenges in effective assessment of Interprofessional Education
Ain Satar, Jonathan Fisher, Mine Orlu, Catherine Tuleu, Paul Winyard, Caroline Fertleman
* Corresponding author: Ain Satar, Institute of Child Health, United Kingdom
n.satar@ucl.ac.uk
Background
Interprofessional education (IPE) encourages collaboration between healthcare students, aiming to nurture effective members of the healthcare system. Although approaches to IPE have expanded, assessing knowledge and skills required for successful collaboration require continued development. A multidisciplinary faculty designed and conducted an interactive flipped classroom paediatric oncology session for University College London’s (UCL) pharmacy and medical students.
Summary of Work
Both paediatricians and pharmacists ran the session using various teaching modalities: short didactic presentations, interactive game and case discussions in small interprofessional groups. Role modelling of interaction through a multidisciplinary faculty helps students appreciate its importance in healthcare setting. Participants completed anonymised matched pre and post-questionnaires. The pre-session questionnaire explored pre-conceived ideas of each profession and post-session questionnaire explored session’s impact on learners’ impression of each profession, communication and appreciation of professional collaboration. Additional feedback collected from medical students five months after the session.
Results
Forty (93%) matched responses from 43 participants received; 21 (53%) pharmacy and 19 (47%) medical students. Of the 21 (53% of total) previously involved in IPE sessions, 18 (86%) were pharmacy students. The impression of each profession is positive, however some acknowledged potential for conflict due to differing perspectives. Thirty-six (90%) students said the session enhanced their communication skills. They reported increased confidence, empowerment to collaborate and encouragement to seek advice from other professionals. Students appreciated broader understanding of each professional’s role and perspectives, experience of working as part of a multidisciplinary team and opportunity to share skills and knowledge. They valued the importance of a holistic approach to patient care and the need for interprofessional collaboration; ‘We have a common goal in prioritising patient care’, ‘Interprofessional collaboration is essential to create good relationship within healthcare team’. However, medical students' feedback five months later questioned its worth, as it did not form part of their assessment (55% rated 2 or 3 out of 5).
Discussion
The high response rate and positive comments reflect the level of student participation and enthusiasm during the session. IPE exposes healthcare students to real working environment and allows appreciation of joint clinical decision-making and collaboration necessary to achieve holistic care. Although it was perceived that this was achieved, measuring long-term impact of cross-disciplinary sessions on wider team interaction and ultimately, patient care is challenging. Some students could not appreciate the session’s relevance, as it did not form part of their assessment. This highlights the importance of assessments in helping students internalise lessons. Although there has been competencies framework suggested for IPE, assessment tools are varied and best practices not yet identified. Both summative and formative assessments designed to evaluate behavioural changes, deliveries of care, patient satisfaction and outcome have been recommended. Further development in comprehensive assessment tools incorporating competencies identified for multidisciplinary education is critical to fully benefit from this educational model. Nevertheless, this study has informed future study designs and module assessment.
Take home messages
IPE sessions enrich health professional learning. Robust, standardised and longitudinal assessment methods using tools to measure competencies are needed.
A multi-disciplinary approach to simulation in undergraduate medical education
Priyanka Patel, Joely Leeder
* Corresponding author: Dr Priyanka Patel, Frimley Park NHS Foundation Trust, United Kingdom
riyanka_08@hotmail.co.uk
Background
Traditionally, simulation in medical education (at the undergraduate level) has been delivered with minimal consideration of the multi-disciplinary approach to patient care, with students from different healthcare backgrounds participating in separate simulation sessions. Multi-disciplinary simulation at the post-graduate level, however, is well-established and there is an abundance of research supporting this. In recent years, the multidisciplinary team has grown further in the UK with the incorporation of physicians associates.
Summary of work
Medical, nursing and physicians’ associate students participated in acute medical or surgical high-fidelity simulation relating to the clinical areas where they were on placement. Each student would receive a candidate briefing relevant to their role and they were expected to work within the remits of their professional roles. Students from each healthcare discipline worked together to assess, investigate and treat an acutely unwell patient. Three students ,one from each discipline, would participate in a simulated scenario together, starting at different times in the initial assessment of the patient.
Summary of results
Four medical students, two physicians associates and two nursing students attended the pilot session. Students provided qualitative feedback on their learning outcomes, experiences that would change their practice and gave additional free text feedback.
They were also asked to rate the following categories:
• the usefulness of the simulation training
• whether their understanding of the topic had increased
• whether their confidence to deal with the acute problem had increased
• if the materials covered were relevant
• if the topics covered would change and enhance practice
*5 (strongly agree), 4 (agree), 3 (neither agree nor disagree), 2 (disagree) to 1 (strongly disagree)*
All students strongly agreed that the sessions was useful, the materials covered were relevant to them and that the topics covered would change and enhance their practice. All students either strongly agreed or agreed that their confidence to deal with and their understanding of the topic had increased. Furthermore, students commented that they had a greater understanding of each other’s roles. Multidisciplinary simulation also allowed them to develop their communication and team-work skills.
Discussion
Multidisciplinary simulation has been shown to enhance patient safety due to improved communication, a more unified team-based approach to care and an appreciation for the competencies of different healthcare professionals. This is yet to be fully implemented in the training of healthcare professionals at the undergraduate level despite recognition of the benefits in postgraduate medicine which could be transferrable to healthcare students.