Friday 8 November - Poster Presentations
10:50-12:30, Room: Foyer Area, 1st floor

Theme: Assessment

Chair: Dr Johanna Louhimo, University of Helsinki

Smart Mobile application based portfolio; Is it feasible and convenient for assessment?

* Duangmontree Rojdamrongratana, Navapol karnjanaranya
* Corresponding author: Duangmontree Rojdamrongratana, Thammasat University Hospital, Thailand
douang007@hotmail.com 

Background 
Portfolios are an important widely used tool for evaluating medical students’ progression in clinical competence and professionalism. Smart Mobile application based portfolio was developed based on the fact that technology is now playing a crucial part in today’s learning style and aids both students and instructors in the process of learning, revision, assessment and reflection. The beta-test application is available on both IOS and android phone for use by medical students rotating through ophthalmology departments. 

Methods 
7 attending ophthalmologists and 110 medical students rotating in ophthalmology department from Thammasat University, Chulalongkorn University and Chumpon hospital were enrolled in this study. All participants used the application from their first to final day of their ophthalmology clinical rotation. On the final day of their rotation, an evaluation form was completed by all participants regarding their feedback about the application. 

Result 
Most of the students and attending ophthalmologists completely agree that the essential skills the students need to learn are listed in the application (54%), the medical skills listed were actually trained (59%), the attending doctor’s advice is helpful for personal development (62%), the application is easy to use (60%), attending doctors and students can record what they have learned (47%), the application touchscreen is easy to use (58%). Most of the students and attending ophthalmologists somewhat agree that the application is flexible for the students (40%), the students and attending doctors will use the application regularly (43%), medical skills guide and recommendation are useful in practice (42%). 

Conclusion 
Smart Mobile application based portfolio is feasible and convenient for assessment. However, the application still needs further development to meet the users’ expectation.

Creating a system for Quality and Progress - The Purposeful Need for Technology?

* Christina Gummesson, Jakob Donnér, Luke Dawson, Johan Agardh, Peter Åsman
* Corresponding author: Christina Gummesson, Lund University, Sweden
Christina.gummesson@med.lu.se 

Background 
The use of a learning management/assessment system has become common in higher education during the past 20 years. However, such systems are frequently not grounded in approaches to support longitudinal educational needs, such as promoting professional development. One reason may be that digital support is designed to assist with isolated tasks e.g. running MCQ tests, and therefore is not developed so cannot support integrated educational needs such as longitudinal learner development. However, for assessment and feedback to be utilized both synergistically and longitudinally, efficient holistic support systems are required. Our aim was to use self-determination theory to define and implement a shared purposeful digitalization plan to support longitudinal learner development in the medical and health science educational programs at our university. 

Summary of work 
In an action driven process during 2016-2017, teachers from thirteen educational programs and student representatives were invited to participate in workshops, discussions and surveys. Concepts of competence (developing mastery as a student and teacher), relatedness (developing a sense of connectedness to other students and faculty members), and autonomy (developing a sense of control of your own learning behavior based on systematic feedback systems, both as a student and as a teacher) underpinned our design philosophy and subsequent activities. Following these activities, a procurement process followed, during 2018, to identify a digital supplier able to meet our needs. 

Summary of results 
There was an agreement on the need for a new digital system that needed to deliver the following key aspects from our needs-assessment: 
• the ability to visualize and analyse complex information for both quality and progress purposes  
• learning activities and assessment results must be augmented by a system where common entities would be searchable, traceable, and triangulated to add longitudinal meaning.  
• the ability to deliver personalized learning through enabling individuals to see the outcomes of various triangulated assessment sources, creating opportunities for longitudinal follow up, feedback, and feed forward.  
• enable shared entities across multiple health educational programs for the benefit of both students and teachers. 

Discussion & Conclusion 
Several frameworks in contemporary medical education such as student-centred personalized learning, programmatic assessment, entrustable professional activities, interprofessional education, and development of autonomy, relatedness and competencies all call for purposeful infrastructural support, to be fully utilized. A challenge is the lack of digital systems meeting those needs. We believe that teams of educational developers together with staff and students have an important role to play in future digital learning space development. The link to medical education theories as well as the link to the health care needs should be considered. If systems are designed for collaboration and collective development, the transition to collaboration in health care and life-long learning may be better facilitated. 

Take-home message 
There is a need for development of digital systems based on educational research and contemporary health care strategies to augment future education. This need can only be fully realized by cooperative working between educators and developers.

Standard setting the PLAB 1 exam: using historical or diet specific Angoff scores

* Zoe Makin, Richard Hankins, Julian Hancock,
* Corresponding author: Zoe Makin, General Medical Council, United Kingdom
zoe.makin@gmc-uk.org 

Background  
The PLAB test is the main route by which International Medical Graduates (IMGs) with acceptable primary medical qualifications demonstrate they have the necessary skills and knowledge to practise medicine in the UK. The test is set at the level of a doctor successfully completing Foundation Year 1 and is taken in two parts. PLAB 1 has an established method of setting the passmark using ‘banked’ Angoff scores. These scores are set by the Part 1 panel upon the first use of the item using modified Angoff. The scores are then applied to calculate the passmark on all subsequent uses.

Summary of Work  
Concerns were raised that such a method does not sufficiently consider factors that may vary the difficulty of the item in use. This may include prompts contained within other questions or changes in the external environment. To test the effect of moving to a model where the standard is set by diet, the Panel agreed to completely re-standard set three papers over a period of nine months. In doing so the panel were able to evaluate the difficulty of the items in context. We were then able to compare the passmark using both the banked and new methodology by comparing the passmark with that created by each method. 

Summary of Results  
We were able to compare the effects of standard setting using historical Angoff data versus utilising banked Angoff scores. At whole paper level the move to standard setting a whole diet in context made no difference to the final cut score on two out of three occasions and a difference of just over 1% of the total pass score on one occasion. When only those items for which new Angoff scores are considered, setting a complete diet also made minimal difference. On one occasion it made no difference. On another it increased the Angoff score by 2 and the final occasion it decreased the score by 1, out of 180 marks. 

Discussion & Conclusion  
The differences in outcome from the two methodologies were minimal and were well within a reasonable estimate of error. Thus, setting each paper as a whole rather than utilising banked scores has been shown to be expensive and administratively burdensome, whilst adding little in terms of outcome. The Panel decided to revert to the historical method of setting the standard using banked scores.

Take-home Message
This exercise has not demonstrated that setting the standard of papers independently makes a significant difference to the cut score versus utilising banked data.

Workplace Based Assessment - Translation, adaptation and implementation of the Mini-Clinical Evaluation Exercise (Mini-CEX) and the Direct Observation Procedural Skills (DOPS) scales

Rita Sousa, André Santa Cruz, António Oliveira e Silva, José Miguel Pêgo, João Cerqueira, Nuno Sousa, Vítor Hugo Pereira
* Corresponding author: Rita Sousa, School of Medicine, University of Minho, Portugal
rita.msousa5@gmail.com 

Background 
Workplace based assessment (WBA) stands for the assessment of clinical skills from information gathering and physical examination to performing procedures, in a professional not simulated environment. To conduct a WBA, scales had to be developed to ensure correct and fair assessment of the students. Some of the most well-known scales are the Mini-Clinical Evaluation Exercise (Mini-CEX) and the Direct Observation of Procedural Skills (DOPS). In the light of a curricular reform at the School of Medicine of the University of Minho, Portugal, we performed the validation of these tools in our context. 

Summary of Work 
The translation and validation of the English version of Mini-CEX and DOPS was conducted by bilingual individuals, namely four medical physicians specialized in medical education and a medical student from the School of Medicine - University of Minho. After the translation process, faculty development was initiated in the Internal Medicine Department of the Hospital of Braga comprising a theoretical session followed by real evaluations with students from the 3rd, 4th, 5th and 6th years. The students answered a questionnaire of four 5 point Likert scale questions regarding their satisfaction with the scales. The faculty also answered a 9 point Likert scale question regarding their overall satisfaction with the scales. 

Summary of Results 
From a sample of 31 evaluations for the Mini-CEX evaluation and 16 evaluations for the DOPS evaluation, the average score of the questionnaire for Mini-CEX was 4,77 and 4,92 for DOPS. The overall satisfaction of the evaluators with the Mini-CEX was 7,88 from a 9 point Likert scale and 8,00 with DOPS. 

Discussion and Conclusions 
In summary, this works intends to implement two WBA scales, translated, adapted and validated to Portuguese as well as a faculty development program. Given the high marks of satisfaction with both scales amongst students and evaluators, supporting its feasibility and applicability, we will continue the validation process by increasing the sample size as well as providing faculty development opportunities. 

Take-home message 
The Mini-CEX and the DOPS are excellent assessment scales that increase satisfaction with workplace based assessment process for both students and faculty.

Differential Attainment - Early Cohort Study from a New Medical School

* Emanuele Fino, Helen Cameron
* Corresponding author: Emanuele Fino, Aston University, Aston Medical School, United Kingdom
e.fino@aston.ac.uk 

Background 
The UK General Medical Council requires undergraduate medical education to be fair for every student; educators must attempt to eliminate all types of discrimination and equality of opportunities, and support all students' development and learning. Variations in attainments exist at the undergraduate level, defined by the GMC in terms of ‘systematic differences in outcomes when grouping cohorts by protected characteristics and socio-economic background’. In particular, they can be found in groups identified by different characteristics and factors, for example age, gender, socio-economic background, nationality, and previous academic experience. Monitoring and tackling such differences is key to assuring fairness and supporting students in their journey to learning and licensing. We do not yet know the extent of the issue for those medical schools committed to widening access and welcoming students from diverse backgrounds. 

Summary of Work 
In this poster we will present the results of our systematic analysis of differential attainment of students in Year 1 of a newly established UK medical school. One of the peculiarities of this medical school resides in the characteristics of its cohort. In the first year of the programme, approximately 1/3 of students are from a less financially advantaged background, many having attended the school’s Widening Participation programme aimed at supporting such individuals and their families. Two thirds of the first intake are international students from a number of nationalities. The work will provide the reader with a comprehensive view of the use of formative assessments and post-exam psychometric analysis to shed light on the differences across groups of students identified by specific characteristics, and informing the design, delivery, and quality assurance of curricula and assessments. 

Summary of Results 
The poster will present the results of differential attainment analyses within a UK undergraduate cohort sitting two formative Applied Knowledge Tests, two summative Applied Knowledge Tests, and one formative Objective Structure Clinical Exam in their Year 1 of a 5-year medical programme. The results of attainment amongst groups of students identified by gender, nationality, and previous academic experience will be analysed and discussed in detail, aiming to identify and discuss (1) evidence of any differential attainment; (2) contributing factors; (3) strategies and methods for intervention and change.  

Discussion & Conclusion 
Timely monitoring and analysis of differential attainment in the early stages of undergraduate medical programmes represent key processes in the effective establishment and management of curricula and their assessments, supporting medical educators in identifying possible contributing factors, designing and implementing effective intervention to tackle any possible differences in attainment, and ultimately assuring fairness and equality of learning opportunities.

The Effectiveness of applying for Formative Assessment Objective Structured Clinical Examination (OSCE) for Undergraduate Students in Faculty of Medicine

* Corresponding author: Rajaa Allhiani, King AbdulAziz University& University of Jeddah, Kingdom of Saudi Arabia, rallhiani@gmail.com

Background 
This study of monitoring the effectiveness of the formative OSCE was conducted over five years from 2014 until 2018 and still this monitoring is going on for 2019. The purpose of this study is to determine the effectiveness of applying formative assessment Objective Structured Clinical Examination (OSCE) for undergraduate 5th year medical students in clinical skills module, Also to measure the effectiveness of Self Direct Learning (SDL) sessions. 

Methods 
Check list of 10 questions survey (questionnaire) were distributed to all undergraduate medical students who had participated in the formative (OSCE) so to measure their competency and proficiency. The formative is conducted before summative OSCE. This method was applied in 2014 and the residents were the examiners. In 2018 this method was amended to accept more challenges in inviting house officers (interns) to be examiners. Since most of the standardized patients (SP) who are participating in the formative OSCE from Interns who have finished their rotations in all the clinical departments and were retained in the inactive stations. 

Results 
The analyses were done thru cross tabulation by (NCSS) similar to SPSS version 20 of all the survey, and the data was summarized in the following table (Fig.1);

In 2014, the number of students were 208 (95 males & 113 females) who participated in the formative assessment had reported not enough time since it was 5 minutes; it should more than 10 minutes. About 8% were incomplete questionnaire nothing to report.43% had reported it highlighted area of weaknesses in their clinical skills and 48% felt safe to make errors without accountability and confidentiality. But students had reported difficulty with the residents; they were inexperience to provide feedback since they were not involved in their teaching plane.

In 2015 and 2016, about 50% of the students did not participate in the formative assessment OSCE since it was not compulsory. In 2018, we have accepted more challenges in inviting house officers (interns) to be examiners. All the incomplete surveys were discarded and omitted from the analysis.

Conclusion 
Overall, 5th year medical students' feedback was in favour of having formative (OSCE) more often in clinical skills module. The formative assessment is powerful tools had assisted them in learning and to overcome their weakness and gave them the opportunity to improvement and to prepare them for the summative (OSCE), as result repetition of this module was reduced, Also suggested that residents should be involved in teaching the curriculum to be familiar with students’ syllabus. Faculty had diversity of opinion about for formative OSCE which was reported that it's time consuming, budget consumption and lack of incentive; on the other hand, students expressed high satisfaction with this type of examination and their outcome after participating was found remarkably. 

A development of short form educational environment test for medical students

* Corresponding author: Anupong Kantiwong, Pharmongkutklao College of Medicine, Thailand
k22k_art@hotmail.com 

Background 
The educational environment has effect to medical student, including intra and extra-curriculum activities during study in an undergraduate program. The DREEM (Dundee Ready Educational Environment Measure) questionnaire is a general tool for assessing the educational environment but contains too many questions, which makes it difficult for students with less experience. 

Objective 
To develop and validate the short form educational environment test for medical student by modifying DREEM questionnaire under literature reviews. Summary of Work; 216 students were sampled. The developed test comprised of 18 questions divided into five factors as follows; perception of learning, Perception of course organization, academic self-perception, perception of atmosphere and social self-perception. The content validity was determined by three experts to select the criterion-tested questionnaire and examine the structural validity of the educational environment measure model. 

Summary of Results 
All items had IOC values between 0.67-1.00, and discriminative power was in accordance with the criteria. The coefficient alpha of Cronbach’s is 0.89. Confirmatory factor analysis found that the five-factor of measure model was fitted within empirical data (Chi-square = 267.51 df = 125 P-value = 0.001 RMSEA = 0.073). Item analysis by using graded-response model was found that high marginal reliability for response pattern scores. 

Discussion & Conclusion 
The developed educational short form environment test has of high quality and can be used for accurate and appropriate measurements.

A qualitative and psychometric analysis of the Portuguese National Selection Exams for access to residency programs

* Rui Jorge Silva, José Miguel Pêgo
* Corresponding author: Rui Jorge Silva, School of Medicine - University of Minho, Portugal  
ruijorgessilva@gmail.com

Background 
In Portugal, up to 2018, access to medical residency programs for clinical specialization was achieved by a public tender, whose selection method was a knowledge test designated the “National Selection Exam”. Medical students, graduates, clinicians and professors criticized the exam, citing its focus on the recall of medical factoids and the presence of item flaws. In 2019, a new National Access Exam is being implemented, which mostly focuses on application of knowledge and the test of reasoning and clinical thinking of presented clinical problems. 

Methods 
In this study, we performed a qualitative and psychometric analysis of 10 National Selection Exams (2009-18; 1000 items), to evaluate and identify the most common item flaws, according to NBME® Gold Book and the studies by Rush (2016) and Jozefowicz (2002), and understand the basis of the criticism of the former exam. 

Results 
Our data shows that almost all items were based on the recall of medical factoids and a considerable number had at least one flaw. Three of the most prevalent item flaws were the use of negative stems, unfocused questions and the presence of vague or generalizing terms. Conversely, use of “all of the above”, use of “none of the above” and use of complex or K-type questions were among the least found item flaws. The psychometric analysis is still in progress, waiting the final results. 

Conclusion 
A very significant number of items on the former exam had errors, which presents a problem in the process of selection of the graduates. This study creates a basis of comparison to the new exam, allowing future studies in this area and the direct comparison of the exams with the considered methods. 

Take-Home Messages 
The former Portuguese National Seriation Exam was characterized by items focused on recall of medical factoids and a considerable number of items had at least one flaw. This study allows future comparison of the two models by future studies. 

References
1) Diário da República n.o 40/2018, Série I de 2018-02-26. Decreto-Lei n.o 13/2018 [cited at July 3rd, 2019]; available at https://dre.pt/web/guest/pesquisa/- /search/114766032/details/normal?l=1;  
2) Diário da República n.o 167/2018, 2o Suplemento, Série II de 2018-08-30. Aviso n.o 12497-B/2018 [cited at July 3rd, 2019]; available at https://dre.pt/home/- /dre/116272154/details/maximized;  
3) Administração Central do Sistema de Saúde. Prova Nacional de Acesso (PNA) [cited at July 3rd, 2019]; available at www.acss.min-saude.pt/2018/09/05/provanacional-de-acesso-pna;  
4) Bonnie R. Rush, David C. Rankin and Brad J. White. (2016). The impact of itemwriting flaws and item complexity on examination item difficulty and discrimination value. BMC Medical Education, 16:250;  
5) Miguel A. Paniagua and Kimberly A. Swygert. (2016). Constructing Written Test Questions for the Basic and Clinical Sciences. 4th Edition, National Board of Medical Examiners, Philadelphia;  
6) Jozefowicz RF, Koeppen BM, Case S, Galbraith R, Swanson D, Glew RH. (2002). The quality of in-house medical school examinations. Acad Med., 77(2):156-61.

The clinical pathway as an alternative method in education of endodontics.

Karolina Osica, Marek Szelągowski, Aleksandra Palatyńska-Ulatowska, Barbara Łapińska, Monika Łukomska-Szymańska
* Corresponding author: Karolina Osica, Department of General Dentistry, Medical University of Lodz, Poland, janeczekarolina@gmail.com  

Background 
The search for rational action and methods of dental education is essential for effective treatment of patients. Using multiple teaching and assessment methods should ensure a meaningful learning experience and help with accurately tracking trainees’ progress. The proposal to introduce teaching and assessment process based on appropriate decision diagrams (clinical pathways, CPs) seems to serve as a complement to the accepted methods used in assessment of clinical competency: an Objective Structured Clinical Examination (OSCE), Short Answer Questions, mini-CEX (Mini Clinical Evaluation Exerciser), Directly Observed Procedural Skills (DOPS) and Clinical Work Sampling (CWS). The clinical pathway is a system of actions to support medical professionals, including medical students, at the stage of diagnosing, treatment planning and providing patient care. CPs are written in a specific notation, while their creation is based on a logical relation of mutually interacting activities, performed in a specific order, based on current medical knowledge. The concept of introducing clinical pathways in the education system of dentistry students consists in optimizing the processes of therapeutic problem solving and developing predictable clinical course as well as in achieving satisfying educational outcomes. Implementing CPs in dental education would serve for integration of learning and improving students’ recognition of evidence-based dentistry. 

Summary of Work 
Modelling of the dynamic clinical pathway of endodontic treatment was performed in the ADONIS in BPMN2.0 notation. In order to obtain the final version of the clinical pathway of primary endodontic treatment, eight draft versions were created. They have been improved so as to comply with the latest medical knowledge in the field of endodontics and the principles of CP modelling. 

Summary of Results 
The final version of CP of the primary endodontic treatment was created. To perform CP: one beginning of the path, 2 path ends, 27 activities, 7 decision gates, 45 connectors, 8 control points, 11 own links and 9 notes (3 types of icons) were used. The CP will be implemented in teaching preclinical and clinical endodontics to 3rd to 5th year dentistry students of Medical University of Lodz. 

Discussion & Conclusion 
The development of the CP presenting the treatment pattern of a given disease entity should be made by a team of professionals. It demands reaching a consensus about the course and stages of treatment, taking into account all the variables. The implementation of CP may be a useful tool in under- and postgraduate education and assessment. CPs may serve as guidelines for treatment of simple cases, and indications in more complex cases demanding non-standard procedures. They might be also used for patient education and motivation, before and during treatment. 

Take-home Message 
The clinical pathway of primary endodontic treatment is an excellent compendium of knowledge useful for clinical decision making. It has a potential to become an element of dentistry education, along with case study teaching, and could supplement the programmatic assessment.

© Copyright 2019 The European Board of Medical Assessors (EBMA) | Disclaimer