Faisal Waseem Ismail ( Department of Medicine, Aga Khan University, Karachi, Pakistan. )
Rashida Ahmed ( Department of Pathology and Laboratory Medicine, Aga Khan University, Karachi, Pakistan. )
Sadaf Khan ( Department of Surgery, Aga Khan University, Karachi, Pakistan. )
Sara Shakil ( Department of Educational Development and Medicine, Aga Khan University, Karachi, Pakistan. )
March 2023, Volume 73, Issue 3
Research Article
Abstract
Objective: To ensure competence of essential skills of final year medical students in clinical examination by identifying essential skills and by revisiting and practising them before the examination.
Method: The cross-sectional study was conducted at the Aga Khan University, Karachi, from February to November, 2019, and comprised final year medical students and internal examiners from various academic disciplines. An overview of the organisational context, exam structure and process was noted.
Results: There were 96 medical students. The four key areas highlighted were development of the list of essential skills across five years of undergraduate medical curriculum with consensus from all disciplines, student motivation for attendance in practice sessions, unfamiliarity of examiners with the assessment tool, and the need for capacity-building. The key areas were based on the feedback received from all the stakeholders, and post-hoc analysis.
Conclusion: This form of assessment would enable a thorough analysis of the preparedness of the students to function as independent physicians as undifferentiated doctor at the start of their careers as interns, and improve the quality of subsequent exams based on feedback and suggestions of faculty and students.
Key Words: Clinical skills, Competency-based medical education, Undergraduate medical education, Final year medical students.
(JPMA 73: 520; 2023) DOI: 10.47391/JPMA.6336
Submission completion date: 25-02-2022 — Acceptance date: 24-09-2022
Introduction
Clinical competence is defined as the attribute of a person and his capability to make correct accurate decisions while keeping patient care and confidentiality the prime aim1. With the evolvement of teaching and learning practices and assessment of skills in medical education nationally and worldwide, there is a need to understand the transition of discipline-based curriculum to a more robust and valid competency-based medical education (CBME). The development and assessment of clinical competence in undergraduate medical students on a continuum is therefore a fundamental task to make them efficient physicians as they leave medical school2. This is not an easy journey as the nature of CBME requires demonstration of multiple essential core competencies in the busy clinical workplace environment, which starts immediately at internship level and continues throughout the postgraduate medical education programmes3. The importance of this transition has been explained with the introduction of a new type of assessment scale to measure gradual progression of clinical performance in Internal Medicine residents4.
The undergraduate competencies at Aga Khan University, Medical College (AKU-MC), Karachi, include critical thinking/problem solving skills, clinical/technical skills, teamwork and collaborative learning skills, communication skills, professionalism, social accountability, advocacy, and scholarship, including research and ethical leadership. At the AKU-MC, different formats are used throughout the programme to assess various components of these competencies; for knowledge, written examinations, including multiple choice questions (MCQs), extended matching questions (EMQs), and integrated short answer questions (ISAQs); and for technical skills, performance-based exams, including mini-clinical evaluation exercise (Mini-CEX), objective structured clinical examinations (OSCE) and bedside clinical examination (BCE).
Undergraduate medical students are often found to be deficient in essential clinical skills possibly due to underutilisation of various bedside clinical teaching strategies that increase student motivation and provides a platform for the demonstration and practice of core skills, like history-taking and physical examination5. Although clinical skills are learnt and assessed in all the years of medical education, it is recognised that they may decay over time, and students may still require time and opportunity to master these skills before they graduate and enter residency programmes6. This shall result in impediment in their professional progress as they move towards their postgraduate years of clinical practice. An intensive training of practical clinical skills during the initial years of undergraduate medical education is required to prepare students for their future role as physicians. A study pointed out common issues like inability to self-reflect, misconceptions and gaps in medical knowledge and poorly developed clinical and professional skills, especially during the fourth year of medical school. In light of these issues, there have been calls for curricular modifications in the final year of medical school to help students smoothly exit medical school7. A recent study emphasised the need to integrate simulation opportunities into the existing medical curriculum. It allows avenues of deliberate practice in a conducive learning environment where students use simulation models and high-to-low fidelity simulators to learn new clinical presentations without the stress of possible mistakes8. Literature also states that the retention skills of students to demonstrate proficiency in core essential competencies is increased by introducing simulation-based courses at undergraduate final year level with both asynchronous (self-directed) and synchronous (facilitator-led) learning9.
Hence, it is important that skills, like history-taking, physical examination and procedures learnt throughout undergraduate education, are not only revisited, but practised and perfected in the final year, as these will form the bedrock of their future career.
Based on faculty concerns and observation of final year students, a need was identified by the undergraduate curriculum committee and educational experts in the Bachelor of Medicine, Bachelor of Surgery (MBBS) programme at AKU-MC that an exit-level assessment of technical skills needs to be done across all medical disciplines that would enable a thorough assessment of the preparedness of the students to function as independent, undifferentiated doctors at the start of their careers as interns.
The AKU Hospital (AKUH) is a 750-bed teaching hospital with an undergraduate and various postgraduate residency and fellowship programmes. The undergraduate medical programme is a five-year programme leading to the degree of MBBS. The first two years of the course provide a systems/process-based introduction to the foundations of medicine, while the next three years are clerkship based, and use experiential clinical problem-solving approach as the major learning strategy. There is progressively increasing clinical exposure, in outpatient, inpatient and procedural settings in all the major disciplines.
The current study was planned to ensure competence of essential skills (ECOES) of final year medical students in clinical examination by identifying essential skills and by revisiting and practising them before the examination.
Subjects and Methods
The cross-sectional study was conducted at AKU-MC, Karachi, from February to November, 2019. Essential skills for graduating batch were identified using ECOES exam, verbal feedback from all the stakeholders, and the overall exam process. The coordinators for undergraduate medical education in the specialties of Internal Medicine, Surgery, Psychiatry, Paediatrics, Obstetrics and Gynaecology, Otolaryngology, Ophthalmology, Family Medicine, Emergency Medicine, and Longitudinal themes, including communication skills, bioethics etc., were invited to identify list of essential skills in their respective disciplines required of a fresh graduate.
Hands-on practice sessions for the assessment were planned three days before the actual examination using simulation models and simulators for most of the clinical skills. A list of all the essential skills was shared with the students, and they were encouraged to sign up for as many of the sessions as they wanted to practice. The learning resources for the practice of these essential skills included relevant text reading, guidelines and videos that were identified by the discipline coordinators. These were circulated to all the students and facilitators well in advance for standardisation. These sessions were conducted for all the essential skills by a faculty member of instructor level and above. The skill was first demonstrated by the trainer, following which opportunities were given to the students to practise the skill under observation until they felt they were competent.
All students of final year MBBS were included, while those who had failed their professional exams were excluded.
The students appearing in the exam were divided into two groups. Three identical, independent circuits were designed, each comprising 17 OSCE stations with 14 active and 3 rest stations. The duration at each station was 7min (Figure).
The ECOES exam was based on student performance assessed through global rating scale. All stations were either clear or borderline pass-fail stations. Any student who performed poorly on a station was declared fail and required remediation, whereas good performers who passed the stations were either declared borderline or clear pass. The failed students were called in a remedial session, facilitated by a clinical faculty who allowed the student to observe demonstration of skill in a simulated environment. Once the facilitator completed the demonstration, the student was instructed to perform the same skill under observation. The student was provided feedback on his/her performance until he/she performed well enough to pass the station. Since this exam was a sign-off from the clinical skills learnt during five years of medical school, all students passed this exam.
All the stations were manned and interactive, and utilised simulated patients or mannequins, as required.
Once the main examination was over, the results were tabulated and a list of students who had failed the stations was drawn up. They were called back the next day, given feedback and the skill was re-demonstrated to them by instructor in a remedial session.
This was an extremely labour- and time-intensive exercise. A total of 51 faculty members and 10 administrative staff participated in the exercise, and it took over 20 hours over 4 days to complete the practice sessions, the main ECOES exam and the remediation sessions. Verbal feedback from both the students and faculty was taken.
Results
There were 96 final year medical students. The initial list had duplication amongst specialties. In addition, most specialties felt that virtually all the skills in their specialty were essential. It was through a process of understanding and discussion that a core essential list of 36 absolute ‘must know’ skills were finalised with agreement of all the stakeholders (Table).
All 96(100%) students ultimately passed ECOES. Student reflection to perform these skills, and motivation to practise them, examiner understanding of the scoring tool, as well as examiner capacity-building were crucial steps identified during the evaluation process. Verbal feedback from both the students and faculty was positive, and there was agreement that the exercise was needed and relevant, and should be continued.
Discussion
The development of the list of essential skills was an arduous process, which required a lot of discussion with the relevant departments. Most departments felt that all skills pertaining to their respective specialty were essential, and all needed to be tested. The identification of the most important procedural skills by physician educators was earlier done using a survey10. Hence an approach like that in the current study would have eased out listing of clinical/technical skills. Another study11 described the importance of taking different stakeholders while identifying essential skills and establishing clear objectives for a module or clerkship.
It is suggested that this list be a living document and should be revised every year by the relevant department faculty to assess if it requires any amendment. This should be based on feedback received from students during their clerkships, which would aid in the improvement of planning of clinical teaching sessions during their clerkships and assessment of student performance subsequently12. Using competency-based medical education guidelines to generate a list of frequently performed skills would aid the process further. 10
Reflection plays a crucial role in modifying student behaviours and increasing their motivation for learning essential skills in successive years of medical school. This is clearly evident in a review article that explored various educational interventions being used to develop reflection in undergraduate medical students13. While identifying conceptual perspectives in transitioning of students from preclinical to clinical years, a study also elaborated the significance of reflection in learning14. Therefore, students should be encouraged to reflect, and then sign up for those practice sessions which they wish to attend.
Lack of proper understanding of the assessment tool and clinical experience of examiners has been identified as a major factor that affects the rating of student performance. A study explored the effect of clinical experience on assessment scores in an OSCE exam which proposed the use of videos for training of examiners15. Our current OSCE rating scale was a simplified numerical scale from 1-6, indicating progressively increasing competence for an attribute, and a global rating for the task. It was developed by our medical education experts in the Department for Educational Development (DED). The skill being tested was broken into its various components, which were either ‘done’ or ‘not done’. The ‘done’ category was further elaborated as partially done, satisfactory and excellent. To assess overall competence in that skill, a global rating scale with four attributes namely, clear fail, borderline fail, borderline pass, and clear pass was used, and the examiner had to indicate whether the student has failed or passed that station. There is considerable body of literature that signifies that correlation of checklist scores with the global ratings depicts a clear picture of the quality of performance16. Hence it was included to add more value to the examiner ratings. The introduction of e-learning training package has been stressed using videos for examiner calibration on rating scales17.
One key element that dictates the success of any assessment process is capacity-building for standardisation. It is imperative that the examiners understand what is being tested in a particular OSCE and confine their opinion only to that skill. A recent study emphasized the importance of faculty development and training for standardisation of the exam18.
The teaching resources that were provided to the students by the faculty were ensured to be referenced from standard and accepted sources of knowledge from that particular specialty. These resources were also shared with the examiners who scored the ECOES assessment, so that the students were assessed only on the skills they had been taught.
Finally, remediation of skills on the same day was a source of great stress for all stakeholders. According to Chou et.al., remediation is a steady process of correcting errors in competence and allowing trainees to practice it till they become confident in the demonstration of clinical skills19. The chances of marking errors, and student and examiner fatigue were prominent in the feedback received from students, examiners and administrative staff. All were in favour of remediation the next day, to enable non-pressurised and effective sessions to take place.
Examiner training is vital, and rating scales should be simple, allowing a clear assessment of the student’s skill. Remediation on the following day allows the examiners and students to approach the task afresh, and achieve competence to the satisfaction of the trainer and the student19. There is also a need for developing a list of essential skills and to update it periodically as a live document. Similar studies may also be recommended in other medical colleges to explore essential skills in the undergraduate medical curriculum. An effectively designed assessment, and provision of opportunities for deliberate practice during remediation sessions are appreciated by both students and examiners, and allow the students to graduate with the best skill-set possible20.
Conclusion
The ECOES examination allowed a robust assessment of the essential ‘must-know’ skills of final year MBBS students just before their graduation.
Disclaimer: None.
Conflicts of Interest: None.
Source of Funding: None.
References
1. Frank JR, Danoff D. The Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada. The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Med Teach. 2007; 29:642-7. doi: 10.1080/01421590701746983.
2. Lockyer J, Carraccio C, Chan MK, Hart D, Smee S, Touchie C, et al. Core principles of assessment in competency-based medical education. Med Teach. 2017; 39:609-16. doi: 10.1080/0142159X.2017.1315082.
3. Van Melle E, Hall AK, Schumacher DJ, Kinnear B, Gruppen L, Thoma B, et al. Capturing outcomes of competency-based medical education: The call and the challenge. Med Teach. 2021; 43:794-800. doi: 10.1080/0142159X.2021.1925640.
4. Halman S, Fu AY, Pugh D. Entrustment within an objective structured clinical examination (OSCE) progress test: Bridging the gap towards competency-based medical education. Med Teach. 2020; 42:1283-8. doi: 10.1080/0142159X.2020.1803251.
5. Sultan AS. Bedside teaching: an indispensible tool for enhancing the clinical skills of undergraduate medical students. J Pak Med Assoc. 2019; 69:235-40.
6. Radabaugh CL, Hawkins RE, Welcher CM, Mejicano GC, Aparicio A, Kirk LM, et al. Beyond the United States medical licensing examination score: assessing competence for entering residency. Acad Med. 2019; 94: 983-9. doi: 10.1097/ACM.0000000000002728.
7. Teo AR, Harleman E, O’Sullivan PS, Maa J. The key role of a transition course in preparing medical students for internship. Am Med 2011; 86:860-5. doi: 10.1097/ACM.0b013e31821d6ae2.
8. Ayaz O, Ismail FW. Healthcare Simulation: A Key to the Future of Medical Education–A Review. Adv Med Educ Pract. 2022; 13:301-8. doi: 10.2147/AMEP.S353777.
9. Offiah G, Ekpotu LP, Murphy S, Kane D, Gordon A, O’Sullivan M, et al. Evaluation of medical student retention of clinical skills following simulation training. BMC Med Educ. 2019; 19:1-7. doi: 10.1186/s12909-019-1663-2.
10. Battaglia F, Sayed C, Merlano M, McConnell M, Ramnanan C, Rowe J, et al. Identifying essential procedural skills in Canadian undergraduate medical education. Can Med Educ J. 2020; 11:e17-23. doi: 10.36834/cmej.68494.
11. Ahmed R, Naqvi Z, Wolfhagen I. Psychomotor skills for the undergraduate medical curriculum in a developing country--Pakistan. Educ Health. 2005; 18:5-13. doi: 10.1080/13576280500042473.
12. Russel SM, Geraghty JR, Kobayashi KR, Patel S, Stringham R, Hyderi A, et al. Evaluating Core Clerkships: Lessons Learned From Implementing a Student-Driven Feedback System for Clinical Curricula. Acad Med. 2021; 96:232-5. doi: 10.1097/ACM.0000000000003760.
13. Uygur J, Stuart E, De Paor M, Wallace E, Duffy S, O’Shea M, et al. A Best Evidence in Medical Education systematic review to determine the most effective teaching methods that develop reflection in medical students: BEME Guide No. 51. Med Teach. 2019; 41:3-16. doi: 10.1080/0142159X.2018.1505037.
14. Atherley A, Dolmans D, Hu W, Hegazi I, Alexander S, Teunissen PW. Beyond the struggles: a scoping review on the transition to undergraduate clinical training. Med Educ. 2019; 53:559-70.
15. Donohoe CL, Reilly F, Donnelly S, Cahill RA. Is there variability in scoring of student surgical OSCE performance based on examiner experience and expertise? J Surg Educ. 2020; 77:1202-10. doi: 10.1016/j.jsurg.2020.03.009.
16. Norcini J. Workplace Assessment. In: Swanwick T, Norcini J, eds. Understanding Medical Education: Evidence, Theory and Practice. London: Wiley-Blackwell, 2010; pp 232-45.
17. Moreno‐López R, Sinclair S. Evaluation of a new e‐learning resource for calibrating OSCE examiners on the use of rating scales. Eur J Dent Educ. 2020; 24:276-81. doi: 10.1111/eje.12495.
18. Yeates P, Moult A, Cope N, McCray G, Xilas E, Lovelock T, et al. Measuring the Effect of Examiner Variability in a Multiple-Circuit Objective Structured Clinical Examination (OSCE). Acad Med. 2021; 96:1189-96. doi: 10.1097/ACM.0000000000004028.
19. Chou CL, Kalet A, Costa MJ, Cleland J, Winston K. Guidelines: The dos, don’ts and don’t knows of remediation in medical education. Perspect Med Educ. 2019; 8:322-38. doi: 10.1007/s40037-019- 00544-5.
20. Moulaert V, Verwijnen MG, Rikers R, Scherpbier AJ. The effects of deliberate practice in undergraduate medical education. Med Educ. 2004; 38:1044-52. doi: 10.1111/j.1365-2929.2004.01954.x.
Related Articles
Journal of the Pakistan Medical Association has agreed to receive and publish manuscripts in accordance with the principles of the following committees: