Objectives: To build a consensus on portfolio framework for master's in health professional education students and document programme learning outcomes, tasks for students related to each outcome, and the pieces of evidence regarding the completion of each task.
Method: The modified Delphi study was conducted from February to July 2020 at Riphah International University, Islamabad, Pakistan, and comprised a three-round electronic-based survey of faculty members associated with the master's in health professional education programme, alumni, and current students as well as portfolio experts. The panellists had to choose from 10 programme learning outcomes, 75 tasks for students to achieve those outcomes, and 510 pieces of evidence to confirm that the tasks had been done to achieve the outcomes. A consensus cut-off of ≥80% was decided to select the item.
Results: Of the 45 stakeholders approached, 41(91.5%) responded in round 1. Of them, 31(75.6%) responded in round 2, while round 3 comprised responses from 31(96.7%) subjects. The draft template was originally derived from the master's in health professional education programme guide, expert opinions, and systematic literature review available for portfolios of other higher education degrees. The list of items was refined through a pilot study. The final template was approved by the expert panel after 3 iterations. The final list of items included 59 (78.6%) tasks and 105(21%) pieces of evidence related to all the 10 programme learning outcomes.
Conclusion: The important programme learning outcomes, their related tasks, and the required pieces of evidence to be added in the e-portfolio of master's in health professional education programme students were identified, and recommendations for the format of implementation and assessment were given.
Keywords: e-Portfolio, MHPE, Delphi, Medical education, Programme learning outcomes, Evidence for e-portfolio.(JPMA 72: 1106; 2022)
Students of master's in health professional education (MHPE) come from very diverse backgrounds ranging from fresh graduates to senior professors. Majority of them are accustomed to teacher-oriented learning and are usually reluctant to make their own learning goals, identify their own learning needs, and perform self-assessment of their learning activities by keeping a reflective log.1 For the faculty, keeping a track of individual student learning and inculcating reflective practices required to make them a life-long learner is difficult. Moreover, measuring the impact of MHPE in terms of translation of this learning into useful practices at workplace seems an unapproachable task.
In a rapidly progressing society where knowledge is created and disseminated fast via electronic tools and the internet as well as the changing educational paradigm to post-positivist approach that is focused on real, student-centred and autonomous learning, the use of new educational tools, like e-portfolio, has become necessary.2 A portfolio has been defined as a purposeful collection of a person's work or achievements or learning activity, depending upon the type and purpose of the portfolio, along with reflections on one's effort during the process. It has been identified as an important tool to stimulate and monitor self-directed learning (SDL) in medical education literature.3 Electronic modification of portfolio has gained more popularity than paper-based portfolios because of having a private webspace and opportunity to collect and organise artifacts, called pieces of evidence (PoEs), in many formats like audios, videos, hyperlinks, graphics and text. Other advantages include easy editing and accessibility at different places.4 Alongside the traditional methods of assessment, the use of e-portfolio for evaluation and training of MHPE graduates will empower MHPE students to achieve personalised competencies by fostering critical thinking and reflective practices. Reflection has been emphasised over artifacts so that the e-portfolio may not become an electronic scrapbook.5,6
The advantages of a learning e-portfolio are often overrun by the challenges in deciding the purpose; one or many; choosing the tool; generic or specific; selection of evidence; the numbers, level and quality; improving acceptance and motivation among the users; and defining the required skills in students for creating the e-portfolio.7
Creating and maintaining e-portfolios have different meanings for deans, faculty, students and accrediting bodies, so when the implementation of e-portfolio is considered, one should not only consider the decisions of the faculty, but also the versatility and choices of the users. Data on students' perspectives of e-portfolio and its effect on successful outcomes suggests that the successful implementation of e-portfolio requires informed practices involving communication and engagement of all the stakeholders, especially the students.8,9 So far, no e-portfolio has been designed for MHPE and its introduction requires consensus on its content and format structure. Therefore, the current study was planned to reach a consensus on the desired content and format requirements of MHPE students' e-portfolio in congruence with content standards for the MHPE programme.
Materials and Methods
The modified Delphi study was conducted from February to July 2020 at Riphah International University, Islamabad, Pakistan, after approval from the ethics review committee of the Islamic International Medical College, Islamabad, and comprised national and international experts to achieve consensus on content and format of the e-portfolio of MHPE students. Opinions and consensus of expert panel was sought about the content and format of the e-portfolio for MHPE. The panellists were free to give their opinion anonymously. Iteration allowed the panelists to change their response and resolve disagreement. Controlled feedback about the group responses was sent to the experts after each round. Conducting and Reporting Delphi studies (CREDES) guidelines were followed throughout the process.10,11
The Delphi information sheet along with the informed consent form was sent to all the participants, assuring them of confidentiality and anonymity.
Using the hierarchical stopping criteria for Delphi studies given by Heiko A,12 all items were categorised as 'included', 'excluded' or 'non-consensus'. A consensus cut-off of >80% was decided to select the item (Table-1). The end-point was a maximum 3 rounds as it enables adequate reflection on group responses and is considered optimal to reach consensus13 and also to limit the responder's attrition and fatigue.
The panel had a maximum variation of experts, including the implementers, the assessors, and the users.
The items for the survey questionnaire were based on the MHPE programme guide, opinions of 6 experts and systematic literature search.14 Two experts were specialists in the field, while 4 were MHPE students. A three-point Likert scale was used conforming to its practical convenience, applicability and adequacy.15 Experts were asked questions at three levels.
The first question was: which of the following domains / programme learning outcomes (PLOs) would you like to include in the e-portfolio of MHPE? For each PLO selected, a list of items was provided enlisting the tasks to be performed by the students to achieve the selected PLO.
The second question was: which of the following tasks from the above domain would you like to include in the e-portfolio of MHPE? For each task selected, the experts had to choose some PoEs / artefacts confirming that the selected 10 PLOs are achieved.
The third question was: which of the following PoEs / artefacts would you like to include in the e-portfolio of MHPE for the said task?
A pilot study on 17 participants was conducted to assess the procedures for usability and analysis of survey questionnaires.
In round 1, Google form was used to collect data from the participants. The questionnaire included 10 PLOs, 75 tasks across the PLOs, and 5-7 PoEs for each task. Six additional questions were asked for the implementation and assessment format of the e-portfolio. Participant's demographics were also collected in this round. The survey remained open for 4 weeks. Comment boxes were also provided. They were also required to choose one or more PoE for all the tasks they selected from a given set of 7 PoEs. The approximate time to complete the survey was 40-45 minutes.
In round 2 all the variables from round 1 along with everyone's response and group summary were sent to all the participants using the same scale. The participants were given 2 weeks to respond. Reminders were sent weekly via email and WhatsApp groups.
In round 3, a close-ended questionnaire, based merged and rephrased items on which consensus or stability was not achieved in the first two rounds, was sent via Google form to the same panellists accompanied by a summary of group responses plus their own responses from the previous round.
Of the 45 stakeholders approached, 41(91.5%) responded in round 1. Of them, 31(75.6%) responded in round 2, while round 3 comprised responses from 30(96.7%) subjects (Table-2).
Consensus was reached on all the 10 PLOs (Table-3).
The 10 PLOs were teaching and learning (T&L), curriculum development (CD), programme evaluation (PE), research and scholarship (R&S), assessment planning and design (AP&D), educational leadership (EL), communication skill (CS); SDL, educational psychology (EP) and total quality management (TQM (Table-4).
Of the 75 tasks presented to the panel, 59(78.7%) were approved, and, of the 507 PoEs presented, 105(21%) were approved with consensus. The most selected PoE was evidence from workplace (EfW) plus group project 41(39%), followed by assignment with score (AwS) 28(26.7%).
The final list contained 3 types of PoEs for the selected 10 (100%) PLOs and 59(78.6%) tasks: PoEs that met stable agreement for inclusion 58(11%); PoEs that met stable agreement for exclusion 401(79%); and PoEs that did not meet stability criteria, but a majority agreed to include these in the final e-portfolio 47(9%). The consensus was built in 3 rounds gradually (Figure-2).
A majority 28(70%), 22(73%) and 27(87%) in all 3 rounds opined that e-portfolio should be mandatory for students. The panellists agreed that the assessment should be both formative and summative 27(87%) and qualitative and quantitative 27(87%). The most agreed upon implementation format was to give the task to students immediately after admission and the completed e-portfolio should be submitted before the last contact session 20(64.5%). Few panellists 18(58%) agreed to add the scores of each assignment, ongoing assessments, and marks sheets for each semester in the e-portfolio. Consensus was achieved 27(87%) that formal student-teacher meetings with duly signed proformas should be a part of e-portfolio implementation. Regarding the application/software choice for making an e-portfolio, the Google site was the most favourite choice in the first 2 rounds 26(70%) & 21(70%), but it did not reach the consensus cut-off. In the third round, students' own choice 27(87%) reached the consensus value.
As more and more health professionals are getting enrolled in MHPE programmes, the concerns about the assessment of their competency are also raised. The e-portfolios structured around the learning outcomes are used worldwide to improve the learning and assessment of many skills, like critical thinking, application of knowledge, and reflective thinking.16,17 For MHPE students, e-portfolio was arranged in the spinal column model18 to increase the reliability of assessment, to maintain uniformity of content and its alignment with the course outcomes, and to help the mentor and the student in staying focussed.19
For the 10 main PLOs, 59 tasks and 105 PoEs were approved which makes it flexible for MHPE students to choose any PoE they feel comfortable with, in consensus with their mentors, providing space at the same time for all the students to further their competencies and to display their strengths in line with their background knowledge and experience at the workplace i.e., individualised learning experienced comparing their learning with themselves, and not others.20-22
The e-portfolios have also faced criticism as some view it as burdensome and an add-on requirement23 rather than as a crucial learning strategy. To keep the student motivated, the PoE chosen were very feasible to collect (AwS) which is the "work done"24 most routinely during the MHPE programme, and EfW, which is the teaching assignments they are doing routinely at their workplace. Some experts chose both to provide triangulation of evidences from all sources.25,26
Through reflection on these assignments, students can easily judge how far they have come in terms of achieving their learning outcomes. But reflective writing could not attain stability for most of the tasks in the current study. This was a bit surprising as many of the panellists were doing reflective writing as an essential course element. This non-selection of reflection shows that many MHPE students did not understand its real potential as a learning tool,8 or the reflective writing and critical thinking skills are challenging, especially for the busy clinicians-cum-academicians.23,27 It may also be due to a lack of confidence in how this information will be used or displayed. This is an important finding which points towards not only the need for training of faculty and students in this area, but also the introduction of a strong mentoring and feedback system.28
Assessment at a certain level requires longitudinal observation, but it might be possible for some MHPE students to produce evidence from the workplace for some tasks, like applying meta-cognitive strategies, Gagne's 9 events and microteaching principles, in their routine teaching activities. In coherence with the above, feedback from students or peers may work as PoV for 12 such tasks. Some students might have the responsibilities of curricular and assessment alignment, standard setting and item analysis, curricular integration, and planning and designing various levels of assessment at their workplace in groups, so group projects were chosen as PoV for such tasks. It was suggested to carefully individualise the PoE according to the students' contexts. As the learning curves are different for novice, experienced and expert learners, determining the level and quality of pieces of evidence needs more deliberation and maybe another consensus study to avoid a potential pitfall of portfolio implementation by making it too complex.24
Formal mentor-mentee meetings were rightly emphasised upon by all the experts without which they thought the students might not be able to identify their learning needs and make correct personal development plan (PDP). Thus, healthy relationships, fruitful conversations and prolific mentor-mentee interactions are necessary to take full advantage of e-portfolios where in-depth feedback from teachers can help students grow, improve and mature as learners and, thus, as teachers in the future.28,29
Regarding the platform to be used for the MHPE e-portfolio, majority of the panel chose MS Word or Google Docs and Google sites because these apps are easily available, user-friendly and are equipped with functions of attaching the required documents, videos, audios or images. These platforms are being used for various other disciplines successfully and were preferred by students even if other platforms were provided.30,31
The current study may serve as a steppingstone towards a qualitative inquiry via focussed group discussions (FGDs) or cognitive interviews from selected experts to further explore the tasks and PoE which gained the agreement of the majority (>50%) of the group, but failed to reach the cut-off value of >80%, and finalize the quality and quantity of each PoE keeping in mind the diversity of students' background. The current study may also be followed up by experimental research project where this prototype e-portfolio might be implemented and checked for the feasibility of implementation.
The current study is the first Delphi undertaking to identify each item of e-portfolio to be included or excluded from the final product being validated from expert medical educationists.
The expert panel reached consensus over 10 PLOs, 59 tasks and 105 PoEs for an MHPE e-portfolio. The experts emphasised asking the evidence in accordance with the context of the learner, assessing the e-portfolio both formatively and qualitatively as well as summatively and quantitatively, and conducting formal meetings of the mentor and the mentee.
Disclaimer: The text is based on a thesis related to the master's in health professional education programme (MHPE).
Conflict of Interest: None.
Source of Funding: None.
1. Zafar Z, Ali S. Education System of Pakistan: Social Functions and Challenges. J Indian Stud. 2018; 4:31-51.
2. O'Keeffe M, Donnelly R. Exploration of ePortfolios for Adding Value and Deepening Student Learning in Contemporary Higher Education. Int J ePortfolio. 2013; 3:1-1.
3. Song BK. E-portfolio implementation: Examining learners’ perception of usefulness, self-directed learning process and value of learning. Aus J Educ Technol. 2021; 37:68-81.
4. Altynbilek I. The Advantage Of Electronic Portfolio In Assessing Student Learning. Alatoo Acad Stud. 2020; 4:56-63.
5. Van Wyk MM. An e-portfolio as empowering tool to enhance students’ self-directed learning in a teacher education course: A case of a South African University. South Afr J Higher Educ. 2017; 31:274-91.
6. Barrett H. Balancing the two faces of ePortfolios. Educação Formação Tecnol. 2010; 3:6-14.
7. Paulson EN, Campbell N. Collective Approaches to ePortfolio Adoption: Barriers and Opportunities in a Large Canadian University. Can J Scholar Teach Learn. 2018; 9:3-4.
8. Scully D, O’Leary M, Brown M. The learning portfolio in Higher Education: a game of snakes and ladders. In: Scully D, O’Leary M, Brown M, eds. Policy & Practice in Education (CARPE) and National Institute for Digital Learning (NIDL). Dublin: Dublin City University, Centre for Assessment Research, 2018.
9. Lambert S, Corrin L. Moving towards a university wide implementation of an ePortfolio tool. Aus J Educ Techno. 2007; 23:43-57.
10. Jünger S, Payne SA, Brine J, Radbruch L, Brearley SG. Guidance on Conducting and REporting DElphi Studies (CREDES) in palliative care: Recommendations based on a methodological systematic review. Palliat Med. 2017; 31:684-706.
11. Veugelers R, Gaakeer MI, Patka P, Huijsman R. Improving design choices in Delphi studies in medicine: the case of an exemplary physician multi-round panel study with 100% response. BMC Med Res Methodol. 2020; 20:156.
12. Heiko AV. Consensus measurement in Delphi studies: review and implications for future quality assurance. Techno Forecast Soc Change. 2012; 79:1525-36.
13. Graham B, Regehr G, Wright JG. Delphi as a method to establish consensus for diagnostic criteria. J Clin Epidemiol. 2003; 56:1150-6.
14. Trakman GL, Forsyth A, Hoye R, Belski R. Developing and validating a nutrition knowledge questionnaire: key methods and considerations. Public Health Nutr. 2017; 20:2670-9.
15. Lehmann DR, Hulbert J. Are three-point scales always good enough? J Market Res. 1972; 9:444-6.
16. De Swardt M, Jenkins LS, Von Pressentin KB, Mash R. Implementing and evaluating an e-portfolio for postgraduate family medicine training in the Western Cape, South Africa. BMC Med Educ. 2019; 19:1-3.
17. Morreale C, Zile-Tamsen V. Thinking Skills by Design: Using a Capstone ePortfolio to Promote Reflection, Critical Thinking, and Curriculum Integration. Int J ePortfolio. 2017; 7:13-28.
18. Bentaib M, Aït Daouad M, Touri B, Namir A, Labriji L, El Kouali M, et al. Implementation of a Computing Device to Quality Service in Learning in Higher Education to Enhance Student’s Quality of Life. IOSR J Res Method Educ. 2014; 4:01–7.
19. Haldane T. “Portfolios” as a method of assessment in medical education. Gastroenterol Hepatol Bed Bench. 2014; 7:89-93.
20. Lewis KO, Baker RC. The development of an electronic educational portfolio: an outline for medical education professionals. Teach Learn Med. 2007; 19:139-47.
21. Datta R, Datta K, Routh D, Bhatia JK, Yadav AK, Singhal A, et al. Development of a portfolio framework for implementation of an outcomes-based healthcare professional education curriculum using a modified e-Delphi method. Med J Armed Forces India. 2021; 77:S49-56.
22. Alzouebi K. Electronic portfolio development and narrative reflections in higher education: Part and parcel of the culture? Educ Inf Technol. 2020; 25:997-1011.
23. Heeneman S, Driessen EW. The use of a portfolio in postgraduate medical education–reflect, assess and account, one for each or all in one?. GMS J Med Educ. 2017; 34: Doc57.
24. Van Tartwijk J, Driessen EW. Portfolios for assessment and learning: AMEE Guide no. 45. Med Teach. 2009; 31:790-801.
25. Sim JH. Moving Towards a Mixed-Method Approach to Educational Assessments. Academic Medicine. London: Lippincott Williams and Wilkins, 2017; pp-726.
26. Schuwirth LWT, van der Vleuten CPM. A history of assessment in medical education. Adv Heal Sci Educ. 2020; 25:1045-56.
27. Rich AJ, Holm SE, Feltman C, Thomas S. A Journey from Patient Care to Jesuit Higher Education: How a Small Group of Healthcare Professionals Navigated the Transition into Academia. Jesuit Higher Educ. 2019; 8:36-44.
28. Leering MM. Perils, pitfalls and possibilities: Introducing reflective practice effectively in legal education. Law Teach. 2019; 53:431-45.
29. Heeneman S, de Grave W. Tensions in mentoring medical students toward self-directed and reflective learning in a longitudinal portfolio-based mentoring system–an activity theory analysis. Med Teach. 2017; 39:368-76.
30. Gutiérrez-Morales G, Lara-Gutiérrez Y, Alpuche-Hernández A, Trejo-Mejía A, Sánchez-Mendiola M. Assessment of competencies in pediatric pneumology residents: Use of an electronic portfolio. NCT Neumología y Cirugía de Tórax. 2019; 78:4-9.
31. Roberts P. Developing reflection through an ePortfolio-based learning environment: design principles for further implementation. Tech Pedago Educ. 2018; 27:313-26.