Article Text

Download PDFPDF

Original research
Developing an integrated national simulation-based educational programme for Scottish junior doctors through structured, multistep action research cycles
  1. Neil Malcolm Harrison1,
  2. Ashley Dennis2
  1. 1School of Medicine, University of Dundee, Dundee, UK
  2. 2Office of Medical Education, Billings Clinic, Billings, Montana, USA
  1. Correspondence to Dr Neil Malcolm Harrison; n.x.harrison{at}dundee.ac.uk

Abstract

Objectives Simulation is widely employed to teach a range of skills, across healthcare professions and is most effective when embedded within a standarised curriculum. Although recommended by many governing bodies, establishing a national programme of simulation presents many challenges. Successful implementation requires a clear understanding of the priorities and needs of those it seeks to serve yet there are limited examples of how best to do this. This study aimed to develop an integrated national simulation-based educational programme for junior doctors in Scotland through a structed, multistep prioritisation process.

Design A series of action research cycles were undertaken to develop and evaluate a national simulation programme. This paper describes cycle 1, which employed a six-step structured approach to understand and prioritise learner needs.

Setting The study considered the educational needs of Scottish junior doctors in the UK Foundation Programme (UKFP).

Participants Multiple stakeholder groups were involved in each stage of the process including recent Scottish UKFP graduates, clinical educators, UKFP programme directors and postgraduate deans.

Results Key stakeholders reviewed the 370 competencies in the UKFP curriculum and identified 18 initial competency areas. These 18 areas were subsequently prioritised through the analytical hierarchy process, resulting in a carefully ordered list of 12 competencies from which a targeted simulation-based educational programme could be developed.

Conclusions To our knowledge, this is the first study to outline the methods of competency prioritisation to create a simulation curriculum that is integrated within a national curriculum in the medical education context. As well as demonstrating the practical steps of such a process, key implications for practice are identified. This robust approach to educational design also resulted in unexpected benefits, including educator and clinician acceptance and programme funding sustainability.

  • MEDICAL EDUCATION & TRAINING
  • EDUCATION & TRAINING (see Medical Education & Training)
  • GENERAL MEDICINE (see Internal Medicine)

Data availability statement

Data may be obtained from a third party and are not publicly available.

http://creativecommons.org/licenses/by-nc/4.0/

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

STRENGTHS AND LIMITATIONS OF THIS STUDY

  • The involvement of representative stakeholders across roles and locations allowed a national picture of needs and priorities representing multiple perspectives.

  • The analytical hierarchy process allowed the combination of multiple criteria and stakeholders to provide diverse perspectives on curricular priorities in a quantifiable way.

  • The balance of rigour and feasibility meant that only two clinical specialties were represented in the expert clinician group.

  • The ability of stakeholders to self assess their needs is variable and challenging to account for.

Introduction

Simulation is a powerful method of medical education.1 2 Simulation-based education (SBE) is an effective means of learning clinical skills, and data indicate that these skills can be transferred to clinical practice.3 4 Simulation is employed to teach a range of skills, across healthcare professions. Evidence has grown in both technical skills training and non-technical skills development.4–9 Simulation is also used to assess new clinical environments and workflows.10

SBE is most effective when exercises are embedded within a standardised curriculum.1 11 12 Resultantly, governing bodies increasingly recommend the integration of simulation-based training into postgraduate specialist training programmes.13 Without consideration of its place in a curriculum, simulation risks being poorly targeted and not integrated, hampering sustainability and quality.12 14 This is pertinent for large national training programmes where variation in clinical placements and multiple modes of curriculum delivery exacerbate the challenges of aligning simulation with the wider curriculum.

Establishing a national programme of education presents many challenges, and successful implementation requires a clear understanding of the priorities and needs of those it seeks to serve. This is vital as any educational intervention is limited by the available resources and funding and competing demands on a learner’s time, such as a need to have sufficient clinical experience and the demands of clinical service delivery. Furthermore, identifying which aspects of a curriculum are best delivered through simulation results in a more goal-directed and, in turn, sustained use of simulation.12 Due to the diverse range of factors impacting on clinical education, learning needs are likely to be widely dispersed across the curriculum and be varied in nature. Needs analysis is a key component of curriculum development and a key starting point for the application of simulation.15–18 Although recognised as the first step in the design of any educational intervention for medical education, it is a step not clearly described and at times overlooked or not approached in a systematic way.19

There are limited examples in the literature of national healthcare curriculums effectively integrating simulation.20 National programmes of simulation training for postgraduate medical education have been shown to be effective in some specialist training, for example, in surgical training and paediatric surgical training.20 21 Current examples focus on evaluating the programme’s success as far as impact and feasibility but give limited information on the process of developing the programmes. For example, Breaud et al explain the involvement of a number of experts nationally, supervised by the national college, but give no practical descriptions of how this was done.20

Identifying and prioritising learning needs relevant to a large body of junior doctors across a country requires a thorough yet manageable process. Due to the gap in the healthcare simulation literature around national educational programmes, no established approaches exist. An initial prioritisation method was needed to answer the question: what competencies should be prioritised to develop a targeted Scottish simulation curriculum aligned with the UK Foundation Programme (UKFP)? This paper aims to demonstrate how competencies were identified and prioritised for this programme through a structured approach.

Methods and results

Context

The introduction of the UKFP was proposed by the Chief Medical Officer for the National Health Service (NHS) in 2002 and later implemented in 2005 to address concerns about training in the previous junior doctor grades in the UK.22 The UKFP programme comprises of 2 years which directly follow graduation from medical school similar to the internship year in other countries like the USA. It provides newly qualified doctors with 2 years of basic clinical training before they transition to specialist training.23 All doctors entering specialist training in the UK must be able to demonstrate that they have attained the UKFP competencies.

The UKFP has a set curriculum with a comprehensive list of competencies that all Foundation Year (FY) doctors are expected to achieve by the end of their 2-year training period.24 Each doctor undertakes a series of clinical posts, including medical and surgical posts in the first year and then a series of additional posts within specialties.25 The specialties covered and the clinical work that they are exposed to vary. Their experience is also dependent on geographical factors, healthcare trust differences, colleagues and local educational contexts, supervision and provision. Varying shift patterns and pressures to fulfil clinical duties can also create barriers for trainees to attend organised learning events.26 Coverage of the curriculum competencies, in what was designed to be a standardised generalist training, therefore, has significant elements of variability.27

Since the institution of the UKFP and its curriculum, there have been concerns that clinical experience and opportunities for training are not consistently meeting all of the requirements for the junior doctors.28 29 Two reviews concluded that although delivery of service is itself a major part of training, this competing demand on time significantly complicates delivering a consistent and comprehensive curriculum to the FY doctors.27 30 Moreover, service demands have increased since these reports.31

In 2014, the Foundation Programme directorate in Scotland and NHS Education for Scotland sought to support clinical training with a new national programme of focused education for all trainees, integrated into their working shift pattern. Furthermore, the working group identified a future need to develop an integrated simulation-based programme of education due to its strong evidence base in support of clinical learning experiences. The Foundation Programme in Scotland sits as an individually funded area within the wider UKFP and as a result drew a clear boundry for the purposes of this work. The Scottish Foundation programme is divided into areas or deaneries to allow doctors to undertake their clinical placement within a reasonably confined geographical area. Each deanery, led by a postgraduate dean, is responsible for overseeing the educational experiences of trainees in its jurisdiction.

Patient and public involvement

There was no direct patient involvement in the design of this study.

Methodology

Reflecting pragmatism where research aims enable problem solving, the overarching study methodology was action research.32 33 Action research seeks understanding of a situation and identifies needs in order to drive the development of an intervention. In action research, data are collected from people’s experiences and insights and knowledge creation comes from how these are interpreted.34 This study sought to create an intervention to enhance the UKFP through a simulation programme. In keeping with the underpinning theoretical basis, the study used a mixed-methods action research approach in an attempt to provide a thorough understanding of the situation and to drive the development of an educational programme.35 Carr and Kemmis developed a reflective spiral model to take action to improve educational activities: plan, act, observe and reflect.36 In alignment with this model, the overarching body of work involved three cycles: cycle 1—needs analysis and prioritisation, cycle 2—simulation curriculum development and cycle 3—simulation curriculum analysis. This paper focuses on cycle 1.

Cycle design

The lack of existing approaches in learning needs prioritisation to develop an integrated national healthcare simulation curriculum, meant that principles were drawn from curriculum design approaches in other disciplines. Hoadley-Maidment developed the concept of the needs analysis triangle: student-perceived needs, teacher-perceived needs and company-perceived needs.37 Additionally, Golden and McGaghie argued that healthcare education design needs to be directly linked to healthcare needs.38 39 In alignment with these views, the researchers structured the analysis around three stakeholder groups, that is, FY doctors, clinical educators and foundation programme directors. Furthermore, clinical significance was included as a key indicator. One of the most significant challenges in designing this needs analysis was the magnitude of the UKFP curriculum, which contains 370 competencies. From a practical perspective, it was unrealistic to have stakeholders, especially trainees, meaningfully engage with all 370 competencies. Moreover, the group considered the limitations of what was possible from a resource and logistical perspective. These limitations dictated that the simulation programme would comprise of no more than two half-day exercises. Finding a way to narrow the competencies so they could be evaluated and the most relevant included in the curriculum was key.

Using these considerations to frame the design, a multistep process involving cycles of work, typified in action research, was incorporated: (1) develop exclusion criteria; (2) review competencies for inclusion/exclusion; (3) assess clinical significance of competencies; (4) evaluate perceived usefulness of training; (5) conduct analytical hierarchy (AHP) process and (6) select final competencies for inclusion. Across the various steps, multiple stakeholder groups were included: Recent Scottish UKFP graduates, clinical educators, UKFP directors and postgraduate deans. Across all steps participants were fully informed about the nature of the project and voluntarily consented to participate. They were also told about the potential outcomes of the project. Methods and the results of each step will now be presented.

Step 1: development of exclusion criteria

A group of experts were recruited from the very small number of individuals in Scotland who had all of the desired experience and understanding needed. This group of experts were chosen because they were highly experienced clinical educators and also had a broad understanding of the UKFP and the context of any potential educational intervention, which would enable them to set meaningful criteria. The reasercher met individually with each of seven potential experts to outline the study, its purpose and clarify any questions they had prior to participation. Two experts were unable to participate due to clinical commitments. Five clinical education experts (Scottish UKFP programme director, three associate postgraduate deans, clinical skills director) agreed to participate and met to develop and agree on exclusion criteria to distill the list of Foundation competencies.

The group agreed on the following exclusion criteria, through which to consider the 370 competencies. Competencies would be excluded if they fell into any one of the following categories:

  1. Already covered by an established educational intervention.

  2. Inappropriate for SBE.

  3. Consistently covered by compulsory clinical placements.

Step 2: review of competencies for inclusion/exclusion

The five experts from step 1 were invited to review the competencies for inclusion/exclusion. The attributes described in step 1 made them also the ideal group to review the competencies. Three experts from step 1 (two were unable to participate because of other commitments) filtered the UKFP curriculum’s 370 competencies using an electronic Delphi approach. The process involved a series of rounds of categorising the competencies via email. First, each individual highlighted the areas of the curriculum they felt should be included after considering the exclusion criteria. These responses were compiled, and competencies that were deemed to meet the exclusion criteria were removed. The condensed list was resent for a second round of reviews. Finally, the group met to discuss the reasons for and against including each competency to settle on a final list.

The final list of competencies for inclusion, developed from the three rounds of the Delphi exercise, included 48 curriculum competencies from the original 370. It was noted during this process that a number of the competencies were related. In response to this reflection, the group themed the resulting 48 competencies into 18 related competency areas. The final identified competency areas are detailed in table 1.

Table 1

Competencies identified in stage 2 of process

Step 3: clinical significance of competencies

A recognised approach to curriculum design is to prioritise the potential content and address areas felt by experts to be of most importance.40 41 Three senior educational leads were asked to score the competency areas for the perceived clinical significance in the training of an FY doctor. The clinical educators came from two specialty areas: anaesthetics (two scorers) and family medicine (one scorer). These educators were chosen because they had not been involved in the previous steps, were senior clinicians who worked closely with FY doctors in different clinical contexts and had significant experience in curriculum design. The clinical educators were asked to give each competency area a score out of 30, with 30 being of most significance, for how clinically significant each area was to the work of a doctor at FY 2 stage. A total score of 30 was chosen as it was sufficiently large to allow discrimination but not so large as to be unmanageable.

Scores are presented in table 2 as a proportion of 1, as is needed for an analytic hierarchy process described in step 5.

Table 2

Competency area scores and outcome from steps 3, 4, 5 and 6

Step 4: perceived usefulness of training

Doctors in training were consulted about their perceptions on the usefulness of additional training in each competency area. Because of the potential variations across programmes and areas of Scotland and the intention that the simulation programme have national relevance, it was felt important to have a large sample of opinions from recent Foundation programme graduates from across the country representing multiple deaneries. A total of 814 doctors who completed the UKFP in Scotland during 2015 were invited to participate in an electronic survey hosted on the Bristol Online Survey platform. A link to the survey was disseminated via email on behalf of the researcher from the postgraduate deanery. The email was sent along with a participant information booklet. The online survey also signposted to this information booklet, confirming that by submitting the survey they were consenting to participate. The survey was open for 4 weeks with a reminder email sent at the end of the second week.

The survey collected basic professional and demographic data (eg, clinical rotations completed) in order to understand the sample within the wider population. Respondents were then asked to consider the usefulness of further teaching on the 18 competency areas, rating each from 1 (not useful) to 5 (very useful). It was also recognised that some competencies from the initial list of 370 could have been wrongly excluded during step 2. In order to ensure that important competencies were not missed, participants were given a free text option to highlight additional areas for inclusion. The draft survey was piloted with 10 doctors of varying degrees of seniority, including FY doctors, to enhance clarity and relevance of findings.41

In total, 132 recent graduates completed the survey (16.2% response rate), 37.2% male and 67.8% female. Identifying the training deanery of the respondents was important in describing the sample population. The number of responses from a particular deanery ranged from 14% to 19% of doctors working in that deanery indicating that the experiences and opinions captured were representative of the programme nationally across Scotland. It was also important to consider the clinical experience of the respondents by looking at the specialties in which they had worked during the foundation programme. As expected, almost all respondents had experience in general medicine and general surgery. There was no consistency across all other specialties, with experience in individual specialty ranging from 2% (dermatology) to 38% (general practice) of respondents (see figure 1).

Figure 1

The range of respondents’ experience through clinical posts. ED, emergency detention; ENT, ears, nose and throat; GP, general practitioner.

The average scores, generated by the questionnaire sent to FY doctors, are detailed in the first column of table 2. Most competencies had the support of around two-thirds of respondents. There was one competency with clear support for additional training (emergency detention) and one where there was very little support for additional training (challenging others on infection control).

In the free text area of the questionnaire, FY doctors identified a number of additional learning needs. The free text data from the questionnaires was analysed by this group via a thematic framework to systematically look for patterns pointing to additional areas of learning need.42 Following analysis, three themes were identified from this text. A sample of quotes are given to further explain each theme in table 3.

Table 3

Themes and quotations from open-end trainee comments

Step 5: AHP

In order to review the data received from the clinical educators and the data from UKFP graduates in a structured way, an AHP was employed.43 44 This process employed a numerical system that allowed the two criteria, trainee-perceived usefulness (gathered via the junior doctors questionnaire) and clinical significance (gathered through the clinical educators survey) to be combined in a final score.

Trainee-perceived usefulness

The quantitative data captured by the aforementioned questionnaire was used to rank the degree of training priority which FY doctors placed on each competency. If a doctor scored a competency as a four or five out of five, then it was considered that they would find additional training in this competency useful. The trainee-perceived usefulness score was calculated by summing the percentage of doctors who scored each competency a 4 or 5. For each of the competencies, this usefulness score was converted to a proportion of 1, as is required for comparison in an analytic hierarchy process.

Clinical significance of the competency

The scores generated by the clinical educators through the survey were then averaged between the three educators, and the average scores were converted to a proportion of 1 to generate a clinical significance score for each competency area.

In order to form accurate conclusions from the criteria, the AHP attributes a weighting to each criteria. This accounts for the fact that some criteria may be more significant than others in answering the proposed question. A score can then be calculated that combines each of the criteria in an appropriate proportion. It was felt that both criteria represented equally important factors influencing the focus of the intervention and therefore each were given a 50% influence on the combined score. A visual representation of this process is provided (figure 2).

Figure 2

The visual representation of the analytical hierarchy process. ICU, intensive care unit.

The AHP combined the scores from steps 3 and 4, with equal weighting, to provide a combined score for prioritisation, detailed in table 2.

Step 6: final competency selection

Three senior simulation experts considered the data generated by the earlier five steps. A combined ranking score was calculated as a mathematical method to prioritise the competency areas for the intervention. Competencies that had a high combined ranking score were automatically prioritised (eg, assessment of capacity). In reviewing the data for lower-ranked competencies it was clear that for some competencies there was agreement across both criteria. However, there were competencies with a significant discrepancy between the scores for each criteria (eg, challenging others on infection control). For these items, the individual scores for each criteria and potential reasons for the discrepancy were considered.

The themes that came out of the open-ended comments from trainees in step 4 were considered against the original exclusion criteria and compared with the existing 18 competency areas. This process allowed the identification of additional needs in the area of procedural skills. Acute care was excluded as all FY doctors in Scotland already undertake a simulation-based course in this area. Interestingly the theme of ‘dealing with difficult colleagues’ was felt to support the inclusion of the ‘challenging others around infection control’ competency area.

Given the resource and logistical constraints of being able to provide no more than two half-day exercises, the group considered the limitations of what was possible to include. Fortunately, this still allowed coverage of 12 of the competency areas. With the competency areas identified, it was agreed that a simulated ward round exercise and a simulated clinic exercise would allow efficient use of resources. Considering all of these points, the group made the decisions about each competency area as detailed in table 2.

Discussion

To our knowledge, this is the first study to outline the methods of competency prioritisation to create a simulation curriculum that is integrated within a national curriculum in the medical education context. Through the methods identified in this paper, the first cycle in the action research process was completed.

Summary of key findings

From an initial list of 370 competencies in the national UKFP curriculum, 12 competency areas were identified that provide the basis for the national simulation programme. In step 1, the identified exclusion criteria (already covered by an established educational intervention; inappropriate for SBE; consistently covered by compulsory clinical placements) were instrumental in narrowing down the competencies. In step 2, through consideration of the exclusion criteria, a list of 48 competencies was created, which were further themed into 18 overarching competency areas. Finally, through steps 3–6, the list was further condensed and prioritised. The final list included: assessment of capacity, dealing with angry patients, safeguarding patients, end-of-life care, unsuccessful treatment, emergency detention, palliative care, advanced care planning, consultation via interpreter, three-way consultation, challenging others on infection control and medical evidence. This list formed the groundwork moving into the second cycle of the action research process.

Although the perceived usefulness and perceived clinical significance as rated by the Foundation Programme graduates and the senior clinical leads were in broad alignment, there were some items where this was clearly not the case. This led the simulation experts to consider further data in their decision making. For example, challenging others on infection control was felt to be clinically significant by clinical leads, but FY doctors did not feel that training in this area would be useful. As mentioned above, one group of stakeholders can bring insight into strong reasons to include a need that the other group would neglect. In this example, FY doctors may lack insight into the need for assertiveness in team working, feel insufficiently prepared or anxious, or may lack the ability to self-assess themselves in this area. The expert group may better understand the strong, patient safety-related need for this skill to be addressed, and it therefore may be important to include in the intervention despite its lower combined score. The accuracy with which learners can self-assess their needs is debated in the literature, but there is evidence that the least competent and more junior are also the least able to self-assess.45 The free-text data from FY doctors conversely suggested that further practice in difficult communications with colleagues would be of value. Considering the data with this level of granularity was important when finally choosing what to include in the programme.

Comparisons with existing literature

A limited number of examples exist in the literature of national curricula effectively integrating simulation. National programmes of simulation training have been shown to be effective in some specialist training, for example, in surgical training.20 21 However, as in this example, the literature focuses on evaluating the programme’s success rather than describing the process of how they were designed. Therefore, it is challenging to compare and contrast processes and evaluate how this may ultimately impact programme outcomes.

Governing bodies are increasingly recommending the integration of simulation-based training into training programmes. In the UK, the General Medical Council and the Nursing and Midwifery Council both advocate this in undergraduate training, and the Royal College of Surgeons of England emphasised this for postgraduate training in their publication ‘Improving surgical training’.13 46 47 The move towards national simulation programmes integrated with clinical learning requires a practical understanding of how best to do this. Examples of successful approaches and the lessons learnt in the process need to be shared to ensure this is done effectively.

Methodological strengths and limitations

Developing unambiguous criteria was a critical methodological strength. This was relevant across multiple stages. In steps 1 and 2, this enabled the reviewers to fairly and consistently exclude competencies. As national curriculums are enormous, this was essential to identifying a manageable number of relevant competencies, while still maintaining a robust process. The chosen criteria were specific enough to bring shared understanding of purpose yet allowed individuals to bring their expertise and opinion to the process. An important example of this was the ‘clinical significance’ criteria. A specific yet common understanding of the nature of a FY doctor’s role and requirements helped focus participants on the process and enable them to give more meaningful and reliable input.

Furthermore, across cycle 1, multiple stakeholders with multiple perspectives participated in multistage process steps. Careful consideration of stakeholders enhanced relevancy and supported validity of the process. Including representative stakeholders across roles and locations was critical to create a national picture of needs representing multiple perspectives.37 That said, the choice of stakeholders was challenging. This process of prioritisation involved a number of busy people and required a balance of rigour versus feasibility. For example, a limitation of the experts in step 6 was that they only represented two clinical specialties: general practice and anaesthesia. Although it would have been helpful to involve many more from a wider range of specialties, representing the various posts the FY doctors rotated through, this was unrealistic. Ensuring the individuals had sufficient educational expertise was the key priority. Furthermore, the UKFP is designed for very junior doctors to develop baseline competencies. Therefore, the variety of specialists was less important than it might have been in later stages of training.

Next, the AHP in step 5 allowed the combination of multiple criteria and stakeholders to provide diverse perspectives on curricular priorities. This approach could be applied to other national curriculum. In this AHP, it was decided to place equal weighting on the different stakeholder groups, whereas most needs analysis places a higher emphasis on the learner. This weighting could be argued a number of ways, but equal weighting was chosen because the FY doctors were very junior and individual learners would not have oversight of the whole FY programme and its national variations.45 Another important consideration that came out of this work was that it is important to not only consider the final output of the AHP, but to meticulously analyse the individual criteria results, particularly where there is a large discrepancy between the scores for each criteria. In this situation, the individual scores for each criteria and potential reasons for the discrepancy needed further consideration, and the free text comments provided helpful additional data in the decision-making process.

Implications for educational practice

Simulation remains a useful but resource-intensive educational tool, and it is vital to share knowledge of how to optimise its use. Recognising that simulation is best embedded into a curriculum, the increasing drive by national governing bodies to include simulation in healthcare curricula means that sharing of experience in how to effectively do this is important. Examples of practical steps taken on how to approach this may be helpful to others trying to develop integrated curriculum, especially at a national level.

As well as the clear curricular benefits of a structured prioritisation, this process has brought about additional gains. The approach highlighted areas of the curriculum that needed further attention through methods other than simulation. This informed the foundation directors in planning other educational interventions. Crucially, the involvement in the prioritisation process of doctors nationally, at both learner and educator levels, meant that roll-out of the programme was relatively simple. Additionally, the robust process meant that the programme was viewed as legitimate and enhanced buy-in by clinicians. Senior clinicians not only supported the programme delivery through agreeing to become teachers, but also by enabling junior doctors to attend. This has ultimately supported long-term programme sustainability.

Finally, the thoroughness of the method was critical in demonstrating the value of the simulation programme and created further unexpected outcomes. In addition to enhancing the support from clinicians, the broader programme has been better accepted and supported by organisations. Moreover, this support has led to further programme funding. The resulting UKFP Simulation Programme has now been running across Scotland for 3 years. It is funded for and built into the training of all FY doctors in Scotland. Future work around integrated national simulation programme development should not only include a robust process for competency prioritisation, but also should report on this process to enable further comparisons between methods and programme outcomes.

In conclusion, this initial work in developing a robust, yet practical prioritisation method was critical in developing a targeted, relevant and integrated national simulation curriculum.

Data availability statement

Data may be obtained from a third party and are not publicly available.

Ethics statements

Patient consent for publication

Ethics approval

Ethical approval for this study was gained through the University of Dundee ethics committee (UREC ref. 15125). Participants gave informed consent to participate in the study before taking part.

References

Footnotes

  • Contributors The two authors were both equally involved in the writing of this manuscript. NMH carried out the original research under the supervision of AD. NMH is the garantor for the content of this manuscript.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Patient and public involvement Patients and/or the public were not involved in the design, or conduct, or reporting, or dissemination plans of this research.

  • Provenance and peer review Not commissioned; externally peer reviewed.