Skip to main content
  • Advancing simulation practice
  • Open access
  • Published:

Simulation-based procedure training (SBPT) in rarely performed procedures: a blueprint for theory-informed design considerations

Abstract

Simulation-based procedure training is now integrated within health professions education with literature demonstrating increased performance and translational patient-level outcomes. The focus of published work has been centered around description of such procedural training and the creation of realistic part-task models. There has been little attention with regards to design consideration, specifically around how simulation and educational theory should directly inform programme creation. Using a case-based approach in cardiology as an example, we present a blueprint for theory-informed simulation-based procedure training linking learning needs analysis and defining suitable objectives to matched fidelity. We press the importance of understanding how to implement and utilise task competence benchmarking in practice, and the role of feedback and debriefing in cycles of repeated practice. We conclude with evaluation and argue why this should be considered part of the initial design process rather than an after-thought following education delivery.

Introduction

Simulation-based education as a modality is used in various guises: team-based training; human factors specific education; familiarisation of new environments, protocols and procedures; and most recently, to combat rare events such as the coronavirus pandemic [1,2,3]. Evidence for effectiveness, measured by improvements to patient safety and cost-effectiveness in health systems continues to emerge [4]. One area which has an expanding research base is the use of simulation in the training of specific procedures [5, 6].

Evidence to practice gap: implementation challenges for educators

Simulation-based procedure training (SBPT) is now firmly integrated into health professions curriculums [6,7,8] with evidence suggesting it can be used either alongside or in replacement of traditional clinical experience for both low-stakes routine procedures [9, 10] and high-stakes emergency procedures [11,12,13,14]. There is heterogeneity in reported outcomes of SBPT with some studies reporting improvement in high-level translational outcomes [15,16,17,18] such as reduced intensive care costs and infection rates, and others focusing on lower level outcomes such as time taken to perform a procedure [19, 20]. In such research, publication is centred on data analysis and often lacks sufficient detail regarding educational design, theoretical considerations and implementation. Consequently, this becomes problematic for the healthcare community as they are unable to learn from the work undertaken to replicate and adapt design principles for other procedural-based simulations [21] in their own context, particularly those which are rare and infrequently described in the simulation literature.

A gap thus exists for educators around how high-stakes and rare procedural training should be optimally designed and conducted. The design of SBPT must be reflective of local contextual factors such as learners, faculty, cost and resources which can impact design, immediate measurable outcomes, skill decay and patient safety. In this paper, we provide a blueprint for SBPT which is intended to function as a guide for educators wishing to design and implement procedural training. Using pericardiocentesis as a ‘high-stakes rare-procedure’ case example, we press the importance of focusing on the underlying theoretical rationale for design decisions to optimise participant learning.

Current approaches to procedural training and curriculum design

Traditional approaches to procedural training focus on a 'see-one, do-one, teach-one' methodology where there is an assumption that competence immediately follows observation [22], failing to recognise the risk to patient safety as few people are competent to independently perform a procedure after one observation. More contemporary structured approaches (Fig. 1) includes Peyton’s four-step approach which aims for educators to deconstruct the activity and scaffold learning, and frameworks such as Miller’s pyramid which guide the educator in thinking about the level of performance we want learners to attain [23, 24]. Theory-informed design in SBPT goes beyond these frameworks and is multi-faceted encompassing discrete, sequential items for educators to consider when trying to maximise learning yield.

Fig. 1
figure 1

Contemporary approaches to procedural training. We have highlighted two traditional theoretical approaches to delivering procedural training. Both of these models focus on deconstruction of the task either via small steps and subordinate tasks (Peyton’s four-step approach) [23] or by distinguishing cognitive knowledge and behavioural knowledge (Miller’s Pyramid) [24]

Several authors have published curriculum development tools with explicitly defined steps with Kern et al. [25] and Sawyer et al. [26] both providing comprehensive approaches (Table 1). Other sources focus more on the pure delivery of teaching psychomotor skills highlighting how sessions can be stratified based on the pre-procedural skill of the learner and desired learning outcomes [27].

Table 1 Established curriculum development frameworks. The proposed SBPT Blueprint incorporates parts of both frameworks alongside additional design considerations centred around education theory

Our approach to curriculum design for SBPT is particularly suited to rarely performed procedures, uses concepts from a variety of these approaches and includes additional elements such as fidelity considerations, skills decay and debriefing and feedback considerations. This curriculum design blueprint, presented in Fig. 2, outlines the sequential theory-informed design elements which educators should consider when designing SBPT in their own context.

Fig. 2
figure 2

SBPT Blueprint. This is an example of how the SBPT blueprint functions in practice. Each element follows from the previous element and considers relevant contextual factors. The content of each element directly influences overall design and should be expanded. For example, under mastery learning, a pre-requisite would be for all participants to be familiar with the equipment and where applicable, time for this must be integrated into the programme or provided in an alternative fashion (e.g. online education) if required. Factors such as resources and evaluation which are often ill-considered can be discussed at the onset by mapping out a comprehensive blueprint

SBPT Blueprint

Stage 1: Design initiation

This is perhaps the most critical stage in the design process where contextual factors are identified and specified such as target participants, available resources, and key stakeholders to be involved. It is important that technical and logistical issues are addressed in parallel to SBPT content design as often, successful implementation is dependent upon both. We refer the reader to the guidelines by Khan et al. for an outline of these elements in more detail [28].

Learning needs analysis

A learning needs analysis (LNA) should be carried out beforehand to determine which areas of the curriculum would benefit from simulation training. The LNA is not limited to individual skills but can encompass communication, teamworking and other non-technical skills [29,30,31,32,33,34]. Many different tools can be used for the LNA with the exact tool chosen to suit the scope of the analysis and the context of the teaching (Table 2). Care needs to be taken to include all relevant stakeholders and enough time needs to be allocated for the assessment to be completed [35, 36]. It is important to note that a LNA can be generic, such as identifying pericardiocentesis amongst many different procedures, or more specific by identifying elements of a procedure which require particular attention such as calculating blood flow velocity or obtaining particular imaging views when undertaking echocardiography; different approaches to LNA can provide different information.

Table 2 Mechanisms to undertake a learning needs analysis

Case study: undertaking a learning needs assessment using participant survey.

A recent electronic survey of European cardiology trainees found high levels of reported self-confidence performing temporary pacing wire and central venous cannula insertion compared to pericardiocentesis or transoesophageal echocardiography [37]. This result was driven by rarity of certain procedures and variable exposure to acute cardiac pathology coupled with scarce simulation-based training [37] mirroring other specialities. Interestingly pericardiocentesis was ranked lowest by trainees in terms of comfort yet, it is also the procedure in which trainees report having the least simulation training [37]. SBPT is being used within cardiology although these endeavours lack information around theory-informed design [38], negatively impacting empirical research in cardiology-specific SBPT [6]. The curriculum gap and lack of theory-informed design make pericardiocentesis a good choice for a worked case example in SBPT (Box 1).

Stage 2: Theory-informed design

Mastery learning as an overarching pedagogical framework

The initial steps within the SBPT Blueprint design process are to determine whether the SBPT is constructively aligned to curriculum objectives, defining learning outcomes based upon the LNA, and deciding which overarching pedagogical framework will allow these to be delivered [39]. Appropriate curriculum integration enhances learning when compared to ad-hoc simulation which is facilitated by ensuring that the learning objectives chosen are Specific, Measurable, Achievable, Realistic and Time-related [40, 41]. For a rarely performed lifesaving intervention like pericardiocentesis, each specialty registrar is required to reach a universally high level of competence to perform the task; achievable through a mastery learning (ML) approach [5, 42,43,44].

It is important to note that we are not exclusively suggesting a ML approach for all procedures; an appropriate pedagogical framework should be linked to the underlying objectives. There is, however, emerging evidence that clinical outcomes between SBPT using traditional instruction compared with demonstration and unsupervised independent practice are not significantly different suggesting a ML approach would be best suited for all procedure-based training to keep learning in line with explicit standards [45]. Instructional design and deliberate practice (DP) are core components of ML that aid the practical implementation of the framework [46, 47] centred around upskilling individuals to a desired standard with task-specific feedback, facilitated by use of a checklist. Table 3 outlines these theories and explains how they inform SBPT design for our case study. These theories do not operate as ‘stand-alone’ and instead should be seen as synergistic explanatory frameworks in understanding how participants learn, enabling educators to design interventions to optimise desired outcomes.

Generating a checklist

When we implement the instructional design model, a reference teaching and assessment checklist is required which outlines the core technical elements. To save time at this stage it is important to identify whether suitable validated teaching checklists already exist [12]. Even if a pre-existing checklist is not validated it could form a basis for a checklist thus preventing duplication.

If no checklists exist a functional task alignment analysis [48] can identify the technical steps required to complete a skill and give an insight into what tasks and processes the simulator will be required to perform. Checklist items can be further informed from curriculum documents/published procedural guides [48] and expert panel consensus statements [48].

Ideally, checklist items need to be independently validated and passing standards agreed in line with the intended learning outcomes. The Delphi methodology focuses on iterative rounds of checklist review by experts aiming to mould the teaching tool and is context dependent [30,31,32]. It may require input from a range of stakeholders (expert and non-expert) and is time-consuming with fatigue occurring in later rounds [29, 36]. A judgement needs to be made by the designers regarding which portions of the checklist are essential for task completion and traditional standard-setting approaches such as the Angoff method may be useful here [49]. The process of checklist generation and functional task alignment, dependent upon local context and learners, is critical as this determines what degree of realism is required to achieve the checklist items. We have provided an example checklist for pericardiocentesis as Supplementary Table 1.

Fidelity and learning

Traditionally, fidelity had been viewed as a unidimensional concept with increasingly complex technology being synonymous with an increased representation of reality. Subsequent theories have split fidelity into four main components—environmental, mechanical (or engineering), psychological and sociological fidelity [50,51,52], although other domains exist. As an example, a difficult communication scenario would require high sociological fidelity (e.g. with actors or standardised patients) compared to suturing where banana skin may suffice for novice learners and animal tissue for advanced learners representing different degrees of tactile feedback (mechanical fidelity) adjusted according to the learner. The key, then, is selecting the most appropriate design features to match the intended learning objectives which themselves are tailored to learners and the context.

Choosing the level of fidelity is important as it can impact on both the cost of delivering the teaching and learner engagement. A systematic review looking at clinical performance as an outcome demonstrated higher fidelity simulations are associated with a small but non-significant gain in performance outcomes compared to lower fidelity simulation which is offset by increased cost [52]. The reason for the lack of difference is thought to be multifactorial: context may not be as important as assumed; high psychological fidelity may be generated even with low mechanical fidelity design; and, lower fidelity simulation may lead to reduced cognitive load (Table 3). Paradoxically there may even be detrimental effects of high-fidelity simulation when teaching novices as it may promote unsafe over-confidence [53, 54]. Despite this, higher fidelity simulation may be more appropriate for advanced learners to give an adequate level of psychological fidelity. Whilst there are many commercially available simulators, high- and low-fidelity simulators, particularly for procedural skills, can be created from readily accessible material, such as gelatine-based and 3D printing, at low cost [11, 13, 19, 55, 56].

Determining the simulation design, which may consider multiple different fidelity domains, is further impacted by cognitive load. In our case study, participants are novices at performing pericardiocentesis, and according to the cognitive load theory, they would benefit from simulation design with a high intrinsic and germane load and a reduced extrinsic load so they are not overwhelmed during the task (Table 3) [57]. Approaches such as virtual reality, augmented reality and mixed reality may provide realistic training experiences, especially when these are combined with tactile feedback in procedural skills. However, they are associated with a significant increase in cognitive load and may be best reserved for advanced trainees who require more complex simulator feedback for engagement [58, 59]. These design choices and trade-off between fidelity and cognitive load are rooted in the initial needs analysis and functional task alignment [62] and attempting to reconstruct the entirety of clinical reality for all learners is inappropriate. Rather, we strive for constructive alignment between learning needs, learning objectives and fidelity matched to benchmarked standards in the form of a checklist.

Table 3 The key educational frameworks which influence simulation-based procedural training and their relationship to our simulation design

Skills decay: contextualising programme delivery

An area of contention when designing simulation courses is when to repeat the intervention. There is no consensus on the durability of acquired skills or the retention interval, and various studies looking at skills decay have conflicting results. There are reports of skills being retained for 14 months in some studies with others finding evidence of decreased performance as little as 6 months after simulation training [12, 63,64,65]. When simulation re-test has been stratified into domains (affective, psychomotor or cognitive) no decrease in any specific domain has been found [66]. These studies are complicated by small sample sizes, heterogenous designs and various confounding factors. A recent scoping review by Donoghue et al. agreed that these were consistent issues making aggregation of results difficult [67]. The authors did conclude, however, that studies which were informed by theory, specifically, DP and ML, improved educational outcomes with less skill decay compared to other education delivery methods. This accounts for the advocacy of inclusion of DP and ML in the 2020 American Heart Association guidelines for resuscitation training [68].

Potential methods to augment skill retention include distributing the teaching session over more sessions than originally planned [69] and giving students access to simulators with dedicated unsupervised training time after the first supervised training session [70]. The consistent messages from various studies are perhaps unsurprising: there is a signal towards increased proficiency and skill retention with increasing seniority of the learners and students who have repeated training sessions show increased proficiency in task performance [9, 20, 71].

In summary, skills decay is an evolving area within SBPT that requires a greater evidence base to provide firm guidance for curriculum design. It is a complex topic that is dependent on multiple factors including those related to the task (complexity of the task, frequency that the procedure will be performed in practice), learner (novice vs. advanced) and healthcare setting (available resources and curriculum integration within the healthcare system). Until more research emerges on the optimal retention interval this design consideration should be determined at the curriculum development and learning needs analysis stages.

Appropriate feedback and debriefing

Feedback and debriefing are crucial steps for skill acquisition. They traditionally occur at the end of a simulation exercise but can occur during the simulation. Feedback is information given about the comparison between the observed performance and the desired outcome and debriefing is an interactive discussion that facilitates a reflection on the performance [72, 73].

There are various feedback and debriefing strategies available. In the context of ML, there is evidence that micro-debriefing improves performance by facilitating DP and attainment of the minimum passing standard [63, 74]. There had been initial concern that continued feedback during the performance can lead to cognitive overload [75] but repeated feedback has been shown to improve the efficiency of SBPT as assessed by procedural outcomes [76]. Micro-debriefs can be employed either whilst the simulation is running (in-action, e.g. what organs have you identified to avoid puncturing with the needle) providing direct feedback on tasks being performed or following a brief pause in the simulation scenario (on-action, e.g. the angle of needle entry was too deep) providing feedback on tasks just performed [73].

Micro-debriefing using reflection on-action is a core component of Rapid Cycle Deliberate Practice (RCDP) [74, 77] and alongside reflection in-action this can be a useful strategy to teach novice learners who lack a frame of reference. RCDP is a relatively novel approach that involves immediate feedback on actions in a coaching style, increasing the amount of time spent on DP. For more advanced trainees an approach using reflective pauses may be appropriate as it allows an exploration of the learners’ frame of reference thus increasing their engagement and allowing them to bring their own experience into the teaching session [78]. This facilitates discussion rather than unidirectional feedback and promotes double-loop learning (providing the underlying rationale for an action) which is more effective than single-loop learning (simply correcting an action) [79, 80].

Regardless of which feedback and debriefing strategy is chosen we need to create a learning environment where learners take risks and are open to feedback and engage in debriefing [81]. Psychological safety and mutual respect can be generated through highlighting the importance of specific feedback, what type of feedback strategy you will use during the SBPT and making it clear that perfection is not expected from the start [82]. There is evidence of benefit in peer-based dyadic teaching in SBPT; this is a cost-effective strategy that is associated with a significant increase in the efficiency of SBPT [83, 84]. If this is the chosen strategy, learners should be made aware of the value of peer evaluation at the beginning of the SBPT [10]. These strategies are in addition to already established frameworks to foster psychological safety in simulation, for example, as outlined by Rudolph et al. [81] and Kolbe et al. [85].

Overall, it is important to think about the type of feedback, source and timing when designing the SBPT in order to plan when and how the feedback and debriefing will occur [60]. Traditional end-of-task feedback and debriefing still has its uses such as generating post-task discussion in scenarios with a specific ethical dilemma or scenarios where there are multiple potential outcomes.

Online learning

There is a growing interest in the use of online videos to aid clinical skills teaching supported by high-quality evidence [86]. Online videos can play a role in multiple parts of SBPT such as allowing students to review a recording of the procedure being performed prior to attending training, creating more time for DP; providing a source for feedback and discussion; and, allowing the student to review the steps of a procedure in the future once the SBPT session is complete. Despite the advantages of online learning there are potential issues such as variation in the trainee’s ability to access the content, patient confidentiality considerations if real patients are involved, and variation in the quality of online learning videos. Use of video will thus be dependent upon local resources and recognising how and why recorded material is to be used; it should not be assumed, as with other design components, that simply inserting an additional pedagogical modality results in increased learning.

Stage 3: Evaluation considerations

When evaluating individual SBPT, it can be difficult to demonstrate that ML in simulation can lead to high-level (T3, T4) translational outcomes if the focus of the intervention is a rarely performed clinical procedure. An alternative may be to demonstrate a reduction in healthcare costs which would be a strong argument in a taxpayer-funded healthcare system such as the National Health Service (NHS), although understandably this is seldom the main goal of a ML programme.

The need for simulation courses for rare procedures is driven by the infrequency with which these procedures are encountered in clinical practice. Unfortunately, this limits the amount of data you can collect to demonstrate higher level translational outcomes unless you have an extensive follow-up time. Because of these restrictions, the goal of many ML programmes for rare procedures will be to produce improvements in competence within the simulation setting on re-testing (T1). This does not represent a failure in evaluation by providing weak evidence, but it is a pragmatic approach to the evaluation of a rare procedure, and we encourage similar rational approaches to evaluation. In comparison, for procedures frequently undertaken in clinical practice such as lumbar puncture, intubations and catheter insertions, evaluation may be conducted to look at improvements in specific domains, including patient-level outcomes and hospital-wide benefits such as reduced costs.

Whilst data collected at evaluation will be constructively aligned to learning outcomes and can be considered robust, evaluation of SBPT should not be limited to positivist metrics and can include qualitative approaches which may provide insights regarding impact on clinical practice and patient safety. The key is to match available resources with intended aims of evaluation, and here, toolkits such as the King’s College London Evaluation Practice Toolkit [87] provide useful direction.

Despite evidence showing that ML leads to improved patient care and clinical outcomes, the quality of reporting from studies implementing mastery-based simulation programmes is not uniform [5, 88]. In order to address this, Cohen et al. outlined a 38 item Reporting Mastery Education Research in Medicine (ReMERM) guideline to provide educators, authors and journal editors with a gold-standard framework for reporting mastery interventions [89]. When authors report their findings using this framework it allows a detailed comparison and aggregation of studies in a systematic review and provides a framework for other authors to replicate design aspects of the simulation intervention as it calls on authors to provide a detailed description of the simulation intervention in the methodology section.

Conclusions: sharing our journeys

We have provided a comprehensive blueprint for simulation designers engaging in SBPT focusing on theoretical issues whilst attending to contextual and practical influences (Fig. 2). Our core steps for curriculum design relevant to SBPT encompass three main phases. The design initiation phase consists of defining the problem, understanding local contextual factors including determining available resources and stakeholder identification. This, coupled with a detailed LNA will determine the course objectives, specific to the learners. Logistical elements need to be considered from the beginning and will often span the entire design process and be influenced by decisions around checklist formation, fidelity and training interval.

In the second phase, an appropriate pedagogical framework should form the scaffold for design. The learners and learning objectives will define the expected standards and development of a checklist in a mastery learning approach. The key next step is to determine how the required knowledge can be achieved; specifically, what simulation setup, or fidelity, is required to simultaneously meet the standards, engage the learners and avoid cognitive overload. The next step, which is dependent upon factors such as the number of faculty and group size, is how cycles of practice can be delivered with corresponding debriefing and feedback. Consideration then needs to be given to training intervals to determine how frequently over the course of a year or training programme learners will be involved in the training again.

Alongside designing the intervention, realistic goals for evaluation need to be set. This again will be contextual and based upon resources, time and what type of outcome measures are feasible. Approaches such as focus groups, interviews and surveys may be a cost-effective option, especially if the aim of evaluation is to refine the course in the early stages before attempting to measure patient-level outcomes.

It is through rigorous adherence to design principles where we, as simulation educators, provide justice to our learners and ultimately to patients. Often, the simulation community report success with varying translational outcomes or descriptive pieces outlining novel simulation interventions. We call for detailed, theory-informed SBPT design to be made available so others can replicate, adapt, contextualise and share in the success.

Availability of data and materials

The following additional material is available:

Supplementary file one: Pericardiocentesis example checklist.

Abbreviations

SBPT:

Simulation-based procedure training

LNA:

Learning needs analysis

ML:

Mastery learning

DP:

Deliberate practice

References

  1. Brazil V, Lowe B, Ryan L, Bourke R, Scott C, Myers S, et al. Translational simulation for rapid transformation of health services, using the example of the COVID-19 pandemic preparation. Adv Simul. 2020;5(1):9. https://doi.org/10.1186/s41077-020-00127-z.

    Article  Google Scholar 

  2. Dieckmann P, Torgeirsen K, Qvindesland SA, Thomas L, Bushell V, Langli Ersdal H. The use of simulation to prepare and improve responses to infectious disease outbreaks like COVID-19: practical tips and resources from Norway, Denmark, and the UK. Adv Simul. 2020;5(1):3. https://doi.org/10.1186/s41077-020-00121-5.

    Article  Google Scholar 

  3. Brown A, Schofield L, Walker J, et al. 0166 ‘Ebola sim’ – an in-situ simulation to test standard operating procedures (SOPS) for a high risk patient pathway. BMJ Simul Technol Enhanced Learn. 2015;1(Suppl 2):A61. https://doi.org/10.1136/bmjstel-2015-000075.150.

    Article  Google Scholar 

  4. Fung L, Boet S, Bould MD, et al. Impact of crisis resource management simulation-based training for interprofessional and interdisciplinary teams: a systematic review. J Interprof Care. 2015;29(5):433–44. https://doi.org/10.3109/13561820.2015.1017555 published Online First: 2015/05/15.

    Article  PubMed  Google Scholar 

  5. McGaghie WC, Issenberg SB, Barsuk JH, et al. A critical review of simulation-based mastery learning with translational outcomes. Med Educ. 2014;48(4):375–85. https://doi.org/10.1111/medu.12391.

    Article  PubMed  Google Scholar 

  6. Purva M, Fent G, Prakash A. In: Armstrong M, editor. https://www.jrcptb.org.uk/news/enhancing-uk-core-medical-training-through-simulation-based-education-sbe-evidence-based: Health Education England Enhancing UK core medical training through simulation-based education (SBE): an evidence-based approach: a report from the joint JRCPTB/HEE Expert Group on Simulation in Core Medical Training; 2016.

    Google Scholar 

  7. Lee R, Raison N, Lau WY, Aydin A, Dasgupta P, Ahmed K, et al. A systematic review of simulation-based training tools for technical and non-technical skills in ophthalmology. Eye. 2020;34(10):1737–59. https://doi.org/10.1038/s41433-020-0832-1.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Rewers M, Østergaard D. The evolution of a national, advanced airway management simulation-based course for anaesthesia trainees. Eur J Anaesthesiol. 2021;38(2):138–45. https://doi.org/10.1097/eja.0000000000001268.

    Article  PubMed  Google Scholar 

  9. Kattan E, De la Fuente R, Putz F, et al. Simulation-based mastery learning of bronchoscopy-guided percutaneous dilatational tracheostomy: competency acquisition and skills transfer to a cadaveric model. Simul Healthc. 9000;Publish Ahead of Print. https://doi.org/10.1097/sih.0000000000000491.

  10. Cason ML, Gilbert GE, Schmoll HH, et al. Cooperative learning using simulation to achieve mastery of nasogastric tube insertion. J Nurs Educ. 2015;54(3 Suppl):S47–51. https://doi.org/10.3928/01484834-20150218-09 published Online First: 2015/02/19.

    Article  PubMed  Google Scholar 

  11. Ghazali A, Breque C, Léger A, Scépi M, Oriot D. Testing of a complete training model for chest tube insertion in traumatic pneumothorax. Simul Healthc. 2015;10(4):239–44. https://doi.org/10.1097/sih.0000000000000071.

    Article  PubMed  Google Scholar 

  12. Boet S, Borges BCR, Naik VN, et al. Complex procedural skills are retained for a minimum of 1 yr after a single high-fidelity simulation training session<sup>&#x2020;</sup>. Br J Anaesth. 2011;107(4):533–9. https://doi.org/10.1093/bja/aer160.

    Article  CAS  PubMed  Google Scholar 

  13. Ruest AS, Getto LP, Fredette JM, Cherico A, Papas MA, Nomura JT. A novel task trainer for penile corpus cavernosa aspiration. Simul Healthc. 2017;12(6):407–13. https://doi.org/10.1097/sih.0000000000000262.

    Article  PubMed  Google Scholar 

  14. Strøm M, Rasmussen JL, Nayahangan LJ, de la Motte L, Vogt K, Konge L, et al. Learn EVAR sizing from scratch: the results of a one-day intensive course in EVAR sizing and stent graft selection for vascular trainees. Vascular. 2020;28(4):342–7. https://doi.org/10.1177/1708538120913719.

    Article  PubMed  Google Scholar 

  15. Barsuk JH, McGaghie WC, Cohen ER, et al. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37(10):2697–701 published Online First: 2009/11/04.

    PubMed  Google Scholar 

  16. Barsuk JH, Cohen ER, Vozenilek JA, O'Connor LM, McGaghie WC, Wayne DB. Simulation-based education with mastery learning improves paracentesis skills. J Grad Med Educ. 2012;4(1):23–7. https://doi.org/10.4300/JGME-D-11-00161.1.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Cohen ER, Feinglass J, Barsuk JH, et al. Cost savings from reduced catheter-related bloodstream infection after simulation-based education for residents in a medical intensive care unit. Simul Healthc. 2010;5(2):98–102. https://doi.org/10.1097/SIH.0b013e3181bc8304 published Online First: 2010/04/15.

    Article  PubMed  Google Scholar 

  18. Duncan DR, Morgenthaler TI, Ryu JH, Daniels CE. Reducing iatrogenic risk in thoracentesis: establishing best practice via experiential training in a zero-risk environment. CHEST. 2009;135(5):1315–20. https://doi.org/10.1378/chest.08-1227.

    Article  PubMed  Google Scholar 

  19. Hauglum SD, Crenshaw NA, Gattamorta KA, Mitzova-Vladinov G. Evaluation of a low-cost, high-fidelity animal model to train graduate advanced practice nursing students in the performance of ultrasound-guided central line catheter insertion. Simul Healthc. 2018;13(5):341–7. https://doi.org/10.1097/sih.0000000000000337.

    Article  PubMed  Google Scholar 

  20. Zaika O, Boulton M, Eagleson R, de Ribaupierre S. Simulation reduces navigational errors in cerebral angiography training. Adv Simul. 2020;5(1):10. https://doi.org/10.1186/s41077-020-00125-1.

    Article  Google Scholar 

  21. Lawaetz J, Skovbo Kristensen JS, Nayahangan LJ, et al. Simulation based training and assessment in open vascular surgery: a systematic review. Eur J Vasc Endovasc Surg. 2021;61(3):502–9. https://doi.org/10.1016/j.ejvs.2020.11.003 published Online First: 2020/12/15.

    Article  PubMed  Google Scholar 

  22. Kotsis SV, Chung KC. Application of the “see one, do one, teach one” concept in surgical training. Plast Reconstr Surg. 2013;131(5):1194–201. https://doi.org/10.1097/PRS.0b013e318287a0b3.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  23. Peyton, JWR. Teaching in the theatre. In: Teaching and learning in medical practice. Rickmansworth: Manticore Publishers Europe Ltd; 1998. p. 171–80.

  24. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63–7. https://doi.org/10.1097/00001888-199009000-00045 published Online First: 1990/09/01.

    Article  CAS  PubMed  Google Scholar 

  25. Kern DET, P.A.; Hughes, M.T. Curriculum development for medical education: a six-step approach: JHU Press; 2009.

    Google Scholar 

  26. Sawyer T, White M, Zaveri P, et al. Learn, see, practice, prove, do, maintain: an evidence-based pedagogical framework for procedural skill training in medicine. Acad Med. 2015;90(8):1025–33. https://doi.org/10.1097/acm.0000000000000734 published Online First: 2015/04/18.

    Article  PubMed  Google Scholar 

  27. Andreatta P, Dougherty P. Advancing surgical education: theory, evidence and practice. In: Nestel D, Dalrymple K, Paige JT, et al., eds. Innovation and change in professional education: Springer 2019:183-196, Supporting the Development of Psychomotor Skills, https://doi.org/10.1007/978-981-13-3128-2_17.

  28. Khan K, S T-C, S W, et al. Simulation in healthcare education. Building a simulation programme: a practical guide. Dundee: Association for Medical Education in Europe (AMEE); 2011.

    Google Scholar 

  29. de Villiers MR, de Villiers PJ, Kent AP. The Delphi technique in health sciences education research. Med Teach. 2005;27(7):639–43. https://doi.org/10.1080/13611260500069947 published Online First: 2005/12/08.

    Article  PubMed  Google Scholar 

  30. Løkkegaard T, Todsen T, Nayahangan LJ, et al. Point-of-care ultrasound for general practitioners: a systematic needs assessment. Scand J Prim Health Care. 2020;38(1):3–11. https://doi.org/10.1080/02813432.2020.1711572 published Online First: 2020/01/21.

    Article  PubMed  PubMed Central  Google Scholar 

  31. Thim S, Nayahangan LJ, Paltved C, Jensen RD, Konge L, Hertel NT, et al. Identifying and prioritising technical procedures for simulation-based curriculum in paediatrics: a Delphi-based general needs assessment. BMJ Paediatr Open. 2020;4(1):e000697. https://doi.org/10.1136/bmjpo-2020-000697.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Bessmann EL, Østergaard HT, Nielsen BU, et al. Consensus on technical procedures for simulation-based training in anaesthesiology: a Delphi-based general needs assessment. Acta Anaesthesiol Scand. 2019;63(6):720–9. https://doi.org/10.1111/aas.13344 published Online First: 2019/03/16.

    Article  PubMed  Google Scholar 

  33. Grant J. Learning needs assessment: assessing the need. BMJ. 2002;324(7330):156–9. https://doi.org/10.1136/bmj.324.7330.156.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Zern SC, Marshall WJ, Shewokis PA, Vest MT. Use of simulation as a needs assessment to develop a focused team leader training curriculum for resuscitation teams. Adv Simul. 2020;5(1):6. https://doi.org/10.1186/s41077-020-00124-2.

    Article  Google Scholar 

  35. Gustavsen PH, Nielsen DG, Paltved C, Konge L, Nayahangan LJ. A national needs assessment study to determine procedures for simulation-based training in cardiology in Denmark. Scand Cardiovasc J. 2019;53(1):35–41. https://doi.org/10.1080/14017431.2019.1569716.

    Article  PubMed  Google Scholar 

  36. Keeney S, Hasson F, McKenna HP. A critical review of the Delphi technique as a research methodology for nursing. Int J Nurs Stud. 2001;38(2):195–200. https://doi.org/10.1016/s0020-7489(00)00044-4 published Online First: 2001/02/27.

    Article  CAS  PubMed  Google Scholar 

  37. Czerwińska-Jelonkiewicz K, Montero S, Bañeras J. The voice of young cardiologists. Eur Heart J. 2020;41(29):2723–5. https://doi.org/10.1093/eurheartj/ehaa432.

    Article  PubMed  Google Scholar 

  38. Spurr L, Barron A, Butcher C, et al. P14 Part-task training with low-fidelity simulation is an effective method of pericardiocentesis training. BMJ Simul Technol Enhanced Learn. 2017;3(Suppl 2):A50. https://doi.org/10.1136/bmjstel-2017-aspihconf.102.

    Article  Google Scholar 

  39. Cowan J. John Biggs 1999. Teaching for quality learning at university: what the student does. High Educ 2000;40(3):374-376. https://doi.org/10.1023/A:1004049006757

  40. Issenberg SB, McGaghie WC, Petrusa ER, et al. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10–28. https://doi.org/10.1080/01421590500046924 published Online First: 2005/09/09.

    Article  PubMed  Google Scholar 

  41. Locke EA, Latham GP. Building a practically useful theory of goal setting and task motivation. A 35-year odyssey. Am Psychol. 2002;57(9):705–17. https://doi.org/10.1037//0003-066x.57.9.705 published Online First: 2002/09/20.

    Article  PubMed  Google Scholar 

  42. McGaghie WC, Issenberg SB, Petrusa ER, et al. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44(1):50–63. https://doi.org/10.1111/j.1365-2923.2009.03547.x.

    Article  PubMed  Google Scholar 

  43. Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB. Simulation in healthcare education: a best evidence practical guide. AMEE Guide No. 82. Med Teach. 2013;35(10):e1511–e30. https://doi.org/10.3109/0142159X.2013.818632.

    Article  PubMed  Google Scholar 

  44. Bloom BS. Learning for mastery. Evaluation comment; Centre for the Study of Evaluation of Instructional Programs 1968;1(2):1-12.

    Google Scholar 

  45. Bube S, Dagnaes-Hansen J, Mahmood O, et al. Simulation-based training for flexible cystoscopy – A randomized trial comparing two approaches. Heliyon. 2020;6(1):e03086. https://doi.org/10.1016/j.heliyon.2019.e03086.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Gagné RM. Military training and principles of learning. Am Psychol. 1962;17(2):83–91. https://doi.org/10.1037/h0048613.

    Article  Google Scholar 

  47. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 Suppl):S70–81. https://doi.org/10.1097/00001888-200410001-00022 published Online First: 2004/09/24.

    Article  PubMed  Google Scholar 

  48. Adams RA, Gilbert GE, Buckley LA, Nino Fong R, Fuentealba IC, Little EL. A method for functional task alignment analysis of an arthrocentesis simulator. Simul Healthc. 2018;13(4):289–94. https://doi.org/10.1097/sih.0000000000000313.

    Article  PubMed  Google Scholar 

  49. Yudkowsky R, Tumuluru S, Casey P, Herlich N, Ledonne C. A patient safety approach to setting pass/fail standards for basic procedural skills checklists. Simul Healthc. 2014;9(5):277–82. https://doi.org/10.1097/sih.0000000000000044.

    Article  PubMed  Google Scholar 

  50. Rehmann AJ. A Handbook of Flight Simulation Fidelity Required for Human Factors Research. 1995. https://doi.org/10.21949/1403228.

  51. Sharma S, Boet S, Kitto S, et al. Interprofessional simulated learning: the need for 'sociological fidelity'. J Interprof Care. 2011;25(2):81–3. https://doi.org/10.3109/13561820.2011.556514 published Online First: 2011/02/09.

    Article  PubMed  Google Scholar 

  52. Norman G, Dore K, Grierson L. The minimal relationship between simulation fidelity and transfer of learning. Med Educ. 2012;46(7):636–47. https://doi.org/10.1111/j.1365-2923.2012.04243.x published Online First: 2012/05/24.

    Article  PubMed  Google Scholar 

  53. Nimbalkar A, Patel D, Kungwani A, Phatak A, Vasa R, Nimbalkar S. Randomized control trial of high fidelity vs low fidelity simulation for training undergraduate students in neonatal resuscitation. BMC Res Notes. 2015;8(1):636. https://doi.org/10.1186/s13104-015-1623-9.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Massoth C, Röder H, Ohlenburg H, Hessler M, Zarbock A, Pöpping DM, et al. High-fidelity is not superior to low-fidelity simulation but leads to overconfidence in medical students. BMC Med Educ. 2019;19(1):29. https://doi.org/10.1186/s12909-019-1464-7.

    Article  PubMed  PubMed Central  Google Scholar 

  55. Morrow DS, Cupp JA, Broder JS. Versatile, reusable, and inexpensive ultrasound phantom procedural trainers. J Ultrasound Med. 2016;35(4):831–41. https://doi.org/10.7863/ultra.15.04085.

    Article  PubMed  Google Scholar 

  56. Todnem N, Nguyen KD, Reddy V, Grogan D, Waitt T, Alleyne CH. A simple and cost-effective model for ventricular catheter placement training: technical note. J Neurosurg. 2021;134(5):1640–3. https://doi.org/10.3171/2020.2.jns19161.

    Article  Google Scholar 

  57. Reedy GB. Using cognitive load theory to inform simulation design and practice. Clin Simul Nurs. 2015;11(8):355–60. https://doi.org/10.1016/j.ecns.2015.05.004.

    Article  Google Scholar 

  58. Frederiksen JG, Sørensen SMD, Konge L, et al. Cognitive load and performance in immersive virtual reality versus conventional virtual reality simulation training of laparoscopic surgery: a randomized trial. Surg Endosc. 2020;34(3):1244–52. https://doi.org/10.1007/s00464-019-06887-8 published Online First: 2019/06/07.

    Article  PubMed  Google Scholar 

  59. Frithioff A, Frendø M, Mikkelsen PT, et al. Ultra-high-fidelity virtual reality mastoidectomy simulation training: a randomized, controlled trial. Eur Arch Otorhinolaryngol. 2020;277(5):1335–41. https://doi.org/10.1007/s00405-020-05858-3 published Online First: 2020/02/19.

    Article  PubMed  Google Scholar 

  60. Chiniara G, Cole G, Brisbin K, et al. Simulation in healthcare: a taxonomy and a conceptual framework for instructional design and media selection. Med Teach. 2013;35(8):e1380–95. https://doi.org/10.3109/0142159x.2012.733451 published Online First: 2012/11/06.

    Article  PubMed  Google Scholar 

  61. Fraser KL, Ayres P, Sweller J. Cognitive load theory for the design of medical simulations. Simul Healthc. 2015;10(5):295–307. https://doi.org/10.1097/sih.0000000000000097.

    Article  PubMed  Google Scholar 

  62. Hamstra SJ, Brydges R, Hatala R, Zendejas B, Cook DA. Reconsidering fidelity in simulation-based training. Acad Med. 2014;89(3):387–92. https://doi.org/10.1097/acm.0000000000000130.

    Article  PubMed  Google Scholar 

  63. Wayne DB, Siddall VJ, Butter J, et al. A longitudinal study of internal medicine residentsʼ retention of advanced cardiac life support skills. Acad Med. 2006;81(10 Suppl):S9–s12. https://doi.org/10.1097/00001888-200610001-00004 published Online First: 2006/09/27.

    Article  PubMed  Google Scholar 

  64. Moazed F, Cohen ER, Furiasse N, Singer B, Corbridge TC, McGaghie WC, et al. Retention of critical care skills after simulation-based mastery learning. J Grad Med Educ. 2013;5(3):458–63. https://doi.org/10.4300/jgme-d-13-00033.1.

    Article  PubMed  PubMed Central  Google Scholar 

  65. Ansquer R, Mesnier T, Farampour F, Oriot D, Ghazali DA. Long-term retention assessment after simulation-based-training of pediatric procedural skills among adult emergency physicians: a multicenter observational study. BMC Med Educ. 2019;19(1):348. https://doi.org/10.1186/s12909-019-1793-6.

    Article  PubMed  PubMed Central  Google Scholar 

  66. Offiah G, Ekpotu LP, Murphy S, Kane D, Gordon A, O’Sullivan M, et al. Evaluation of medical student retention of clinical skills following simulation training. BMC Med Educ. 2019;19(1):263. https://doi.org/10.1186/s12909-019-1663-2.

    Article  PubMed  PubMed Central  Google Scholar 

  67. Donoghue A, Navarro K, Diederich E, et al. Deliberate practice and mastery learning in resuscitation education: a scoping review. Resuscitation Plus. 2021;6:100137. https://doi.org/10.1016/j.resplu.2021.100137.

    Article  PubMed  PubMed Central  Google Scholar 

  68. Cheng A, Magid DJ, Auerbach M, et al. Part 6: Resuscitation education science: 2020 American Heart Association guidelines for cardiopulmonary resuscitation and emergency cardiovascular care. Circulation. 2020;142(16_suppl_2). https://doi.org/10.1161/cir.0000000000000903.

  69. Andersen SA, Konge L, Cayé-Thomasen P, et al. Retention of mastoidectomy skills after virtual reality simulation training. JAMA Otolaryngol Head Neck Surg. 2016;142(7):635–40. https://doi.org/10.1001/jamaoto.2016.0454 published Online First: 2016/04/29.

    Article  PubMed  Google Scholar 

  70. Cold KM, Konge L, Clementsen PF, Nayahangan LJ. Simulation-based mastery learning of flexible bronchoscopy: deciding factors for completion. Respiration. 2019;97(2):160–7. https://doi.org/10.1159/000493431.

    Article  PubMed  Google Scholar 

  71. Taras J, Everett T. Rapid cycle deliberate practice in medical education - a systematic review. Cureus. 2017;9(4):e1180. https://doi.org/10.7759/cureus.1180 published Online First: 2017/05/26.

    Article  PubMed  PubMed Central  Google Scholar 

  72. Cheng A, Eppich W, Grant V, et al. Debriefing for technology-enhanced simulation: a systematic review and meta-analysis. Med Educ. 2014;48(7):657–66. https://doi.org/10.1111/medu.12432 published Online First: 2014/06/10.

    Article  PubMed  Google Scholar 

  73. Eppich WJ, Hunt EA, Duval-Arnould JM, et al. Structuring feedback and debriefing to achieve mastery learning goals. Acad Med. 2015;90(11):1501–8. https://doi.org/10.1097/acm.0000000000000934 published Online First: 2015/09/17.

    Article  PubMed  Google Scholar 

  74. Hunt EA, Duval-Arnould JM, Nelson-McMillan KL, et al. Pediatric resident resuscitation skills improve after “rapid cycle deliberate practice” training. Resuscitation. 2014;85(7):945–51. https://doi.org/10.1016/j.resuscitation.2014.02.025 published Online First: 2014/03/13.

    Article  PubMed  Google Scholar 

  75. Nicholls D, Sweet L, Muller A, et al. Teaching psychomotor skills in the twenty-first century: revisiting and reviewing instructional approaches through the lens of contemporary literature. Med Teach. 2016;38(10):1056–63. https://doi.org/10.3109/0142159x.2016.1150984 published Online First: 2016/03/30.

    Article  PubMed  Google Scholar 

  76. Strandbygaard J, Bjerrum F, Maagaard M, et al. Instructor feedback versus no instructor feedback on performance in a laparoscopic virtual reality simulator: a randomized trial. Ann Surg. 2013;257(5):839–44. https://doi.org/10.1097/SLA.0b013e31827eee6e published Online First: 2013/01/09.

    Article  PubMed  Google Scholar 

  77. Perretta JS, Duval-Arnould J, Poling S, et al. Best practices and theoretical foundations for simulation instruction using rapid-cycle deliberate practice. Simul Healthc. 2020;15(5):356–62. https://doi.org/10.1097/sih.0000000000000433 published Online First: 2020/08/19.

    Article  PubMed  Google Scholar 

  78. Clapper TC, Leighton K. Incorporating the reflective pause in simulation: a practical guide. J Contin Educ Nurs. 2020;51(1):32–8. https://doi.org/10.3928/00220124-20191217-07 published Online First: 2020/01/03.

    Article  PubMed  Google Scholar 

  79. Argyris C. Double-loop learning, teaching, and research. Acad Manag Learn Educ. 2002;1(2):206–18. https://doi.org/10.5465/amle.2002.8509400.

    Article  Google Scholar 

  80. Rudolph JW, Simon R, Rivard P, Dufresne RL, Raemer DB. Debriefing with good judgment: combining rigorous feedback with genuine inquiry. Anesthesiol Clin. 2007;25(2):361–76. https://doi.org/10.1016/j.anclin.2007.03.007.

    Article  PubMed  Google Scholar 

  81. Rudolph JW, Raemer DB, Simon R, et al. Simul Healthc. 2014;9(6):339–49. https://doi.org/10.1097/sih.0000000000000047 published Online First: 2014/09/05.

    Article  PubMed  Google Scholar 

  82. Molloy E, D B, E M. Changing conceptions of feedback. Routledge: Feedback in Higher and Professional Education: Understanding it and Doing It Well; 2013. p. 11–33.

    Google Scholar 

  83. Tolsgaard MG, Madsen ME, Ringsted C, et al. The effect of dyad versus individual simulation-based ultrasound training on skills transfer. Med Educ. 2015;49(3):286–95. https://doi.org/10.1111/medu.12624 published Online First: 2015/02/20.

    Article  PubMed  PubMed Central  Google Scholar 

  84. Tolsgaard MG, Rasmussen MB, Bjørck S, et al. Medical studentsʼ perception of dyad practice. Perspect Med Educ. 2014;3(6):500–7. https://doi.org/10.1007/s40037-014-0138-8 published Online First: 2014/07/31.

    Article  PubMed  PubMed Central  Google Scholar 

  85. Kolbe M, Eppich W, Rudolph J, et al. Managing psychological safety in debriefings: a dynamic balancing act. BMJ Simul Technol Enhanced Learn. 2019:bmjstel-2019-000470. https://doi.org/10.1136/bmjstel-2019-000470.

  86. Srinivasa K, Chen Y, Henning MA. The role of online videos in teaching procedural skills to post-graduate medical learners: a systematic narrative review. Med Teach. 2020;42(6):689–97. https://doi.org/10.1080/0142159X.2020.1733507.

    Article  PubMed  Google Scholar 

  87. Simpson T, Kitchen S, Lavelle M, et al. Evaluaiton Practice Toolkit. https://www.kcl.ac.uk/study/learningteaching/kli/research/ceps-research-group/evaluation-toolkit-1-developing-your-evaluation-strategy: King’s College London, 2017.

    Google Scholar 

  88. Cook DA, Brydges R, Zendejas B, et al. Mastery learning for health professionals using technology-enhanced simulation: a systematic review and meta-analysis. Acad Med. 2013;88(8):1178–86. https://doi.org/10.1097/ACM.0b013e31829a365d published Online First: 2013/06/29.

    Article  PubMed  Google Scholar 

  89. Cohen ER, McGaghie WC, Wayne DB, et al. Recommendations for reporting mastery education research in medicine (ReMERM). Acad Med. 2015;90(11):1509–14. https://doi.org/10.1097/acm.0000000000000933.

    Article  PubMed  Google Scholar 

  90. Yardley S, Teunissen PW, Dornan T. Experiential learning: AMEE Guide No. 63. Med Teach. 2012;34(2):e102–15. https://doi.org/10.3109/0142159x.2012.650741 published Online First: 2012/02/01.

    Article  PubMed  Google Scholar 

  91. Gaba D, Howard SK, Fish K, et al. Simulation-based training in anesthesia crisis resource management (ACRM): a decade of experience. Simul Gaming. 2001;32(2):175–93. https://doi.org/10.1177/104687810103200206.

    Article  Google Scholar 

  92. Zerth H, Harwood R, Tommaso L, et al. An inexpensive, easily constructed, reusable task trainer for simulating ultrasound-guided pericardiocentesis. J Emerg Med. 2012;43(6):1066–9. https://doi.org/10.1016/j.jemermed.2011.05.066 published Online First: 2011/09/20.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

We are grateful for the help and support of Gabriel Reedy, Anna Jones and Libby Thomas from King’s College London.

Funding

No external funding received.

Author information

Authors and Affiliations

Authors

Contributions

DG and RK both contributed equally to the overall outline of the paper. DG wrote sections contained within the case study section. RK wrote sections under the introduction. Both authors contributed equally to the SBPT blueprint and overall editing of all sections. The paper was reviewed and agreed by both authors prior to submission.

Authors’ information

The concept of this paper was developed during the King’s College London post-graduate certificate in Clinical Education undertaken by DG. RK teaches on this programme.

Corresponding author

Correspondence to David Gent.

Ethics declarations

Ethics approval and consent to participate

Ethics approval not applicable.

Consent for publication

Consent for publication not applicable.

Competing interests

There are no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gent, D., Kainth, R. Simulation-based procedure training (SBPT) in rarely performed procedures: a blueprint for theory-informed design considerations. Adv Simul 7, 13 (2022). https://doi.org/10.1186/s41077-022-00205-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41077-022-00205-4

Keywords