Skip to main content

Development and implementation of a novel, mandatory competency-based medical education simulation program for pediatric emergency medicine faculty

Abstract

Background

Maintaining acute care physician competence is critically important. Current maintenance of certification (MOC) programs has started to incorporate simulation-based education (SBE). However, competency expectations have not been defined. This article describes the development of a mandatory annual SBE, competency-based simulation program for technical and resuscitation skills for pediatric emergency medicine (PEM) physicians.

Methods

The competency-based medical education (CBME) program was introduced in 2016. Procedural skill requirements were based on a needs assessment derived from Royal College PEM training guidelines. Resuscitation scenarios were modified versions of pre-existing in-situ mock codes or critical incident cases. All full-time faculty were required to participate annually in both sessions. Delivery of educational content included a flipped classroom website, deliberate practice, and stop-pause debriefing. All stations required competency checklists and global rating scales.

Results

Between 2016 and 2018, 40 physicians and 48 registered nurses attended these courses. Overall course evaluations in 2018 were 4.92/5 and 4.93/5. Barriers to implementation include the need for many simulation education experts, time commitment, and clinical scheduling during course events.

Conclusion

We have developed a mandatory simulation-based, technical, and resuscitation CBME program for PEM faculty. This simulation-based CBME program could be adapted to other acute care disciplines. Further research is required to determine if these skills are enhanced both in a simulated and real environment and if there is an impact on patient outcomes.

Background

Maintaining physician competence is critically important in acute care settings in order to deliver high-quality, evidence-based care. Current maintenance of certification (MOC) programs require mostly passive learning strategies. Simulation-based education (SBE), often in the form of in situ mock codes, has been widely adopted for post-graduate training. Efforts to incorporate simulation into MOC for practicing physicians have recently been introduced in some disciplines; however, performance is not linked to competency expectations [1, 2]. Annual requirements for competency in simulation-based procedural and resuscitation skills would ensure that physicians in acute care settings maintain their competency in critical lifesaving skills.

Physician knowledge decay is a well-known phenomenon after post-graduate training. In fact, skill decay has been demonstrated in numerous cardiopulmonary resuscitation (CPR)-based courses [3,4,5,6,7,8]. To maintain competence, emergency physicians need to participate in continuing medical education to ensure updated medical knowledge and skill acquisition especially for critical procedures [9]. The American Society of Anesthesiologists (ASA) introduced simulation-based education into MOC for Anesthesia in 2010 [1]. Upon completion of the simulation scenarios, participants are required to identify 3 areas for practice improvement and incorporate them into their future daily practice [2]. Participants receive MOCA credits after completion of the program; however, they are not required to pass the simulation cases in order to complete the process.

Some studies have shown that participant satisfaction is greater with simulation-based workshops and courses compared to traditional lecture-based courses [10,11,12,13]. Simulation has the advantage of being utilized not only as a training tool, but also as a framework to assess teamwork principles, leadership, and communication skills [14,15,16,17]. Skills developed during simulation training are transferable to patient care, resulting in significant improvements in patient outcomes [18,19,20,21,22]. Despite these benefits, Pirie et al. demonstrated that PEM physicians participating in weekly divisional interprofessional in situ simulations averaged only 1.25 sessions per year and team skills plateaued with time [23].

Competency-based medical education (CBME) has attracted the attention of educators and accreditation bodies [24,25,26] as it allows competency measurement for specific skills by being outcome-based and promotes learner-driven skills acquisition [25,26,27]. Frank et al. define CBME as “an approach to preparing physicians for practice that is fundamentally oriented to graduate outcome abilities and organized around competencies” [28]. CBME is currently being implemented primarily in post-graduate training programs but not with practicing clinicians.

Although it is well known that simulation-based education addresses many educational and competence assessment needs for physicians, its utilization in a competency format for faculty members in pediatric emergency medicine (PEM) has not to our knowledge been previously studied. This paper describes the development and implementation of a mandatory simulation based CBME program for faculty in PEM.

Rationale

We developed a mandatory simulation competency-based procedural and resuscitation program in pediatric emergency medicine. In this report we describe the program, the curriculum from 2016 to 2018, and the applicability to other acute care settings aiming to adopt similar programs.

Methods

This program was implemented in 2016 in the emergency department of a tertiary care pediatric hospital. At the time of implementation, there were 28–30 full-time staff MDs, 6–8 half-time or greater contract staff MDs, 16 PEM fellows, 6–7 advanced training fellows (e.g., simulation, POCUS), and over 100 RNs. In 2018, the ED had 80,555 patient visits of which the Canadian Triage and Acuity Scores (CTAS) were CTAS 1 (1.1%), CTAS 2 (19.7%), CTAS 3 (45.3%), CTAS 4 (27.1%), and CTAS 5 (5.8%).

Course development

The CBME program for PEM faculty was introduced in 2016 and initially included training and assessment of both procedural and resuscitation skills. Although not done a priori, our curriculum development included the following stages of Kern's 6-step approach to curricular development [29].

  1. 1.

    Problem identification—skills gaps identified, frequency of individual in situ simulations insufficient, recurring morbidity cases

  2. 2.

    Needs assessment—Royal College Training Objectives

  3. 3.

    Goals and objectives—competency in core technical and resuscitations skills

  4. 4.

    Educational strategies—asynchronous website modules, annual simulation-based training, competency testing

  5. 5.

    Implementation—leadership support, simulation centre resources, PEM education and clinical expertise, scheduling

  6. 6.

    Evaluation—course evaluations, faculty feedback (future study), effect on in situ simulation performance (future study), mastery testing (future study)

Procedures skills content

Our division undertook a physician skill needs assessment based on existing Royal College of Physicians and Surgeons of Canada Objectives of Training in the Subspecialty of Pediatric Emergency Medicine and found that many physicians had not performed or infrequently performed many critical procedural skills. The top 4 ranked procedural skills were chosen for the first course from this needs assessment (see Table 1).

Table 1 Procedural skills stations

The pre-existing ISMC committee which consisted of 5 PEM simulation education faculty, 1 interprofessional education nursing specialist, 1 clinical support nurse, and 1 respiratory therapy education specialist were responsible for case selection and development.

Resuscitation case content

Resuscitation scenarios were developed based on pre-existing in situ mock code (ISMC) cases which incorporated both Pediatric Advance Life Support (PALS) algorithms as well as cases which challenge participants’ team or crisis resource management (CRM) skills (see Table 2). Resuscitation station content was initially derived primarily by the primary author (JP) and members of the PEM in situ team training committee. It was decided that one case per session would include a PALS algorithm sequence. A second case was selected based on the need to order multiple medications, testing leaders’ ability to prioritize medication orders and the team’s ability to deliver the medications in a timely manner using excellent closed-loop-communication, as this was identified as the most common skills gap in our in situ team training program [23]. Finally, the third case was arbitrarily decided based on either new updated guidelines (sepsis, trauma—massive hemorrhage) or potential but rare cases (sedation with laryngospasm). Changes to the program were decided by the PEM simulation committee based on feedback from evaluations as well as morbidity case reviews.

Table 2 Resuscitation skills stations

Because the CBME program was developed as an adjunct to the existing ISMC team training program, Institutional Ethics Review was not required.

Course delivery

The CBME program initially consisted of 2 half-day courses of procedures and 2 half-day courses of resuscitation per year. For ease of administration the half-days were combined into 2 full-day courses after the first year. Each MD faculty is required to complete one procedural and one resuscitation simulation course per year. In 2018, point of care ultrasound (POCUS) was added to the procedural half-day component of the program. Nurses traditionally are expected to have a full day of education per year and so those assigned on the CBME day were active participants.

PEM RNs completed a RN-focused procedural skills education half-day separately from the MD participants and then joined the half-day resuscitation team-based competency portion of the course. Due to RN staffing shortages, a maximum of 8 RNs were permitted to attend any given CBME session, resulting in 2 RNs per group. RNs not able to attend the CBME course were scheduled into the monthly interprofessional in situ mock trauma simulations. During non-CBME months, the nurses participated in the existing in situ mock traumas.

Each resuscitation group consisted of 2–4 staff physicians and 2 RN participants which enhanced the interprofessional teamwork of the sessions. Participants were all expected to play a role which they would normally do in a real scenario. A debriefing session was held following each resuscitation scenario with the intention of clarifying medical issues arising in the case and discussing crisis resource management aspects including interprofessional teamwork.

The range of MD participants per course was 10–19 and the number of instructors ranged from a minimum of 8 for a half-day and 16 for a full-day course, averaging approximately 1 instructor per POCUS/technical and 2 instructors per resuscitation stations. In total, 40 PEM physicians and 48 PEM nurses participated in the program from 2016 to 2018. The mean percentage of MDs participating per year was 85.4% and the mean percentage instructing per year was 42.8%.

Website content

An asynchronous flipped classroom approach was utilized. The RN-specific procedures eLearning was available on the SickKids ED intranet education page. A separate website with MD specific procedures and interprofessional (MD and RN) resuscitation case modules was created. Each learning module consisting of online videos and content-specific reading material was made available for the participants to review prior to the course. Website material was prepared by PEM and simulation experts as well as our interprofessional nurse education specialist (CG) based on RCPSC core knowledge requirements for PEM trained physicians as well as divisional clinical pathways, order sets, and procedural guidelines. Website material was password protected for participants. The competency checklists for each station were also available on the website (discussed below) so that participants may familiarize themselves with them beforehand. All MD and RN participants were expected to review the content material prior to taking the course. Station content included the following:

  1. 1.

    Station objectives

  2. 2.

    Video instruction

  3. 3.

    Reading material: e.g., guidelines, journal articles, textbook chapters

  4. 4.

    Checklist

  5. 5.

    Additional resources or links

Competency testing

Both Checklists and Global Rating Scale (GRS) specifically designed for each individual procedure or resuscitation station were used in order to assess competency throughout the full-day course. Checklists were designed separately for each station; some were modified from previously validated Objective Score of Technical Skills (OSAT) [30], while others were designed by PEM faculty and PEM educational experts with expertise in those skills (procedural) or content area (resuscitation) (see Additional file 1 MD for an example of a procedural checklist). Nursing used locally derived checklists for procedural skills (see Additional file 1 RN for an example of a procedural checklist).

For resuscitation scenarios, checklists included Crisis Resource Management components in order to highlight the importance of team functioning during resuscitations. While checklists listed every step in performing a procedural skill or accurately running a resuscitation scenario, the most important of these steps were highlighted in bold. Participants were required to achieve all bolded checklist items in order to achieve overall competence on the GRS. Several studies have assessed validity of GRS in the emergency setting [31, 32] and a systematic review has demonstrated some of the advantages of GRS over checklists [33]. Therefore, the decision was made to use checklists formatively, with the most important steps highlighted in bold. The GRS was used summatively to determine competence (see Additional file 2). Participants were required to achieve all checklist items in bold as a minimum passing standard (MPS) to achieve overall competence on the GRS.

Instructors with expertise within PEM education were identified and recruited to teach and evaluate each station. Instructors were directed on the components of the checklists and GRS, and asked to familiarize themselves with the website course material. Guidance on using the checklists and GRS to assess for competency was also given. A core group of instructors was identified as the course progressed, although instructors needed to rotate through competency days themselves as participants.

For procedural competence testing, all participants utilized repeated deliberate practice, an education methodology of repeated skills and resuscitation training with feedback, and then completed a final competency testing [34,35,36,37]. Unsuccessful participants were asked to repeat the testing until competency was met. Failure to meet competency by the end of the course resulted in a failure to pass the station.

For resuscitation competence testing, stop-pause debriefing [38] was utilized to reinforce learning and key scenario competencies followed by a complete scenario for GRS competency. Competency was defined a priori as team competence rather than individual competence, as the performance of the team ultimately determines outcomes in real-life cases (see Additional files 3 and 4 for examples of resuscitation station checklists and GRS). Unsuccessful team performance would result in teams needing to repeat the scenario until competency was achieved.

Although individuals and teams infrequently were unsuccessful, performance data will be analyzed in a separate study.

No formal rater training was utilized for the checklist and global rating scores. The majority of raters had used the checklists for other courses and our in situ mock code program so consistency of scoring was likely very high.

Cost

Cost estimates were approximated (see Table 3) and include (1) faculty time—both teachers and learners; (2) equipment including models for procedures; (3) room rental (covered by institutional simulation program); and (4) supplies.

Table 3 Average financing for the CBME program (projected and actual $CAN) per year

Course evaluations

Evaluations of both instructors and course were initiated with program implementation. From 2016 to 2018, the average instructor evaluation for POCUS 4.86 (range 4.63–5.0), procedural was 4.95 (range 4.5–5.0), and resuscitation was 4.81 (range 4.63–5.0). Overall course scores started in 2018 and mean scores were 4.92 and 4.93 for each course.

Overall comments for the course were very favorable. Technical skills comments included “great stations,” “friendly and positive learning environment,” “enjoyed viewing uncommon but potential complications to common procedures in the ED,” and “deliberate practice awesome”. Comments from resuscitation stations included “great for nursing to participate,” “hands on and interactive with constructive feedback in real time and conductive to my learning during scenario,” “makes people feel good even when feedback is constructive/negative”, “never felt judged or criticized”, and “love that it was a group scenario and focus was on team and communication.” Some barriers were also identified: “more facilitators to speed up assessments,” “long day, resuscitation sessions shorter,” “more nurses per group.”

Discussion

The authors report the development, implementation, and participant evaluations of an innovative multimodal continuing education course for faculty competency maintenance and assessment. Online learning material included key articles, clinical guidelines, videos, checklists, and online self-assessment tools. The hands-on procedures incorporated deliberate practice and resuscitations were debriefed using stop-pause methodology. A synthesis of systematic reviews showed that CME activities that were more interactive, used more methods, and involved multiple exposures were more likely to lead to improved physician performance and patient outcomes [39].

Challenges

Scheduling of both participants and instructors was and continues to be a challenge. Given the need to cover the clinical workload on CBME course days and individual’s academic responsibilities, developing a balanced schedule was difficult. On average 1–3 staff would “drop out” in the week leading up to the course. As these sessions are mandatory, most of these individuals would then request to participate in the following session leading to larger group sizes which ultimately impacted flow, timing, and instructor scheduling. The number of MD participants ranged from 9 to 17 per session. Additionally, the division continues to add new staff as the clinical and academic load has increased significantly each year.

The number of instructors who have either simulation expertise or technical expertise could be a challenge for smaller programs. There are five staff with simulation fellowship training or equivalent as well as numerous faculty who participate in the simulation instruction of post-graduate trainees from junior resident through to PEM fellows, most of which have taken a simulation instructor workshop. Additionally, many staff have clinical expertise which was utilized for either technical or scenario case development and instruction. Despite this broad education expertise, approximately 12–15 MD staff educators and 4–5 RN educators are required per session, meaning that many of the simulation “experts” were required to teach multiple course in a row. As a result, these faculty have not been able to take the course as a participant on annual basis, as mandated by the program. For smaller programs with fewer simulation educators, it may be difficult to run a program of this size.

Nursing participation was more challenging for the physicians for several reasons. Firstly, there are over 100 nurses in our division and so by sheer numbers it would very difficult to complete the CBME course in a given year. Additionally, the funding model for nurses only permits a limited number of paid education days per year. This allowed for just less than one-third of the RN group per year to participate. The remainder of the nurses continued to participate in the in situ program plus the pre-existing procedural training annually.

Next steps

Station and content development are important components of the program. Subsequent changes were iteratively made based on feedback from faculty evaluations and simulation/resuscitation expert panel. Ultimately, it will be important to define a set curriculum which can be rotated over subsequent years which represent both common and infrequent but high-risk critical skills. Potential solutions include repeating of a needs assessment as well as continuing to utilize quality reviews as a source for new case development.

Although procedural and POCUS skills were easily evaluated individually, the resuscitation stations were evaluated based on team competence. Ideally, our competency evaluations should also include leader competency. Although most physicians lead at least one case, several courses with larger participant numbers hindered all MDs from participating in the lead position. Also, competence in one case is not necessarily generalizable to other station content. Individually testing all MDs across all cases would require repeating the cases 3 or 4 more times for each group which is not feasible in a half-day format. In the future, we hope to review the completion rates and incorporate strategies to move from a competency model to a true mastery model.

Future research is required in order to evaluate the impact of this innovative program. Following Kirkpatrick’s hierarchy, evaluation of education programs happens at four levels: reactions, learning (knowledge, skills, attitudes), behavior (simulated or clinical), results (patient outcomes) [40]. As this is a new program, evaluation across all four levels is recommended. Currently, we are evaluating our program at the first three levels and hope to report on these findings in the near future. Additionally, feasibility is an extremely important consideration, and the ability of other acute care disciplines to adopt this program will depend on resources, finances, and leadership buy-in.

Conclusion

We have developed an annual mandatory simulation-based technical, POCUS, and resuscitation CBME program for PEM faculty. Although challenges around scheduling exist, the course was extremely well received by participants with excellent participation rates. Ensuring lifelong competence in acute care skills is essential for PEM physicians and nurses. This program addresses gaps in the traditional models of MOC and skills decay associate with life support courses. Our simulation-based CBME program could be adapted and generalized to other acute care disciplines. Further research is required to determine if these skills are enhanced both in a simulated and real environment and if there is an impact on patient outcomes.

Availability of data and materials

The datasets used and or analyzed during the current study are available from the corresponding author unreasonable request.

References

  1. 1.

    McIvor W, Burden A, Weinger MB, Steadman R. Simulation for maintenance of certification in anesthesiology: the first two years. J Contin Educ Health Prof. 2012;32(4):236–42. https://doi.org/10.1002/chp.21151.

    Article  PubMed  Google Scholar 

  2. 2.

    Steadman RH, Burden AR, Huang YM, Gaba DM, Cooper JB. Practice improvements based on participation in simulation for the maintenance of certification in anesthesiology program. Anesthesiology. 2015;122(5):1154–69. https://doi.org/10.1097/ALN.0000000000000613.

    Article  PubMed  Google Scholar 

  3. 3.

    Yang CW, Yen ZS, McGowan JE, et al. A systematic review of retention of adult advanced life support knowledge and skills in healthcare providers. Resuscitation. 2012;83(9):1055–60. https://doi.org/10.1016/j.resuscitation.2012.02.027.

    Article  PubMed  Google Scholar 

  4. 4.

    Einspruch EL, Lynch B, Aufderheide TP, Nichol G, Becker L. Retention of CPR skills learned in a traditional AHA Heartsaver course versus 30-min video self-training: a controlled randomized study. Resuscitation. 2007;74(3):476–86. https://doi.org/10.1016/j.resuscitation.2007.01.030.

    Article  PubMed  Google Scholar 

  5. 5.

    Wik L, Myklebust H, Auestad BH, Steen PA. Retention of basic life support skills 6 months after training with an automated voice advisory manikin system without instructor involvement. Resuscitation. 2002;52(3):273–9. https://doi.org/10.1016/S0300-9572(01)00476-2.

    Article  PubMed  Google Scholar 

  6. 6.

    Wik L, Myklebust H, Auestad BH, Steen PA. Twelve-month retention of CPR skills with automatic correcting verbal feedback. Resuscitation. 2005;66(1):27–30. https://doi.org/10.1016/j.resuscitation.2004.12.022.

    Article  PubMed  Google Scholar 

  7. 7.

    Smith KK, Gilcreast D, Pierce K. Evaluation of staff’s retention of ACLS and BLS skills. Resuscitation. 2008;78(1):59–65. https://doi.org/10.1016/j.resuscitation.2008.02.007.

    Article  PubMed  Google Scholar 

  8. 8.

    Meaney PA, Sutton RM, Tsima B, Steenhoff AP, Shilkofski N, Boulet JR, et al. Training hospital providers in basic CPR skills in Botswana: acquisition, retention and impact of novel training techniques. Resuscitation. 2012;83(12):1484–90. https://doi.org/10.1016/j.resuscitation.2012.04.014.

    Article  PubMed  PubMed Central  Google Scholar 

  9. 9.

    Ross BK, Metzner J. Simulation for maintenance of certification. Surg Clin North Am. 2015;95(4):893–905. https://doi.org/10.1016/j.suc.2015.04.010.

    Article  PubMed  Google Scholar 

  10. 10.

    Warren JN, Luctkar-Flude M, Godfrey C, Lukewich J. A systematic review of the effectiveness of simulation-based education on satisfaction and learning outcomes in nurse practitioner programs. Nurse Educ Today. 2016;46:99–108. https://doi.org/10.1016/j.nedt.2016.08.023.

    Article  PubMed  Google Scholar 

  11. 11.

    Solymos O, O'Kelly P, Walshe CM. Pilot study comparing simulation-based and didactic lecture-based critical care teaching for final-year medical students. BMC Anesthesiol. 2015;15(1):153. https://doi.org/10.1186/s12871-015-0109-6.

    Article  PubMed  PubMed Central  Google Scholar 

  12. 12.

    Curran V, Fleet L, White S, Bessell C, Deshpandey A, Drover A, et al. A randomized controlled study of manikin simulator fidelity on neonatal resuscitation program learning outcomes. Adv Health Sci Educ Theory Pract. 2015;20(1):205–18. https://doi.org/10.1007/s10459-014-9522-8.

    Article  PubMed  Google Scholar 

  13. 13.

    Corbridge SJ, Robinson FP, Tiffen J, Corbridge TC. Online learning versus simulation for teaching principles of mechanical ventilation to nurse practitioner students. Int J Nurs Educ Scholarsh. 2010;7:Article12.

    Article  Google Scholar 

  14. 14.

    Wong AH, Gang M, Szyld D, Mahoney H. Making an “attitude adjustment”: using a simulation-enhanced interprofessional education strategy to improve attitudes toward teamwork and communication. Simul Healthc. 2016;11(2):117–25. https://doi.org/10.1097/SIH.0000000000000133.

    Article  PubMed  Google Scholar 

  15. 15.

    McEwan D, Ruissen GR, Eys MA, Zumbo BD, Beauchamp MR. The effectiveness of teamwork training on teamwork behaviors and team performance: a systematic review and meta-analysis of controlled interventions. PLoS One. 2017;12(1):e0169604. https://doi.org/10.1371/journal.pone.0169604.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  16. 16.

    Miller D, Crandall C, Washington C 3rd, McLaughlin S. Improving teamwork and communication in trauma care through in situ simulations. Acad Emerg Med. 2012;19(5):608–12. https://doi.org/10.1111/j.1553-2712.2012.01354.x.

    Article  PubMed  Google Scholar 

  17. 17.

    Rosenman ED, Shandro JR, Ilgen JS, Harper AL, Fernandez R. Leadership training in health care action teams: a systematic review. Acad Med. 2014;89(9):1295–306. https://doi.org/10.1097/ACM.0000000000000413.

    Article  PubMed  Google Scholar 

  18. 18.

    Barsuk JH, Cohen ER, Potts S, Demo H, Gupta S, Feinglass J, et al. Dissemination of a simulation-based mastery learning intervention reduces central line-associated bloodstream infections. BMJ Qual Saf. 2014;23(9):749–56. https://doi.org/10.1136/bmjqs-2013-002665.

    Article  PubMed  Google Scholar 

  19. 19.

    Barsuk JH, Cohen ER, Williams MV, Scher J, Jones SF, Feinglass J, et al. Simulation-based mastery learning for thoracentesis skills improves patient outcomes: a randomized trial. Acad Med. 2018;93(5):729–35. https://doi.org/10.1097/ACM.0000000000001965.

    Article  PubMed  Google Scholar 

  20. 20.

    Knight LJ, Gabhart JM, Earnest KS, Leong KM, Anglemyer A, Franzon D. Improving code team performance and survival outcomes: implementation of pediatric resuscitation team training. Crit Care Med. 2014;42(2):243–51. https://doi.org/10.1097/CCM.0b013e3182a6439d.

    Article  PubMed  Google Scholar 

  21. 21.

    Andreatta P, Saxton E, Thompson M, Annich G. Simulation-based mock codes significantly correlate with improved pediatric patient cardiopulmonary arrest survival rates. Pediatr Crit Care Med. 2011;12(1):33–8. https://doi.org/10.1097/PCC.0b013e3181e89270.

    Article  PubMed  Google Scholar 

  22. 22.

    Josey K, Smith ML, Kayani AS, Young G, Kasperski MD, Farrer P, et al. Hospitals with more-active participation in conducting standardized in-situ mock codes have improved survival after in-hospital cardiopulmonary arrest. Resuscitation. 2018;133:47–52. https://doi.org/10.1016/j.resuscitation.2018.09.020.

    Article  PubMed  Google Scholar 

  23. 23.

    Pirie J, Cardenas S, Seleem W, Kljujic D, Schneeweiss S, Glanfield C, et al. The use of statistical process control charts to evaluate interprofessional education sessions embedded into a pediatric emergency in situ resuscitation program. Simul Healthc. 2019;14(2):121–8. https://doi.org/10.1097/SIH.0000000000000336.

    Article  PubMed  Google Scholar 

  24. 24.

    Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010;32(8):676–82. https://doi.org/10.3109/0142159X.2010.500704.

    Article  PubMed  Google Scholar 

  25. 25.

    Frank JR, Snell LS, Cate OT, Holmboe ES, Carraccio C, Swing SR, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638–45. https://doi.org/10.3109/0142159X.2010.501190.

    Article  PubMed  Google Scholar 

  26. 26.

    Ten Cate O, Billett S. Competency-based medical education: origins, perspectives and potentialities. Med Educ. 2014;48(3):325–32. https://doi.org/10.1111/medu.12355.

    Article  PubMed  Google Scholar 

  27. 27.

    Dath D, Iobst W. The importance of faculty development in the transition to competency-based medical education. Med Teach. 2010;32(8):683–6. https://doi.org/10.3109/0142159X.2010.500710.

    Article  PubMed  Google Scholar 

  28. 28.

    Frank JR, Mungroo R, Ahmad Y, Wang M, De Rossi S, Horsley T. Toward a definition of competency-based education in medicine: a systematic review of published definitions. Med Teach. 2010;32(8):631–7. https://doi.org/10.3109/0142159X.2010.500898.

    Article  PubMed  Google Scholar 

  29. 29.

    Thomas PA, Kern DE, Hughes MT, Chen BY. Curriculum development for medical education: a six-step approach. Johns Hopkins University Press; 2015. p. 300.

    Google Scholar 

  30. 30.

    Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84(2):273–8. https://doi.org/10.1046/j.1365-2168.1997.02502.x.

    CAS  Article  PubMed  Google Scholar 

  31. 31.

    Adler MD, Vozenilek JA, Trainor JL, Eppich WJ, Wang EE, Beaumont JL, et al. Comparison of checklist and anchored global rating instruments for performance rating of simulated pediatric emergencies. Simul Healthc. 2011;6(1):18–24. https://doi.org/10.1097/SIH.0b013e318201aa90.

    Article  PubMed  Google Scholar 

  32. 32.

    Hall AK, Dagnone JD, Lacroix L, Pickett W, Klinger DA. Queen’s simulation assessment tool: development and validation of an assessment tool for resuscitation objective structured clinical examination stations in emergency medicine. Simul Healthc. 2015;10(2):98–105. https://doi.org/10.1097/SIH.0000000000000076.

    Article  PubMed  Google Scholar 

  33. 33.

    Ilgen JS, Ma IW, Hatala R, Cook DA. A systematic review of validity evidence for checklists versus global rating scales in simulation-based assessment. Med Educ. 2015;49(2):161–73. https://doi.org/10.1111/medu.12621.

    Article  PubMed  Google Scholar 

  34. 34.

    Ericsson KA. Deliberate practice and acquisition of expert performance: a general overview. Acad Emerg Med. 2008;15(11):988–94. https://doi.org/10.1111/j.1553-2712.2008.00227.x.

    Article  PubMed  Google Scholar 

  35. 35.

    Hayward M, Chan T, Healey A. Dedicated time for deliberate practice: one emergency medicine program’s approach to point-of-care ultrasound (PoCUS) training. CJEM. 2015;17(5):558–61. https://doi.org/10.1017/cem.2015.24.

    Article  PubMed  Google Scholar 

  36. 36.

    Hunt EA, Duval-Arnould JM, Nelson-McMillan KL, et al. Pediatric resident resuscitation skills improve after “rapid cycle deliberate practice” training. Resuscitation. 2014;85(7):945–51. https://doi.org/10.1016/j.resuscitation.2014.02.025.

    Article  PubMed  Google Scholar 

  37. 37.

    Taras J, Everett T. Rapid cycle deliberate practice in medical education - a systematic review. Cureus. 2017;9(4):e1180. https://doi.org/10.7759/cureus.1180.

    Article  PubMed  PubMed Central  Google Scholar 

  38. 38.

    McMullen M, Wilson R, Fleming M, Mark D, Sydor D, Wang L, et al. “Debriefing-on-demand”: a pilot assessment of using a “pause button” in medical simulation. Simul Healthc. 2016;11(3):157–63. https://doi.org/10.1097/SIH.0000000000000140.

    Article  PubMed  Google Scholar 

  39. 39.

    Cervero RM, Gaines JK. The impact of CME on physician performance and patient health outcomes: an updated synthesis of systematic reviews. J Contin Educ Health Prof. 2015;35(2):131–8. https://doi.org/10.1002/chp.21290.

    Article  PubMed  Google Scholar 

  40. 40.

    Kirkpatrick DL, Kirkpatrick JD. Evaluating training programs : the four levels; 2006.

    Google Scholar 

Download references

Acknowledgements

I would like to acknowledge Dr. Savithiri Ratnapalan for her contributions to the program and for her careful review of the manuscript.

Funding

No funding was obtained for this study.

Author information

Affiliations

Authors

Contributions

JP designed the program, designed the website material, and was the major contributor in writing the manuscript. JF helped with data collection and writing of the introduction. MG helped with the design of the reporting of the program. LS helped with the design of the program. CG contributed to nursing data and the design of the resuscitation checklists. AK helped design the procedural component of the program and had a large role in the discussion. All authors read an approved the final version of the manuscript.

Authors’ information

Dr. Jonathan Pirie is the Director for Simulation for the Division of Pediatric Emergency Medicine, Hospital for Sick Children, Toronto, Canada

Corresponding author

Correspondence to Jonathan Pirie.

Ethics declarations

Ethics approval and consent to participate

This was obtained from the Hospital for Sick Children Ethics Review Board (REB # 1000064640).

Consent for publication

Not applicable.

Competing interests

The authors declared that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

MD: Procedural Checklist. RN: Procedural Checklist.

Additional file 2.

Procedural Global Rating Scale (GRS).

Additional file 3.

Resuscitation Checklist.

Additional file 4.

Resuscitation Global Rating Scale (GRS).

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Pirie, J., Fayyaz, J., Gharib, M. et al. Development and implementation of a novel, mandatory competency-based medical education simulation program for pediatric emergency medicine faculty. Adv Simul 6, 17 (2021). https://doi.org/10.1186/s41077-021-00170-4

Download citation

Keywords

  • Simulation
  • Competency-based medical education
  • Procedures
  • Resuscitation
  • Continuing professional development