The study compares Traditional education with RCDP education on anesthesiology residents’ application of (ECC) and communication in a simulated environment. For the purposes of this paper and to create a shared mental model of the instruction, each of the learner groups received either “Traditional” or “RCDP” instruction which are broadly defined as follows: the “Traditional” instruction had a 90-min didactic (teaching) followed by a 60-min group, immersive simulation and debriefing (application), and survey completion (assessment); the “RCDP” instruction group completed the same elements with teaching and application occurring concurrently during a 2.5-h session as dictated by the iterative cycles used with RCDP. Additional curricular details are provided in the instructional design section.
An expert trained in curriculum development (LAR) and three experts in simulation education (EB, TC, LB) designed and implemented the curriculum. Resident ECC skills and communication were assessed by a trained simulation expert who was blinded to group allocation. Other measures were identified during a review of the literature and included previous experience with ECC [15], satisfaction/self-confidence [5], and learner’s experience ranking surveys [16].
Study design and participants
After obtaining ethics approval from the University of Alabama at Birmingham (UAB) Institutional Review Board, members of the postgraduate year (PGY) 2 anesthesiology residency class (n = 21) were randomly assigned to one of two bootcamp groups: Traditional or RCDP education. All residents took part in the activities, occurring during July of 2019. A power calculation was not used to determine the sample size, as all of the 21 PGY2 residents were to be included in the study. Residents provided electronic consent for research, allowing analysis of their survey results and simulation recordings. All participants were previously certified in advanced cardiac life support. Each group participated in 2 days of bootcamp, with the 2 days occurring 2 weeks apart.
Study setting
These activities took place at the University of Alabama at Birmingham’s Office of Interprofessional Simulation for Innovative Clinical Practice. This SSH-accredited simulation center is equipped with multiple simulation, debriefing, and control rooms, in addition to audiovisual capabilities. The PGY2 anesthesiology bootcamp is held annually, with all PGY2 anesthesiology residents receiving simulation and didactic content designed to help them successfully transition to anesthesiology clinical practice. Per the Accreditation Council for Graduate Medical Education [17], all PGY2 anesthesiology residents must have completed a minimum of 6 months of education that includes “experience in caring for inpatients in family medicine, internal medicine, neurology, obstetrics and gynecology, pediatrics, surgery or any of the surgical specialties, or any combination of these,” as well as 1–2 months’ experience in “critical care and emergency medicine.” Additionally, all PGY2 residents are required to have Basic and Advanced Cardiac Life Support certifications, with the latter including instruction and application of ECC elements.
Instructional methods
Three educational elements were incorporated into each educational group: teaching, application, and assessment. The order of these elements differed as described hereafter. The study design is represented in Fig. 1.
Bootcamp day 1
At the beginning of their first day of bootcamp, all participants completed the “Experience with ECC survey,” adapted from a previous article [11], evaluating baseline experience levels both in simulation and in clinical practice. The Traditional education group and the RCDP education group then received instruction as described below.
RCDP education group
On their first day of bootcamp, the RCDP education group (n = 11) received education through a 2.5-h RCDP simulation session (elements: teaching and application), where components of ECC, role assignment, closed-loop communication, and defibrillation operation were emphasized. Two facilitators extensively trained in RCDP and immersive simulation, as well as debriefing with good judgment, led the session. At the completion of the session, the participants completed the survey (element: assessment).
Traditional education group
On their first day of bootcamp, the Traditional education group (n = 10) received education through a 90-min lecture delivered by the same 2 previously mentioned facilitators (element: teaching). This lecture covered ECC interventions, closed-loop communication, assigning roles, and operating a defibrillator. Afterwards, residents took part in a group, immersive simulation (element: application), where they had the opportunity to utilize the skills covered in the preceding lecture on two patients requiring ECC. Following the simulation, a reflective debriefing session was conducted, exploring residents’ motivations and actions, and coaching on any identified gaps. The simulation and debriefing sessions lasted 60 min and were led by the same trained facilitators who conducted the RCDP simulation. At the completion of the session, the participants completed the survey (element: assessment).
Bootcamp day 2
All participants, regardless of education group, engaged in an individual, immersive simulation where the patient required ECC interventions. This occurred after all subjects received education, either Traditional or RCDP, on their first day of bootcamp. A reflective debriefing, led by the same facilitators mentioned previously, followed the immersive simulation, after which residents were asked to complete the “Satisfaction and Self-Confidence in Learning” survey [18]. The individual, immersive simulation sessions were video recorded to enable retrospective review.
After completing the individual simulation, members of the Traditional education group received the same RCDP session that members of the RCDP group were exposed to on their first day of bootcamp, taught by the same instructors. At the end of their second day of bootcamp, members of both groups completed a ranking survey, adapted from a previously published article [16], asking them to classify the bootcamp activities in order of usefulness, with 7 being the most useful and 1 being the least useful.
Simulation cases and performance assessment
The individual simulation scenario involved residents receiving a case stem stating that a patient was decompensating and needed their attention. Upon entering the room, residents were presented with a pulseless patient, showing a rhythm of ventricular fibrillation on the vital sign monitor. Residents were expected to initiate ACLS, delegate roles to trained embedded participants portraying nursing staff, and successfully defibrillate the patient. Upon defibrillation, the patient transitioned to unstable supraventricular tachycardia, requiring two rounds of synchronized cardioversion before converting into a normal sinus rhythm. When the patient converted to a normal sinus rhythm, or 10 min into the scenario, whichever came first, the scenario was ended. The participant then took part in a debriefing, followed by completion of the “Satisfaction and Self-Confidence in Learning” survey.
Using retrospective video review and blinding of participant identity and group, a trained simulation educator scored residents’ performances using a 24-item instrument, with 8 items adapted from the American Heart Association’s “Megacode Testing Checklist” [19]. This instrument, detailed in an additional file (see Additional file 1), was used to assess each learner’s performance related to defibrillation, synchronized cardioversion, Basic Life Support measures, and communication.
Data management
All instruments were electronic and used the Research Electronic Data Capture (REDCap) system. Each participant was assigned a random participant code at the outset of the bootcamp. Each participant’s video tape was tagged with the participant’s codes by a different simulation expert (EB) than the person who ultimately reviewed them (LB). This allowed for the expert who analyzed each resident’s performance to be blinded to their name and group. The participant codes also allowed for anonymous comparison of survey results and performance metrics.
Statistical analysis
Descriptive data were summarized using frequencies (percentages), with continuous data being analyzed using t-tests. Due to group size, Fisher’s test was utilized for analysis of dichotomous data. Each group’s mean rank order on the “Learners’ Experience Qualitative Tool” was analyzed using the Mann-Whitney test. All data were exported and analyzed using SPSS version 24, with two-sided p values <0.05 being statistically significant. Corrections were not made for multiple comparisons.