Skip to main content

Online-synchronized clinical simulation: an efficient teaching-learning option for the COVID-19 pandemic time and: beyond

Abstract

Face-to-face clinical simulation has been a powerful methodology for teaching, learning, and research, and has positioned itself in health science education. However, due to the COVID-19 pandemic, social distancing has forced universities to abandon simulation centers and make use of alternatives that allow the continuation of educational programs safely for students and teachers through virtual environments such as distance simulation. In Latin America, before the pandemic, the use of non-presential simulation was very limited and anecdotal. This article has three main objectives: to establish the efficacy of online-synchronized clinical simulation in the learning and performance of medical students on the management of patients with COVID-19 in simulation centers of three Latin American countries, to determine the quality of the online debriefing from the students’ perspective, and to deepen the understanding of how learning is generated with this methodology.

Introduction

Clinical simulation is a teaching, learning, evaluation, and research strategy that has achieved an important place in health science education [1, 2]. This educational methodology attempts to represent reality without putting patients at risk. It is constantly developed by working with learning theories, didactics, cognitive psychology, industrial engineering, technology, and human resources [3, 4].

In these times of pandemic by the coronavirus disease 2019 (COVID-19) [5], social distancing forced universities and training centers to close their classrooms and migrate to virtual environments [6]. An estimated 1.3 billion students have withdrawn from their daily academic routines in 186 countries, including all of Latin America [7]. According to UNESCO, this represents 70% of the worldwide student population [8]. However, the pandemic has accelerated the digital transformation of medical education, allowing students to review concepts and build knowledge through webinars and other virtual strategies without having to suspend classes or expose themselves to the risk of contagion. Still, it has limitations that are perceived by students, mainly those who should already be in clinical practice, possibly affecting their motivation to learn [9].

Non-presential simulation has been developed in the last decade with terms such as remote simulation [10, 11], online simulation, which can be synchronous or asynchronous [12], telesimulation [13, 14], among others. These have shown promising results in student satisfaction, concept learning, and psychomotor skill development when there is task trainer availability; nonetheless, there are still doubts regarding the technical feasibility, the logistical aspects and the way in which learning is generated with this methodology [15].

During the COVID-19 pandemic, non-presential simulation began to be used more frequently in Latin America in order to maintain the teaching-learning processes in medical schools; nevertheless, the limitations of virtual environments can have a negative impact on the learning of medical students in low-and middle-income countries, where the technological and connectivity resources available are possibly fewer due to the existing inequality [16].

In this article, we describe the way in which we transformed the activities that we used to do in the face-to-face simulation in three Latin American simulation centers (Briefing, simulated cases and debriefing) to a synchronized online environment, and the way in which we studied it through educational research.

The main objectives of the study were to evaluate the learning and performance of the participants in the diagnosis, treatment, and non-technical skills for the case management of patients with COVID-19 during online simulation in real time. The secondary objectives were to know the satisfaction level that medical students and residents had regarding the webinar-based education they received during the pandemic and the perception of learning with an online-synchronized simulation strategy. In addition to this, we evaluate the quality of the structured debriefing from the student's perspective, as well as the simulation and debriefing times, the relationship between variables, and the comparison of the results between the three participating countries.

Methods

Study design

We conducted a comparative before-and-after study with mixed design, between 13 to 25 May 2020, in three Latin American clinical simulation centers (Colombia, Ecuador, and Mexico). A simulation-based educational intervention with cases related to COVID-19 was proposed in both the emergency room (ER) and the operations room (OR).

Sample and ethics

The sample consisted of 4th, 5th, and 6th year medical students as well as anesthesiology residents who had practiced at the participating simulation centers before the COVID-19 pandemic. The students who participated in this study came from the medical school of Tecnologico de Monterrey (Mexico), Alexander Von Humboldt University (Colombia), and Pontificia Universidad Catolica del Ecuador (Ecuador). From Colombia 55 students were invited and 49 attended (89%), from Mexico 33 students were invited and 33 attended (100%), and from Ecuador 38 were invited and 24 students attended (63%).

This research was approved by the research ethics committee of the VitalCare Clinical Simulation Center with registration # CEIC-005-05-2020.

Settings

The study was carried out in three Latin American simulation centers; in Mexico the Tecnologico de Monterrey’s center, in Ecuador the center of the Pontificia Universidad Catolica del Ecuador, and in Colombia the VitalCare Simulation Center.

Interventions

We summarized the simulated clinical cases and the structure of the activity in Table 1.

Table 1 Key elements of simulation-based research [17]

Cases

Six simulation cases related to COVID-19 were designed, two in each country:

Colombia

Case 1 (T1): Young woman with upper gastrointestinal bleeding due to NSAIDs. Background of mild cough, headache, unquantified fever, and contact with a patient with severe respiratory symptoms. Case 2 (T2): Elderly male patient, with respiratory distress, cough, fever, and anosmia. Admitted to ER in shock and acute respiratory failure.

Mexico

Case 1 (T1): 76-year-old man with heart disease, has had a hip fracture for two weeks on treatment with Ketorolac. He was admitted to the emergency room for abdominal pain, upper gastric bleeding, and unquantified fever. Case 2 (T2): 68-year-old man, diabetic, smoker, and multiple allergies. He was admitted to the ER in acute respiratory failure and high fever.

Ecuador

Case 1 (T1): 32-year-old woman, 40 weeks pregnant, admitted to the operating room for respiratory failure and loss of fetal well-being. Case 2 (T2): A 39-year-old woman, 36 weeks pregnant, admitted to the operating room with placental abruption and respiratory failure.

The connection was made through the Zoom® video-meeting platform (Zoom Video Communications, Inc., USA). Engineers were in charge of operating the Laerdal’s ALS® and SimMom® simulator monitor and transmitting vital signs and images. The teachers were in charge of doing the briefing, conducting the debriefing and evaluating the students. The Confederates took it upon themselves to stay in contact with the students during the simulations. Each case had a stage director who communicated with the patient and the nurse through the private zoom chat (Fig. 1). The simulated patient monitors were used, which were already present in our simulation centers prior to the pandemic.

Fig. 1
figure1

Example of elements of online-synchronized simulation

In order to avoid the system crashing in the event of an internet connection fluctuation, several co-hosts were assigned, so that if the host left the meeting, one of the participants could remain as the new meeting’s administrator.

Quantitative measures

Performance and learning behaviors

For the students’ performance evaluation during the simulated cases, we use a performance scale of 9 points (1 to 9) where a minimum rating is 1, which means that the student shows a very, very poor performance. The maximum rating is 9, which means that the student shows a very, very good performance. This scale was designed by our group, it was validated by experts to evaluate the performance of clinical teams in a previous research [19].

For the evaluation of cognitive engagement, we used the Interactive, Constructive, Active, and Passive framework (ICAP), which is based on the behaviors of individuals towards the learning activity. The participant with passive behavior receives the information without managing it, the participant with active behavior asks for information, shows interest in the task. The constructive participant reflects on the situation, contrasts the information, and the interactive participant shows high interest in the task, reflects and proposes solutions, interacts with their peers, explains the situation [20].

As previously stated, the teachers were in charge of evaluating the performance and behavior of the students, all of them were informed of the nature and objectives of the study, as well as the tools that were used to evaluate. They held synchronous meetings through the Zoom® video conferencing platform to train in the use of performance and learning behavior assessment tools.

Participants’ satisfaction with the online simulation

This instrument consisted of two five-point satisfaction scales for participants to rate both the online activities based on conferences (webinars) and the online-synchronized simulation received.

Perception of learning

We developed a Likert-type survey of 20 statements and five options (1: Totally disagree; 5: Totally agree). The items were constructed from the learning objectives of the course and the expected behaviors for the activity. This instrument included the perception of learning, teamwork, communication, and realism. We conducted a pilot of the cases and the scale, the latter was consistent, obtaining a Cronbach’s alpha of 0.73.

Debriefing Assessment for Simulation in Healthcare (DASH)

The Debriefing Assessment for Simulation in Healthcare (DASH)® Student Short-form scale was used to assess the quality of debriefing. This scale contains six elements that encompass the instructor’s behaviors: Introduction to the simulation environment, the engaging context for learning, organized debriefing structure, provoked reflection of performance, and identification of what was done well and poorly, which helped determine how to improve or sustain good performance. Element ratings are based on a 7-point effectiveness scale (1: Extremely Ineffective/Detrimental; 7: Extremely Effective/Outstanding) [21]. The students were instructed in its use.

Qualitative measures

Two open-ended questions were asked for participants to express their views on the strengths (question A) and the weaknesses (question B) of the online-synchronized simulation.

Data collection

Performance evaluation and assessment of learning behaviors were collected in a google sheet®. The instruments of satisfaction, learning perception, quality of debriefing, and strengths and weaknesses of the synchronous online simulation were sent to the participants via Google Forms® (Google LLC, USA). The information was collected between May 13 and 25, 2020.

Statistics

Statistical analysis was performed in SPSS 26® (IBM, USA). The normality of the distribution of the data was evaluated with the Kolmogorov-Smirnov test, the qualitative variables were summarized with proportions, and the quantitative variables with measures of central tendency and dispersion. The comparison of qualitative variables was performed with the chi-square test. Pretest and posttest scores were compared with the Wilcoxon Test. Relationships between variables were calculated using the Spearman’s Rho correlation coefficient. Statistical significance was expressed as a function of p < 0.05.

Qualitative analysis

Atlas.Ti V8.1 (Scientific Software Development GmbH, Germany) software was used for the qualitative analysis of two open-ended questions related to the online-synchronized simulation strengths and weaknesses. We carried out a Thematic Analysis [22]. For this, we initially read (DAD-G, AR-Z) the text several times, then we made a general coding of possible themes, having the initial themes labeled with colors that corresponded to known categories, for example, teamwork, communication, learning, and realism. On the final phase, we reviewed the codes and reorganized themes: in this step, we disregarded some of the codes and regrouped the themes until the final ones were picked, which were then exported to a spreadsheet in order to be summarized as proportions.

Results

Sample

One hundred and six medical students, 49 from Colombia (46.2%), 33 from Mexico (31.1%), and the remaining (22.6%) from Ecuador participated in the study. Mean age was 23 years (IQR: 22–26), and (51.9%) were men. Regarding the academic level, (34.9%) were fourth year students of medicine, (38%) of fifth year and (4.7%) of 6-year (21.7%) were anesthesia residents.

Times

Fourteen online-synchronized simulation (OSSim) sessions were performed with a total duration of 25.1 h with a mean of 102.7 min. In each session, two clinical cases were executed with their respective structured briefing and debriefing. The relationship of debriefing time with simulation time (D/S index) was 1.33. In Table 2, we presented the educational activities’ times.

Table 2 Educational activities time (minutes)

Participants’ satisfaction with the online simulation

The satisfaction score for online education (webinars) during the COVID-19 pandemic was lower than that of online-synchronized simulation: 3 (IQR: 3–4) vs 5 (IQR: 4–5). A difference by country was found, being lower in Colombia for online education (p < 0.001), and the level of satisfaction for the online simulation was lower in Mexico (p = 0.021).

Performance and learning behaviors

No difference in performance was found by sex, however, a statistically significant difference was found by educational level, being greater before and after the intervention in the anesthesia residents (p < 0.05). The comparison of the before and after performance is summarized in Table 3. The cognitive engagement was passive (10.4%), active (11.3%), constructive (34.9%), and interactive (43.4%).

Table 3 Performance in COVID-19 simulated cases 1 and 2 (N: 106)

Perception of learning

Out of the 106 participants, 100 answered the survey (94.3%). A high agreement level was found with the OSSim inventory in all of its items (Table 4). This instrument showed a good internal consistency with Cronbach’s alpha of 0.87.

Table 4 Agreement proportion to online-synchronized simulation (N: 100)

The questions were grouped into four categories: Realism, Learning, Non-technical Skills Training (NTS), and Active Learning Strategy (ALS). The answers were grouped into three agreement levels: low, middle, and high. The level of agreement was mainly high: Realism (88%), Learning (89%), NTS training (94%), ALS (95%).

No statistically significant difference was found for age, sex, or educational level. Difference was found by country in the perception of realism (p=0.030) and learning obtained (p=0.037), being lower in Mexico. But not at the perception of non-technical skills training (p=0.12) or active learning (p=0.8).

Debriefing assessment

The evaluation of the debriefing’s quality was high (Table 5). No significant differences by sex or age were found. Fourth-year students and resident physicians rated element 1 higher than fifth- and sixth-year students (p = 0.023). In the analysis by country, the scores in Colombia were higher for element 1 (p < 0.001), element 2 (p = 0.04) and element 3 (p = 0.033). No statistically significant difference was found for the other elements.

Table 5 Debriefing Assessment for Simulation in Healthcare (DASH) (N: 100)

Bivariate analysis

In the bivariate analysis, a strong positive correlation was found between cognitive engagement and the categories related to simulation-based learning, being stronger with realism (p < 0.001). Another correlation was found between cognitive engagement and performance, being stronger with communication (p < 0.001), and an intermediate positive correlation between cognitive engagement and improvement in situational awareness and treatment. In Table 6, we summarized the correlations with Spearman’s Rho.

Table 6 Bivariate Analysis (N: 100)

Qualitative analysis

Open and selective coding of the texts written by participants (N: 100) was carried out. Regarding the strengths, 12 codes were found that represented the students' thinking, with 256 citations. 41% of the students highlight the realism, and 36% the social interaction. The most common code concurrences were found between perception of realism and real-time interaction, realism and theory-practice integration, and realism with the opportunity to carry out social practice (Table 7).

Table 7 Proportions of online-synchronized clinical simulation strengths (N: 100)

Regarding the weaknesses of online-synchronized clinical simulation, (36.4%) of students recognized the intermittency of communication due to the saturation of the platform when they spoke at the same time, (35%) described their dependence on internet speed, and (32.3%) considered the lack of practice of motor skills such as orotracheal intubation, donning and doffing of personal protective equipment, among others, as a limitation (Table 8). No strong concurrences were found.

Table 8 Proportions of online-synchronized clinical simulation weaknesses (N: 100)

Discussion

In the current study, the results demonstrated a low level of satisfaction from medical students with the methodology of education in virtual settings based exclusively on webinars through conference platforms. This was the dominant strategy at the beginning of the COVID-19 pandemic to maintain the processes of teaching-learning and constructing knowledge despite social distancing [8, 23]. In contrast, a high level of satisfaction with learning was found with the online-synchronized clinical simulation. The latter can be explained from the qualitative analysis of the students’ discourse, since this strategy allowed them interaction, social practice, possibility of making decisions, and integrating theory with practice. The online-synchronized simulation has characteristics that make it a social practice [24].

An interesting finding with this interactive methodology was the time needed to achieve the learning objectives. Online-synchronized simulation requires more time than we used in face-to-face simulation for the case development. The debriefing time was similar to the one we used in the simulation center. However, the debriefing and simulation relationship was lower than that found in other studies [25]. This could be due to the participants describing what they did during the online simulation, and the turns taken to speak, which lengthened the time of the simulated cases.

It is possible that clinical simulation is superior to traditional passive educational practices for developing skills and integrating learning [26,27,28], as it has had an essential technological advance to emulate clinical environments [29]. Nonetheless, the evidence is not conclusive that with more fidelity of the simulators, more learning is achieved [30]. Similar results were found in other studies with online simulation [12, 13].

In our study, the perception of realism was high; we think that this was due to the great social interaction in real-time with peers, standardized patients, staff, the immediate feedback shown on the hemodynamic monitoring, and the complementary diagnostic aids, which were favored by the briefing and structured debriefing.

The posttest learning levels were high, which corresponded to the perception of learning, this is perhaps more related to constructive and interactive cognitive engagement, social interaction, and the environment created by the instructors during the briefing and debriefing [31]. In a similar manner, we think that despite the fact that the second case was different from the first, maintaining the same theme and structure of the simulation (briefing ➔ simulated case ➔ debriefing) allowed greater comfort for students, which might be involved in a better performance.

Patel et al. conducted a similar study with 53 anesthesiology residents. Knowledge was evaluated with pre- and post-tests, and satisfaction with the activity through a survey. They found improvements in learning and satisfaction with simulated online activities, with the biggest downside to telesimulation was the audio quality. All of the above is consistent with our findings [32].

During this pandemic, human factors have shown to be related to risking or protecting health care workers [33,34,35]. The mastery of non-technical skills such as communication, awareness, leadership, and teamwork, as well as the management of the cognitive load, are both determining factors for success [19]. The level of performance in the initial evaluation (T1) was low, which is largely explained by the novelty of the disease, by the lack of knowledge of the safe technique of personal protective equipment (PPE) usage, and by the scarce formal curricular insertion of non-technical skills during the undergraduate level.

The declarative components of knowledge regarding safe airway management and correct usage of PPE, along with the mastery of communication strategies and situational awareness improved significantly on the second simulated case. This we attribute to the changes done in the conceptual model of the participant's biosafety, using checklists for donning and doffing, distributing the attention, using the strategy of “pause and think,” calling to the sterile cockpit, and improving the closed communication loop.

The online debriefing (teledebriefing) in this work obtained a very good rating from the students. We believe that this result is mainly due to two situations, the first is the experience conducting debriefing with the leaders of each simulation center, and the second is the structure carried out during the activity, since we decided to use the same dynamics that already existed and had succeeded in the face-to-face simulation prior to the COVID-19 pandemic. We managed to build a safe learning environment for the participants, and kept their interest during the course. We think that it is possible that the participants’ perception of their learning and performance was also influenced when rating the debriefing. This result is consistent with the study by Ahmed et al .[36], who carried out teledebriefing, which was evaluated using the DASH scale in the student version with satisfactory results.

Limitations and strengths

This study has some limitations, in the design, the lack of a control group and randomization may decrease the internal validity. From a technical aspect, the dependence on the quality of the internet could be involved in the low cognitive engagement of some participants, however, the evaluation of the activity was high. A fundamental limitation was that only the declarative aspect of the procedures could be worked on. Regarding strengths, this was a multicenter and multinational study, its sample was larger than that of similar studies, and the internal consistency of the instruments used to collect the information was high.

The limitations of this work can be addressed in future studies with a multidisciplinary sample, with more countries participating, and with a performance evaluation after the online simulation is done in the simulation centers.

Conclusion

Although the COVID-19 pandemic has promoted social distancing and online conference-based education, the level of students’ satisfaction tends to decrease. Online-synchronized simulation is an active and social learning activity that enables the training and developing of non-technical skills, as well as improving the declarative knowledge of medical students without having to increase costs or sacrificing the perception of realism by the learners, and an efficient alternative for teaching and learning in health sciences in the new normalcy. For this, it is essential to perform an adequate briefing, allocate more time for cases, and carry out structured debriefing. Having said that, it is recommendable that in a face-to-face modality the procedural aspects be complemented in the simulation centers with the appropriate biosafety protocols.

Availability of data and materials

The anonymized data used for the analysis of the present study are available from the corresponding author on reasonable request.

Abbreviations

ALS:

Active learning strategy

COVID-19:

Coronavirus disease 2019

DASH:

Debriefing assessment for simulation in healthcare

ER:

Emergency room

ICAP:

Interactive, constructive, active, passive framework

NTS:

Non-technical skills

OR:

Operations room

OSSim:

Online-synchronized simulation

PPE:

Personal protective equipment

References

  1. 1.

    La Cerra C, Dante A, Caponnetto V, et al. Effects of high-fidelity simulation based on life-threatening clinical condition scenarios on learning outcomes of undergraduate and postgraduate nursing students: a systematic review and meta-analysis. BMJ Open 2019;9:e025306. https://doi.org/10.1136/bmjopen-2018-025306.

  2. 2.

    Beal MD, Kinnear J, Anderson CR, Martin TD, Wamboldt R, Hooper L. The effectiveness of medical simulation in teaching medical students critical care medicine. Simul Healthc. 2017;12(2):104–16. https://doi.org/10.1097/SIH.0000000000000189.

    Article  PubMed  Google Scholar 

  3. 3.

    Díaz-Guio DA, Ruiz-Ortega FJ. Relationship among mental models , theories of change , and metacognition : structured clinical simulation. Colomb J Anesthesiol 2019;47(14):113–116. Available from: http://dx.doi.org/https://doi.org/10.1097/CJ9.0000000000000107

  4. 4.

    Nestel D, Bearman M. Theory and Simulation-Based Education: Definitions, Worldviews and Applications. Clin Simul Nurs. 2015;11(8):349–354. Available from: http://dx.doi.org/https://doi.org/10.1016/j.ecns.2015.05.013

  5. 5.

    Zhu N, Zhang D, Wang W, Li X, Yang B, Song J, et al. A Novel Coronavirus from Patients with Pneumonia in China, 2019. N Engl J Med. 2020;382(8):727–33. https://doi.org/10.1056/NEJMoa2001017.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  6. 6.

    Gottlieb M, Landry A, Egan DJ, Shappell E, Bailitz J, Horowitz R, et al. Rethinking Residency Conferences in the Era of COVID-19. AEM Educ Train. 2020;4(3):313–7. https://doi.org/10.1002/aet2.10449.

    Article  PubMed  PubMed Central  Google Scholar 

  7. 7.

    Li, C; Lalani F. The COVID-19 pandemic has changed education forever. This is how [Internet]. World Economic Forum. 2020 [cited 2020 Aug 11]. Available from: https://www.weforum.org/agenda/2020/04/coronavirus-education-global-covid19-online-digital-learning/

  8. 8.

    UNESCO. COVID-19 Educational Disruption and Response [Internet]. 2020 [cited 2020 Aug 11]. Available from: https://en.unesco.org/covid19/educationresponse

  9. 9.

    Abedi M, Abedi D. A letter to the editor: the impact of COVID-19 on intercalating and non-clinical medical students in the UK, Med Educ Online. 2020;25:1:1771245. https://doi.org/10.1080/10872981.2020.1771245.

  10. 10.

    Laurent DA, Niazi A, Cunningham M, Jaeger M, Abbas S, Mcvicar J, et al. A valid and reliable assessment tool for remote simulation-based ultrasound-guided regional anesthesia. Reg Anesth Pain Med. 2014;39(6):496–501. https://doi.org/10.1097/AAP.0000000000000165.

    CAS  Article  Google Scholar 

  11. 11.

    LeFlore JL, Sansoucie DA, Cason CL, Aaron A, Thomas PE, Anderson M. Remote-controlled distance simulation assessing neonatal provider competence: A feasibility testing. Clin Simul Nurs 2014;10(8):419–424. Available from: http://dx.doi.org/https://doi.org/10.1016/j.ecns.2014.04.004

  12. 12.

    Cant RP, Cooper SJ. Simulation in the Internet age: The place of Web-based simulation in nursing education: An integrative review. Nurse Educ Today 2014;34(12):1435–1442. Available from: http://dx.doi.org/https://doi.org/10.1016/j.nedt.2014.08.001

  13. 13.

    McCoy CE, Sayegh J, Rahman A, Landgorf M, Anderson C, Lotfipour S. Prospective Randomized Crossover Study of Telesimulation Versus Standard Simulation for Teaching Medical Students the Management of Critically Ill Patients. AEM Educ Train. 2017;1(4):287–92. https://doi.org/10.1002/aet2.10047.

    Article  PubMed  PubMed Central  Google Scholar 

  14. 14.

    Jewer J, Parsons MH, Dunne C, Smith A, Dubrowski A. Evaluation of a mobile telesimulation unit to train rural and remote practitioners on high-acuity low-occurrence procedures: Pilot randomized controlled trial. J Med Internet Res. 2019;21(8):1–17.

    Article  Google Scholar 

  15. 15.

    Hayden EM, Khatri A, Kelly HR, Yager PH, Salazar GM. Mannequin-based Telesimulation: Increasing Access to Simulation-based Education. Acad Emerg Med. 2018;25(2):144–7. https://doi.org/10.1111/acem.13299.

    Article  PubMed  Google Scholar 

  16. 16.

    Alves Bastos E Castro M, Lucchetti G. Simulation in Healthcare Education During and After the COVID-19 Pandemic. Simul Healthc. 2020;15(4):298–9. https://doi.org/10.1097/SIH.0000000000000492.

    Article  PubMed  Google Scholar 

  17. 17.

    Cheng A, Kessler D, Mackinnon R, Chang TP, Nadkarni VM, Hunt EA, et al. Reporting guidelines for health care simulation research: extensions to the CONSORT and STROBE statements. Adv Simul. 2016;1(1):1–13.

    Article  Google Scholar 

  18. 18.

    Rudolph JW, Simon R, Dufresne RL, Raemer DB. There’s no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc. 2006;1(1):49–55. https://doi.org/10.1097/01266021-200600110-00006.

    Article  PubMed  Google Scholar 

  19. 19.

    Díaz-Guio DA, Ricardo-Zapata AJ, Ospina-Velez J, Gómez-Candamil G, Mora-Martinez S, Rodriguez-Morales A. Cognitive load and performance of health care professionals in donning and doffing PPE before and after a simulation-based educational intervention and its implications during the COVID-19 pandemic for biosafety. Infez Med. 2020;28(Suppl 1):111–7 Available from: https://pubmed.ncbi.nlm.nih.gov/32532947/.

    PubMed  Google Scholar 

  20. 20.

    Chi MTH, Wylie R. The ICAP Framework: Linking Cognitive Engagement to Active Learning Outcomes. Educ Psychol. 2014;49(4):219–43. https://doi.org/10.1080/00461520.2014.965823.

    Article  Google Scholar 

  21. 21.

    Simon R, Raemer DBRJ. Debriefing Assessment for Simulation in Healthcare (DASH)© – Student Version, Short Form - Spanish [Internet]. Boston, Massachusetts: Center for Medical Simulation; 2010. Available from: https://harvardmedsim.org/wp-content/uploads/2017/01/DASH.SV.Short.2010.Final.pdf

    Google Scholar 

  22. 22.

    Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101. https://doi.org/10.1191/1478088706qp063oa.

    Article  Google Scholar 

  23. 23.

    Almarzooq ZI, Lopes M, Kochar A. Virtual Learning During the COVID-19 Pandemic: A Disruptive Technology in Graduate Medical Education. J Am Coll Cardiol. 2020;75(20):2635–8. https://doi.org/10.1016/j.jacc.2020.04.015.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  24. 24.

    Dieckmann P, Gaba D, Rall M. Deepening the theoretical foundations of patient simulation as social practice. Simul Healthc. 2007;2(3):183–93. https://doi.org/10.1097/SIH.0b013e3180f637f5.

    Article  PubMed  Google Scholar 

  25. 25.

    Aghera A, Emery M, Bounds R, Bush C, Stansfield B, Gillett B, et al. A Randomized Trial of SMART Goal Enhanced Debriefing after Simulation to Promote Educational Actions. West J Emerg Med. 2018;19(1):112–20. https://doi.org/10.5811/westjem.2017.11.36524.

    Article  PubMed  Google Scholar 

  26. 26.

    Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. BEME: Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10–28. https://doi.org/10.1080/01421590500046924.

    Article  PubMed  Google Scholar 

  27. 27.

    Warren JN, Luctkar-Flude M, Godfrey C, Lukewich J. A systematic review of the effectiveness of simulation-based education on satisfaction and learning outcomes in nurse practitioner programs. Nurse Educ Today. 2016;46(c):99–108. https://doi.org/10.1016/j.nedt.2016.08.023.

    Article  PubMed  Google Scholar 

  28. 28.

    Cifuentes-Gaitán MJ, González-Rojas D, Ricardo-Zapata A, Díaz-Guio DA. Transferencia del aprendizaje de emergencias y cuidado crítico desde la simulación de alta fidelidad a la práctica clínica. Acta Colomb Cuid Intensivo. 2020;21(1):17–21. Available from: https://doi.org/https://doi.org/10.1016/j.acci.2020.06.001

  29. 29.

    Stokes-Parish JB, Duvivier R, Jolly B. Investigating the impact of moulage on simulation engagement — A systematic review. Nurse Educ Today. 2018;64(January):49–55. https://doi.org/10.1016/j.nedt.2018.01.003.

    Article  PubMed  Google Scholar 

  30. 30.

    Sherwood RJ, Francis G. The effect of mannequin fidelity on the achievement of learning outcomes for nursing, midwifery and allied healthcare practitioners: Systematic review and meta-analysis. Nurse Educ Today. 2018;69:81–94. Available from: https://doi.org/https://doi.org/10.1016/j.nedt.2018.06.025

  31. 31.

    Rudolph JW, Raemer DB, Simon R. Establishing a safe container for learning in simulation the role of the presimulation briefing. Simul Healthc. 2014;9(6):339–49. https://doi.org/10.1097/SIH.0000000000000047.

    Article  PubMed  Google Scholar 

  32. 32.

    Patel SM, Miller CR, Schiavi A, Toy S, Schwengel DA. The sim must go on: adapting resident education to the COVID-19 pandemic using telesimulation. Adv Simul. 2020;5(1):1–11.

    Article  Google Scholar 

  33. 33.

    Cook TM. Personal protective equipment during the COVID-19 pandemic - a narrative review. Anaesthesia. 2020;75(7):920–7. https://doi.org/10.1111/anae.15071.

    CAS  Article  PubMed  Google Scholar 

  34. 34.

    Díaz-Guio DA, Díaz-Guio Y, Pinzón-Rodas V, Díaz-Gomez AS, Guarín-Medina JA, Chaparro-Zúñiga Y, et al. COVID-19: Biosafety in the Intensive Care Unit. Curr Trop Med Rep. 2020;7:104–111. Available from: https://doi.org/https://doi.org/10.1007/s40475-020-00208-z, 2020

  35. 35.

    Sorbello M, El-Boghdadly K, Di Giacinto I, Cataldo R, Esposito C, Falcetta S, et al. The Italian COVID-19 outbreak: experiences and recommendations from clinical practice. Anaesthesia. 2020;75(6):724–32. https://doi.org/10.1111/anae.15049.

    CAS  Article  PubMed  Google Scholar 

  36. 36.

    Ahmed RA, Atkinson SS, Gable B, Yee J, Gardner AK. Coaching from the Sidelines: Examining the Impact of Teledebriefing in Simulation-Based Training. Simul Healthc. 2016;11(5):334–9. https://doi.org/10.1097/SIH.0000000000000177.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

This work was possible thanks to the voluntary collaboration of undergraduate and graduate medical students, and the hard work of the technical and teaching staff of the participating centers. Thank you very much to all.

Funding

This research did not receive any funding.

Author information

Affiliations

Authors

Contributions

All authors participated in the design of the study, discussed the results, and contributed to the final manuscript. The author(s) read and approved the final manuscript.

Authors’ information

DAD-G: He is a professor of intensive care medicine, director of the VitalCare simulation center and is the vice president of the Latin American Federation of Clinical Simulation (FLASIC).

ER-B: She is the national director of simulation centers at Tecnológico de Monterrey, and secretary of the Latin American Federation of Clinical Simulation (FLASIC).

PAS-R: He is a professor of anesthesiology at the Catholic University of Ecuador.

SMM: He is an intern at the VitalCare simulation center.

ASD-G: She is an intern at the VitalCare simulation center.

JAM-E: He is the operations coordinator of the simulation center at Tecnológico de Monterrey.

ABA: He is professor of Internal Medicine at Tecnólogico de Monterrey.

MNA: Resident of anesthesiology of the Catholic University of Ecuador.

ARZ: She is a professor at VitalCare simulation center.

AJR-M: He is a Professor and senior lecturer of Medicine at the Fundación Universitaria de las Americas and VitalCare simulation center, and he is the director of the Latin American Network of COVID-19 (LANCOVID).

Corresponding author

Correspondence to Diego Andrés Díaz-Guio.

Ethics declarations

Ethics approval and consent to participate

This research was approved by the research ethics committee of the VitalCare Clinical Simulation Center with registration # CEIC-005-05-2020. All participants gave their informed consent to participate in the study.

Consent for publication

The image included in this work does not allow the identification of people, however, it has written authorization for its publication.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Díaz-Guio, D.A., Ríos-Barrientos, E., Santillán-Roldan, P.A. et al. Online-synchronized clinical simulation: an efficient teaching-learning option for the COVID-19 pandemic time and: beyond. Adv Simul 6, 30 (2021). https://doi.org/10.1186/s41077-021-00183-z

Download citation

Keywords

  • COVID-19
  • SARS-CoV-2
  • Learning
  • Clinical simulation
  • Telesimulation
  • Teledebriefing
  • Human factors
  • Latin America