Skip to main content

Characterization of simulation centers and programs in Latin America according to the ASPIRE and SSH quality criteria

Abstract

Background

Latin American clinical simulation has had an important development; there are no studies that characterize simulation centers and programs in the entire region. The aims of this work are to characterize the current state of simulation-based education in the health sciences, to determine the structure of Latin American simulation centers in terms of teaching, research, and continuing medical education (CME), as well as to determine the perception of quality based on international standards of simulation practices for the directors of Latin American centers.

Methods

A quantitative, descriptive, cross-sectional study with a demographic questionnaire and a Likert-type survey was conducted to the directors of the simulation centers found in Latin America.

Results

Four hundred eight simulation centers were documented, the survey was answered by 240 directors, and the data from 149 were complete responses on the 42 quality self-perception scale and considered valid on further analyses related to the quality of the programs. Most of the centers that responded correspond to Chile, Brazil, and Mexico (37.5%, 18.1%, 12.7%). 84% of the centers are university-based, and 71% of the centers are medium-sized, with less than 10 instructors (54%). The directors are mostly women (61.7%), medical doctors (50%), and nurses (40%), with clinical specialization (37%), master’s degree (53%), and doctorate (13%). 75% have completed a simulation instructor course, and 6% have developed a fellowship. Most consider the maintenance of international quality standards to be relevant in their centers, mainly in reflective training techniques, ethical aspects, and adequate learning environments.

Conclusions

Simulation-based education in health sciences has had an increasing development in Latin America, within a university environment, in an important academic specialization process that seeks to adhere to high-quality standards to improve training and development of clinical skills, human factors, and critical thinking. We recommend starting accreditation processes in Latin America and studies that measure the quality of simulation-based education in our region, based on objective observations more than in self-reporting.

Background

Latin America is a region of the American continent whose languages, derived from Latin, are mainly Spanish and Portuguese. This territory is made up of approximately 20 nations and has an extension of 22,000,000 km2, where about 626 million people live, with ethnic, cultural [1], economic, and public education financing diversity [2].

Latin America is different from regions in which clinical simulation training and research criteria or recommendations are available for simulation-based education such as Europe, the UK, the USA, and Canada [3,4,5].

Simulation has been reported in Latin America as a teaching tool in prelicensure [6, 7], postgraduate [8], and cardiopulmonary resuscitation programs [9], and as an assessment tool inside OSCEs (Objective and Structured Clinical Examination) [10,11,12]. Information regarding simulation centers in Latin America is scarce [13, 14]. There are no studies that characterize simulation centers and their programs or quality.

The quality of clinical simulation occupies an important part of the agenda of scientific societies in Europe and North America, focusing both on the standards and recommendations of good practice [15], as well as on the accreditation criteria to measure quality. The accreditation criteria for the simulation centers of the Society for Simulation in Healthcare (SSH), which consider elements of systems integration [3], the criteria of the Association for Medical Education in Europe (AMEE) for accreditation of programs [4], the standards for simulation training developed by the International Nursing Association for Clinical Simulation and Learning (INACSL) [5], and the Association of Standardized Patient Educators (ASPE) [16] are examples of standards of quality that do not exists for Latin America.

Currently, in Latin America, there is the need to work on standardized aspects of the quality of simulation-based education; nonetheless, until now, there is no consensus instrument to measure the quality of our centers and programs.

The aims of this work are to characterize the current state of simulation-based education in the health sciences, to determine the structure of Latin American simulation centers in terms of teaching, research, and continuing medical education (CME), as well as to determine the perception of quality based on international standards of simulation practices for the directors of Latin American centers.

Methods

Study design

A quantitative, descriptive, cross-sectional study using an online self-report instrument was conducted.

Setting and participants

For this study, we considered a simulation center or simulation program as an organization with dedicated resources, and a mission targeted to the use of simulation for education, assessment or research, that uses a substantial component of simulation as a technique [3].

Directors of simulation centers were asked to respond representing simulation centers in Latin American Spanish- and Portuguese-speaking countries.

An intentional sample was selected based on the defined population. A database was created with the contact emails that appeared on the institutional websites and was complemented by a snowball sampling technique to cover the greatest extent of the universe in the cases where a contact email was not available at the institutional website. To include centers that do not have a website, we used the information available from congress contacts. Once the database was constructed with the information known to the authors, we expanded it, identifying simulation center directors in each country from whom we requested contact information for centers not yet included in the database. No databases of country societies or the Latin American federation were used.

We also used data available on population, and the percentage of global Gross Domestic Product (GDP) spended on education and health in Latin American countries, obtained from public information on the CEPAL website for 2018 [17].

Instrument development

A group of researchers (two nurses and five medical doctors) trained in simulation, with experience in administering simulation centers and research, belonging to the Federación Latinoamericana de Simulación Clínica (FLASIC, www.flasic.org), constituted a committee to design the research protocol to develop the instruments.

Based on a literature review that included the accreditation criteria of SSH and ASPIRE, surveys used to characterize worldwide simulation centers [18,19,20], centers in European [21, 22] and Latin American single countries [14], and some definitions to report simulation centers resources and activities [23]. A two-part bilingual instrument (Spanish and Portuguese) was designed [24].

The selection of the SSH criteria was based on the fact that they include systems integration criteria that we needed to characterize the simulation centers of clinical institutions, and the ASPIRE criteria because they are based on elements of Medical Education. In addition, both SSH and ASPIRE have center and program accreditation processes. The survey includes the main criteria of both SSH and ASPIRE.

The first part is a characterization questionnaire with 19 questions, focusing on center information (country, type of institutions, infrastructure, metrics of activities, human resources and directors profile, simulation resources) and program orientations. In relation to the size of the centers, they were classified into groups according to the square meters reported (small <122 m2, medium 122–1500m2, large 1500–2900m2, very large> 2900m2). For the metrics of activities used in the questionnaire (number of participants per year, hours per participant, number of activities, and number of hours of room use), an explanation of how to calculate them was included [23].

The second part was developed through a modified e-Delphi method based on the opinions of six Latin American experts to assess the self-perception of quality. The expert profile was a professional with more than 7 years of experience and postgraduate training in educational sciences and clinical simulation who had experience in administering simulation centers and training instructors.

A first draft of the second part of the instrument was created in Spanish with six dimensions and 42 items based on the accreditation criteria for SSH [3] and ASPIRE [4] simulation centers. This version passed through a three-step iterative creation process (e-Delphi) until it reached a complete consensus. The semantics, wording, and spelling were adjusted at the first Delphi step. The following stages did not generate changes in the instrument. As a result of that process, a preliminary Spanish version of this questionnaire was obtained [24]. The revised questionnaire maintained the initial number of dimensions and items.

The instrument was translated to Portuguese by a researcher, native in Portuguese and proficient in Spanish. A backward independent translation was performed from Portuguese to Spanish to corroborate the first. Finally, in terms of the semantics and cultural equivalence for the study, the Portuguese version was reviewed by two simulation instructors with Portuguese as their native language [26, 27].

The bilingual final instrument was composed of a demographic questionnaire with 19 items and a 42 items quality self-perception questionnaire Likert-type (1 = totally disagree to 5 = totally agree).

The average lickert per item and dimension was calculated. A score was also calculated for each dimension and for the total scale, adding the score for each item.

Data collection

This study was carried out between January and May 2019. The survey was hosted on the Survey Monkey® platform (https://es.surveymonkey.com/) and sent by email to the directors of the simulation centers in Latin America, with monthly reminders (four email reminders). There were no incentives for participation or completion of the survey.

Statistical analysis

The use of descriptive statistics for the characterization of the sample was considered. Cronbach’s alpha was calculated as a measure of internal consistency of the self-perception of the quality questionnaire (alpha value> 0.70) [25, 28]. In the 42 items self-perception questionnaire of the instrument, good internal consistency with Cronbach’s alpha of 0.95 was found.

The statistical package IBM Corp. Released 2015. IBM SPSS Statistics for Windows, Version 23.0. Armonk, NY: IBM Corp was used. The correlation analysis with demographic and economic data was performed with Microsoft Excell (version 16.52).

Ethical approval

Participants informed consent were obtained. The ethical committee approved the Research design in the Universidad del Desarrollo de Chile (CEI 46/2018) and the Federal University of Santa Catarina of Brazil (Parecer do Comitê de Ética N° 3.206.561).

Results

Characteristics of Latin American simulation centers

Four hundred eight directors of simulation centers in Latin America were contacted. The distribution of those centers goes between 1 and 136 per country.

Using CEPAL information about countries’ population, education GDP, and health GDP, we graphed the relationship with the number of simulation centers. There is a positive linear correlation between the number of centers v/s population (correlation coefficient 0,922) (Fig. 1)

Fig. 1
figure 1

Number of centers v/s population

149 directors sent complete responses (36.5%) on the 42 quality self-perception scale and considered valid on further analyses related to the quality of the programs. Valid responses were obtained from 14 countries. The countries with the highest response rate were Chile (37.5%), Brazil (18.1%), and Mexico (12.7%) (Fig. 2).

Fig. 2
figure 2

Number of centers contacted by country and complete responses to quality self-perception survey

Most simulation centers were university linked (84%), and only 12 centers linked to health institutions were reported, of which 50% were located in Chile.

Centers responding to the survey were in existence for an average of 7 years with a standard deviation of 6833 years and range from 0 to 58 years. The first simulation center in Latin America was created in 1961 in Peru and is still open to date. It is a surgical simulation center that reports that the resources are mainly biological models and surgical trainers. New simulation centers were created every year from 1998 to 2019 (Fig. 3). Between 2008 and 2016, about ten centers per year were created, and in the years 2017 and 2018, we observed the creation of about 20 centers per year. During the first half of 2019, 8 new simulation centers were created in the region.

Fig. 3
figure 3

Year of creation of new simulation centers in latinamerica

Infrastructure and metric of activities

The data reported in the logistics aspects are heterogeneous; the centers’ area is declared between 8 and 4307 m2. The information was organized into quartiles, and the centers were categorized into small <122 m2, medium 122–1500m2, large 1500–2900m2, and very large> 2900m2. Twenty-five percent of simulation centers are small, and 71% are medium size. Heterogeneity was found in the number of participants per year, hours per participant, number of activities, and number of hours of room use (Table 1).

Table 1 Characteristics of Latin American simulation centers

Human resources and directors’ profile

Most simulation centers (54%) have less than ten instructors and 8% more than 50. Regarding the profile of the simulation center directors, 61.7% are women, 50% are medical doctors (MD), 40% nurses, and 5% engineers. Thirty-seven percent have a clinical specialty, 53% have a master’s degree, and 13% have doctoral training. In the specific training in simulation, 75% have completed an instructor course, 6% have completed a fellowship in simulation, 5.4% report having Certified Healthcare Simulation Educator (CHSE), and 17% report not having specific training in simulation-based education.

Simulation resources and programs’ orientation

Regarding the types of simulation resources used in simulation centers, the most commonly used are high-cost simulators (81%) and simulators for procedures (79%). The third place among the simulation resources used in the centers corresponds to simulated patients. The least used resources correspond to biological samples and animal models, which are used in centers dedicated to surgical training (Table 2).

Table 2 Distribution of centers who use simulation resources and declare orientation of simulation programs (n = 149)

Regarding the intention of acquiring new simulators for next year, 65% of respondents do not want to.

When asked about program orientations, most centers report that their programs intend to improve the practice and development of clinical skills (94%), critical thinking (93%), and human factors (84%) (Table 2).

Self-perception of quality

The research dimension was the one with the lowest Likert average (m = 3.3), and the SSH teaching/learning dimension the one with the highest Likert (m = 4.1) global score by dimensions (Table 3).

Table 3 Mean average of Likert points by every item of quality self-perception instrument (ASPIRE and SSH criteria) (n = 149)

The individual descriptors in which the Likert average was higher were, in decreasing order, those related to adequate learning environments (item 29, SSH Teaching/Learning Criteria, m = 4.4) ethical aspects (item 20, SSH CORE Criteria, m = 4.4) and reflective training techniques (item 7, ASPIRE Criteria, m = 4.3) as seen in Table 3.

The lowest average Likert in individual descriptors was 3.0 in item 12, corresponding to ASPIRE Criteria: “The faculty of the institution (or its students) conducts research related to simulation-based education,” and in item 38, corresponding to SSH Research Criteria: “There is a designated individual who is responsible for administering research programs” (Table 3).

The higher score by dimension was obtained using ASPIRE criteria, and the lowest in SSH Research dimension (Table 4).

Table 4 Total Score (Sum of total scores of quality self perception instrument) and Total Dimension Score (Sum of Scores by each dimension of quality self perception instrument)

Discussion

Few studies show the specific characteristics of the simulation centers [18, 19], but they do not consider the orientations of the programs carried out in them. Our work is the first on a large scale in Latin America, and we found an acceptable response rate compared to works in a single country in the region [14].

In the recent Italian survey of simulated pediatric training, nearly 15% of the surveyees answered [21]; In Switzerland, Stocker obtained a response in 96% of hospital centers where they offered training in pediatrics; of these, 66.6% used clinical simulation in their teaching practice [22]. Sixty-six percent of residents surveyed, and 100% of program managers responded to the Canadian emergency medicine training survey [20].

Our response rate is lower than that reported in other latitudes and contexts. It may be given by the cultural diversity in Latin America, by the differences in the development of the simulation in the different countries of the region, or because this is the first time that is carried out surveys of this type.

The number of simulation centers in Latinamerican Countries shows large differences. Since these are countries with different economic and population sizes, it is convenient to compare not only in absolute terms, but also in relation to the level of wealth and number of inhabitants of the countries. We found a positive linear correlation between the number of centers v/s population, with a country that deviates from the tendency, with a higher proportion of centers for the number of inhabitants. Moreover, it is precisely this country that reports the most simulation centers associated with clinical institutions, an area in which it is possible that the rest of Latin America will expand simulation centers in the future given the worldwide trends on systems integration of clinical simulation [20,21,22]. We do not believe we can recommend a standard of centers per number of inhabitants at the present time or estimate how much this growth will reach a plateau.

We cannot find correlation between global GDP in education or GDP in health and number of simulation centers. One particularity of latinamerican countries are the differences about education funding. In some countries, the expenditure of education depends mostly on private funding, contrasting with the rest of Latin American countries [17].

The size and activities of the Latin American simulation centers were heterogeneous. This may be explained by the fact that some universities have several campuses in different regions of the same country, and the report shows the total activities as a single center, due to their administrative organization of the programs. Another explanation is that some centers conduct a significant number of OSCE evaluations to their own students and in processes of re-validation of international professional degrees, which influences the reporting of higher indicators or that some centers are dedicated to single professions and others attend multiple careers with large groups of students.

At the time of data collection, most of the simulation centers in Latin America were linked to university institutions. It is important to consider that centers linked to clinical institutions may have different forms of organization and focuses of action than those of university institutions, and that the number of centers or simulation activities with a focus on clinical teams may be modified given the need for training of specific clinical competencies related to the current pandemic context.

It was reported that a simulation center using surgical simulation resources was created in 1961. Given the methodology defined in this study, based on self-reporting, no actions were taken to verify this statement. The literature describes the use of frozen biological material for surgical procedural training in 1986 [29], and in the early 1990s, the first recommendations for surgical training with simulation are found [30].

In our work, the professional profile of those who run the centers is heterogeneous in the profession, clinical specialty, academic degrees, and simulation training. The vast majority have received instruction with short courses, and only a few have fellow or international certifications. It is noteworthy that almost one sixth of them do not have specific training in the area.

Observing quality criteria is necessary for human activities that aim for excellence; this includes clinical simulation. In this study, the quality standards used were recognized as highly relevant by the directors of the simulation centers, mainly in teaching, learning environments, and ethical criteria. The research criteria were considered less relevant; this is consistent with the low region research visibility in the world ranking in the last decade [31].

Most of the simulation centers in the region were reported to be linked to universities. This may be related to the fact that the best rated dimensions globally correspond to those based on the ASPIRE criteria and the SSH teaching and learning criteria.

In 2013, Arthur et al. conducted a study with Delphi methodology in which the importance of maintaining standards in nursing simulation-based education is denoted [32]. Although the daily activity of the Latin America centers (participants, activities, etc.) was heterogeneous and relatively low, most of them showed a high number of activities related to the modalities of simulation resources used.

This study presents some limitations, such as the dependence of the self-report and the sincerity of the respondents [33], and the majority of responses come from three countries.

Another important consideration is that snowball sampling makes it challenging to determine the sampling error or make inferences about populations based on the obtained sample.

However, the internal consistency of the survey was high, and the responses were similar by country, size of the center, and profile of the director, which gives greater validity to the results [24]. Another research that attempts to characterize centers worldwide has a lesser response rate than our research [18].

In this case, the information that we get is important because it is the first attempt to characterize the complete region. We consider that this work is a basis to better understand how simulation centers operate in Latin America and open the opportunity for new research in this field.

Some of these research areas are the differences between simulation centers based in university and clinical institutions, or the relevance of the training of center directors in the development of educational programs. It is also important to inquire into the potential differences between self-reporting and independent observer evaluation.

Conclusions

Simulation-based education in health sciences has had an increasing development in Latin America. Growth alone is not a goal, and quality might be worth looking at.

Characterized centers are predominantly medium-sized, university-based, using standardized mannequins and patients to train clinical skills and procedures.

Agreement with the importance of quality and continuous improvement is high; it is low concerning research criteria and adherence to program evaluation mechanisms.

We recommend starting accreditation processes in Latin America and studies that measure the quality of simulation-based education in our region, based on objective observations more than in self-reporting.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

OSCE:

Objective and Structured Clinical Examination

SSH:

Society for Simulation in Healthcare

AMEE:

Association for Medical Education in Europe

INACSL:

International Nursing Association for Clinical Simulation and Learning ASPE

Association of Standardized Patient Educators

FLASIC:

Federación Latinoamericana de Simulación Clínica - Latin American Federation of Clinical Simulation

CEPAL:

Confederación

References

  1. Economic Commission for Latin America and the Caribbean (ECLAC). Demographic observatory, 2014 [Internet]. Santiago, Chile: United Nations publication; [cited 2020 Apr 25]. Available from: https://repositorio.cepal.org/bitstream/handle/11362/39228/1/S1500643_mu.pdf

  2. OECD, Economic Commission for Latin America and the Caribbean, CAF Development Bank of Latin America and European Commission. Perspectivas económicas de América Latina 2019: Desarrollo en transición. 2019 Dec [cited 2020 Apr 25]; Available from: https://www.oecd-ilibrary.org/development/perspectivas-economicas-de-america-latina-2019_g2g9ff1a-es

  3. Society for Simulation in Healthcare, Council for Accreditation of Healthcare Simulation Programs. SSH Accreditation Process: Informational Guide for the Accreditation Process from the SSH Council for Accreditation of Healthcare Simulation Programs. [Internet]. Society for Simulation in Healthcare. 2017 [cited 2019 Sep 18]. Available from: Society for Simulation in Healthcare; 2017 [cited september 18, 2019]. Available in: https://www.ssih.org/Portals/48/Accreditation/SSH%20Accreditation%20Informational%20Guide.pdf?ver=2017-03-09-133118-517

  4. ASPIRE International Recognition of Excellence in Education. Areas of excellence to be recognised [Internet]. Aspire-to-excellence.org. 2019 [cited 2019 Sep 18]. Available from: http://www.aspire-to-excellence.org/Areas+of+Excellence/

  5. Sittner BJ, Aebersold ML, Paige JB, Graham LLM, Schram AP, Decker SI, et al. INACSL Standards of best practice for simulation: past, present, and future. Nurs Educ Perspect [Internet]. September/October 2015;36(5):294–8. Available from: https://doi.org/10.5480/15-1670

  6. Martínez G, Guarda E, Baeza R, Garayar B, Chamorro G, Casanegra P. Enseñanza de auscultación cardiaca a estudiantes y residentes de medicina mediante el uso de un simulador de ruidos cardiacos. Rev Esp Cardiol (Engl Ed) [Internet]. 2012;65(12):1135–6. Available from: https://doi.org/10.1016/j.recesp.2012.03.022

  7. Armijo S. Consider this: how to integrate simulation into pre-licensure programs. In: Palaganas JC, Maxworthy JC, Epps CA, Mancini ME, editors. Defining Excellence in Simulation Programs. Lippincott Williams & Wilkins (LWW); 2014.

    Google Scholar 

  8. Boza C, León F, Buckel E, Riquelme A, Crovari F, Martínez J, Aggarwal R., Grantcharov T., Jarufe N., Varas J. Simulation-trained junior residents perform better than general surgeons on advanced laparoscopic cases. Surg Endosc [Internet]. 2017;31(1):135–41. Available from: https://doi.org/10.1007/s00464-016-4942-6

  9. López-Herce J, Red de Estudio Iberoamericano de estudio de la parada cardiorrespiratoria en la infancia (RIBEPCI), Matamoros MM, Moya L, Almonte E, Coronel D, et al. Paediatric cardiopulmonary resuscitation training program in Latin-America: the RIBEPCI experience. BMC Med Educ [Internet]. 2017;17(1). Available from: https://doi.org/10.1186/s12909-017-1005-1

  10. Triviño BX, Vásquez MA, Mena MA, López TA, Aldunate RM, Varas PM, et al. Aplicación del Examen Clínico Objetivo Estructurado (OSCE) en la evaluación final del internado de pediatría en dos escuelas de medicina. Rev Med Chil [Internet]. 2000;128(9). Available from: https://doi.org/10.4067/s0034-98872000000900013.

  11. Bustamante ZM, Carvajal HC, Gottlieb BB, Contreras PJE, Uribe MM, Melkonian TE, et al. Hacia un nuevo instrumento de evaluación en la carrera de Medicina: Uso del método OSCE. Rev Med Chil. 2000;128(9):1039–44. https://doi.org/10.4067/S0034-98872000000900013.

    Article  PubMed  CAS  Google Scholar 

  12. Trejo-Mejía JA, Sánchez-Mendiola M, Méndez-Ramírez I, Martínez-González A. Reliability analysis of the objective structured clinical examination using generalizability theory. Med Educ Online[Internet]. 2016 Aug 18;21(1):31650. Available from: http://dx.doi.org/10.3402/meo.v21.31650, 21, 1

  13. Matiz CH. La Práctica de la Simulación Clínica en las Ciencias de la Salud. Rev Col Cardiología. 2011;18(6):297–306.

    Google Scholar 

  14. Escudero Z. EX, Fuentes CM, Gonzalez V. MJO, Corvetto A. MA. Simulación en educación para ciencias de la Salud: ¿Qué calidad hemos alcanzado en Chile? ARS MEDICA Revista de Ciencias Médicas [Internet]. 2017 Jan 26 [cited 2019 Sep 19];26(2):16–20. Available from:http://www.arsmedica.cl/index.php/MED/article/view/394

  15. Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB. Simulation in healthcare education: a best evidence practical guide. AMEE Guide No. 82. Med Teach 2013;35(10):e1511-e1530, DOI: 10.3109/0142159X.2013.818632.

  16. Lewis KL, Bohnert CA, Gammon WL, Hölzer H, Lyman L, Smith C, et al. The association of standardized patient educators (ASPE) standards of best practice (SOBP). Adv Simul [Internet]. 2017;2(1). Available from: https://doi.org/10.1186/s41077-017-0043-4

  17. CEPAL. Base de datos de inversión social en América Latina y el Caribe Available from: https://observatoriosocial.cepal.org/inversion/es/paises

  18. Qayumi K, Pachev G, Zheng B, Ziv A, Koval V, Badiei S, et al. Status of simulation in health care education: an international survey. Adv Med Educ Pract. 2014;5:457–67. https://doi.org/10.2147/AMEP.S65451.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Zhao Z, Niu P, Ji X, Sweet RM. State of simulation in healthcare education: an initial survey in Beijing. JSLS. 2017;21(1): e2016.00090.Available from: https://doi.org/10.4293/JSLS.2016.00090

  20. Russell E, Hall AK, Hagel C, Petrosoniak A, Dagnone JD, Howes D. Simulation in Canadian postgraduate emergency medicine training – a national survey. CJEM. 2018;20(1):132–41. https://doi.org/10.1017/cem.2017.24.

    Article  PubMed  Google Scholar 

  21. Binotti M, Genoni G, Rizzollo S, De Luca M, Carenzo L, Monzani A, et al. Simulation-based medical training for paediatric residents in Italy: a nationwide survey. BMC Med Educ [Internet].2019;19(1): 161.:Available from: https://doi.org/10.1186/s12909-019-1581-328

  22. Stocker M, Laine K, Ulmer F. Use of simulation-based medical training in Swiss pediatric hospitals: a national survey. BMC Med Educ [Internet]. 2017;17(1). Available from: https://doi.org/10.1186/s12909-017-0940-1

  23. Feaster S, Lutz J, Reihsen T, Leland F, Shatzer J. Simulation center program metrics. In: Palaganas J, Maxworthy J, Epps C, Mancini M, editors. Defining Excellence in Simulation Programs. First Ed. United States: Wolters Kluwer; 2015. p. 25–38.

    Google Scholar 

  24. Machuca-Contreras F, Armijo-Rivera S, Díaz-Guio A, Nunes-de Oliveira S, Shibao-Miyasato H, Raúl N, Ballesteros-Mendoza I. Creación y propiedades psicométricas de un instrumento de autopercepción de calidad de programas y centros de simulación de Latinoamérica. Revista Latinoamericana de Simulación Clínica [Internet]. 2021; 3 (1):7-14. Available from: https://doi.org/10.35366/99863.

  25. English T, Keeley JW. Internal consistency approach to test construction. In: The Encyclopedia of Clinical Psychology. Hoboken, NJ, USA: John Wiley & Sons, Inc.; 2015. p. 1–3. https://doi.org/10.1002/9781118625392.wbecp156.

    Chapter  Google Scholar 

  26. de Souza AC, Alexandre NMC, Guirardello, E.D.B. Psychometric properties in instruments evaluation of reliability and validity. Propriedades psicométricas na avaliação de instrumentos: avaliação da confiabilidade e da validade. Epidemiol Serv Saude [Internet]. 2017;26(3):649–59. Available from: https://doi.org/10.5123/S1679-49742017000300022

  27. Mokkink LB, Prinsen CA, Bouter LM, de Vet HCW, Terwee CB. The COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) and how to select an outcome measurement instrument. Braz J Phys Ther. [Internet]. 2016;20(2):105–13. Available from: https://doi.org/10.1590/bjpt-rbf.2014.0143

  28. Price LR. Test development. In Methodology in the Social Sciences. Psychometric Methods: Theory into Practice. New York, USA: The Guilford Press; 2017. p. 165–202.

    Google Scholar 

  29. Stotter AT, Becket AJ, Hansen JP, Capperauld I, Dudley HA. Simulation in surgical training using freeze dried material. Br J Surg. 1986 Jan;73(1):52-4. doi: 10.1002/bjs.1800730122. PMID: 3512022. CITA de simulacion con muestras biologicas data de 1980

  30. Capperauld I, Hargraves J. Surgical simulation for general practitioners. Ann R Coll Surg Engl. 1991 Sep;73(5):273-275. PMID: 1929124; PMCID: PMC2499499.

  31. Scimago Journal and Country Rank. World Report, Latin America, Health Professions 1996-2019; [Internet]. Scimagojr.com. [cited 2020 Jun 7]. Available from:https://www.scimagojr.com/worldreport.php?w=Latin%20America&area=3600

  32. Arthur C, Levett-Jones T, Kable A. Quality indicators for the design and implementation of simulation experiences: a Delphi study. Nurse Educ Today [Internet]. 2013;33(11):1357–61. Available from: https://doi.org/10.1016/j.nedt.2012.07.012

  33. Burmeister LF. Principles of successful sample surveys. Anesthesiology. 2003;99(6):1251–2. https://doi.org/10.1097/00000542-200312000-00003.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

The research groups acknowledge to Federación Latinoamericana de Simulación Clínica (FLASIC) for their support. Also thanks to Jorge Bustos, José Luis García, Cinda Pérez, Fanny Solorzano, César Ruiz, Alejandro Sencion, and Nelson López.

Funding

No fundings

Author information

Authors and Affiliations

Authors

Contributions

SA, FM, AD, IB, and SN made substantial contributions to the conception and design of the work; all the authors made a substantial contribution to the data acquisition; FM and SA contributed significantly to the analysis and interpretation of the data; FM, SA, and AD drafted the work or substantively revised it. The authors made revisions, reads, and approved the final manuscript.

Corresponding author

Correspondence to Soledad Armijo-Rivera.

Ethics declarations

Ethics approval and consent to participate

This article was approved by the Ethics Committee of Universidad del Desarrollo (CEI 46/2018) and the Ethics Committee of Universidade Federal de Santa Catarina (Parecer do Comitê de ética n° 3.206.561). Informed consent was obtained from all participants.

Consent for publication

Not applicable

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Armijo-Rivera, S., Machuca-Contreras, F., Raul, N. et al. Characterization of simulation centers and programs in Latin America according to the ASPIRE and SSH quality criteria. Adv Simul 6, 41 (2021). https://doi.org/10.1186/s41077-021-00188-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41077-021-00188-8

Keywords