Skip to main content

Interprofessional staff perspectives on the adoption of or black box technology and simulations to improve patient safety: a multi-methods survey



Medical errors still plague healthcare. Operating Room Black Box (ORBB) and ORBB-simulation (ORBBSIM) are innovative emerging technologies which continuously capture as well as categorize intraoperative data, team information, and audio-visual files, in effort to improve objective quality measures. ORBB and ORBBSIM have an opportunity to improve patient safety, yet a paucity of implementation literature exists. Overcoming implementation barriers is critical. This study sought to obtain rich insights while identifying facilitators and barriers to adoption of ORBB and ORBBSIM in alignment with Donabedian’s model of health services and healthcare quality. Enrichment themes included translational performance improvement and real-world examples to develop sessions.


Interprofessional OR staff were invited to complete two surveys assessing staff’s perceptions using TeamSTEPPS’s validated Teamwork Perceptions Questionnaire (T-TPQ) and open-ended questions. Descriptive statistics were calculated for quantitative variables, and inductive phenomenological content analysis was used for qualitative.


Survey 1 captured 71 responses from 334 invited (RR 21%) while survey 2 captured 47 responses from 157 (RR 29.9%). The T-TPQ score was 65.2, with Communication (70.4) the highest construct and Leadership (58.0) the lowest. Quality Improvement (QI), Patient Safety, and Objective Case Review were the most common perceived ORBB benefits. Trends suggested a reciprocal benefit of dual ORBB and ORBBSIM adoption. Trends also suggested that dual implementation can promote Psychological Safety, culture, trust, and technology comfort. The need for an implementation plan built on change management principles and a constructive culture were key findings.


Findings supported ORBB implementation themes from previous literature and deepened our understanding through the exploration of team culture. This blueprint provides a model to help organizations adopt ORBB and ORBBSIM. Outcomes can establish an empirical paradigm for future studies.


Medical errors (MEs) in hospitals contribute to over 200,000 preventable deaths annually, resulting in significant costs associated with re-admissions, longer hospital stays, and malpractice lawsuits [1,2,3]. Despite efforts to improve quality and develop new techniques and technologies, errors continue to plague the healthcare sector [3,4,5]. Surgical errors (SEs) are particularly prevalent (44.9% of MEs) and pose significant harm to patients due to the invasive nature of surgery [1, 3, 4, 6,7,8]. Various factors contribute to the risk of SEs, including communication breakdowns, non-adherence to safety protocols, lack of standardization, performance deviations, environmental factors, time constraints, and ineffective utilization of technology [9].

Efforts to mitigate MEs have been multi-dimensional which have included data collection, analysis, as well as dissemination at state and national levels, policy changes, and targeted initiatives [2, 9,10,11,12]. At the micro-level, efforts have focused on overcoming site-specific challenges, including performance goals, Universal Protocols, safety checklists, and in-depth systematic review of cases and errors [4, 9]. However, the outcomes of these efforts have been mixed, indicating the need for a critical evaluation of why errors persist in healthcare, concerted efforts to find new ME mitigators, and a better understanding of which mechanisms are most effective [9, 11, 12].

Growing literature on the use of audio-visual (AV) recordings in clinical environments has shown promise in improving ME identification, standardization, reporting, and mitigation through enhanced quality of objective data [13,14,15]. In an effort to expand the promising findings of AV recording, emerging technology has sought to enhance clinical AV recordings by harnessing big data through the capture of additional data streams [15]. Analogous to flight deck recorders, Operating Room (OR) Black Box (ORBB)(Surgical Safety Technologies Inc., Toronto, ON, Canada) continuously captures and categorizes additional sources of intraoperative data, such as patient records, physiological capture, environment data (noise, interruptions, people present), and team information alongside AV files [15]. ORBB then transforms this information into detailed reports as well as real-time benchmarks via artificial intelligence [15].

Based on ORBB's capability to enhance reporting, identify contributing factors to MEs, and improve coaching metrics, the integration of ORBB into simulation (ORBBSIM) presents a valuable opportunity to advance ME mitigation efforts. As the pioneering paper that introduces ORBBSIM, we present the following examples to provide a contextual understanding. ORBBSIM represents an innovative approach that harnesses multiple streams of data to augment simulation effectiveness, enabling the design of tailored development programs and facilitating more accurate assessments. ORBB installed systems capture data streams from a wide range of devices, such as AV data from procedural cameras (i.e., laparoscope), AV room data from ceiling-mounted cameras, simulated patient records, physiological measurements (manikin vitals), environmental factors (number of people present), and behavioral observations (i.e., teamwork, communication, or eye-tracking). By amalgamating these disparate data streams, ORBBSIM has a significant opportunity to enhance learning and patient outcomes. This personalized approach can maximize the effectiveness of simulation-based training and promote competency development in a targeted and efficient manner.

ORBB literature is expanding. The majority of literature can be categorized into the following groups. First, ORBB has demonstrated the ability to improve real-time monitoring and enhance reporting capabilities [16,17,18,19,20,21,22,23,24,25]. Next, ORBB literature has highlighted its effectiveness in capturing and organizing multiple data sets across different domains, surpassing the capabilities of legacy AV systems [16,17,18,19,20,21,22,23,24,25,26,27,28,29]. Next, studies have shown that ORBB's expanded scope in collecting multiple data streams has significantly improved the identification of contributing factors to MEs and the effectiveness of targeted mitigation efforts [16, 17, 20, 22, 25]. Additionally, the improved data capture and accuracy of ORBB have demonstrated enhanced metrics for coaching programs, outperforming the outdated apprenticeship model and accentuating the value of ORBBSIM [14, 15, 18].

Lastly, only two studies have examined ORBB implementation [26, 27]. Findings revealed a range of OR team’s attitudes towards the technology, with some expressing mixed feelings [26]. Team members who were supportive of the technology emphasized that their support was contingent upon its implementation from the perspective of promoting a patient safety culture, enhancing care processes, and improving patient outcomes [26]. This underscores the significance of adopting ORBB through an effective implementation strategy that avoids any perception of it being used for punitive purposes but rather as a learning tool [26, 27]. Stakeholders believed all staff should be included to foster buy-in while underscoring the importance of transparency [26, 27]. Those who were opposed to ORBB cited concerns about feeling threatened, fear of a punitive culture, downstream legal challenges, and the breach of confidentiality [26]. Findings corroborate with clinical recording implementation literature, where stakeholders believed recordings could improve patient care, but only with careful implementation [26, 27, 30,31,32,33,34,35,36]. Similarly, broader clinical recording implementation literature has found those opposed to this technology cited concerns about feeling threatened, a punitive culture, legal challenges, and privacy [26, 37]. These single-site studies focused on clinical adoption with no known literature on ORBB adoption in simulation [26, 34,35,36].

To expand the scope and maximize the impact of ORBB and ORBBSIM across the industry, effective adoption across all disciplines is critical [21, 26,27,28,29,30]. Endemically, healthcare has struggled with poor adoption of technology, as seen in the cases of pre-COVID telehealth and system-wide simulation [31, 32]. Implementation inhibitors encompass complex infrastructures (structure), diverse stakeholder perspectives (process), and a legacy punitive culture (outcomes) (Fig. 1) [31,32,33, 37,38,39]. Individual-level barriers include ignorance, poor communication, and apathy, while system-level barriers encompass a lack of critical information, implementation resources, and feedback mechanisms [35, 39,40,41,42,43,44] . An implementation plan that harnesses a system’s approach, like Donabedian’s Model, can ensure all factors that impact adoption are examined and considered [21, 26,27,28,29,30, 40,41,42,43,44].

Fig. 1
figure 1

Donabedian’s quality framework. Figure 1 outlines the application of Donabedian’s quality framework to the project, highlighting conceptually the relationships between different quality improvement factors


To address these gaps, this study sought to obtain rich insights while identifying facilitators and barriers to the adoption of ORBB and ORBBSIM, aligning with Donabedian's model of health services and healthcare quality (Fig. 1). Donabedian’s Process factors were measured via AHRQ’s Team Strategies and Tools to Enhance Performance and Patient Safety (TeamSTEPPS) Teamwork Perceptions Questionnaire (T-TPQ) [45,46,47]. Outcomes were measured through qualitative questions exploring staff perspectives [42, 48]. Findings will enable teams to better adopt ORBB technology in order to enhance effective error mitigation leading to improved patient outcomes. The study employs a multi-methods approach to provide critical descriptive data for the effective adoption of ORBB and ORBBSIM.


This was a prospective convergent multi-methods study using descriptive and qualitative data solicited via electronic surveys [47, 49]. A multi-method approach enabled a rich understanding of underlying staff perceptions, beliefs through lived experiences, and culture through methodological triangulation [49]. Surveys provided a method to capture a diverse range of perspectives from a larger audience in an efficient manner while ensuring confidentiality and encouraged non-biased responses, thus promoting honest and representative data collection [49]. The study was designed based on the Standards for Reporting Qualitative Research checklist [50]. The UTSW Institutional Review Board approved an exemption based on Human Research Subject Regulations.

Two subsequential surveys were administered to interprofessional OR team members working at UT Southwestern (UTSW) Medical Center’s Clements University Hospital (CUH). CUH is a large 751-bed academic hospital and was selected because it recently installed ORBBs in five out of 59 (8.5%) ORs; all five ORBBs were in robotic ORs. Additionally, UTSW is the first US center to install ORBBs in simulated ORs. Leadership’s buy-in was garnered through the ORBB Executive Committee. UTSW policies around sensitive data required the first survey (S1) to be administered through the organizational survey system, GLINT (Sunnyvale, CA). Open-ended qualitative questions were administered via a subsequent secondary survey (S2) through REDCap electronic data capture tools.

Survey development

S1’s quantitative questions captured staff’s attitudes towards five core T-TPQ team culture constructs [47, 49, 51, 52]. T-TPQ examined micro- and macro-level factors via five core components’ subscales, including Team Structure, Leadership, Mutual Support, Situation Monitoring, and Communication [47]. Seven Likert-scale questions defined and later coded each of the five subscales, where 100 = strongly agree, 50 = neutral, and 0 = strongly disagree (see Supplemental Digital Content 1 for a copy of the survey 1) [47]. S2 contained qualitative questions which elicited naturally occurring phenomena around interprofessional teams’ perspectives, beliefs, and feelings about ORBB, ORBBSIM, perceived benefits, and barriers (Supplemental Digital Content 2 for a copy of the survey 2) [40, 42, 47, 49, 51]. Five subject matter experts reviewed surveys’ instrument wording, flow, design, and collection format [49].

Survey administration

S1 was administered to the full population of 334 CUH OR-members working in robotic ORs. The email invited the following team members (% of 334 invited) to the survey: medical doctors (surgeons and anesthesiologists) (32%), nurses (35%), technicians (13%), APPs (19%), and other OR staff (1%) to the survey. S1 was open from November 29, 2021, through December 21, 2021, and took an average of 30–40 min to complete. S2 was administered to a subgroup of S1, focusing on a core group of 157 ORBB interprofessional team members who were identified by CUH OR leadership as those who work primarily in rooms outfitted by ORBB. This included medical doctors (surgeons and anesthesiologists) (39%), nurses (31%), technicians (9%), APPs (20%) and other OR staff (1%) to the survey. S2 was administered from December 22, 2021, through January 22, 2022, and took an average of 10–20 min to complete.

Participants for both surveys were invited via email to participate. The email included an introduction, the aim, completion time, duration surveys would be open, a reminder that the survey was voluntary, a statement ensuring invitees that answers would not impact the respondent’s job, and contact information. The email for S1 highlighted that only aggregate data would be shared. S2’s highlighted that identifiable information would only be seen by study-team members with all subsequently reported data de-identified. Weekly reminder emails were sent to those who had not completed the surveys.

Data analysis

Descriptive statistics were computed for the quantitative variables using the validated T-TPQ survey tool's five constructs, following the defined methodology by AHRQ for calculating aggregate means. Aggregate means were determined by summing all the responses within that construct and dividing the sum by the number of items in the construct (AHRQ, 2021). To explore variations in perceptions among different healthcare professions, subgroup analysis was conducted (AHRQ, 2021). An inductive phenomenological content analysis was used to describe qualitative open-ended questions [53]. Pre-analysis, exploration, and treatment of the data were performed in Microsoft Excel (2022). The data and codes were then imported to QSr NVivo software for interpretation phases (V.11, QSR International, Doncaster, Australia). Three coders established intercoder reliability (ICR) by independently coding seven practice responses, comparing themes, and discussing areas of non-alignment. Two coders then independently coded the remaining responses and compiled a codebook. A third coder reviewed the codebook to ensure all viewpoints were considered and settled areas of non-alignment [54]. The refinement process was done by the PI using the newly developed codebook and refining codes until thematic saturation was reached, while comparing other literature’s themes [54, 55]. A summary of common emerging themes by profession using thematic content analysis was presented to coders to ensure consideration of interprofessional perspectives, mitigate personal biases, and ensure credibility [55]. Data were believed to be missing completely at random, based on variability and no observed missing data trends. Missing data were handled through listwise deletion.


Baseline demographics

Out of the 334 surveys administered for S1, 87 responses were collected and after removing non-complete responses through listwise deletion, 76 completed responses were retained (RR 23%). Out of the 157 surveys administered for S2, 54 responses were collected and after removing non-complete responses through listwise deletion, 47 responses remained (RR 29.9%). The most common role represented for S1 was OR Registered Nurses (RN) (36, 50.7%). S2’s most common profession was Medical Doctors (MD) (30, 63.8%). S1 had an even split on respondents who had worked at UTSW for over 5 years versus under 5 years (38, 50%) and an almost even split for S2 respondents, with a slight majority working over 5 years (25, 53.2%) versus under 5 years (22, 46.8%). Looking at specific demographic S2 questions, Anesthesiology and Surgical Services had the highest and equal respondents (14, 29.8%). Respondents most commonly had simulation experience (33, 79.3%) versus no experience (14, 29.8%). The majority of respondents had worked in an OR with ORBB (37, 78.7%) and all respondents knew what ORBB was (47, 100%) (Table 1). Complete demographic information can be found in Table 1.

Table 1 Demographics of respondents

S1: T-TPQ results

The overall T-TPQ aggregate score was 65.2. The highest AHRQ construct was Communication (70.4), followed by Situation Monitoring (68.4), Mutual Support (65.3), Team Structure (64.0), and Leadership with the lowest aggregate mean (58.0). Analyzing aggregate means by profession, Medical Technicians (MedTech) had the highest average aggregate mean (71.6) across all constructs, followed by MDs (69.7), APPs (64.1), and OR RNs (62.3), while the Other category had the lowest (53.1).

Considering constructs by profession, MedTech’s highest-rated construct was Situation Monitoring (75.4) while their lowest-rated construct was Leadership (62.3). MDs highest rated construct was Leadership (75.6), which was also the highest aggregate mean across all constructs when looking by profession. MD’s lowest construct was Team Structure (65.4). APPs highest aggregate mean was Leadership (69.0), while their lowest was Team Structure (59.7). OR RNs’ highest-scored construct was Communication (69.7), while their lowest was Leadership (46.4). Lastly, for those under the Other category, their highest score was Communication (62.1), while their lowest, also the lowest construct aggregate mean across all professions, was Leadership (42.1).

Team structure

Examining aggregate means for specific questions under each construct, the highest mean question under Team Structure was Responsibilities (72) while the lowest mean was Resource Efficiency (59). The profession with the highest mean rating for Team Structure was MedTechs (72.7) while the profession with the lowest mean were those assigned to the Other category (50.0).


The highest mean question was Manager & Change (66.0), while the lowest mean was Manager Decision Making (52.0). The profession with the highest mean for Leadership was MDs (75.6) while the profession with the lowest mean was those assigned to Other (40.0).

Situation monitoring

The highest mean question under Situation Monitoring was Correct Mistakes (72.0) while the lowest mean was Staff Anticipate Needs (64.0). The group with the highest mean rating was MedTech (75.4) while the lowest were those assigned to the Other category (60.7).

Mutual support

The highest mean question was Caution Awareness (73.0) while the lowest mean was for Staff Conflict Resolution (54.0). The group with the highest mean under Mutual Support was MedTech (74.3) while the lowest mean was from the group Other (50.7).


The highest scored question under Communication was Common Terminology (77.0) while the lowest scoring question was Available Sources (64.0). The group with the highest mean was MedTechs (73.1) while the group with the lowest scores was the group, Other (62.1). Complete T-TPQ results can be found in Table 2.

Table 2 T-TPQ survey construct aggregate means by profession

S2: Qualitative emerging themes

Across all team members, Quality Improvement (QI), Patient Safety, and Objective Case Review were the most frequently mentioned benefits for ORBB. Objective Case Review was viewed as the top benefit among APPs and MDs, with one respondent stating, “objective documentation of exactly what was and was not said as well as documentation of events.” QI and Patient Safety were the most common themes for RNs, with one respondent noting “learning from emergency situations and ability to see how we can improve especially in terms of communication” and another respondent noting, “it (ORBB) enhances patient safety measures.”

The most common concerns regarding ORBB integration were Psychological Safety (PsySaf), Privacy, and Loss of Trust. APPs and RNs most common concern was Privacy, with one respondent noting “there is a significant concern for privacy, both from the perspective of constantly being video recorded and from the perspective of the safety of recorded data in the age of constant malware attacks on healthcare institutions and their data centers”, while the most common concerns for MDs and Other was PsySaf. Complete ORBB themes based on OR team members’ perspectives can be found in Table 3.

Table 3 Perspectives around benefits and/or concerns with ORBB

Considering ORBBSIM, the most common benefits were QI, Development of Real-world Objective Scenarios, and Education. MDs and APPs felt that QI was ORBBSIM’s top benefit. RNs viewed ORBBSIM the most beneficial to Education, noting “(ORBB) provides learning opportunities to see examples of both correct and incorrect behaviors and provide meaningful debriefing sessions after simulations". When examining team members’ concerns with ORBBSIM, the most common concerns across all professions were PsySaf, Resources, and Timing. Interestingly, PsySaf was viewed differently across the disciplines, with five respondents viewing ORBBSIM as an impediment to PsySaf. On the contrary, two respondents with different simulation experience, felt there was less concern for PsySaf when using ORBBSIM in comparison to the threat of PsySaf when ORBB is integrated clinically, with one respondent noting, “within the simulation realm the fears should be less substantial as sim sessions are already under video surveillance.” Complete ORBBSIM themes can be found in Table 4.

Table 4 Perspectives around benefits and/or concerns with ORBBSIM

When asked about ORBBSIM implementation, enablers across all professions were to Engage All Stakeholders, Education/Communication, and creation of a Culture of Safety. Engagement of All Stakeholders was the most common enabler expressed by APPs, MDs, and RNs, with one respondent highlighting the importance of a, “committee with a good representation of OR members to develop accurate simulation scenarios: OR nurse, scrub tech, anesthesia tech, surgeon, CRNA/resident, anesthesiologist, medical students, and many more (radiology, vendors…).” The Other category viewed Education/Communication as the most important enabler, with one respondent noting, “will need to provide education to familiarize participants with the technology". When asked about barriers to ORBBSIM adoption, the most common barriers were Resources, Schedule, and Time. MDs and RNs were aligned that the biggest barrier was Resources, with one respondent stating, “these rooms are used for patient care and not frequently available for simulation.” APPs had a tie for their two most common barriers, split between Resources and Schedule. The Other group felt that Education and Awareness would be barriers to adoption. Lastly, when respondents were asked whether there was additional feedback on the ORBB project, the most common responses were to ensure ORBB had Inclusivity, with one respondent highlighting, “as much inclusivity across as many specialty areas as possible.” Additionally, respondents across all disciplines felt Education, Legal, Logistics, Schedule, and QI were important components of the success of ORBB and ORBBSIM. Complete ORBB and ORBBSIM implementation themes can be found in Table 5.

Table 5 Perspectives around ORBB integration with simulation implementation


This is the first known study looking at staff perceptions of ORBB and ORBBSIM, and to identify interprofessional teams’ self-assessment of culture via T-TPQ (process factors) in conjunction with OR team perspectives (outcome factors) [40, 42, 45, 47].The following paragraphs highlight key areas that can promote adoption and be used as a blueprint for other organizations looking to adopt ORBB and ORBBSIM, arranged by the five AHRQ TeamSTEPPs constructs [40, 48].

Looking first at culture, the highest-scoring T-TPQ constructs were Communication, Situation Monitoring, and Mutual Support. Top-rated questions included common terminology, supporting one another to correct mistakes, and cautioning one another about dangerous situations, which highlight the existence of a supportive and collaborative team [42, 47]. Leaders can maximize these team strengths during implementation, by engaging champions early [40]. On the contrary, Leadership was found as the lowest construct when examining the overall team’s scores, but when inspecting scores by profession, MDs and APPs rated Leadership high, while MedTech, RN, and Other rated Leadership low. The dissonance between professions’ perceptions on leaders’ effectiveness demonstrates the importance of change management principles which tailor initiatives based on individuals’ or teams’ unique perspectives, beliefs, motivators, and norms (structure and process factors), promoting a leaders’ effectiveness in fostering change (outcome) [40]. The lowest-scored Leadership question of leaders not considering staff’s input when making decisions, was noteworthy when triangulated to the downstream outcomes, where a common theme was the need to engage all stakeholders, as the qualitative data help illuminate why this construct was rated lowest [40, 42, 46]. These findings further emphasize the significance of empowering diverse champions as a crucial element of effective change management. Champions play a vital role in ensuring that staff perspectives (motivators) are considered during the change process, promoting inclusivity and buy-in from all stakeholders [40, 42].

Additionally, the findings shed light on the evolving landscape of healthcare, highlighting the growing need for leaders who can foster constructive cultures. Cultivating a constructive culture creates an environment that embraces change as an opportunity for growth and improvement, enabling organizations to adapt more effectively [42]. Constructive cultures empower individuals and foster collaboration, creating a shared vision and stakeholder engagement. This collaborative approach allows organizations to navigate and adapt to change more successfully [42]. Leaders who embrace a constructive culture mindset are better equipped to execute strategic plans, generate buy-in, implement a shared vision, utilize agile change frameworks, and inspire innovation [42]. In summary, these findings not only underscore the importance of change management principles and the involvement of diverse champions in adopting ORBB and ORBBSIM, but they also highlight the essential role of visionary and adaptable healthcare leaders in establishing constructive cultures and shaping the future of healthcare [42]. It is important for leaders to use these findings as a blueprint to build infrastructures that promote constructive cultures, emphasizing the importance of seeking all stakeholders' perspectives and incorporating their input into the change process.

Next, considering Team Structure was the second lowest construct, movement towards a constructive culture would be beneficial to create a shared vision that maximizes resources, increase staff accountability, and create improved efficiency to overcome the lowest scored questions [42]. A constructive culture will also promote intrinsic and extrinsic motivators, encourage a Culture of Safety, and help teams withstand future challenges [42]. Adoption can be impeded without these focused efforts as unwanted behaviors can perpetuate a dysfunctional culture, disrupt the strategic plan execution, and inhibit innovation [42].

Similar to previous literature, all team members held common beliefs that ORBB will enhance QI, improve Patient Safety, and provide opportunities for Objective Case Review [26]. It will be important to find mechanisms that overcome concerns around the impact of ORBB on PsySaf, Privacy, and Loss of Trust (process factors). RNs seemed less concerned about trust than MDs and APPs, while instead showing more concern for privacy, similar to prior literature [26]. These findings emphasize the importance of engaging all professions and strong leadership which establishes a constructive culture in order to promote process factors of trust, PsySaf, a Culture of Safety, effective communication, as well as transparency [40, 42, 56].

All professions were aligned that ORBBSIM could enrich QI initiatives and education using objective scenarios. Enrichment themes included translational performance improvement and real-world examples to develop sessions. These are noteworthy themes, as simulation has made important advances since being highlighted as a key mechanism to improve patient safety in To Err is Human, but has been challenged to achieve full system-wide adoption due to a disconnect with safety system’s operations, lack of real-time data, and lack of involvement of key stakeholders [2, 32]. Based on staff’s perceptions, ORBBSIM may be a bridge that spans the chasm between the current state of simulation to system-wide adoption [32]. Respondents also recognized that ORBBSIM could increase fidelity and enhance debriefing, but stressed doing so will require the right resources, like ensuring that all teams can participate, providing protected time, considering different schedules, and using well-trained simulation experts as facilitators to preserve PsySaf. Leaders should involve stakeholders in the implementation plan to overcome these structural concerns [40]. Through triangulation, it is noteworthy that many of the lowest-scored T-TPQ questions were regarding inefficient use of resources, suggesting opportunities exist to improve resource allocation [42]. Concerns about ORBBSIM in comparison to ORBB were less about intrinsic (process) motivators (i.e., privacy or loss of trust), but more about the extrinsic (structure) barriers (i.e., resources, environment, or timing) [42]. Some respondents highlighted that simulation has established the norm of recording, thus there was less concern about ORBBSIM recordings. Indeed, simulation best practices have established safe learning environments and PsySaf as a foundational norm, through mechanisms like a Prebrief and expertly trained facilitators [57]. This suggests there might be a reciprocal benefit for ORBBSIM to also enhance the team’s PsySaf, increase ORBB comfort, and overcome privacy as well as trust concerns. Similar to other implementation studies, this study found that additional factors to expand adoption included increased education, communication, and awareness [26]. Empowering diverse champions can help establish a shared vision, strategic plan, and communication plan to enhance education, communication as well as awareness while also ensuring there is ongoing two-way communication to promote transparency [40].

Study limitations and strengths

This is the first known single-site study to explore teams’ perceptions around ORBBSIM and the second known study exploring team’s perceptions with ORBB. Additionally, as a growing technology, though the majority of respondents knew about ORBB, a subset of respondents had yet to work clinically in a room with this technology. Thus, the degree of generalizability of our findings to all professions and other institutions is unknown. Sampling error, non-response error, varying response rates, and recall biases were limitations the study team sought to control for by ensuring correct staff solicitation, leadership buy-in, frequent follow-up, and thematic saturation [49]. This includes the fact that S2 was administered to a smaller OR population based on work location; however, the authors believe that the themes which emerged still provided rich insights regarding ORBB and ORBBSIM. Perceptions on data use might have impacted responses, but this was explicitly addressed in the survey’s accompanying email [40]. Lastly, the authors sought to methodologically triangulate the data to build a deeper understanding of adoption factors. Organizations embarking on implementation of ORBB or ORBBSIM should use this study as a blueprint for adoption strategies to first assess interprofessional staff perceptions, culture, and other factors within their own environments as future studies will be needed to understand the generalizability of findings.

Key take-aways

Our findings supported themes from previous literature and enriched our understanding through team culture constructs, strengthening our appreciation of disruptive innovation’s impact on QI system factors [44]. As the first US organization to implement ORBBSIM, trends emerged on the reciprocal benefit of dual adoption of ORBB with ORBBSIM. These included the overwhelmingly positive beliefs that ORBB and ORBBSIM could enhance QI efforts through objective data and improve patient safety initiatives. Emerging trends also suggested that dual implementation can improve adoption through the promotion of PsySaf, team culture, trust, and technology comfort (process factors), but future studies are needed in these areas. Findings also underscored the importance of exploring a team’s culture and the adoption of a systematic implementation plan built on change management principles such that site-specific gaps (structure and process) can be overcome through involvement of all stakeholders and heightened, two-way communication [39, 41]. Lastly, our findings highlighted the importance of strong leaders who can foster constructive cultures in order to promote trust, inclusivity, and PsySaf.


ORBB and ORBBSIM are promising emerging technologies to improve patient care. Our findings supported previous literature’s ORBB implementation themes, while enriching our understanding through exploration of team culture constructs. Although further research is needed, the blueprint outlined in this study provides an initial paradigm to help organizations adopt ORBB and ORBBSIM. Data and themes can be used to monitor how factors influence ORBB and ORBBSIM outcomes, providing an empirical paradigm for future studies such that consensus regarding effective implementation are achieved using best practice change management and constructive culture strategies for emerging technology in healthcare [40, 42, 43].

Availability of data and materials

The datasets analyzed during the current study are not publicly available due to psychological safety and to protect sensitive data disclosed by individual team-members but are available from the corresponding author on reasonable request in aggregate form, de-identified.



Advanced Practice Providers


Audio Visual


Medical Doctor


Medical Errors


Registered Nurse


Medical Technicians


Operating Room Black Box


Operating Room Black Box Simulations


Operating Room Video Recorder


Psychological Safety


Surgical errors


Surgical Safety Checklist


Survey 1


Survey 2


Team Strategies and Tools to Enhance Performance and Patient Safety


Teamwork Perceptions Questionnaire


Universal Protocol


Quality Improvement


  1. Makary MA, Daniel M. Medical error—the third leading cause of death in the US. BMJ. 2016;3:353.

    Google Scholar 

  2. Kohn LT, Corrigan JM, Donaldson MS, McKay T, Pike KC. To err is human. building a safer health system. 2000;600:2000.

  3. Longest BB. Health policymaking in the United States. 6th ed. Chicago, IL: Health Administration Press; 2016.

    Google Scholar 

  4. Cohen AJ, Lui H, Zheng M, et al. Rates of Serious Surgical Errors in California and Plans to Prevent Recurrence. JAMA Netw Open. 2021;4(5):e217058.

    PubMed  PubMed Central  Google Scholar 

  5. Bates DW, Singh H. Two decades since to err is human: an assessment of progress and emerging priorities in patient safety. Health Aff. 2018;37(11):1736–43.

    Google Scholar 

  6. Ricciardi R, Baxter NN, Read TE, et al. Surgeon involvement in the care of patients deemed to have “preventable” conditions. J Am Coll Surg. 2009;209(6):707–11.

    PubMed  Google Scholar 

  7. Thomas EJ, Studdert DM, Burstin HR, et al. Incidence and types of adverse events and negligent care in Utah and Colorado. Med Care. 2000;1:261–71.

    Google Scholar 

  8. Brennan TA, Leape LL, Laird NM, et al. Incidence of adverse events and negligence in hospitalized patients: results of the Harvard Medical Practice Study I. N Engl J Med. 1991;324(6):370–6.

    CAS  PubMed  Google Scholar 

  9. Hempel S, Maggard-Gibbons M, Nguyen DK, et al. Wrong-site surgery, retained surgical items, and surgical fires: a systematic review of surgical never events. JAMA Surg. 2015;150(8):796–805.

    PubMed  Google Scholar 

  10. Centers for Medicare and Medicaid Services. Hospital Compare. Available from: [cited 2023 July 13].

  11. Rowland HT. When Never Happens: Implications of Medicare’s Never-Event Policy. Marq Elder’s Advisor. 2008;10:341.

    Google Scholar 

  12. Urbach DR, Govindarajan A, Saskin R, et al. Introduction of surgical safety checklists in Ontario. Canada N Engl J Med. 2014;13(370):1029–38.

    Google Scholar 

  13. Augestad KM, Butt K, Ignjatovic D, et al. Video-based coaching in surgical education: a systematic review and meta-analysis. Surg Endosc. 2020;34(2):521–35.

    PubMed  Google Scholar 

  14. Birkmeyer JD, Finks JF, O’reilly A, et al. Surgical skill and complication rates after bariatric surgery. N Engl J Med. 2013;369(15):1434–42.

    CAS  PubMed  Google Scholar 

  15. Goldenberg MG, Jung J, Grantcharov TP. Using data to enhance performance and improve quality and safety in surgery. JAMA Surg. 2017;152(10):972–3.

    PubMed  Google Scholar 

  16. Jung JJ, Adams-McGavin RC, Grantcharov TP. Underreporting of Veress needle injuries: comparing direct observation and chart review methods. J Surg Res. 2019;1(236):266–70.

    Google Scholar 

  17. Bergström H, Larsson LG, Stenberg E. Audio-video recording during laparoscopic surgery reduces irrelevant conversation between surgeons: a cohort study. BMC Surg. 2018;18(1):1–5.

    Google Scholar 

  18. Bonrath EM, Dedy NJ, Gordon LE, et al. Comprehensive surgical coaching enhances surgical skill in the operating room. Ann Surg. 2015;262(2):205–12.

    PubMed  Google Scholar 

  19. Jue J, Shah NA, Mackey TK. An interdisciplinary review of surgical data recording technology features and legal considerations. Surg Innov. 2020;27(2):220–8.

    PubMed  Google Scholar 

  20. Jung JJ, Elfassy J, Jüni P, et al. Adverse events in the operating room: definitions, prevalence, and characteristics. A systematic review World J Surg. 2019;43(10):2379–92.

    PubMed  Google Scholar 

  21. Jung JJ, Jüni P, Lebovic G, et al. First-year analysis of the operating room black box study. Ann Surg. 2020;271(1):122–7.

    PubMed  Google Scholar 

  22. Jung JJ, Elfassy J, Grantcharov T. Factors associated with surgeon’s perception of distraction in the operating room. Surg Endosc. 2020;34(7):3169–75.

    PubMed  Google Scholar 

  23. Jung JJ, Kashfi A, Sharma S, et al. Characterization of device-related interruptions in minimally invasive surgery: need for intraoperative data and effective mitigation strategies. Surg Endosc. 2019;33(3):717–23.

    PubMed  Google Scholar 

  24. Salgado D, Barber KR, Danic M. Objective assessment of checklist Fidelity using digital audio recording and a standardized scoring system audit. J Patient Saf. 2019;15(3):260.

    PubMed  Google Scholar 

  25. Jung JJ, Grantcharov TP. The operating room black box: a prospective observation study of the operating room. J Am Coll Surg. 2017;225(4):S127–8.

    Google Scholar 

  26. Etherington N, Usama A, Patey AM, et al. Exploring stakeholder perceptions around implementation of the Operating Room Black Box for patient safety research: a qualitative study using the theoretical domains framework. BMJ Open Qual. 2019;8(3):e000686.

    PubMed  PubMed Central  Google Scholar 

  27. Doyen B, Gordon L, Soenens G, et al. Introduction of a surgical black box system in a hybrid angiosuite: challenges and opportunities. Physica Med. 2020;1(76):77–84.

    Google Scholar 

  28. Levin M, McKechnie T, Kruse CC, et al. Surgical data recording in the operating room: a systematic review of modalities and metrics. Br J Surg. 2021;108(6):613–21.

    CAS  PubMed  Google Scholar 

  29. van Dalen AS, Jansen M, Van Haperen M, et al. Implementing structured team debriefing using a Black Box in the operating room: surveying team satisfaction. Surg Endosc. 2021;35(3):1406–19.

    PubMed  Google Scholar 

  30. van de Graaf FW, Eryigit Ö, Lange JF. Current perspectives on video and audio recording inside the surgical operating room: results of a cross-disciplinary survey. Updates Surg. 2021;73(5):2001–7.

    PubMed  Google Scholar 

  31. MacRae CA, Deo RC, Shaw SY. Ecosystem Barriers to Innovation Adoption in Clinical Practice. Trends Mol Med. 2021;27(1):5–7.

    CAS  PubMed  Google Scholar 

  32. Phrampus PE. Simulation and integration into patient safety systems. Simul Healthc. 2018;13(4):225–6.

    PubMed  Google Scholar 

  33. Hakimzada M, O’Brien A, Wigglesworth H. Exploring the attitudes of the nursing staff towards the use of body-worn cameras in psychiatric inpatient wards. J Intensive Care Med. 2020;16(2):75–84.

    Google Scholar 

  34. Gallant JN, Brelsford K, Sharma S, et al. Patient Perceptions of Audio and Video Recording in the Operating Room. Ann Surg. 2021 Jan 15.

  35. ErichsenAndersson A, Frödin M, Dellenborg L, et al. Iterative co-creation for improved hand hygiene and aseptic techniques in the operating room: experiences from the safe hands study. BMC Health Serv Res. 2018;18(1):1–2.

    Google Scholar 

  36. Armstrong DG, Rankin TM, Giovinco NA, Mills JL, Matsuoka Y. A heads-up display for diabetic limb salvage surgery: a view through the google looking glass. J Diabetes Sci Technol. 2014;8(5):951–6.

    PubMed  PubMed Central  Google Scholar 

  37. Yarborough AK, Smith TB. Technology acceptance amongst physicians. Med Care Res Rev. 2007;64:650–72.

    Google Scholar 

  38. Grzybicki DM. Barriers to the implementation of patient safety initiatives. Clin Lab Med. 2004;24(4):901–11.

    PubMed  Google Scholar 

  39. Kennedy E, Lingard L, Watling CJ, et al. Understanding helping behaviors in an interprofessional surgical team: How do members engage? Am J Surg. 2020;219(2):372–8.

    PubMed  Google Scholar 

  40. Kotter J. Leading Change-John Kotter. Primento Publishing; 2011.

  41. Pimentel MP, Choi S, Fiumara K, et al. Safety culture in the operating room: variability among perioperative healthcare workers. J Patient Saf. 2021;17(6):412–6.

    PubMed  Google Scholar 

  42. Szumal J, Cooke R. Creating Constructive Cultures: Leading People and Organizations to Effectively Solve Problems and Achieve Goals. Human Synergistics International; 2019.

  43. Kilbridge PM, Classen DC. The informatics opportunities at the intersection of patient safety and clinical informatics. J Am Med Inform Assoc. 2008;15(4):397–407.

    PubMed  PubMed Central  Google Scholar 

  44. Kunkel S, Rosenqvist U, Westerling R. The structure of quality systems is important to the process and outcome, an empirical study of 386 hospital departments in Sweden. BMC Health Serv Res. 2007;7(1):1–8.

    Google Scholar 

  45. Donabedian A. The quality of care: how can it be assessed? JAMA. 1988;260(12):1743–8.

    CAS  PubMed  Google Scholar 

  46. Donabedian A. The quality of care. Arch Pathol Lab Med. 1997;121:11.

    Google Scholar 

  47. Teamstepps Teamwork Perceptions Questionnaire Manual. AHRQ. [cited 2022 Feb 21]. Available from:

  48. LoPorto J. Application of the Donabedian quality-of-care model to New York state direct support professional core competencies: How structure, process, and outcomes impacts disability services. Soc Change. 2020;12(1):5.

    Google Scholar 

  49. Shi L. Health services research methods. Third Edition. Cengage Learning; 2019 Nov 5.

  50. O’Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med. 2014;89(9):1245–51.

    PubMed  Google Scholar 

  51. Salas E, Sims DE, Burke CS. Is there a “big five” in teamwork? Small Gr Res. 2005;36(5):555–99.

    Google Scholar 

  52. Keebler JR, Dietz AS, Lazzara EH, et al. Validation of a teamwork perceptions measure to increase patient safety. BMJ Qual Saf. 2014;23(9):718–26.

    PubMed  Google Scholar 

  53. Patton MQ. Qualitative evaluation and research methods. SAGE Publications, inc; 1990.

  54. Agee J. Developing qualitative research questions: A reflective process. Int J Qual Stud Educ. 2009;22(4):431–47.

    Google Scholar 

  55. Thomas DR. A general inductive approach for analyzing qualitative evaluation data. Am J Eval. 2006;27(2):237–46.

    Google Scholar 

  56. Brown B. Dare to Lead: Brave Work. Tough Conversations. Whole Hearts: Random House; 2018.

    Google Scholar 

  57. Roh YS, Ahn JW, Kim E, et al. Effects of prebriefing on psychological safety and learning outcomes. Clin Simul. 2018;1(25):12–9.

    Google Scholar 

Download references


The authors gratefully acknowledge support provided by the UT Southwestern Simulation Center. They also gratefully acknowledge Suzanne Farmer, PhD., and Sonja Bartolome, MD.



Author information

Authors and Affiliations



KC made substantial contributions to the conception; design of the work; acquisition, analysis, and interpretation of data; drafted the work; and substantively revised it. AG made substantial contributions to the conception of the study; design of the work; and substantively revised it. JJ made substantial contributions to the design of the work; analysis and interpretation of data. DS made substantial contributions to the conception of the study; and substantively revised it. JH made substantial contributions to the conception of the study; design of the work; analysis and interpretation of data; and substantively revised it. AK made substantial contributions to the conception of the study; design of the work; analysis and interpretation of data; and substantively revised it.

Corresponding author

Correspondence to Krystle Campbell.

Ethics declarations

Ethics approval and consent to participate

The Institutional Review Board at UT Southwestern and the Medical University of South Carolina approved an exemption based on Human Research Subject Regulations # STU-2020–0427.

Consent for publication

All authors have agreed to the contents of this manuscript’s submission, have no conflicts of interest, and no financial obligations.

Competing interests

All authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Campbell, K., Gardner, A., Scott, D.J. et al. Interprofessional staff perspectives on the adoption of or black box technology and simulations to improve patient safety: a multi-methods survey. Adv Simul 8, 24 (2023).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: