Debriefing strategies for interprofessional simulation—a qualitative study
Advances in Simulation volume 7, Article number: 18 (2022)
Interprofessional education is becoming more common worldwide. Simulation is one format in which this can effectively take place. The debriefing after the simulation is a critical part of the simulation process as it allows reflection and discussion of concepts that arose during the simulation. Debriefing has been noted to be challenging in the literature. Debriefing after interprofessional simulation (IPS) is likely to have even more challenges, many related to the different backgrounds (profession, specialty) of the learners. This study was designed to investigate: ‘How do differing learner professions impact on delivery of post simulation debriefing after team based interprofessional simulation—what are the challenges and what strategies can be used to overcome them?’
An initial review of the literature was used to identify current understanding and potential themes requiring further exploration. Using the results from the literature as a starting point for topics and questions to be asked, semi-structured interviews were planned, with those who are experienced in debriefing after IPS. The interviews were transcribed then analysed using a framework analysis.
The literature search resulted in twenty relevant papers. Four dimensions were drawn out from these papers that were directly related to debriefing after IPS: ‘the debriefer’, ‘method of debriefing’, ‘the learner’ and ‘psychological safety’. Sixteen interviews occurred between June and August 2020. Ten themes were extracted from the analysis of the transcripts of these interviews: number and specialty of debriefers, credibility, assumptions/preconceptions, nurses vs doctors, method of debriefing, the learner, hierarchy, safe learning environment, inclusion of all learners, and number of debriefers. These themes were fitted in the four dimensions identified in the literature search, and discussed as so.
Several challenges and strategies were identified during this study. ‘It depends’ was a common answer received in the interviews suggesting that there is very little advice that can be given that applies to every situation. The main recommendation from this study is the support for an interprofessional group of debriefers in IPS although this does introduce its own challenges. Further research is suggested around the hierarchy found in IPS debriefing and how this translates to and from clinical practice.
Interprofessional education (IPE) is increasing in popularity worldwide . The World Health Organisation (WHO) suggests that ‘interprofessional education occurs when students from two or more professions learn about, from and with each other to enable effective collaboration and improve health outcomes. ’. Publications and meta-analyses from organisations such as WHO, Cochrane and Best Evidence Medical Education [1,2,3] have explored this topic at length. These demonstrate that IPE can improve patient outcomes and therefore should be used across the world in healthcare settings but suggest that there are many research gaps that require filling to prove that IPE is more beneficial than single-profession education. Simulation is one of many methods within which IPE can occur  and has become increasingly commonplace in medical education [4, 5].
Following a simulated scenario, the debriefing is recognised as a critical but challenging aspect of the simulation process. There is a wealth of literature investigating the challenges and strategy of debriefing in general, some of which could be applied to debriefing after interprofessional simulation (IPS) but are not specifically designed for this. Examples are: ‘debriefing with good judgement’, a specific debriefing technique ; co-debriefing, i.e. having more than one debriefer ; and ‘learner centred’ debriefing .
Debriefing after IPS is becoming more common because of the increase in simulation and IPE but there seems to be few studies looking into IPS debriefing specifically . Debriefing after simulation in general is well-known to be challenging at times; this often relates to specific characteristics of individual learners . In debriefing after IPS, having an interprofessional cohort of learners, all with different learning needs, suggest that these challenges in debriefing may be multiplied, particularly when balancing the interprofessional learning objectives with the individual learner needs.
This study has therefore been designed to answer:
‘How do differing learner professions impact on delivery of post simulation debriefing after team based interprofessional simulation – what are the challenges and what strategies can be used to overcome them?’
Specifically, the aims are to
Explore what the additional challenges are when debriefing an interprofessional group of learners after a simulation.
Investigate any strategies recommended to overcome these challenges when debriefing an interprofessional group of learners.
The design of this study had two approaches, a review of the literature followed by semi-structured interviews. The results from the literature were used to guide the interview discussion topics and influence the subsequent analysis of the data.
Review of the literature
The aim of this literature search was to examine the existing literature on debriefing after IPS, identify gaps in research and, as per the question and aims for this study above, explore any challenges or strategies to debriefing after IPS. This was not a formal systematic or scoping literature review, but an informal process to inform the next stage of the study.
The main concept explored in this research study was debriefing after simulation specifically in IPE. The search terms used for the literature search in November 2019 were therefore: ‘debrief* AND (interprofessional OR multidisciplinary OR interdisciplinary) AND simulation’. The ‘PubMed’ database was used along with a further search on the ‘Google Scholar’ search engine. Inclusion criteria included any paper that referred to debriefing after an IPS, even briefly. Any papers that did not include this were excluded. See Fig. 1 for the flowchart of the literature search process. The included papers were read and analysed by one of the authors (CH).
The question posed in this study is clearly looking for opinion and personal experience rather than fact, with the hopes that sharing and collating such opinions will aid others to continue constructing their own. There is no right and wrong. Methodology for this study has therefore been considered from a constructivist viewpoint .
Qualitative approach and framework analysis
Ng et al.  suggest that experiential phenomena (as in this study) would be best researched qualitatively. Thematic analysis is one of the most commonly used approaches to managing data in a qualitative study . From a constructivist viewpoint, the interviews, data analysis and writing up of the results are all part of the process of qualitative research as they promote further reflection and analysis .
The authors are both active and regular facilitators and debriefers of both interprofessional and single profession simulations but relatively novice researchers. Both authors have undertaken training that included qualitative methodology and have some experience of this process.
All of the ‘Standards for Reporting Qualitative Research’ have been met .
A framework method of thematic analysis was used in this study as described by Gale et al. . This 7-step approach was followed:
‘Transcription’—this was done manually by one of the authors (CH).
‘Familiarisation with the interview’—as the same author performed the interviews then transcribed them, familiarisation was achieved.
‘Coding’—interesting themes were coded manually on the transcript using a colour and a descriptive term. This was done by CH with input and discussion throughout with the second author (EM).
‘Developing a working analytical framework’—after the first 5 transcripts were coded inductively, it became clear that there were approximately 10 themes.
‘Applying the analytical framework’—the rest of the transcripts were then deductively coded using these themes.
‘Charting the data into the framework matrix’—a spreadsheet was created where the summary of each of the themes from each transcript was manually inputted.
‘Interpreting the data’—the data under each theme was interpreted and analysed. It was noted that each of the topics fit under or was one of the four main themes noted in the literature. Both authors reviewed and discussed this interpretation.
Choice of method
Semi-structured interviews were chosen so that specific questions could be asked, leaving space for the interviewee to interpret them and answer based on their own experience . The literature [15,16,17] supports interviews as an appropriate choice of method for a qualitative study. The interview should be a conversation between two people with similar interests and experiences to draw out information to answer the aims of the study. This ‘conversation’ is appropriate from a constructivist view.
Recruitment and interview process
Sampling of participants
Purposeful sampling was used to include debriefers from a variety of different professions and experiencing IPS in different contexts and locations. This was done through direct working, meeting at simulation conferences or simulation courses. This ensured a cohort of information rich participants . There is some bias in this form of sampling if you approach research from a positivist standpoint. However, from a constructivist point of view, this is not a limitation for this study as the aim is not to prove any specific facts, only to get opinions and experience and identify potential areas for future research. All participants have had regular experience with debriefing after IPS although their setting may vary, e.g. simulation lab, in situ simulation and life support courses. Face-to-face request was followed up with an email formally inviting them to take part with further information attached.
Interviewing to saturation is a concept that was considered—this entails stopping interview once all themes become recurrent and there are no new themes . This was monitored when the data was analysed after each interview. Saturation of themes occurred around interview ten. However, the interviews continued to sixteen to try and ensure a more interprofessional representation of participants.
A written information sheet was provided to the participants (Appendix 3) Written consent was gained from each participant prior to initiating the interview (Appendix 4). The signed consent forms were uploaded to a secure online data storage area and any paper copies destroyed.
The interview schedule was prepared beforehand. The first interview had questions written based on the experiences of the author, the information found in the literature search and the gaps identified in the literature search, all with the aims of answering the research question. The research question was read out to the participants first with an initial open question before further potential questions to narrow down focus depending on the topics brought up by the participants themselves. The interviews were semi-structured, i.e. there was scope to explore new ideas suggested by participants rather than going through every suggested question. The questions used in the interviews as a guide are in Appendix 5—while it was an iterative process with some changes in focus between interviews, new topics and questions were not formally written to avoid focussing too much on specific questions and less on the opinion of the participant.
A trial face-to-face interview took place in January 2020 which was included in the analysis with the consent of the participant. The rest of the interviews took place between June and August 2020.
The interviews were initially planned to take place face-to-face; however, when the COVID-19 pandemic occurred, most were done via Microsoft Teams to comply with social distancing measures—this was not considered to affect the aims or objectives of the study.
They were audio-recorded via dictaphone and audio data was stored securely in an online data storage area with an identifying number rather than a name.
The collected interview data was transcribed in a non-verbatim way (i.e. not writing down items like ‘erm’, ‘ok’, ‘yeah’ when they are not used in a meaningful way). This study was aiming to pick up themes of discussion and opinion rather than closely analyse the speech and conversation style so verbatim transcription was deemed unnecessary. A constructivist viewpoint aligns with transcribing and analysing the data—this will allow reflection and continuous development of the information received from the interviews with the aim to create a summary of this information in a format that can be shared.
Once transcribed, the audio files were deleted. The transcription was then anonymous. This transcription was only visible to the research team (or the participant if requested by them).
Approval from the University of Edinburgh Medical Education Ethics committee was sought and received in January 2020 (see Appendix 1).
Approval from Health Research Authority (HRA) was sought and was received in March 2020 (see Appendix 2). Each NHS trust which has potential participants working for them received the appropriate documents to approve and sign as per the HRA process prior to initiating formal contact with the participants.
Review of the literature
Twenty papers were included, all from 2013 to 2019. Each paper was read and the main topics which were discussed relating specifically to interprofessional debriefing, as per the aims of this study, were noted. This data is included in Appendix 6 along with a summary of the aim, design and methodology of each of the papers. These topics were collated under four main theme headings the debriefer, the method of debriefing, the learner and psychological safety.
Twenty-two participants were invited and sixteen agreed and were included. Reasons for not participating were all clinical or annual leave commitments. These were from three National Health Service (NHS) trusts in England. See Table 2 for the breakdown of the background of the participants of this study.
The 16 interviews were completed between June and August 2020. The interviews lasted 30–70 min. Following the analysis process, four themes with nine subthemes were identified. It should be noted that ‘number of debriefers’ was relevant to both ‘the debriefer’ and ‘psychological safety’ so has been included twice.
These themes and subheadings are: ‘the debriefer’ (number and specialty of debriefers, credibility, assumptions/preconceptions, nurses vs doctors,), ‘method of debriefing’, ‘the learner’ and ‘psychological safety’ (hierarchy, safe learning environment, inclusion of all learners, number of debriefers).
It is important to point out that for the great majority of the opinions shared, a caveat of ‘it depends’ (e.g. on the learning outcome, or on the learners’ backgrounds, or on the debriefers’ style etc.) or similar was uttered by each of the participants so while a discussion around these collective opinions will now take place, there is no ‘correct answer’ advised for any particular situation. One participant particularly focussed on this, giving ‘it depends’ as an initial answer to every question asked before qualifying further.
Interestingly, many of the participants were very reflective upon their own debriefing practices throughout the interview, using examples and stories from their previous experiences to demonstrate their meaning when discussing debriefing practices. Several of them took aspects that were discussed in the interviews back to their educational practice to further develop after this reflection and discussion, demonstrating the importance of reflection and continuous professional development for even experienced educators and debriefers.
A discussion of the main themes and subheadings and how they relate to the literature will now follow.
Number and specialty of debriefers
The interviews overwhelmingly suggested that having more than one debriefer from differing professions was perceived to have benefits in debriefing after IPS—all participants stated this. Having multiple debriefers in debriefing after simulation is not a new concept. Cheng et al.  highlight the benefits and challenges of co-debriefing. In their conclusion, they write ‘whether and how co-debriefing strategies need to be adapted in an interprofessional context is an important area of study’.
Collectively, the interview participants in this current study suggested that between 2 and 4 debriefers was optimum and having a debriefer with a nursing background debriefing alongside one with a medical background is beneficial for those IPS involving both these professions. This mimics the interprofessional working expected from the participants in the simulations (and clinical practice) and encourages learners of each specialty to engage. This is supported by the literature with Stockert et al.  stating that an interprofessional debriefing team can ‘model interprofessional behaviours and collaborative practice for the learners’.
However, this can become excessive in large multispecialty simulation debriefing sessions with more than four professions/specialties, e.g. trauma team. ‘Too many cooks’ was suggested as a direct quote by more than one participant. The participants also suggested that debriefing with others can be challenging, potentially those that you do not know well or who have limited training or experience in debriefing. It can lead to a difficult debriefing sessions with poor flow and interruptions from other debriefers because ‘everybody debriefs slightly differently, everybody has a slightly different focus, different speeds and different process for how you do it’. Strategies suggested to get around this included:
Having a lead or ‘chair’ debriefer that brings in others at appropriate times or for specific sections—this ensures that the debriefing flows well and there is no competition amongst the debriefers to get their opinion in. This is backed up by Hull et al. 
Planning how each debriefer will be involved in the debriefing helps to ensure that all areas are covered in an effective way and to mitigate debriefers having ‘completely different styles’.
Credibility of the debriefer, or the perception of it, was a theme that featured in multiple interviews. This was summarised nicely by one of the participants: ‘The biggest sort of headline challenge would be credibility and natural ability to link to the different professional people within the team’. Interestingly, this was not particularly noted in the literature found in the brief literature search as part of this study, suggesting further investigation is needed for this topic in IPS. Clinical credibility is a complex topic, meaning different things to different people. A systematic review into clinical credibility in nursing has confirmed this—they were unable to find one clear definition . In this study, while a specific level of seniority was not suggested by most with one participant stating ‘it's not about level it's almost about what background experience you have’, one participant felt that it was important that for clinical topics, the debriefer was more senior than the learners or at least supervised by someone more senior to ensure an element of clinical credibility. Some of the participants were lacking in clinically credible people with appropriate debriefing experience or training which they found challenging. This ‘debriefer credibility’ was perceived to be as important as clinical credibility as demonstrated by ‘I don’t think it needs to be the most senior person in the room, it needs somebody with debriefing skills.’ However, many experienced debriefers have never attended a debriefing course (including 6 out of 15 in this study) and it is difficult to define what is needed for debriefer credibility—something that was not clearly answered by this study. When discussing debriefer training or credibility, two things were mentioned by multiple participants, including all six participants who had not had formal debriefing training—a debriefing course as a starting point and, probably more importantly, watching and practising debriefing. Nearly all of the participants had debriefing coaching or ‘debriefing the debriefer’ built into their simulation programmes. Several participants suggested that debriefing training, particularly for those with the required clinical credibility, was key to improving debriefing after IPS. One participant mentioned that ‘building a community of practice’ or a ‘gang of debriefers’ in the trust should be the aim. One participant suggested as a solution to having both clinical and debriefing credibility present: ‘I think having a lead debriefer with a support structure of debriefers from different professions can add credibility’. This links back to the lead debriefer and interprofessional debriefing team discussion in the section above.
A further debriefer oriented point that came up repeatedly was around assumptions or preconceptions of the behaviours of the learners. There were several discussions around the assumed behaviour of learners based on their specialty (particularly if different to the debriefer), with the assumptions related to how the individual learner would engage in the debriefing based on their profession alone. Suggestions were that surgeons were less likely to engage or receive feedback well. Most participants recognised these preconceptions as such but there was a divide, some believing that they usually are accurate based on their experiences while others acknowledging that learners should be treated as individuals and keeping an open mind as the debriefer is key. This all suggests that many (if not all) have preconceived ideas and judge people based on many factors including their specialty, something that should be actively avoided once awareness of this is raised.
Nurses vs doctors
Comparison of doctors and nurses, their background training and experience of simulation as well as their behaviours in the debriefing, was discussed. Nearly all the participants commented on the differences between these two professions, many in their first response to an open question, suggesting that it is a key issue. Several of the participants who are doctors stated that they felt like they did not have the appropriate background, experience or understanding of what background knowledge the nursing staff should have or what level they should be working at, i.e. were not credible to be debriefing nurses—one participant stated that it is ‘arrogant’ of doctors to believe that they have that credibility and another acknowledging it is easy to ‘be a bit medic-centric’ (i.e. more focussed on medical rather than nursing staff) when debriefing. All of the nurses interviewed found it challenging to debriefing groups including doctors initially, with feelings that they did not have the credibility to do so. As their debriefing experience has increased, this was found to be less of a problem with the realisation that their knowledge and experience of non-technical skills (NTS) and debriefing outweighs that of most doctors.
On the other hand, some participants suggested that it was more a matter of trust and respect of whoever is debriefing regardless of their profession and that debriefing skill and experience outweighs the clinical background. It was suggested that doctors and nurses have different experiences of simulation in their education—‘the nurses tend to have done less simulation so are more nervous and anxious of the debrief and so you kind of have to be a little bit gentler’. Interestingly, it was thought also by a debriefer with a nursing background that the doctors are ‘too nice’ to the junior nurses and ‘tiptoe around them’ which leads to a poorer learning experience for them. These perceptions seem to stem from the assumptions of background knowledge and feelings and need to be addressed. Several suggested (from both doctor and nursing background) that having a nurse debriefing with the doctor(s) (i.e. an interprofessional debriefing team) would help with these perceptions and ensure appropriate level of learning is taking place. However, there were difficulties in finding available and trained debriefers amongst the nursing staff. Barriers to nursing staff participating in simulation included difficulties getting nurses protected education time to involve them as learners but also as debriefers. Doctors tended to have protected time and were more able participate. It was suggested by one participant that there was no recognition from ‘the system’ that interprofessional education is important and requires time, money and people.
It is important to mention that while the difficulties around nursing involvement have been discussed, there are other professions that also have similar barriers—five papers [22, 24, 33, 34, 36] from the brief literature search which mentioned or discussed debriefing after IPS include participants and/or debriefers from other professions, e.g. physical therapy and pharmacy backgrounds. None of these were from the UK and none of these other professions were mentioned in the interviews in this study at all (although it should be noted that the participants in this study either had a doctor or nursing background). This suggests that there is space to look into and move forward with other professions in IPS.
Table 3 summarises all of the challenges and strategies discussed around the topic of ‘the debriefer’.
Having learners in IPS with different backgrounds was perceived to introduce challenges to debriefing an interprofessional group. These different backgrounds may be different professions, e.g. nursing or medical training, or different grades from students up to consultant or senior nurse level. ‘Everyone is at a different level’ was one answer when asked generically about the challenges around debriefing after IPS and many of the participants concurred with this. The differing learner backgrounds can cause complexities around what the learning outcomes for the session should be. For any IPS, most (if not all) of the learning outcomes are team based (often based around NTS according to the interview participants). Despite this, the learners still all have individual learning needs. Park and Holtschneider  write in their paper that aiming questions in debriefing towards team behaviours and actions rather than individual clinical behaviours ensures that interprofessional learning objectives are met. On a similar note, when asked specifically about tactics to get around this, the consensus from the interviews from this current study supported leaving very individualised questions and learning for after the debriefing, either a one-on-one conversation or via email. However, several suggested that most learning in these debriefing sessions should be applicable to all even if it was only related to the tasks of one profession. One participant advised: ‘you’ve just got to pause for a minute before you debrief and make sure that you are going to talk about stuff that’s relevant to everybody or an equal balance’. Another participant said: ‘because we are a complete team, we need to know about all the different cogs in it’. This is likely to be situation dependant however (along with most of the concepts discussed here)—ensuring that we are not devaluing or excluding learners by discussing things which are irrelevant to them is important. Interestingly there was no mention of interprofessional learning outcomes vs individual learner needs in the brief literature search (see Table 1)—potentially a topic that requires further exploration in IPS.
The interview participants acknowledged that most interprofessional debriefing sessions have a larger number of learners than those in single profession debriefings. This lead to increased challenges with every aspect of debriefing, in particular having more individual agendas to manage. It can lead to difficulties with the debriefer talking too much or becoming more ‘feedback’ focussed rather than debriefing—i.e. going away from learner-centred learning which has been advocated for in debriefing in general by Cheng et al. . One participant phrased this as: ‘I would try and pull out what their learning is as opposed to try and inflict my teaching’.
Method of debriefing
Based on the previous paragraph, the debriefing session should be adjusted depending on several factors—learning outcomes, individual learner background and debriefer experience amongst other things. This highlights the difficulties in providing a clear structure or debriefing framework that is applicable for all interprofessional debriefing and suggests that this is not what we should be looking for. ‘Debriefing with good judgement’  is a strategy (rather than a framework) mentioned in the literature around debriefing after IPS [20,21,22, 26, 31, 33], suggesting that this is a useful tool. However, most interview participants stated that they had their own way of doing things, often based on frameworks (those mentioned include the Resuscitation Council’s ‘learning conversation’ , the ‘Diamond’ model , and a local framework base on the ‘PEARLS’ framework ) with small amendments that they developed over time. One technique described was ‘pre-briefing’ before the simulation or debriefing—to explain why the simulation is happening, that it is not a test, that it is about the team and system rather than individuals. This can avoid the perception that the simulation and debriefing is about a specific individual or group, with others there ‘to help’ rather than to learn. ‘Self-debriefing’ was another technique described by several of the participants. This involved the learners having a 5–10-min chat amongst themselves with no debriefer to write down some of the learning points that they want to discuss in the debriefing session. This ensured the debriefing was learner-centred and allowed the learners to relax and become familiar talking in an unfamiliar group. Most of the participants felt that giving the learners time to relax and calm down was beneficial. However, one participant was keen on getting the learners into the main debriefing session as soon as possible and talking while they were still ‘hot’, using this to get them to open up. This suggests there are several different techniques that may work for different debriefers and learners and further reinforces the point that there is no right or wrong answer. There was also an agreement that keeping the number of learning points discussed in the simulation to an achievable number (examples from 3 to 6 given) avoids overloading the learners and gave them the chance to meaningfully explore some key points.
Table 4 summarises the challenges (mostly found under ‘the learner’) and strategies found around both the ‘method of debriefing’ and ‘the learner’.
The introduction of potential hierarchy gradients in an interprofessional group has been highlighted by several participants as a key difference in debriefing after IPS compared to a single-profession group of learners.
Interestingly, very few participants had ever noticed any direct problems with it in one of their debriefing sessions but all acknowledged the potential for it. One participant more clearly acknowledged the existing hierarchy and stated: ‘I still think there is a greater importance put on to the medical professionals as opposed to the nursing profession’. Multiple participants mentioned the term ‘flat hierarchy’ while referring to their area of specialty, particularly in the ED and in theatres, and the possible reason they had not had any direct issues with hierarchy. This is a commonly used term which ‘acknowledges that the contributions and opinions of all team members are crucial’ . Several of the participants suggested that hierarchy may be more of a problem in other specialties, particularly surgery—this links back to assumed perceptions of specialties which may or may not be true—it is difficult to comment on this further here as there were no representatives from surgical specialties during this study—this may be an interesting avenue to explore further.
As all the interview participants are experienced debriefers and senior clinicians in their fields, it seems possible that their perceptions of hierarchy may not be the same as their learners, many of whom are likely to be more junior than them. To further explore this, information would need to be sought from these junior learners. Van Schaik et al.  interviewed the learners in their debriefing sessions and found that hierarchy limited the discussion—this is only one study from the USA however which has a different set up to the UK, further exploration of this in the UK would be interesting.
Safe learning environment
Creating a ‘safe learning environment’ was discussed in every interview as the primary technique used to acknowledge and ensure the psychological safety of the learners. Many participants referred to a ‘spiel’ that they run through at the start of the debriefing to ensure that all learners feel safe and included. This was usually part of the ‘pre-brief’ as discussed in the ‘Method of debriefing’ section. This was not specific to IPS, but several participants suggested that it helps to overcome some of the potential hierarchy issues which may be more challenging in IPS. Frequent things that were mentioned as being covered in this ‘spiel’ include: ensuring that all learners are aware not to discuss things about the simulation or the debriefing session ‘outside these four walls’; ensuring that all learners were aware that it was a team simulation and debriefing, not aimed at any individual; encouraging all learners to participate, regardless of grade or profession and making it clear that it was a learning experience and it was not expected to go perfectly—honest feedback and discussion would occur with the aim of learning, not an attack on anyone. In their paper, van Schaik et al.  did suggest that having this ‘safe environment’ was not enough to get around issues of hierarchy but it is unclear what else may be required.
The only other strategy suggested by the interview participants was the physical set up of the debriefing session, ensuring it was in a quiet private space with enough space for all the participants to sit and making sure that they were not stood or sat in their profession groups during the debriefing to avoid a physical emphasis of this hierarchy. While there is little new in this suggested strategy from these interviews, it does emphasise the need for further research around hierarchy and safe learning environments in IPS and debriefing.
Inclusion of all learners
Another challenge noted was around including all learners. There was a perception, suggested by several of the interview participants, that the quieter participants were so because they were worried about speaking in front of their colleagues or in a group and that this anxiety was likely to be heightened in more junior members of staff, particularly when more senior staff were present. It was also acknowledged by a few that this can be a personality trait of the learner and does not necessarily mean that they do not feel included or are not learning.
Asking more junior members of the team to provide a summary at the start of the debriefing session was one way around this suggested by many of the participants. Asking someone more junior to do this ensures that they felt involved and that their contributions were valued and expected as well as ensuring that the focus of the simulation and the debriefing was not on one particular individual (e.g. the team leader) but on the team as a whole.
There were some opposing opinions about asking questions directly to individual learners to draw them into the discussion. Several participants suggested that that was a good method to ensure that everyone was involved but caution was advised from other participants. They had some concern that it could cause discomfort to some learners, and they may feel like they were being put on the spot. It seems that this was something that the debriefer must decide at the time whether or not it will cause more harm than good based on their perception of the learner from the simulation.
Number of debriefers
The number of debriefers was a key discussion point in debriefing IPS. It has been included here again, because there was a clear sub-theme relating to the effect of more than one debriefer on the psychological safety of the learners from both the literature search and the interviews. There were suggestions that an increased number of debriefers can have both a positive and negative effect on the psychological safety of the learners. Paige et al.  suggested that increasing the number of debriefers increased the psychological safety of participants because it reduced the time taken to ‘establish a learning environment’ which was noted to be associated with the psychological safety of the learners. In one of the interviews of this current study, having more than one debriefer was also suggested to be of benefit if there is a particularly distressed learner that requires one on one support. This allowed the debriefers to split up and continue the main debriefing session as well as supporting a struggling learner.
However, several of our interview participants had concerns or experiences that increasing the number of debriefers could cause difficulties for the learners. This was related to the number of debriefers potentially outnumbering or appearing to overpower the learners. Another concern was that an increased number of debriefers from different specialties may become too focussed on their own specific clinical areas which may cause confusion amongst the learners and poor flow in the debriefing—linking back to the discussion on interprofessional versus individual learning outcomes.
Strategies noted by the interview participants to ensure the positive aspects were gained from multiple debriefers without adding the negative effects included having a lead debriefer, planning the debriefing amongst all the debriefers and keeping the number of learning points down (covered in ‘The debriefer’ section above). It was also noted that most of the participants advocated for 2–4 debriefers as a maximum so while suggesting more than one debriefer has benefits, limiting the number of debriefers somewhat appears to be another strategy to ensure that the psychological safety of the participants and the flow of the debriefing is optimised.
Table 5 summarised the findings liked to the topic of psychological safety.
The primary limitation of this study is that while attempts were made to recruit an interprofessional group of participants, there was only partial success at this, with many of the participants being senior doctors, mostly in emergency medicine. There have been findings from them and the other participants that it would be helpful to explore with a wider interprofessional group. As discussed in the ‘Nurses vs. doctors’ section, it is possible that one of the reasons it was difficult to recruit nurses and other professions to this study centred around the potential barriers in them getting involved in simulation and debriefing.
Many of the participants, regardless of profession, have been mostly involved in interprofessional debriefing after in-situ simulation in particular. Several have also been involved in life support courses and other sim-lab-based sessions; however, many of the suggestions and conclusions are perhaps more applicable to the in-situ environment, given the participants’ experience, albeit some do appear to be transferrable.
An informal literature search was performed for background information. Both PubMed and Google Scholar were used to look for literature but other databases could have also been reviewed to ensure a more complete review of the literature. Other improvements could include having more than one author reviewing the full literature and performing a more formal scoping or systematic literature review.
A single author analysed and discussed the data found in the literature and interviews with review from and discussion with a second author. There may be views and opinions that may be individual specific. It is probable that with other viewpoints, it would be possible to create a richer discussion, having more than one person analysing the data was suggested by Gale et al.  whose framework was followed in this study. However, from a constructivist viewpoint, this study does provide options and opinions that are valid additions to the current literature.
Multiple potential challenges as well as some possible strategies to overcome them have been identified from this study. Few of these answers are new information, most have been discussed in previous literature in other contexts. They have, however, been discussed here specifically in relation to IPS.
While there are several suggestions based on the research and opinions of the participants, the primary answer to many of the challenges and questions is, as stated multiple times by one of the interview participants, ‘it depends’. Many of the challenges and suggestions noted in this study are individual-dependent (either based on the learner or the debriefer) or situation-dependent, and it is important to remain flexible and acknowledge that there is no correct answer for every situation and there will never be. Personality and perceptions played a large part in many of the discussions suggesting that individuality and our behaviour as people is a key factor in all of this, something that cannot be easily ‘fixed’ or changed. Many of the behaviours and preconceptions involved in IPS mirror cultural issues from clinical practice which are commonly overlooked. Debriefing after IPS could provide a forum for discussion of these issues and potentially trigger change from the ‘top down’ and the ‘bottom up’ with a group of interprofessional individuals from a spectrum of grades.
One particular recommendation based on this study would be the use of an interprofessional debriefing faculty for IPS. This could help with having credibility from a clinical and debriefing aspect between the debriefers. It would also ‘role model’ interprofessional working that can hopefully be transferred back into clinical practice. There is a danger of faculty from the wrong specialty leading a debriefings session if they demonstrate they have not considered or understood the experience and backgrounds of the different professions—it could make them feel less valued and worsen hierarchy issues that likely already exist. There are also suggestions for adjustments to debriefing structure or framework to ensure that the interprofessional team is considered. Although there is not a specific debriefing framework to use, many individuals or departments have their own structure that they have developed, although these often require deviation from depending upon the learners and the situation. This confirms that a structure should be followed but there is not a ‘correct’ option.
Areas in which further research would be of particular interest include studies looking into hierarchy in debriefing after IPS from a debriefer and learner viewpoint. Investigating how to break these barriers down and how this hierarchy translates out of simulation to the workplace would be fascinating.
Availability of data and materials
The datasets generated and/or analysed during the current study are not publicly available due to requirement to delete audio data after transcription as it would not be anonymous. Written transcripts are available from the corresponding author on reasonable request.
Health Research Authority
National Health Service
World Health Organisation
Reeves S, Fletcher S, Barr H, Birch I, Boet S, Davies N, et al. A BEME systematic review of the effects of interprofessional education: BEME Guide No. 39. Med Teach. 2016;38(7):656–68.
Gilbert JHV, Yan J, Hoffman SJ. A WHO report: Framework for action on interprofessional education and collaborative practice. J Allied Health. 2010;39(SUPPL. 1):196–7.
Reeves S, Perrier L, Goldman J, Freeth D, Zwarenstein M. Interprofessional education: effects on professional practice and healthcare outcomes. Cochrane Database of Syst Rev. 2013;(3). Art. No.: CD002213. https://doi.org/10.1002/14651858.CD002213.pub3.
Ker J, Bradley P. Simulation in medical education. In: Understanding Medical Education: Evidence, Theory and Practice. 2nd ed; 2013. p. 175–92.
McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003-2009. Med Educ. 2010;44(1):50–63 [cited 2017 Nov 15]. Available from: http://doi.wiley.com/10.1111/j.1365-2923.2009.03547.x.
Rudolph JW, Simon R, Dufresne RL, Raemer DB. There’s no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc. 2006;1(1):49–55.
Cheng A, Palaganas J, Eppich W, Rudolph J, Robinson T, Grant V. Co-debriefing for simulation-based education: a primer for facilitators. Simul Healthc. 2015;10(2):69–75.
Cheng A, Morse KJ, Rudolph J, Arab AA, Runnacles J, Eppich W. Learner-centered debriefing for health care simulation education: lessons for faculty development. Simul Healthc. 2016;11(1):32–40.
Poore JA, Dawson JC, Dunbar DM, Parrish K. Debriefing interprofessionally: a tool for recognition and reflection. Nurse Educ. 2019;44(1):25–8.
Grant VJ, Robinson T, Catena H, Eppich W, Cheng A. Difficult debriefing situations: A toolbox for simulation educators. Med Teach. 2018;40(7):703–12.
Illing J. Thinking about research: theoretical perspectives, ethics and scholarship. In: Understanding Medical Education: Evidence, Theory and Practice. 2nd ed; 2014. p. 331–47.
Ng S, Lingard L, Kennedy TJ. Qualitative research in medical education: methodologies and methods. In: Understanding Medical Education: Evidence, Theory and Practice. 2nd ed; 2013. p. 371–84.
O’Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med. 2014;89(9):1245–51.
Gale NK, Heath G, Cameron E, et al. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med Res Methodol. 2013;13:117.
Edwards R, Holland J. What is qualitative interviewing? 2013.
Neergaard MA, Olesen F, Andersen RS, Sondergaard J. Qualitative description-the poor cousin of health research? BMC Med Res Methodol. 2009;9(1):1–5.
Sandelowski M, Caelli K, Ray L, Mill J. Focus on research method: whatever happend to qualitative description. Res Nurs Health. 2000;2(23):1–13 Available from: http://journals.sagepub.com/doi/10.1177/160940690300200201%0A, http://www.wou.edu/~mcgladm/Quantitative%20Methods/optional%20stuff/qualitative%20description.pdf.
Guest G, Namey E, Mitchell M. Sampling in qualitative research. In: Collecting Qualitative Data: A Field Manual for Applied Research; 2017. p. 41–74.
Boet S, Bould MD, Layat Burn C, Reeves S. Twelve tips for a successful interprofessional team-based high-fidelity simulation education session. Med Teach. 2014;36(10):853–7 [cited 2018 Jan 18]. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4245993/pdf/MTE-36-853.pdf.
Boet S, Dylan Bould M, Sharma B, Revees S, Naik VN, Triby E, et al. Within-team debriefing versus instructor-led debriefing for simulation-based education: a randomized controlled trial. Ann Surg. 2013;258(1):53–8.
Boet S, Pigford AA, Fitzsimmons A, Reeves S, Triby E, Bould MD. Interprofessional team debriefings with or without an instructor after a simulated crisis scenario: an exploratory case study. J Interprof Care. 2016;30(6):717–25. Available from:. https://doi.org/10.1080/13561820.2016.1181616.
Brown DK, Wong AH, Ahmed RA. Evaluation of simulation debriefing methods with interprofessional learning. J Interprof Care. 2018;00(00):1–3. Available from:. https://doi.org/10.1080/13561820.2018.1500451.
Hull L, Russ S, Ahmed M, Sevdalis N, Birnbach DJ. Quality of interdisciplinary postsimulation debriefing: 360° evaluation. BMJ Simul Technol Enhanc Learn. 2017;3(1):9–16.
Stockert B, Ohtake PJ. A national survey on the use of immersive simulation for interprofessional education in physical therapist education programs. Simul Healthc. 2017;12(5):298–303.
Paige JT, Zamjahn JB, Carvalho RB, Yang S, Yu Q, Garbee DD, et al. Quality with quantity? Evaluating interprofessional faculty prebriefs and debriefs for simulation training using video. Surgery. 2019;165(6):1069-1074.
Cheng A, Hunt EA, Donoghue A, Nelson-McMillan K, Nishisaki A, LeFlore J, et al. Examining pediatric resuscitation education using simulation and scripted debriefing: A multicenter randomized trial. JAMA Pediatr. 2013;167(6):528–36.
Endacott R, Gale T, O’Connor A, Dix S. Frameworks and quality measures used for debriefing in team-based simulation: a systematic review. BMJ Simul Technol Enhanc Learn. 2019;5(2):61–72.
Nyström S, Dahlberg J, Edelbring S, Hult H, Abrandt DM. Debriefing practices in interprofessional simulation with students: a sociomaterial perspective. BMC Med Educ. 2016;16(1):1–8. Available from:. https://doi.org/10.1186/s12909-016-0666-5.
Sullivan S, Campbell K, Ross J, Thompson R, Underwood A, LeGare A, et al. Journal of Surgical Education. J Surg Educ. 2018;75(4):978–83.
Scherer K, Winokur RS. Multidisciplinary team training simulation in interventional radiology. Tech Vasc Interv Radiol. 2019;22(1):32–4. Available from:. https://doi.org/10.1053/j.tvir.2018.10.007.
Kolbe M, Weiss M, Grote G, Knauth A, Dambach M, Spahn DR, et al. TeamGAINS: a tool for structured debriefings for simulation-based team trainings. BMJ Qual Saf. 2013;22(7):541–53.
Yang YY, Yang LY, Lee FY, Hwang SJ. DAA-based IIT simulation model enhances the interprofessional collaboration and team efficiency competency of health professionals. J Chin Med Assoc. 2019;82(3):169–71.
Meny LM, de Voest MC, Salvati LA. Assessment of student pharmacist learning within an interprofessional simulation: A comparison of small group vs. large group debriefing. Curr Pharm Teach Learn. 2019;11(5):533-537.
Richmond A, Burgner A, Green J, Young G, Gelber J, Bills J, et al. Discharging Mrs. Fox: a team-based interprofessional collaborative standardized patient encounter. MedEdPORTAL Publ. 2017;13:1–8.
Park CW, Holtschneider ME. Interprofessional simulation. J Nurses Prof Dev. 2016;32(1):44–6 Available from: http://content.wkhealth.com/linkback/openurl?sid=WKPTLP:landingpage&an=01709760-201601000-00009.
van Schaik S, Plant J, O’Brien B. Challenges of interprofessional team training: a qualitative analysis of residents’ perceptions. Educ Heal Chang Learn Pract. 2015;28(1):52–7.
Cardwell R, Davis J, Gray R, Hillel SA, McKenna L. How is clinical credibility defined in nursing? A systematic review. Collegian. 2020;27(1):23–33. Available from:. https://doi.org/10.1016/j.colegn.2019.05.007.
Bullock I, Davis M, Lockey A, Mackway-Jones K. Feedback. In: Pocket guide to teaching for clinical instructors. 3rd ed; 2016. p. 59–68.
Jaye P, Thomas L, Reedy G. “The Diamond”: a structure for simulation debrief. Clin Teach. 2015;12(3):171–5 [cited 2018 Jan 3]. Available from: http://doi.wiley.com/10.1111/tct.12300.
Eppich W, Cheng A. Promoting Excellence and Reflective Learning in Simulation (PEARLS). Simul Healthc J Soc Simul Healthc. 2015;10(2):106–15.
Royal College of Physicians. Improving teams in healthcare Resource 2: Team culture. 2017. Available from: https://www.rcem.ac.uk/docs/External%20Guidance/ITIH%20R2%20Final.pdf
Thompson R, Sullivan S, Campbell K, Osman I, Statz B, Jung HS. Does a Written Tool to Guide Structured Debriefing Improve Discourse? Implications for Interprofessional Team Simulation. J Surg Educ. 2018:1–6. Available from:. https://doi.org/10.1016/j.jsurg.2018.07.001.
The authors would like to acknowledge the following:
• The MSc Clinical Education team at the University of Edinburgh for their teaching and encouragement, particularly Dr Derek Jones for his assistance with some of the practicalities of this project.
• Mr. Mark Taylor for his assistance with proof-reading multiple drafts of this study.
• All of the interview participants who shared their experience and thoughts during interviews.
CH had half of the fees funded for the first two years of her MSc (this dissertation was completed in the non-funded 3rd year of the programme) by Leeds Teaching Hospital Trust Emergency Department. There was no direct funding for this study.
Ethics approval and consent to participate
Approval from the University of Edinburgh Medical Education Ethics committee was sought and received in January 2020 (see Appendix 1). Approval from the Health Research Authority was sought and was received in March 2020 (see Appendix 2). Each NHS trust which has potential participants working for them received the appropriate documents to approve and sign as per the HRA process prior to initiating formal contact with the participants.
Consent for publication
This work was part of an MSc in Clinical Education for CH at the University of Edinburgh.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Holmes, C., Mellanby, E. Debriefing strategies for interprofessional simulation—a qualitative study. Adv Simul 7, 18 (2022). https://doi.org/10.1186/s41077-022-00214-3