Skip to main content

Supporting transitions in medical career pathways: the role of simulation-based education

Abstract

Transitions, or periods of change, in medical career pathways can be challenging episodes, requiring the transitioning clinician to take on new roles and responsibilities, adapt to new cultural dynamics, change behaviour patterns, and successfully manage uncertainty. These intensive learning periods present risks to patient safety. Simulation-based education (SBE) is a pedagogic approach that allows clinicians to practise their technical and non-technical skills in a safe environment to increase preparedness for practice. In this commentary, we present the potential uses, strengths, and limitations of SBE for supporting transitions across medical career pathways, discussing educational utility, outcome and process evaluation, and cost and value, and introduce a new perspective on considering the gains from SBE. We provide case-study examples of the application of SBE to illustrate these points and stimulate discussion.

Background

Transitions are inherent in medical education, training, and working life. However, evidence from healthcare and other literature indicates that transitions can be challenging for medical students and doctors, who report feeling underprepared in terms of technical and non-technical skills and who report high levels of associated stress [14]. Calls have been made for further formalised training aligned to support doctors in tackling the specific challenges experienced during educational transitions [5]. In this paper, we discuss the potential utility of simulation-based education (SBE) as a mechanism to support transitions in medical careers. We provide some examples of how this is already happening and suggest ways to expand the use of SBE in terms of preparedness for clinical practice in the broadest sense. Whilst identifying concrete specific outcomes of SBE for this purpose is not the focus of this paper, we situate our argument in the wider literature on how formal education and practice-based experiences contribute to the development of medical capacities and dispositions [6] and suggest ways to ensure maximum gain from SBE.

Simulation-based education

First, it is important to be clear what we mean by SBE. Simulation is a means of allowing deliberate hands-on practice of clinical skills and behaviours prior to, and alongside, entry into clinical environments. The aim of SBE is to develop safe clinicians by creating alternative situations and environments in which to learn skills and behaviours. SBE encompasses a breadth of approaches, from low-cost bench simulators to high-fidelity manikins, from simulated patients for learning communication skills to complex ward simulations and haptics.

Simulation is required in medical education for a number of reasons. The natural method of teaching clinicians advocated by Osler (1903)—unstructured clinical experience—was shown to be educationally ineffective [7]. Therefore, the focus of medical education and training shifted to a competency- or outcome-based model of teaching and learning where objectives and outcomes, assessment and feedback, and practice and supervision became the norm [8]. Concurrently, reduced availability of patients for teaching and learning medicine, due to changes in healthcare delivery [9], as well as increased emphasis on protecting patients from unnecessary harm [10], placed limits on the nature of patient contact, particularly for relatively inexperienced learners. Last but not least, in many countries, including Europe and to a lesser extent the USA, hours of training have now been strictly controlled by working time legislation which also led to increased interest in alternative pedagogic paradigms. SBE addresses all of these issues—decreasing reliance on training on real patients, allowing for instant feedback for correction of errors and for directing learning, optimising use of valuable clinical time, enhancing the transfer of theoretical knowledge into the clinical context, and ensuring learners are competent before exposure to real patients [1113]. SBE focuses on the needs of the learner—it allows for the deconstruction of clinical work patterns to focus on the mastery of a particular skill or combination of skills of interest, it can be calibrated to meet the needs and level of the individuals or teams involved, and it can be optimised in terms of timing of delivery to support skill development in a preparatory fashion [13]. Additionally, given that individuals find it difficult to reliably self-assess their level of preparedness—their strengths and weaknesses [14]—SBE incorporating rigorous, objective, and relevant measures of performance can contribute stronger predictive data regarding readiness for a role and help to identify areas requiring focused educational attention and self-directed learning prior to making a transition. Research has indicated a positive relationship between SBE and learning outcomes including the development of technical and non-technical skills, confidence, and, critically, patient outcomes [1518]. Indeed, a number of recent publications have identified that SBE can have a measurable, direct effect on a range of patient and public health outcomes including ICU infection rates, lower childbirth complications, and reduced post-operative complications and overnight stay [1921].

How can SBE support transitions in medical education?

It is useful to think of the complexity of transitions before answering this question. As defined by Kilminster et al., the term transition refers to the process of change or movement between one state of work and another [22]. At the undergraduate level, transitions start with entry into medical school and then involve moving from non-clinical to clinical environments and rotating through different medical specialties, culminating for many in the transition from medical school to working as a junior doctor. Following graduation, junior doctors rotate from unit to unit, place to place, speciality to speciality, and then sub-specialty to sub-specialty, before moving on from being a senior trainee or resident to their first fully trained post. Each transition presents an intensive learning period, requiring that the individual adapt to new environments, with their values, norms, and beliefs [23], manage uncertainty, master unfamiliar equipment or technology, work with new colleagues, and perhaps take on new roles/responsibilities and/or work with unfamiliar patient groups. Given the potential “breadth” of unfamiliarity associated with the changing working environment, it is perhaps understandable that transition points present risks for patient safety [24] and may stifle progress in skill acquisition [25, 26].

The focus of the majority of the research on transitions has been that of the move from medical school to junior doctor or internship (Foundation year 1 in the UK), where the learning curve is steep and the challenges facing new doctors are well defined and relatively well understood. During this transition, the emphasis shifts from learning to balancing education with performing a role in the workplace. Research shows that new doctors often feel they lack the skills and competences for work upon graduation [2]. Studies from the UK context suggest that there are particular areas in which senior medical students or new doctors feel unprepared, such as clinical reasoning and making a diagnosis, diagnosing and managing acute medical emergencies, and prescribing, as well as competencies associated with non-technical skills such as communicating effectively in a multidisciplinary team, speaking up, prioritising patients, handover, and breaking bad news [1, 2729]. (Note that this feeling of being thrown in at the deep end is not unique to medical graduates: the messages from the literature on the transition from student nurse to staff nurse are very similar) [30]. There is a paucity of research around transitions later in medical training where, arguably, the role shifts are less dramatic, but what evidence there is suggests that the issues are broadly the same. What we know is that those transitioning from Foundation doctor to specialist trainee (intern to resident) often report a heavy focus on service delivery to the detriment of having time to learn and develop new skills, to pursue sub-specialty interests and to gain exposure to the responsibility necessary for progressing in their roles [31]. Similarly, doctors transitioning from specialist trainee (senior resident) to consultant frequently recognise that they are deficient in several necessary non-clinical skills, e.g. supervision, handling complaints, decision-making, delegation, managing conflict, and providing feedback [32]. Although, as yet, there has been relatively little research on the effects of these later career transitions on doctors’ performance and patient safety, it would seem prudent at this point in time to consider all transitions in medical education and training as challenging and with the potential to lead to harm if poorly managed by the individual and the system.

SBE can aid transitions by allowing medical educators to create the conditions in which learners can undergo the practice to acquire and maintain essential (pre-determined) skills, behaviours, and expertise [33]. Learners can rehearse specific skills and procedures and practise broader tasks such as managing competing demands in acute settings, in artificially created environments which are designed to be authentic and to facilitate acquisition of expertise by individuals and teams, via practice, assessment, and feedback [34]. By doing so, learners are better prepared for clinical practice and hence may be able to manage transitions more effectively. How does this work? Increased knowledge and skill obtained through SBE allows necessary information to be accumulated and stored in long-term memory, and drawn on as required, freeing working memory to focus on other aspects of the task in question. To borrow an example from Leppink et al. [35], a novice reviewing an x-ray for the first time may see a mass of different elements all of which need to be processed to make sense of the x-ray. A more experienced learner, who has learned about x-rays and who has a preliminary cognitive schema of what to expect (in terms of physiology, anatomy, and imaging), can make sense of the x-ray more easily. This leaves him or her more cognitive resources to process other unknown aspects of the task. The same applies to practising any other skill, for example non-technical skills such as patient prioritisation or task delegation—once these skills have been rehearsed and incorporated into cognitive schema in the long-term memory store, they will become more automatic and capacity to process additional information simultaneously will increase. Thus, SBE prepares the individual to manage challenging and new situations by supporting them to learn parts of the puzzle in advance (so information is in storage to draw on as required), thus freeing up working memory to focus on what is new, novel, or unexpected [36]. In short, SBE draws on years of research into deliberate practice [8, 35] and cognitive load in a number of high-risk areas—not just medical education but also aviation, oil and gas, and energy [37, 38]—to create safe learning conditions for learners, to ensure safety in real-life situations.

Whilst the theoretical basis of SBE is well-recognised and researched, there has been less of a focus on using SBE effectively to support transitions in the medical career pathway. To effectively use SBE to support transitions requires a number of considerations which draw on the wider literature on deliberate practice. First, the simulated scenarios or tasks must be linked to well-designed learning objectives which are appropriately aligned with the learner’s stage of training and to areas known to be problematic at that transition point [7, 11, 39]. Given, for example, most UK medical students feel adequately trained in terms of basic medical knowledge, history taking, and certain clinical skills, but less confident in other areas, including non-technical skills, such as prioritisation and teamwork [29], then SBE to support the transition between medical school and internship could usefully focus on the latter areas. To give an example of using SBE to support the development of non-technical skills, a number of years ago, we identified that newly qualified doctors struggled with seeking help from senior staff in out-of-hours situations, particularly where communication was by telephone. We developed a simple simulation-based session which involved realistic scenarios, a phone, clinical staff taking the calls, feedback from those taking the calls and faculty (observing the student making the call), and “handy hints”. Our data indicates that the block of teaching of which this is part (the “Professional Practice Block”) has been effective in graduating doctors who are more prepared for practice [40]. Second, for SBE to be effective, it should be integrated into the curriculum in a way that promotes transfer of the skills learnt to clinical practice. For example, it should be initiated at the appropriate educational moment/s, it should include processes for reinforcing learning (including immediate and informative feedback with a focus on areas of weakness), there should be opportunities to amend behaviours (i.e. time for reflection and consideration of current strategies and repeat sessions to allow learning to be put into practice), and, ideally, it should be possible to track performance gains (or losses) using rigorous and objective performance measures [11, 41, 42]. An example of SBE which was grounded in observable difficulties at the time of transition from medical school and included many of these essential features of deliberate practice was recently published by Thomas and colleagues (Table 1, Fig. 1) [43].

Table 1 Simulated ward round to support the transition from medical student to junior doctor
Fig. 1
figure 1

A simulated ward round experience: bridging the gap between undergraduate and postgraduate years

Fewer SBE examples exist to support more senior transitions in a doctor’s career. This is reflective of the relative paucity of research focused on understanding the challenges faced during these later transitions and is mirrored by the fact that whilst the competencies required by junior doctors have generally been outlined by national training bodies, this is not typical for more senior transitions in medical careers. Nevertheless, some promising work is emerging here. One relatively new approach to SBE is the “Boot Camp”. Boot Camps are themed on the principles of a military Boot Camp—intensive, focused training using experiential learning and hands-on practice to learn new skills and knowledge in a safe environment. Several US Boot Camps, designed to support interns transitioning into residency programmes, and drawing heavily on SBE for technical skill development, report improvements in interns’ confidence levels and procedural skill acquisition following repeated exposures to clinical scenarios in a simulated setting [4446]. A further example of a Boot Camp to support the transition from junior doctor into a surgical training programme in the UK (Fig. 2), one which includes a novel focus on developing core non-technical skills through SBE, is provided in Table 2 [47].

Fig. 2
figure 2

Scottish Surgical Boot Camp programme, 2015

Table 2 Scottish Surgical Boot Camp

Projecting further ahead still, to the transition from specialty training to career grade positions (attending clinicians in the USA and consultant posts in the UK and Europe), it was not possible to identify any SBE approaches designed to support the development of the new skills and behaviours necessary for mastery of these roles. Since the emerging literature is starting to highlight a need for more support for certain non-technical elements of these positions (e.g. supervision, delegation, influencing culture) [32], it may be time to define and specify the precise learning objectives that can be addressed usefully via deliberate practice and to plan SBE interventions that can support preparedness at these higher stages.

What other considerations are important when planning SBE to support transitions?

First, in relation to feedback, it is critical to use trained faculty, who are skilled in giving immediate, informative feedback and engaging participants not just with SBE generally but also with the feedback component of SBE. More contemporary theories of feedback stress the importance of the learner in the feedback process, conceptualising feedback as a dialogic process, where effective feedback depends on learner engagement and activation of the internal regulatory processes required for goal-directed learning [48]. To do so sets expectations for the role of both faculty and learners in the feedback process, which may differ from their experiences of feedback to date, and hence should be explicitly considered in the training and preparation components of any SBE. Associated with this, and discussed earlier, SBE must be matched with performance standards, educational objective(s), and opportunities for repeated practice (with feedback) in order to reach these standards. Moreover, it is essential to set up SBE to recognise that different people require different amounts of practice to accomplish mastery of predefined educational objectives [49]. In other words, one learner might master the outcomes/objectives associated with a simulated anastomosis activity with related ease depending on their prior learning, hand-eye coordination, etc., whereas another learner with different skills, knowledge, experience, and/or attitudes towards learning may need repeated practice.

Clearly, as stated above, SBE outcomes need to be defined in advance. When evaluating SBE, it is very important to go beyond acceptability (e.g. “8 out of 10 students reported that they enjoyed the [SBE]”) and move into outcomes based on measurable change in skill acquisition, whether the objectives are technical or non-technical skills [17, 18], all the way to translation into practice—“from VR-to-OR” [50]—and better outcomes for patients [1921]. Clearly, it is much easier to collect evaluation data immediately after an SBE intervention than it is to follow up learners when they are out in clinical practice, but it is essential to carefully plan long-term follow-up as otherwise it will be hard to justify the value of SBE to education providers and funders. One study assessed the impact of SBE on performance in the clinical environment in an Irish teaching hospital. SBE on ordering blood products was delivered to newly graduated medical students as part of a Boot Camp course prior to working as a junior doctor. The training was found to reduce the risk of a rejected sample by 65 % as compared with junior doctors who did not receive the training. Moreover, the risk of a rejected sample for trained interns was 45 % lower than for much more experienced doctors. The untrained interns required more than 2 months of clinical experience to reach an error rate that was not significantly different from that of the trained interns [51].

In addition to high-quality outcome evaluation, Moore et al. [52] discuss a number of benefits of complementing this with process evaluations (i.e. understanding the functioning of an intervention, e.g. How was it implemented?, What are the mechanisms of effect?) in relation to complex interventions (and we believe the educational interventions, including SBE, can be considered as complex interventions) [53]. Process evaluations provide the benefit of being able to inform the educators (e.g. was the simulated intervention delivered as intended, does it need to be redesigned in some way) and/or identify aspects of context which acted as barriers to the new learning being translated into clinical practice.

Value is usually related to cost [54]. Medical schools and medical training providers need to answer to governments, regulators, funders, and the public in terms of whether what they are delivering is fit for purpose. “Fit for purpose” can be considered from a number of perspectives. For example, are we producing the right doctors in terms of knowledge, skills, attitudes, and behaviours to meet the health needs of the communities they serve? Are we delivering these outcomes not just to a high standard but in a fiscally responsible way—can we justify a high-cost simulation over an apparently low-cost clinical experience? What are the gains from SBE that would be unobtainable, or unsafe, in more traditional models of teaching and learning? There is a need to develop an evidence base for SBE which encompasses “fitness for purpose” both educationally and fiscally. A high-cost, low-educational value SBE is the worst of all possible outcomes. A low-cost, low-value SBE will not meet anyone’s needs in the long term, whereas a high-cost, high-value SBE would probably be acceptable. Thomas and colleagues [55] calculated the cost of their simulated ward round and realised that the high cost limited the feasibility of the simulation as it was originally designed. By identifying the main cost components, they were able to develop and evaluate a slightly different approach (e.g. group feedback rather than individual feedback). Costs were significantly reduced, but the positive response from participants was maintained. As these are recent studies, it is not yet known if changing from individual to group feedback impacts differently on more distal outcomes such as clinical care practices. However, paying explicit attention to the cost of their SBE allowed Thomas and colleagues to consider other models of educational delivery without threatening the quality of their product.

Our next consideration is that of fidelity in the broadest sense. Much SBE has focused on individual skill development. However, healthcare is usually a team effort and many of the problems noted in transitions are about team skills, e.g. communication with other members of the multidisciplinary team, supervision, and speaking up across professional hierarchies [2830]. This means it is essential both to develop SBE team tasks and to develop outcomes that go beyond individual gains to team outcomes. These group objectives might be “hard” outcomes such as systemic improvements in team performance (e.g. fewer errors, cost savings, more efficient patient throughput), but it is also important to assess softer, process-, and team-related outcomes, including the quality of inter-disciplinary teamwork in a global sense and including specific team tasks such as communication during handovers, transfer of leadership, and task-based coordination [56]. Through the observation of teams working together in realistic scenarios and reflecting on performance through debriefing (potentially facilitated by the use of video), SBE might also help to explore more complex team-based dynamics such as social hierarchy, diversity, and divisions which can be difficult to pick up in more traditional classroom-based approaches to inter-professional education [57]. In this sense, SBE might also be tailored to target any specific team-based issues ongoing at a local level [58].

Finally, whilst it is absolutely critical to know what works in SBE and understand how it does so in terms of individual, cognitive, and acquisitive learning, limiting the focus of research to outcome and effectiveness studies means understanding of SBE will remain limited. There has been, to date, no acknowledgement in the literature that much SBE is inherently a social activity, bringing together groups of learners and faculty, away from the everyday clinical environment, sometimes in residential situations. By not recognising this explicitly, we have no understanding of how the relationships between faculty, participants, and activities during SBE influence learning [59] or of the nature or influence of the hidden curriculum [60]. Nor do we know about the influence of the particular cultural contexts, for example, of the wider socio-cultural, institutional, and historical setting and complexities of clinical training [61], in which SBE is situated. The need to extend the range of approaches to researching this field is real because, if SBE is based on limited models of learning, it risks inadequately preparing learners for practice. Moreover, where theoretical frameworks are lacking, explanations of the simulation phenomena that can be elaborated and refined in future research may not be forthcoming. Indeed, recently, some researchers have called for the reconceptualising of simulation education generally, to draw on contemporary practice and to consider questions of learning in complex healthcare systems [62, 63].

Conclusions

Increasing doctors’ preparedness to perform the skills and behaviours required to fulfil the responsibilities of any new role is important for patient safety, service efficiency, and individual psychological well-being. Whilst true mastery of a role cannot be achieved until one is immersed within the workplace itself [6], the literature indicates that we can go some way to preparing individuals for the technical and non-technical elements of any new role, and indeed the associated psychological challenges, through the judicious and imaginative use of SBE. In this paper, we have provided an overview of some of the key factors associated with planning and evaluating SBE for transitions. We have also highlighted a number of areas for future research in SBE to support medical career transitions. These include the development of understanding around the practical factors to be considered when designing SBE, ranging from the delivery of feedback and the incorporation of longer term outcome measures to analysis of the cost-effectiveness of the approach being undertaken, as well as the socio-cultural influences on learning in simulated settings. We urge those working in SBE research to consider how best to identify and evaluate concrete specific outcomes of SBE for this purpose. There remains the need for further investigation into the use of SBE to support the transition from medical student to junior doctor, but we urge those working in this area to not neglect examining the use of SBE to support later medical career transitions where “learners” are working with less supervision and increasing responsibility yet where (largely non-technical) issues pertinent to patient safety remain apparent.

References

  1. Teunissen PW, Westerman M. Junior doctors caught in the clash: the transition from learning to working explored. Med Educ. 2011;45(10):968–70.

    Article  PubMed  Google Scholar 

  2. Cameron A, Miller J, Szmidt N, et al. Can new doctors be prepared for practice? A review. Clin Teach. 2014;11:188–92.

    Article  PubMed  Google Scholar 

  3. Westerman M, Teunissen PW, Fokkema JPI, et al. The transition to hospital consultant and the influence of preparedness, social support and perception: a structural equation modelling approach. Med Teach. 2013;35(4):320–7.

    Article  PubMed  Google Scholar 

  4. Chen A, Kotliar D, Drolet BC. Medical education in the United States: do residents feel prepared? Perspectives on Medical Education. 2015;4:181–5.

    Article  PubMed  Google Scholar 

  5. Greenaway D. Shape of training: securing the future of excellent patient care. 2013. Available at: http://www.shapeoftraining.co.uk/static/documents/content/Shape_of_training_FINAL_Report.pdf_53977887.pdf. Accessed on 04.02.2016.

    Google Scholar 

  6. Billett S. The practices of learning through occupations. In: Billett S, editor. Learning through practice: models, traditions, orientations and approaches. Netherlands: Springer; 2010. p. 59–81.

    Chapter  Google Scholar 

  7. McGaghie WC, Kristopaitis T. Deliberate practice and mastery learning: origins or expert medical performance. In: Cleland J, Durning SJ, editors. Researching medical education. Oxford, UK: Wiley Blackwell; 2015. p. 219–30.

    Chapter  Google Scholar 

  8. Harden RM. AMEE Guide No. 14: outcome-based education: part 1—an introduction to outcome-based education. Med Teach. 1999;21:7–14.

    Article  Google Scholar 

  9. Ker J, Bradley P. Simulation in medical education. In: Swannick T, editor. Understanding medical education: evidence, theory and practice. London: Wiley-Blackburn; 2010. p. 164–80.

    Chapter  Google Scholar 

  10. Ziv A, Wolpe PR, Small SD, et al. Simulation-based medical education: an ethical imperative. Acad Med. 2003;78(8):783–8.

    Article  PubMed  Google Scholar 

  11. Bearman M, Nestel D, Andreatta P. Simulation-based medical education. In: Oxford textbook of medical education. Oxford University Press, Oxford, UK. 2013. p. 186–97.

  12. Watson K, Wright A, Morris N, et al. Can simulation replace part of clinical time? Two parallel randomised controlled trials. Med Educ. 2012;46:657–67.

    Article  PubMed  Google Scholar 

  13. Weller JM, Nestel D, Marshall SD, et al. Simulation in clinical teaching and learning. Med J Aust. 2012;196:1–5.

    Article  Google Scholar 

  14. Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. Acad Med. 2005;80:S46–54.

    Article  PubMed  Google Scholar 

  15. Riley W, Davis S, Miller K, et al. Didactic and simulation nontechnical skills team training to improve perinatal patient outcomes in a community hospital. Joint Comm J Qual Patient Saf. 2011;37(8):357–64.

    Google Scholar 

  16. Fung L, Boet S, Bould MD, et al. Impact of crisis resource management simulation-based training for interprofessional and interdisciplinary teams: a systematic review. J Interprof Care. 2015;29(5):433–44.

    Article  PubMed  Google Scholar 

  17. Knight LJ, Gabhart JM, Earnest KS, et al. Improving code team performance and survival outcomes: implementation of pediatric resuscitation team training. Crit Care Med. 2014;42(2):243–51.

    Article  PubMed  Google Scholar 

  18. Bagai A, O’Brien S, Al Lawati H, et al. Mentored simulation training improves procedural skills in cardiac catheterization: a randomized, controlled pilot study. Circ Cardiovasc Interv. 2012;5(5):672–9.

    Article  PubMed  Google Scholar 

  19. Barsuk JH, Cohen ER, Potts S, et al. Dissemination of a simulation-based mastery learning intervention reduces central line-associated bloodstream infections. BMJ Qual Saf. 2014;23:749–56.

    Article  PubMed  Google Scholar 

  20. Zendejas B, Cook D, Bingener J, et al. Simulation-based mastery learning improves patient outcomes in laparoscopic inguinal hernia repair: a randomized controlled trial. Ann Surg. 2011;254:502–11.

    Article  PubMed  Google Scholar 

  21. Inglis SR, Feier N, Chetiyaar JB, et al. Effects of shoulder dystocia training on the incidence of brachial plexus injury. Am J Obstet Gynecol. 2011;204(4):322-e1.

    Article  Google Scholar 

  22. Kilminster S, Zukas M, Quinton N, et al. Preparedness is not enough: understanding transitions as critically intensive learning periods. Med Educ. 2011;45:1006–15.

    Article  PubMed  Google Scholar 

  23. Martin MJ. “That’s the way we do things around here”. An overview of organizational culture. Electronic Journal of Academic and Special Librarianship 2006;7(1).

  24. Vaughan L, McAlister G, Bell D. ‘August is always a nightmare’: results of the Royal College of Physicians of Edinburgh and Society of Acute Medicine August transition survey. Clin Med. 2011;11(4):322–6.

    Article  Google Scholar 

  25. Kellett J, Papageorgiou A, Cavenagh P, et al. The preparedness of newly qualified doctors—views of Foundation doctors and supervisors. Med Teach. 2015;37(10):949–54.

    Article  PubMed  Google Scholar 

  26. Teunissen PW, Westerman M. Opportunity or threat: the ambiguity of the consequences of transitions in medical education. Med Educ. 2011;45(1):51–9.

    Article  PubMed  Google Scholar 

  27. Goldacre MJ, Taylor K, Lambert TW. Views of junior doctors about whether their medical school prepared them well for work: questionnaire surveys. BMC Med Educ. 2010;10:78.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Brennen N, Corrigan O, Allard J, et al. The transition from medical student to junior doctor: today’s experiences of tomorrow’s doctors. Med Educ. 2010;44:449–58.

    Article  Google Scholar 

  29. Tallentire VR, Smith SE, Skinner J, et al. Understanding the behavior of newly qualified doctors in acute care contexts. Med Educ. 2011;45:995–1005.

    Article  PubMed  Google Scholar 

  30. Monaghan T. A critical analysis of the literature and theoretical perspectives on theory-practice gap amongst newly qualified nurses within the United Kingdom. Nurse Educ Today. 2015;35(8):1–7.

    Article  Google Scholar 

  31. Lambert T, Smith F, Goldacre MJ. Doctors’ views about their work, education and training three years after graduation in the UK: questionnaire survey. JRSM Open. 2015;6(12):2054270415616309.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Morrow G, Burford B, Redfern N, et al. Does specialty training prepare doctors for senior roles? A questionnaire study of new UK consultants. Postgrad Med J. 2012;88:558–65.

    Article  PubMed  Google Scholar 

  33. McGaghie WC, Issenberg SB, Cohen ER, et al. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86:706–11.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Causer J, Williams AM. Professional expertise in medicine. In: Catheter-based cardiovascular interventions. Berlin Heidelberg: Springer; 2013. p. 97–112.

    Chapter  Google Scholar 

  35. Leppink J, van Gog T, Paas F, et al. Cognitive load theory: researching and planning teaching to maximise learning. In: Cleland J, Durning SJ, editors. Researching medical education. Oxford, UK: Wiley Blackwell; 2015. p. 207–18.

    Chapter  Google Scholar 

  36. Qiao YQ, Shen J, Liang X, et al. Using cognitive theory to facilitate medical education. BMC Med Educ. 2014;14:79.

    Article  PubMed  PubMed Central  Google Scholar 

  37. McAllister B. Crew resource management: the improvement of awareness, self-disciple cockpit efficiency and safety. London: Crowood Press; 1997.

    Google Scholar 

  38. Flin R, O’Connor P, Mearns K. Crew resource management: improving teamwork in high reliability industries. Team performance management: an international journal. 2002;8.3/4:68–78.

    Article  Google Scholar 

  39. Nicholas C. Teaching with simulation. In: An introduction to medical teaching. Netherlands: Springer; 2014. p. 93–111.

    Google Scholar 

  40. Thomson, AR, Cleland, JA, Arnold, R. Communication skills in a multi-professional critical illness course. The Association for Medical Education in Europe (AMEE), Berne, 30th Aug-3rd September 2003.

  41. Cook DA, Hamstra SJ, Brydges R, et al. Comparative effectiveness of instructional design features in simulation-based education: systematic review and meta-analysis. Med Teach. 2013;35(1):e867–98.

    Article  PubMed  Google Scholar 

  42. Carter O, Brennen M, Ross N, et al. Assessing simulation-based clinical training: comparing the concurrent validity of students’ self-reported satisfaction and confidence measures against objective clinical examinations. BMJ Simulation and Technology Enhanced Learning 2016. In Press.

  43. Thomas IM, Nicol LG, Regan L, et al. Driven to distraction: a prospective controlled study of a simulated ward round experience to improve patient safety teaching for medical students. BMJ Qual Saf. 2014. doi:10.1136/bmjqs-2014-003272.

    Google Scholar 

  44. Fernandez GL, Page DW, Coe NP, et al. Boot cAMP: educational outcomes after 4 successive years of preparatory simulation-based training at onset of internship. J Surg Educ. 2012;69:242–8.

    Article  PubMed  Google Scholar 

  45. Krajewski A, Filippa D, Staff I, et al. Implementation of an intern boot camp curriculum to address clinical competencies under the new Accreditation Council for Graduate Medical Education supervision requirements and duty hour restrictions. JAMA Surg. 2013;148:727–32.

    Article  PubMed  Google Scholar 

  46. Parent RJ, Plerhoples TA, Long EE, et al. Early, intermediate, and late effects of a surgical skills “boot camp” on an objective structured assessment of technical skills: a randomized controlled study. J Am Coll Surg. 2010;210:984–9.

    Article  PubMed  Google Scholar 

  47. Cleland J, Walker KG, Gale M, et al. Simulation-based education: understanding the complexity of a surgical training boot camp. Medical Education 2016; In Press.

  48. Boud D, Molloy E. Feedback in higher and professional education: understanding it and doing it well. UK: Routledge; 2012.

    Google Scholar 

  49. Wong BS, Krang L. Mastery learning in the context of university education. Journal of NUS Teaching Academy. 2012;2:206–22.

    Google Scholar 

  50. Dawe SR, Pena GN, Windsor JA, et al. Systematic review of skills transfer after surgical simulation-based training. Br J Surg. 2014;101(9):1063–76.

    Article  CAS  PubMed  Google Scholar 

  51. Joyce KM, Byrne D, O’Connor P, et al. An evaluation of the use of deliberate practice and simulation to train interns in requesting blood products. Simul Healthc. 2015;10(2):92–7.

    Article  PubMed  Google Scholar 

  52. Moore GF, Audrey S, Barker M, et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ. 2015;350:h1258.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Mattick K, Barnes R, Dieppe P. Medical education: a particularly complex intervention to research. Adv Health Sci Educ. 2013;18(4):769–78.

    Article  Google Scholar 

  54. Brown C, Cleland JA, Walsh K. The costs of medical education assessment. Medical Teacher 2015; In Press.

  55. Ford H, Cleland J, Thomas I. Simulated ward round: reducing costs, not outcomes. The Clinical Teacher 2015; In Press.

  56. Medical Research Council. Developing and evaluating complex interventions: new guidance. Available at: http://www.mrc.ac.uk/documents/pdf/complex-interventions-guidance/. Accessed on 04.02.16.

  57. Palaganas JC, Epps C, Raemer DB. A history of simulation-enhanced interprofessional education. J Interprof Care. 2014;28(2):110–5.

    Article  PubMed  Google Scholar 

  58. Decker SI, Anderson M, Epps C, et al. Standards of best practice: Simulation Standard VIII: Simulation-Enhanced interprofessional Education (SIM-IPE). Clinical Simulation in Nursing. 2015;11:293–7.

    Article  Google Scholar 

  59. Durning SJ, Artino AR. Situativity theory: a perspective on how participants and the environment can interact: AMEE Guide no. 52. Med Teach. 2011;33(3):188–99.

    Article  PubMed  Google Scholar 

  60. Mahood SC. Medical education: beware the hidden curriculum. Can Fam Physician. 2011;57(9):983–5.

    PubMed  PubMed Central  Google Scholar 

  61. Koens F, Mann KV, Custers EJFM, et al. Analysing the concept of context in medical education. Med Educ. 2005;39(12):1243–9.

    Article  PubMed  Google Scholar 

  62. Fenwick T, Dahlgren MA. Towards socio-material approaches in simulation-based education: lessons from complexity theory. Med Educ. 2015;49:359–67.

    Article  PubMed  Google Scholar 

  63. Bleakley A, Bligh J, Browne J. Medical education for the future: identity, power and location. 2011. Springer, London New York.

  64. Weigl M, Muller A, Zupance A, et al. Hospital doctors’ workflow interruptions and activities: an observation study. BMJ Qual Saf. 2011;20(6):491–7.

    Article  PubMed  Google Scholar 

  65. Pottier P, Dejoie T, Hardouin J, et al. Effect of stress on clinical reasoning during simulated ambulatory consultations. Med Teach. 2013;35(6):472–80.

    Article  CAS  PubMed  Google Scholar 

  66. Westbrook J, Woods A, Rob M, et al. Association of interruptions with an increased risk and severity of medication administration errors. Arch Intern Med. 2010;170(8):683–90.

    Article  PubMed  Google Scholar 

  67. Ryan C, Ross S, Davey P, et al. Prevalence and causes of prescribing errors: the Prescribing Outcomes for Trainee Doctors Engaged in Clinical Training (PROTECT) study. Public Library of Science. 2014;3;9(1), e798025.

    Google Scholar 

  68. Liu D, Grundgeiger T, Sanderson PM, Jenkins SA, Leane TA. Interruptions and blood transfusion checks: lessons from the simulated operating room. Anesth Analg. 2009;108(1):219–22.

    Article  PubMed  Google Scholar 

  69. Thomas IM. Student views of stressful simulated ward roads. Clin Teach. 2015. doi:10.1111/tct.12329.

    PubMed Central  Google Scholar 

  70. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27:10–28.

    Article  PubMed  Google Scholar 

  71. Yule S, Flin R, Paterson-Brown S, Maran N, Rowley D. Development of a rating system for surgeons’ non-technical skills. Med Educ. 2006;40:1098–104.

    Article  CAS  PubMed  Google Scholar 

  72. Cleland J et al. Understanding the sociocultural complexity of a surgical simulation “Boot Camp”. Med Educ. 2015, in press

Download references

Funding

No funding was required for the preparation of this manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stephanie Russ.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

All authors made a significant contribution towards the conception, design, and content of this paper. Whilst no primary data is presented, authors KW and IT contributed specifically the two case studies presented in Tables 1 and 2. Authors JC, SR, RP and PO completed the first draft of the paper which was then critically revised by all authors for intellectual content. All authors have given approval for this version to be published and to be held accountable for all aspects of the work. All authors read and approved the final manuscript.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cleland, J., Patey, R., Thomas, I. et al. Supporting transitions in medical career pathways: the role of simulation-based education. Adv Simul 1, 14 (2016). https://doi.org/10.1186/s41077-016-0015-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41077-016-0015-0

Keywords