Skip to main content

A simulated “Night-onCall” to assess and address the readiness-for-internship of transitioning medical students

Abstract

Transitioning medical students are anxious about their readiness-for-internship, as are their residency program directors and teaching hospital leadership responsible for care quality and patient safety. A readiness-for-internship assessment program could contribute to ensuring optimal quality and safety and be a key element in implementing competency-based, time-variable medical education. In this paper, we describe the development of the Night-onCall program (NOC), a 4-h readiness-for-internship multi-instructional method simulation event. NOC was designed and implemented over the course of 3 years to provide an authentic “night on call” experience for near graduating students and build measurements of students’ readiness for this transition framed by the Association of American Medical College’s Core Entrustable Professional Activities for Entering Residency. The NOC is a product of a program of research focused on questions related to enabling individualized pathways through medical training. The lessons learned and modifications made to create a feasible, acceptable, flexible, and educationally rich NOC are shared to inform the discussion about transition to residency curriculum and best practices regarding educational handoffs from undergraduate to graduate education.

Introduction

“It still doesn’t quite feel like I am able to jump in and start on July 1…the nurses expect you to be the doctor, the patients expect you to be the doctor, your colleagues expect you to be the doctor”.

~4th year medical student 2 weeks before graduation expressing anxiety about transitioning to residency.

“We get to see July 1st as medical students and get to see how a lot of Interns really struggle with some basic skills”.

~3rd year medical student a year before graduation voicing concern about transitioning to residency.

Medical students transitioning from undergraduate medical education (UME) to graduate medical education (GME, also referred to as “residency” or “internship”) experience uncertainty and distress about their readiness-for-internship [1,2,3]. This lack of readiness may be partially responsible for the “July effect”—a reported increase of 10% in fatal medical errors in teaching hospitals in North America when these new graduates enter the workforce each July [4]. Residency program directors are just as anxious about integrating the incoming medical students into a fast-paced and complex health care system because they are aware that clinical experience and competence during the senior year of medical school is variable, both within a single school and across institutions [5,6,7], and a new resident class is typically made up of graduates of many medical schools. This heterogeneity in readiness has led residency programs and hospital leadership to implement orientation programs and increase supervision to ensure patient care quality and safety as new trainees learn to function effectively in their latest roles [8, 9]. Some medical schools have also implemented transition courses; however, these are generally focused by clinical discipline [10]. A clinical discipline-agnostic readiness-for-internship program, administered just prior to medical school graduation, would serve many important purposes including (1) preparing near-graduate medical students for a smooth and safe transition to residency, (2) building an assessment program with the intention of ultimately benchmarking and reporting readiness-for-internship metrics, regardless of clinical discipline, and (3) providing a meaningful educational handoff between UME and GME in the USA and beyond.

A competency-based readiness-for-internship assessment program is both timely and critical to the UME-GME continuum [10].In recent years, patient safety and quality assurance committees of hospitals and residency program directors have been called upon by accrediting agencies, malpractice insurance companies, and the general public to demonstrate that trained residents are capable of providing the level of care for which they have been assigned. Residency Review Committees, the clinical discipline specific accreditation bodies of the US Accreditation Council for Graduate Medical Education (ACGME), have provided guidelines outlining what a first-year resident can and cannot do without direct supervision until competency has been documented [11]. In 2014, the Association of American Medical Colleges (AAMC), responsible for accrediting medical schools in the USA, released a set of 13 core Entrustable Professional Activities (EPAs) for entering residency (Core EPAs) (see Fig. 1). EPAs are units of professional practice a trainee can be trusted to accomplish unsupervised once he or she has demonstrated sufficient and specific competence. Authors of the core EPAs provided detailed guidance meant to drive the community toward refining, measuring, and benchmarking the minimal level of competence expected of a medical school graduate [12]. As of yet, there is little consensus on how to assess the Core EPAs of new residents or what type of transition documentation (or “handoff”) to residency programs would be meaningful [13, 14].

Fig. 1
figure 1

The 2016 NOC activities were tailored to capture and assess the 13 core EPAs for medical students transitioning to residency. For a complete version of the Core Entrustable Professional Activities for Entering Residency please go to: www.mededportal.org/icollaborative/resource/887

Although ensuring readiness-for-internship is challenging, there are unacceptable negative consequences for patients, institutions, programs, and for the individual professional if “onboarding” is not done effectively. Simulation has a critical role to play in both reducing the risk of iatrogenic harm to patients [15, 16] and assessing fundamental clinical competence critical to creating an institutional culture of safety [17,18,19,20]. Ideally, with the implementation of a meaningful simulation-based assessment program just prior to medical school graduation, actionable formative feedback can be provided to both the learner and GME Program Directors to achieve these goals.

In this paper, we describe in detail the development of a complex, immersive simulated, Night-onCall (NOC). We believe that NOC is an innovative program for a number of reasons including the fact that it (1) was designed iteratively and in response to specific local needs and evolving research questions, (2) it can be reproduced at most medical schools without need of sophisticated simulation facilities, and (3) it provides both an authentic educational experience for and is likely to enable high value core EPAs assessment of transitioning medical students.

Developing NOC

Conceptual framework underlying NOC

NOC is a multi-station experience in an Objective Structured Clinical Exams (OSCEs) format [21,22,23]. Since the 1960’s OSCEs utilizing standardized (a.k.a. “programed,” “simulated”) patients to assess core clinical skills have become a ubiquitous part of medical education assessment programs—used worldwide in a vast array of formats and for a variety of purposes including physicians’ licensing examinations. NOC aligns with literature that supports the utility of a well-designed OSCE as an assessment of clinical competence, assuming careful attention is paid to “contextual fidelity,” which includes the interprofessional nature of most medical work and accurate “professional role reproduction” [24]. NOC is the current focus of a research program in which we explore the measurement of clinical competence for the purpose of supporting increasingly individualized pathways through medical training [25].

The team

NOC was developed by a multidisciplinary and inter-professional team consisting of physician, nurse, medical librarian, and PhD-prepared educators from Emergency Medicine, Internal Medicine and Surgery and Obstetrics and Gynecology. Our team, as whole, has extensive expertise in using simulation in undergraduate and graduate medical and nursing education.

Table 1 details how we incrementally developed NOC over a 3-year period into a complex multi-modal, immersive simulation and a summary of our experience. The individual components of the 2016 NOC experience were refined and designed to address and assess each of the 13 Core EPAs. Fig. 2 illustrates what a medical student would experience in the 2016 iteration of NOC.

Table 1 Development of the Night-onCall (NOC) event
Fig. 2
figure 2

The Night-onCall experience from the student’s perspective

Types of NOC assessments

Web-based multimedia module

In response to the increasing focus on medical students’ readiness for residency, and based on 10 years of experience building and studying WISE-MD—a web-based core surgery clerkship curriculum [26], our team created WISE-onCall—a set of web-based, multimedia modules targeted at enhancing the ability of novices to address common clinical coverage issues. The modules are designed as a cognitive apprenticeship framework [27], starting with two “partially worked” case examples including video demonstration of inter-professional interaction, utilizing the instructional strategies of modeling, coaching, scaffolding, and fading of instructional guidance and then three text-based practice cases where the learner applies diagnostic skills and obtains feedback. To date, eight WISE-onCall modules have been completed with plans to build at least five more in the next 2 years [28]. For NOC, we selected the Oliguria (low urine output) WISE-onCall module because it is a topic that all students are likely to have basic familiarity with by the end of medical school and it is a condition interns in all clinical disciplines can expect to encounter during a typical night on inpatient call. (Addresses EPAs #1,2,3,4,9,10,12).

Performance-based assessment (PBA)

Initially, in 2014, we designed two standardized patient (SP) and standardized nurse (SN) cases of relatively equal difficulty (case no. 1, case no. 2) for pre and post of the WISE-onCall module. In 2015, we developed two additional SP/SN cases, (case no. 3, case no. 4) in order to explore how clinical case content concordance and sequencing of PBA and WISE-onCall, impacted performance across the simulation activities [31].In 2016, we revised case numbers 3r and 4r to enable us to address, assess, and align PBAs to the core EPAs (see Table 1 for details).

Learners’ clinical skills including inter-professional teamwork were assessed using SP/SN completed checklists developed based on extensive/prior research [22]. Clinical reasoning was assessed based on student-completed patient coverage notes and scored by a clinician based on a rubric [29, 30]. Rigorous methods were employed to develop SP/SN roles and checklists, as well as recruit, train, and calibrate actors for both case portrayal (3 h) and rater reliability (3 h) [23] (Addresses EPAs 1,2,3,4,5,9,10,12).

Oral presentation

Experienced physicians from the study team played the role of a standardized attending (SA) for case number 1. The SA received a phone call from the study participants following the case number 1 clinical encounter. A detailed guide to the case and the task were provided. Specifically, the guide included the clinical details of the case and a set of standardized prompts to be used to encourage the learner to share their clinical reasoning and establish a management plan. The SA was also responsible to assess the quality of the oral presentation using a checklist designed based on the detailed description of core EPA no. 6 [31] and make an entrustment judgment.

Evidence-based medicine activity

Following case no. 3 learners, seated in front of a computer with an Internet access, were given 10 min to define a clinical question based on this case and instructed to use Web-based resources to find the best answer to a clinical questions provided to them (e.g., “What is the best initial management for urgent hypertension?”). A computer program was used to allow a medical librarian to remotely observe the learner’s progression through the activity both in real time and based on a recording. Using this approach, the medical librarian was also able to assess the learner’s ability to formulate a clinical question and use digital resources to identify high quality evidence to guide the patient’s care as described by EPA no. 7.

Patient handoff

We recruited senior medical students to play the role of the standardized intern (SI) taking over the clinical service. Each SI was trained to use a structured evaluation instrument, modified from a published instrument, to assess the quality of the handoff [32] as well as provide an entrustment judgement (EPA no. 8).

Culture of safety exercise

Participants were first given time to read a detailed vignette describing a pre-entrustable intern’s approach to a series of common quality and safety challenges on an inpatient ward [31]. Then, in written responses to open-ended prompts, the participants listed both the interns’ behaviors and attitudes that interfered with a culture of safety and suggested actions needed for systems improvement. A faculty member (GN) assessed students’ written responses based on a rubric designed based on the description of the AAMC’s EPA no. 13 [33].

Recruitment of students

For all phases of NOC, we recruited near-graduate medical students by email. If the student agreed to participate, he or she could sign up for a scheduled slot in the simulation center by clicking on a Universal Resource Locator (URL) embedded in the recruitment email. Study staff then confirmed the date with the participant and provided him/her background information regarding the study via email. Participation was entirely voluntary, written informed consent was obtained and a financial incentive was provided.

Resources needed to implement NOC

The NOC experience was hosted by the New York Simulation Center [34]. We estimate our cost-per-student for NOC to be around $500 (US). This includes SP/SN salaries and staff time for planning and running the event, including SP training, student recruitment, and scheduling. This estimated cost does not include a facility fee, the study incentive, case development time, patient note scoring, data entry and management, and physician preceptor time.

Lessons learned

In building this experience, we have learned many lessons that may be of interest to other’s seeking to build similar assessment events. While currently we do not share any assessment data with students, ultimately, we seek to use the competency assessments and entrustment judgments for feedback to students on their readiness and as a handoff to residency training program directors. The following is what we learned so far.

NOC is feasible

As we have demonstrated, it is feasible to host a NOC for a large number of students. However, this can only be done with championship from leadership, adequate funding and committed professional and administrative personnel. The team met weekly for the 3 years; it took to develop materials, pilot, and refine the program. The staging of the full NOC event required several months of planning which included scheduling space, recruiting and training actors, faculty and students playing roles, and recruiting and scheduling participants. Data entry, cleaning, analysis, and interpretation also required adequate resources. Advanced simulation facilities or equipment were not required to host the NOC.

NOC is acceptable

NOC is an immersive, complex, mixed-modality simulation experience, aimed at creating an authentic opportunity to rehearse being an Intern “on call.” Although it will require more work to establish the program as an effective means of measuring students’ readiness-for-internship for high-stakes purposes, the participants routinely expressed in debriefing portion of the NOC experience, that it helped them better understand their readiness and identify knowledge/skill gaps prior to their transition.

NOC is a flexible structure

Depending on local needs and resources, there are ways to modify the program to reduce cost and shorten the time needed, while at the same time, still achieving the same objectives. Based on our experience, we believe the EPA framework allows for a great deal of creativity and innovation. For instance, we choose to assess EPA no. 13 using a written assessment of a paper case rather than a complex simulation others have used [35]. Other schools prepare students for a night on call by integrating assessments into their required advanced clerkships [36]. In the future, we plan to conduct head-to-head comparisons of various strategies to better understand relative educational and assessment value and costs.

NOC will likely produce valuable information

One goal of the analysis of our experience and data is to understand the educational value of the components of NOC. From the point of view of the students who have volunteered to participate, this low-stakes experience was almost uniformly seen as time well spent, educational, and anxiety-reducing. This may change as we refine the competency measures and entrustment judgments and start providing detailed feedback. At our school, 12 years ago, we established a Comprehensive Clinical Skills Exam (CCSE), an 8-station Objective Structured Clinical Exam, to serve as a final performance exam for the core clinical clerkships. Similarly, we developed the CCSE as an assessment for learning or formative experience where it was very popular with and very much appreciated by students. Once we transitioned the CCSE to an assessment of learning, (as defined by van der Vleuten et al. [37]) or summative, high-stakes experience where students were required to pass the exam, its popularity and the enthusiasm among students decreased. We suspect this may be an inevitable trade-off for some students, but we do hope to engage students in embracing the value of the data produced by NOC.

Next steps for NOC

We are currently experimenting to find effective ways to visualize the NOC data and report it to students for the purposes of guiding them in preparation for internship. Despite desiring educational handoff information on their incoming interns, residency program directors are suspicious of assessments done in the undergraduate setting and do not yet “trust” evidence of readiness [1]. With this in mind, we are exploring how, if at all, residency program directors would find this type of performance data useful to plan supervision during the transition months, given that in the USA, they are contractually committed to training incoming residents at the time of medical school graduation.

We are also exploring both how best to understand the entrustment judgments generated in NOC [38, 39] and adding self-assessment measures (e.g., context-specific self-efficacy, affect, and cognitive load relevant measures) to examine the value of experiences like NOC on understanding a student’s metacognitive capabilities [40]—thought to be crucial to the lifelong learning required by a career in medicine in the twenty-first century.

Is NOC a valid approach to enhance readiness for internship?

We embraced the complexity and context-based nature of competence in building NOC. As a consequence, it will require a great deal of work to establish validity and set standards with the NOC outcome data for the purpose of high-stakes promotion decisions. Our team is currently working toward this goal. NOC’s design is grounded in both a conceptual (situated mixed modality clinical experiences in an immersive simulation) and content framework (core EPAs), created through national consensus and endorsed by the AAMC. When available, we based our assessment instruments on tools with previously reported internal validity data and we are working to ensure there is acceptable reliability to all our assessments. NOC balances the difficulty of having highly reliable measures with the fact that we are generating a large number of assessments on each student from a variety of perspectives (patient, nurse, expert, peer—a simulated 360° workplace assessment. We plan to follow some of our subjects forward into the first year of residency and beyond to see if strengths and weaknesses identified during the NOC experience are associated with adjustment to internship and demonstrated skills, and, in the longer run, to study if NOC predicts success in residency training and beyond.

The NOC program has already resulted in curriculum changes. For example, our clerkships and sub-internships have incorporated WISE onCall modules and related exercises that address clinical reasoning and provide examples of professional behavior, teamwork, and communication.

Conclusion

If the AAMC EPAs are to become the standard by which we assess transitioning students’ preparedness for residency, we will need to assure all of our students reach those standards and continue to be able to perform at that level at the time they are transitioning to graduate level medical education. Building programs, like NOC, will also enable medical schools to move toward a competency-based, time-variable curriculum that many now believe is the best way forward [41,42,43]. We have described a program for achieving these goals that is feasible, acceptable, flexible, and likely to produce valuable information for learners, educational leaders and policy makers.

References

  1. Sozener CB, Lypson ML, House JB, Hopson LR, Dooley-Hash SL, Hauff S, et al. Reporting achievement of medical student milestones to residency program directors: an educational handover. Acad Med. 2016;91:676–84.

    Article  PubMed  Google Scholar 

  2. Minter RM, Amos KD, Bentz ML, Blair PG, Brandt C, D'Cunha J, et al. Transition to surgical residency: a multi-institutional study of perceived intern preparedness and the effect of a formal residency preparatory course in the fourth year of medical school. Acad Med. 2015;90:1116–24.

    Article  PubMed  Google Scholar 

  3. Teunissen PW, Westerman M. Opportunity or threat: the ambiguity of the consequences of transitions in medical education. Med Educ. 2011;45:51–9.

    Article  PubMed  Google Scholar 

  4. Petrilli CM, Del Valle J, Chopra V. Why July matters. Acad Med. 2016;91:910–2.

    Article  PubMed  Google Scholar 

  5. Goren EN, Leizman DS, La Rochelle J, Kogan JR. Overnight hospital experiences for medical students: results of the 2014 clerkship directors in internal medicine national survey. J Gen Intern Med. 2015;30:1245–50.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Frischknecht AC, Boehler ML, Schwind CJ, Brunsvold ME, Gruppen LD, Brenner MJ, et al. How prepared are your interns to take calls? Results of a multi-institutional study of simulated pages to prepare medical students for surgery internship. Am J Surg. 2014;208:307–15.

    Article  PubMed  Google Scholar 

  7. Schwind CJ, Boehler ML, Markwell SJ, Williams RG, Brenner MJ. Use of simulated pages to prepare medical students for internship and improve patient safety. Acad Med. 2011;86:77–84.

    Article  PubMed  Google Scholar 

  8. Sachdeva AK, Loiacono LA, Amiel GE, Blair PG, Friedman M, Roslyn JJ. Variability in the clinical skills of residents entering training programs in surgery. Surgery. 1995;118:300–8. discussion 308-309

    Article  CAS  PubMed  Google Scholar 

  9. Lyss-Lerman P, Teherani A, Aagaard E, Loeser H, Cooke M, Harper GM. What training is needed in the fourth year of medical school? Views of residency program directors. Acad Med. 2009;84:823–9.

    Article  PubMed  Google Scholar 

  10. Antonoff MB, Swanson JA, Green CA, Mann BD, Maddaus MA, D'Cunha J. The significant impact of a competency-based preparatory course for senior medical students entering surgical residency. Acad Med. 2012;87:308–19.

    Article  PubMed  Google Scholar 

  11. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system––rationale and benefits. N Engl J Med. 2012;366:1051–6.

    Article  CAS  PubMed  Google Scholar 

  12. Englander R, Flynn T, Call S, Carraccio C, Cleary L, Fulton TB, et al. Toward defining the foundation of the MD degree: core entrustable professional activities for entering residency. Acad Med. 2016;91:1352–8.

    Article  PubMed  Google Scholar 

  13. Santen SA, Rademacher N, Heron SL, Khandelwal S, Hauff S, Hopson L. How competent are emergency medicine interns for level 1 milestones: who is responsible? Acad Emerg Med. 2013;20:736–9.

    Article  PubMed  Google Scholar 

  14. Warm EJ, Englander R, Pereira A, Barach P. Improving learner handovers in medical education. Academic Medicin. 2017;92(7):927–93.

    Article  Google Scholar 

  15. Barsuk JH, Cohen ER, Wayne DB, Siddall VJ, McGaghie WC. Developing a simulation-based mastery learning curriculum: lessons from 11 years of advanced cardiac life support. Simulation in healthcare : journal of the Society for Simulation in Healthcare. 2016;11:52–9.

    Article  Google Scholar 

  16. Cohen ER, Feinglass J, Barsuk JH, Barnard C, O'Donnell A, McGaghie WC, et al. Cost savings from reduced catheter-related bloodstream infection after simulation-based education for residents in a medical intensive care unit. Simul Healthc. 2010;5:98–102.

    Article  PubMed  Google Scholar 

  17. Cleland J, Patey R, Thomas I, Walker K, O’Connor P, Russ S. Supporting transitions in medical career pathways: the role of simulation-based education. Advances in Simulation. 2016;1:14.

    Article  Google Scholar 

  18. Hallas D, Biesecker B, Brennan M, Newland JA, Haber J. Evaluation of the clinical hour requirement and attainment of core clinical competencies by nurse practitioner students. J Am Acad Nurse Pract. 2012;24:544–53.

    Article  PubMed  Google Scholar 

  19. Ker J, Mole L, Bradley P. Early introduction to interprofessional learning: a simulated ward environment. Med Educ. 2003;37:248–55.

    Article  PubMed  Google Scholar 

  20. Thomas I, Nicol L, Regan L, Cleland J, Maliepaard D, Clark L, et al. Driven to distraction: a prospective controlled study of a simulated ward round experience to improve patient safety teaching for medical students. BMJ quality & safety. 2015;24:154–61.

    Article  Google Scholar 

  21. Zabar S, Hanley K, Altshuler L, Wallach A, Porter B, Fox J, et al. Do clinical skills assessed in osces transfer to the real world of clinical practice? Using unannounced standardized patient visits to assess transfer. Acad Med. In Press.

  22. Zabar S, Adams J, Kurland S, Shaker-Brown A, Porter B, Horlick M, et al. Charting a key competency domain: understanding resident physician interprofessional collaboration (IPC) skills. J Gen Intern Med. 2016;31:846–53.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Zabar S, Kachur E, Kalet A, Hanley K. Objective structured clinical examinations: 10 steps to planning and implementing OSCEs and other standardized patient exercises: Springer Science & Business Media; 2012.

  24. Hodges B. Validity and the OSCE. Med Teach. 2003;25:250–4.

    Article  PubMed  Google Scholar 

  25. Szyld D, Unquillas K, Green B, Yavner S, Song HS, Nick M, et al. Improving the clinical skills of near graduating medical students using a blended computer and simulation based approach. Simul Healthc. (in press).

  26. Yavner SD, Pusic MV, Kalet AL, Song HS, Hopkins MA, Nick MW, et al. Twelve tips for improving the effectiveness of Web-based multimedia instruction for clinical learners. Med Teach. 2015;37:239–44.

    Article  PubMed  Google Scholar 

  27. Stalmeijer RE. When I say… cognitive apprenticeship. Med Educ. 2015;49:355–6.

    Article  PubMed  Google Scholar 

  28. The Web Initiative for Surgical Education (WISE). WISE-OnCall Web-based e-learning modules. Available at: http://www.wisemed.org/wise-oncall-e-learning-page. Accessed 30 June 2017.

  29. Berger AJ, Gillespie CC, Tewksbury LR, Overstreet IM, Tsai MC, Kalet AL, et al. Assessment of medical student clinical reasoning by “lay” vs physician raters: inter-rater reliability using a scoring guide in a multidisciplinary objective structured clinical examination. Am J Surg. 2012;203:81–6.

    Article  PubMed  Google Scholar 

  30. Stevens DL, King D, Laponis R, Hanley K, Zabar S, Kalet AL, et al. Medical students retain pain assessment and management skills long after an experiential curriculum: a controlled study. Pain. 2009;145:319–24.

    Article  PubMed  Google Scholar 

  31. Association of American Medical Colleges (AAMC). Core entrustable professional activities for entering residency, curriculum developers guide. Association of American Medical College. Available at: https://members.aamc.org/eweb/upload/core%20EPA%20Curriculum%20Dev%20Guide.pdf. Accessed 30 June 2017.

  32. Farnan JM, Paro JA, Rodriguez RM, Reddy ST, Horwitz LI, Johnson JK, et al. Hand-off education and evaluation: piloting the observed simulated hand-off experience (OSHE). J Gen Intern Med. 2010;25:129–34.

    Article  PubMed  Google Scholar 

  33. Ng G, Pimentel S, Szyld D, Kalet A. Towards entrusting medical students: recognising safety behaviours. Med Educ. 2016;50:569–70.

    Article  PubMed  Google Scholar 

  34. The New York Simulation Center (NYSIM). Available at: http://www.nysimcenter.org/. Accessed 29 June 2017.

  35. Farnan JM, Gaffney S, Poston JT, Slawinski K, Cappaert M, Kamin B, et al. Patient safety room of horrors: a novel method to assess medical students and entering residents’ ability to identify hazards of hospitalisation. BMJ quality & safety. 2016;25:153–8.

    Article  Google Scholar 

  36. Wald D, Peet A, Cripe J. Kinloch M. A simulated night on call experience for graduating medical students. Available at: https://www.mededportal.org/publication/10483. Accessed 27 July 2017.

  37. van der Vleuten C, Sluijsmans D, Joosten-ten BD. Competence assessment as learner support in education. In: Mulder M, editor. Competence-based vocational and professional education: bridging the worlds of work and education. Cham: Springer International Publishing; 2017. p. 607–30.

    Chapter  Google Scholar 

  38. Kalet A, Ark T, Eliasz KL, Nick M, Ng G, Szyld D, et al. A simulated night on call (NOC): assessing the entrustment of near graduating medical students from multiple perspectives. J Gen Intern Med. 2017;32:102–3.

    Google Scholar 

  39. ten Cate O. Entrustment decisions: bringing the patient into the assessment equation. Acad Med. 2017;92(6):736–8.

    Article  PubMed  Google Scholar 

  40. Cutrer WB, Miller B, Pusic MV, Mejicano G, Mangrulkar RS, Gruppen LD, et al. Fostering the development of master adaptive learners: a conceptual model to guide skill acquisition in medical education. Acad Med. 2017;92:70–5.

    Article  PubMed  Google Scholar 

  41. Frank JR, Mungroo R, Ahmad Y, Wang M, De Rossi S, Horsley T, et al. Toward a definition of competency-based education in medicine: a systematic review of published definitions. Med Teach. 2010;32(8):631–7.

    Article  PubMed  Google Scholar 

  42. Emanuel EJ, Fuchs VR. Shortening medical training by 30%. JAMA. 2012, 21;307(11):1143–1144.

  43. Cangiarella J, Gillespie C, Shea JA, Morrison G, Abramson SB. Accelerating medical education: a survey of deans and program directors. Med Educ Online. 2016;21(31794):1–8.

    Google Scholar 

Download references

Acknowledgements

We would like to thank Heather Dumorne, Nadiya Pavlishyn, Gizely Andrade, and Natasha Orzeck-Byrnes for providing support in the running of NOC as well as the many actors who participated as standardized patients and nurses to make this event a success. We also thank Deans of Medical Education, Victoria Harnik and Melvin Rosenfeld, for their support and guidance.

Funding

This project was supported with a grant from the James and Frances Berger Family Foundation.

Availability of data and materials

All materials needed to replicate the NOC Project, including standardized patient case materials and assessment instruments are available from the corresponding author on reasonable request.

Author information

Authors and Affiliations

Authors

Contributions

All individuals listed below have substantially contributed to the conception or design of the work and have participated in drafting the manuscript and/or revising critically for important intellectual content. In addition, each has given final approval of this version and has agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved. Specifically, AK is responsible for all aspects of this project. She has led the conception and design of the project, drafted and finalized the manuscript, designed and refined aspects of the project, and championed and supervised its implementation. SZ participated in overall conception of the project, designed SP/SN cases, recruited and trained actors, and participated in conceptualizing, editing, and finalizing the manuscript. DS participated in overall conception of the project, designed SP/SN cases, and participated in manuscript editing and finalizing. SDY participated in design and implementation of NOC, data collection and management, and manuscript editing. HS participated in conception and design, sought and received IRB approval, and participated in manuscript editing. MN has participated in overall conception of the project, implemented the WISE-onCall module, designed data management and reporting components of the project, and participated in manuscript editing. GN has participated in overall conception of the project, in particular, the culture of safety activity, and participated in manuscript editing. MVP participated in design and conception of the oral presentation component and manuscript conceptualization and editing. CD participated in design and conceptualization of the implementation of NOC. CB participated in design and conceptualization of the handoff and in manuscript editing. KLE contributed important intellectual content on educational theory and participated in manuscript editing. JN participated in conceptualization and design of evidence-based medicine component of the NOC and participated in manuscript editing. TSR participated in funding and championing the project and in conceptualizing and editing the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Adina Kalet.

Ethics declarations

Ethics approval and consent to participate

While no data is shared in this manuscript, the NOC Project has been reviewed and approved by the NYU School of Medicine Institutional Review Board.

Consent for publication

Not applicable.

Competing interests

WISE-onCall was developed at NYU School of Medicine, and Dr. Thomas Riles, is the Executive Director for both WISE-MD which produces and distributes WISE-onCall and the New York Simulation Center both are not-for-profit entities. Mr. Nick is a member of the Program for Medical Education and Technology at the NYU School of Medicine and Technical Director for the WISE-MD and WISE-onCall. The other authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kalet, A., Zabar, S., Szyld, D. et al. A simulated “Night-onCall” to assess and address the readiness-for-internship of transitioning medical students. Adv Simul 2, 13 (2017). https://doi.org/10.1186/s41077-017-0046-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41077-017-0046-1

Keywords