Date Published: June 15, 2017
Publisher: BioMed Central
Author(s): David H. Salzman, Diane B. Wayne, Walter J. Eppich, Eric S. Hungness, Mark D. Adler, Christine S. Park, Katherine A. Barsness, William C. McGaghie, Jeffrey H. Barsuk.
This article describes the development, implementation, and modification of an institutional process to evaluate and fund graduate medical education simulation curricula. The goals of this activity were to (a) establish a standardized mechanism for proposal submission and evaluation, (b) identify simulation-based medical education (SBME) curricula that would benefit from mentored improvement before implementation, and (c) ensure that funding decisions were fair and defensible. Our intent was to develop a process that was grounded in sound educational principles, allowed for efficient administrative oversight, ensured approved courses were high quality, encouraged simulation education research and scholarship, and provided opportunities for medical specialties that had not previously used SBME to receive mentoring and faculty development.
The recent expansion and popularity of simulation-based medical education (SBME)  has rapidly increased the demand for simulation space and resources . SBME has also become a requirement or recommendation of various US Accreditation Council for Graduate Medical Education (ACGME) residency review committees and subspecialty boards [3–5]. This increasing interest in SBME and the need to meet professional regulatory requirements often coexists with finite time, staff, faculty expertise, and funding available for educational activities.
A multispecialty group of board-certified physicians (general and pediatric surgery, emergency medicine, anesthesiology, internal medicine, pediatrics), a simulation technician, a nurse educator, and administrative staff was convened in November 2012 to manage SBME proposal submission and review.
The proposal review process has six steps:Develop/revise the SBME proposal-scoring rubricEstablish a submission timelineCreate a review process for staff and faculty reviewersCalculate proposal priority scoresDetermine costs of proposed curriculaMake funding decisions
In 2013–2014, we received and funded all 29 proposals (mean score 2.88, SD 1.36). In 2014–2015, 38 proposals were submitted and 36 were funded (mean score 3.38, SD 1.24). In 2015-2016, 37 were submitted and 34 were funded (mean score 3.27, SD 1.35). Authors of curricula that do not receive funding receive specific feedback about their proposals and are invited to resubmit the next year. The two unfunded proposals in 2014–2015 were both successfully resubmitted and funded in 2015–2016. During the first 3 years of implementation, approximately 10% of proposals have required mentored improvement before implementation. However, all proposals receive mentoring as needed throughout the application process. In the first year, all accepted proposals were dominated by five departments. In subsequent years, over 15 departments submitted successful proposals.
This report describes the implementation of a rigorous SBME curriculum development, evaluation, and funding process for GME simulation activities guided by a defensible rationale. The ACGME has recently required residency programs to use competency-based Milestones to assess a resident’s achievement of competency during his/her progression through residency . Simulation has been advocated as an effective means to assess trainee achievement of these competencies in a safer environment than actual patient care [14–17]. We know that traditional methods of training (vicarious learning) produce uneven skill acquisition [18–20]. Simulation training has advantages over traditional training that have shown to improve trainee skills [17, 19, 20], and reduce complications, and healthcare costs [14, 15, 21–26]. The demand for simulation-based resources in the coming years will increase as more healthcare training programs move toward adoption of simulation-based education and assessment.
Feedback about the approved proposals is currently obtained via several mechanisms to ensure ongoing program evaluation. First, simulation staff frequently bring concerns of deviation from approved curricular plans to Northwestern SimulationTM leadership. Second, specific questions contained on renewal proposals ask authors to describe successes and challenges with implementation of their curricula, intended changes to the curricula for the renewal period, and a summary of learner evaluations. Third, simulation faculty and staff perform random audits by observing SBME curricula to ensure proposals are implemented as described in a safe and nurturing learning environment. Faculty (curricula authors) receive specific feedback about how they may improve their SBME curricula after these observations.
The course proposal process we describe here has been successful for at least seven reasons. First, it provides a mechanism that allows identification and ranking of simulation-based courses that should receive funding based on a previously established NIH peer-review formula. This mechanism ensures that all simulation users have an opportunity to compete for limited space and funding. It also ensures that limited financial resources are used for the curricula that are well designed and valuable to learners and are aligned with the institutional mission. Second, faculty members that have limited simulation experience, and might have otherwise been excluded from SBME, are identified and mentored via faculty development programs for curriculum design, simulation scenario development, and debriefing. Third, the organized approach requires applicants to consider creation of SBME curricula in a deliberate fashion using Kern’s six-step curriculum design method . Fourth, the transparent process reduces concerns about the allocation of limited GME funds. This is achieved through broad specialty representation, with multiple reviewers reaching consensus after group discussion, using a standardized and familiar scoring approach. Furthermore, all proposals submitted by inexperienced faculty or those with numerous years of simulation experience, are all evaluated using the same review system and discussed during the consensus meeting. Fifth, the curriculum development and submission process provides sufficient detail to estimate each project budget, allowing for alignment with institutional goals of patient safety, exposure to infrequent clinical presentations or procedures, and ACGME or specialty board requirements. Sixth, funding of proposals occurs annually. However, if an accepted and already funded curriculum acquires external grant funding during a fiscal year, the department can then use the grant money to cover SBME costs and use the center funding for new proposals in the same academic year. The new proposals must be vetted through the same curriculum review process. In fact, regardless of the source of funding, all GME-based SBME activities held at Northwestern Simulation™ are required to go through the curriculum review process. Institutional GME funding to support SBME is not a requirement of each application. Since implementation of our submission process, only one proposal has not requested institutional GME funding; their funding came from departmental sources. Finally, our model provides a clear understanding of what GME simulation use is planned over the course of the academic year allowing for effective scheduling across a large group of users.
The simulation curriculum process revealed several challenges that led to annual improvements. First, we needed to enhance communication to ensure all potential users were aware of the GME submission process and the availability of institutional funds. We used email and an annual lecture to publicize and describe the process to potential users. Despite these efforts, after the first year, we realized that some faculty members were still unaware of the simulation proposal and funding mechanism. An aggressive advertising campaign was launched, inviting hospital leadership and all faculty members to private tours and a simulation center open house (meet and greet). These efforts resulted in a 31% increase in SBME proposal submissions in year 2 of the process. Second, we revised the curriculum submission form each year to ensure it was “user friendly,” especially for relatively inexperienced SBME curriculum authors. Faculty development workshops were held, and internet links were embedded in the submission form to provide assistance developing learning objectives and to describe best practices for delivering feedback and formative and summative assessments. Third, the method of submission of a course proposal was modified over time. Proposals were completed in the first year as a Microsoft Word document and submitted via email. This method posed challenges in ensuring completion of all required sections for the proposal, as well as difficulties with organization and processing of the applications after receipt. A user-friendly, web-based form with pull-down menus and skip logic was needed. We introduced FluidReview (Fluidsurveys, Ottawa, ON, Canada), an online application system for grants, scholarships and awards in year three . This system allows both submission and scoring of curriculum proposals. Individual user information can be entered to pre-populate subsequent submissions and draft proposals can be saved and edited before final submission. Fourth, we modified the scoring rubric each year. In year 2, we added more specific anchors and provided additional rater training before faculty reviewers graded proposals. Detailed anchors were added to the rubric to include translational science criteria for Impact score in year 3. Fifth, we realized more details were needed within SBME proposals to estimate costs accurately. We added several items addressing the type and duration of each simulation activity contained in the curriculum proposal, including specific types of simulators, equipment, rooms, and video needs. Finally, we clarified simulation lab use policies and procedures. During the first year we had many requests to add or reschedule sessions and to change simulation needs after funding was allocated. Authors must now submit changes to curriculum proposals in writing at least 1 month before implementation that must be approved by the review committee.
This proposal process describes an annual method for submission, review, and distribution of funding for GME SBME courses. Our proposal process has improved annually based on feedback. There are still limitations to the process which we also attempt to improve in each annual iteration. First, our procedure is based on an educational theory approach to curriculum design and not all health science centers have experience in SBME curriculum design. Second, successful implementation of our process also requires adequate faculty and staff resources. Faculty with SBME expertise are key to our approach to rating proposals and delivering faculty development courses and do so as part of their academic obligations to the medical school. We estimate approximately 0.35 full-time equivalent administrative support is needed yearly to oversee curriculum development, submission, and evaluation. Third, our faculty review process was designed to provide guidance for the distribution of institutional funding for GME simulation activities. A separate and distinct process is applied to non-GME activities (e.g., external users who are seeking space for in-situ systems testing, device testing, or clinical research and medical student or continuing medical education activities). Fourth, although several studies have shown downstream impact from SBME on patient care [14, 19, 21, 22], we do not have such expected outcomes from proposals funded under this mechanism. We are unable to determine if this new process resulted in increased academic output from the submitting faculty due to the short duration since implementation. Fifth, the majority of the submitted projects were funded. However, the purpose of our proposal process was to not only discriminate among the submitted curricula, but also improve curricula by giving all authors feedback, streamline the process, increase operational efficiencies, provide transparent justification for distribution of limited funds, and identify borderline proposals just below the funding cut-off which might be successful with additional mentoring. Finally, we acknowledge that other schools of health professions education may have limited personnel and financial resources which prevent such an ambitious curriculum review procedure. We encourage tailored adaptation of the model presented in this report to the needs and conditions that make sense in other educational settings. For example, healthcare simulation educators may rely solely on the needs assessment feature of the Kern curriculum development model to isolate educational gaps that warrant local attention. Administration and information technology resources can be limited by using simpler and cheaper applications to submit proposals (e.g., Microsoft Word). Additionally, for those centers considering implementing a similar process, a survey of SBME course directors might provide additional guidance regarding how to specifically adapt our process.