The use of simulation in neurosurgical education and training

A systematic review

Full access

Object

There is increasing evidence that simulation provides high-quality, time-effective training in an era of resident duty-hour restrictions. Simulation may also permit trainees to acquire key skills in a safe environment, important in a specialty such as neurosurgery, where technical error can result in devastating consequences. The authors systematically reviewed the application of simulation within neurosurgical training and explored the state of the art in simulation within this specialty. To their knowledge this is the first systematic review published on this topic to date.

Methods

The authors searched the Ovid MEDLINE, Embase, and PsycINFO databases and identified 4101 articles; 195 abstracts were screened by 2 authors for inclusion. The authors reviewed data on study population, study design and setting, outcome measures, key findings, and limitations.

Results

Twenty-eight articles formed the basis of this systematic review. Several different simulators are at the neurosurgeon's disposal, including those for ventriculostomy, neuroendoscopic procedures, and spinal surgery, with evidence for improved performance in a range of procedures. Feedback from participants has generally been favorable. However, study quality was found to be poor overall, with many studies hampered by nonrandomized design, presenting normal rather than abnormal anatomy, lack of control groups and long-term follow-up, poor study reporting, lack of evidence of improved simulator performance translating into clinical benefit, and poor reliability and validity evidence. The mean Medical Education Research Study Quality Instrument score of included studies was 9.21 ± 1.95 (± SD) out of a possible score of 18.

Conclusions

The authors demonstrate qualitative and quantitative benefits of a range of neurosurgical simulators but find significant shortfalls in methodology and design. Future studies should seek to improve study design and reporting, and provide long-term follow-up data on simulated and ideally patient outcomes.

Abbreviations used in this paper:ACGME = Accreditation Council for Graduate Medical Education; CAS = carotid angioplasty and stenting; MERSQI = Medical Education Research Study Quality Instrument; MeSH = Medical Subject Headings; SIMONT = Sinus Model Oto-Rhino Neuro Trainer; SRSP = Stratathane resin ST-504 polymer; VIST = Vascular Intervention System Training; VR = virtual reality.

Abstract

Object

There is increasing evidence that simulation provides high-quality, time-effective training in an era of resident duty-hour restrictions. Simulation may also permit trainees to acquire key skills in a safe environment, important in a specialty such as neurosurgery, where technical error can result in devastating consequences. The authors systematically reviewed the application of simulation within neurosurgical training and explored the state of the art in simulation within this specialty. To their knowledge this is the first systematic review published on this topic to date.

Methods

The authors searched the Ovid MEDLINE, Embase, and PsycINFO databases and identified 4101 articles; 195 abstracts were screened by 2 authors for inclusion. The authors reviewed data on study population, study design and setting, outcome measures, key findings, and limitations.

Results

Twenty-eight articles formed the basis of this systematic review. Several different simulators are at the neurosurgeon's disposal, including those for ventriculostomy, neuroendoscopic procedures, and spinal surgery, with evidence for improved performance in a range of procedures. Feedback from participants has generally been favorable. However, study quality was found to be poor overall, with many studies hampered by nonrandomized design, presenting normal rather than abnormal anatomy, lack of control groups and long-term follow-up, poor study reporting, lack of evidence of improved simulator performance translating into clinical benefit, and poor reliability and validity evidence. The mean Medical Education Research Study Quality Instrument score of included studies was 9.21 ± 1.95 (± SD) out of a possible score of 18.

Conclusions

The authors demonstrate qualitative and quantitative benefits of a range of neurosurgical simulators but find significant shortfalls in methodology and design. Future studies should seek to improve study design and reporting, and provide long-term follow-up data on simulated and ideally patient outcomes.

Changes to the clinical, educational, and regulatory context of surgery over the past 2 decades have created many opportunities and challenges for neurosurgical training.22 High-profile patient safety incidents have been a primary driver for an increasing focus on patient safety, accountability, and surgical performance.32 The introduction of reduced working hours for doctors in training most notably in the US35,40 and Europe50 was an attempt to improve the working conditions of residents and, ultimately, patient safety. However, it may be that the effect of resident duty-hour restrictions as stipulated by the Accreditation Council for Graduate Medical Education (ACGME) is an increase in adverse patient outcomes.17 As such, working-hour restrictions necessitate the delivery of high-quality, time-effective training to surgeons to ensure optimal patient outcomes.

Simulation has been postulated as a potential solution to the challenge of providing appropriate training in less time21,50 and represents a useful proxy measure for expert surgical performance. Simulation as a concept is diverse and can concern itself with technical and nontechnical skills and knowledge. The burgeoning role of simulation in surgical training is in part due to rapid advances in simulation technology but also to the ACGME requirements for resident proficiency-based assessments.21 High-fidelity simulation, such as that involving immersive simulated operating room environments,23 is emerging as a method of providing learners with a safe yet realistic learning environment. There is a significant amount of evidence accumulating from several specialties as to the potential benefits of simulation, especially when combined with deliberate practice (repeated practice in motivated individuals receiving feedback on their performance).22 A recent meta-analysis found simulation-based medical education with deliberate practice better than traditional apprenticeship-styled clinical education in technical skill acquisition and maintenance for several clinical skills.33

In principle, one may consider neurosurgery an ideal specialty in which simulation could flourish; simulation could permit trainees to acquire key skills in a safe and protected environment in a high-precision specialty where technical error can result in devastating patient outcomes. To the best of our knowledge, the application of simulation within neurosurgical education and training has yet to be systematically reviewed and the state of the art in simulation within this specialty is currently unknown. This paper reports the results of the first systematic review aimed to assess the application of simulation within neurosurgical training.

Methods

Data Sources and Search Strategy

We prespecified the methods used in this systematic review and present them in accordance with PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) guidelines.34 A literature search was performed using the electronic databases of Ovid MEDLINE (1980 to Week 3 of December 2012), Embase (1980 to Week 52 of 2012), and PsycINFO (1987 to Week 3 of December 2012). A search of PubMed and the Cochrane Database of Systematic Reviews was also performed. The search strategy combined the 3 broad content areas of neurosurgery, simulation, and education (Table 1). These 3 content areas were combined using the Boolean operator “and.” Medical Subject Headings (MeSH) terms were used to ensure our search was comprehensive.

TABLE 1:

The search strategy used to identify relevant studies

StageSearch TermsNo. of Articles
1‘simulat*'.mp or ‘comput*'.mp or ‘model*'.mp or ‘technolog*'.mp or ‘tactile'.mp or ‘haptic*'.mp or ‘robot*'.mp or ‘augmented reality'.mp or ‘virtual reality'.mp or ‘artificial intelligence'.mp or ‘animal model*'.mp

or MeSH/subject headings: patient simulation/ or computer simulation/ or disease simulation or simulation/ or model/ or computer model/ or artificial intelligence/ or virtual reality/ or clinical models/ or models, animal/ or animal model/ or animal models/ or human computer interaction/ or computer applications/
7,151,995
2‘educat*'.mp or ‘train*'.mp or ‘teach*'.mp or ‘learn*'.mp or ‘curricul*'.mp or ‘competen*'.mp or ‘skill*'.mp

or MeSH/subject headings: models, educational/ or motor skills/ or “task performance and analysis”/ or Education, Medical/ or Teaching/ or Learning/ or Curriculum/ or Motor Skills/ or Test Taking Skills/ or Health Education/ or Competency-Based Education/ or Education/ or Education, Medical Graduate/ or clinical education/ or postgraduate education/ or residency education/ or education program/ or educational model/ or skill/ or skill retention/ or curriculum development/ or motor performance/or Medical Education/ or Curriculum Based Assessment/ or Curriculum Development/ or Motor Performance/ or Professional Competence/ or Competence/
3,716,976
3‘neurosurg*'.mp or ‘neuro-surg*'.mp or MeSH/subject heading: Neurosurgery/240,023
41 and 2 and 34101

The search strategy of this systematic review incorporated the electronic databases of Ovid MEDLINE (1980 to Week 3 of December 2012), Embase (1980 to Week 52 of 2012), and PsycINFO (1987 to Week 3 of December 2012). The first column in the table indicates the stages of the search strategy. The second column shows the search phrases used in the electronic database search, encompassing the 3 broad content areas of simulation (Stage 1 in the table), education (Stage 2), and neurosurgery (Stage 3). In Stage 4 of the search, the content areas were combined so that only articles including content on all 3 content areas were retrieved to produce a final set of articles to proceed to further review (as per Fig. 1). The final column indicates the number of articles retrieved at each stage of the search. MeSH = Medical Subject Headings.

To be eligible for inclusion in our review, papers had to report primary data, be published in the English language, describe a simulation-based neurosurgical intervention used in an educational or training context, and present outcome data. Papers that described a simulation alone, without outcome data, were excluded. There was no restriction placed on the specialty or training level of participants included in studies, and thus studies recruiting medical students and trainees in other (nonneurosurgical) specialties were also included. Furthermore, papers that incorporated data from simulation in nonneurosurgical specialties were included, but only the data relevant to neurosurgery were analyzed.

An initial title screen performed by a neurosurgical resident with expertise in surgical education (M.A.K.) was followed by an independent review of abstracts by M.A.K. and another clinician with expertise in medical education (M.A.). Full articles were reviewed where ambiguity regarding eligibility remained, and disagreements were resolved by consensus. Additional articles were identified through reading relevant specialty journals and through reviewing reference lists of included studies (M.A.K. and A.F.A.). Figure 1 highlights the study selection process from this systematic review.

Fig. 1.
Fig. 1.

Flow diagram illustrating our search strategy.

Data Extraction and Quality Assessment

Data from studies meeting our inclusion criteria were extracted by M.A.K. using a standardized data extraction proforma (available from the corresponding author upon request) and were critically appraised. Methodological quality of studies was assessed using the validated Medical Education Research Study Quality Instrument (MERSQI),42 a 10-item tool giving a score from 5 to 18 (higher score denotes better quality) in the domains of study design, sampling, data type, assessment validity, data analysis, and outcomes. When a study had components not applicable or relevant to the domain being evaluated, our scores were adjusted appropriately to result in a standard denominator of 18, in line with the original tool description.42

Given the wide variations in study design and outcome measures, a meta-analysis was not applicable.

Results

Selected Articles

Our search strategy identified 4101 articles (Table 1). After duplicates and those not published in the English language were excluded, 3132 articles remained for title screening (Fig. 1), which subsequently left 195 articles for abstract review. Two authors (M.A.K. and M.A.) agreed, after abstract review, on the full review of 81 of these papers (kappa statistic of agreement between reviewers = 0.947, 95% CI 0.901–0.993), and 19 were selected for inclusion in this systematic review. An additional 9 articles were found following hand searching of the reference lists of retrieved articles, leaving 28 articles for analysis.

Characteristics of Included Studies and Study Settings

The majority of studies were performed either exclusively in the US (n = 17; 61%) or in combination with Europe (n = 2; 7%) (Table 2). Study participants were of varying grades and specialties. The commonest simulated procedure was ventriculostomy (n = 6; 21%), followed by carotid angioplasty and stenting (CAS) (n = 4; 14%). In one instance, critical care scenarios were simulated as opposed to a specific surgical procedure.36 The commonest simulator used was ImmersiveTouch (n = 7; 25%), followed by the Procedicus Vascular Intervention System Training (VIST; Mentice AB) (n = 4; 14%) and the in vivo rat model (n = 3; 11%). Table 3 shows the different types of procedures simulated using the different surgical models.

TABLE 2:

Descriptive features of the included studies

ParameterNo. of Articles (%)
country of origin
 US17 (61)
 United Kingdom2 (7)
 US & Europe2 (7)
 Brazil2 (7)
 Germany2 (7)
 Italy1 (4)
 Japan1 (4)
 Korea1 (4)
participant type*
 neurosurgical resident15 (54)
 medical student5 (18)
 general surgery resident3 (11)
 interventional cardiologist3 (11)
 neurosurgery attending physician3 (11)
 neurosurgical fellow3 (11)
 resident, no specialty specified3 (11)
 vascular surgeon3 (11)
 interventional radiologist2 (7)
 neurosurgeon (experienced & inexperienced)2 (7)
 otolaryngology resident1 (4)
 otolaryngology attending physician1 (4)
 neurology resident1 (4)
  “experienced neurosurgeon” (no further information)1 (4)
 interventional neuroradiologist1 (4)
 endovascular neurosurgery fellow1 (4)
 fellow, no specialty specified1 (4)
 interventional radiology technician1 (4)

These data are not cumulative as most studies included more than 1 participant type.

TABLE 3:

Procedures simulated and models used for simulation*

SubspecialtyProcedureIn Vivo ModelComputer/VR-Based SimulatorCadaveric ModelSynthetic ModelPatient SimulatorCombination of Models
neurovascularCAS
cerebral angiography
microsurgical vascular procedures
EC-IC bypass
neurooncologytumor handling/removal
neurocritical careclinical scenarios
spinespinal needle placement
laminectomy
pedicle screw placement
neuroendoscopyendoscopic 3rd ventriculostomy
tumor resection/removal
no specific procedure described
neurotraumacraniectomy for traumatic EDH
neuroanatomycolor-coded physical models of
periventricular structures
othertranssylvian approach
ventriculostomy

EC-IC = external carotid–internal carotid; EDH = extradural hematoma.

Study Quality and Level of Evidence

Only 2 of the included studies were randomized; these were the only studies that had a control group.11,18 None of the studies were performed in high-fidelity environments. Only 1 study57 assessed skills retention following training on a simulator at 1 month following initial skill training only; this study was also the only study to assess transfer of simulator-based skills training to improved patient outcomes. The mean MERSQI score of included studies was 9.21 ± 1.95 (± SD; range 6–12.5) of a possible score of 18 (Table 4). No study scored more than 0 out of 3 in the domain of assessment validity, indicating overall poor validation evidence.

TABLE 4:

Study quality as assessed using the MERSQI

Scale Item (points available)Subscale (points if present)No. of Studies (%)Mean ± SD for Scale Item
study design (3)single group cross-sectional or single group posttest only (1)10 (36)1.43 ± 0.50
single group pretest & posttest (1.5)16 (57)
nonrandomized, 2 group (2)0
randomized controlled trial (3)2 (7)
sampling (3)a) no. of institutions studied1.79 ± 0.64
 1 (0.5)15 (54)
 2 (1)0
 >2 (1.5)13 (46)
b) response rate, %
 not applicable*0
 <50 or not reported (0.5)18 (64)
 50–74 (1)2 (7)
 ≥75 (1.5)8 (29)
type of data (3)assessment by study participant (1)10 (36)2.29 ± 0.98
objective measurement (3)18 (64)
validity of evaluation instrument (3)a) internal structure0 ± 0
 not applicable*0
 not reported (0)28 (100)
 reported (1)0
b) content
 not applicable*0
 not reported (0)28 (100)
 reported (1)0
relationships to other variables
 not applicable*0
 not reported (0)28 (100)
 reported (1)0
data analysis (3)a) appropriateness of analysis2.32 ± 0.77
 data analysis inappropriate for study design or type of data (0)5 (18)
 data analysis appropriate for study design and type of data (1)23 (82)
b) complexity of analysis
 descriptive analysis only (1)14 (50)
 beyond descriptive analysis (2)14 (50)
outcomes (3)satisfaction, attitudes, perceptions, opinions, general facts (1)8 (29)1.39 ± 0.28
knowledge, skills (1.5)18 (64)
behaviors (2)2 (7)
patient/health care outcome (3)0
total score (18)9.21 ± 1.95

In instances where these subscales are used, the denominators are changed appropriately to scale up to a score out of 18.

Data Synthesis

Study findings are categorized according to the category of simulator used (Table 5).

TABLE 5:

Summary of studies included in this systematic review*

Simulator CategorySimulator (reference no.)Participants (no.)ProcedureAssessment TypeRandomizedControlsDidactic ComponentDemonstration PretestingPractice TimeFeedback to ParticipantsWritten AssessmentPerformance MetricsFeedback by ParticipantsFollow-UpMain Findings
cadavericporcine skull10general surgery residents (7)craniectomy for traumatic EDH (proctored & unproctored)performance & participant survey pre- & postproctoringPerformance while being proctored was significantly better & faster, w/ better tissue handling & understanding of anatomy. Greater satisfaction among residents after being proctored.
deer head & spine56neurosurgery residents (8)MISS laminectomy & pedicle screw placement (dependent on seniority)pre- & postintervention self-assessment, participant surveySignificant & nonsignificant increase in self-reported mean confidence in performing a laminectomy & placement of pedicle screws, respectively. Participants felt it was realistic & useful for training. Reported as a feasible, inexpensive, & reproducible model.
combinationhuman cadaveric head, SRSP polymer to simulate tumor14neurosurgery & otolaryngology faculty surgeons & residents (13)skull base tumor removalparticipant surveyOverall positive feedback about the model. Scored highly for consistency & usefulness for neurosurgical training, less for radiographic visibility for surgical planning.
synthetic tubing, chicken wing, & in vivo rat model20neurosurgeons & neurosurgery residents (20)microanastomosisparticipant surveyLiving rat model favored for model accuracy, improving skills, & introducing the subject to other neurosurgeons or neurosurgery residents. Chicken wing model preferred for practicality.
in vivo rat model & turkey wing model1neurosurgery residents (15)EC-IC bypassparticipant surveyLive rat model rated the best model for training. Turkey wing model ranked second, preferred over chicken wing & Silastic tube models.
computer/VRboundary element based VR simulator w/ haptic feedback55neurosurgery attending physicians or residents (13)tumor handlingfeedback from surgeonsAll respondents felt the simulator could help in understanding basic surgical acts & that the simulator has a role in surgical training.
EasyGuide Neuro261st-yr trainees to board-certified neurosurgeons working in the neurosurgery department (16)ventriculostomy in normal & pathological ventriclesventriculostomy performance48% of catheters sited correctly. Those w/ >8 yrs of neurosurgical experience had a nonsignificantly lower accuracy rate compared w/ those w/ ≤8 yrs of experience. A positive training effect was observed, especially in those who participated regularly in sessions.
ImmersiveTouch28medical students (no. unreported)ventriculostomyventriculostomy performanceMedical students improved their performance from 10–50% to 100% in <30 trials.
ImmersiveTouch5neurosurgical residents & fellows (78)ventriculostomyventriculostomy performance at 1st attemptMean Euclidean distance of catheter tip to target (Monro foramen) was 16.09 mm. 73% of catheter tips successfully reached ventricle. No significant relationship btwn seniority & performance.
ImmersiveTouch6neurosurgery residents of varying grade (60)ventriculostomyventriculostomy performance at 1st attemptPeak performance occurred in 2nd- & 3rd-yr residents. Performance dropped in the 5th & 6th yrs of training.
ImmersiveTouch27neurosurgery residents from Yrs 1 to 7 (48)ventriculostomy in presence of shifted ventricles due to mass effectventriculostomy performanceAll grades of resident improved skills by the 2nd attempt at ventriculostomy. Data suggested initial proficiency lost by midresidency, returning at end of training.
ImmersiveTouch57neurosurgery residents of varying grade (16)ventriculostomy in normal, shifted & compressed ventriclesperformance on simulator & in real life surgery, participant surveySignificant improvement in simulated performance immediately after intervention &, to a lesser degree, 1 mo after intervention. Shifted ventricles were more difficult to cannulate than normal brains, but compressed ventricles were not. Best performance was in midresidency (PGY 3–4). Simulator practice resulted in significantly higher chance of ventriculostomy success at 1st attempt in real life but did not reduce hemorrhage development in patients. Participants found simulator realistic, w/ 3D visualization & haptic feedback often voted the best simulator features.
ImmersiveTouch29neurosurgery residents & fellows (51)thoracic pedicle screw placementtask performanceOverall procedure failure rate was 12.5%, & an improvement in accuracy of screw placement was observed moving from practice to test session.
ImmersiveTouch30neurosurgical residents & fellows (63)percutaneous spinal needle placementperformance in needle placement, fluoroscopy time, & failure rateNeedle placement failure rate of 8%. Average error, fluoroscopy exposure, & performance scores improved on 2nd attempt.
ROBO-SIM41experienced neurosurgeon (1)neuroendoscopic visualization of structures & tumor removalfeedback on quality of ROBOSIM, participant surveyHigh realism of ROBO-SIM compared w/ other simulators. Felt to be highly useful for training students.
ANGIO Mentor52interventional cardiologists, radiologists, neuroradiologists, a vascular surgeon (11)CASperformance pre- & postintervention, participant surveyFollowing the 2-day course a significant improvement in performance & reduction in fluoroscopy exposure & errors was observed. All participants extremely satisfied w/ the course. Simulator felt to be realistic w/ good force feedback.
ANGIO Mentor47neurosurgery residents & endovascular neurosurgery fellows (14)diagnostic cerebral angiographypre- & postintervention participant survey, performanceImprovement in no. of potentially dangerous actions & shorter procedure & fluoroscopy times among residents over trials. Fellows had fewer complications, dangerous maneuvers, & lower contrast use, & also showed improvement w/ trials. All felt the hands-on component of the trial was more educational than the didactic element. Good scores for visual presentation & mechanical properties of the simulator.
VIST9general surgery residents & vascular surgeons (21)CASperformance in procedure, participant surveyNovices (general surgery residents, <5 percutaneous angiographic procedures) required significantly more time to complete the procedure & used fluoroscopy for longer. This improved after training but remained significantly longer than for experts (vascular surgeons, >300 peripheral interventions). 80% of experts felt the clinical & tactile feedback were inadequate. All thought the scenarios were realistic.
VIST18residents, medical students, attending physicians (radiology, interventional cardiology, vascular surgery), fellows, interventional radiology technician (29)CASperformance pre- & postintervention, participant surveyWhether experienced or not in endovascular procedures, those in the experimental control group performed the procedure significantly faster than the control group (no difference in other performance measures). Those significantly experienced w/ endovascular procedures took significantly less time to perform the procedure. 75% of experienced group rated the simulator as realistic, but 2 criticized poor haptics & unrealistic catheter behavior.
VIST38experienced interventional cardiologists who never performed CAS (20)carotid angiographytask performanceSignificant improvement in procedure & fluoroscopy time, contrast vol, & catheter handling errors when comparing participants' 1st & last simulation.
VIST12neurosurgery residents (7)4-vessel angiographypre- & postintervention MCQ, performance, participant survey✓;MCQ score increased significantly following simulator use. Significant improvement in faculty assessment of technical skills, reduction in procedure & fluoroscopy time. Overall course satisfaction high.
in vivorat model39residents (16)microsurgical vascular proceduressurgical skills assessmentImprovement in all assessed criteria (intraoperative tremor, bleeding, surgical technique, effectiveness of anastomosis, intraop death not related to anesthesia, suture & anastomosis integrity after 48 hrs) when comparing 1st to last session.
swine model43neurosurgery residents (24)craniotomy, dural opening, tumor excisionparticipant surveyCourse judged as excellent or good by all participants. Best aspects of the course reported as the in vivo model (97%), realistic laboratory setup (94%), working environment (94%), & close supervision (94%).
patient simulatorhuman patient simulator36neurosurgery, neurology, & general surgery residents, & senior medical students (29)critical care scenariospre- & postintervention MCQSignificant improvement in MCQ test scores observed following simulation scenarios in all participant groups. All participants reported their education was enhanced by the experience & that the simulation was realistic.
syntheticcast model of ventricular system11medical students (101)construction of 3D color-coded physical models of periventricular structuresassessment quiz, participant survey, medical school neuroanatomy examination scoresScores in assessment quiz significantly higher in experimental compared to control group, especially for questions requiring 3D understanding of periventricular structures. No difference in medical school neuroanatomy course grades. 84% of respondents found the model helpful.
OMeR16medical students & residents (23)transsylvian approachparticipant surveyMost participants responded favorably, stating it helped them understand neuroanatomy & made neurosurgery more interesting.
SIMONT59experienced & inexperienced neurosurgeons (37)neuroendoscopy (no specific procedure specified)participant surveyAll participants felt SIMONT is an important device that would improve surgical skills. 89% felt it represented surgical situations realistically; the remainder would like to see improvements in consistency & texture.
SIMONT13experienced & inexperienced neurosurgeons (22)tumor resection & 3rd ventriculostomytask performanceSignificant improvement in technical skills after performing 6 procedures on the simulator.

MCQ = multiple choice questionnaire; MISS = minimally invasive spine surgery; PGY = postgraduate year.

Computer/Virtual Reality Simulators

Six computer/virtual reality simulators were identified through the systematic review: ImmersiveTouch (Fig. 2), ANGIO Mentor (Simbionix), ROBO-SIM (developed as part of the ROBOSCOPE EU-Telematics program), VIST (Fig. 3), EasyGuide Neuro (Philips Medical Systems), and a boundary element–based virtual reality simulator with haptic feedback. All but one of these (EasyGuide Neuro) incorporates haptic feedback, and EasyGuide Neuro was originally designed as a clinical neuronavigation tool for use in the operating room.46 See Appendix for technical details of some of the simulators identified in this systematic review.

Fig. 2.
Fig. 2.

The ImmersiveTouch system. Left: The system in operation. Right: Simulated ventriculostomy catheter insertion. Reproduced with permission from Banerjee et al: J Neurosurg 107:515–521, 2007.

Fig. 3.
Fig. 3.

The Procedicus VIST virtual reality simulator (Mentice AB, Gothenburg, Sweden). Reprinted from J Am Coll Cardiol 47, Patel et al., “Learning curves and reliability measures for virtual reality simulation in the performance assessment of carotid angiography,” pp 1796–1802, 2006, with permission from the American College of Cardiology Foundation.

The most commonly used simulator was Immersive-Touch, an augmented virtual reality system that has been shown to improve performance when used for training to simulate ventriculostomy,5,6,27,28,57 thoracic pedicle screw placement,29 and percutaneous spinal needle placement30 across different grades of trainees.

Ventriculostomy was the commonest simulated procedure identified from this systematic review, having been reported in 6 articles. In 5 articles the Immersive-Touch simulator was used;5,6,27,28,57 the remaining study used EasyGuide Neuro.26 Simulated ImmersiveTouch ventriculostomy training in neurosurgical residents, fellows, and medical students was shown to improve accuracy of catheter placement,27,28 even after one attempt,27 and in the presence of anatomically abnormal ventricles as may be encountered in the clinical setting.27,57 The effect of training was shown to persist 1 month after training.57 Of note, there was no clear relationship between seniority and performance at ventriculostomy in the reviewed studies; some studies found peak performance in midresidency,6,57 and others did not,5,27 including the study utilizing EasyGuide Neuro.26 These conflicting findings are perhaps surprising given that all but one of these simulated ventriculostomy studies were performed by the same research group using the same simulator (ImmersiveTouch). However, it may simply reflect differences in study design (for example, the use of normal vs pathological ventricles) or a complex relationship between seniority and performance in this task.

Utilizing VIST to simulate cerebral angiography12,38 and CAS (including in 1 randomized controlled study)9,18 resulted in improvements in anatomical and procedural knowledge,12 faculty assessment of technical skills,12 time taken to complete the procedure,9,12,18,38 duration of fluoroscopy exposure,9,12,38 contrast volume administered,38 and catheter handling errors.38 The ANGIO Mentor was used to simulate CAS52 and diagnostic cerebral angiography47 with evidence of significantly improved performance following training. Performance in cerebral angiography was better in more experienced participants (endovascular neurosurgery fellows) in the first and last trials; however, residents were able to close the gap and, interestingly, even the fellows showed an overall improvement with training.47

Studies of the ROBO-SIM41 and the boundary element–based virtual reality simulator55 only reported participants' subjective feedback on simulator training, which was positive.

Cadaveric Models

Evaluation of cadaveric porcine, deer head and spine, and human models was identified in the literature. Using a cadaveric porcine skull, remote proctoring of general surgery residents was found to improve performance and satisfaction in craniectomy for traumatic extradural hematoma, compared with being unproctored.10 Mean self-reported confidence in performing a laminectomy significantly increased among neurosurgical residents after use of a cadaveric deer head and spine model, with confidence in pedicle screw insertion also increasing but nonsignificantly so.56 Neurosurgery and otolaryngology faculty surgeons and residents found a human cadaveric head, with Stratathane resin ST-504 polymer (SRSP) instilled to simulate a skull base tumor, to be useful for neurosurgical training, but the participants raised concerns about the ability to visualize the SRSP material with radiographic imaging while planning the procedure.14

In Vivo Models

In vivo rodent and swine models were identified by the systematic review. The in vivo rodent model has been evaluated as a training tool for microvascular surgery,20,39 with improved faculty-rated resident surgical skills in 1 laboratory.39 The rodent model was favored over turkey wing,1 Silastic tube,1,20 and chicken wing training models,1 especially with regard to accuracy and utility.1,20 However, it may not be as practical as the chicken wing model.20 A live swine model used for craniotomy, dural opening, and tumor excision by neurosurgery residents received positive feedback.43

Synthetic Models

The synthetic models we identified include the Sinus Model Oto-Rhino Neuro Trainer (SIMONT; produced using synthetic thermoretractable and thermosensible rubber called Neoderma [Pro Delphus Co.]), the OMeR model (ONO & Co. Ltd.), and a cast of the ventricular system made with Model Magic (Crayola). Performance in simulated intraventricular tumor resection and third ventriculostomy by experienced and inexperienced neurosurgeons utilizing the SIMONT improved after performing 6 procedures.13 However, weaknesses in consistency and texture of the model were highlighted.59 The OMeR model, a synthetic head model, received positive feedback from medical students and residents, helping them learn about neuroanatomical relationships and increasing the medical students' interest in neurosurgery.16 In a randomized study using Crayola Model Magic, medical students who constructed 3D color-coded physical models of periventricular structures had significantly higher quiz scores than a control group (using 2D brain cross-sections), especially for questions requiring 3D understanding of periventricular structures.11 Although no difference in medical school grades was observed, the medical school examinations may not have appropriately tested 3D knowledge. Participants found the model helpful for learning. Although the authors' institution has formally integrated the intervention into the curriculum, long-term outcomes were not available.

Patient Simulator

Training of medical students and residents from several specialties in various neurocritical care scenarios using the Human Patient Simulator improved knowledge-based written assessment scores.36

Assessment Tools

Table 6 shows the variety of assessment tools used in the reviewed studies, which comprised the following 3 broad categories:

  1. subjective evaluation or feedback surveys;1,9–12,14,16,18,20,41,43,47,52,55–57,59

  2. performance at a surgical task assessed either by a rater10,39 (including self-rating56), simulator-derived metrics,5,6,26–30,38,47,57 or both;9,12,13,18,52 and

  3. knowledge-based written tests.11,12,36

TABLE 6:

Psychometric robustness of the assessment tools identified in this systematic review*

Simulator CategorySimulator (reference no.)Assessment ToolFeatures AssessedAssessment MetricsMERSQI Score (/18)
ReliabilityValidity
cadavericporcine skull10
  1. Operative Performance Rating Scale (8-question, 5-point scale)

  2. 2 satisfaction surveys, 1 after unproctored scenario (8 questions), another after proctoring; 5-point Likert scales

  1. anatomy knowledge, prevention of complications, tissue handling, flow of operation, principles of operation, knowledge & use of equipment, overall performance

  2. overall experience, familiarity & comfort w/ the procedure

NRconstruct validity10
deer head & spine56
  1. self-assessment survey pre-/postintervention (5-point scales)

  2. exit survey feedback from participants (5-point scales)

  1. asked to rate confidence in performing spinal procedure

  2. realism & general opinions about MISS simulator

NRconstruct, face validity8.5
combinationhuman cadaveric head, SRSP polymer to simulate tumor1410-item feedback survey (5-point scale)comparing tumor model & real tumor cases & general impressions of modelNRface validity8
synthetic tubing, chicken wing, in vivo rat model20survey of participants by emailquestions about model accuracy & practicalityNRface validity (rat model)7
in vivo rat model & turkey wing model1evaluation survey (5-point scale)comparison of 4 different models of EC-IC bypassNRface validity (both models)7
computer/VRboundary element based VR simulator w/ haptic feedback55feedback from surgeonsease of use, realism, comfortNRface validity6
EasyGuide Neuro26accuracy of ventricular catheter placementresults according to level of experienceNRconstruct validity8
ImmersiveTouch28performance in ventriculostomynot described in enough detailNRconstruct validity8
ImmersiveTouch5accuracy of ventricular catheter placement at 1st attemptdistance of catheter tip from Monro foramenNRconstruct validity9.5
ImmersiveTouch6accuracy of ventricular catheter placement at 1st attemptwhether catheter tip punctured & terminated inside the ventricleNRconstruct validity9.5
ImmersiveTouch27asessment of accuracy of ventriculostomy placementEuclidean distance from “ideal target”NRconstruct validity10
ImmersiveTouch57
  1. performance of ventriculostomy on novel brains presented by simulator & in real surgery

  2. written questionnaire on feedback from participants (Likert & free text responses)

  1. cannulation success rate, ipsilat vs contralat placement, entry into lat ventricle vs other ventricle, hemorrhage, catheter depth in cm

  2. realism of simulator, task difficulty, perceived impact of practice session, satisfaction, & suggestions for improvement

NRconstruct, face, & predictive validity12.5
ImmersiveTouch29score in ImmersiveTouch simulatorEuclidean distance from “ideal” target, & failure rate based on collision of the bur drill w/ the virtual spine modelNRconstruct validity11
ImmersiveTouch30simulator-collected performance dataaccuracy of needle placement, fluoroscopy time, &failure rateNRconstruct validity11
ROBO-SIM41feedback survey from 1 expert neurosurgeonusability, realism, usefulness for training, quality of visible anatomical structuresNRface validity7
ANGIO Mentor52
  1. assessment of performance in carotid artery stenting, pre-/postintervention, subjective & objective measurements

  2. feedback from participants (5-point scale)

  1. simulator: procedure performance, x-ray & delivery retrieval time of embolic protection device, complications; errors recorded by simulators & assessor, & procedure specific rating scale

  2. prior experience, realism, & training potential of simulator

interrater reliabilityconstruct, concurrent, face validity12
ANGIO Mentor47
  1. preintervention survey

  2. intervention: performance in procedure

  3. posttask survey

  1. experience w/ cerebral angiography & self-rating knowledge of technique & relevant anatomy

  2. total procedure & fluoroscopy times, no. of dangerous actions, vol of contrast administered

  3. feedback on realism of simulator & educational value of both didactic teaching & hands-on training session

NRconstruct, face validity10
VIST9
  1. performance in simulated carotid artery stenting graded by an experienced interventionalist

  2. simulator collected data

  3. instructor subjectively evaluated (5-point scale, 4 areas)

  4. exit survey (5-point scale)

  1. checklist of 50 steps (score out of 100)

  2. time to complete procedure, fluoroscopy time, amount of dye used

  3. guidewire manipulation, catheter manipulation, catheter exchanges, & monorail balloon technique

  4. realism of simulator, utility of simulator

NRconstruct, concurrent, face validity10
VIST18
  1. simulator-generated performance measures pre- & posttraining

  2. assessment by proctor pre- & posttraining

  3. feedback survey by participants at end

  1. total time, contrast used, fluoroscopy time, no. of tool insertions, stent placement accuracy, & other measures of performance

  2. inappropriate stent location

  3. endovascular experience, simulator opinions, experience w/ computer & video games

NRconstruct, concurrent, face validity12
VIST38simulator-generated performance measures (posttraining only)procedure time,fluoroscopy time, contrast vol, composite catheter handling errorstest-retest reliability, internal consistencyconstruct validity10
VIST12
  1. 12-question MCQ

  2. simulator-collected data

  3. 10-point scales rated by 2 faculty pre- & post-intervention

  4. feedback from participants (10 questions, 5-point scales)

  1. general principles of angiographic anatomy, procedures, & indications

  2. time of procedure, amount of contrast used, total fluoroscopy time, & potentially dangerous actions

  3. technical skills in catheter navigation, use of fluoroscopy & contrast, speed, & other factors

  4. general features of the course & questions about the simulator itself

NRconstruct, concurrent, face validity11
in vivorat model39assessment of surgical skillssurgical skills: blood loss, mortality, suture quality after 48 hrs, tremorNRconstruct validity9.5
swine model43evaluation by participants using 1–10 scales & “excellent,” “good,” “satisfactory,” or “inadequate”questions about the contents & practical use of the program & limitations & improvements that should be made in the futureNRface validity8
patient simulatorhuman patient simulator36pre- & postexercise testing; 20-question MCQquestions relevant to the neurocritical care scenarios presentedNRconstruct validity11
syntheticcast model of ventricular system11
  1. assessment quiz

  2. survey of opinions (4-point scales)

  1. understanding of anatomical relationships btwn 2D/3D structures

  2. survey of helpfulness of model in learning & understanding neuroanatomy

internal consistencyconstruct validity12.5
OMeR168-question feedback questionnairequestions about the model & simulationNRNR6
SIMONT59feedback questionnaire (no further details provided)opinions about SIMONT modelNRface validity6
SIMONT13quantitative & qualitative data from using the SIMONT modeltime required to complete the procedure, surgical technique appliedretest, interrater reliabilityconstruct, face validity8

Exact wording from the manuscripts themselves has been used as much as possible. NR = not reported.

Construct validity assessed but no relationship found between years of experience and performance.

Therefore, a significant number of studies reported participant feedback without objective evidence for efficacy of the simulator. None of the written assessment tools, knowledge or evaluation based, have been used more than once in the studies identified; this makes comparisons between studies difficult.

Psychometric Evidence for the Assessment Tools (Reliability and Validity)

As shown in Table 6, little reliability evidence is presented by the studies. Only 4 studies described assessment of reliability. One described high interrater reliability in the video assessment of performance in carotid artery stenting using the ANGIO Mentor.52 Another study reported test-retest reliability and internal consistency in the context of simulator-derived performance measures (notably catheter handling errors by participants) when performing carotid angiography using VIST.38 A third study found acceptable internal consistency of a written assessment quiz assessing 2D and 3D anatomical relationships in the context of a cast model of the ventricular system with Model Magic.11 The fourth study reported test-retest reliability and interrater reliability in using SIMONT for neuroendoscopic tumor resection and third ventriculostomy.13

The validation evidence from the studies was presented predominantly in the form of face validity, with participant feedback reporting realism when using the following: ImmersiveTouch,57 ANGIO Mentor,47,52 VIST,9,12,18,37 ROBO-SIM,41 SIMONT,13,59 the cadaveric deer head and spine model,56 in vivo rodent model,1,20 in vivo swine model,43 human cadaveric model with SRSP to simulate a tumor,14 and boundary element–based virtual reality (VR) simulator with haptic feedback.55

Construct validation, predominantly through the successful differentiation of novices and experts, or individuals pre- and postintervention, has been demonstrated when using the following simulators/devices: cadaveric porcine skull,10 cadaveric deer head and spine,56 EasyGuide Neuro,26 ImmersiveTouch,6,27–30,57 ANGIO Mentor,47,52 VIST,9,12,18,38 in vivo rodent model,39 human patient simulator,36 cast model of ventricular system with Model Magic,11 and SIMONT.13

Predictive validity was reported in a study using the ImmersiveTouch simulator for ventriculostomy training, as simulator practice resulted in significantly higher chance of ventriculostomy success at first attempt in real patients in the operating room.57 In another study an assessment scale used (the Operative Performance Rating Scale) was stated to be adapted from a previously validated assessment tool,10 but no validity or reliability evidence was presented.

Discussion

This systematic review, the first to our knowledge to evaluate simulation as an educational and training tool in neurosurgery, has demonstrated the presence of several different simulators at the neurosurgeon's disposal, with evidence for improved performance for a range of procedures, including ventriculostomy, neuroendoscopic procedures, and spinal surgery. In one study, critical care scenarios as opposed to surgical procedures were evaluated,36 highlighting the diverse role that simulation could play in neurosurgical education and training. The majority of studies present participant feedback, which is generally positive and the simulators are felt to be realistic.

These positive findings should be interpreted against the limitations of the evidence base. We found that many of the studies are hampered by one or more of the following shortcomings: nonrandomized design; presenting normal rather than abnormal anatomy to participants; lack of control groups and long-term follow-up; poor reporting of study methods, methodology, and the data obtained; lack of evidence of improved simulator performance translating into clinical benefit; and poor reliability and validity evidence of the assessment tools used.

These shortcomings were reflected in the systematically appraised quality of the reviewed articles, which scored a mean of 9.21 out of a maximum score of 18 on the MERSQI, a multifaceted instrument for assessing the quality of medical education studies.42 The MERSQI has been shown to have reliability and validity evidence, with high interrater and intrarater reliability and scores correlating well with expert ratings about study quality, 3-year citation rate, and journal impact factor.42 The score from our studies is lower than that found in systematic reviews of simulation-based education in central venous catheterization (12.6),31 laparoscopic surgery (11.9),58 and technology- enhanced simulation for health professions education (11.6).8 Although there is no preexisting evidence to confirm the relevance of MERSQI to neurosurgical simulation studies, its generic nature means that its power lies in the ability to make comparisons between specialties on the quality of educational studies being performed. As long as the implications of the MERSQI score are not overstated (it has little direct relevance to clinical outcomes), use of the MERSQI could serve as a benchmark upon which neurosurgical education researchers should strive to advance training and education within neurosurgery in a robust, evidence-based manner.

Traditionally, validity has commonly been incorrectly sought for the simulator/test as opposed to the results obtained from the simulator/test.24 Most studies identified by this systematic review had little or no assessment of validity to support the tool and the associated performance metrics being used as a simulator, which is important in ensuring that any benefits observed from the simulator are likely to be relevant to clinical practice. Although the “look and feel” of a simulator and assessment tool is important in acceptability to the learner (a concept traditionally referred to as “face validity”), this is notoriously difficult to assess. Some consider face validity to be the weakest form of validity,51 and others disagree that it is a form of validity, stating that the term should be abandoned from the literature altogether.25

Indeed, it is interesting to note that while no studies reported presenting the simulator in a high-fidelity setting (for example, a fully simulated operating room), many studies reported that the participants found the simulation realistic. Definitions of validity and reliability used in the surgical literature are inconsistent,53 and inconsistencies in validation study methodologies within this review limited our ability to draw strong conclusions comparing the effectiveness of simulation-based curricula15,49 and evaluating the transfer of skills from the simulation setting to the operating room.48 These are open questions yet to be addressed in future research.

The concept of validity in the assessment literature has changed over the past few decades, but the surgical education literature has generally been slow to adapt.25 Use of terms to describe different types of validity (such as construct and concurrent validity) is considered long redundant. Since 1985 the American Educational Research Association has stated a preference to refer to types of validity evidence as opposed to distinct types of validity, with validity representing the “appropriateness, meaningfulness, and usefulness of the specific inferences made from test scores.”3,4 Types of validity evidence encompass the need to find validity evidence for the results or metrics of a test through multiple sources, including “test content, response process, internal structure, and relationships to other criteria.”25 A paradigm shift in the surgical education literature toward the modern concept of validity will be vital in ensuring appropriate judgments of performance are made.

In surgical specialties other than neurosurgery, there is an ever-increasing repertoire of evidence, including from double-blind randomized studies, that simulation training improves skills and reduces error in clinical settings.2,7,44,45,54 In neurosurgery the gravity of technical error is arguably greater, and the potential benefit of simulation providing a safe environment for trainees to improve their skills should be explored further. In our systematic review, only 1 study of neurosurgical simulation training analyzed simulated performance at a procedure some time after training and correlated simulation training with clinical performance and patient outcomes. In this study, involving 16 neurosurgical residents using the ImmersiveTouch simulator for ventriculostomy,57 the probability of successful ventricular cannulation 1 month after training was higher than that pretraining, but not as high compared with that immediately posttraining. This suggests the need for ongoing education (such as refresher sessions) to maintain benefits. Furthermore, the probability of successful ventricular cannulation in real patients at the first attempt was also higher following compared to before training, although in this small cohort the risk of hemorrhage formation around the site of ventriculostomy postoperatively was not reduced. Importantly, no other outcome measures were reported, and the effect of hemorrhage on clinical outcome is debatable and it can be argued that more important outcome measures exist. Nevertheless, the authors of this study should be commended for their efforts.

Limitations

This systematic review is limited by the heterogeneity in research methodology and design as well as quality of the included studies, as highlighted above. The wide variability in outcome measures, simulators used, and the participant profiles made it difficult to interpret findings collectively. Of course, no review (even if a systematic review as in this paper) can claim to find all relevant papers. Indeed, we included only articles published in the English language, which means that we may have missed some important studies published in other languages. The risk of publication bias is also important, as negative findings are much less likely to be published than positive ones. Despite these limitations, this is the first systematic review to synthesize the current evidence base on simulation in neurosurgical education and training.

Future Directions for Simulation-Based Training in Neurosurgery

Aside from the clear need for improvement in methodological quality and reporting of future studies, there are important wider considerations for neurosurgical simulation. The role of nontechnical skills such as communication and leadership on surgeons' technical performance is well recognized, especially in general surgery.19 Simulation (including high-fidelity full operating room simulation) may serve as a useful platform for nontechnical skills training in neurosurgeons and should be explored. In addition, deliberate practice with simulation has gained significant evidence as an effective learning strategy in other specialties,21 and its implementation into neurosurgical curricula should be strongly considered.

Conclusions

This systematic review has demonstrated qualitative and quantitative benefits for a range of neurosurgical simulators, but the identified studies were often of poor quality and reported in a suboptimal manner. Future studies should seek to improve study design and reporting and provide long-term follow-up data on simulated and/or patient outcomes. This will demonstrate the true benefits of simulation in neurosurgery and will facilitate professional bodies' and program directors' decision making regarding optimal integration of simulation training into the neurosurgery curriculum and residency.

Disclosure

M.A.K. is funded by a UK National Institute for Health Research (NIHR) Academic Clinical Fellowship in Neurosurgery. M.A.K. and M.A. are Education Associates at the UK General Medical Council. M.A. and N.S. are affiliated with the Imperial Centre for Patient Safety and Service Quality (www.cpssq.org), which is funded by the NIHR. For the remaining authors no relevant disclosures are declared.

Author contributions to the study and manuscript preparation include the following. Conception and design: Kirkman, Sevdalis. Acquisition of data: Kirkman, Ahmed, Albert. Analysis and interpretation of data: Kirkman, Sevdalis. Drafting the article: Kirkman. Critically revising the article: all authors. Reviewed submitted version of manuscript: all authors. Approved the final version of the manuscript on behalf of all authors: Kirkman. Statistical analysis: Kirkman. Study supervision: Wilson, Nandi, Sevdalis.

Appendix

Technical information about some of the simulators identified in this systematic review.

ImmersiveTouch simulator

Developed at the University of Illinois at Chicago, the ImmersiveTouch augmented virtual reality system combines real-time haptic feedback with a high-resolution stereoscopic display. An electromagnetic head-tracking system provides a dynamic perspective as the user moves his/her head and, in combination with a half-silvered mirror, creates an augmented reality environment integrating the surgeon's hands, virtual surgical instrument, and virtual patient (based on imaging data). Haptic feedback enhances the experience through, for example, decreasing resistance as one penetrates the ventricle in a virtual ventriculostomy.

Vascular Interventional System Trainer

The Procedicus VIST simulator (Mentice AB) consists of a computer-based software interface connected to a haptic device permitting the insertion and manipulation of catheters, balloons, wires, stents, and embolic protection devices with 3D anatomical reconstruction of patient data. VIST can simulate fluoroscopic imaging, contrast angiography, and pathology.

ANGIO Mentor

The ANGIO Mentor (Simbionix USA Corp.) is used to simulate diagnostic and interventional procedures. It consists of a computer, 2 liquid-crystal display screens, a haptics device, and controls that permit, for example, contrast medium injection, table movement, balloon inflation, and stent deployment.

ROBO-SIM

This system has the benefit of the use of actual patient data and the simulation of tissue deformation. The haptic feedback is provided by use of Laparoscopic Impulse Engines, which utilize cables to provide force feedback; a disadvantage of this is the high internal friction of the cables makes the simulation of small forces difficult. The use of ROBO-SIM in neurosurgery has only been reported in 1 study to date.

EasyGuide Neuro

The EasyGuide Neuro (Philips Medical Systems) is a frameless neuronavigation device that integrates preoperative imaging with live images of the brain taken during surgery using infrared technology, computer, and display screen. It has been used clinically since 1996.

Sinus Model Oto-Rhino Neuro Trainer

A weakness of cadaveric models for neuroendoscopic training is the absence of ventriculomegaly in most specimens. The Sinus Model Oto-Rhino Neuro Trainer (SIMONT) comprises a prosthetic head model containing silicone and fiberglass molds in the shape of the cerebral ventricles in addition to normal intraventricular features (choroid plexus, blood vessels) and pathology (tumors, cysts).

Cast of the ventricular system with Crayola Model Magic

This synthetic model comprises a cast model of the ventricular system (Human Brain Ventricles, Carolina Biological Supply Co.) using Model Magic (Crayola) to construct 3D color-coded physical models of periventricular structures.

OMeR model

Skull bone (made of polyamide nylon and glass beads), brain (soft elastomer), and cerebral artery (urethane resin) models (OMeR; ONO & Co. Ltd.) to simulate the transsylvian approach.

References

  • 1

    Abla AAUschold TPreul MCZabramski JM: Comparative use of turkey and chicken wing brachial artery models for microvascular anastomosis training. Laboratory investigation. J Neurosurg 115:123112352011

  • 2

    Ahlberg GEnochsson LGallagher AGHedman LHogman CMcClusky DA III: Proficiency-based virtual reality training significantly reduces the error rate for residents during their first 10 laparoscopic cholecystectomies. Am J Surg 193:7978042007

  • 3

    American Educational Research Association American Psychological Association National Council on Measurement in Education: Standards for Educational and Psychological Testing Washington, DCAmerican Educational Research Association1985

  • 4

    American Educational Research Association American Psychological Association National Council on Measurement in Education: Standards for Educational and Psychological Testing Washington, DCAmerican Educational Research Association1999

  • 5

    Banerjee PPLuciano CJLemole GM JrCharbel FTOh MY: Accuracy of ventriculostomy catheter placement using a head-and hand-tracked high-resolution virtual reality simulator with haptic feedback. J Neurosurg 107:5155212007

  • 6

    Banerjee PPYudkowsky RLemole MCharbel FLuciano C: Using a high-fidelity virtual reality and haptics-based simulation to determine the “learning curve” of neurosurgery residents' surgical skills. Simul Healthc 2:1452007. (Abstract)

  • 7

    Cho ABasson STsang T: Outcomes of a structured training programme for paediatric laparoscopic inguinal hernia repair. J Pediatr Surg 48:4044072013

  • 8

    Cook DAHatala RBrydges RZendejas BSzostek JHWang AT: Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA 306:9789882011

  • 9

    Dayal RFaries PLLin SCBernheim JHollenbeck SDeRubertis B: Computer simulation as a component of catheter-based training. J Vasc Surg 40:111211172004

  • 10

    Ereso AQGarcia PTseng EGauger GKim HDua MM: Live transference of surgical subspecialty skills using telerobotic proctoring to remote general surgeons. J Am Coll Surg 211:4004112010

  • 11

    Estevez MELindgren KABergethon PR: A novel three-dimensional tool for teaching human neuroanatomy. Anat Sci Educ 3:3093172010

  • 12

    Fargen KMSiddiqui AHVeznedaroglu ETurner RDRinger AJMocco J: Simulator based angiography education in neurosurgery: results of a pilot educational program. J Neurointerv Surg 4:4384412012

  • 13

    Filho FVCoelho GCavalheiro SLyra MZymberg ST: Quality assessment of a new surgical simulator for neuroendoscopic training. Neurosurg Focus 30:4E172011

  • 14

    Gragnaniello CNader Rvan Doormaal TKamel MVoormolen EHLasio G: Skull base tumor model. Laboratory investigation. J Neurosurg 113:110611112010

  • 15

    Gurusamy KSAggarwal RPalanivelu LDavidson BR: Virtual reality training for surgical trainees in laparoscopic surgery. Cochrane Database Syst Rev 1CD0065752009

  • 16

    Harada NKondo KMiyazaki CNomoto JKitajima SNemoto M: Modified three-dimensional brain model for study of the trans-sylvian approach. Neurol Med Chir (Tokyo) 51:5675712011

  • 17

    Hoh BLNeal DWKleinhenz DTHoh DJMocco JBarker FG II: Higher complications and no improvement in mortality in the ACGME resident duty-hour restriction era: an analysis of more than 107,000 neurosurgical trauma patients in the Nationwide Inpatient Sample database. Neurosurgery 70:136913822012

  • 18

    Hsu JHYounan DPandalai SGillespie BTJain RASchippert DW: Use of computer simulation for determining endovascular skill levels in a carotid stenting model. J Vasc Surg 40:111811252004

  • 19

    Hull LArora SAggarwal RDarzi AVincent CSevdalis N: The impact of nontechnical skills on technical performance in surgery: a systematic review. J Am Coll Surg 214:2142302012

  • 20

    Hwang GOh CWPark SQSheen SHBang JSKang HS: Comparison of different microanastomosis training models: model accuracy and practicality. J Korean Neurosurg Soc 47:2872902010

  • 21

    Issenberg SBMcGaghie WCPetrusa ERLee Gordon DScalese RJ: Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 27:10282005

  • 22

    Kirkman MA: Deliberate practice, domain-specific expertise, and implications for surgical education in current climes. J Surg Educ 70:3093172013

  • 23

    Kneebone RArora SKing DBello FSevdalis NKassab E: Distributed simulation—accessible immersive training. Med Teach 32:65702010

  • 24

    Korndorffer JR JrHayes DJDunne JBSierra RTouchard CLMarkert RJ: Development and transferability of a cost-effective laparoscopic camera navigation simulator. Surg Endosc 19:1611672005

  • 25

    Korndorffer JR JrKasten SJDowning SM: A call for the utilization of consensus standards in the surgical education literature. Am J Surg 199:991042010

  • 26

    Krombach GGanser AFricke CRohde VReinges MGilsbach J: Virtual placement of frontal ventricular catheters using frameless neuronavigation: an “unbloody training” for young neurosurgeons. Minim Invasive Neurosurg 43:1711752000

  • 27

    Lemole MBanerjee PPLuciano CCharbel FOh M: Virtual ventriculostomy with ‘shifted ventricle': neurosurgery resident surgical skill assessment using a high-fidelity haptic/graphic virtual reality simulator. Neurol Res 31:4304312009

  • 28

    Luciano CBanerjee PLemole GM JrCharbel F: Second generation haptic ventriculostomy simulator using the ImmersiveTouch system. Stud Health Technol Inform 119:3433482006

  • 29

    Luciano CJBanerjee PPBellotte BOh GMLemole M JrCharbel FT: Learning retention of thoracic pedicle screw placement using a high-resolution augmented reality simulator with haptic feedback. Neurosurgery 69:1 Suppl Operativeons14ons192011

  • 30

    Luciano CJBanerjee PPSorenson JMFoley KTAnsari SARizzi S: Percutaneous spinal fixation simulation with virtual reality and haptics. Neurosurgery 72:Suppl 189962013

  • 31

    Ma IWYBrindle MERonksley PELorenzetti DLSauve RSGhali WA: Use of simulation-based education to improve outcomes of central venous catheterization: a systematic review and meta-analysis. Acad Med 86:113711472011

  • 32

    Marcus HVakharia VKirkman MAMurphy MNandi D: Practice makes perfect? The role of simulation-based deliberate practice and script-based mental rehearsal in the acquisition and maintenance of operative neurosurgical skills. Neurosurgery 72:Suppl 11241302013

  • 33

    McGaghie WCIssenberg SBCohen ERBarsuk JHWayne DB: Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med 86:7067112011

  • 34

    Moher DLiberati ATetzlaff JAltman DG: Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med 6:e10000972009

  • 35

    Moonesinghe SRLowery JShahi NMillen ABeard JD: Impact of reduction in working hours for doctors in training on postgraduate medical education and patients' outcomes: systematic review. BMJ 342:d15802011

  • 36

    Musacchio MJ JrSmith APMcNeal CAMunoz LRothenberg DMvon Roenn KA: Neuro-critical care skills training using a human patient simulator. Neurocrit Care 13:1691752010

  • 37

    Nicholson WJCates CUPatel ADNiazi KPalmer SHelmy T: Face and content validation of virtual reality simulation for carotid angiography: results from the first 100 physicians attending the Emory NeuroAnatomy Carotid Training (ENACT) program. Simul Healthc 1:1471502006

  • 38

    Patel ADGallagher AGNicholson WJCates CU: Learning curves and reliability measures for virtual reality simulation in the performance assessment of carotid angiography. J Am Coll Cardiol 47:179618022006

  • 39

    Pichierri AFrati ASantoro ALenzi JDelfini RPannarale L: How to set up a microsurgical laboratory on small animal models: organization, techniques, and impact on residency training. Neurosurg Rev 32:1011102009

  • 40

    Purcell Jackson GTarpley JL: How long does it take to train a surgeon?. BMJ 339:b42602009

  • 41

    Radetzky ANürnberger A: Visualization and simulation techniques for surgical simulators using actual patient's data. Artif Intell Med 26:2552792002

  • 42

    Reed DACook DABeckman TJLevine RBKern DEWright SM: Association between funding and quality of published medical education research. JAMA 298:100210092007

  • 43

    Regelsberger JHeese OHorn PKirsch MEicker SSabel M: Training microneurosurgery – four years experiences with an in vivo model. Cent Eur Neurosurg 72:1921952011

  • 44

    Scott DJBergen PCRege RVLaycock RTesfay STValentine RJ: Laparoscopic training on bench models: better and more cost effective than operating room experience?. J Am Coll Surg 191:2722832000

  • 45

    Seymour NE: VR to OR: a review of the evidence that virtual reality simulation improves operating room performance. World J Surg 32:1821882008

  • 46

    Spetzger UKrombach GReinges MGilsbach JMSchmidt T: Navigational microneurosurgery: experience with the EasyGuide Neuro. Medicamundi 41:28351997

  • 47

    Spiotta AMRasmussen PAMasaryk TJBenzel ECSchlenk R: Simulated diagnostic cerebral angiography in neurosurgical training: a pilot program. J Neurointerv Surg 5:3763812013

  • 48

    Sturm LPWindsor JACosman PHCregan PHewett PJMaddern GJ: A systematic review of skills transfer after surgical simulation training. Ann Surg 248:1661792008

  • 49

    Sutherland LMMiddleton PFAnthony AHamdorf JCregan PScott D: Surgical simulation: a systematic review. Ann Surg 243:2913002006

  • 50

    Temple J: Time for Training. A Review of the Impact of the European Working Time Directive on the Quality of Training LondonMedical Education England2010

  • 51

    Trochim WMK: The Research Methods Knowledge Base ed 2Cincinnati, OHAtomic Dog Publishing2001

  • 52

    Van Herzeele IAggarwal RNeequaye SHamady MCleveland TDarzi A: Experienced endovascular interventionalists objectively improve their skills by attending carotid artery stent training courses. Eur J Vasc Endovasc Surg 35:5415502008

  • 53

    Van Nortwick SSLendvay TSJensen ARWright ASHorvath KDKim S: Methodologies for establishing validity in surgical simulation studies. Surgery 147:6226302010

  • 54

    Van Sickle KRRitter EMBaghai MGoldenberg AEHuang IPGallagher AG: Prospective, randomized, double-blind trial of curriculum-based training for intracorporeal suturing and knot tying. J Am Coll Surg 207:5605682008

  • 55

    Vloeberghs MGlover ABenford SJones AWang PBecker A: Virtual neurosurgery, training for the future. Br J Neurosurg 21:2622672007

  • 56

    Walker JBPerkins EHarkey HL: A novel simulation model for minimally invasive spine surgery. Neurosurgery 65:6 Suppl1881952009

  • 57

    Yudkowsky RLuciano CBanerjee PSchwartz AAlaraj ALemole GM Jr: Practice on an augmented reality/haptic simulator and library of virtual brains improves residents' ability to perform a ventriculostomy. Simul Healthc 8:25312013

  • 58

    Zendejas BBrydges RHamstra SJCook DA: State of the evidence on simulation-based training for laparoscopic surgery: a systematic review. Ann Surg 257:5865932013

  • 59

    Zymberg SVaz-Guimarães Filho FLyra M: Neuroendoscopic training: presentation of a new real simulator. Minim Invasive Neurosurg 53:44462010

If the inline PDF is not rendering correctly, you can download the PDF file here.

Article Information

Address correspondence to: Matthew Kirkman, M.R.C.S., M.Ed., Victor Horsley Department of Neurosurgery, The National Hospital for Neurology and Neurosurgery, Queen Square, London WC1N 3BG, United Kingdom. email: matthew.kirkman@gmail.com.

Please include this information when citing this paper: published online June 20, 2014; DOI: 10.3171/2014.5.JNS131766.

© AANS, except where prohibited by US copyright law.

Headings

Figures

  • View in gallery

    Flow diagram illustrating our search strategy.

  • View in gallery

    The ImmersiveTouch system. Left: The system in operation. Right: Simulated ventriculostomy catheter insertion. Reproduced with permission from Banerjee et al: J Neurosurg 107:515–521, 2007.

  • View in gallery

    The Procedicus VIST virtual reality simulator (Mentice AB, Gothenburg, Sweden). Reprinted from J Am Coll Cardiol 47, Patel et al., “Learning curves and reliability measures for virtual reality simulation in the performance assessment of carotid angiography,” pp 1796–1802, 2006, with permission from the American College of Cardiology Foundation.

References

1

Abla AAUschold TPreul MCZabramski JM: Comparative use of turkey and chicken wing brachial artery models for microvascular anastomosis training. Laboratory investigation. J Neurosurg 115:123112352011

2

Ahlberg GEnochsson LGallagher AGHedman LHogman CMcClusky DA III: Proficiency-based virtual reality training significantly reduces the error rate for residents during their first 10 laparoscopic cholecystectomies. Am J Surg 193:7978042007

3

American Educational Research Association American Psychological Association National Council on Measurement in Education: Standards for Educational and Psychological Testing Washington, DCAmerican Educational Research Association1985

4

American Educational Research Association American Psychological Association National Council on Measurement in Education: Standards for Educational and Psychological Testing Washington, DCAmerican Educational Research Association1999

5

Banerjee PPLuciano CJLemole GM JrCharbel FTOh MY: Accuracy of ventriculostomy catheter placement using a head-and hand-tracked high-resolution virtual reality simulator with haptic feedback. J Neurosurg 107:5155212007

6

Banerjee PPYudkowsky RLemole MCharbel FLuciano C: Using a high-fidelity virtual reality and haptics-based simulation to determine the “learning curve” of neurosurgery residents' surgical skills. Simul Healthc 2:1452007. (Abstract)

7

Cho ABasson STsang T: Outcomes of a structured training programme for paediatric laparoscopic inguinal hernia repair. J Pediatr Surg 48:4044072013

8

Cook DAHatala RBrydges RZendejas BSzostek JHWang AT: Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA 306:9789882011

9

Dayal RFaries PLLin SCBernheim JHollenbeck SDeRubertis B: Computer simulation as a component of catheter-based training. J Vasc Surg 40:111211172004

10

Ereso AQGarcia PTseng EGauger GKim HDua MM: Live transference of surgical subspecialty skills using telerobotic proctoring to remote general surgeons. J Am Coll Surg 211:4004112010

11

Estevez MELindgren KABergethon PR: A novel three-dimensional tool for teaching human neuroanatomy. Anat Sci Educ 3:3093172010

12

Fargen KMSiddiqui AHVeznedaroglu ETurner RDRinger AJMocco J: Simulator based angiography education in neurosurgery: results of a pilot educational program. J Neurointerv Surg 4:4384412012

13

Filho FVCoelho GCavalheiro SLyra MZymberg ST: Quality assessment of a new surgical simulator for neuroendoscopic training. Neurosurg Focus 30:4E172011

14

Gragnaniello CNader Rvan Doormaal TKamel MVoormolen EHLasio G: Skull base tumor model. Laboratory investigation. J Neurosurg 113:110611112010

15

Gurusamy KSAggarwal RPalanivelu LDavidson BR: Virtual reality training for surgical trainees in laparoscopic surgery. Cochrane Database Syst Rev 1CD0065752009

16

Harada NKondo KMiyazaki CNomoto JKitajima SNemoto M: Modified three-dimensional brain model for study of the trans-sylvian approach. Neurol Med Chir (Tokyo) 51:5675712011

17

Hoh BLNeal DWKleinhenz DTHoh DJMocco JBarker FG II: Higher complications and no improvement in mortality in the ACGME resident duty-hour restriction era: an analysis of more than 107,000 neurosurgical trauma patients in the Nationwide Inpatient Sample database. Neurosurgery 70:136913822012

18

Hsu JHYounan DPandalai SGillespie BTJain RASchippert DW: Use of computer simulation for determining endovascular skill levels in a carotid stenting model. J Vasc Surg 40:111811252004

19

Hull LArora SAggarwal RDarzi AVincent CSevdalis N: The impact of nontechnical skills on technical performance in surgery: a systematic review. J Am Coll Surg 214:2142302012

20

Hwang GOh CWPark SQSheen SHBang JSKang HS: Comparison of different microanastomosis training models: model accuracy and practicality. J Korean Neurosurg Soc 47:2872902010

21

Issenberg SBMcGaghie WCPetrusa ERLee Gordon DScalese RJ: Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 27:10282005

22

Kirkman MA: Deliberate practice, domain-specific expertise, and implications for surgical education in current climes. J Surg Educ 70:3093172013

23

Kneebone RArora SKing DBello FSevdalis NKassab E: Distributed simulation—accessible immersive training. Med Teach 32:65702010

24

Korndorffer JR JrHayes DJDunne JBSierra RTouchard CLMarkert RJ: Development and transferability of a cost-effective laparoscopic camera navigation simulator. Surg Endosc 19:1611672005

25

Korndorffer JR JrKasten SJDowning SM: A call for the utilization of consensus standards in the surgical education literature. Am J Surg 199:991042010

26

Krombach GGanser AFricke CRohde VReinges MGilsbach J: Virtual placement of frontal ventricular catheters using frameless neuronavigation: an “unbloody training” for young neurosurgeons. Minim Invasive Neurosurg 43:1711752000

27

Lemole MBanerjee PPLuciano CCharbel FOh M: Virtual ventriculostomy with ‘shifted ventricle': neurosurgery resident surgical skill assessment using a high-fidelity haptic/graphic virtual reality simulator. Neurol Res 31:4304312009

28

Luciano CBanerjee PLemole GM JrCharbel F: Second generation haptic ventriculostomy simulator using the ImmersiveTouch system. Stud Health Technol Inform 119:3433482006

29

Luciano CJBanerjee PPBellotte BOh GMLemole M JrCharbel FT: Learning retention of thoracic pedicle screw placement using a high-resolution augmented reality simulator with haptic feedback. Neurosurgery 69:1 Suppl Operativeons14ons192011

30

Luciano CJBanerjee PPSorenson JMFoley KTAnsari SARizzi S: Percutaneous spinal fixation simulation with virtual reality and haptics. Neurosurgery 72:Suppl 189962013

31

Ma IWYBrindle MERonksley PELorenzetti DLSauve RSGhali WA: Use of simulation-based education to improve outcomes of central venous catheterization: a systematic review and meta-analysis. Acad Med 86:113711472011

32

Marcus HVakharia VKirkman MAMurphy MNandi D: Practice makes perfect? The role of simulation-based deliberate practice and script-based mental rehearsal in the acquisition and maintenance of operative neurosurgical skills. Neurosurgery 72:Suppl 11241302013

33

McGaghie WCIssenberg SBCohen ERBarsuk JHWayne DB: Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med 86:7067112011

34

Moher DLiberati ATetzlaff JAltman DG: Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med 6:e10000972009

35

Moonesinghe SRLowery JShahi NMillen ABeard JD: Impact of reduction in working hours for doctors in training on postgraduate medical education and patients' outcomes: systematic review. BMJ 342:d15802011

36

Musacchio MJ JrSmith APMcNeal CAMunoz LRothenberg DMvon Roenn KA: Neuro-critical care skills training using a human patient simulator. Neurocrit Care 13:1691752010

37

Nicholson WJCates CUPatel ADNiazi KPalmer SHelmy T: Face and content validation of virtual reality simulation for carotid angiography: results from the first 100 physicians attending the Emory NeuroAnatomy Carotid Training (ENACT) program. Simul Healthc 1:1471502006

38

Patel ADGallagher AGNicholson WJCates CU: Learning curves and reliability measures for virtual reality simulation in the performance assessment of carotid angiography. J Am Coll Cardiol 47:179618022006

39

Pichierri AFrati ASantoro ALenzi JDelfini RPannarale L: How to set up a microsurgical laboratory on small animal models: organization, techniques, and impact on residency training. Neurosurg Rev 32:1011102009

40

Purcell Jackson GTarpley JL: How long does it take to train a surgeon?. BMJ 339:b42602009

41

Radetzky ANürnberger A: Visualization and simulation techniques for surgical simulators using actual patient's data. Artif Intell Med 26:2552792002

42

Reed DACook DABeckman TJLevine RBKern DEWright SM: Association between funding and quality of published medical education research. JAMA 298:100210092007

43

Regelsberger JHeese OHorn PKirsch MEicker SSabel M: Training microneurosurgery – four years experiences with an in vivo model. Cent Eur Neurosurg 72:1921952011

44

Scott DJBergen PCRege RVLaycock RTesfay STValentine RJ: Laparoscopic training on bench models: better and more cost effective than operating room experience?. J Am Coll Surg 191:2722832000

45

Seymour NE: VR to OR: a review of the evidence that virtual reality simulation improves operating room performance. World J Surg 32:1821882008

46

Spetzger UKrombach GReinges MGilsbach JMSchmidt T: Navigational microneurosurgery: experience with the EasyGuide Neuro. Medicamundi 41:28351997

47

Spiotta AMRasmussen PAMasaryk TJBenzel ECSchlenk R: Simulated diagnostic cerebral angiography in neurosurgical training: a pilot program. J Neurointerv Surg 5:3763812013

48

Sturm LPWindsor JACosman PHCregan PHewett PJMaddern GJ: A systematic review of skills transfer after surgical simulation training. Ann Surg 248:1661792008

49

Sutherland LMMiddleton PFAnthony AHamdorf JCregan PScott D: Surgical simulation: a systematic review. Ann Surg 243:2913002006

50

Temple J: Time for Training. A Review of the Impact of the European Working Time Directive on the Quality of Training LondonMedical Education England2010

51

Trochim WMK: The Research Methods Knowledge Base ed 2Cincinnati, OHAtomic Dog Publishing2001

52

Van Herzeele IAggarwal RNeequaye SHamady MCleveland TDarzi A: Experienced endovascular interventionalists objectively improve their skills by attending carotid artery stent training courses. Eur J Vasc Endovasc Surg 35:5415502008

53

Van Nortwick SSLendvay TSJensen ARWright ASHorvath KDKim S: Methodologies for establishing validity in surgical simulation studies. Surgery 147:6226302010

54

Van Sickle KRRitter EMBaghai MGoldenberg AEHuang IPGallagher AG: Prospective, randomized, double-blind trial of curriculum-based training for intracorporeal suturing and knot tying. J Am Coll Surg 207:5605682008

55

Vloeberghs MGlover ABenford SJones AWang PBecker A: Virtual neurosurgery, training for the future. Br J Neurosurg 21:2622672007

56

Walker JBPerkins EHarkey HL: A novel simulation model for minimally invasive spine surgery. Neurosurgery 65:6 Suppl1881952009

57

Yudkowsky RLuciano CBanerjee PSchwartz AAlaraj ALemole GM Jr: Practice on an augmented reality/haptic simulator and library of virtual brains improves residents' ability to perform a ventriculostomy. Simul Healthc 8:25312013

58

Zendejas BBrydges RHamstra SJCook DA: State of the evidence on simulation-based training for laparoscopic surgery: a systematic review. Ann Surg 257:5865932013

59

Zymberg SVaz-Guimarães Filho FLyra M: Neuroendoscopic training: presentation of a new real simulator. Minim Invasive Neurosurg 53:44462010

TrendMD

Metrics

Metrics

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 510 510 93
PDF Downloads 246 246 37
EPUB Downloads 0 0 0

PubMed

Google Scholar