Browse

You are looking at 1 - 10 of 10,878 items for

  • User-accessible content x
Clear All
Free access

Florian Bernard, Julien Haemmerli, Gregory Zegarek, Daniel Kiss-Bodolay, Karl Schaller, and Philippe Bijlenga

Visualizing major periventricular anatomical landmarks intraoperatively during brain tumor removal is a decisive measure toward preserving such structures and thus the patient's postoperative quality of life. The aim of this study was to describe potential standardized preoperative planning using standard landmarks and procedures and to demonstrate the feasibility of using augmented reality (AR) to assist in performing surgery according to these “roadmaps.” The authors have depicted stepwise AR surgical roadmaps applied to periventricular brain surgery with the aim of preserving major cognitive function. In addition to the technological aspects, this study highlights the importance of using emerging technologies as potential tools to integrate information and to identify and visualize landmarks to be used during tumor removal.

Free access

Michael E. Ivan, Daniel G. Eichberg, Long Di, Ashish H. Shah, Evan M. Luther, Victor M. Lu, Ricardo J. Komotar, and Timur M. Urakov

OBJECTIVE

Monitor and wand–based neuronavigation stations (MWBNSs) for frameless intraoperative neuronavigation are routinely used in cranial neurosurgery. However, they are temporally and spatially cumbersome; the OR must be arranged around the MWBNS, at least one hand must be used to manipulate the MWBNS wand (interrupting a bimanual surgical technique), and the surgical workflow is interrupted as the surgeon stops to “check the navigation” on a remote monitor. Thus, there is need for continuous, real-time, hands-free, neuronavigation solutions. Augmented reality (AR) is poised to streamline these issues. The authors present the first reported prospective pilot study investigating the feasibility of using the OpenSight application with an AR head-mounted display to map out the borders of tumors in patients undergoing elective craniotomy for tumor resection, and to compare the degree of correspondence with MWBNS tracing.

METHODS

Eleven consecutive patients undergoing elective craniotomy for brain tumor resection were prospectively identified and underwent circumferential tumor border tracing at the time of incision planning by a surgeon wearing HoloLens AR glasses running the commercially available OpenSight application registered to the patient and preoperative MRI. Then, the same patient underwent circumferential tumor border tracing using the StealthStation S8 MWBNS. Postoperatively, both tumor border tracings were compared by two blinded board-certified neurosurgeons and rated as having an excellent, adequate, or poor correspondence degree based on a subjective sense of the overlap. Objective overlap area measurements were also determined.

RESULTS

Eleven patients undergoing craniotomy were included in the study. Five patient procedures were rated as having an excellent correspondence degree, 5 had an adequate correspondence degree, and 1 had poor correspondence. Both raters agreed on the rating in all cases. AR tracing was possible in all cases.

CONCLUSIONS

In this small pilot study, the authors found that AR was implementable in the workflow of a neurosurgery OR, and was a feasible method of preoperative tumor border identification for incision planning. Future studies are needed to identify strategies to improve and optimize AR accuracy.

Free access

Simon Skyrman, Marco Lai, Erik Edström, Gustav Burström, Petter Förander, Robert Homan, Flip Kor, Ronald Holthuizen, Benno H. W. Hendriks, Oscar Persson, and Adrian Elmi-Terander

OBJECTIVE

The aim of this study was to evaluate the accuracy (deviation from the target or intended path) and efficacy (insertion time) of an augmented reality surgical navigation (ARSN) system for insertion of biopsy needles and external ventricular drains (EVDs), two common neurosurgical procedures that require high precision.

METHODS

The hybrid operating room–based ARSN system, comprising a robotic C-arm with intraoperative cone-beam CT (CBCT) and integrated video tracking of the patient and instruments using nonobtrusive adhesive optical markers, was used. A 3D-printed skull phantom with a realistic gelatinous brain model containing air-filled ventricles and 2-mm spherical biopsy targets was obtained. After initial CBCT acquisition for target registration and planning, ARSN was used for 30 cranial biopsies and 10 EVD insertions. Needle positions were verified by CBCT.

RESULTS

The mean accuracy of the biopsy needle insertions (n = 30) was 0.8 mm ± 0.43 mm. The median path length was 39 mm (range 16–104 mm) and did not correlate to accuracy (p = 0.15). The median device insertion time was 149 seconds (range 87–233 seconds). The mean accuracy for the EVD insertions (n = 10) was 2.9 mm ± 0.8 mm at the tip with a 0.7° ± 0.5° angular deviation compared with the planned path, and the median insertion time was 188 seconds (range 135–400 seconds).

CONCLUSIONS

This study demonstrated that ARSN can be used for navigation of percutaneous cranial biopsies and EVDs with high accuracy and efficacy.

Free access

Steven Knafo, Nicolas Penet, Stephan Gaillard, and Fabrice Parker

OBJECTIVE

Simulation is gaining momentum as a new modality of medical training, particularly in acute care settings such as surgery. In the present study, the authors aimed to compare individual cognitive skills with manual abilities as assessed by virtual reality (VR) simulation among neurosurgical residents.

METHODS

Participants were asked to complete a multiple-choice questionnaire assessing their surgical abilities regarding three basic neurosurgical procedures (endoscopic third ventriculostomy, cranial meningioma, and lumbar laminectomy). They subsequently performed these same three procedures on a VR simulator (NeuroTouch).

RESULTS

The authors found that cognitive scores correlated with self-evaluation of surgical experience and autonomy. On the contrary, VR simulation, as assessed by NeuroTouch automated scoring, did not reflect participants’ cognitive or self-evaluation of their surgical proficiency.

CONCLUSIONS

The results of this study suggest that neurosurgical education should focus as much on cognitive simulation (e.g., careful planning and critical appraisal of actual procedures) as on VR training of visuomotor skills.

Free access

Yun-Sik Dho, Sang Joon Park, Haneul Choi, Youngdeok Kim, Hyeong Cheol Moon, Kyung Min Kim, Ho Kang, Eun Jung Lee, Min-Sung Kim, Jin Wook Kim, Yong Hwy Kim, Young Gyu Kim, and Chul-Kee Park

OBJECTIVE

With the advancement of 3D modeling techniques and visualization devices, augmented reality (AR)–based navigation (AR navigation) is being developed actively. The authors developed a pilot model of their newly developed inside-out tracking AR navigation system.

METHODS

The inside-out AR navigation technique was developed based on the visual inertial odometry (VIO) algorithm. The Quick Response (QR) marker was created and used for the image feature–detection algorithm. Inside-out AR navigation works through the steps of visualization device recognition, marker recognition, AR implementation, and registration within the running environment. A virtual 3D patient model for AR rendering and a 3D-printed patient model for validating registration accuracy were created. Inside-out tracking was used for the registration. The registration accuracy was validated by using intuitive, visualization, and quantitative methods for identifying coordinates by matching errors. Fine-tuning and opacity-adjustment functions were developed.

RESULTS

ARKit-based inside-out AR navigation was developed. The fiducial marker of the AR model and those of the 3D-printed patient model were correctly overlapped at all locations without errors. The tumor and anatomical structures of AR navigation and the tumors and structures placed in the intracranial space of the 3D-printed patient model precisely overlapped. The registration accuracy was quantified using coordinates, and the average moving errors of the x-axis and y-axis were 0.52 ± 0.35 and 0.05 ± 0.16 mm, respectively. The gradients from the x-axis and y-axis were 0.35° and 1.02°, respectively. Application of the fine-tuning and opacity-adjustment functions was proven by the videos.

CONCLUSIONS

The authors developed a novel inside-out tracking–based AR navigation system and validated its registration accuracy. This technical system could be applied in the novel navigation system for patient-specific neurosurgery.

Free access

Singh Gagandeep, Kainth Tejasvi, Manjila Nihal, Jain Shubham, Vaysberg Anatoliy, Spektor Vadim, Prasanna Prateek, and Manjila Sunil

Free access

Frederick Van Gestel, Taylor Frantz, Cédric Vannerom, Anouk Verhellen, Anthony G. Gallagher, Shirley A. Elprama, An Jacobs, Ronald Buyl, Michaël Bruneau, Bart Jansen, Jef Vandemeulebroucke, Thierry Scheerlinck, and Johnny Duerinck

OBJECTIVE

The traditional freehand technique for external ventricular drain (EVD) placement is most frequently used, but remains the primary risk factor for inaccurate drain placement. As this procedure could benefit from image guidance, the authors set forth to demonstrate the impact of augmented-reality (AR) assistance on the accuracy and learning curve of EVD placement compared with the freehand technique.

METHODS

Sixteen medical students performed a total of 128 EVD placements on a custom-made phantom head, both before and after receiving a standardized training session. They were guided by either the freehand technique or by AR, which provided an anatomical overlay and tailored guidance for EVD placement through inside-out infrared tracking. The outcome was quantified by the metric accuracy of EVD placement as well as by its clinical quality.

RESULTS

The mean target error was significantly impacted by either AR (p = 0.003) or training (p = 0.02) in a direct comparison with the untrained freehand performance. Both untrained (11.9 ± 4.5 mm) and trained (12.2 ± 4.7 mm) AR performances were significantly better than the untrained freehand performance (19.9 ± 4.2 mm), which improved after training (13.5 ± 4.7 mm). The quality of EVD placement as assessed by the modified Kakarla scale (mKS) was significantly impacted by AR guidance (p = 0.005) but not by training (p = 0.07). Both untrained and trained AR performances (59.4% mKS grade 1 for both) were significantly better than the untrained freehand performance (25.0% mKS grade 1). Spatial aptitude testing revealed a correlation between perceptual ability and untrained AR-guided performance (r = 0.63).

CONCLUSIONS

Compared with the freehand technique, AR guidance for EVD placement yielded a higher outcome accuracy and quality for procedure novices. With AR, untrained individuals performed as well as trained individuals, which indicates that AR guidance not only improved performance but also positively impacted the learning curve. Future efforts will focus on the translation and evaluation of AR for EVD placement in the clinical setting.

Free access

Fangfang Qi, Yixiang Gan, Shengwen Wang, Yizhe Tie, Jiewen Chen, and Chunhai Li

OBJECTIVE

Today, minimally invasive procedures have become mainstream surgical procedures. Percutaneous endoscopic transforaminal discectomy for lumbar disc herniation (LDH) requires profound knowledge of the laparoscopic lumbar anatomy. Immersive virtual reality (VR) provides three-dimensional patient-specific models to help in the process of preclinical surgical preparation. In this study, the authors investigated the efficacy of VR application in LDH for training orthopedic residents and postgraduates.

METHODS

VR images of the lumbar anatomy were created with immersive VR and mAnatomy software. The study was conducted among 60 residents and postgraduates. A questionnaire was developed to assess the effect of and satisfaction with this VR-based basic and clinical fused curriculum. The teaching effect was also evaluated through a postlecture test, and the results of the prelecture surgical examination were taken as baselines.

RESULTS

All participants in the VR group agreed that VR-based education is practical, attractive, and easy to operate, compared to traditional teaching, and promotes better understanding of the anatomical structures involved in LDH. Learners in the VR group achieved higher scores on an anatomical and clinical fusion test than learners in the traditional group (84.67 ± 14.56 vs 76.00 ± 16.10, p < 0.05).

CONCLUSIONS

An immersive VR-based basic and clinical fused curriculum can increase residents’ and postgraduates’ interest and support them in mastering the structural changes and complicated symptoms of LDH. However, a simplified operational process and more realistic haptics of the VR system are necessary for further surgical preparation and application.