Holographic mixed-reality neuronavigation with a head-mounted device: technical feasibility and clinical application

Ziyu Qi MD1,3, Ye Li MD, PhD2, Xinghua Xu MD, PhD1, Jiashu Zhang MD, PhD1, Fangye Li MD, PhD1, Zhichao Gan MD1,3, Ruochu Xiong MD1, Qun Wang MD, PhD1, Shiyu Zhang MD1, and Xiaolei Chen MD, PhD1
View More View Less
  • 1 Department of Neurosurgery, Chinese PLA General Hospital; and
  • | 2 Department of Neurosurgery, Xuanwu Hospital, Capital Medical University, Beijing; and
  • | 3 School of Medicine, Nankai University, Tianjin, China
Free access

OBJECTIVE

The authors aimed to evaluate the technical feasibility of a mixed-reality neuronavigation (MRN) system with a wearable head-mounted device (HMD) and to determine its clinical application and accuracy.

METHODS

A semiautomatic registration MRN system on HoloLens smart glasses was developed and tested for accuracy and feasibility. Thirty-seven patients with intracranial lesions were prospectively identified. For each patient, multimodal imaging–based holograms of lesions, markers, and surrounding eloquent structures were created and then imported to the MRN HMD. After a point-based registration, the holograms were projected onto the patient's head and observed through the HMD. The contour of the holograms was compared with standard neuronavigation (SN). The projection of the lesion boundaries perceived by the neurosurgeon on the patient's scalp was then marked with MRN and SN. The distance between the two contours generated by MRN and SN was measured so that the accuracy of MRN could be assessed.

RESULTS

MRN localization was achieved in all patients. The mean additional time required for MRN was 36.3 ± 6.3 minutes, in which the mean registration time was 2.6 ± 0.9 minutes. A trend toward a shorter time required for preparation was observed with the increase of neurosurgeon experience with the MRN system. The overall median deviation was 4.1 mm (IQR 3.0 mm–4.7 mm), and 81.1% of the lesions localized by MRN were found to be highly consistent with SN (deviation < 5.0 mm). There was a significant difference between the supine position and the prone position (3.7 ± 1.1 mm vs 5.4 ± 0.9 mm, p = 0.001). The magnitudes of deviation vectors did not correlate with lesion volume (p = 0.126) or depth (p = 0.128). There was no significant difference in additional operating time between different operators (37.4 ± 4.8 minutes vs 34.6 ± 4.8 minutes, p = 0.237) or in localization deviation (3.7 ± 1.0 mm vs 4.6 ± 1.5 mm, p = 0.070).

CONCLUSIONS

This study provided a complete set of a clinically applicable workflow on an easy-to-use MRN system using a wearable HMD, and has shown its technical feasibility and accuracy. Further development is required to improve the accuracy and clinical efficacy of this system.

ABBREVIATIONS

D = distance; FRE = fiducial registration error; HMD = head-mounted device; MRN = mixed-reality neuronavigation; SN = standard neuronavigation; TRE = target registration error.

OBJECTIVE

The authors aimed to evaluate the technical feasibility of a mixed-reality neuronavigation (MRN) system with a wearable head-mounted device (HMD) and to determine its clinical application and accuracy.

METHODS

A semiautomatic registration MRN system on HoloLens smart glasses was developed and tested for accuracy and feasibility. Thirty-seven patients with intracranial lesions were prospectively identified. For each patient, multimodal imaging–based holograms of lesions, markers, and surrounding eloquent structures were created and then imported to the MRN HMD. After a point-based registration, the holograms were projected onto the patient's head and observed through the HMD. The contour of the holograms was compared with standard neuronavigation (SN). The projection of the lesion boundaries perceived by the neurosurgeon on the patient's scalp was then marked with MRN and SN. The distance between the two contours generated by MRN and SN was measured so that the accuracy of MRN could be assessed.

RESULTS

MRN localization was achieved in all patients. The mean additional time required for MRN was 36.3 ± 6.3 minutes, in which the mean registration time was 2.6 ± 0.9 minutes. A trend toward a shorter time required for preparation was observed with the increase of neurosurgeon experience with the MRN system. The overall median deviation was 4.1 mm (IQR 3.0 mm–4.7 mm), and 81.1% of the lesions localized by MRN were found to be highly consistent with SN (deviation < 5.0 mm). There was a significant difference between the supine position and the prone position (3.7 ± 1.1 mm vs 5.4 ± 0.9 mm, p = 0.001). The magnitudes of deviation vectors did not correlate with lesion volume (p = 0.126) or depth (p = 0.128). There was no significant difference in additional operating time between different operators (37.4 ± 4.8 minutes vs 34.6 ± 4.8 minutes, p = 0.237) or in localization deviation (3.7 ± 1.0 mm vs 4.6 ± 1.5 mm, p = 0.070).

CONCLUSIONS

This study provided a complete set of a clinically applicable workflow on an easy-to-use MRN system using a wearable HMD, and has shown its technical feasibility and accuracy. Further development is required to improve the accuracy and clinical efficacy of this system.

Preoperative planning in neurosurgery requires a deep understanding of anatomical relationships to ensure optimal results. However, precise localization of intracranial lesions very much relies on medical imaging due to the lack of natural anatomical landmarks on the head. As the concept of image-guided surgery was introduced, commercial neuronavigation systems have been developed that can accurately determine the location and boundary of lesions during neurosurgical procedures to promote minimally invasive surgery.1 Preoperative accurate localization of the target lesion on the patient's scalp is essential with neuronavigation systems. However, as previous literature has described,2 visual coordination between the surgical field and navigation monitor may distract the surgeon's attention. It also requires mental transformation of 2D images to 3D physical space. Additionally, a standard neuronavigation (SN) system has bulky hardware, including an infrared camera, a navigation workstation, and accessories; with all these expensive components, an SN system usually costs more than $300,000.

Currently, new imaging technologies are emerging that combine real and virtual to create a human-computer interactive environment. In virtual reality, users observe virtual objects in a completely virtual environment. In contrast, in augmented reality, users observe virtual objects in a physical environment.3 Mixed reality, derived from augmented reality, refers to the input of physical scene information into a virtual environment and enables interactive digital data to be displayed over the physical environment. Commercial head-mounted devices (HMDs), such as HoloLens (Microsoft Corp.), make these imaging technologies more applicable to many fields, including healthcare,4 education,5 and psychology.6 HMDs have three key features: 1) holographic display provides users with depth perception of 3D virtual objects by overlaying holograms to the user's visual field; 2) spatial mapping allows projected holograms to maintain their physical space position, even if the user moves; and 3) an interactive hands-free interface enables the user to control the device through gestures, gaze, and voice. The application of HMDs in the landscape of neurosurgery has attracted attention.7–9 However, so far, there has been only one study that has assessed HMDs for neuronavigation with manual registration methods.10 Moreover, the registration accuracy reported is seemingly unsatisfactory. Also, the relatively long, time-consuming, and seemingly complex workflow makes surgeons hesitant to use an HMD for navigation. An automatic registration method has potential benefits for improving feasibility and accuracy. We hypothesized that a point-based, semiautomatic registration system programmed on HoloLens might help to improve localizing accuracy and applicability in clinical settings. We conducted this prospective pilot study to develop a low-cost, easy-to-use system on an HMD and determine its clinical feasibility and accuracy.

Methods

Patient Criteria

Thirty-seven patients who had surgery for the resection of intracranial lesions from April 2018 to February 2021 in two clinical hospitals (Chinese PLA General Hospital, Beijing, and Chinese PLA General Hospital Hainan Branch, Sanya) were prospectively identified. Detailed data on patients are shown in Table 1. The IRB and local ethics committee of the Chinese PLA General Hospital approved this study. Written informed consent was obtained from all patients or their appropriate representatives.

TABLE 1.

Characteristics of 37 patients and deviation in lesion localization with MRN

Patient No.SexAge (yrs)Lesion LocationPathologyLesion Vol (cm3)Lesion Depth (mm)Imaging ModuleSurgical PositionAdditional Time (mins)Deviation (mm)
1F55Lt temporalDLBCL27.915.9T1+CSupine495.6
2F42Rt frontalMeningioma12.50.0T1+CSupine442.8
3M54Rt parietalCavernous malformation12.312.0T2Supine453.0
4F35Rt frontalCavernous malformation3.215.4T2Supine362.9
5F34Lt frontalMeningioma12.60.0T1+CSupine402.6
6F60Lt frontalMeningioma19.90.0T1+CSupine362.8
7F54Lt parietalMeningioma14.70.0T1+CSupine372.7
8F62Rt temporalMeningioma21.50.0T1+CSupine404.5
9F54Lt parietalMeningioma20.70.0T1+CSupine342.5
10M76Rt parietalMetastasis13.514.5T1+CSupine363.5
11M57Rt parietalMeningioma16.60.0T1+CSupine352.6
12M24Lt occipitalMedulloblastoma22.518.6T1+CProne384.6
13M70Lt parietalMetastasis15.315.0T1+CProne415.7
14M70Lt frontalGlioblastoma17.624.8T1+CSupine394.3
15M37Rt frontalEpendymoma20.920.3T1+CSupine373.2
16M53Rt basal gangliaHematoma35.126.8CTSupine334.2
17M80Rt basal gangliaHematoma40.527.6CTSupine354.1
18M44Rt basal gangliaHematoma38.435.4CTSupine293.0
19M40Rt basal gangliaHematoma41.628.1CTSupine283.8
20F73Rt parietalMetastasis15.516.8T1+CSupine364.2
21F68Rt frontalMeningioma20.40.0T1+CSupine384.3
22F57Rt occipitalMeningioma15.70.0T1+CProne374.9
23M69Rt frontalMetastasis16.40.0T1+CSupine432.6
24M31Rt temporalMeningioma8.10.0T1+CSupine493.0
25M16Rt temporalCavernous malformation20.10.0T1+CSupine453.0
26M49Lt temporalMetastasis5.011.0T1+CSupine435.2
27M84Rt frontalHematoma42.80.0CTSupine313.3
28M8Lt frontalCavernous malformation15.421.3T2Supine296.6
29M58Rt temporalDiffuse astrocytoma19.824.5T2Supine334.5
30M62Lt occipitalMeningioma27.70.0T1+CProne446.9
31M41Lt occipitalDLBCL16.824.5T1+CProne394.8
32M63Rt basal gangliaHematoma33.942.3CTSupine283.6
33M57Rt basal gangliaHematoma34.037.5CTSupine256.8
34M51Lt basal gangliaHematoma30.036.0CTSupine193.1
35M51Rt occipitalMetastasis61.90.0T1+CProne274.7
36F27Lt frontalMeningioma111.20.0T1+CSupine314.2
37F66Rt occipitalMetastasis38.50.0T1+CProne336.3

DLBCL = diffuse large B-cell lymphoma; T1+C = T1-weighted with contrast.

The boundary of the lesions was defined as follows: 1) the boundary of homogeneously enhanced lesions (including meningioma, glioblastoma, and metastasis) were determined by the enhanced part of the lesion; 2) the boundary of heterogeneous enhanced lesions (including cavernous malformation and diffusive astrocytoma) were determined by the edge of an abnormal signal on a T2-weighted FLAIR sequence; and 3) the boundaries of intracerebral hematomas were determined by the extent of abnormal hyperdensity on CT. The study excluded patients who had an intracranial lesion with a highly diffuse boundary that was hard to identify.

Imaging Acquisition

The scanning parameters were defined as follows. MRI scans were performed on a 1.5-T scanner (Espree, Siemens) with the following scan parameters: T2-weighted sequence (TR 5500 msec, TE 93 msec, matrix size 512 × 512, FOV 230 mm, and slice thickness 3 mm), and 3D T1-weighted gadolinium-enhanced magnetization prepared rapid acquisition gradient-echo sequence (MPRAGE; TR 1650 msec, TE 3.02 msec, FOV 250 × 250 mm, and slice thickness 1 mm). CT was performed on a 128 multislice CT scanner (SOMATOM, Siemens) with the following scan parameters: tube voltage 120 kVp, window width 120, window level 40, matrix 128 × 128, FOV 251 × 251 mm, and slice thickness 1.25 mm.

More importantly, to perform rigid preoperative registration from the holograms to the patient's head, 6 to 7 markers were attached to each patient's scalp around the target area for surgery. The markers were placed in positions where skin shift is minimal. A blue CT marker or a green MRI marker was embedded in the marker sockets before scanning (Fig. 1A). Moreover, after the scanning, marker sockets were replaced by registration sockets at the same positions for later holographic registration.

FIG. 1.
FIG. 1.

Materials for holographic registration. A: Disposable patient marker kit (Brainlab AG) showing registration markers for MR (1) and CT (2), the marker socket (3) into which the markers can be embedded (4 and 5), and the registration socket (6). B: Three-dimensional–printed position tool placed onto the registration socket, used for adding the registration point sequence. C: Virtual tools positioned over the real field of view.

Data Postprocessing and Holographic Visualization

All preoperative CT or MRI data were accessed in the format of DICOM. The imaging sequence type was chosen based on the radiological characteristics of the lesion and surgical requirements. After importing data to 3D Slicer version 4.8.1 (http://www.slicer.org/, free open-source software), automatic segmentation and 3D modeling of the lesion were performed to calculate the volume (cm3). The depth of the lesion was defined as the distance from the closest boundary to the brain surface. Lesions were categorized by hemisphere (left vs right), lobe (frontal, parietal, temporal, occipital, and basal ganglia), volume (> and < the median), and depth (superficial [distance = 0 mm] and deep [distance > 0 mm]). Modeling was performed for the target lesion, position markers, and natural anatomical landmarks on the patient's head (including the nose, left ear, or right ear). All virtual models were output in OBJ format and imported into Unity version 3.1 (Unity Technologies) for compatibility with holographic navigation. We developed a Unity module to upload the processed 3D holograms online to a dedicated server to be downloaded later with HoloLens. We also developed a HoloLens-based application (MR Neuron) to download the holograms and perform navigation registration. Wireless access to the data enabled the visualization of high-definition holograms through transparent lenses in front of the neurosurgeon's eyes. Figure 2 demonstrates a brief view of the entire data postprocessing workflow.

FIG. 2.
FIG. 2.

An overview of the whole data postprocessing workflow. A: Preoperative MRI data in DICOM format. B: The models of the lesion (green), right ear (purple), and registration points (blue) were made in 3D Slicer software. C: The models were imported into Unity software for rendering and conversion into a HoloLens-compatible format. D and E: Wearing the HMD enables visualization of the holograms on a phantom head.

Hologram Registration

The HMD was comfortably and firmly fixed on the operator's head by adjusting the headband and overhead strap to avoid slipping while moving around. Calibration of the HMD was performed before each registration using a system-provided calibration program. The operator was asked to use their index finger to “air tap” 6 holographic targets, so that the device could align the actual finger position with the virtually displayed tap cursor. It took about 1 minute, and the holograms were clear and easy to interact with by setting the correct interpupillary distance. We used a 3D printer to print a positioning tool with a logo sticker (Fig. 1B). Mixed-reality neuronavigation (MRN) recognized the tool and then projected a virtual tool over the real tool in the field of view (Fig. 1C). The virtual tip of the tool was always at the exact location as the physical tip. When positioning the tip at the desired location, a point can be added at the exact position through gesture commands (“air tap”). Each position point was added according to the sequence indicated by the program. After tapping all points, the system would automatically match the point data sets. An overview of the MRN registration process is shown in Fig. 3A–D.

FIG. 3.
FIG. 3.

Holographic lesion localization procedures and comparison with an SN system. A: Axial preoperative MR image obtained in a 51-year-old man with a right occipital lesion. B: Registration markers attached to the patient's head before scanning. C: Registration sockets attached to the patient's head at the same locus after scanning and anesthesia. D: Registration procedure by the neurosurgeon with the HMD. E and F: Measurement of the medial (E) and anterior (F) deviation between MRN (the boundary of the green hologram) and SN (black dashes). G and H: After coregistration with the navigation system, the operator points the navigation pointer (Brainlab AG) to the anterior boundary of the lesion. The probe direction is adjusted to be parallel with the sagittal plane and vertical to the midsagittal line. I and J: The operator points the navigation probe to the medial boundary of the lesion. K and L: On the SN monitor, the “inline 1” navigation mode is chosen; the thick green line indicates the navigation probe which is vertical to the head surface. Its extended lines touch the anterior (K) and posterior (L) boundary of the lesion. M and N: The “inline 2” navigation mode is chosen. The extended navigation pointer lines touch the medial (M) and lateral (N) boundary of the lesion.

Preoperative Planning and Accuracy Assessment

After anesthesia had been induced, the patient was positioned based on surgical requirements. The operator performed registration to an SN system (Curve, Brainlab; or S7, Medtronic) using the surface matching mode. The navigation system calculated the fiducial registration error (FRE) to indicate registration accuracy. Subsequently, the operator wearing the HoloLens HMD performed registration, superimposing the 3D holograms onto the patient's head. The program automatically measures the FRE of the MRN system. Similar to the method used by Incekara et al.,11 the 2D projection of the lesion boundaries perceived by the operator on the patient's scalp was then marked with MRN and SN. The maximal extension of the perceived holographic lesion boundary was marked as 4 checkpoints in 4 directions: anterior, posterior, medial (upper), and lateral (lower). The exact process was performed with an SN system by the same operator. A vernier caliper measured distance (D) between the corresponding checkpoints (Fig. 3E and F). The following equation calculated the integrated deviation between MRN and SN: D = {[(DAnterior + DPosterior)/2]2 + [(DMedial + DLateral)/2]2}1/2 or D = {[(DAnterior + DPosterior)/2]2 + [(DSuperior + DInferior)/2]2}1/2.

In some cases, we outlined the lesion boundary on the patient's scalp under guidance with a microscope that was equipped with an augmented-reality navigation function as the gold standard.

Eventually, we performed the skin incision, craniotomy, and surgery according to the SN-localized lesion. All registration and measurements were performed by the corresponding author (X.C., doctor A) and first author (Z.Q., doctor B), who are skilled in HoloLens usage. Patients were not eligible if both authors did not attend the surgery.

To acquire an impression of complete holographic lesion projection alignment, in the case of patient 35, we placed the navigation pointer at 4 checkpoints, keeping the pointer's direction parallel to the sagittal plane and perpendicular to the head surface (Fig. 3G–J). The SN automatically added the extension line from the pointer tip when using “inline 1” and “inline 2” modes. In the case of patient 37, after exposing the lesion, we qualitatively compared the holographic lesion boundary of metastasis with the actual boundary seen within the surgical field. There was no shift of the patient's head during the entire process, from navigation registration to lesion exposure.

The neurosurgeon or his assistant recorded the additional time required for the holographic method. The additional time was defined as the duration from system setup to lesion outline on the patient's head, including marker embedding, data postprocessing, holographic visualization, and holographic registration. Imaging duration was not included in the additional time, in that imaging is a process indispensable for SN paradigms.

Statistical Analysis

All statistical analyses were performed using IBM SPSS version 23 software (IBM Corp.). A two-related sample Wilcoxon signed-rank test was used for comparison of FRE of MRN with SN. Differences in deviation among groups of lesion localizations were tested using a one-way ANOVA test. The comparison of deviation between various surgical positions was compared using the Student t-test. Spearman rank correlation was applied to analyze the relationship between the deviation and the lesion volume or depth. The patients were divided into two subgroups based on the operator: doctor A operated on group A (patients 1–22, n = 22) and doctor B operated on group B (patients 23–37, n = 15). The Mann-Whitney U-test was performed to detect significant differences in deviation or additional time between the two subgroups. Quadratic regression analysis of additional time and number of cases was applied to fit the learning curve to test whether the operator became adept at using MRN. The threshold of statistical significance was set at p = 0.05.

Results

The overall study design is demonstrated in Fig. 4. The study included 37 patients with a median age of 35 years (range 8–84 years). The lesions had a median depth of 12.0 mm (IQR 0 mm–24.5 mm), and 17 of 37 lesions (45.9%) reached the surface of the brain. The median lesion volume was 20.10 cm3 (IQR 15.35 cm3–33.95 cm3). The supine position was employed in 30 patients (81.1%), and the remaining patients were positioned prone. Table 1 further demonstrates patient characteristics. 3D modeling and hologram registration were achieved in all patients. Overall, the median deviation between MRN and SN was 4.1 mm (IQR 3.0 mm–4.7 mm), and 81.1% of the lesions were found to be highly consistent (distance < 5.0 mm). The median MRN FRE was 4.1 mm (IQR 2.8 mm–5.2 mm) compared with the median SN FRE of 2.5 mm (IQR 2.2 mm–3.1 mm) (p < 0.001). No significant difference in deviation was detected between the left and right sides (4.4 ± 1.5 mm vs 3.9 ± 1.1 mm, p = 0.228). The difference in deviations between the location of the lesion (frontal, parietal, temporal, occipital, and basal ganglia) was statistically significant (3.6 ± 1.2 mm vs 3.4 ± 1.2 mm vs 4.3 ± 1.1 mm vs 5.3 ± 1.0 mm vs 4.0 ± 1.3 mm, p = 0.040). The post hoc comparison showed that the difference came from the occipital location with frontal location (p = 0.005) and the occipital location with parietal location (p = 0.006). There was a significant difference between the supine position and the prone position (3.7 ± 1.1 mm vs 5.4 ± 0.9 mm, p = 0.001). The Spearman rank correlation test showed that the magnitudes of deviation vectors did not correlate with lesion volume (p = 0.126) or depth (p = 0.128) (Fig. 5). The mean time needed for using MRN was 36.3 ± 6.3 minutes, in which the mean registration time was 2.6 ± 0.9 minutes.

FIG. 4.
FIG. 4.

Flowchart of the study structure.

FIG. 5.
FIG. 5.

The magnitude of deviation vectors. A: Scatterplot illustrating the magnitude of deviation vectors of the 37 patients in the study. The median and the 1st and 3rd quartile (Q1 and Q3) are labeled. B: Scatterplot illustrating the magnitude of deviation vectors on the y-axis and the lesion volumes on the x-axis. C: Scatterplot illustrating the magnitude of deviation vectors on the y-axis and the lesion distance from the surface on the x-axis.

As shown in subgroup analysis, there was no significant difference in additional operating time between group A and group B (37.4 ± 4.8 minutes vs 34.6 ± 4.8 minutes, p = 0.237), or in localization deviation (3.7 ± 1.0 mm vs 4.6 ± 1.5 mm, p = 0.070), and other variables, including lesion volume (p = 0.076) and depth (p = 0.861), were not significantly different. Furthermore, in both subgroups, the quadratic regression analysis of the additional time and the number of cases showed a steep learning curve (Fig. 6). There was a trend toward shorter time required for the MRN method, which meant that the method was easy to master.

FIG. 6.
FIG. 6.

The learning curve of the point-based holographic registration system. Quadratic regression analysis of the additional time and number of cases experienced by doctor A (A) and doctor B (B).

The qualitative assessment showed that in the case of patient 35, the navigation pointer tip's extension lines drawn at the 4 MRN checkpoints were approximatively tangent to the SN lesion boundary (Fig. 3K–N). In the case of patient 37, the actual boundary was within the holographic lesion projection after exposing the lesion (Fig. 7).

FIG. 7.
FIG. 7.

Patient 37. Image overlay during neurosurgery in a patient. A: Axial preoperative MR image obtained in a 66-year-old woman with bilateral occipital lesions. B: Point-based rigid coregistration between the holograms and the patient's head before surgery. C: Exposure of the physical lesion after dural incision. The left occipital lesion was selected for accuracy assessment. D: View of the surgical field showing the physical lesion overlaid with the hologram.

Discussion

This prospective pilot study suggests that a wearable head-mounted MRN system is technically feasible. We have provided a complete set of a clinically applicable workflow and tested the accuracy in patients with intracranial lesions using quantitative and qualitative methods.

An MRN system may provide advantages in an actual clinical environment; first, it provides the visual experience of a holographic display. To describe the experience, we encourage the reader to view Video 1.

VIDEO 1. Video showing holographic MRN after fiducial registration in a 54-year-old woman with a cerebral parafalx meningioma. Copyright Xiaolei Chen. Published with permission. Click here to view.

Without turning the head to other monitors, the operator can visualize the hologram of the target lesion and the surrounding eloquent structures overlaid on the patient's head. The holograms are seen via the heads-up display in front of the user's eyes. In this way, the user can acquire an intuitive understanding of the target lesion's location and shape. The system has ergonomic advantages and reduces the diversion of attention. Second, the efficiency of the work process is improved. The operator may control the device through gestures or voice without requiring a surgical team member to control the workstation. Third, compared with commercial SN systems, the cost of our system is much lower (exclusive of the computer, approximately $3000). Moreover, the system is easy to set up after our simplification. The user can perform the imaging data processing and holographic visualization using only an HMD and a computer anytime and anywhere, free from an unwieldy workstation hardware system, as with SN. Compared with standard optic neuronavigation systems, our MRN system is not tethered to an infrared camera. We can achieve lesion localization and simple navigation, holding only a 3D-printed positioning tool. Last but not least, other team members can share the operator's real-time perspective with another HoloLens device through a local area network, which applies to teaching practice. The 3D Slicer software provides a platform for image segmentation and model creation, facilitating the holographic visualization in this study. The critical step in this holographic localizing method is the hologram's registration to the patient, for which we designed a semiautomatic coregistration procedure. Compared with manual hologram placement, this fiducial matching–based registration has the potential advantage of reducing visual misinterpretation, so that the operator does not need to adjust the position of the hologram from various perspectives.10, 12

Various methods of accuracy measurement have been used in the published literature. Incekara et al.11 measured accuracy by comparing lesion centers on the scalp using both MRN and SN. The median distance between the centers was 4 mm (IQR 0 mm–8 mm). Van Doormaal et al.10 measured FRE using MRN and reported an error of 7.2 ± 1.8 mm on the phantom and error of 4.4 ± 2.5 mm on 3 patients. Li et al.13 performed external ventricular drainage under holographic guidance. The mean target deviation of the holographic-guided group was 4.34 ± 1.63 mm. McJunkin et al.14 measured the target registration error (TRE) on a 3D-printed phantom, and the mean TRE was 5.8 ± 0.5 mm. The median deviation between MRN and SN in our study was 4.1 mm, which is consistent with the accuracy ranges reported in previous literature. It is acceptable for navigated procedures in which extremely high accuracy is not needed. Our MRN system may be helpful for localization of large brain lesions, intracerebral hematoma evacuation, and external ventricular drainage. However, the difference in FRE between MRN and SN suggests that the precision of MRN requires improvement. According to our results, the prone position deviation seems to be greater than the supine position deviation. The main reason for this may be due to stretched suboccipital muscles and skin when positioning the patient. The offset of the registration point due to skin shift may increase the deviation.

We measured the additional time required for the MRN method to assess the feasibility in the clinical setting. The mean additional time before the surgery was 37.2 ± 7.9 minutes, and the efficiency is comparable with using commercial SN systems. Data postprocessing time accounts for most of the total time, which could be shortened with the gradual accumulation of experience.

Limitations do exist when using MRN in the study. First, the perceived drift of holograms may affect the system accuracy due to the relatively high error of accessed tracking data and displayed information using HoloLens glasses. The visual effects of holographic drift prevalently occur while walking around the hologram instead of toward it. Theoretically, the perceived drift would be minimal if the operator observes from the perspective where the registration is performed because the hologram is registered from this position. It may imply that the error attributed to perceived drift can be reduced when the operator performs the registration from the perspective of the surgery and at an angle vertical to the head skin. Second, observing holograms may distract the operator from the physical space, in that the hologram may potentially obstruct the view of the surgical area and cause inattentional blindness. The operator may experience discomfort trying to perform technical maneuvers wearing an HMD, such as boundary or incision marking. HoloLens screen brightness and hologram opacity can be easily adjusted with certain buttons on the device, so that the operator can take a clear look at both the hologram and the patient. Third, the time spent on data postprocessing accounts for a significant proportion of the entire workflow. One of the challenges is to provide enough well-trained staff to reconstruct and visualize the holograms. For data postprocessing, 3D Slicer is already used by many clinicians worldwide. With further simplification of the process, this MRN system would become easier to use.

Although our preliminary results are promising, some improvements are still needed. First, any movement of the patient's head after registration will cause severe deviation of the hologram. A reregistration must be performed in that case to ensure that the hologram is aligned with the matched volume. We have been developing markers to be placed in the surgical area, which the HMD continuously tracked, to keep the hologram on the patient's head accurately. Another limitation is that the mean TRE in the holographic scene was not measured. TRE is defined as the offset from a specific selected point (not the registration points) in the virtual space to the physical space. It is often used to verify the accuracy of infrared registration neuronavigation systems. A previous study suggested that FRE is not recommended as an index to indicate the accuracy of a specific registration in that a specific FRE is not related to a specific TRE.15 The automatically measured FRE in this study was only used as an index to illustrate the precision of MRN rather than the accuracy. However, due to hardware or software limitations, the surface mesh created by HoloLens is too crude to measure the MRN TRE with sufficient precision.16 Future studies need to focus on automatic MRN TRE measurements for standardized validation.

Conclusions

This study provides a complete set of a clinically applicable workflow on an easy-to-use MRN system using a wearable HMD, and proves its technical feasibility and accuracy. Further development is yet required to improve the accuracy and efficacy of this system.

Acknowledgments

Funding was provided by The National Key Research and Development Program of China, No. 2018YFC1312602 (to Xiaolei Chen, MD, PhD); the National Natural Science Foundation of China, No. 81771481 (to Xiaolei Chen, MD, PhD); and The Young Scholar Medical Research Fund of Chinese PLA General Hospital, No. QNF19018 (to Xinghua Xu, MD, PhD).

We thank Dr. Zhizhong Zhang and Dr. Kefan Yi (Department of Neurosurgery, Chinese PLA General Hospital, Beijing, China) for performing MRI scanning.

Disclosures

The authors report no conflict of interest concerning the materials or methods used in this study or the findings specified in this paper.

Author Contributions

Conception and design: Chen, Qi, Y Li, Xu, J Zhang. Acquisition of data: Chen, Qi, Xu, J Zhang. Analysis and interpretation of data: Chen, Qi. Drafting the article: Chen, Qi. Critically revising the article: Chen. Reviewed submitted version of manuscript: Chen. Approved the final version of the manuscript on behalf of all authors: Chen. Statistical analysis: Qi. Administrative/technical/material support: Chen, Y Li, Xu, J Zhang, F Li, Gan, Xiong, Wang, S Zhang. Study supervision: Chen.

Supplemental Information

References

  • 1

    Barone DG, Lawrie TA, Hart MG. Image guided surgery for the resection of brain tumours. Cochrane Database Syst Rev. 2014;2014(1):CD009685.

    • Search Google Scholar
    • Export Citation
  • 2

    Guha D, Alotaibi NM, Nguyen N, Gupta S, McFaul C, Yang VXD. Augmented reality in neurosurgery: a review of current concepts and emerging applications. Can J Neurol Sci. 2017;44(3):235245.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 3

    Haemmerli J, Davidovic A, Meling TR, Chavaz L, Schaller K, Bijlenga P. Evaluation of the precision of operative augmented reality compared to standard neuronavigation using a 3D-printed skull. Neurosurg Focus. 2021;50(1):E17.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 4

    Jütten LH, Mark RE, Maria Janssen BWJ, Rietsema J, Dröes RM, Sitskoorn MM. Testing the effectivity of the mixed virtual reality training Into D’mentia for informal caregivers of people with dementia: protocol for a longitudinal, quasi-experimental study. BMJ Open. 2017;7(8):e015702.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 5

    Hoffman MA, Provance JB. Visualization of molecular structures using HoloLens-based augmented reality. AMIA Jt Summits Transl Sci Proc. 2017;2017:6874.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 6

    Pas ET, Johnson SR, Larson KE, Brandenburg L, Church R, Bradshaw CP. Reducing behavior problems among students with autism spectrum disorder: coaching teachers in a mixed-reality setting. J Autism Dev Disord. 2016;46(12):36403652.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 7

    Mascitelli JR, Schlachter L, Chartrain AG, Oemke H, Gilligan J, et al. Navigation-linked heads-up display in intracranial surgery: early experience. Oper Neurosurg (Hagerstown). 2018;15(2):184193.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 8

    Cutolo F, Meola A, Carbone M, Sinceri S, Cagnazzo F, et al. A new head-mounted display-based augmented reality system in neurosurgical oncology: a study on phantom. Comput Assist Surg (Abingdon). 2017;22(1):3953.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 9

    Tepper OM, Rudy HL, Lefkowitz A, Weimer KA, Marks SM, et al. Mixed reality with HoloLens: where virtual reality meets augmented reality in the operating room. Plast Reconstr Surg. 2017;140(5):10661070.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 10

    van Doormaal TPC, van Doormaal JAM, Mensink T. Clinical accuracy of holographic navigation using point-based registration on augmented-reality glasses. Oper Neurosurg (Hagerstown). 2019;17(6):588593.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 11

    Incekara F, Smits M, Dirven C, Vincent A. Clinical feasibility of a wearable mixed-reality device in neurosurgery. World Neurosurg. 2018;118:e422e427.

  • 12

    Frantz T, Jansen B, Duerinck J, Vandemeulebroucke J. Augmenting Microsoft's HoloLens with vuforia tracking for neuronavigation. Healthc Technol Lett. 2018;5(5):221225.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 13

    Li Y, Chen X, Wang N, Zhang W, Li D, et al. A wearable mixed-reality holographic computer for guiding external ventricular drain insertion at the bedside. J Neurosurg. 2019;131(5):15991606.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 14

    McJunkin JL, Jiramongkolchai P, Chung W, Southworth M, Durakovic N, et al. Development of a mixed reality platform for lateral skull base anatomy. Otol Neurotol. 2018;39(10):e1137e1142.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 15

    Fitzpatrick JM. Fiducial registration error and target registration error are uncorrelated. Paper presented at: Medical Imaging 2009: Visualization, Image-Guided Procedures, and Modeling; February 7, 2009; Lake Buena Vista, FL.

    • Search Google Scholar
    • Export Citation
  • 16

    Kuhlemann I, Kleemann M, Jauer P, Schweikard A, Ernst F. Towards X-ray free endovascular interventions—using HoloLens for on-line holographic visualisation. Healthc Technol Lett. 2017;4(5):184187.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • View in gallery

    Materials for holographic registration. A: Disposable patient marker kit (Brainlab AG) showing registration markers for MR (1) and CT (2), the marker socket (3) into which the markers can be embedded (4 and 5), and the registration socket (6). B: Three-dimensional–printed position tool placed onto the registration socket, used for adding the registration point sequence. C: Virtual tools positioned over the real field of view.

  • View in gallery

    An overview of the whole data postprocessing workflow. A: Preoperative MRI data in DICOM format. B: The models of the lesion (green), right ear (purple), and registration points (blue) were made in 3D Slicer software. C: The models were imported into Unity software for rendering and conversion into a HoloLens-compatible format. D and E: Wearing the HMD enables visualization of the holograms on a phantom head.

  • View in gallery

    Holographic lesion localization procedures and comparison with an SN system. A: Axial preoperative MR image obtained in a 51-year-old man with a right occipital lesion. B: Registration markers attached to the patient's head before scanning. C: Registration sockets attached to the patient's head at the same locus after scanning and anesthesia. D: Registration procedure by the neurosurgeon with the HMD. E and F: Measurement of the medial (E) and anterior (F) deviation between MRN (the boundary of the green hologram) and SN (black dashes). G and H: After coregistration with the navigation system, the operator points the navigation pointer (Brainlab AG) to the anterior boundary of the lesion. The probe direction is adjusted to be parallel with the sagittal plane and vertical to the midsagittal line. I and J: The operator points the navigation probe to the medial boundary of the lesion. K and L: On the SN monitor, the “inline 1” navigation mode is chosen; the thick green line indicates the navigation probe which is vertical to the head surface. Its extended lines touch the anterior (K) and posterior (L) boundary of the lesion. M and N: The “inline 2” navigation mode is chosen. The extended navigation pointer lines touch the medial (M) and lateral (N) boundary of the lesion.

  • View in gallery

    Flowchart of the study structure.

  • View in gallery

    The magnitude of deviation vectors. A: Scatterplot illustrating the magnitude of deviation vectors of the 37 patients in the study. The median and the 1st and 3rd quartile (Q1 and Q3) are labeled. B: Scatterplot illustrating the magnitude of deviation vectors on the y-axis and the lesion volumes on the x-axis. C: Scatterplot illustrating the magnitude of deviation vectors on the y-axis and the lesion distance from the surface on the x-axis.

  • View in gallery

    The learning curve of the point-based holographic registration system. Quadratic regression analysis of the additional time and number of cases experienced by doctor A (A) and doctor B (B).

  • View in gallery

    Patient 37. Image overlay during neurosurgery in a patient. A: Axial preoperative MR image obtained in a 66-year-old woman with bilateral occipital lesions. B: Point-based rigid coregistration between the holograms and the patient's head before surgery. C: Exposure of the physical lesion after dural incision. The left occipital lesion was selected for accuracy assessment. D: View of the surgical field showing the physical lesion overlaid with the hologram.

  • 1

    Barone DG, Lawrie TA, Hart MG. Image guided surgery for the resection of brain tumours. Cochrane Database Syst Rev. 2014;2014(1):CD009685.

    • Search Google Scholar
    • Export Citation
  • 2

    Guha D, Alotaibi NM, Nguyen N, Gupta S, McFaul C, Yang VXD. Augmented reality in neurosurgery: a review of current concepts and emerging applications. Can J Neurol Sci. 2017;44(3):235245.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 3

    Haemmerli J, Davidovic A, Meling TR, Chavaz L, Schaller K, Bijlenga P. Evaluation of the precision of operative augmented reality compared to standard neuronavigation using a 3D-printed skull. Neurosurg Focus. 2021;50(1):E17.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 4

    Jütten LH, Mark RE, Maria Janssen BWJ, Rietsema J, Dröes RM, Sitskoorn MM. Testing the effectivity of the mixed virtual reality training Into D’mentia for informal caregivers of people with dementia: protocol for a longitudinal, quasi-experimental study. BMJ Open. 2017;7(8):e015702.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 5

    Hoffman MA, Provance JB. Visualization of molecular structures using HoloLens-based augmented reality. AMIA Jt Summits Transl Sci Proc. 2017;2017:6874.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 6

    Pas ET, Johnson SR, Larson KE, Brandenburg L, Church R, Bradshaw CP. Reducing behavior problems among students with autism spectrum disorder: coaching teachers in a mixed-reality setting. J Autism Dev Disord. 2016;46(12):36403652.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 7

    Mascitelli JR, Schlachter L, Chartrain AG, Oemke H, Gilligan J, et al. Navigation-linked heads-up display in intracranial surgery: early experience. Oper Neurosurg (Hagerstown). 2018;15(2):184193.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 8

    Cutolo F, Meola A, Carbone M, Sinceri S, Cagnazzo F, et al. A new head-mounted display-based augmented reality system in neurosurgical oncology: a study on phantom. Comput Assist Surg (Abingdon). 2017;22(1):3953.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 9

    Tepper OM, Rudy HL, Lefkowitz A, Weimer KA, Marks SM, et al. Mixed reality with HoloLens: where virtual reality meets augmented reality in the operating room. Plast Reconstr Surg. 2017;140(5):10661070.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 10

    van Doormaal TPC, van Doormaal JAM, Mensink T. Clinical accuracy of holographic navigation using point-based registration on augmented-reality glasses. Oper Neurosurg (Hagerstown). 2019;17(6):588593.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 11

    Incekara F, Smits M, Dirven C, Vincent A. Clinical feasibility of a wearable mixed-reality device in neurosurgery. World Neurosurg. 2018;118:e422e427.

  • 12

    Frantz T, Jansen B, Duerinck J, Vandemeulebroucke J. Augmenting Microsoft's HoloLens with vuforia tracking for neuronavigation. Healthc Technol Lett. 2018;5(5):221225.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 13

    Li Y, Chen X, Wang N, Zhang W, Li D, et al. A wearable mixed-reality holographic computer for guiding external ventricular drain insertion at the bedside. J Neurosurg. 2019;131(5):15991606.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 14

    McJunkin JL, Jiramongkolchai P, Chung W, Southworth M, Durakovic N, et al. Development of a mixed reality platform for lateral skull base anatomy. Otol Neurotol. 2018;39(10):e1137e1142.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 15

    Fitzpatrick JM. Fiducial registration error and target registration error are uncorrelated. Paper presented at: Medical Imaging 2009: Visualization, Image-Guided Procedures, and Modeling; February 7, 2009; Lake Buena Vista, FL.

    • Search Google Scholar
    • Export Citation
  • 16

    Kuhlemann I, Kleemann M, Jauer P, Schweikard A, Ernst F. Towards X-ray free endovascular interventions—using HoloLens for on-line holographic visualisation. Healthc Technol Lett. 2017;4(5):184187.

    • Crossref
    • Search Google Scholar
    • Export Citation

Metrics

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 1445 620 31
PDF Downloads 1555 710 41
EPUB Downloads 0 0 0