A novel 3D-vision–based collaborative robot as a scope holding system for port surgery: a technical feasibility study

View More View Less
  • 1 Medical School of Chinese PLA, Beijing;
  • | 2 Department of Neurosurgery, First Medical Center of Chinese PLA General Hospital, Beijing; and
  • | 3 Medical School, Nankai University, Tianjin, China
Free access

OBJECTIVE

A clear, stable, suitably located vision field is essential for port surgery. A scope is usually held by hand or a fixing device. The former yields fatigue and requires lengthy training, while the latter increases inconvenience because of needing to adjust the scope. Thus, the authors innovated a novel robotic system that can recognize the port and automatically place the scope in an optimized position. In this study, the authors executed a preliminary experiment to test this system’s technical feasibility and accuracy in vitro.

METHODS

A collaborative robotic (CoBot) system consisting of a mechatronic arm and a 3D camera was developed. With the 3D camera and programmed machine vision, CoBot can search a marker attached to the opening of the surgical port, followed by automatic alignment of the scope’s axis with the port’s longitudinal axis so that optimal illumination and visual observation can be achieved. Three tests were conducted. In test 1, the robot positioned a laser range finder attached to the robot’s arm to align the sheath’s center axis. The laser successfully passing through two holes in the port sheath’s central axis defined successful positioning. Researchers recorded the finder’s readings, demonstrating the actual distance between the finder and the sheath. In test 2, the robot held a high-definition exoscope and relocated it to the setting position. Test 3 was similar to test 2, but a metal holder substituted the robot. Trained neurosurgeons manually adjusted the holder. The manipulation time was recorded. Additionally, a grading system was designed to score each image captured by the exoscope at the setting position, and the scores in the two tests were compared using the rank-sum test.

RESULTS

The CoBot system positioned the finder successfully in all rounds in test 1; the mean height errors ± SD were 1.14 mm ± 0.38 mm (downward) and 1.60 mm ± 0.89 mm (upward). The grading scores of images in tests 2 and 3 were significantly different. Regarding the total score and four subgroups, test 2 showed a more precise, better-positioned, and more stable vision field. The total manipulation time in test 2 was 20 minutes, and for test 3 it was 52 minutes.

CONCLUSIONS

The CoBot system successfully acted as a robust scope holding system to provide a stable and optimized surgical view during simulated port surgery, providing further evidence for the substitution of human hands, and leading to a more efficient, user-friendly, and precise operation.

ABBREVIATIONS

CoBot = collaborative robotic.

OBJECTIVE

A clear, stable, suitably located vision field is essential for port surgery. A scope is usually held by hand or a fixing device. The former yields fatigue and requires lengthy training, while the latter increases inconvenience because of needing to adjust the scope. Thus, the authors innovated a novel robotic system that can recognize the port and automatically place the scope in an optimized position. In this study, the authors executed a preliminary experiment to test this system’s technical feasibility and accuracy in vitro.

METHODS

A collaborative robotic (CoBot) system consisting of a mechatronic arm and a 3D camera was developed. With the 3D camera and programmed machine vision, CoBot can search a marker attached to the opening of the surgical port, followed by automatic alignment of the scope’s axis with the port’s longitudinal axis so that optimal illumination and visual observation can be achieved. Three tests were conducted. In test 1, the robot positioned a laser range finder attached to the robot’s arm to align the sheath’s center axis. The laser successfully passing through two holes in the port sheath’s central axis defined successful positioning. Researchers recorded the finder’s readings, demonstrating the actual distance between the finder and the sheath. In test 2, the robot held a high-definition exoscope and relocated it to the setting position. Test 3 was similar to test 2, but a metal holder substituted the robot. Trained neurosurgeons manually adjusted the holder. The manipulation time was recorded. Additionally, a grading system was designed to score each image captured by the exoscope at the setting position, and the scores in the two tests were compared using the rank-sum test.

RESULTS

The CoBot system positioned the finder successfully in all rounds in test 1; the mean height errors ± SD were 1.14 mm ± 0.38 mm (downward) and 1.60 mm ± 0.89 mm (upward). The grading scores of images in tests 2 and 3 were significantly different. Regarding the total score and four subgroups, test 2 showed a more precise, better-positioned, and more stable vision field. The total manipulation time in test 2 was 20 minutes, and for test 3 it was 52 minutes.

CONCLUSIONS

The CoBot system successfully acted as a robust scope holding system to provide a stable and optimized surgical view during simulated port surgery, providing further evidence for the substitution of human hands, and leading to a more efficient, user-friendly, and precise operation.

As part of artificial intelligence technology, robotic systems are advancing rapidly with high technological development. The first robot introduced in surgery was PUMA 200 (Westinghouse Electric), which completed a neurosurgical biopsy in 1985.1 Then, other surgical disciplines, such as cardiac surgery, abdominal surgery, and urological surgery, led to a boom in robotic surgery and new robotic systems, such as the Da Vinci (Intuitive Surgical, Inc.) and ZEUS (Computer Motion) robotic surgical systems.2

Although the first robotic operation was a neurosurgical operation, the application of robotics in neurosurgery is not as comprehensive as it is in other disciplines. Some studies have also suggested that robots such as Da Vinci are not suitable for neurosurgery, especially intracranial procedures;3 the limited space and delicate brain tissue involved in neurosurgery may hinder the application of surgical robots. Thus, most robotic systems are used for biopsy and stereotactic procedures4 or designed for facilitating viewing, among which there are well-known systems such as Neuroarm (MDA, Inc.),5 Neuromate (Renishaw, Inc.),6 ROSA (Zimmer Biomet),7 MKM microscopic system (Zeiss AG),8,9 ROVOT-m (Synaptive Medical, Inc.),10 CyberKnife (Accuray, Inc.),11 and Remebot (Beijing Baihui Weikang Technology Co.).12

The surgical robot can be divided into three types, as follows:13,14 1) a supervisory-controlled system, whereby the surgeon predetermines the plans on a computer model based on diagnostic imaging of the patient and downloads the plans to supervise the robot as it executes the surgical plans; 2) a telesurgical system, which allows the surgeon to operate in real time (with force-feedback from surgical instruments, the surgeon operates in a haptic interface, and the surgical robot copies the surgeon’s movements); and 3) a shared-control system, where the robotic system assists and provides steady hand manipulation of the instruments. Compared with the first two types, the third type of surgical robot is not as frequently studied. However, a robotic assistant can be handy because it increases dexterity and reduces human hand tremors, ensuring a more accurate and safer operation.15 A typical example of this type of robotic system is one that holds surgical instruments or rests human hands and arms.16–18 For example, Rachinger and colleagues introduced a robotic system used as an instrument holder13 based on a robot called Evolution 1;19 the earlier robot had been used as an endoscope holder in transsphenoidal surgery with navigational support.

During endoscopic or exoscopic neurosurgery, stable and flexible holding of the cumbersome equipment is demanding for the neurosurgeon. Generally, the surgeon holds the equipment with one hand and manipulates instruments with the other hand, which leads to fatigue quickly and limits the surgeon’s manipulation capabilities,14 thus reducing stability and safety during the procedures. Sometimes an assistant holds equipment to support the surgeon,20 which requires perfect coordination between the surgeon and assistant, a collaborative relationship forged over years of working together. Although artificial approaches usually cannot provide a stable and suitable surgical vision field, a holding device can be a suitable substitute for human hands, with stability as the most significant merit.21,22

Nevertheless, when using the transparent sheath sets during a port surgery, adjustment of the holding device adds complexity and difficulty to the already tedious procedure. To adjust the sheath to another surgical spot, the surgeon must first loosen the holding system, take down the scope, unlock the retractor clamping the sheath, and adjust the sheath. After that, it is necessary to relock the clamp and port sheath, relocate the scope, and fasten the holding system of the scope. The biggest obstacle is not the complexity but the difficulty in relocating the scope to a correct point and angle in order to display a clear and complete field of view.22 We developed a novel collaborative robotic (CoBot) system with a machine-vision tracking system to simplify these surgical procedures. In this article, we report the preliminary assessment of the feasibility and accuracy of this system.

Methods

CoBot System

This CoBot system mainly consists of a mechatronic arm and a 3D camera (Fig. 1). The mechatronic arm is UR5 (Universal Robots), a lightweight, adaptable, collaborative industrial robot that tackles medium-duty applications with ultimate flexibility. It has 6 degrees of freedom and weighs 18.4 kg, with a payload of 5 kg, a footprint diameter of 149 mm, a reach of 850 mm, and a precision of ± 0.1 mm. Patented technology lets operators with no programming experience quickly set up the system within hours and subsequently operate using an easy-to-use touchscreen tablet. Furthermore, the German Technical Inspection Association approved and certified the safety of this system, which is the primary safety requirement for a surgical robot. The robotic camera, Basler acA2440-20gm GigE (Basler AG), with an accuracy of 0.0694 mm and a visual field of 150 × 170 mm, is the eye of this CoBot system and is fixed onto the front of the mechatronic arm. Its resolution is 2448 pixels × 2048 pixels, and the pixel size is 3.45 μm × 3.45 μm. Cooperating with WEILINKRT Corp., we designed a program that enables the camera to recognize the shape or pattern of a specific marker on the port sheath, leading to the discernment of spatial location to help the robot precisely align the holding exoscope to the longitudinal axis of the port when the port is tilted or pivoted.

FIG. 1.
FIG. 1.

The construction of the robotic system. The black arrow indicates the robotic camera, Basler acA2440-20gm GigE (Basler AG), and the white arrow indicates the robotic arm, UR5 (Universal Robots).

Sheath and Model

Based on our patented obturator and transparent sheath set23–25 (patent no. 201210066281.1, Victor Medical Instruments Co.), we made two revisions of the sheath to facilitate the camera’s recognition and testing. At the upper opening of both versions, a circular facet with a central hollow was added, on which we stuck a piece of paper writing a character string “ABCDHKM STVWXYZ” that helps the camera recognize and learn. Inside the tube of version 1, we added two slices with an interval of 2 cm. On the center of each slice, a 2-mm-diameter hole was created (Fig. 2A and C), while on the bottom of version 2, we pasted a piece of paper with a pattern (Fig. 2B and D). The revised sheaths are 3D-printed samples (UnionTech Lite600) stabilized by a clamp and a holder. During the test, we changed the direction of the sheath’s facet by adjusting the clamp and the holder.

FIG. 2.
FIG. 2.

Two versions of the revised sheath. A: Top opening of version 1 (used in test 1). B: Top opening of version 2 (used in tests 2 and 3). C: Bottom of version 1 (used in test 1). D: Bottom of version 2 (used in tests 2 and 3).

High-Definition Exoscope and Laser Range Finder

We used a high-definition exoscope (HUIBOSHI Technology) to capture images used to evaluate the result of positioning. The size and focus could be adjusted to demonstrate the details in the images.

A laser range finder, with a measurement error of ± 2 mm, was used to measure the distance between the tip of the finder and the first layer within the sheath and to prove the robotic system’s coaxial alignment.

Procedures and Measurement

Test 1

The first test was performed to determine the positioning accuracy of the robotic system. During this test, we used version 1 of the sheath. The robot attaching a laser range finder was calibrated. Theoretically, if the robotic system can accurately place the exoscope with coaxial alignment to the sheath, the laser will pass through the small holes on the two layers, creating a spot of laser light that is visible on the testing platform (Fig. 3A). Under this condition, the laser range finder measures the distance vertically. We recorded the measurement value shown by the laser range finder. Next, we tilted the direction of the sheath at different angles, after which the CoBot system automatically recognized the sheath facet and relocated the laser range finder. These steps were repeated 20 times (Video 1).

VIDEO 1. Demonstration of the testing of coaxial alignment and positioning accuracy. Copyright Xiaolei Chen. Published with permission. Click here to view.

FIG. 3.
FIG. 3.

A: Illustration of the coaxial alignment test. B: The setting of test 2. The thick black arrow indicates the robotic arm, the thick white arrow indicates the holder, the thin black arrow indicates the robotic camera, and the thin white arrow indicates the high-definition camera. C: The high-definition camera and the holder used in test 3.

The successful positioning rate was regarded as the proportion of the laser successfully passing through the holes. In addition, we recorded the readings of actual height shown by the laser range finder when the robot accurately positioned the range finder. The actual height was the distance between the tip of the finder and the first layer of the sheath. Then we calculated the height error, meaning the difference between the actual height and the set height of the system to place the finder. The error was shown by a mean value and a standard deviation.

Test 2

During this test, we sought to reflect the feasibility of the robotic system. Version 2 of the sheath was used and clamped onto a holder during this test. Mounted on the robot, the exoscope was connected to a computer (Fig. 3B) to display the surgical field. After calibration of the robotic system, the robotic arm was set to the starting position. As we adjusted the direction of the sheath, we started the timer and operated the robotic system to recognize the sheath’s facet. Next, the robotic system relocated the exoscope to an optimized position, after which we recorded the image captured by the high-definition exoscope on the computer for the subsequent evaluation. Then the robotic arm returned to the initiating position. These steps were repeated 20 times, and we stopped the timer at the last image recording (Video 2).

VIDEO 2. Demonstration of the process of test 2. Copyright Xiaolei Chen. Published with permission. Click here to view.

Test 3

The third test was aimed at comparing the fixing device with the robotic system and simulating the general port surgery. Test 3 was the same as test 2, except that a metal fixing arm replaced the robotic arm as a holder for the high-definition camera (Fig. 3C). Following the setup, we began the testing procedure by adjusting the direction of the sheath, then starting the timer. First, operator X changed the sheath’s direction to the possible position that may be encountered during a port surgery. Next, operator Z manually located the high-definition exoscope to a suitable place, fastened the lock on the holder, and adjusted the image size and focus, followed by operator X screenshotting the image that was transmitted by the camera on the laptop. We repeated these procedures 20 times and collected 20 images for analysis. The timer was stopped at the moment of the last screenshot.

Three researchers (Z.G., Z.Q., and M.L.), blinded to the study design, assessed four statements after evaluating the images taken by the high-definition camera: 1) The pattern was demonstrated clearly, representing clearness. 2) The pattern was located in the center of the top opening, representing coaxial alignment. 3) The pattern was located in the center of the visual field, representing the location. 4) The visual field was stable and comfortable, representing comfortableness.

The researchers rated each statement using a 5-point Likert scale26–28 (ranging from 1 to 5 points with 1 = strongly disagree and 5 = strongly agree) to assess each image. The score of each statement was aggregated into a total for each image. We used medians and interquartile ranges to demonstrate statistical description results26,29 of the scores of 20 images in each test. Finally, using IBM SPSS Statistics version 25 (IBM Corp.), we compared the scores of the images in different tests by the rank-sum test of two independent samples. A p value < 0.05 was regarded as statistically significant.

Results

Test 1

This robotic system placed the laser range finder successfully in all 20 rounds, which indicates a relatively high location accuracy on the plane parallel to the opening of the sheath and a satisfactory coaxial alignment. The setting height of the system (the distance between the finder’s tip and the first layer of the sheath) was 380.00 mm. The mean height errors ± SD were 1.14 mm ± 0.38 mm (downward) with the most significant error at 2.00 mm and 1.60 mm ± 0.89 mm (upward) with the most significant error at 3.00 mm.

Test 2

The image ratings of each researcher are demonstrated in Table 1. The mean total score of 20 images was 17.00 (range 16.70–17.30). For each statement, the results were 4.30 (IQR 4.00–4.30) for clearness, 4.30 (IQR 4.08–4.30) for coaxial alignment, 4.30 (IQR 4.30–4.30) for location, and 4.30 (IQR 4.08–4.30) for comfortableness. Test 2 took 20 minutes in total, and each round was 1 minute on average. Figure 4 shows typical images made with the high-definition exoscope.

TABLE 1.

The ratings of each researcher for the images in test 2

ResearcherClearnessCoaxial AlignmentLocationComfortablenessTotal
M.L.4.00 (3.00–4.00)4.00 (4.00–4.00)4.00 (4.00–4.00)4.00 (3.25–4.00)15.50 (14.25–16.00)
Z.Q.5.00 (5.00–5.00)5.00 (5.00–5.00)5.00 (5.00–5.00)5.00 (5.00–5.00)20.00 (20.00–20.00)
Z.G.4.00 (4.00–4.00)4.00 (4.00–4.00)4.00 (4.00–4.00)4.00 (4.00–4.00)16.00 (16.00–16.00)
Overall mean (range)4.30 (4.00–4.30)4.30 (4.08–4.30)4.30 (4.30–4.30)4.30 (4.08–4.30)17.00 (16.70–17.30)

Values represent the median (IQR) unless stated otherwise.

FIG. 4.
FIG. 4.

Typical images made with the high-definition camera in test 2.

Test 3

Table 2 illustrates the ratings of each statement. The mean total score of 20 images was 14.00 (range 13.00–14.93). For each statement, the results were 3.85 (IQR 3.30–4.30) for clearness, 3.30 (IQR 3.30–4.30) for coaxial alignment, 3.00 (IQR 2.40–3.30) for location, and 3.70 (IQR 3.00–3.70) for degree of comfort. Test 3 took 52 minutes in total, and each round was 2.6 minutes on average. Figure 5 shows typical images made with the high-definition camera.

TABLE 2.

The ratings of each researcher for the images in test 3

ResearcherClearnessCoaxial AlignmentLocationComfortablenessTotal
M.L.4.00 (2.25–4.00)2.00 (2.00–4.00)2.00 (2.00–4.00)3.00 (2.00–3.00)11.00 (9.25–12.75)
Z.Q.5.00 (4.25–5.00)5.00 (5.00–5.00)4.00 (2.00–4.75)5.00 (5.00–5.00)18.00 (16.25–19.75)
Z.G.3.00 (3.00–4.00)3.00 (3.00–4.00)3.00 (2.00–3.00)3.00 (2.25–3.00)12.50 (10.25–14.00)
Overall mean (range)3.85 (3.30–4.30)3.30 (3.30–4.30)3.00 (2.40–3.30)3.70 (3.00–3.70)14.00 (13.00–14.93)

Values represent the median (IQR) unless stated otherwise.

FIG. 5.
FIG. 5.

Typical images made with the high-definition camera in test 3.

Comparison of Tests 2 and 3

For each image, the mean score of three researchers was calculated. Next, we compared the scores of the two groups of images. The p value of the rank-sum test for comparing the total score was < 0.001, reflecting a significant difference in the image quality between these two groups. Subgroup analysis of clearness (p = 0.003), coaxial alignment (p = 0.002), location (p < 0.001), and degree of comfort (p < 0.001) also showed significant differences between the two groups.

Discussion

Our novel CoBot system can act as a scope holding system for port surgery with high accuracy, presenting a more robust localization ability compared with traditional scope holders.

This robot is designed for port surgery using our patented obturator and transparent sheath set. It replaces the surgical staff assistant to some extent and leaves the dominance of the surgeon. Using our CoBot, a surgeon only needs to manipulate the port sheath and, therefore, can operate with both hands. Meanwhile, the CoBot automatically adjusts the scope to the optimal position with a constant distance for a clear focus; in this way, significant time and effort can be saved.

Robotic scope holding systems have already been introduced in neurosurgery.30 However, these systems rely on a navigation system to track or work passively using a controller.13,19,31 Most of them are integrated platforms that merge the scope, navigation, and robotic arm as a whole.10 Additionally, systems like ROSA and NeuroMate are cumbersome and costly, which hampers their integration in the operating room.32 One of our breakthroughs is that our untethered CoBot system can work without a navigation system, track the sheath itself automatically, and, thus, display the surgical field actively. Its compactness and portability also facilitate its integration in the general operating room and widen its application in some special situations, such as during wartime. In addition, this system can be used with scopes of various kinds and brands, and the self-designed program—the core of our robotic system—can be matched with different robotic arms and cameras. Both of these attributes expand the potential applications of this system. Additionally, by subjecting the CoBot system to experimental conditions and apparatus, we were able to prove and evaluate its feasibility and positioning accuracy.

In test 1, the laser successfully passing through two central holes reflected both an excellent coaxial alignment and positioning. The success rate of positioning was 100% for 20 rounds. The mean height errors were 1.14 mm (downward) with the most significant error at 2 mm, and 1.60 mm (upward) with the most significant error at 3 mm, which, in our view, reflects a relatively high accuracy for port surgery.

For a scope holding system, coaxial alignment weighs more than the positioning error of a certain point. Nevertheless, few studies of scope holding systems have paid attention to coaxial alignment. Due to the difficulty of direct measurement of coaxial alignment and limited testing conditions, which requires professional equipment, we used the laser light to prove the coaxial alignment qualitatively. Two holes were on the central axis of version 1 of the sheath. When the laser finder substituted the scope, the laser represented the central axis of the scope. Therefore, if the laser aligned with the sheath’s central axis, light could pass through the holes and reach the testing platform, which could easily be determined from a visible spot of light. Although it was qualitative, it avoided artificial measurement errors as much as possible. We tried other quantitative methods to measure the errors in a previous pilot study, but the results were disappointing. Another merit of our current method is that the passing of light ensures a vertical measurement of height, which we found difficult to define and measure correctly in a previous pilot study.

Tests 2 and 3 simulated exoscopic port surgery. In test 3, two operators manipulated the sheath and scope holder, similar to the process during a port surgery. Operator Z reported difficulty and inconvenience in adjusting the joints of the holder and aligning the camera with the sheath. We frequently refocused, zoomed in, or zoomed out, which could not ensure a stable picture. In addition, the imperfect location of the sheath’s facet in the display was constantly changing, causing much discomfort when we stared at the screen. However, in test 2, the robotic system provided a stable visual field with high positioning accuracy, bringing conveniences such that we did not need to refocus and zoom if an optimal location was set. The results of these two tests were 40 images made with the high-definition camera. Since there is no consensus on the method to evaluate the imaging quality of the surgical scope, we designed a scale containing four aspects rated by three researchers who were blinded to the study design and process. The four statements reflected four elements of a scope holding system: providing a clear visual field, coaxial alignment, correct location, and comfort and stability. Our comparison results demonstrated that the images made with the camera placed by a robotic system possessed a higher quality, meaning that the “CoBot” could reasonably provide an appropriate visual field for surgery. Test 3 took 52 minutes in total for operating 20 rounds, and for each round, the average time was 2.6 minutes, whereas the manipulation of the robotic system (test 2) only took 20 minutes (1 minute for each round). Although we did not compare time statistically, the difference in minutes suggests that we saved a considerable amount of time using the robotic scope holding system.

Although we designed this system for port surgery, its capacity has further potential. Face identification could be achieved via its robust algorithm, making it possible to turn the robotic system into a navigation system with facial registration. The ability to automatically hold and place the equipment may enable its use in general endoscopic and other types of surgery, or even other surgical fields. Moreover, the system’s ability to identify particular objects from disordered products could be transformed into substitution of the surgical assistant and instrument nurses. All these possibilities brighten the future application of this robotic system in the medical field.

It is worth mentioning that this study is a successful attempt to apply an industrial robot in neurosurgery. As far as we know, few studies have tried the industrial robot for intracranial procedures. The industrial robot has already been introduced in other fields such as radiosurgery, spine surgery, and otosurgery.33–37 Although a previous study argued the possible but complicated application of industrial robots in surgery,38 we believe that an appropriate and thoroughly designed translational research of industrial robots can excavate even more potential uses for surgical robots and simplify surgical procedures.

Nevertheless, although the robot could perform the function of inserting the endoscope model into the sheath, limited by our testing conditions, we did not test its performance during simulated endoscopic surgery. The artificial error in the image evaluation cannot be neglected. In a future study, we will have to refine the testing process and conditions in order to study the robot in the context of endoscopic port surgery. For clinical application of this system, a further safety test is ongoing to ensure that the CoBot will brake immediately if it collides with any object. We added force sensors in the robotic arm joints; the sensors were previously used in the industrial version of our CoBot. With these sensors, when the robotic arm encounters a resistance force > 1 N (102 g), it will stop moving. These safety measures are very effective in industrial applications. We are now adopting this technology in our system. Furthermore, we hope to carry out a clinical feasibility study with a small sample of cases after verifying the safety of the CoBot system.

Conclusions

Our newly developed CoBot system can act as an efficient instrument holding system for port surgery, providing further evidence for the substitution of human hands, and leading to a more user-friendly, precise, and safer operation.

Acknowledgments

This study was funded by the National Key Research and Development Program of China (No. 2018YFC1312602) and National Natural Science Foundation of China (NSFC No. 81771481).

We would like to thank WEILINKRT Corp., Mr. Xiaochuan Chen, and Mr. Zhiwei Deng for their support in developing the program of the robotic system. We extend our appreciation to Mr. Shuaifeng Yang, Mr. Tianyou Xing, and Mr. Peng Liu of WEILINKRT Corp. for their assistance in operating the robotic system in tests 1 and 2.

Disclosures

The authors report no conflict of interest concerning the materials or methods used in this study or the findings specified in this paper.

Author Contributions

Conception and design: Chen. Acquisition of data: Xiong, S Zhang, Gan, Qi, Liu. Analysis and interpretation of data: Xiong. Drafting the article: Xiong. Critically revising the article: Chen, Xiong. Reviewed submitted version of manuscript: Chen, Xiong. Approved the final version of the manuscript on behalf of all authors: Chen. Statistical analysis: Xiong. Administrative/technical/material support: Chen. Study supervision: Chen, Xu, Wang, J Zhang, Li.

Supplemental Information

References

  • 1

    Young RF. Application of robotics to stereotactic neurosurgery. Neurol Res. 1987;9(2):123128.

  • 2

    Leal Ghezzi T, Campos Corleta O. 30 Years of robotic surgery. World J Surg. 2016;40(10):25502557.

  • 3

    Marcus HJ, Hughes-Hallett A, Cundy TP, Yang GZ, Darzi A, Nandi D. da Vinci robot-assisted keyhole neurosurgery: a cadaver study on feasibility and safety. Neurosurg Rev. 2015;38(2):367371.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 4

    Hiraki T, Kamegawa T, Matsuno T, et al. Robotically driven CT-guided needle insertion: preliminary results in phantom and animal experiments. Radiology. 2017;285(2):454461.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 5

    Lang MJ, Greer AD, Sutherland GR. Intra-operative robotics: NeuroArm. Acta Neurochir Suppl (Wien). 2011;109:231236.

  • 6

    Li QH, Zamorano L, Pandya A, Perez R, Gong J, Diaz F. The application accuracy of the NeuroMate robot—a quantitative comparison with frameless and frame-based surgical localization systems. Comput Aided Surg. 2002;7(2):9098.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 7

    Liu L, Mariani SG, De Schlichting E, et al. Frameless ROSA® robot-assisted lead implantation for deep brain stimulation: technique and accuracy. Oper Neurosurg (Hagerstown). 2020;19(1):5764.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 8

    Nakamura M, Tamaki N, Tamura S, Yamashita H, Hara Y, Ehara K. Image-guided microsurgery with the Mehrkoordinaten Manipulator system for cerebral arteriovenous malformations. J Clin Neurosci. 2000;7(suppl 1):1013.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 9

    Willems PW, Noordmans HJ, Ramos LM, et al. Clinical evaluation of stereotactic brain biopsies with an MKM-mounted instrument holder. Acta Neurochir (Wien). 2003;145(10):889897.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 10

    Gonen L, Chakravarthi SS, Monroy-Sosa A, et al. Initial experience with a robotically operated video optical telescopic-microscope in cranial neurosurgery: feasibility, safety, and clinical applications. Neurosurg Focus. 2017;42(5):E9.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 11

    Saito K, Kajiwara K, Ishihara H, Nomura S, Suzuki M. CyberKnife robotic radiosurgery. Article in Japanese. No To Shinkei. 2006;58(4):277288.

  • 12

    Wang T, Zhao QJ, Gu JW, et al. Neurosurgery medical robot Remebot for the treatment of 17 patients with hypertensive intracerebral hemorrhage. Int J Med Robot. 2019;15(5):e2024.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 13

    Rachinger J, Bumm K, Wurm J, et al. A new mechatronic assistance system for the neurosurgical operating theatre: implementation, assessment of accuracy and application concepts. Stereotact Funct Neurosurg. 2007;85(5):249255.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 14

    Cabuk B, Ceylan S, Anik I, Tugasaygi M, Kizir S. A haptic guided robotic system for endoscope positioning and holding. Turk Neurosurg. 2015;25(4):601607.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 15

    Nathoo N, Pesek T, Barnett GH. Robotics and neurosurgery. Surg Clin North Am. 2003;83(6):13391350.

  • 16

    Goto T, Hongo K, Yako T, et al. The concept and feasibility of EXPERT: intelligent armrest using robotics technology. Neurosurgery. 2013;72(suppl 1):3942.

  • 17

    Okamura C, Kojima T, Tokiwa S, Hasegawa A, Tanaka Y, Ichikawa K. Microscopic ophthalmic surgery using a freely movable arm support robot: basic experiment and clinical experience. Ophthalmic Res. 2020;63(6):580587.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 18

    Ogiwara T, Goto T, Nagm A, Hongo K. Endoscopic endonasal transsphenoidal surgery using the iArmS operation support robot: initial experience in 43 patients. Neurosurg Focus. 2017;42(5):E10.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 19

    Nimsky Ch, Rachinger J, Iro H, Fahlbusch R. Adaptation of a hexapod-based robotic system for extended endoscope-assisted transsphenoidal skull base surgery. Minim Invasive Neurosurg. 2004;47(1):4146.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 20

    Zhenzhu L, Lu L, Zhenzhi L, et al. Feasibility study of the low-cost motion tracking system for assessing endoscope holding skills. World Neurosurg. 2020;140:312319.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 21

    Mamelak AN, Danielpour M, Black KL, Hagike M, Berci G. A high-definition exoscope system for neurosurgery and other microsurgical disciplines: preliminary report. Surg Innov. 2008;15(1):3846.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 22

    Panchal S, Yamada Y, Nagatani T, et al. A practice survey to compare and identify the usefulness of neuroendoscope and exoscope in the current neurosurgery practice. Asian J Neurosurg. 2020;15(3):601607.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 23

    Sun GC, Chen XL, Hou YZ, et al. Image-guided endoscopic surgery for spontaneous supratentorial intracerebral hematoma. J Neurosurg. 2017;127(3):537542.

  • 24

    Dong G, Zhang J, Yang J, Zhao Y, Chen X. Navigated endoscopic port surgery for resection of intra-cerebral deep-seated small lesions. Acad J PLA Postgr Med Sch. 2016;7:692696.

    • Search Google Scholar
    • Export Citation
  • 25

    Ge X, Xu X, Yu X, et al. Smartphone-assisted endoscopic surgery via Kocher’s point for intraventricular hemorrhage caused by thalamic hemorrhage: a comparison with external ventricular drainage. Exp Ther Med. 2019;18(3):18701876.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 26

    Marcus HJ, Hughes-Hallett A, Cundy TP, et al. Comparative effectiveness of 3-dimensional vs 2-dimensional and high-definition vs standard-definition neuroendoscopy: a preclinical randomized crossover study. Neurosurgery. 2014;74(4):375381.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 27

    Breimer GE, Haji FA, Bodani V, et al. Simulation-based education for endoscopic third ventriculostomy: a comparison between virtual and physical training models. Oper Neurosurg (Hagerstown). 2017;13(1):8995.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 28

    Patel KB, Eldeniz C, Skolnick GB, et al. 3D pediatric cranial bone imaging using high-resolution MRI for visualizing cranial sutures: a pilot study. J Neurosurg Pediatr. 2020;26(3):311317.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 29

    Saito M, Takahashi Y, Yoshimura Y, et al. Inadequate communication between patients with unruptured cerebral aneurysms and neurosurgeons. Neurol Med Chir (Tokyo). 2012;52(12):873877.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 30

    Langer DJ, White TG, Schulder M, Boockvar JA, Labib M, Lawton MT. Advances in intraoperative optics: a brief review of current exoscope platforms. Oper Neurosurg (Hagerstown). 2020;19(1):8493.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 31

    Garneau JC, Laitman BM, Cosetti MK, Hadjipanayis C, Wanna G. The use of the exoscope in lateral skull base surgery: advantages and limitations. Otol Neurotol. 2019;40(2):236240.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 32

    Kang B, Castelli V, Diversi C, et al. Towards accurate robot-assisted neuroendoscopy using an ergonomic handling interface and a lightweight robot. Annu Int Conf IEEE Eng Med Biol Soc. 2014;2014:68766879.

    • Search Google Scholar
    • Export Citation
  • 33

    Du YQ, Li T, Ma C, Qiao GY, Yin YH, Yu XG. Biomechanical evaluation of two alternative techniques to the Goel-Harms technique for atlantoaxial fixation: C1 lateral mass-C2 bicortical translaminar screw fixation and C1 lateral mass-C2/3 transarticular screw fixation. J Neurosurg Spine. 2020;32(5):682688.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 34

    Federspil PA, Geisthoff UW, Henrich D, Plinkert PK. Development of the first force-controlled robot for otoneurosurgery. Laryngoscope. 2003;113(3):465471.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 35

    Healy AT, Lubelski D, Mageswaran P, et al. Biomechanical analysis of the upper thoracic spine after decompressive procedures. Spine J. 2014;14(6):10101016.

  • 36

    Kunos C, von Gruenigen V, Waggoner S, et al. Cyberknife radiosurgery for squamous cell carcinoma of vulva after prior pelvic radiation therapy. Technol Cancer Res Treat. 2008;7(5):375380.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 37

    Lubelski D, Healy AT, Mageswaran P, Colbrunn R, Schlenk RP. Analysis of adjacent-segment cervical kinematics: the role of construct length and the dorsal ligamentous complex. J Neurosurg Spine. 2020;32(1):1522.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 38

    Korb W, Engel D, Boesecke R, et al. Development and first patient trial of a surgical robot for complex trajectory milling. Comput Aided Surg. 2003;8(5):247256.

  • View in gallery

    The construction of the robotic system. The black arrow indicates the robotic camera, Basler acA2440-20gm GigE (Basler AG), and the white arrow indicates the robotic arm, UR5 (Universal Robots).

  • View in gallery

    Two versions of the revised sheath. A: Top opening of version 1 (used in test 1). B: Top opening of version 2 (used in tests 2 and 3). C: Bottom of version 1 (used in test 1). D: Bottom of version 2 (used in tests 2 and 3).

  • View in gallery

    A: Illustration of the coaxial alignment test. B: The setting of test 2. The thick black arrow indicates the robotic arm, the thick white arrow indicates the holder, the thin black arrow indicates the robotic camera, and the thin white arrow indicates the high-definition camera. C: The high-definition camera and the holder used in test 3.

  • View in gallery

    Typical images made with the high-definition camera in test 2.

  • View in gallery

    Typical images made with the high-definition camera in test 3.

  • 1

    Young RF. Application of robotics to stereotactic neurosurgery. Neurol Res. 1987;9(2):123128.

  • 2

    Leal Ghezzi T, Campos Corleta O. 30 Years of robotic surgery. World J Surg. 2016;40(10):25502557.

  • 3

    Marcus HJ, Hughes-Hallett A, Cundy TP, Yang GZ, Darzi A, Nandi D. da Vinci robot-assisted keyhole neurosurgery: a cadaver study on feasibility and safety. Neurosurg Rev. 2015;38(2):367371.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 4

    Hiraki T, Kamegawa T, Matsuno T, et al. Robotically driven CT-guided needle insertion: preliminary results in phantom and animal experiments. Radiology. 2017;285(2):454461.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 5

    Lang MJ, Greer AD, Sutherland GR. Intra-operative robotics: NeuroArm. Acta Neurochir Suppl (Wien). 2011;109:231236.

  • 6

    Li QH, Zamorano L, Pandya A, Perez R, Gong J, Diaz F. The application accuracy of the NeuroMate robot—a quantitative comparison with frameless and frame-based surgical localization systems. Comput Aided Surg. 2002;7(2):9098.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 7

    Liu L, Mariani SG, De Schlichting E, et al. Frameless ROSA® robot-assisted lead implantation for deep brain stimulation: technique and accuracy. Oper Neurosurg (Hagerstown). 2020;19(1):5764.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 8

    Nakamura M, Tamaki N, Tamura S, Yamashita H, Hara Y, Ehara K. Image-guided microsurgery with the Mehrkoordinaten Manipulator system for cerebral arteriovenous malformations. J Clin Neurosci. 2000;7(suppl 1):1013.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 9

    Willems PW, Noordmans HJ, Ramos LM, et al. Clinical evaluation of stereotactic brain biopsies with an MKM-mounted instrument holder. Acta Neurochir (Wien). 2003;145(10):889897.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 10

    Gonen L, Chakravarthi SS, Monroy-Sosa A, et al. Initial experience with a robotically operated video optical telescopic-microscope in cranial neurosurgery: feasibility, safety, and clinical applications. Neurosurg Focus. 2017;42(5):E9.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 11

    Saito K, Kajiwara K, Ishihara H, Nomura S, Suzuki M. CyberKnife robotic radiosurgery. Article in Japanese. No To Shinkei. 2006;58(4):277288.

  • 12

    Wang T, Zhao QJ, Gu JW, et al. Neurosurgery medical robot Remebot for the treatment of 17 patients with hypertensive intracerebral hemorrhage. Int J Med Robot. 2019;15(5):e2024.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 13

    Rachinger J, Bumm K, Wurm J, et al. A new mechatronic assistance system for the neurosurgical operating theatre: implementation, assessment of accuracy and application concepts. Stereotact Funct Neurosurg. 2007;85(5):249255.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 14

    Cabuk B, Ceylan S, Anik I, Tugasaygi M, Kizir S. A haptic guided robotic system for endoscope positioning and holding. Turk Neurosurg. 2015;25(4):601607.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 15

    Nathoo N, Pesek T, Barnett GH. Robotics and neurosurgery. Surg Clin North Am. 2003;83(6):13391350.

  • 16

    Goto T, Hongo K, Yako T, et al. The concept and feasibility of EXPERT: intelligent armrest using robotics technology. Neurosurgery. 2013;72(suppl 1):3942.

  • 17

    Okamura C, Kojima T, Tokiwa S, Hasegawa A, Tanaka Y, Ichikawa K. Microscopic ophthalmic surgery using a freely movable arm support robot: basic experiment and clinical experience. Ophthalmic Res. 2020;63(6):580587.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 18

    Ogiwara T, Goto T, Nagm A, Hongo K. Endoscopic endonasal transsphenoidal surgery using the iArmS operation support robot: initial experience in 43 patients. Neurosurg Focus. 2017;42(5):E10.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 19

    Nimsky Ch, Rachinger J, Iro H, Fahlbusch R. Adaptation of a hexapod-based robotic system for extended endoscope-assisted transsphenoidal skull base surgery. Minim Invasive Neurosurg. 2004;47(1):4146.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 20

    Zhenzhu L, Lu L, Zhenzhi L, et al. Feasibility study of the low-cost motion tracking system for assessing endoscope holding skills. World Neurosurg. 2020;140:312319.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 21

    Mamelak AN, Danielpour M, Black KL, Hagike M, Berci G. A high-definition exoscope system for neurosurgery and other microsurgical disciplines: preliminary report. Surg Innov. 2008;15(1):3846.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 22

    Panchal S, Yamada Y, Nagatani T, et al. A practice survey to compare and identify the usefulness of neuroendoscope and exoscope in the current neurosurgery practice. Asian J Neurosurg. 2020;15(3):601607.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 23

    Sun GC, Chen XL, Hou YZ, et al. Image-guided endoscopic surgery for spontaneous supratentorial intracerebral hematoma. J Neurosurg. 2017;127(3):537542.

  • 24

    Dong G, Zhang J, Yang J, Zhao Y, Chen X. Navigated endoscopic port surgery for resection of intra-cerebral deep-seated small lesions. Acad J PLA Postgr Med Sch. 2016;7:692696.

    • Search Google Scholar
    • Export Citation
  • 25

    Ge X, Xu X, Yu X, et al. Smartphone-assisted endoscopic surgery via Kocher’s point for intraventricular hemorrhage caused by thalamic hemorrhage: a comparison with external ventricular drainage. Exp Ther Med. 2019;18(3):18701876.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 26

    Marcus HJ, Hughes-Hallett A, Cundy TP, et al. Comparative effectiveness of 3-dimensional vs 2-dimensional and high-definition vs standard-definition neuroendoscopy: a preclinical randomized crossover study. Neurosurgery. 2014;74(4):375381.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 27

    Breimer GE, Haji FA, Bodani V, et al. Simulation-based education for endoscopic third ventriculostomy: a comparison between virtual and physical training models. Oper Neurosurg (Hagerstown). 2017;13(1):8995.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 28

    Patel KB, Eldeniz C, Skolnick GB, et al. 3D pediatric cranial bone imaging using high-resolution MRI for visualizing cranial sutures: a pilot study. J Neurosurg Pediatr. 2020;26(3):311317.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 29

    Saito M, Takahashi Y, Yoshimura Y, et al. Inadequate communication between patients with unruptured cerebral aneurysms and neurosurgeons. Neurol Med Chir (Tokyo). 2012;52(12):873877.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 30

    Langer DJ, White TG, Schulder M, Boockvar JA, Labib M, Lawton MT. Advances in intraoperative optics: a brief review of current exoscope platforms. Oper Neurosurg (Hagerstown). 2020;19(1):8493.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 31

    Garneau JC, Laitman BM, Cosetti MK, Hadjipanayis C, Wanna G. The use of the exoscope in lateral skull base surgery: advantages and limitations. Otol Neurotol. 2019;40(2):236240.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 32

    Kang B, Castelli V, Diversi C, et al. Towards accurate robot-assisted neuroendoscopy using an ergonomic handling interface and a lightweight robot. Annu Int Conf IEEE Eng Med Biol Soc. 2014;2014:68766879.

    • Search Google Scholar
    • Export Citation
  • 33

    Du YQ, Li T, Ma C, Qiao GY, Yin YH, Yu XG. Biomechanical evaluation of two alternative techniques to the Goel-Harms technique for atlantoaxial fixation: C1 lateral mass-C2 bicortical translaminar screw fixation and C1 lateral mass-C2/3 transarticular screw fixation. J Neurosurg Spine. 2020;32(5):682688.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 34

    Federspil PA, Geisthoff UW, Henrich D, Plinkert PK. Development of the first force-controlled robot for otoneurosurgery. Laryngoscope. 2003;113(3):465471.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 35

    Healy AT, Lubelski D, Mageswaran P, et al. Biomechanical analysis of the upper thoracic spine after decompressive procedures. Spine J. 2014;14(6):10101016.

  • 36

    Kunos C, von Gruenigen V, Waggoner S, et al. Cyberknife radiosurgery for squamous cell carcinoma of vulva after prior pelvic radiation therapy. Technol Cancer Res Treat. 2008;7(5):375380.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 37

    Lubelski D, Healy AT, Mageswaran P, Colbrunn R, Schlenk RP. Analysis of adjacent-segment cervical kinematics: the role of construct length and the dorsal ligamentous complex. J Neurosurg Spine. 2020;32(1):1522.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 38

    Korb W, Engel D, Boesecke R, et al. Development and first patient trial of a surgical robot for complex trajectory milling. Comput Aided Surg. 2003;8(5):247256.

Metrics

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 158 158 158
PDF Downloads 300 300 300
EPUB Downloads 0 0 0