How precise is PreSize Neurovascular? Accuracy evaluation of flow diverter deployed-length prediction

Tufail Patankar Department of Neuroradiology, Leeds Teaching Hospital, Leeds;

Search for other papers by Tufail Patankar in
Current site
Google Scholar
PubMed
Close
 MD
,
Jeremy Madigan Atkinson Morley Neurosciences Centre, St. George’s University Hospital, London;

Search for other papers by Jeremy Madigan in
Current site
Google Scholar
PubMed
Close
 MBChB
,
Jonathan Downer Royal Infirmary of Edinburgh, Department of Clinical Neurosciences, Edinburgh;

Search for other papers by Jonathan Downer in
Current site
Google Scholar
PubMed
Close
 MD
,
Hemant Sonwalkar Department of Neuroradiology, Lancashire Teaching Hospitals, Preston;

Search for other papers by Hemant Sonwalkar in
Current site
Google Scholar
PubMed
Close
 MD
,
Peter Cowley Department of Neuroradiology, National Hospital for Neurology and Neurosurgery, London; and

Search for other papers by Peter Cowley in
Current site
Google Scholar
PubMed
Close
 MD
, and
Francesco Iori Oxford Heartbeat Ltd., London, United Kingdom

Search for other papers by Francesco Iori in
Current site
Google Scholar
PubMed
Close
 PhD
Free access

OBJECTIVE

The use of flow-diverting stents has been increasingly important in intracranial aneurysm treatment. However, accurate sizing and landing zone prediction remain challenging. Inaccurate sizing can lead to suboptimal deployment, device waste, and complications. This study presents stent deployment length predictions offered in medical software (PreSize Neurovascular) that provides physicians with real-time planning support, allowing them to preoperatively "test" different devices in the patient’s anatomy in a safe virtual environment. This study reports the software evaluation methodology and accuracy results when applied to real-world data from a wide range of cases and sources as a necessary step in demonstrating its reliability, prior to impact assessment in prospective clinical practice.

METHODS

Imaging data from 138 consecutive stent cases using the Pipeline embolization device were collected from 5 interventional radiology centers in the United Kingdom and retrospectively analyzed. Prediction accuracy was calculated as the degree of agreement between stent deployed length measured intraoperatively and simulated in the software.

RESULTS

The software predicted the deployed stent length with a mean accuracy of 95.61% (95% confidence interval [CI] 94.87%–96.35%), the highest reported accuracy in clinical stent simulations to date. By discounting 4 outlier cases, in which events such as interactions with coils and severe push/pull maneuvers impacted deployed length to an extent the software was not able to simulate or predict, the mean accuracy further increases to 96.13% (95% CI 95.58%–96.69%). A wide discrepancy was observed between labeled and measured deployed stent length, in some cases by more than double, with no demonstrable correlation between device dimensions and deployment elongation. These findings illustrate the complexity of stent behavior and need for simulation-assisted sizing for optimal surgical planning.

CONCLUSIONS

The software predicts the deployed stent length with excellent accuracy and could provide physicians with real-time accurate device selection support.

ABBREVIATIONS

3DRA = 3D rotational angiography; CI = confidence interval; DSA = digital subtraction angiography; FD = flow diverter; PED = Pipeline embolization device; ROI = region of interest; UK = United Kingdom; VMTK = Vascular Modeling Toolkit.

OBJECTIVE

The use of flow-diverting stents has been increasingly important in intracranial aneurysm treatment. However, accurate sizing and landing zone prediction remain challenging. Inaccurate sizing can lead to suboptimal deployment, device waste, and complications. This study presents stent deployment length predictions offered in medical software (PreSize Neurovascular) that provides physicians with real-time planning support, allowing them to preoperatively "test" different devices in the patient’s anatomy in a safe virtual environment. This study reports the software evaluation methodology and accuracy results when applied to real-world data from a wide range of cases and sources as a necessary step in demonstrating its reliability, prior to impact assessment in prospective clinical practice.

METHODS

Imaging data from 138 consecutive stent cases using the Pipeline embolization device were collected from 5 interventional radiology centers in the United Kingdom and retrospectively analyzed. Prediction accuracy was calculated as the degree of agreement between stent deployed length measured intraoperatively and simulated in the software.

RESULTS

The software predicted the deployed stent length with a mean accuracy of 95.61% (95% confidence interval [CI] 94.87%–96.35%), the highest reported accuracy in clinical stent simulations to date. By discounting 4 outlier cases, in which events such as interactions with coils and severe push/pull maneuvers impacted deployed length to an extent the software was not able to simulate or predict, the mean accuracy further increases to 96.13% (95% CI 95.58%–96.69%). A wide discrepancy was observed between labeled and measured deployed stent length, in some cases by more than double, with no demonstrable correlation between device dimensions and deployment elongation. These findings illustrate the complexity of stent behavior and need for simulation-assisted sizing for optimal surgical planning.

CONCLUSIONS

The software predicts the deployed stent length with excellent accuracy and could provide physicians with real-time accurate device selection support.

In Brief

This study reports the evaluation methodology and accuracy results when applied to real-world data for the medical software PreSize Neurovascular, which provides physicians real-time planning support for brain stenting procedures. From a multicenter cohort of 138 cases using the Pipeline embolization device, the software predicted the deployed stent length with a mean accuracy of 95.61%, the highest rate reported to date. This was a necessary step in demonstrating the software's reliability, prior to impact assessment in prospective clinical practice.

Flow-diverting stents are an important endovascular treatment option for intracranial aneurysms.1–3 These stents are placed in the parent artery across the aneurysm neck to divert blood flow. This procedure promotes thrombosis of the aneurysm sac and provides a scaffold for re-endothelialization across the aneurysm neck, ultimately sealing the aneurysm off from the circulation.4 A number of studies have shown the effectiveness of these stents.3,5,6

Accurate flow diverter (FD) sizing is of utmost importance to ensure good device wall apposition and optimize the length of the deployed FD. Accurate FD sizing avoids unnecessary branch vessel coverage and can make deployment easier by allowing the operator to avoid tortuous vessel segments, e.g., the carotid genu. Inappropriate FD sizing can lead to complications, potentially related to side vessel obstruction, poor wall apposition, and stent migration.7–9

To aid in FD sizing, device manufacturers indicate the so-called "labeled" length, which is a metric indicating the length assumed by the device when deployed in a straight tube of uniform diameter equal to the labeled diameter of the FD (not the diameter of the unconstrained, fully expanded device). Braided stents, when deployed in real clinical conditions, have been shown to elongate by up to approximately 80% of their labeled length.10 Deployed FD elongations complicate precise and consistent device sizing.

Currently, in conventional clinical practice, devices are selected by physicians based on manual measurements taken on 2D digital subtraction angiography (DSA) or 3D rotational DSA images. Because of the intricate mechanical behavior of braided FDs and irregularities of each patient’s anatomy, this process is time-consuming, intrinsically prone to error, mostly dependent on physician experience, and as such, hardly reproducible.11 To support physicians and facilitate stent sizing, various numerical methods and tools have been developed to predict stent deployment.12–18 However, there is currently no standard-of-care FD sizing tool validated for its accuracy across FD products available for clinical use.

The software described in this study (PreSize Neurovascular, Oxford Heartbeat Ltd.) is a novel visualization and simulation tool developed to enhance the planning of neurovascular FD interventions in aneurysm treatment. This software provides physicians with real-time planning support, allowing them to preoperatively "test" different devices in a patient’s anatomy in a safe virtual environment. This work reports the evaluation methodology and precision results of the PreSize software overall, and for each processing component: image segmentation, meshing, centerline extraction, and stent deployment, all of which together contribute to the final prediction accuracy. This work is the necessary step in demonstrating the reliability of software to be used by physicians prior to its impact assessment in prospective clinical practice.

Methods

Software Processing Steps

The processing steps within the software for 3D visualization of patient anatomy, and "best-fit" stent selection for preoperative surgery planning, are overviewed and illustrated in Fig. 1. The process was designed to be automated and involve a limited number of manual steps.

FIG. 1.
FIG. 1.

Illustration of the processing pipeline steps within the software: load preoperative patient images (A), select ROI (B), automatic visualization of a 3D model of the vessels and aneurysm (C), automatic visualization of vessel centerlines (yellow; D), select proximal (P) and distal (D) points along the centerlines (E), automatic visualization of deployed stent based on proximal/distal point selection (F), manually adjust stent position by dragging (G), and change stent model and size using the available menu (H).

Study Design

A reference measurement of the deployed stent length is needed for comparison to validate the accuracy of virtual deployment. Direct measurements in patients are impossible due to the procedure’s minimally invasive nature. Clinicians rely on live 2D radiography and radiographic angiography images to guide deployment. Three-dimensional images of the deployed stents are not routinely acquired in the United Kingdom (UK), meaning that the reference, or measured, length of the deployed device in 3D has to be inferred from 2D postoperative images. These 2D images were used to obtain reference measurements (similar to the study of Narata et al.10) that constituted the "ground truth," against which the accuracy of the software deployed-length predictions is assessed. The protocol for the accuracy evaluation between reference and simulated lengths is overviewed and illustrated in Fig. 2.

FIG. 2.
FIG. 2.

A–F: Illustration of the evaluation protocol to calculate the accuracy of FD deployment length. Preoperative 2D images are loaded in the software and the software automatically produces a 3D model of the vessel tree (A) along with the centerlines (B). Pre- and postoperative images are coregistered for an aligned view of the distal and proximal locations of the stent in relation to the preoperative anatomy (C). With the use of postoperative 2D images without (D) and with (E) contrast, the distal and proximal ends of the deployed stents are marked and transferred to the 3D model (F). The length of the vessel centerline between the two markers is recorded as measured, or reference, deployed stent length (Lref). An FD of the same dimensions as those used in surgery is virtually deployed between distal and proximal markers, and its length is recorded as simulated stent length (Lsim). G and H: One illustrative example case from the evaluation set illustrates the comparison between a 2D angiographic image with the FD visible (G; the measured length is the length of the blue curve between the markers) and a 2D angiographic image with contrast showing the vasculature, with an overlay of the virtually deployed stent (H; the simulated length is the length of the green curve).

The study received Health Research Authority approval (legal compliance and ethics review for research projects in the UK’s National Health System). Due to the study’s retrospective nature and use of anonymized data, the requirement for informed patient consent was waived.

Case Selection Criteria and Sample Size

Any consecutive case in which an intracranial aneurysm was treated with a single Pipeline embolization device (PED; Medtronic Inc.), and the FD was assessed as fully open at the end of the procedure, in any of the participating hospitals between July 2014 and July 2021, was considered suitable for study inclusion. While other FD models are available in the software, this evaluation study focused on the PED, given its widespread clinical use.

Further selection criteria were based on availability of the following required minimal image data set: 1) preoperative 3D rotational angiography (3DRA) of the aneurysm and adjacent vessels, resliced in a parallel view; 2) biplane intraoperative 2D radiography and radiographic angiography showing the fully opened stent after deployment (assessed by the clinician or hospital radiographer performing data export); and 3) information on the labeled diameter and length of the implanted FD, e.g., PED-250-14, referring to a PED of 2.5-mm labeled diameter and 14-mm labeled length. A minimum sample size between 2419 and 5020 was recommended to produce statistically significant parameters, e.g., standard deviation and mean.

Image Acquisition

Preoperative 3DRA images and 2D postoperative sequences were acquired using imaging systems from the three main UK vendors: Siemens Medical Solutions (n = 51), Philips Healthcare (n = 57), and General Electric Healthcare (n = 30). Voxel sizes of the 3DRA images ranged from 0.139 × 0.139 × 0.139 mm to 0.484 × 0.484 × 0.484 mm.

Segmentation and Meshing

Preoperative 3DRA images were segmented in the software using a fully automated method. Segmentation is the first step in building 3D anatomical models from raw medical imaging data (Fig. 2A). An anatomical region of interest (ROI) is demarcated, and all other irrelevant pixel data are disregarded. Once this process is completed, a 3D mesh is extracted from the selected ROI.

Over- and undersegmentation can cause the merging or splitting of nonconnected or connected structures, respectively; over- or underestimations of the local vessel diameter; and modifications to the correct vessel and aneurysm shape. Validating the segmentation and meshing is therefore of utmost importance to ensure accurate vessel tree representation, subsequently impacting the evaluation of vessel centerlines, FD deployment simulation, and ultimately, the accuracy of the virtual deployment. The accuracy of this step was evaluated with the same methodology used for the Aneurisk data set.21 The meshes created through the software were first coregistered and then compared with other meshes constructed either manually or automatically from the same DICOM image set.

Forty-three meshes from various sources were collected: 1) 24 cases from the Aneurisk database, a collection of semiautomated segmentations from patient cases captured in a study with the university and research hospital in Milan (2005–2008);21 2) 14 cases from the SwissNeuroFoundation, an ongoing study of manually segmented cases performed by clinicians across Switzerland;22 and 3) 5 cases from Visible Patient (France), a company that specializes in and is CE-marked for semiautomated segmentation for medical imaging. They produce automatically segmented cases, which are then cross-checked by expert clinicians.23 The segmentation evaluation was performed using a metric called Hausdorff distance, a widely accepted method of comparing the morphology and distance of two meshes, especially within the neurovascular community.24 The Hausdorff distance was normalized by a characteristic length (i.e., mean vessel tree radius) to produce an accuracy metric.

Centerline Extraction

Centerlines are powerful descriptors of a vascular geometry. Briefly, centerlines are represented as a tree where every branch is a polyline, determined as the weighted shortest path, traced between two or more extremal points (Fig. 2B).25 Additionally, each point on the polyline in 3D space has an associated radius, calculated as the shortest distance from the surface mesh.

The centerline and its radius are central to the way the software calculates how an FD unfolds within a vessel. The centerlines extracted by the software were validated against those extracted from the same meshes by the Vascular Modeling Toolkit (VMTK), the leading industry standard, widely used in the scientific community.26 The validation was conducted for a subset (n = 15) of all collected cases as follows: two sets of centerlines (PreSize software vs VMTK) were compared using a simple Hausdorff distance between center points. At these associated nearest points, the relative difference between centerline radii was calculated, providing a comparison in terms of the location of the central point and the approximated size of the local vessel.

Data Analysis

For all cohort cases, we assessed the following:

  • Relative change in length between labeled FD length (Llabeled) and measured reference length (Lref): Lchange = (|Lref – Llabeled|/Llabeled) × 100%.

  • Prediction error (absolute percentage difference between the reference and simulated [Lsim] device length): Error = (|Lsim – Lref|/Lref) × 100%.

  • Prediction accuracy: Accuracy = (1 – |Lsim – Lref|/Lref) × 100% = 100% – Error.

Measured and simulated lengths were also compared using the Bland-Altman plot and a two one-sided t-test for equivalence. A significance level of 0.05 was chosen for the statistical analysis. Data analysis was performed in Matlab (MathWorks Inc.) by expert engineers of the software company.

Results

Segmentation and Meshing

An average 99% accuracy was obtained with respect to manual and semiautomated segmentation, confirming the high accuracy of the segmentation method implemented in the PreSize software.

Centerline Extraction

A mean point-to-point centerline distance of 0.0257 mm was calculated, corresponding to a mean error of 1.68% when normalized by the mean centerline radius. A mean relative difference between PreSize and VMTK radii of 2.19% was also calculated. Overall, the aggregate results show good quantitative agreement between the centerline extraction methods implemented in PreSize and VMTK.

Labeled Versus Measured FD Length

A multicenter anonymized data set of 138 consecutive PED FD cases was collected from 5 UK interventional radiology centers (see Table 1 for case numbers by center), exceeding the minimum sample size requirement set out in the Methods section. Figure 3A shows a scatterplot of the labeled diameters and lengths for the devices used in the processed cases, grouped by center. The FDs appear to be uniformly distributed, and no specific distribution pattern could be observed. The labeled FD length was consistently shorter than the measured FD length. Figure 3B shows the distribution of relative length change between Lref and Llabeled, and Fig. 3C presents the linear regression (R2 = 0.7646) between these quantities. A mean length change of 43.87% and as high as 106.6% was observed, meaning that some devices became more than twice as long as their labeled length. The standard error of the regression was 3.925 mm, corresponding to a 95% confidence interval [CI] of −7.69 to 7.69 mm. The large CI indicates that there is no demonstrable correlation between FD dimensions and their deployment elongation, further demonstrating that using the labeled length to predict the implanted length with satisfactory precision is limited.

TABLE 1.

Summary of the deployment accuracy by interventional radiology center and overall, including and excluding outliers

Interventional Radiology CenterAccuracyAccuracy
Mean ± SD (%)MinN*Mean ± SD (%)MinN
Royal Infirmary of Edinburgh94.25 ± 5.7473.954595.46 ± 3.5184.4642
Leeds Teaching Hospitals96.07 ± 4.1881.273596.51 ± 3.3483.8134
St. George’s University Hospitals96.41 ± 2.4990.403096.41 ± 2.4990.4030
Royal Preston Hospital96.43 ± 3.8882.712296.43 ± 3.8882.7122
National Hospital for Neurology and Neurosurgery96.21 ± 2.3291.85696.21 ± 2.3291.856
Overall95.61 ± 4.4373.9513896.13 ± 3.2782.71134

Including outliers.

Excluding outliers.

FIG. 3.
FIG. 3.

A: Scatterplot of PED stent models in the study. Each box represents a deployed device and contains the number of occurrences for each participating interventional radiology center. B: Distribution of relative length change between the measured deployed length and the labeled stent length. C: Linear regression between the measured deployed length and the labeled stent length. NHNN = National Hospital for Neurology and Neurosurgery.

Measured Versus Simulated FD Length

Equivalence testing was used to validate software length predictions against the measured implanted lengths for the 138 PED FD cases.27 A two one-sided paired t-test was used to test equivalence between the measured (24.06 ± 8.06 mm) and simulated (23.97 ± 8.00 mm) lengths, within a margin of equivalence (referred to as δ) selected for the two one-sided paired t-test as = −0.5 to 0.5 mm. This value was chosen as an estimate of the uncertainty introduced by the operator when placing the distal and proximal deployment markers in the software.

The test showed the equivalence between the two sets within the chosen margin and with a significance level of 5% (with the two t-statistics being t1(137) = −3.21, t2(137) = 4.61, p1 and p2 < 0.001). A mean deployment accuracy of 95.61% ± 4.43% (median 97.03%), with a 95% CI of 94.87%–96.35%, was calculated. One hundred twenty-nine (93%) of 138 processed PED cases produced a deployment accuracy greater than 90%, and 94 (68%) of 138 greater than 95% deployment accuracy. A histogram of the distribution of deployment accuracy is shown in Fig. 4A, and box plots of the accuracy per interventional center in Fig. 4B. The agreements between reference and simulated lengths were compared using Bland-Altman plots (Fig. 4C).

FIG. 4.
FIG. 4.

A: Bar graph showing the distribution of the deployment accuracy. B: Box plots of the deployment accuracy grouped by interventional radiology center. Red crosses indicate cases in which accuracy is more than twice the interquartile Q3-Q1 distance from the sample median. C: Bland-Altman plot comparing FD reference and simulated lengths for the 138 analyzed cases.

Among the 138 cases, 4 outliers were identified that displayed particularly low accuracy (< 82%). In those cases, events such as interactions with coils and severe push/pull maneuvers appear to have affected the deployed length to an extent that the software was unable to simulate or predict (see Discussion for a more detailed overview of these cases). After excluding these 4 outliers, the mean deployment accuracy was 96.13% ± 3.27% (median 97.09%), with a 95% CI of 95.58%–96.69%. In 129 (96.27%) of these 134 PED cases, accuracy remained greater than 90% (error < 10%), while a relatively low error rate of 18.73% was recorded in the worst reported case.

Table 1 summarizes the mean and standard deviation of the deployment accuracy calculated for each interventional radiology center and overall, both including and excluding the outliers.

Discussion

Labeled FD Length

Conventionally, FD diameter is sized to match the largest diameter of the selected vessel segment, but vessels have nonconstant diameters and distal tapering, which usually results in an oversized FD in the distal section. Considering that the deployed length of braided stents is dependent on the local diameter of the targeted vessel segment, FD oversizing usually causes stent over-elongation. In addition, the FD labeled length is the length assumed by the device when deployed in a straight tube of uniform labeled diameter, which differs from the stress-free condition; specifically for PEDs, the stress-free diameter is 0.25 mm larger than the labeled diameter.1 This means that sections of the FDs could actually shorten compared to their labeled state in regions in which the vessel width is larger than the labeled diameter (for example, at the aneurysm neck, or within large fusiform aneurysms). These two observations indicate that estimating the deployed length of a device in real-patient aneurysmal anatomy is nontrivial, and labeled length is not a reliable predictor of the deployed length.

The results reported in this study in fact measured that there is a consistently greater relative change between labeled and postdeployment FD length for PED, up to 106%. This finding not only agrees with the previously observed FD elongations of approximately 80%,10 but also indicates that elongation may be even more pronounced (up to 106%), further illustrating the complexity of accurate and consistent FD sizing.

Simulated FD Length

To the authors’ knowledge, there is no published study that validates the overall accuracy of this software type as well as its individual components (image segmentation, centerline extraction, virtual FD deployment). Furthermore, the current study represents the largest multicenter cohort of aneurysm cases treated with a single device (Medtronic’s PED) for this type of study.10,28 PreSize deployed-length predictions in this study showed a mean percentage error of 4.39% (median 2.97%) and mean error of 0.1 mm, which is (to our knowledge) the lowest reported error rate for tools with similar functionality.10 When grouping by hospital or imaging system, nonsignificant differences were observed between groups.

Among the 138 cases, 4 outliers were identified that displayed a particularly low accuracy of < 82%. In 2 of the 4 identified outlier cases, the FD was deployed after jailing the microcatheter into the aneurysm in preparation for coiling, as reported by the operating physician. This partially obstructed the lumen during deployment, causing a greater implanted length that could not have been simulated by the software. The third outlier case presented a large fusiform aneurysm. In such cases, stent deployment is more sensitive to push/pull maneuvers, with a number of equally valid deployed configurations (and thus deployed lengths) possible. The final fourth outlier was a case with a “major push” maneuver, as reported by the operating physician. As a result of these events, the deployed length could not have been simulated or predicted by the software, nor is simulating such events the intended use of the software. When discounting these outlier cases, the mean percentage error reduced further to 3.87% (median 2.91%).

The software predicts the deployed length based on an average/standard deployment technique. FD deployed length, in certain aneurysm types (such as large/giant aneurysms), might also be dependent on the loading forces applied during deployment. However, considering that there is no strong evidence that intraoperative forces such as push/pull maneuvers or postdeployment balloon angioplasty can be accurately and consistently quantified or validated, they have not been incorporated in the software. Nevertheless, it is interesting to note that even though the software does not simulate such push/pull stent manipulations, the PreSize prediction accuracy was demonstrated to be very high in this study involving a consecutive retrospective data set acquired from multiple sources, i.e., from cases performed by multiple operators with varying experience and practice and multiple imaging systems that could have impacted simulation accuracy. This indicates that, at least for the PED, not simulating such forces does not appear to compromise the accuracy of FD deployed-length predictions.

The fact that the software’s deployed-length prediction does not appear to be affected by not simulating those forces raises an interesting question of the actual impact the deployment maneuvers have on the final deployed stent length for this particular FD device. Separately, intraoperative stent manipulations are often conducted during the procedure due to suboptimal stent fit. Having established high simulation accuracy with PreSize, the aim is for the software tool to be used prospectively to inform preoperative planning and stent selection that subsequently might minimize the need for such forces, thus also minimizing the need to simulate such forces preoperatively.

Limitations

The main limitation of the study is due to its retrospective nature. The analysis in this work was performed on the available data at the time of collection. Poor positioning, complications in stent deployment, and potential intraoperative manipulations of the implanted devices were not consistently recorded, and thus could not be analyzed. Due to lack of such information about the deployment procedure, it is not possible to conclusively evaluate whether some accuracy variations were caused by intraoperative stent manipulations.

Cases in which more than one FD is deployed to achieve sufficient coverage were considered out of the scope for this study and the software’s functionality. While FDs normally used for the treatment of intracranial aneurysms are braided stents, and thus are expected to behave similarly, the present retrospective evaluation was only performed on the PED. These results should therefore not be directly extended to FDs produced by other manufacturers.

Conclusions

The presented study reports a thorough evaluation of intracranial FD length predictions with PreSize Neurovascular software using real clinical data. With a large multicenter cohort of aneurysm cases treated with Medtronic’s PED, including a range of anatomies, neurovascular imaging systems, and multiple operators, the robustness of the software to process data was thoroughly tested. Segmentation and centerline extraction components showed high accuracy, and the software yielded a very good estimate of the stent length after deployment (> 95%). This study also demonstrated a large discrepancy between labeled and measured deployed FD length. These are important findings, given the challenge in consistently and accurately predicting deployed length to inform stent sizing in current practice. Inaccurate sizing can result in suboptimal deployments, device removal and waste, and potentially, intraoperative and late complications. This present accuracy evaluation using real clinical data was the necessary step in demonstrating the reliability of the PreSize software to be used by physicians, prior to its impact assessment in prospective clinical practice.

Acknowledgments

We gratefully acknowledge Dr. Philippe Bijlenga and the contribution of SwissNeuroFoundation for providing data used to validate segmentation and meshing, and Dr. Max Whitby, who helped perform the segmentation and centerline evaluations. This research was supported by the National Institutes of Health Research grant (no. NIHR II-BP-0817-10020) and a UK Innovation Agency grant (no. Innovate UK BMC-P 105100).

Disclosures

Dr. Iori is employed at the company (Oxford Heartbeat Ltd.) that developed the software included in the presented study. Dr. Downer reports being a consultant for Medtronic, MicroVention, Stryker, and Neurologic.

Author Contributions

Acquisition of data: Patankar, Madigan, Downer, Sonwalkar, Cowley. Analysis and interpretation of data: Iori. Drafting the article: Patankar. Critically revising the article: Madigan, Downer. Reviewed submitted version of manuscript: Sonwalkar, Cowley. Statistical analysis: Iori.

Supplemental Information

Previous Presentations

Preliminary results of this study were previously presented at the ABC-Win Seminar on January 12–17, 2020, in Val d’Isere, France.

References

  • 1

    Shapiro M, Raz E, Becske T, Nelson PK. Variable porosity of the pipeline embolization device in straight and curved vessels: a guide for optimal deployment strategy. AJNR Am J Neuroradiol. 2014;35(4):727733.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 2

    Nelson PK, Lylyk P, Szikora I, Wetzel SG, Wanke I, Fiorella D. The pipeline embolization device for the intracranial treatment of aneurysms trial. AJNR Am J Neuroradiol. 2011;32(1):3440.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 3

    Becske T, Potts MB, Shapiro M, et al. Pipeline for uncoilable or failed aneurysms: 3-year follow-up results. J Neurosurg. 2017;127(1):8188.

  • 4

    Zanaty M, Chalouhi N, Starke RM, et al. Flow diversion versus conventional treatment for carotid cavernous aneurysms. Stroke. 2014;45(9):26562661.

  • 5

    Becske T, Brinjikji W, Potts MB, et al. Long-term clinical and angiographic outcomes following pipeline embolization device treatment of complex internal carotid artery aneurysms: five-year results of the pipeline for uncoilable or failed aneurysms trial. Neurosurgery. 2017;80(1):4048.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 6

    Rangel-Castilla L, Munich SA, Jaleel N, et al. Patency of anterior circulation branch vessels after Pipeline embolization: longer-term results from 82 aneurysm cases. J Neurosurg. 2017;126(4):10641069.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 7

    Chalouhi N, Tjoumakaris SI, Gonzalez LF, et al. Spontaneous delayed migration/shortening of the pipeline embolization device: report of 5 cases. AJNR Am J Neuroradiol. 2013;34(12):23262330.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 8

    Heller RS, Dandamudi V, Calnan D, Malek AM. Neuroform intracranial stenting for aneurysms using simple and multi-stent technique is associated with low risk of magnetic resonance diffusion-weighted imaging lesions. Neurosurgery. 2013;73(4):582591.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 9

    Gascou G, Lobotesis K, Brunel H, et al. Extra-aneurysmal flow modification following pipeline embolization device implantation: focus on regional branches, perforators, and the parent vessel. AJNR Am J Neuroradiol. 2015;36(4):725731.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 10

    Narata AP, Blasco J, Roman LS, et al. Early results in flow diverter sizing by computational simulation: quantification of size change and simulation error assessment. Oper Neurosurg (Hagerstown). 2018;15(5):557566.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 11

    Drescher F, Weber W, Berlis A, et al. Treatment of intra- and extracranial aneurysms using the flow-redirection endoluminal device: multicenter experience and follow-up results. AJNR Am J Neuroradiol. 2017;38(1):105112.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 12

    Babiker MH, Gonzalez LF, Ryan J, et al. Influence of stent configuration on cerebral aneurysm fluid dynamics. J Biomech. 2012;45(3):440447.

  • 13

    Babiker H, Kalani Y, Levitt M, et al. Clinical validations of simulated neurovascular braided stent deployments. Ann Biomed Eng. 2016;44(12):37233725.

    • Search Google Scholar
    • Export Citation
  • 14

    Larrabide I, Kim M, Augsburger L, Villa-Uriol MC, Rüfenacht D, Frangi AF. Fast virtual deployment of self-expandable stents: method and in vitro evaluation for intracranial aneurysmal stenting. Med Image Anal. 2012;16(3):721730.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 15

    Bernardini A, Larrabide I, Petrini L, et al. Deployment of self-expandable stents in aneurysmatic cerebral vessels: comparison of different computational approaches for interventional planning. Comput Methods Biomech Biomed Engin. 2012;15(3):303311.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 16

    Larrabide I, Radaelli A, Frangi A. Fast virtual stenting with deformable meshes: application to intracranial aneurysms. Med Image Comput Comput Assist Interv. 2008;11(Pt 2):790797.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 17

    Spranger K, Ventikos Y. Which spring is the best? Comparison of methods for virtual stenting. IEEE Trans Biomed Eng. 2014;61(7):19982010.

  • 18

    Bouillot P, Brina O, Ouared R, et al. Geometrical deployment for braided stent. Med Image Anal. 2016;30:8594.

  • 19

    Julious SA. Sample sizes for clinical trials with normal data. Stat Med. 2004;23(12):19211986.

  • 20

    Sim J, Lewis M. The size of a pilot study for a clinical trial should be calculated in relation to considerations of precision and efficiency. J Clin Epidemiol. 2012;65(3):301308.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 21

    AneuriskWeb. Accessed December 20, 2021. http://ecm2.mathcs.emory.edu/aneuriskweb

  • 22

    Swiss Neuro Foundation. Accessed December 20, 2021. https://www.swissneurofoundation.org

  • 23

    Visible Patient. Accessed December 20, 2021. https://www.visiblepatient.com

  • 24

    Ghaffari M, Sanchez L, Xu G, et al. Validation of parametric mesh generation for subject-specific cerebroarterial trees using modified Hausdorff distance metrics. Comput Biol Med. 2018;100:209220.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 25

    Antiga L. Patient-Specific Modeling of Geometry and Blood Flow in Large Arteries. Doctoral thesis. Politecnico di Milano;2002.

  • 26

    VMTK. Accessed December 20, 2021. http://www.vmtk.org

  • 27

    Robinson AP, Froese RE. Model validation using equivalence tests. Ecol Modell. 2004;176(3-4):349358.

  • 28

    Ospel JM, Gascou G, Costalat V, Piergallini L, Blackham KA, Zumofen DW. Comparison of pipeline embolization device sizing based on conventional 2D measurements and virtual simulation using the Sim&Size software: an agreement study. AJNR Am J Neuroradiol. 2019;40(3):524530.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • Collapse
  • Expand

Illustration from Di Somma et al. (pp 1187–1190). Published with permission from Glia Media | Artist: Martha Headworth, MS.

  • FIG. 1.

    Illustration of the processing pipeline steps within the software: load preoperative patient images (A), select ROI (B), automatic visualization of a 3D model of the vessels and aneurysm (C), automatic visualization of vessel centerlines (yellow; D), select proximal (P) and distal (D) points along the centerlines (E), automatic visualization of deployed stent based on proximal/distal point selection (F), manually adjust stent position by dragging (G), and change stent model and size using the available menu (H).

  • FIG. 2.

    A–F: Illustration of the evaluation protocol to calculate the accuracy of FD deployment length. Preoperative 2D images are loaded in the software and the software automatically produces a 3D model of the vessel tree (A) along with the centerlines (B). Pre- and postoperative images are coregistered for an aligned view of the distal and proximal locations of the stent in relation to the preoperative anatomy (C). With the use of postoperative 2D images without (D) and with (E) contrast, the distal and proximal ends of the deployed stents are marked and transferred to the 3D model (F). The length of the vessel centerline between the two markers is recorded as measured, or reference, deployed stent length (Lref). An FD of the same dimensions as those used in surgery is virtually deployed between distal and proximal markers, and its length is recorded as simulated stent length (Lsim). G and H: One illustrative example case from the evaluation set illustrates the comparison between a 2D angiographic image with the FD visible (G; the measured length is the length of the blue curve between the markers) and a 2D angiographic image with contrast showing the vasculature, with an overlay of the virtually deployed stent (H; the simulated length is the length of the green curve).

  • FIG. 3.

    A: Scatterplot of PED stent models in the study. Each box represents a deployed device and contains the number of occurrences for each participating interventional radiology center. B: Distribution of relative length change between the measured deployed length and the labeled stent length. C: Linear regression between the measured deployed length and the labeled stent length. NHNN = National Hospital for Neurology and Neurosurgery.

  • FIG. 4.

    A: Bar graph showing the distribution of the deployment accuracy. B: Box plots of the deployment accuracy grouped by interventional radiology center. Red crosses indicate cases in which accuracy is more than twice the interquartile Q3-Q1 distance from the sample median. C: Bland-Altman plot comparing FD reference and simulated lengths for the 138 analyzed cases.

  • 1

    Shapiro M, Raz E, Becske T, Nelson PK. Variable porosity of the pipeline embolization device in straight and curved vessels: a guide for optimal deployment strategy. AJNR Am J Neuroradiol. 2014;35(4):727733.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 2

    Nelson PK, Lylyk P, Szikora I, Wetzel SG, Wanke I, Fiorella D. The pipeline embolization device for the intracranial treatment of aneurysms trial. AJNR Am J Neuroradiol. 2011;32(1):3440.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 3

    Becske T, Potts MB, Shapiro M, et al. Pipeline for uncoilable or failed aneurysms: 3-year follow-up results. J Neurosurg. 2017;127(1):8188.

  • 4

    Zanaty M, Chalouhi N, Starke RM, et al. Flow diversion versus conventional treatment for carotid cavernous aneurysms. Stroke. 2014;45(9):26562661.

  • 5

    Becske T, Brinjikji W, Potts MB, et al. Long-term clinical and angiographic outcomes following pipeline embolization device treatment of complex internal carotid artery aneurysms: five-year results of the pipeline for uncoilable or failed aneurysms trial. Neurosurgery. 2017;80(1):4048.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 6

    Rangel-Castilla L, Munich SA, Jaleel N, et al. Patency of anterior circulation branch vessels after Pipeline embolization: longer-term results from 82 aneurysm cases. J Neurosurg. 2017;126(4):10641069.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 7

    Chalouhi N, Tjoumakaris SI, Gonzalez LF, et al. Spontaneous delayed migration/shortening of the pipeline embolization device: report of 5 cases. AJNR Am J Neuroradiol. 2013;34(12):23262330.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 8

    Heller RS, Dandamudi V, Calnan D, Malek AM. Neuroform intracranial stenting for aneurysms using simple and multi-stent technique is associated with low risk of magnetic resonance diffusion-weighted imaging lesions. Neurosurgery. 2013;73(4):582591.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 9

    Gascou G, Lobotesis K, Brunel H, et al. Extra-aneurysmal flow modification following pipeline embolization device implantation: focus on regional branches, perforators, and the parent vessel. AJNR Am J Neuroradiol. 2015;36(4):725731.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 10

    Narata AP, Blasco J, Roman LS, et al. Early results in flow diverter sizing by computational simulation: quantification of size change and simulation error assessment. Oper Neurosurg (Hagerstown). 2018;15(5):557566.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • 11

    Drescher F, Weber W, Berlis A, et al. Treatment of intra- and extracranial aneurysms using the flow-redirection endoluminal device: multicenter experience and follow-up results. AJNR Am J Neuroradiol. 2017;38(1):105112.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 12

    Babiker MH, Gonzalez LF, Ryan J, et al. Influence of stent configuration on cerebral aneurysm fluid dynamics. J Biomech. 2012;45(3):440447.

  • 13

    Babiker H, Kalani Y, Levitt M, et al. Clinical validations of simulated neurovascular braided stent deployments. Ann Biomed Eng. 2016;44(12):37233725.

    • Search Google Scholar
    • Export Citation
  • 14

    Larrabide I, Kim M, Augsburger L, Villa-Uriol MC, Rüfenacht D, Frangi AF. Fast virtual deployment of self-expandable stents: method and in vitro evaluation for intracranial aneurysmal stenting. Med Image Anal. 2012;16(3):721730.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 15

    Bernardini A, Larrabide I, Petrini L, et al. Deployment of self-expandable stents in aneurysmatic cerebral vessels: comparison of different computational approaches for interventional planning. Comput Methods Biomech Biomed Engin. 2012;15(3):303311.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 16

    Larrabide I, Radaelli A, Frangi A. Fast virtual stenting with deformable meshes: application to intracranial aneurysms. Med Image Comput Comput Assist Interv. 2008;11(Pt 2):790797.

    • PubMed
    • Search Google Scholar
    • Export Citation
  • 17

    Spranger K, Ventikos Y. Which spring is the best? Comparison of methods for virtual stenting. IEEE Trans Biomed Eng. 2014;61(7):19982010.

  • 18

    Bouillot P, Brina O, Ouared R, et al. Geometrical deployment for braided stent. Med Image Anal. 2016;30:8594.

  • 19

    Julious SA. Sample sizes for clinical trials with normal data. Stat Med. 2004;23(12):19211986.

  • 20

    Sim J, Lewis M. The size of a pilot study for a clinical trial should be calculated in relation to considerations of precision and efficiency. J Clin Epidemiol. 2012;65(3):301308.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 21

    AneuriskWeb. Accessed December 20, 2021. http://ecm2.mathcs.emory.edu/aneuriskweb

  • 22

    Swiss Neuro Foundation. Accessed December 20, 2021. https://www.swissneurofoundation.org

  • 23

    Visible Patient. Accessed December 20, 2021. https://www.visiblepatient.com

  • 24

    Ghaffari M, Sanchez L, Xu G, et al. Validation of parametric mesh generation for subject-specific cerebroarterial trees using modified Hausdorff distance metrics. Comput Biol Med. 2018;100:209220.

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • 25

    Antiga L. Patient-Specific Modeling of Geometry and Blood Flow in Large Arteries. Doctoral thesis. Politecnico di Milano;2002.

  • 26

    VMTK. Accessed December 20, 2021. http://www.vmtk.org

  • 27

    Robinson AP, Froese RE. Model validation using equivalence tests. Ecol Modell. 2004;176(3-4):349358.

  • 28

    Ospel JM, Gascou G, Costalat V, Piergallini L, Blackham KA, Zumofen DW. Comparison of pipeline embolization device sizing based on conventional 2D measurements and virtual simulation using the Sim&Size software: an agreement study. AJNR Am J Neuroradiol. 2019;40(3):524530.

    • PubMed
    • Search Google Scholar
    • Export Citation

Metrics

All Time Past Year Past 30 Days
Abstract Views 1643 152 0
Full Text Views 692 494 41
PDF Downloads 822 506 38
EPUB Downloads 0 0 0