Browse

You are looking at 31 - 40 of 37,207 items for

  • Refine by Access: all x
Clear All
Restricted access

Hyung Rae Lee, Dong-Ho Lee, Jae Hwan Cho, Eui Seung Hwang, Sang Yun Seok, Sehan Park, and Choon Sung Lee

OBJECTIVE

The objective of this study was to evaluate the feasibility and complications of the over-the-arch (OTA) technique for screw insertion into the C1 lateral mass in patients in whom conventional techniques (i.e., posterior arch [PA] and inferior lateral mass [ILM]) are not feasible due to 1) PA with a very small height (< 3.5 mm), 2) a caudally tilted PA blocking the inferior part of the C1 lateral mass, or 3) loss of height at the ILM (< 3.5 mm).

METHODS

The authors reviewed the medical records of 60 patients who underwent C1 screw fixation with the OTA technique (13 screws) and the PA/ILM technique (107 screws) between 2011 and 2019. Vertebral artery (VA) injuries, screw malposition, and bony union were radiologically assessed. Clinical outcome measures, including Neck Disability Index (NDI), Japanese Orthopaedic Association (JOA) scale score, and occipital neuralgia, were recorded.

RESULTS

Thirteen OTA screws were successfully inserted without any major complications. NDI and JOA scale scores did not show significant differences between the two groups at final follow-up. No VA injuries were recognized during screw insertion. There was no evidence of ischemic damage to the VA or bony erosion in the occiput or atlas. Medial wall violation was observed in 1 screw (7.7%); however, no C0–1, C1–2, or lateral wall violations were observed. No patients developed new-onset neuralgia postoperatively after C1 fixation with the OTA technique.

CONCLUSIONS

The OTA technique was safe and useful for C1 screw fixation in patients in whom conventional techniques could not be employed.

Restricted access

Hisanori Ikuma, Tomohiko Hirose, Shinichiro Takao, Masataka Ueda, Kazutaka Yamashita, Kazutoshi Otsuka, and Keisuke Kawasaki

OBJECTIVE

Patients with ankylosing spinal disorders (ASDs), such as ankylosing spondylitis and diffuse idiopathic skeletal hyperostosis, often have rigid kyphosis of the spine. The fracture site is sometimes unintentionally displaced when surgery is conducted with the patient prone. To prevent this incident, the authors adopted the lateral decubitus position for patients intraoperatively for this pathology. The aim of this study was to retrospectively assess the impact of the lateral decubitus position in the perioperative period on posterior fixation for thoracolumbar fractures with ASD.

METHODS

Thirty-seven consecutive patients who underwent posterior instrumentation for thoracolumbar fracture with ASD at the authors’ institute were divided into 15 lateral decubitus positions (group L) and 22 prone positions (group P). Surgical time, estimated blood loss (EBL), number of levels fused, perioperative complications, length of stay (LOS), ratio of fracture voids, and ratio of anterior wall height were investigated. The ratio of fracture void and the ratio of anterior wall height were the radiological assessments showing a degree of reduction in vertebral fracture on CT.

RESULTS

Age, sex, BMI, fracture level, and LOS were similar between the groups. Levels fused and EBL were significantly shorter and less in group L (p < 0.001 and p = 0.04), but there was no significant difference in surgical time. The complication rate was similar, but 1 death within 90 days after surgery was found in group P. The ratio of fracture voids was 85.4% ± 12.8% for group L and 117.5% ± 37.3% for group P. A significantly larger number of patients with a fracture void ratio of 100% or less was found in group L (86.7% vs 36.4%, p = 0.002). The ratio of anterior wall height was 107.5% ± 12.3% for group L and 116.9% ± 18.8% for group P. A significantly larger number of patients with the anterior wall height ratio of 100% or less was also found in group L (60.0% vs 27.3%, p = 0.046).

CONCLUSIONS

The results of this study suggest that the lateral decubitus position can be expected to have an effect on closing or maintaining the fracture void or a preventive effect of intraoperative unintentional extension displacement of the fractured site, which is often seen in the prone position during surgery for thoracolumbar fractures involving ASD.

Restricted access

Hao You, Xing Fan, Jiajia Liu, Dongze Guo, Zhibao Li, and Hui Qiao

OBJECTIVE

The current study investigated the correlation between intraoperative motor evoked potential (MEP) and somatosensory evoked potential (SSEP) monitoring and both short-term and long-term motor outcomes in aneurysm patients treated with surgical clipping. Moreover, the authors provide a relatively optimal neurophysiological predictor of postoperative motor deficits (PMDs) in patients with ruptured and unruptured aneurysms.

METHODS

A total of 1017 patients (216 with ruptured aneurysms and 801 with unruptured aneurysms) were included. Patient demographic characteristics, clinical features, intraoperative monitoring data, and follow-up data were retrospectively reviewed. The efficacy of using changes in MEP/SSEP to predict PMDs was assessed using binary logistic regression analysis. Subsequently, receiver operating characteristic curve analysis was performed to determine the optimal critical value for duration of MEP/SSEP deterioration.

RESULTS

Both intraoperative MEP and SSEP monitoring were significantly effective for predicting short-term (p < 0.001 for both) and long-term (p < 0.001 for both) PMDs in aneurysm patients. The critical values for predicting short-term PMDs were amplitude decrease rates of 57.30% for MEP (p < 0.001 and area under the curve [AUC] 0.732) and 64.10% for SSEP (p < 0.001 and AUC 0.653). In patients with an unruptured aneurysm, the optimal critical values for predicting short-term PMDs were durations of deterioration of 17 minutes for MEP (p < 0.001 and AUC 0.768) and 21 minutes for SSEP (p < 0.001 and AUC 0.843). In patients with a ruptured aneurysm, the optimal critical values for predicting short-term PMDs were durations of deterioration of 12.5 minutes for MEP (p = 0.028 and AUC 0.706) and 11 minutes for SSEP (p = 0.043 and AUC 0.813).

CONCLUSIONS

The authors found that both intraoperative MEP and SSEP monitoring are useful for predicting short-term and long-term PMDs in patients with unruptured and ruptured aneurysms. The optimal intraoperative neuromonitoring method for predicting PMDs varies depending on whether the aneurysm has ruptured or not.

Restricted access

Jenny C. Kienzler, Stephen Tenn, Srinivas Chivukula, Fang-I Chu, Hiro D. Sparks, Nzhde Agazaryan, Won Kim, Antonio De Salles, Michael Selch, Alessandra Gorgulho, Tania Kaprealian, and Nader Pouratian

OBJECTIVE

Precise and accurate targeting is critical to optimize outcomes after stereotactic radiosurgery (SRS) for trigeminal neuralgia (TN). The aim of this study was to compare the outcomes after SRS for TN in which two different techniques were used: mask-based 4-mm cone versus frame-based 5-mm cone.

METHODS

The authors performed a retrospective review of patients who underwent SRS for TN at their institution between 1996 and 2019. The Barrow Neurological Institute (BNI) pain score and facial hypesthesia scale were used to evaluate pain relief and facial numbness.

RESULTS

A total of 234 patients were included in this study; the mean age was 67 years. In 97 patients (41.5%) radiation was collimated by a mask-based 4-mm cone, whereas a frame-based 5-mm cone was used in the remaining 137 patients (58.5%). The initial adequate pain control rate (BNI I–III) was 93.4% in the frame-based 5-mm group, compared to 87.6% in the mask-based 4-mm group. This difference between groups lasted, with an adequate pain control rate at ≥ 24 months of 89.9% and 77.8%, respectively. Pain relief was significantly different between groups from initial response until the last follow-up (≥ 24 months, p = 0.02). A new, permanent facial hypesthesia occurred in 30.3% of patients (33.6% in the frame-based 5-mm group vs 25.8% in the mask-based 4-mm group). However, no significant association between the BNI facial hypesthesia score and groups was found. Pain recurrence occurred earlier (median time to recurrence 12 months vs 29 months, p = 0.016) and more frequently (38.1% vs 20.4%, p = 0.003) in the mask-based 4-mm than in the frame-based 5-mm group.

CONCLUSIONS

Frame-based 5-mm collimator SRS for TN resulted in a better long-term pain relief with similar toxicity profiles to that seen with mask-based 4-mm collimator SRS.

Restricted access

Mark D. Dijkman, Martine W. T. van Bilsen, Michael G. Fehlings, and Ronald H. M. A. Bartels

OBJECTIVE

Degenerative cervical myelopathy (DCM) is a major global cause of spinal cord dysfunction. Surgical treatment is considered a safe and effective way to improve functional outcome, although information about long-term functional outcome remains scarce despite increasing longevity. The objective of this study was to describe functional outcome 10 years after surgery for DCM.

METHODS

A prospective observational cohort study was undertaken in a university-affiliated neurosurgery department. All patients who underwent surgery for DCM between 2008 and 2010 as part of the multicenter Cervical Spondylotic Myelopathy International trial were included. Participants were approached for additional virtual assessment 10 years after surgery. Functional outcome was assessed according to the modified Japanese Orthopaedic Association (mJOA; scores 0–18) score at baseline and 1, 2, and 10 years after surgery. The minimal clinically important difference was defined as 1-, 2-, or 3-point improvement for mild, moderate, and severe myelopathy, respectively. Outcome was considered durable when stabilization or improvement after 2 years was maintained at 10 years. Self-evaluated effect of surgery was assessed using a 4-point Likert-like scale. Demographic, clinical, and surgical data were compared between groups that worsened and improved or remained stable using descriptive statistics. Functional outcome was compared between various time points during follow-up with linear mixed models.

RESULTS

Of the 42 originally included patients, 37 participated at follow-up (11.9% loss to follow-up, 100% response rate). The mean patient age was 56.1 years, and 42.9% of patients were female. Surgical approaches were anterior (76.2%), posterior (21.4%), or posterior with fusion (2.4%). The mean follow-up was 10.8 years (range 10–12 years). The mean mJOA score increased significantly from 13.1 (SD 2.3) at baseline to 14.2 (SD 3.3) at 10 years (p = 0.01). A minimal clinically important difference was achieved in 54.1%, and stabilization of functional status was maintained in 75.0% in the long term. Patients who worsened were older (median 63 vs 52 years, p < 0.01) and had more comorbidities (70.0% vs 25.9%, p < 0.01). A beneficial effect of surgery was self-reported by 78.3% of patients.

CONCLUSIONS

Surgical treatment for DCM results in satisfactory improvement of functional outcome that is maintained at 10-year follow-up.

Restricted access

MirHojjat Khorasanizadeh, Yu-Ming Chang, Alejandro Enriquez-Marulanda, Satomi Mizuhashi, Mohamed M. Salem, Santiago Gomez-Paz, Farhan Siddiq, Peter Kan, Justin Moore, Christopher S. Ogilvy, and Ajith J. Thomas

OBJECTIVE

Middle meningeal artery embolization (MMAE) is an increasingly utilized approach for the treatment of chronic subdural hematomas (CSDHs). The course of morphological progression of CSDHs following MMAE is poorly understood. Herein, the authors aimed to describe these morphological changes and assess their prognostic significance for the outcomes on follow-up.

METHODS

A single-institution retrospective cohort study of CSDH cases treated by upfront MMAE, without prior or adjunctive surgical evacuation, was performed. Clinical outcomes, complications, and the need for rescue surgery on follow-up were recorded. Hematomas were categorized into 6 morphological subtypes. All baseline and follow-up head CT scans were assessed for CSDH structural appearance, density, and loculation. Changes in CSDH size were quantified via 3D reconstruction for volumetric measurement.

RESULTS

Overall, 52 CSDHs in 45 patients treated with upfront MMAE were identified. Hematomas were followed for a mean of 92.9 days. Volume decreased by ≥ 50% in 79.6% of the CSDHs. The overall rescue surgery rate was 9.6%. A sequence of morphological changes after MMAE was identified. Hematomas that diverged from this sequence (5.4%) all progressed toward treatment failure and required rescue surgery. The CSDHs were categorized into early, intermediate, and late stages based on the baseline morphological appearance. Progression from early to intermediate and then to late stage took 12.7 and 30.0 days, respectively, on average. The volume of early/intermediate- and late-stage hematomas decreased by ≥ 50%, a mean of 78.2 and 47.6 days after MMAE, respectively. Early- and intermediate-stage hematomas showed a trend toward more favorable outcomes compared with late-stage hematomas. The density of homogeneous hypodense hematomas (HSDHs) transiently increased immediately after MMAE (p < 0.001). A marked decrease in density and volume 1 to 3 weeks after MMAE in HSDHs was detected, the lack of which indicated an eventual need for rescue surgery. In HSDHs, a baseline mean density of < 20 HU, and a lower density than baseline by 1 month post-MMAE were predictors of favorable outcomes. The baseline hematoma volume, axial thickness, midline shift, and loculation were not correlated with MMAE outcomes. Loculated, trabecular, and laminar hematomas, which are known to have unfavorable surgical outcomes, had MMAE outcomes similar to those of other "surgical" hematomas.

CONCLUSIONS

The current study was the first to describe the nature, sequence, and timing of morphological changes of CSDHs after MMAE treatment and has identified structural features that can predict treatment outcomes.

Restricted access

Ayoub Dakson, Michelle Kameda-Smith, Michael D. Staudt, Pascal Lavergne, Serge Makarenko, Matthew E. Eagles, Huphy Ghayur, Ru Chen Guo, Alwalaa Althagafi, Jonathan Chainey, Charles J. Touchette, Cameron Elliott, Christian Iorio-Morin, Michael K. Tso, Ryan Greene, Laurence Bargone, and Sean D. Christie

OBJECTIVE

External ventricular drainage (EVD) catheters are associated with complications such as EVD catheter infection (ECI), intracranial hemorrhage (ICH), and suboptimal placement. The aim of this study was to investigate the rates of EVD catheter complications and their associated risk factor profiles in order to optimize the safety and accuracy of catheter insertion.

METHODS

A total of 348 patients with urgently placed EVD catheters were included as a part of a prospective multicenter observational cohort. Strict definitions were applied for each complication category.

RESULTS

The rates of misplacement, ECI/ventriculitis, and ICH were 38.6%, 12.2%, and 9.2%, respectively. Catheter misplacement was associated with midline shift (p = 0.002), operator experience (p = 0.031), and intracranial length (p < 0.001). Although mostly asymptomatic, ICH occurred more often in patients receiving prophylactic low-molecular-weight heparin (LMWH) (p = 0.002) and those who required catheter replacement (p = 0.026). Infectious complications (ECI/ventriculitis and suspected ECI) occurred more commonly in patients whose catheters were inserted at the bedside (p = 0.004) and those with smaller incisions (≤ 1 cm) (p < 0.001). ECI/ventriculitis was not associated with preinsertion antibiotic prophylaxis (p = 0.421), catheter replacement (p = 0.118), and catheter tunneling length (p = 0.782).

CONCLUSIONS

EVD-associated complications are common. These results suggest that the operating room setting can help reduce the risk of infection, but not the use of preoperative antibiotic prophylaxis. Although EVD-related ICH was associated with LMWH prophylaxis for deep vein thrombosis, there were no significant clinical manifestations in the majority of patients. Catheter misplacement was associated with operator level of training and midline shift. Information from this multicenter prospective cohort can be utilized to increase the safety profile of this common neurosurgical procedure.

Restricted access

Jason I. Liounakos, Asham Khan, Karen Eliahu, Jennifer Z. Mao, Christopher R. Good, John Pollina, Colin M. Haines, Jeffrey L. Gum, Thomas C. Schuler, Ehsan Jazini, Richard V. Chua, Eiman Shafa, Avery L. Buchholz, Martin H. Pham, Kornelis A. Poelstra, and Michael Y. Wang

OBJECTIVE

Robotics is a major area for research and development in spine surgery. The high accuracy of robot-assisted placement of thoracolumbar pedicle screws is documented in the literature. The authors present the largest case series to date evaluating 90-day complication, revision, and readmission rates for robot-assisted spine surgery using the current generation of robotic guidance systems.

METHODS

An analysis of a retrospective, multicenter database of open and minimally invasive thoracolumbar instrumented fusion surgeries using the Mazor X or Mazor X Stealth Edition robotic guidance systems was performed. Patients 18 years of age or older and undergoing primary or revision surgery for degenerative spinal conditions were included. Descriptive statistics were used to calculate rates of malpositioned screws requiring revision, as well as overall complication, revision, and readmission rates within 90 days.

RESULTS

In total, 799 surgical cases (Mazor X: 48.81%; Mazor X Stealth Edition: 51.19%) were evaluated, involving robot-assisted placement of 4838 pedicle screws. The overall intraoperative complication rate was 3.13%. No intraoperative implant-related complications were encountered. Postoperatively, 129 patients suffered a total of 146 complications by 90 days, representing an incidence of 16.1%. The rate of an unrecognized malpositioned screw resulting in a new postoperative radiculopathy requiring revision surgery was 0.63% (5 cases). Medical and pain-related complications unrelated to hardware placement accounted for the bulk of postoperative complications within 90 days. The overall surgical revision rate at 90 days was 6.63% with 7 implant-related revisions, representing an implant-related revision rate of 0.88%. The 90-day readmission rate was 7.13% with 2 implant-related readmissions, representing an implant-related readmission rate of 0.25% of cases.

CONCLUSIONS

The results of this multicenter case series and literature review suggest current-generation robotic guidance systems are associated with low rates of intraoperative and postoperative implant-related complications, revisions, and readmissions at 90 days. Future outcomes-based studies are necessary to evaluate complication, revision, and readmission rates compared to conventional surgery.

Restricted access

Xiaopeng Guo, Zihao Wang, Lu Gao, Wenbin Ma, Bing Xing, and Wei Lian

OBJECTIVE

Opioid-minimizing or nonopioid therapy using nonsteroidal antiinflammatory drugs (NSAIDs) or tramadol has been encouraged for pain management. This study aimed to examine the noninferiority of NSAIDs to tramadol for pain management following transsphenoidal surgery for pituitary adenomas in terms of analgesic efficacy, adverse events, and rescue opioid use.

METHODS

This was a randomized, single-center, double-blind noninferiority trial. Patients 18–70 years old with planned transsphenoidal surgery for pituitary adenomas were randomly assigned (in a 1-to-1 ratio) to receive NSAIDs (parecoxib injection and subsequent loxoprofen tablets) or tramadol (tramadol injection and subsequent tramadol tablets). The primary outcome was pain score assessed by a visual analog scale (VAS) for 24 hours following surgery; the secondary outcomes were VAS scores for 48 and 72 hours. Other prespecified outcomes included nausea, vomiting, dizziness, upset stomach, skin rash, peptic ulcer, gastrointestinal bleeding, and pethidine use to control breakthrough pain. Noninferiority of NSAIDs to tramadol was established if the upper limit of the 95% confidence interval (CI) of the VAS score difference was < 1 point and the rate difference of adverse events and pethidine use < 5%. The superiority of NSAIDs was assessed when noninferiority was verified. All analyses were performed on an intention-to-treat basis.

RESULTS

Two hundred two patients were enrolled between November 1, 2020, and May 31, 2021 (101 in the NSAIDs group, 101 in the tramadol group). Baseline characteristics between groups were well balanced. Mean VAS scores for 24 hours following transsphenoidal surgery were 2.6 ± 1.8 in the NSAIDs group and 3.5 ± 2.1 in the tramadol group (−0.9 difference, 95% CI −1.5 to −0.4; p value for noninferiority < 0.001, p value for superiority < 0.001). Noninferiority and superiority were also achieved for both secondary outcomes. VAS scores improved over time in both groups. Incidences of nausea (39.6% vs 61.4%, p = 0.002), vomiting (3.0% vs 42.6%, p < 0.001), and dizziness (12.9% vs 47.5%, p < 0.001) were significantly lower, while incidence of upset stomach (9.9% vs 2.0%, p = 0.017) was slightly higher in the NSAIDs group compared with the tramadol group. The percentage of opioid use was 4.0% in the NSAIDs group and 15.8% in the tramadol group (−11.8% difference, 95% CI −19.9% to −3.7%; p value for noninferiority < 0.001, p value for superiority = 0.005).

CONCLUSIONS

NSAIDs significantly reduced acute pain following transsphenoidal surgery, caused few adverse events, and limited opioid use compared with tramadol.

Restricted access

Pravesh S. Gadjradj, Nicholas V. R. Smeele, Mandy de Jong, Paul R. A. M. Depauw, Maurits W. van Tulder, Esther W. de Bekker-Grob, and Biswadjiet S. Harhangi

OBJECTIVE

Lumbar discectomy is a frequently performed procedure to treat sciatica caused by lumbar disc herniation. Multiple surgical techniques are available, and the popularity of minimally invasive surgical techniques is increasing worldwide. Clinical outcomes between these techniques may not show any substantial differences. As lumbar discectomy is an elective procedure, patients’ own preferences play an important role in determining the procedure they will undergo. The aims of the current study were to determine the relative preference weights patients apply to various attributes of lumbar discectomy, determine if patient preferences change after surgery, identify preference heterogeneity for choosing surgery for sciatica, and calculate patient willingness to pay for other attributes.

METHODS

A discrete choice experiment (DCE) was conducted among patients with sciatica caused by lumbar disc herniation. A questionnaire was administered to patients before they underwent surgery and to an independent sample of patients who had already undergone surgery. The DCE required patients to choose between two surgical techniques or to opt out from 12 choice sets with alternating characteristic levels: waiting time for surgery, out-of-pocket costs, size of the scar, need of general anesthesia, need for hospitalization, effect on leg pain, and duration of the recovery period.

RESULTS

A total of 287 patients were included in the DCE analysis. All attributes, except scar size, had a significant influence on the overall preferences of patients. The effect on leg pain was the most important characteristic in the decision for a surgical procedure (by 44.8%). The potential out-of-pocket costs for the procedure (28.8%), the wait time (12.8%), need for general anesthesia (7.5%), need for hospitalization (4.3%), and the recovery period (1.8%) followed. Preferences were independent of the scores on patient-reported outcome measures and baseline characteristics. Three latent classes could be identified with specific preference patterns. Willingness-to-pay was the highest for effectiveness on leg pain, with patients willing to pay €3133 for a treatment that has a 90% effectiveness instead of 70%.

CONCLUSIONS

Effect on leg pain is the most important factor for patients in deciding to undergo surgery for sciatica. Not all proposed advantages of minimally invasive spine surgery (e.g., size of the scar, no need of general anesthesia) are necessarily perceived as advantages by patients. Spine surgeons should propose surgical techniques for sciatica, not only based on own ability and proposed eligibility, but also based on patient preferences as is part of shared decision making.