Browse

You are looking at 1 - 5 of 5 items for

  • By Author: Tempel, Zachary J. x
  • By Author: Agarwal, Nitin x
Clear All
Restricted access

Nitin Agarwal, Michael D. White, Xiaoran Zhang, Nima Alan, Alp Ozpinar, David J. Salvetti, Zachary J. Tempel, David O. Okonkwo, Adam S. Kanter and D. Kojo Hamilton

OBJECTIVE

Stand-alone lateral lumbar interbody fusion (LLIF) is a useful minimally invasive approach for select spinal disorders, but implant subsidence may occur in up to 30% of patients. Previous studies have suggested that wider implants reduce the subsidence rate. This study aimed to evaluate whether a mismatch of the endplate and implant area can predict the rate and grade of implant subsidence.

METHODS

The authors conducted a retrospective review of prospectively collected data on consecutive patients who underwent stand-alone LLIF between July 2008 and June 2015; 297 patients (623 surgical levels) met inclusion criteria. Imaging studies were examined to grade graft subsidence according to Marchi criteria. Thirty patients had radiographic evidence of implant subsidence. The endplates above and below the implant were measured.

RESULTS

A total of 30 patients with implant subsidence were identified. Of these patients, 6 had Marchi grade 0, 4 had grade I, 12 had grade II, and 8 had grade III implant subsidence. There was no statistically significant correlation between the endplate-implant area mismatch and subsidence grade or incidence. There was also no correlation between endplate-implant width and length mismatch and subsidence grade or incidence. However, there was a strong correlation between the usage of the 18-mm-wide implants and the development of higher-grade subsidence (p = 0.002) necessitating surgery. There was no significant association between the degree of mismatch or Marchi subsidence grade and the presence of postoperative radiculopathy. Of the 8 patients with 18-mm implants demonstrating radiographic subsidence, 5 (62.5%) required reoperation. Of the 22 patients with 22-mm implants demonstrating radiographic subsidence, 13 (59.1%) required reoperation.

CONCLUSIONS

There was no correlation between endplate-implant area, width, or length mismatch and Marchi subsidence grade for stand-alone LLIF. There was also no correlation between either endplate-implant mismatch or Marchi subsidence grade and postoperative radiculopathy. The data do suggest that the use of 18-mm-wide implants in stand-alone LLIF may increase the risk of developing high-grade subsidence necessitating reoperation compared to the use of 22-mm-wide implants.

Free access

Lateral lumbar interbody fusion in the elderly: a 10-year experience

Presented at the 2018 AANS/CNS Joint Section on Disorders of the Spine and Peripheral Nerves

Nitin Agarwal, Andrew Faramand, Nima Alan, Zachary J. Tempel, D. Kojo Hamilton, David O. Okonkwo and Adam S. Kanter

OBJECTIVE

Elderly patients, often presenting with multiple medical comorbidities, are touted to be at an increased risk of peri- and postoperative complications following spine surgery. Various minimally invasive surgical techniques have been developed and employed to treat an array of spinal conditions while minimizing complications. Lateral lumbar interbody fusion (LLIF) is one such approach. The authors describe clinical outcomes in patients over the age of 70 years following stand-alone LLIF.

METHODS

A retrospective query of a prospectively maintained database was performed for patients over the age of 70 years who underwent stand-alone LLIF. Patients with posterior segmental fixation and/or fusion were excluded. The preoperative and postoperative values for the Oswestry Disability Index (ODI) were analyzed to compare outcomes after intervention. Femoral neck t-scores were acquired from bone density scans and correlated with the incidence of graft subsidence.

RESULTS

Among the study cohort of 55 patients, the median age at the time of surgery was 74 years (range 70–87 years). Seventeen patients had at least 3 medical comorbidities at surgery. Twenty-three patients underwent a 1-level, 14 a 2-level, and 18 patients a 3-level or greater stand-alone lateral fusion. The median estimated blood loss was 25 ml (range 5–280 ml). No statistically significant relationship was detected between volume of blood loss and the number of operative levels. The median length of hospital stay was 2 days (range 1–4 days). No statistically significant relationship was observed between the length of hospital stay and age at the time of surgery. There was one intraoperative death secondary to cardiac arrest, with a mortality rate of 1.8%. One patient developed a transient femoral nerve injury. Five patients with symptomatic graft subsidence subsequently underwent posterior instrumentation. A lower femoral neck t-score < −1.0 correlated with a higher incidence of graft subsidence (p = 0.006). The mean ODI score 1 year postoperatively of 31.1 was significantly (p = 0.003) less than the mean preoperative ODI score of 46.2.

CONCLUSIONS

Stand-alone LLIF can be safely and effectively performed in the elderly population. Careful evaluation of preoperative bone density parameters should be employed to minimize risk of subsidence and need for additional surgery. Despite an association with increased comorbidities, age alone should not be a deterrent when considering stand-alone LLIF in the elderly population.

Full access

David J. Salvetti, Zachary J. Tempel, Ezequiel Goldschmidt, Nicole A. Colwell, Federico Angriman, David M. Panczykowski, Nitin Agarwal, Adam S. Kanter and David O. Okonkwo

OBJECTIVE

Nutritional deficiency negatively affects outcomes in many health conditions. In spine surgery, evidence linking preoperative nutritional deficiency to postoperative surgical site infection (SSI) has been limited to small retrospective studies. Authors of the current study analyzed a large consecutive cohort of patients who had undergone elective spine surgery to determine the relationship between a serum biomarker of nutritional status (preoperative prealbumin levels) and SSI.

METHODS

The authors conducted a retrospective review of the electronic medical charts of patients who had undergone posterior spinal surgeries and whose preoperative prealbumin level was available. Additional data pertinent to the risk of SSI were also collected. Patients who developed a postoperative SSI were identified, and risk factors for postoperative SSI were analyzed. Nutritional deficiency was defined as a preoperative serum prealbumin level ≤ 20 mg/dl.

RESULTS

Among a consecutive series of 387 patients who met the study criteria for inclusion, the infection rate for those with preoperative prealbumin ≤ 20 mg/dl was 17.8% (13/73), versus 4.8% (15/314) for those with preoperative prealbumin > 20 mg/dl. On univariate and multivariate analysis a low preoperative prealbumin level was a risk factor for postoperative SSI with a crude OR of 4.29 (p < 0.01) and an adjusted OR of 3.28 (p = 0.02). In addition, several previously known risk factors for infection, including diabetes, spinal fusion, and number of operative levels, were significant for the development of an SSI.

CONCLUSIONS

In this consecutive series, preoperative prealbumin levels, a serum biomarker of nutritional status, correlated with the risk of SSI in elective spine surgery. Prehabilitation before spine surgery, including strategies to improve nutritional status in patients with nutritional deficiencies, may increase value and improve spine care.

Free access

Nitin Agarwal, Prateek Agarwal, Ashley Querry, Anna Mazurkiewicz, Zachary J. Tempel, Robert M. Friedlander, Peter C. Gerszten, D. Kojo Hamilton, David O. Okonkwo and Adam S. Kanter

OBJECTIVE

Previous studies have demonstrated the efficacy of infection prevention protocols in reducing infection rates. This study investigated the effects of the development and implementation of an infection prevention protocol that was augmented by increased physician awareness of spinal fusion surgical site infection (SSI) rates and resultant cost savings.

METHODS

A cohort clinical investigation over a 10-year period was performed at a single tertiary spine care academic institution. Preoperative infection control measures (chlorohexidine gluconate bathing, Staphylococcus aureus nasal screening and decolonization) followed by postoperative infection control measures (surgical dressing care) were implemented. After the implementation of these infection control measures, an awareness intervention was instituted in which all attending and resident neurosurgeons were informed of their individual, independently adjudicated spinal fusion surgery infection rates and rankings among their peers. During the course of these interventions, the overall infection rate was tracked as well as the rates for those neurosurgeons who complied with the preoperative and postoperative infection control measures (protocol group) and those who did not (control group).

RESULTS

With the implementation of postoperative surgical dressing infection control measures and physician awareness, the postoperative spine surgery infection rate decreased by 45% from 3.8% to 2.1% (risk ratio 0.55; 95% CI 0.32–0.93; p = 0.03) for those in the protocol cohort, resulting in an estimated annual cost savings of $291,000. This reduction in infection rate was not observed for neurosurgeons in the control group, although the overall infection rate among all neurosurgeons decreased by 54% from 3.3% to 1.5% (risk ratio 0.46; 95% CI 0.28–0.73; p = 0.0013).

CONCLUSIONS

A novel paradigm for spine surgery infection control combined with physician awareness methods resulted in significantly decreased SSI rates and an associated cost reduction. Thus, information sharing and physician engagement as a supplement to formal infection control measures result in improvements in surgical outcomes and costs.