Journal of Neurosurgery: Spine
Isaac O. Karikari, Christopher L. Gilchrist, Liufang Jing, David A. Alcorta, Jun Chen, William J. Richardson, Mostafa A. Gabr, Richard D. Bell, Michael J. Kelley, Carlos A. Bagley and Lori A. Setton
Chordoma cells can generate solid-like tumors in xenograft models that express some molecular characteristics of the parent tumor, including positivity for brachyury and cytokeratins. However, there is a dearth of molecular markers that relate to chordoma tumor growth, as well as the cell lines needed to advance treatment. The objective in this study was to isolate a novel primary chordoma cell source and analyze the characteristics of tumor growth in a mouse xenograft model for comparison with the established U-CH1 and U-CH2b cell lines.
Primary cells from a sacral chordoma, called “DVC-4,” were cultured alongside U-CH1 and U-CH2b cells for more than 20 passages and characterized for expression of CD24 and brachyury. While brachyury is believed essential for driving tumor formation, CD24 is associated with healthy nucleus pulposus cells. Each cell type was subcutaneously implanted in NOD/SCID/IL2Rγnull mice. The percentage of solid tumors formed, time to maximum tumor size, and immunostaining scores for CD24 and brachyury (intensity scores of 0–3, heterogeneity scores of 0–1) were reported and evaluated to test differences across groups.
The DVC-4 cells retained chordoma-like morphology in culture and exhibited CD24 and brachyury expression profiles in vitro that were similar to those for U-CH1 and U-CH2b. Both U-CH1 and DVC-4 cells grew tumors at rates that were faster than those for U-CH2b cells. Gross tumor developed at nearly every site (95%) injected with U-CH1 and at most sites (75%) injected with DVC-4. In contrast, U-CH2b cells produced grossly visible tumors in less than 50% of injected sites. Brachyury staining was similar among tumors derived from all 3 cell types and was intensely positive (scores of 2–3) in a majority of tissue sections. In contrast, differences in the pattern and intensity of staining for CD24 were noted among the 3 types of cell-derived tumors (p < 0.05, chi-square test), with evidence of intense and uniform staining in a majority of U-CH1 tumor sections (score of 3) and more than half of the DVC-4 tumor sections (scores of 2–3). In contrast, a majority of sections from U-CH2b cells stained modestly for CD24 (scores of 1–2) with a predominantly heterogeneous staining pattern.
This is the first report on xenografts generated from U-CH2b cells in which a low tumorigenicity was discovered despite evidence of chordoma-like characteristics in vitro. For tumors derived from a primary chordoma cell and U-CH1 cell line, similarly intense staining for CD24 was observed, which may correspond to their similar potential to grow tumors. In contrast, U-CH2b tumors stained less intensely for CD24. These results emphasize that many markers, including CD24, may be useful in distinguishing among chordoma cell types and their tumorigenicity in vivo.
Amit Jain, Hamid Hassanzadeh, Varun Puvanesarajah, Eric O. Klineberg, Daniel M. Sciubba, Michael P. Kelly, D. Kojo Hamilton, Virginie Lafage, Aaron J. Buckland, Peter G. Passias, Themistocles S. Protopsaltis, Renaud Lafage, Justin S. Smith, Christopher I. Shaffrey, Khaled M. Kebaish and the International Spine Study Group
Using 2 complication-reporting methods, the authors investigated the incidence of major medical complications and mortality in elderly patients after surgery for adult spinal deformity (ASD) during a 2-year follow-up period.
The authors queried a multicenter, prospective, surgeon-maintained database (SMD) to identify patients 65 years or older who underwent surgical correction of ASD from 2008 through 2014 and had a minimum 2 years of follow-up (n = 153). They also queried a Centers for Medicare & Medicaid Services claims database (MCD) for patients 65 years or older who underwent fusion of 8 or more vertebral levels from 2005 through 2012 (n = 3366). They calculated cumulative rates of the following complications during the first 6 weeks after surgery: cerebrovascular accident, congestive heart failure, deep venous thrombosis, myocardial infarction, pneumonia, and pulmonary embolism. Significance was set at p < 0.05.
During the perioperative period, rates of major medical complications were 5.9% for pneumonia, 4.1% for deep venous thrombosis, 3.2% for pulmonary embolism, 2.1% for cerebrovascular accident, 1.8% for myocardial infarction, and 1.0% for congestive heart failure. Mortality rates were 0.9% at 6 weeks and 1.8% at 2 years. When comparing the SMD with the MCD, there were no significant differences in the perioperative rates of major medical complications except pneumonia. Furthermore, there were no significant intergroup differences in the mortality rates at 6 weeks or 2 years. The SMD provided greater detail with respect to deformity characteristics and surgical variables than the MCD.
The incidence of most major medical complications in the elderly after surgery for ASD was similar between the SMD and the MCD and ranged from 1% for congestive heart failure to 5.9% for pneumonia. These complications data can be valuable for preoperative patient counseling and informed consent.
Christopher S. Bailey, Marcel F. Dvorak, Kenneth C. Thomas, Michael C. Boyd, Scott Paquett, Brian K. Kwon, John France, Kevin R. Gurr, Stewart I. Bailey and Charles G. Fisher
The authors compared the outcome of patients with thoracolumbar burst fractures treated with and without a thoracolumbosacral orthosis (TLSO).
As of June 2002, all consecutive patients satisfying the following inclusion criteria were considered eligible for this study: 1) the presence of an AO Classification Type A3 burst fractures between T-11 and L-3, 2) skeletal maturity and age < 60 years, 3) admission within 72 hours of injury, 4) initial kyphotic deformity < 35°, and 5) no neurological deficit. The study was designed as a multicenter prospective randomized clinical equivalence trial. The primary outcome measure was the score based on the Roland-Morris Disability Questionnaire assessed at 3 months postinjury. Secondary outcomes are assessed until 2 years of follow-up have been reached, and these domains included pain, functional outcome and generic health-related quality of life, sagittal alignment, length of hospital stay, and complications. Patients in whom no orthotic was used were encouraged to ambulate immediately following randomization, maintaining “neutral spinal alignment” for 8 weeks. The patients in the TLSO group began being weaned from the brace at 8 weeks over a 2-week period.
Sixty-nine patients were followed to the primary outcome time point, and 47 were followed for up to 1 year. No significant difference was found between treatment groups for any outcome measure at any stage in the follow-up period. There were 4 failures requiring surgical intervention, 3 in the TLSO group and 1 in the non-TLSO group.
This interim analysis found equivalence between treatment with a TLSO and no orthosis for thoracolumbar AO Type A3 burst fractures. The influence of a brace on early pain control and function and on long-term 1- and 2-year outcomes remains to be determined. However, the authors contend that a thoracolumbar burst fracture, in exclusion of an associated posterior ligamentous complex injury, is inherently a very stable injury and may not require a brace.
Presented at the 2009 Joint Spine Section Meeting
Charles A. Sansur, Davis L. Reames, Justin S. Smith, D. Kojo Hamilton, Sigurd H. Berven, Paul A. Broadstone, Theodore J. Choma, Michael James Goytan, Hilali H. Noordeen, Dennis Raymond Knapp Jr., Robert A. Hart, Reinhard D. Zeller, William F. Donaldson III, David W. Polly Jr., Joseph H. Perra, Oheneba Boachie-Adjei and Christopher I. Shaffrey
This is a retrospective review of 10,242 adults with degenerative spondylolisthesis (DS) and isthmic spondylolisthesis (IS) from the morbidity and mortality (M&M) index of the Scoliosis Research Society (SRS). This database was reviewed to assess complication incidence, and to identify factors that were associated with increased complication rates.
The SRS M&M database was queried to identify cases of DS and IS treated between 2004 and 2007. Complications were identified and analyzed based on age, surgical approach, spondylolisthesis type/grade, and history of previous surgery. Age was stratified into 2 categories: > 65 years and ≤ 65 years. Surgical approach was stratified into the following categories: decompression without fusion, anterior, anterior/posterior, posterior without instrumentation, posterior with instrumentation, and interbody fusion. Spondylolisthesis grades were divided into low-grade (Meyerding I and II) versus high-grade (Meyerding III, IV, and V) groups. Both univariate and multivariate analyses were performed.
In the 10,242 cases of DS and IS reported, there were 945 complications (9.2%) in 813 patients (7.9%). The most common complications were dural tears, wound infections, implant complications, and neurological complications (range 0.7%–2.1%). The mortality rate was 0.1%. Diagnosis of DS had a significantly higher complication rate (8.5%) when compared with IS (6.6%; p = 0.002). High-grade spondylolisthesis correlated strongly with a higher complication rate (22.9% vs 8.3%, p < 0.0001). Age > 65 years was associated with a significantly higher complication rate (p = 0.02). History of previous surgery and surgical approach were not significantly associated with higher complication rates. On multivariate analysis, only the grade of spondylolisthesis (low vs high) was in the final best-fit model of factors associated with the occurrence of complications (p < 0.0001).
The rate of total complications for treatment of DS and IS in this series was 9.2%. The total percentage of patients with complications was 7.9%. On univariate analysis, the complication rate was significantly higher in patients with high-grade spondylolisthesis, a diagnosis of DS, and in older patients. Surgical approach and history of previous surgery were not significantly correlated with increased complication rates. On multivariate analysis, only the grade of spondylolisthesis was significantly associated with the occurrence of complications.
Khoi D. Than, Paul Park, Kai-Ming Fu, Stacie Nguyen, Michael Y. Wang, Dean Chou, Pierce D. Nunley, Neel Anand, Richard G. Fessler, Christopher I. Shaffrey, Shay Bess, Behrooz A. Akbarnia, Vedat Deviren, Juan S. Uribe, Frank La Marca, Adam S. Kanter, David O. Okonkwo, Gregory M. Mundis Jr., Praveen V. Mummaneni and the International Spine Study Group
Minimally invasive surgery (MIS) techniques are increasingly used to treat adult spinal deformity. However, standard minimally invasive spinal deformity techniques have a more limited ability to restore sagittal balance and match the pelvic incidence–lumbar lordosis (PI-LL) than traditional open surgery. This study sought to compare “best” versus “worst” outcomes of MIS to identify variables that may predispose patients to postoperative success.
A retrospective review of minimally invasive spinal deformity surgery cases was performed to identify parameters in the 20% of patients who had the greatest improvement in Oswestry Disability Index (ODI) scores versus those in the 20% of patients who had the least improvement in ODI scores at 2 years' follow-up.
One hundred four patients met the inclusion criteria, and the top 20% of patients in terms of ODI improvement at 2 years (best group, 22 patients) were compared with the bottom 20% (worst group, 21 patients). There were no statistically significant differences in age, body mass index, pre- and postoperative Cobb angles, pelvic tilt, pelvic incidence, levels fused, operating room time, and blood loss between the best and worst groups. However, the mean preoperative ODI score was significantly higher (worse disability) at baseline in the group that had the greatest improvement in ODI score (58.2 vs 39.7, p < 0.001). There was no difference in preoperative PI-LL mismatch (12.8° best vs 19.5° worst, p = 0.298). The best group had significantly less postoperative sagittal vertical axis (SVA; 3.4 vs 6.9 cm, p = 0.043) and postoperative PI-LL mismatch (10.4° vs 19.4°, p = 0.027) than the worst group. The best group also had better postoperative visual analog scale back and leg pain scores (p = 0.001 and p = 0.046, respectively).
The authors recommend that spinal deformity surgeons using MIS techniques focus on correcting a patient's PI-LL mismatch to within 10° and restoring SVA to < 5 cm. Restoration of these parameters seems to impact which patients will attain the greatest degree of improvement in ODI outcomes, while the spines of patients who do the worst are not appropriately corrected and may be fused into a fixed sagittal plane deformity.
Paul Park, Michael Y. Wang, Virginie Lafage, Stacie Nguyen, John Ziewacz, David O. Okonkwo, Juan S. Uribe, Robert K. Eastlack, Neel Anand, Raqeeb Haque, Richard G. Fessler, Adam S. Kanter, Vedat Deviren, Frank La Marca, Justin S. Smith, Christopher I. Shaffrey, Gregory M. Mundis Jr. and Praveen V. Mummaneni
Minimally invasive surgery (MIS) techniques are becoming a more common means of treating adult spinal deformity (ASD). The aim of this study was to compare the hybrid (HYB) surgical approach, involving minimally invasive lateral interbody fusion with open posterior instrumented fusion, to the circumferential MIS (cMIS) approach to treat ASD.
The authors performed a retrospective, multicenter study utilizing data collected in 105 patients with ASD who were treated via MIS techniques. Criteria for inclusion were age older than 45 years, coronal Cobb angle greater than 20°, and a minimum of 1 year of follow-up. Patients were stratified into 2 groups: HYB (n = 62) and cMIS (n = 43).
The mean age was 60.7 years in the HYB group and 61.0 years in the cMIS group (p = 0.910). A mean of 3.6 interbody fusions were performed in the HYB group compared with a mean of 4.0 interbody fusions in the cMIS group (p = 0.086). Posterior fusion involved a mean of 6.9 levels in the HYB group and a mean of 5.1 levels in the cMIS group (p = 0.003). The mean follow-up was 31.3 months for the HYB group and 38.3 months for the cMIS group. The mean Oswestry Disability Index (ODI) score improved by 30.6 and 25.7, and the mean visual analog scale (VAS) scores for back/leg pain improved by 2.4/2.5 and 3.8/4.2 for the HYB and cMIS groups, respectively. There was no significant difference between groups with regard to ODI or VAS scores. For the HYB group, the lumbar coronal Cobb angle decreased by 13.5°, lumbar lordosis (LL) increased by 8.2°, sagittal vertical axis (SVA) decreased by 2.2 mm, and LL–pelvic incidence (LL-PI) mismatch decreased by 8.6°. For the cMIS group, the lumbar coronal Cobb angle decreased by 10.3°, LL improved by 3.0°, SVA increased by 2.1 mm, and LL-PI decreased by 2.2°. There were no significant differences in these radiographic parameters between groups. The complication rate, however, was higher in the HYB group (55%) than in the cMIS group (33%) (p = 0.024).
Both HYB and cMIS approaches resulted in clinical improvement, as evidenced by decreased ODI and VAS pain scores. While there was no significant difference in degree of radiographic correction between groups, the HYB group had greater absolute improvement in degree of lumbar coronal Cobb angle correction, increased LL, decreased SVA, and decreased LL-PI. The complication rate, however, was higher with the HYB approach than with the cMIS approach.
Zoher Ghogawala, Christopher I. Shaffrey, Anthony L. Asher, Robert F. Heary, Tanya Logvinenko, Neil R. Malhotra, Stephen J. Dante, R. John Hurlbert, Andrea F. Douglas, Subu N. Magge, Praveen V. Mummaneni, Joseph S. Cheng, Justin S. Smith, Michael G. Kaiser, Khalid M. Abbed, Daniel M. Sciubba and Daniel K. Resnick
There is significant practice variation and considerable uncertainty among payers and other major stakeholders as to whether many surgical treatments are effective in actual US spine practice. The aim of this study was to establish a multicenter cooperative research group and demonstrate the feasibility of developing a registry to assess the efficacy of common lumbar spinal procedures using prospectively collected patient-reported outcome measures.
An observational prospective cohort study was conducted at 13 US academic and community sites. Unselected patients undergoing lumbar discectomy or single-level fusion for spondylolisthesis were included. Patients completed the 36-item Short-Form Survey Instrument (SF-36), Oswestry Disability Index (ODI), and visual analog scale (VAS) questionnaires preoperatively and at 1, 3, 6, and 12 months postoperatively. Power analysis estimated a sample size of 160 patients: 125 patients with lumbar disc herniation, and 35 with lumbar spondylolisthesis. All patient data were entered into a secure Internet-based data management platform.
Of 249 patients screened, there were 198 enrolled over 1 year. The median age of the patients was 45.0 years (49% female) for lumbar discectomy (n = 148), and 58.0 years (58% female) for lumbar spondylolisthesis (n = 50). At 30 days, 12 complications (6.1% of study population) were identified. Ten patients (6.8%) with disc herniation and 1 (2%) with spondylolisthesis required reoperation. The overall follow-up rate for the collection of patient-reported outcome data over 1 year was 88.3%. At 30 days, both lumbar discectomy and single-level fusion procedures were associated with significant improvements in ODI, VAS, and SF-36 scores (p ≤ 0.0002), which persisted over the 1-year follow-up period (p < 0.0001). By the 1-year follow-up evaluation, more than 80% of patients in each cohort who were working preoperatively had returned to work.
It is feasible to build a national spine registry for the collection of high-quality prospective data to demonstrate the effectiveness of spinal procedures in actual practice. Clinical trial registration no.: 01220921 (ClinicalTrials.gov).