Epilepsy is a chronic neurological disorder that affects 0.5–1% of the population. Up to one-third of patients will have incompletely controlled seizures or debilitating side effects of anticonvulsant medications. Although some of these patients may be candidates for resection, many are not. The desire to find alternative treatments for epilepsy has led to a resurgence of interest in the use of deep brain stimulation (DBS), which has been used quite successfully in movement disorders. Small pilot studies and open-label trials have yielded results that may support the use of DBS in selected patients with refractory seizures. Because of the diversity of regions involved with seizure initiation and propagation, a variety of targets for stimulation have been examined. Moreover, stimulation parameters such as amplitude, frequency, pulse duration, and continuous versus intermittent on vary from one study to the next. More studies are necessary to determine if there is an appropriate population of seizure patients for DBS, the optimal target, and the most efficacious stimulation parameters.
Thomas L. Ellis and Andrew Stevens
Andrew C. Vivas, Steven W. Hwang and Joshua M. Pahys
Phrenic stimulators offer an alternative to standard mechanical ventilation as well as the potential for ventilator independence in select patients with chronic respiratory failure. Young patients (< 10 years old) with high cervical spinal cord injuries often develop paralytic scoliosis due to loss of muscle tone caudal to their spinal cord lesion. Growing rod systems allow for stabilization of spinal deformity while permitting continued growth of the spine and thoracic cavity. Magnetically controlled growing rods (MCGRs) offer the advantage of noninvasive expansion, as opposed to the operative expansion required in traditional growing rod systems. To the authors’ knowledge, this is the first reported case of MCGRs in a patient with a diaphragmatic pacemaker (DP). A 7-year-old boy with ventilator dependence after a high cervical spinal cord injury presented to the authors’ institution with paralytic scoliosis that progressed to > 120°. The patient had previously undergone insertion of phrenic nerve stimulators for diaphragmatic pacing. The decision was made to insert MCGRs bilaterally to stabilize his deformity, because the planned lengthening surgeries that are necessary with traditional growing rods would be poorly tolerated in this patient. The patient’s surgery and postoperative course were uneventful. The DP remained functional after insertion and lengthening of the MCGRs by using the external magnet. The DP had no effect on the expansion capability of the MCGRs. In conclusion, the MCGRs appear to be compatible with the DP. Further studies are needed to validate the long-term safety and compatibility of these 2 devices.
Sheeraz A. Qureshi, Steven McAnany, Vadim Goz, Steven M. Koehler and Andrew C. Hecht
In recent years, there has been increased interest in the use of cervical disc replacement (CDR) as an alternative to anterior cervical discectomy and fusion (ACDF). While ACDF is a proven intervention for patients with myelopathy or radiculopathy, it does have inherent limitations. Cervical disc replacement was designed to preserve motion, avoid the limitations of fusion, and theoretically allow for a quicker return to activity. A number of recently published systematic reviews and randomized controlled trials have demonstrated positive clinical results for CDR, but no studies have revealed which of the 2 treatment strategies is more cost-effective. The purpose of this study was to evaluate the cost-effectiveness of CDR and ACDF by using the power of decision analysis. Additionally, the authors aimed to identify the most critical factors affecting procedural cost and effectiveness and to define thresholds for durability and function to focus and guide future research.
The authors created a surgical decision model for the treatment of single-level cervical disc disease with associated radiculopathy. The literature was reviewed to identify possible outcomes and their likelihood following CDR and ACDF. Health state utility factors were determined from the literature and assigned to each possible outcome, and procedural effectiveness was expressed in units of quality-adjusted life years (QALYs). Using ICD-9 procedure codes and data from the Nationwide Inpatient Sample, the authors calculated the median cost of hospitalization by multiplying hospital charges by the hospital-specific cost-to-charge ratio. Gross physician costs were determined from the mean Medicare reimbursement for each current procedural terminology (CPT) code. Uncertainty as regards both cost and effectiveness numbers was assessed using sensitivity analysis.
In the reference case, the model assumed a 20-year duration for the CDR prosthesis. Cervical disc replacement led to higher average QALYs gained at a lower cost to society if both strategies survived for 20 years ($3042/QALY for CDR vs $8760/QALY for ACDF). Sensitivity analysis revealed that CDR needed to survive at least 9.75 years to be considered a more cost-effective strategy than ACDF. Cervical disc replacement becomes an acceptable societal strategy as the prosthesis survival time approaches 11 years and the $50,000/QALY gained willingness-to-pay threshold is crossed. Sensitivity analysis also indicated that CDR must provide a utility state of at least 0.796 to be cost-effective.
Both CDR and ACDF were shown to be cost-effective procedures in the reference case. Results of the sensitivity analysis indicated that CDR must remain functional for at least 14 years to establish greater cost-effectiveness than ACDF. Since the current literature has yet to demonstrate with certainty the actual durability and long-term functionality of CDR, future long-term studies are required to validate the present analysis.
Impreet Gill, Andrew G. Parrent and David A. Steven
Cranial nerve (CN) deficits following anterior temporal lobectomy (ATL) are an uncommon but well-recognized complication. The usual CNs implicated in post-ATL complications include the oculomotor, trochlear, and facial nerves. To the authors’ knowledge, injury to the trigeminal nerve leading to neuropathic pain has not been previously described in the literature. This paper presents 2 cases of trigeminal neuropathic pain following temporal lobe resections for pharmacoresistant epilepsy. The possible pathophysiological mechanisms are discussed and the microsurgical anatomy of surgically relevant structures is reviewed.
Daniel H. Fulkerson, Steven W. Hwang, Akash J. Patel and Andrew Jea
External orthosis is the accepted and historical management of odontoid synchondrosis fractures; however, this conservative therapy carries a significant complication and fracture nonunion rate among young children. The purpose of this study was to evaluate the authors' own experience in the context of the literature, to explore surgical fixation as a primary treatment for unstable fractures. The authors retrospectively reviewed 2 cases of unstable odontoid synchondrosis fractures treated at their institution; both showed radiographic progression of deformity and subsequently underwent an open surgical reduction and fusion. A literature review was conducted to compare the authors' management strategy with those in published data. External orthosis for treatment of odontoid synchondrosis fractures has a strong history of success. However, in the literature, patients treated with a halo orthosis had a 43.3% rate of complications and an 11.4% risk of nonunion. There are radiographic findings that suggest instability, such as severe angulation and displacement of the odontoid process. Both patients in the present report underwent successful fusion without complication, as documented on CT scans obtained 3 months after surgery. Given the high rate of fusion attained with conservative therapy, it is recommended for most synchondrosis fractures. However, there is a recognized subgroup of synchondrosis fractures with severe angulation (> 30°) and displacement suggestive of significant ligamentous injury. In these patients, surgical fixation may be a safe and efficacious alternative to halo orthosis as the primary treatment.
E. Andrew Stevens, Elizabeth Palavecino, Robert J. Sherertz, Zakariya Shihabi and Daniel E. Couture
Treatment of ventriculoperitoneal shunt infections frequently requires placement of an external ventricular drain (EVD). Surveillance specimens obtained from antibiotic-impregnated (AI) EVDs may be less likely to demonstrate bacterial growth, potentially resulting in undertreatment of an infection. The purpose of this study was to assess whether AI EVDs had any significant effect on bacterial culture results compared with nonantibiotic-impregnated (NAI) EVDs.
In vitro assays were performed using AI EVDs containing minocycline and rifampin (VentriClear II, Medtronic) and NAI EVD controls (Bioglide, Medtronic). The presence of antibiotics was evaluated via capillary electrophoresis of sterile saline drawn from AI and NAI EVDs after predefined incubation intervals. Antimicrobial activity was assessed by evaluating zones of inhibition created by the catheter aspirates on plates inoculated with a quality control strain of Staphylococcus epidermidis (American Type Culture Collection strain 12228). To determine the effects of cultures drawn through AI compared with NAI EVDs, the quality control strain was then incubated within 4 new AI and 4 new NAI EVDs for predefined intervals before being plated on culture media. Spread and streak plate culture results from each type of catheter were compared at each time interval.
Capillary electrophoresis showed that more minocycline than rifampin was eluted from the AI EVDs. Sterile saline samples incubated within the AI EVDs demonstrated zones of growth inhibition when placed on plates of S. epidermidis at all time intervals tested. No zones of inhibition were noted on NAI EVD control plates. When a standardized inoculum of S. epidermidis was drawn through AI and NAI EVDs, antimicrobial effects were observed after incubation in the AI EVD group only. Colony counting demonstrated that significantly fewer colonies resulted from samples drawn through AI compared with NAI EVDs at the multiple time intervals. Similarly, streak plating yielded a statistically significant number of false-negative results from AI compared with NAI EVDs at 2 time intervals.
The findings in the current study indicate that the risk of a false-negative culture result may be increased when a CSF sample is drawn through an AI catheter. In the management of a known shunt infection, a false-negative result from an EVD culture specimen may lead to an inappropriately short duration of antibiotic therapy. These data have significant clinical implications, particularly given the widespread use of AI drains and the current high rates of shunt reinfection after EVD use worldwide.
Guillermo Aldave, Daniel Hansen, Steven W. Hwang, Amee Moreno, Valentina Briceño and Andrew Jea
Tethered cord syndrome is the clinical manifestation of an abnormal stretch on the spinal cord, presumably causing mechanical injury, a compromised blood supply, and altered spinal cord metabolism. Tethered cord release is the standard treatment for tethered cord syndrome. However, direct untethering of the spinal cord carries potential risks, such as new neurological deficits from spinal cord injury, a CSF leak from opening the dura, and retethering of the spinal cord from normal scar formation after surgery. To avoid these risks, the authors applied spinal column shortening to children and transitional adults with primary and secondary tethered cord syndrome and report treatment outcomes. The authors' aim with this study was to determine the safety and efficacy of spinal column shortening for tethered cord syndrome by analyzing their experience with this surgical technique.
The authors retrospectively reviewed the demographic and procedural data of children and young adults who had undergone spinal column shortening for primary or secondary tethered cord syndrome.
Seven patients with tethered cord syndrome caused by myelomeningocele, lipomyelomeningocele, and transitional spinal lipoma were treated with spinal column shortening. One patient with less than 24 months of follow-up was excluded from further analysis. There were 3 males and 4 females; the average age at the time was surgery was 16 years (range 8–30 years). Clinical presentations for our patients included pain (in 5 patients), weakness (in 4 patients), and bowel/bladder dysfunction (in 4 patients). Spinal column osteotomy was most commonly performed at the L-1 level, with fusion between T-12 and L-2 using a pedicle screw-rod construct. Pedicle subtraction osteotomy was performed in 6 patients, and vertebral column resection was performed in 1 patient. The average follow-up period was 31 months (range 26–37 months). Computed tomography–based radiographic outcomes showed solid fusion and no instrumentation failure in all cases by the most recent follow-up. Five of 7 patients (71%) reported improvement in preoperative symptoms during the follow-up period. The mean differences in initial and most recent Scoliosis Research Society Outcomes Questionnaire and Oswestry Disability Index scores were 0.26 and –13%, respectively; minimum clinically important difference in SRS-22 and ODI were assumed to be 0.4% and –12.8%, respectively.
Spinal column shortening seems to represent a safe and efficacious alternative to traditional untethering of the spinal cord for tethered cord syndrome.
Leonardo Rangel-Castilla, Steven W. Hwang, George Al-Shamy, Andrew Jea and Daniel J. Curry
The surgical treatment of refractory epilepsy has evolved as new innovations have been created. Disconnective procedures such as hemispherectomy have evolved. Presently, hemispherotomy has replaced hemispherectomy to reduce complication rates while maintaining good seizure control. Several disconnective techniques have been described including the Rasmussen, vertical, and lateral approaches. The lateral approach, or periinsular hemispherectomy, was derived from modifications on the functional hemispherectomy and involves removal of the temporal lobe mesial structures, exposure of the atrium via the circular sulcus, internal capsule transection under the central sulcus, intraventricular callosotomy, and frontobasal disconnection. The purpose of this article is to describe and illustrate in detail the anatomy and operative technique for periinsular hemispherotomy, as well as to discuss the nuances and issues involved with this procedure.
Max K. Kole, David Steven, Andrew Kirk and Stephen P. Lownie
Andrew M. Garfinkle, Irena R. Danys, David A. Nicolle, Austin R. T. Colohan and Steven Brem
✓ Terson's syndrome refers to the occurrence of vitreous hemorrhage with subarachnoid hemorrhage (SAH), usually due to a ruptured cerebral aneurysm. Although it is a well-described entity in the ophthalmological literature, it has been only rarely commented upon in the neurosurgical discussion of SAH.
Fundus findings are reported in a prospective study of 22 consecutive patients with a computerized tomography- or lumbar puncture-proven diagnosis of SAH. Six of these patients had intraocular hemorrhage on initial examination. In four patients vitreous hemorrhage was evident on presentation (six of eight eyes). In the subsequent 12 days, vitreous hemorrhage developed in the additional two patients (three of four eyes) due to breakthrough bleeding from the original subhyaloid hemorrhages.
The initial amount of intraocular hemorrhage did not correlate with the severity of SAH. Two of the six patients with intraocular hemorrhage died, whereas five of the 16 remaining SAH patients without intraocular hemorrhage died. Of the four survivors with intraocular hemorrhage, three showed gradual but significant improvement in their visual acuity by 6 months. The fourth underwent vitrectomy at 8 months after presentation and had a good visual result. With modern and aggressive medical and microsurgical management, Terson's syndrome should be recognized as an important reversible cause of blindness in patients surviving SAH.