Editorial. Reflections on the first decade of neurosurgical science of practice: what has been accomplished; what ambitions remain to be fulfilled?

View More View Less
  • 1 Neuroscience and Levine Cancer Institutes, Atrium Health, Charlotte, North Carolina;
  • 2 Carolina Neurosurgery & Spine Associates, Charlotte, North Carolina;
  • 3 Department of Neurosurgery, Mayo Clinic, Rochester, Minnesota; and
  • 4 Department of Neurosurgery, College of Medicine, Pennsylvania State University, Hershey, Pennsylvania
Free access

If the inline PDF is not rendering correctly, you can download the PDF file here.

The chief purpose of learning is to “discover the meaning of experience.”

Edward Lindeman, American educational theorist

Almost 10 years ago, we outlined in Neurosurgical Focus an ambitious mission to develop a national competency in the “science of neurosurgical practice,” which, at its core, is a requirement that all physicians (as opposed to a small scientific elite) engage in scientific inquiry and quality improvement through the acquisition and analysis of practice data.1,2 Here we briefly review the conditions that led to the development of that ambition, the resulting national projects, and our thoughts about the future of cooperative data collection and management in neurosurgery.

Conditions for the Development of a National Competence in “The Science of Practice”

In the first decade of the current century, the specialty of neurosurgery experienced a collective awakening regarding the potential of information derived from daily practice to improve the safety and quality of patient care. This transformation was influenced by the convergence of a variety of observations, trends, and forces, many of which have been described previously:1

  • 1. A renewed interest in education and cognitive psychology research that emphasized the potential for deep conceptual learning and new knowledge generation through the “analysis of experience.”
  • 2. A recognition that high-performance “knowledge workers” (i.e., non–routine problem solvers) in a variety of professional domains habitually engage in a process of reflection on experience, identification of conspicuous knowledge gaps, and the generation of new insights.
  • 3. A reappraisal of medicine’s general role in our “knowledge economy,” with a recognition that physicians were often less engaged that other modern professionals in routinely using daily experience to facilitate individual and collective quality improvement.
  • 4. An increasing perception that traditional methods of evidence (i.e., knowledge) generation were failing our profession, particularly in procedural specialties, and that newer methods of investigation, focused on the end objectives of our therapies in “real world” settings (i.e., outcomes research) were more likely to result in a more rapid and meaningful healthcare progress.4
  • 5. A dramatic societal migration from traditional incentives for expanding utilization of increasingly costly healthcare services to incentives for objective documentation of safety, quality, and cost-effectiveness (i.e., “value”) through collection and analysis of practice data.5
  • 6. An increased awareness of significant structural challenges related to the practical implementation of value-based approaches, including the realities that “quality” of care remained variably defined, extant classification schemes failed to reflect critical clinical distinctions, and the substantial limitations of using medical “administrative databases” to empower healthcare improvement.

By 2008, neurosurgeons across the country had achieved a shared awareness of the conditions for meaningful participation in our emerging value-based healthcare system and informatics society. Specifically, it was determined that success in this new environment would require adopting the skills necessary to critically analyze practice, determine opportunities for improvement, and generate new knowledge. These essential activities defined a “science of practice,” which was hoped to “support a fundamental shift in professional culture, embedding quality improvement into the fabric of daily practice and harnessing the collective scientific potential of neurosurgeons in all practice settings.”1

This common awareness and purpose were ultimately manifest in the creation of our national cooperative data programs and associated curricula promoting widespread clinical data competencies, the largest of which—Quality Outcomes Database (QOD) Spine—is a retrospectively evaluated in this issue of Neurosurgical Focus.

QOD and the Science of Practice: Have We Achieved Our Ambitions?

We previously described in this journal a plan to “advance the quality of care and meet the research needs of a broad range of health care stakeholders” through the creation of “nationally coordinated practice data collection platforms, targeted support of scholarship and research in outcomes methodology, and educational programs to advance the understanding of practice science.”1

The authors report here that over the last 10 years, our QOD and related quality data efforts have resulted in some of the most successful and impactful cooperative endeavors in the history of our specialty.

We have successfully developed novel, national data systems, quality improvement tools, and an associated “quality culture” within neurosurgery. We have obtained essential credibility among important national healthcare quality stakeholders. Neurosurgeons in all practice environments are now engaged in collective clinical data projects related to the care of patients with spinal disorders, brain tumors, movement disorders, and cerebrovascular disease. These surgeons receive site-specific, risk-adjusted benchmarks for patient-reported outcomes and utilization. This information has been used to support advanced quality improvement projects, public and private insurer reporting, payer negotiations, and value-based contracting, along with evidence-based decision support. Approximately 200,000 patients have been enrolled in all registry programs to date. QOD projects have resulted in the development of dozens of published manuscripts, hundreds of abstracts, and have allowed for the objective demonstration of effectiveness and safety of some of our most common procedures in “real world” settings. Numerous educational programs now advance the understanding and importance of neurosurgical practice science.

By most common measures of strategic effectiveness, these substantial achievements, gained within a relatively compressed timeframe on a national scale, would seem to constitute success. Despite our progress, however, we believe that there is much room to expand the scope and impact of our efforts. In particular, data collection efficiencies remain cumbersome and overly manual; a lack of high-level audits and automated data collection methods has sometimes resulted in less than adequate data completeness; we have yet to optimize the application of collected data to continuous methods of improvement on the practice and individual surgeon level; and our use of the data platforms for high-quality prospective research remains in its infancy.

Neurosurgical Science of Practice: Ambitions for the Next Decade

Building off our own abilities to “generate knowledge from experience,” the leadership of our national data programs have identified the following areas for improvement, growth, and maturation in the coming decade.

National Consensus Regarding Diagnostic and Procedural Definitions

For many clinical areas in neurosurgery (particularly spine surgery), we lack the ability to precisely and reproducibly define distinct and comparable patient cohorts. In particular, reliable systems for classifying patients based on structural, symptomatic, and possibly biological criteria need to be constructed and prospectively validated. In the absence of such consistent definitions, we remain handicapped in our collective ability to identify focused opportunities for improvement, compare the efficacy of various treatments, and make determinations regarding the relative experience of different patient populations. We propose here the development of multidisciplinary groups to help create meaningful definitional criteria for our most common sets of clinical conditions.

Improved Data Collection Efficiencies

It is imperative that we continue to develop improved methods for automating the collection of clinical data relevant to our quality improvement and scientific processes. Such improved methods will enhance data quality, completeness, and quantity, along with reducing the costs associated with obtaining essential clinical data. We have already begun to make progress in this objective through partnerships with various national “middleware” health information technology (a.k.a., HIT) enterprises. Our hope with respect to this aim is to achieve near-total automation of quality clinical data acquisition in our national data systems by the end of the decade.

Use of Our Data Platforms for Prospective Research

The regular use of registry platforms and similar data tools to promote and support prospective research was part of the original goals of our national data programs and remains an elusive target. This condition is largely due to the fact that clinical registries have commonly been conceptualized as ongoing, relatively stable data instruments designed for continuous improvement, as opposed to more versatile tools that can be serially applied in focused scientific projects. Furthermore, we have observed over the last several years that only subsets of registry participants are generally interested in focused prospective research questions (although many, if not most, remain interested in the retrospective use of data to generate new insights). As such, we propose here the creation of “vanguard” groups of registry participants who have the interest, talents, and resources to engage in time-limited, prospective research embedded within our existing and developing platforms, using methods similar to those proposed by Harbaugh3 in this issue of Neurosurgical Focus. We also propose the development of funding mechanisms (public and private) to support such research.

Enhancing the Value of Collected Data

It has been proposed that clinical data collection/analysis and applied quality improvement represent distinct intellectual and practical competencies.6 As few individuals in healthcare possess both sets of capabilities, we recognize the existence of a common knowledge gap that likely limits the potential of our clinical information systems to empower continuous quality improvement. Therefore, in order to allow for collected clinical data to achieve its true applied value, we recommend the development of a national competency in comprehensive quality improvement. This goal envisions an expanded role of clinical registries in comprehensive quality improvement, to include using registries to define opportunities for quality improvement, and embedding insights from registry data within established methods of continuous quality improvement programs. We have already begun work on this objective through partnerships with internationally regarded quality improvement groups, such as the Institute for Healthcare Improvement (IHI).

Leverage Strategic Partnerships to Achieve Synergies/Scale

The essence of practice science involves widespread and routine cooperation related to the sharing of data, experience, resources, and technical capabilities. Perhaps no other activity related to the acquisition and application of clinical data promises a greater return on investment than fostering multidisciplinary collaboration. Over the last several years, we have experienced an explosion of scientific activity within our original registry communities through the development of “neural networks” through which individuals from programs around the country collaborate in formal and organically generated research groups. Taking this concept one step further, many of our existing national data programs now involve cooperative data collection and sharing agreements with other medical specialties and societies. The NPA-SNIS (The NeuroPoint Alliance and the Society of NeuroInterventional Surgery) alliance is a novel collaborative data collection platform designed to acquire data related to the care of patients with cerebrovascular disease in multiple specialty and practice settings. The American Spine Registry (ASR) is a newly created cooperative endeavor between the American Association of Neurological Surgeons and the American Academy of Orthopedic Surgeons, which is designed to facilitate the collection of practice data from North American spine surgeons in all practice settings. These latter efforts were generated with the intent of creating synergies, scale, and efficiencies through the combination of unique resources and skills. We see intelligent cooperation with a variety of healthcare stakeholders as a critically important element of our development strategy in the coming years.

Conclusions

In summary, we authors previously offered our belief that “the evidence neurosurgeons need to improve care and shape the future of our profession is created in daily practice. Tremendous scientific and economic potential resides untapped within our routine clinical activities. The methods for realizing that potential now exist. The promise of those methods can only be fulfilled through concerted effort and organized action. Every neurosurgeon should embrace practice science as an essential component of modern neurosurgical practice and the creation of a sustainable health care system. By doing so, we will define the relevance of neurosurgical practice within the broader realm of medicine, surgery, and society.”1

Almost a decade after publishing those words, we remain fully committed to that vision. We would perhaps add that we have made great progress as a united and dedicated group of surgical practice scientists, but we have barely scratched the surface of what we can ultimately achieve to the benefit of our specialty and our patients.

Disclosures

The authors report no conflict of interest.

References

  • 1

    Asher AL, McCormick PC, Kondziolka D: Introduction: the science of practice: addressing the challenges of modern health care. Neurosurg Focus 34(1):Introduction, 2013

    • Search Google Scholar
    • Export Citation
  • 2

    Berwick DM: Measuring surgical outcomes for improvement: was Codman wrong? JAMA 313:469470, 2015

  • 3

    Harbaugh RE: How the science of practice will improve evidence-based care. Neurosurg Focus 48(5):E7, 2020

  • 4

    Porter ME: What is value in health care? N Engl J Med 363:24772481, 2010

  • 5

    Selden NR, Ghogawala Z, Harbaugh RE, Litvack ZN, McGirt MJ, Asher AL: The future of practice science: challenges and opportunities for neurosurgery. Neurosurg Focus 34(1):E8, 2013

    • Search Google Scholar
    • Export Citation
  • 6

    Wallis CJD, Detsky AS, Fan E: Establishing the effectiveness of procedural interventions: the limited role of randomized trials. JAMA 320:24212422, 2018

    • Search Google Scholar
    • Export Citation

If the inline PDF is not rendering correctly, you can download the PDF file here.

Contributor Notes

Correspondence Anthony L. Asher: a.asher@cnsa.com.

INCLUDE WHEN CITING DOI: 10.3171/2020.2.FOCUS20147.

Disclosures The authors report no conflict of interest.

  • 1

    Asher AL, McCormick PC, Kondziolka D: Introduction: the science of practice: addressing the challenges of modern health care. Neurosurg Focus 34(1):Introduction, 2013

    • Search Google Scholar
    • Export Citation
  • 2

    Berwick DM: Measuring surgical outcomes for improvement: was Codman wrong? JAMA 313:469470, 2015

  • 3

    Harbaugh RE: How the science of practice will improve evidence-based care. Neurosurg Focus 48(5):E7, 2020

  • 4

    Porter ME: What is value in health care? N Engl J Med 363:24772481, 2010

  • 5

    Selden NR, Ghogawala Z, Harbaugh RE, Litvack ZN, McGirt MJ, Asher AL: The future of practice science: challenges and opportunities for neurosurgery. Neurosurg Focus 34(1):E8, 2013

    • Search Google Scholar
    • Export Citation
  • 6

    Wallis CJD, Detsky AS, Fan E: Establishing the effectiveness of procedural interventions: the limited role of randomized trials. JAMA 320:24212422, 2018

    • Search Google Scholar
    • Export Citation

Metrics

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 188 188 188
PDF Downloads 119 119 119
EPUB Downloads 0 0 0