Impact of improved documentation on an academic neurosurgical practice

Clinical article

Full access

Object

Accuracy in documenting clinical care is becoming increasingly important; it can greatly affect the success of a neurosurgery department. As patient outcomes are being more rigorously monitored, inaccurate documentation of patient variables may present a distorted picture of the severity of illness (SOI) of the patients and adversely affect observed versus expected mortality ratios and hospital reimbursement. Just as accuracy of coding is important for generating professional revenue, accuracy of documentation is important for generating technical revenue. The aim of this study was to evaluate the impact of an educational intervention on the documentation of patient comorbidities as well as its impact on quality metrics and hospital margin per case.

Methods

All patients who were discharged from the Department of Neurosurgery of the Penn State Milton S. Hershey Medical Center between November 2009 and June 2012 were evaluated. An educational intervention to improve documentation was implemented and evaluated, and the next 16 months, starting in March 2011, were used for comparison with the previous 16 months in regard to All Patient Refined Diagnosis-Related Group (APR-DRG) weight, SOI, risk of mortality (ROM), case mix index (CMI), and margin per discharge.

Results

The APR-DRG weight was corrected from 2.123 ± 0.140 to 2.514 ± 0.224; the SOI was corrected from 1.8638 ± 0.0855 to 2.154 ± 0.130; the ROM was corrected from 1.5106 ± 0.0884 to 1.801 ± 0.117; and the CMI was corrected from 2.429 ± 0.153 to 2.825 ± 0.232, and as a result the average margin per discharge improved by 42.2%. The mean values are expressed ± SD throughout.

Conclusions

A simple educational intervention can have a significant impact on documentation accuracy, quality metrics, and revenue generation in an academic neurosurgery department.

Abbreviations used in this paper:APR-DRG = All Patient Refined Diagnosis-Related Group; CC = complications or comorbidities; CMI = case mix index; CMS = Center for Medicare and Medicaid Services; HIS = Health Information Systems; ICD-9-CM = International Classification of Diseases, Ninth Revision, Clinical Modification; I-MR = individuals and moving range; LOS = length of stay; MCC = major complications and comorbidities; MS-DRG = Medicare Severity DRG; PPACA = Patient Protection and Affordable Care Act; ROM = risk of mortality; SOI = severity of illness.

Object

Accuracy in documenting clinical care is becoming increasingly important; it can greatly affect the success of a neurosurgery department. As patient outcomes are being more rigorously monitored, inaccurate documentation of patient variables may present a distorted picture of the severity of illness (SOI) of the patients and adversely affect observed versus expected mortality ratios and hospital reimbursement. Just as accuracy of coding is important for generating professional revenue, accuracy of documentation is important for generating technical revenue. The aim of this study was to evaluate the impact of an educational intervention on the documentation of patient comorbidities as well as its impact on quality metrics and hospital margin per case.

Methods

All patients who were discharged from the Department of Neurosurgery of the Penn State Milton S. Hershey Medical Center between November 2009 and June 2012 were evaluated. An educational intervention to improve documentation was implemented and evaluated, and the next 16 months, starting in March 2011, were used for comparison with the previous 16 months in regard to All Patient Refined Diagnosis-Related Group (APR-DRG) weight, SOI, risk of mortality (ROM), case mix index (CMI), and margin per discharge.

Results

The APR-DRG weight was corrected from 2.123 ± 0.140 to 2.514 ± 0.224; the SOI was corrected from 1.8638 ± 0.0855 to 2.154 ± 0.130; the ROM was corrected from 1.5106 ± 0.0884 to 1.801 ± 0.117; and the CMI was corrected from 2.429 ± 0.153 to 2.825 ± 0.232, and as a result the average margin per discharge improved by 42.2%. The mean values are expressed ± SD throughout.

Conclusions

A simple educational intervention can have a significant impact on documentation accuracy, quality metrics, and revenue generation in an academic neurosurgery department.

It has been a century since Codman's innovation of the “End Result Idea and System.” As recounted by Iezzoni,16,17 in Codman's own words, it is “merely the common sense notion that every hospital should follow every patient it treats, long enough to determine whether or not the treatment has been successful, and then to inquire ‘if not, why not?’ with a view to prevent a similar failure in the future.” Codman was one of the first physicians to advocate what we now call “outcome measures.” The current emphasis by the Joint Commission on indicators of hospital performance is a continuance of the principles that the initial band of reformers, led by Codman, persistently advanced as a basis for the “standardization of hospitals.”12,18–20

Outcome measures have become an increasingly important factor for the current practice of physicians. With the signing of the Patient Protection and Affordable Care Act (PPACA) of 2010, the US health system increased the level of accountability of health care professionals for both the quality and efficiency of the care they provide.27 As the disparities in quality of care are becoming more apparent between providers, payers are increasing efforts to reduce these inequalities.21–23 The PPACA adds to these efforts by authorizing numerous value-based payment and delivery reforms that will fundamentally change the way in which medical care is delivered, evaluated, and paid for.24

The Center for Medicare and Medicaid Services (CMS), the largest public payer, mandates the reporting of 2 sets of the SCIP (Surgical Care Improvement Project) measures covering infection and venous thromboembolism. This is part of the initiative to advance transparency within the Medicare program. Hospitals are required to submit quarterly data, which are posted on the Hospital Compare website (http://www.hospitalcompare.hhs.gov/), to receive annual Medicare payment updates.25 The CMS has been publicly reporting facility-level care and performance rates for heart attack, heart failure, pneumonia, surgery, and patient satisfaction scores on its Hospital Compare website. The CMS is also in the process of developing a Physician Compare website. The goal is to monitor physician performance data—including quality, efficiency, and patient experience data—and to make it available to the public through the initiative of the PQRS (Physician Quality Reporting System).26

In addition to these efforts, medical professional societies are moving toward investing resources in the development of data collection and quality recognition programs that more accurately reflect the care provided by their members, such as the N2QOD (National Neurosurgery Quality and Outcomes Database) for neurosurgeons.24 One concern is that the source of these outcome measures is the clinical documentation for coding that is completed for each patient. Unfortunately, clinical documentation is susceptible to errors, which has been demonstrated in studies comparing departments within and between hospitals, showing great variation.2,3

The evolution of documentation and coding has been driven by its use for billing. Furthermore, as practitioners in a surgical field, neurosurgeons focus more on surgical procedure codes for reimbursement, often neglecting the documentation of comorbidities. Another issue is that surgeons often overlook documentation of comorbidities because they see it as part of the principal diagnosis. For example, they may not document coagulopathy in a patient admitted for a subdural hematoma who is using an anticoagulant and who has an elevated prothrombin time/international normalized ratio.

With the increase in demand for greater accountability and quality management in health care institutions, jurisdictions have released public report cards that compare outcomes across hospitals or practice groups. For example, the California OSHPD (Office of Statewide Health Planning and Development) publishes annual reports on risk-adjusted hospital outcomes for medical, surgical, and obstetric patients. The outcome indicators report on postoperative complications and postoperative length of stay (LOS) for cervical and lumbar discectomy.4 The report cards are created with the use of administrative data collected for administrative, financial, or managerial purposes.5 Risk adjustment is used to ensure that hospitals treating sicker patients are not unfairly penalized.6

As reimbursement is becoming increasingly tied to quality, more neurosurgeons are becoming hospital employees, and hospitals are operating on narrower margins, a better understanding of documentation is increasingly important for both a truer reflection of the quality of care and of the financial success of physicians and their institutions. A thorough understanding of the documentation and coding system with careful attention to comorbidities is essential for the proper coding of cases to determine appropriate reimbursement for services rendered. Proper documentation of the complexity of the cases is also important for benchmarking quality measures, which shows how well a hospital compares with others providing “similar” care and also for service-line reporting, which reports how well a department uses its resources to deliver care to patients.1

Several studies have demonstrated that comorbidities and risk factors are undercoded in administrative data compared with patients' medical records.7–11 Underreporting of comorbidities through documentation make the patients appear less complex than they are. This would result in some hospitals being classified as having inappropriately high mortality rates due to insufficient allowance for prevalence of comorbidities.5

A reason for this problem of physician inaccuracies with documentation is that only a few physicians have any formal training in proper documentation. Professional coders, who have studied coding and received certification, work in advisory roles to medical providers. There is also a belief among physicians that evaluation and management of coding guidelines are clinically irrelevant and overly complex,13,14 leading to a poor understanding.

In evaluating the appropriate reimbursement for disease processes, the inclusion of comorbidities affects the reimbursement level because more complex cases require more resources for care. These comorbidities significantly affect the reimbursement for a diagnosis-related group (DRG). Therefore, inaccurate coding has a great impact on the reimbursement for services rendered. In an effort to improve documentation accuracy, the coding office was asked to aid us in developing a training program to improve the accuracy of documentation and coding in the department. The aim of our study was to evaluate the impact of this intervention on the documentation of patient comorbidities and its effect on the All Patient Refined DRG (APR-DRG), risk of mortality (ROM), severity of illness (SOI), case mix index (CMI), and, as a result, the margin per case.

Methods

Definition of Terms Used

Medicare Severity DRGs (MS-DRGs) are determined from the ICD-9 (International Classification of Diseases, Ninth Revision) code assignments, as selected by the Health Information Services coding staff, following review of chart documentation. The coding staff is formally trained in ICD-9 diagnosis/procedure coding guidelines and the application of those guidelines. Health Information Services coders are certified through AHIMA (American Health Information Management Association) as CCSs (Certified Coding Specialists). The ICD-9 codes are entered into coding software that then uses algorithms to assign the appropriate MS-DRG. The MS-DRGs are assigned using several criteria including principal diagnosis, other diagnoses, presence or absence of major complications and comorbidities (MCC) and/or presence or absence of complications or comorbidities (CC), procedure codes (ICD-9—inpatients are not assigned CPT [Current Procedural Terminology] codes), sex of the patient, discharge status, and birth weight for neonates.

The data are gathered using documentation that is used for ICD-9 code assignment. These include the emergency department summary, admission history and physical, all progress notes, consults, discharge summary, physician orders, and operative report. Coders are not authorized to interpret any findings, lab values, or terminology other than what is explicitly documented in the chart. For example, if the chart contained the terms “mass effect with midline shift” (which cannot be coded as a comorbidity), the coders are not authorized to reinterpret those terms as brain compression (which can be coded as a comorbidity). The coders are authorized, however, to query the physician for coding clarification on items in the chart, such as notes, pathology reports, laboratory results, radiographic studies, and nursing notes.

The APR-DRG expands the basic DRG structure by adding patient differences. Several data elements are used to determine the APR-DRGs; these elements are produced with the aid of an algorithm that has been developed with the assistance of clinicians by the 3M Health Information Systems (HIS) and the NACHRI (National Association of Children's Hospitals and Related Institutions).15 The APR-DRG determination software requires input of several data elements that include principal diagnosis coded in ICD-9-CM, secondary diagnoses coded in ICD-9-CM, procedures coded in ICD-9-CM, age, sex, and discharge disposition. These data elements are combined on a patient-specific basis to determine the patient's SOI and ROM. The 4 subclasses (Levels I–IV) of both SOI and ROM are highly correlated for many conditions, but they often differ because they relate to distinct patient attributes. The APR-DRG in conjunction with SOI subclass has a higher utility in evaluating resource use, whereas the APR-DRG in conjunction with the ROM has higher utility for evaluating patient mortality.15

The APR-DRG system was developed as an all-payer alternative to MS-DRGs. This system shifted the focus from facility characteristics to patient characteristics and provides a better predictive model for resource use and outcomes. The MS-DRGs used for neurosurgery have varying relative weight and reimbursement depending on the inclusion of CC and MCC, which are reflected in the documented SOI and ROM. For example, failure to code MCC for craniotomy and endovascular intracranial procedures reduces reimbursement from $36,475.75 with MCC to $16,336.56 without CC/MCC. Most patients receiving craniotomy and/or endovascular treatment have comorbidities that go undocumented or get improperly documented, requiring a higher level of care and risk, which is not reflected in the MS-DRG. Documentation must include specific terminology for disease processes for proper coding because coders are not allowed to infer coding without the exact terminology.

The CMI is a measure of the relative cost or resources needed to treat the mix of patients in each licensed hospital during the calendar year. To calculate the CMI, MS-DRGs and their associated weights, which are assigned to each MS-DRG by the CMS, are used. Each MS-DRG has a numeric weight reflecting the national “average hospital resource consumption” by a patient in that MS-DRG, relative to that of all patients.

Documentation Intervention

Starting in March 2011, the Neurosurgery Department at Penn State Hershey College of Medicine worked with the HIS coding department to educate our providers (faculty, residents, and advance practice clinicians) on proper terms and techniques for accurate documentation. Two 45-minute presentations on the common pitfalls of documentation were given at our monthly departmental meetings (required attendance for all providers) in the first 2 months. These presentations covered the importance of accurate documentation, common documentation errors, terminology to aid in documentation, and easy access hotlines for any questions regarding documentation. Also at these sessions, pocket cards for the most commonly used and misused terms (Table 1) were distributed for reference to all providers (attending physicians, residents, physician assistants, and nurse practitioners). From the start of the intervention in March of 2011, members from the coding department joined neurosurgery inpatient rounds once every 2 weeks with each provider. On rounds, the coding department members would help ensure that proper documentation terminology was being used as well as serving as a resource for any questions that may have arisen since the last visit. Also, there were frequent email communications from the coding department to our staff with regard to proper documentation of clinical activity.

TABLE 1:

Substitution of documented words to make it more coder friendly*

Instead of Documenting…………Document………
midline shift, mass effect, increased ICPbrain compression &/or brain herniation
hypertensive crisis, hypertensive urgencyaccelerated or malignant hypertension
respiratory distress/insufficiency—when in need of BIPAP or mechanical ventilationrespiratory failure (this can be a PSI – Patient Safety Indicator following surgery)
respiratory distress/insufficiency—when due to prolonged surgery, airway or facial edema, airway protection following surgery or trauma, even when on BIPAP or mechanical ventilation (not a true respiratory failure)pulmonary insufficiency
elevated INR, abnormal coag profilecoagulopathy
microbleeds in brainsmall intracranial hemorrhages
left/right sided weaknesshemiparesis or hemiplegia
statusstatus epilepticus
brain swelling, vasogenic edemacerebral or brain edema
acute mental status changesencephalopathy
decreased or ↓ Nahyponatremia
increased or ↑ Na (even if being induced with 3% NaCl gtt)hypernatremia
decreased or ↓ H/H postoperative or postoperative anemiaacute blood loss anemia
seizuresepilepsy
Chiari malformationspecify Type 1, 2, 3, or 4

BIPAP = bilevel positive airway pressure; coag = coagulation; gtt = drip; H/H = hemoglobin/hematocrit; ICP = intracranial pressure; INR = international normalized ratio.

Data Analysis

Coding for the 16-month period from November 2009 to February 2011 (preintervention) was compared with the following 16-month period, from March 2011 to June 2012 (postintervention). A preintervention data set of 2092 cases and a postintervention data set of 2057 cases were studied. Both sets included all patient discharges from the neurosurgery service during that period. The coders in the HIS Department at Penn State Hershey Medical Center coded the patient data and compared it against documented clinical activity. The neurosurgeons were not involved directly in the coding, so there was no change from November 2009 to June 2012 in the manner of coding of the patients' illnesses by HIS.

Data obtained were LOS, CMI, APR-DRG weight, SOI, ROM, and margin per case. The margin per case each month was adjusted to a ratio of the average of the pretreatment period of the study. The payer mix was analyzed between the 2 periods for comparison for reimbursement changes. Descriptive statistics were used to summarize the sample characteristics. Statistical analyses were performed and figures were constructed using version 14 of the Minitab statistical software package (Minitab, Inc.), with a 2-tailed t-test. The mean values are expressed ± SD throughout.

Results

The 16-month period from November 2009 to February 2011 (preintervention) was compared with March 2011 to June 2012 (postintervention). A preintervention data set of 2092 cases and a postintervention data set of 2057 cases were studied. A summary of all the relevant factors investigated and the pre- and postintervention values are shown in Table 2.

TABLE 2:

Summary of results of coding intervention*

VariablePreinterventionPostinterventionDifferencet-Test CIp Value
LOS4.834 ± 0.6134.934 ± 0.644−0.099375(−0.554141 to 0.355391)0.656
APR-DRG weight2.123 ± 0.1402.514 ± 0.2240.390625(0.254664–0.526586)<0.001
ROM1.5106 ± 0.08841.801 ± 0.1170.290625(0.215484–0.365766)<0.001
SOI1.8638 ± 0.08552.154 ± 0.1300.290000(0.210063–0.369937)<0.001
CMI2.429 ± 0.1532.825 ± 0.2320.396250(0.253068–0.539432)<0.001
margin per case100% ± 30.4%142.2% ± 26.4%42.2%(0.2166–0.6282)<0.001

The pre- and postintervention values are expressed as the mean ± SD.

The primary outcome was to compare the APR-DRG pre- and postintervention to determine documentation accuracy. The APR-DRG weight was calculated for each case, with a statistically significant increase from 2.123 preintervention to 2.514 postintervention (p < 0.001). There was also a statistically significant increase in SOI and ROM following the intervention. Figure 1 shows the trend of the APR-DRG monthly pre- and postintervention.

Fig. 1.
Fig. 1.

An individual and moving range (I-MR) chart for APR-DRG, showing CIs and mean for the pre- and postintervention periods. LCL = lower control limit; UCL = upper control limit.

The CMI reflects the relative cost or resources needed to treat the mix of patients in each licensed hospital during the calendar year. The higher the number greater than 1.00, the lower the adjusted cost is per patient per day. In our group, the CMI increased from 2.429 preintervention to 2.825 postintervention (p < 0.001), which was a statistically significant increase. Figure 2 shows the CMI pre- and postintervention. Table 3 shows the comparison of the payer mix between the pre- and postintervention time periods, which remained stable. The number of cases only varied by 1.67% and did not reach statistical significance (p = 0.6772).

Fig. 2.
Fig. 2.

An I-MR chart for CMI, showing CIs and mean for the pre- and postintervention periods.

TABLE 3:

Payer mix pre- and postintervention

5 Major Insurance GroupsNov 2009–Feb 2011March 2011–June 2012
Medicare24.28%25.64%
Medical Assistance5.44%5.59%
Blue Cross/Highmark30.79%29.80%
managed care25.52%26.00%
all other13.96%12.96%
total100.00%100.00%

The margin per case for the Department of Neurosurgery increased by 42.2% postintervention (p < 0.001). Figure 3 shows the monthly trend in margin per case, demonstrating that there was a steady improvement in margin per case that was sustained.

Fig. 3.
Fig. 3.

An I-MR chart for profit margin per case, showing CIs and mean for the pre- and postintervention periods.

Also, the LOS for patients preintervention was 4.834 and postintervention was 4.934 (p = 0.656). The difference was found to be not statistically significant.

In evaluating the hospital (excluding the Department of Neurosurgery), the CMI for the preintervention period was 1.784 ± 0.057, and for the postintervention period it was 1.860 ± 0.054. Also in comparing the 2 periods, the profit margin of the hospital per case increased by only 40% of that of the Department of Neurosurgery.

Discussion

In this study we investigated the impact of the implementation of an educational intervention on clinical documentation within the Department of Neurosurgery. Accurate documentation is important for appropriate coding of patient disease severity and for accurate assessment of outcomes. To properly reflect the APR-DRG, SOI, ROM, CMI, and margin per case, the comorbidities of the patients need to be accurately documented. All these data are obtained from the MS-DRG records.

Unfortunately, clinical coding is susceptible to errors, with great variations found between institutions.2,3 Neurosurgery has a high level of complexity, with the presence of many subspecialties, and an array of procedures with subtle differences in anatomical location, mode of access, and use of auxiliary equipment. Furthermore, the wide range of complexity of illnesses in the patients we treat is difficult to document without a clear understanding of the requirements for properly documenting these levels of care.

Historically, studies have demonstrated that comorbidities and risk factors are undercoded in administrative data compared with the patients' medical records.7–11 Quan et al.24 compared the coding of comorbidities between Canadian administrative discharge abstracts and the patients' medical charts. Their study showed that the rates of cerebrovascular disease, congestive heart failure, and diabetes with chronic complications according to the patients' charts were 6.1%, 10.7%, and 3.9%, whereas the rates of these comorbidities in administrative data were 2.8%, 9.4%, and 2.5%, respectively.

The lack of formal training of most physicians and advanced practice clinicians regarding these methods of documentation may contribute to the issue of undercoding. Unfortunately, many physicians hold the belief that the evaluation and management coding guidelines are clinically irrelevant and overly complex.13,14 It is essential for physicians and advanced practice clinicians to learn the nuances of documentation to record and reflect comorbidities properly. It is important to note that we are in no way suggesting inappropriate upcoding of disease severity. Rather we are advocating accurate coding for all patients. Not accurately documenting comorbidities can lead to undercoding, making patients' illnesses appear less complex than they are. By misrepresenting the complexity of patients' illnesses, the APR-DRG, ROM, SOI, and CMI are directly affected. These measures have multiple uses: for benchmarking, which shows how well a hospital compares with others providing “similar” care; for service-line reporting, which shows how well a department uses its resources to deliver care to patients; and for reimbursement, which is based on the complexity of care.1

In the current study, we showed that a brief educational intervention led to an increase of 0.39 in APRDRG, 0.29 in ROM, 0.29 in SOI, and 0.40 in CMI; as a result, there was a 42.2% increase in margin per case. This intervention enabled us to record more accurately the complexity and the level of severity of illness of the patients treated by the department, which resulted in the increases in these measures and margin per case, which was sustained over time. These findings demonstrate the impact of this simple intervention.

Although it was important to have the coders join rounds every 2 weeks, the most useful method of instruction was the provision of the reference cards for use as a quick guide while delivering patient care. Further studies are required to determine which of these interventions were most useful. We believe that the cost incurred by asking the coders to make presentations and to increase their involvement with the providers was far outweighed by the improved accuracy of the quality metrics and profit margin.

A limitation of this study was that this was a single intervention at a single institution that lasted for 16 months, and the long-term effects of the intervention remain unknown. A future study with long-term outcomes could help determine if this intervention is adequate to maintain the improvement in documentation accuracy.

In an effort to look at other sources that may affect CMI and profit margin, we investigated the payer mix for the department between the 2 time periods and found no significant differences (Fig. 3). Another method of raising the margin is by decreasing LOS; however, as shown in the results, the LOS was slightly longer (4.834 days preintervention vs 4.934 days postintervention), although this difference did not reach statistical significance. Also, accurate documentation was the main effort in improving quality and efficiency during the study period. Furthermore, there were no major changes in the Department of Neurosurgery during the period of pre- and postintervention analysis, including no changes in faculty, midlevel staff, number of residents, referral patterns, or the number and types of cases that would affect the variety of cases or type of practice.

To determine how the department compared with the rest of our institution, we evaluated the change in CMI and margin per case for the hospital (excluding the Department of Neurosurgery) between the 2 periods. Although the hospital also increased its CMI and profit margin per case between these periods, the Department of Neurosurgery increased its CMI by more than 5 times that of the hospital, and its profit margin by more than 2.5 times that of the hospital. Whereas other departments in the hospital had their own projects to improve quality and efficiency, the Department of Neurosurgery was the only one with a project on accurate coding at the time of the study. Evaluation of the hospital statistics further support our findings that the intervention was responsible for the significant improvement in the APR-DRG, ROM, SOI, CMI, and profit margin of the department.

Of note, our health care system is in transition; with passage of the PPACA, quality is increasingly being used as a measurement for comparing practices and as a measurement for resource allocation. All proposed reimbursement models share an emphasis on quality of care. Improvements in documentation by ensuring their accuracy and validity will allow for better use of quality measures for comparisons of hospitals, allocation of resources, and ultimately patient care.

Conclusions

With stronger emphasis on quality measures and physicians working on ever-narrowing margins, proper documentation for the level of severity of cases is essential. A simple and brief intervention on proper documentation terminology for coding can be an effective method of significantly improving coding in an academic neurosurgery practice.

Acknowledgments

We acknowledge the help, support, and opportunity afforded to Dr. Zalatimo by the Council of State Neurosurgical Societies Resident Fellowship Program 2012–2013 in pursuing and completing this project. We also express our thanks to Pat Swetland, Shannon Durovick, and Jay Schoen for all their help and support.

Disclosure

The authors report no conflict of interest concerning the materials or methods used in this study or the findings specified in this paper.

Author contributions to the study and manuscript preparation include the following. Conception and design: Zalatimo, Harbaugh, Iantosca. Analysis and interpretation of data: Zalatimo, Ranasinghe, Iantosca. Drafting the article: Ranasinghe. Critically revising the article: all authors. Reviewed submitted version of manuscript: all authors. Approved the final version of the manuscript on behalf of all authors: Zalatimo. Statistical analysis: Zalatimo. Administrative/technical/material support: Zalatimo. Study supervision: Harbaugh, Iantosca.

This article contains some figures that are displayed in color online but in black-and-white in the print edition.

References

  • 1

    Agency for Healthcare Research and Quality: APR™-DRG Classification Software–Overview (http://www.ahrq.gov/qual/mortality/Hughessumm.pdf) [Accessed November 18 2013]

  • 2

    Albertsen PCKamens EA: Variations in coding practices among Connecticut urologists for the Medicare population. Conn Med 54:5085111990

  • 3

    Austin PCTu JVAlter DANaylor CD: The impact of under coding of cardiac severity and comorbid diseases on the accuracy of hospital report cards. Med Care 43:8018092005

  • 4

    Berwick DM: E.A. Codman and the rhetoric of battle: a commentary. Milbank Q 67:2622671989

  • 5

    Best WRKhuri SFPhelan MHur KHenderson WGDemakis JG: Identifying patient preoperative risk factors and postoperative adverse events in administrative databases: results from the Department of Veterans Affairs National Surgical Quality Improvement Program. J Am Coll Surg 194:2572662002

  • 6

    Birkmeyer JDSiewers AEFinlayson EVStukel TALucas FLBatista I: Hospital volume and surgical mortality in the United States. N Engl J Med 346:112811372002

  • 7

    Birkmeyer JDStukel TASiewers AEGoodney PPWennberg DELucas FL: Surgeon volume and operative mortality in the United States. N Engl J Med 349:211721272003

  • 8

    Centers for Medicare and Medicaid Services: Physician Quality Reporting System (http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/PQRS/index.html?redirect=/pqrs) [Accessed November 18 2013]

  • 9

    Centers for Medicare and Medicaid Services: Reporting Hospital Quality Data for Annual Payment Update (RHQDAPU) (http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/downloads/HospitalFactSheetAP.pdf) [Accessed November 20 2013]

  • 10

    Codman EA: The product of a hospital. Surg Gynecol Obstet 18:4914961914

  • 11

    Codman EA: Report of the committee on the standardization of hospitals. Surg Gynecol Obstet 22:1191201916

  • 12

    Daley JKhuri SFHenderson WHur KGibbs JOBarbour G: Risk adjustment of the postoperative morbidity rate for the comparative assessment of the quality of surgical care: results of the National Veterans Affairs Surgical Risk Study. J Am Coll Surg 185:3283401997

  • 13

    Donabedian A: The end results of health care: Ernest Codman's contribution to quality assessment and beyond. Milbank Q 67:2332671989

  • 14

    Groman RFRubin KY: Neurosurgical practice and health care reform: moving toward quality-based health care delivery. Neurosurg Focus 34:1E12013

  • 15

    Hawker GACoyte PCWright JGPaul JEBombardier C: Accuracy of administrative data for assessing outcomes after knee replacement surgery. J Clin Epidemiol 50:2652731997

  • 16

    Iezzoni LI: The demand for documentation for Medicare payment. N Engl J Med 341:3653671999

  • 17

    Iezzoni LI: Risk Adjustment for Measuring Health Care Outcomes Ann Arbor, MIHealth Administration Press1994

  • 18

    Jameson SSReed MR: Payment by results and the surgeon: implications for current and future practice. Surgeon 6:1331352008

  • 19

    Lasker RDMarquis MS: The intensity of physicians' work in patient visits—implications for the coding of patient evaluation and management services. N Engl J Med 341:3373411999

  • 20

    Lehmann RD: Joint commission sets agenda for change. Qual Rev Bull 13:1481501987

  • 21

    Lembcke PA: Evolution of the medical audit. JAMA 199:5435501967

  • 22

    Malenka DJMcLerran DRoos NFisher ESWennberg JE: Using administrative data to describe casemix: a comparison with the medical record. J Clin Epidemiol 47:102710321994

  • 23

    Powell HLim LLHeller RF: Accuracy of administrative data to assess comorbidity in patients with heart disease. An Australian perspective. J Clin Epidemiol 54:6876932001

  • 24

    Quan HParsons GAGhali WA: Validity of information on comorbidity derived rom ICD-9-CCM administrative data. Med Care 40:6756852002

  • 25

    Romano PSZach ALuft HSRainwater JRemy LLCampa D: The California Hospital Outcomes Project: using administrative data to compare hospital performance. Jt Comm J Qual Improv 21:6686821995

  • 26

    Sorrel AL: Patient Protection and Affordable Care Act update. J Med Assoc Ga 101:10122012

  • 27

    Turnipseed WDLund DPSollenberger D: Product line development: a strategy for clinical success in academic centers. Ann Surg 246:5855922007

If the inline PDF is not rendering correctly, you can download the PDF file here.

Article Information

Address correspondence to: Omar Zalatimo, M.D., 30 Hope Dr., C 110, Hershey, PA 17033. email: ozz101@gmail.com.

Please include this information when citing this paper: published online December 20, 2013; DOI: 10.3171/2013.11.JNS13852.

© AANS, except where prohibited by US copyright law.

Headings

Figures

  • View in gallery

    An individual and moving range (I-MR) chart for APR-DRG, showing CIs and mean for the pre- and postintervention periods. LCL = lower control limit; UCL = upper control limit.

  • View in gallery

    An I-MR chart for CMI, showing CIs and mean for the pre- and postintervention periods.

  • View in gallery

    An I-MR chart for profit margin per case, showing CIs and mean for the pre- and postintervention periods.

References

  • 1

    Agency for Healthcare Research and Quality: APR™-DRG Classification Software–Overview (http://www.ahrq.gov/qual/mortality/Hughessumm.pdf) [Accessed November 18 2013]

  • 2

    Albertsen PCKamens EA: Variations in coding practices among Connecticut urologists for the Medicare population. Conn Med 54:5085111990

  • 3

    Austin PCTu JVAlter DANaylor CD: The impact of under coding of cardiac severity and comorbid diseases on the accuracy of hospital report cards. Med Care 43:8018092005

  • 4

    Berwick DM: E.A. Codman and the rhetoric of battle: a commentary. Milbank Q 67:2622671989

  • 5

    Best WRKhuri SFPhelan MHur KHenderson WGDemakis JG: Identifying patient preoperative risk factors and postoperative adverse events in administrative databases: results from the Department of Veterans Affairs National Surgical Quality Improvement Program. J Am Coll Surg 194:2572662002

  • 6

    Birkmeyer JDSiewers AEFinlayson EVStukel TALucas FLBatista I: Hospital volume and surgical mortality in the United States. N Engl J Med 346:112811372002

  • 7

    Birkmeyer JDStukel TASiewers AEGoodney PPWennberg DELucas FL: Surgeon volume and operative mortality in the United States. N Engl J Med 349:211721272003

  • 8

    Centers for Medicare and Medicaid Services: Physician Quality Reporting System (http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/PQRS/index.html?redirect=/pqrs) [Accessed November 18 2013]

  • 9

    Centers for Medicare and Medicaid Services: Reporting Hospital Quality Data for Annual Payment Update (RHQDAPU) (http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HospitalQualityInits/downloads/HospitalFactSheetAP.pdf) [Accessed November 20 2013]

  • 10

    Codman EA: The product of a hospital. Surg Gynecol Obstet 18:4914961914

  • 11

    Codman EA: Report of the committee on the standardization of hospitals. Surg Gynecol Obstet 22:1191201916

  • 12

    Daley JKhuri SFHenderson WHur KGibbs JOBarbour G: Risk adjustment of the postoperative morbidity rate for the comparative assessment of the quality of surgical care: results of the National Veterans Affairs Surgical Risk Study. J Am Coll Surg 185:3283401997

  • 13

    Donabedian A: The end results of health care: Ernest Codman's contribution to quality assessment and beyond. Milbank Q 67:2332671989

  • 14

    Groman RFRubin KY: Neurosurgical practice and health care reform: moving toward quality-based health care delivery. Neurosurg Focus 34:1E12013

  • 15

    Hawker GACoyte PCWright JGPaul JEBombardier C: Accuracy of administrative data for assessing outcomes after knee replacement surgery. J Clin Epidemiol 50:2652731997

  • 16

    Iezzoni LI: The demand for documentation for Medicare payment. N Engl J Med 341:3653671999

  • 17

    Iezzoni LI: Risk Adjustment for Measuring Health Care Outcomes Ann Arbor, MIHealth Administration Press1994

  • 18

    Jameson SSReed MR: Payment by results and the surgeon: implications for current and future practice. Surgeon 6:1331352008

  • 19

    Lasker RDMarquis MS: The intensity of physicians' work in patient visits—implications for the coding of patient evaluation and management services. N Engl J Med 341:3373411999

  • 20

    Lehmann RD: Joint commission sets agenda for change. Qual Rev Bull 13:1481501987

  • 21

    Lembcke PA: Evolution of the medical audit. JAMA 199:5435501967

  • 22

    Malenka DJMcLerran DRoos NFisher ESWennberg JE: Using administrative data to describe casemix: a comparison with the medical record. J Clin Epidemiol 47:102710321994

  • 23

    Powell HLim LLHeller RF: Accuracy of administrative data to assess comorbidity in patients with heart disease. An Australian perspective. J Clin Epidemiol 54:6876932001

  • 24

    Quan HParsons GAGhali WA: Validity of information on comorbidity derived rom ICD-9-CCM administrative data. Med Care 40:6756852002

  • 25

    Romano PSZach ALuft HSRainwater JRemy LLCampa D: The California Hospital Outcomes Project: using administrative data to compare hospital performance. Jt Comm J Qual Improv 21:6686821995

  • 26

    Sorrel AL: Patient Protection and Affordable Care Act update. J Med Assoc Ga 101:10122012

  • 27

    Turnipseed WDLund DPSollenberger D: Product line development: a strategy for clinical success in academic centers. Ann Surg 246:5855922007

TrendMD

Metrics

Metrics

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 358 358 33
PDF Downloads 154 154 8
EPUB Downloads 0 0 0

PubMed

Google Scholar