Excess Cost and Length of Stay Associated With ... - SAGE Journals

2 downloads 0 Views 160KB Size Report
This study estimates excess cost and length of stay associated with voluntary patient safety event reports at 3 hospitals. Voluntary patient safety event reporting ...
Excess Cost and Length of Stay Associated With Voluntary Patient Safety Event Reports in Hospitals Andrew R. Paradis, MBA Valerie T. Stewart, PhD K. Bruce Bayley, PhD Allen Brown Andrew J. Bennett

This study estimates excess cost and length of stay associated with voluntary patient safety event reports at 3 hospitals. Voluntary patient safety event reporting has proliferated in hospitals in recent years, yet little is known about the cost of events captured by this type of system. Events captured in an electronic reporting system at 3 urban community hospitals in Portland, Oregon, are evaluated. Cost and length of stay are assessed by linking event reports to risk-adjusted administrative data. Hospital stays with an event report are 17% more costly and 22% longer than stays without events. Medication and treatment errors are the most expensive and most common events, representing 77% of all event types and 77% of added costs. Ninety percent of events result in no measurable harm. Patient safety events captured by voluntary event reporting reflect significant waste and inefficiency in hospital stays. (Am J Med Qual 2009: 24:53-60)

Voluntary event reporting systems have proliferated at hospitals around the country1 since the Institute of Medicine report To Err Is Human2 recommended them in an effort to improve patient safety. Previous research on patient safety–related cost and length of stay has focused on adverse events or injuries defined by some level of patient harm, which were most often captured via medical record review or electronic surveillance.3-8 Voluntary reporting systems capture a different spectrum of events than those captured by medical record review or electronic surveillance,9 and most of these events do not reflect acute patient harm.10 The events captured by voluntary event reporting systems frequently have been described as latent errors, near misses, or unsound practices that have the potential to cause future patient harm.11-13 As a result, costs associated with these events may reflect to a greater extent waste and inefficiency from poor patient safety practices rather than harm or injury. Little is known about the additional cost and length of stay associated with hospital stays that produce voluntary patient safety event reports. Voluntary event reporting systems vary widely in their purpose and design. Some national voluntary systems focus on certain events or specific outcomes such as hospital-acquired infections.14 For the purposes of this study, we consider only general-purpose voluntary systems that capture a broad range of unsafe conditions, events, and situations. This type of system can be purchased from several national vendors15 or can be developed internally.16,17 We used a multivariate regression model with case matching, risk adjustment, and log transformation of highly skewed dependent variables to estimate the excess cost and length of stay associated with

Keywords: patient safety; cost; length of stay; voluntary reporting

AUTHORS’ NOTE: At the time of the study, all authors were affiliated with the Center for Outcomes Research and Education, Providence Health and Services, Portland, Oregon. Mr Paradis is now with Ingenix, Eden Prairie, Minnesota. The authors disclosed no conflicts of interest. Corresponding author: Andrew R. Paradis, MBA, Ingenix, 2525 Lake Park Blvd, Salt Lake City, UT 84120 ([email protected]). This study was funded in part by contract 290-00-0018 (task order 11) from the Agency for Healthcare Research and Quality. The remainder was funded by Providence Health and Services. American Journal of Medical Quality, Vol. 24, No. 1, Jan/Feb 2009 DOI: 10.1177/1062860608327610 Copyright © 2009 by the American College of Medical Quality

53

54

Paradis et al

AMERICAN JOURNAL OF MEDICAL QUALITY

Table 1 Distribution of Event Types and Outcomes Outcome Distribution, % Event Type

% of Event Types

No Incident

No Harm

Medication Treatment Fall Equipment Behavioral Loss/exposure Overall

38 39 9 5 5 5 …

20 36 … 41 56 59 30

70 54 81 49 35 36 62

voluntary event reports. This study represents a methodological improvement on the sole previous study18 that has examined the costs associated with voluntary event reports. METHODS Data Sources

The Center for Outcomes Research and Education analyzed data from 3 Providence Health and Services community hospitals in the Portland, Oregon, metropolitan area. Providence St Vincent Medical Center operates 451 beds, Providence Portland Medical Center operates 483 beds, and Providence Milwaukie Hospital operates 77 beds. Data included all 123 281 discharges in the study period from April 1, 2002, through April 30, 2004. The study period was selected because it included the most recent administrative data available at the time this analysis was initially conducted. We extracted and linked data from 2 routinely available data sources in hospitals. One source was an administrative database, and the other was a database of voluntary patient safety event reports. A patient safety report was provided when any event occurred that deviated from routine or standard care. A report could also be generated when an event or process placed a patient in an unplanned risky situation. Information about the individual submitting a report was not collected, but anecdotal evidence indicates that most reports were completed by nurses. Use of this system began at the study hospitals in May 2001 using machine-readable forms; an online option was added in 2003. The reporting system was developed as part of efforts to encourage a culture of patient safety.

Harm 10 9 19 10 9 5 10

Death … 0.3 … … 0.4 … 0.1

The administrative database, developed by CareScience Inc19 (a benchmarking vendor), contained actual cost, actual length of stay, age, sex, payer, diagnosis related group, predicted cost, and predicted stay for each hospitalization. A patient with multiple hospitalizations during the study period would be included once for each hospitalization. In the CareScience database, cost is derived by applying cost-to-charge ratios for patient charge data. This database uses proprietary diagnosis-specific risk-adjustment models calculated from a client database of more than 200 hospitals representing more than 4 million discharges. The risk models use variables for chronic diseases, comorbidities, principal diagnosis, major procedures, urgency of the admission, age, sex, race/ethnicity, median household income in the patient zip code, relative travel distance to the facility, admission source, and transfer status to provide patient-specific estimates of expected cost and length of stay. The voluntary event report database contained 29 019 submissions from 3 hospitals for the study period. The submissions were related to the following different event categories, referred to as types: medication errors, patient falls, treatment events, equipment problems, behavioral issues, and loss/exposure events (Table 1). The hospitals in our study had a reporting rate of about 60 per 1000 patient-days, higher than the median 35 per 1000 patient-days reported by Milch et al.10 However, this difference is not surprising because reporting rates varied widely across hospitals in that study. In addition to an error type, each report was assigned an outcome using 14 categories that described the event’s potential effect on the patient based on the National Coordinating Council for Medication Error Reporting and Prevention scale20 or, for falls, the National Center for Nursing Quality

AMERICAN JOURNAL OF MEDICAL QUALITY

Costs Associated With Voluntary Event Reports in Hospitals

scale.21 An outcome could be categorized as “no incident,” “error/no harm,” “error/harm,” or “error/death” (Table 1). Each report was reviewed and validated by the manager of the department in which the event occurred and by hospital quality management personnel. The final data set contained 15 851 encounters that were linked to a voluntary event report. A total of 10 352 encounters could not be linked with the CareScience data. However, linked and unlinked event reports had similar distributions of types and outcomes when compared. Case Matching

Patient encounters resulting in voluntary event reports were matched against 1 to 4 controls using facility, initial department, diagnosis related group, sex, and age (±10 years). This procedure follows the method used by Zhan and Miller8 and was performed to create a data set with control cases that were similar to patient stays that produced a voluntary event report. The goal of this procedure was to reduce statistical confounding by controlling for differences in hospital processes that might influence the likelihood of an error and of the event being reported. Matching was performed without replacement so that each case was matched to a different control. Of 15 851 encounters linked to an event report, 11 568 were successfully matched with at least 1 control case. The matched cases and controls represented the distribution of patient types found in the overall hospital patient population in all areas other than obstetrics and newborns, which had a small number of event reports relative to their large proportion of hospital volume. This process is shown in Figure 1. Multivariate Modeling

To isolate the effect of an event from patient characteristics that influence cost and length of stay, we initially constructed 2 linear regression models. A cost model used the logarithm of cost as the dependent variable and included as independent variables the logarithm of expected cost, the logarithm of expected length of stay, age, sex, payer, a surgery indicator variable, and a dummy variable to indicate that an event was reported for that patient encounter, as well as interaction terms for payer, the logarithm of expected cost, and the logarithm of expected length of stay. We also constructed a model

55

for length of stay as a dependent variable using the same independent variables as in the cost model. In these 2 regression models, the coefficient assigned to the event indicator variable was interpreted as the increase in cost or length of stay associated with an event, while accounting for differences in patient characteristics. To model the cost or length of stay of a particular event type or outcome, we replaced the event dummy variable with dummy variables for each event type or outcome category. This yielded a total of 8 regression equations, including 2 overall cost and length of stay equations and 2 equations each for cost and length of stay by (1) type, (2) outcome, and (3) type and outcome (Figures 2 and 3). Significant differences between types and/or outcomes were identified by examining the 95% confidence interval (CI) around the parameter estimate for the respective event dummy variable. This approach is similar to a 2-tailed t test. Relative differences are shown in Figures 2 and 3. Using the logarithm of cost or length of stay was necessary to ensure that our models satisfied the assumptions of linear regression.22,23 The result of this transformation is that the coefficient of the event indicator variable now estimates the logarithm of the proportional or percentage change in cost or length of stay and must be transformed to be more easily interpreted.24 The transformation took the antilogarithm of the event dummy coefficient, providing the proportional change in cost or length of stay given an event report. The result was then multiplied by the median cost or length of stay of nonevent patient encounters to provide a “per event” cost or excess days resulting from an event. After matching, the estimate per event was multiplied by the total number of events in the data set to calculate overall total cost and days. All analyses were performed using SPSS 13.0 (SPSS Inc, Chicago, Illinois). RESULTS Descriptive Results

Only 10% of all event reports were classified as “error/harm,” and 0.1% of event reports documented events involving death (Table 1). Although few in number, it should be noted that some of the most serious events, particularly those resulting in death, were not always recorded in this voluntary system because staff used more formal legal reporting systems instead of the voluntary event system.

56

Paradis et al

AMERICAN JOURNAL OF MEDICAL QUALITY

Administrative Database

Voluntary Event Report

123,281 Discharges / Encounters

29,019 Event Reports

Match Discharges to Event Reports

107,430 Encounters not matched to a Event Report

15,851: Encounters Linked to Event Reports (Some encounters linked to more than one report. In this case, only the first report was used in the analysis.)

Case Matching

39,000 Controls

Figure 1.

10,352 Event Reports with invalid patient identification

2,816 Event Reports with no patient identification

Cases matched with 1 to 4 controls using hospital, initial department, DRG, sex, and age (± 10 years).

11,568 Cases matched to a control.

Data used in regression analysis

Case progression from 2 data sources. DRG indicates diagnosis related group.

Regression

After controlling for patient risk factors, hospitalizations with any type of event report were 17% more expensive than those without an event report (Figure 2). Similarly, length of stay was 22% longer for patients with an event report compared with those without. Medication and fall events were the most expensive (21% higher cost), followed by behavioral events (15%), loss/exposure (13%), treatment (12%), and equipment events (11%). The 95% CI for each regression coefficient was used to assess significant differences between

increases associated with different dummy variables (Figure 2). Overall, there was a significant difference in cost increase between events that did not reach patients (“no incident” 95% CI, 11%15%) and those that reached patients (“error/no harm” 95% CI, 18%-21%; and “error/harm” 95% CI, 17%-23%). Of those that reached patients, there was no significant difference in cost increase between events with harm and those without harm. Fall event reports were associated with the greatest increase in length of stay (34% longer length of stay), followed by medication events

AMERICAN JOURNAL OF MEDICAL QUALITY

Figure 2. Percentage increase in inefficiencies described in voluntary events for cost . Each point shows the percentage increase in cost or length of stay and associated 95% confidence interval. All coefficients were significant at P < .05.

Costs Associated With Voluntary Event Reports in Hospitals

Figure 3. Percentage increase in inefficiencies described in voluntary events for length of stay. Each point shows the percentage increase in cost or length of stay and associated 95% confidence interval. All coefficients were significant at P < .05.

57

58

Paradis et al

AMERICAN JOURNAL OF MEDICAL QUALITY

Table 2 Total Incremental Costs and Patient Days Associated With Voluntary Event Reportsa Event Type Medication Treatment Fall Behavioral Equipment Loss/exposure Overall

No. of Events

Days per Event

Cost per Event, $

4543 4622 1025 569 635 542 11 936

0.52 0.25 0.68 0.20 0.25 0.42 0.43

913 501 897 659 489 552 749

Total Days 2364 701 1162 238 124 265 4854

Total Cost, $ 4 149 346 2 316 702 919 507 375 126 310 808 299 448 8 370 937

a The total cost and days for each event report type are calculated by multiplying the type-specific per-event estimate by the number of events of that type. The total cost and days reported are the sum of the type-specific total cost and days to reflect the frequency of each event type. The overall cost and days per-event estimates reported in italics are from the initial regressions that did not contain type or outcome dummy variables and are not used in the calculation of total cost and days.

(26%), loss/exposure events (25%), behavioral events (21%), treatment events (13%), and equipment events (10%). There was a significant difference in length of stay increase between events that did not reach patients (“no incident” 95% CI, 16%21%) and events that reached patients but did not cause harm (“error/no harm” 95% CI, 22%-25%). As was observed with the cost model, there was no significant difference in length of stay increase between events with harm and those without harm. The cost models had an R2 of 0.72, and the length of stay models had an R2 of 0.51.

Extrapolation of Cost and Length of Stay

Percentage increases in cost and length of stay can be translated into dollars and days by multiplying the increase in cost and length of stay by the corresponding nonevent report median values. This step provides a picture of the total effect of the events reported because it combines both the percentage increase in cost or length of stay and the frequency of each event type. In the 2 years represented by our study, unplanned patient care events resulted in an estimated $8.3 million in additional patient care costs and an additional 4854 patient days (Table 2). Medication events, which were common and expensive per event, accounted for an estimated $4 million in patient care costs and more than 2300 bed days alone. Treatment events were the next most expensive, accounting for roughly $2.3 million in extra costs, followed by fall events, which accounted for more than $900 000 in additional costs. Falls had the greatest per-event increase in length of stay and accounted for more than 1100 additional bed days during 2 years.

DISCUSSION

Our study differs from other efforts to measure patient safety–related costs primarily in the means used to identify and define events. Prior investigations in this area have used mainly medical record review and electronic surveillance to identify events generally defined by some level of patient harm. In our study, the events collected through a voluntary safety reporting system also represent costly events, although they generally do not reflect patient harm. We suspect that most voluntary event reports, even those that do not result in patient harm, are markers for a degree of confusion, delay, missed communication, or lack of coordination in the care provided to a patient. Another portion of event reports represents human errors. Despite our best attempts at system design, human errors can occur. They illustrate the pervasive nature of human error but also point to our inadequacies in building error prevention into system design. A final portion of event reports captures rule violations or behavior outside of established policy that increases risk to the patient. Further work is needed to fully understand the conditions that give rise to different types of event reports. However, it is clear that the event reporters are identifying situations that involve added cost to the system. Thomas and Petersen11 developed a conceptual model that provides support to the “process defect” interpretation. They suggest that the strength of methods such as voluntary reporting is in the identification of latent errors or “system defects such as poor design, incorrect installation, faulty maintenance, poor purchasing decisions, and inadequate staffing.”11(p62) In contrast, methods such as medical

AMERICAN JOURNAL OF MEDICAL QUALITY

Costs Associated With Voluntary Event Reports in Hospitals

record review and electronic surveillance are better suited to identify active errors or those that occur at the “sharp end” of care. The additional cost and length of stay associated with voluntary event reports illustrate an important connection between patient safety and health care waste and inefficiency. These events arise from a broad cross section of hospital processes and probably are not unique to the 3 hospitals studied. The cost associated with these events is a reason to study low-acuity events and to give them the attention they deserve, although they do not result in direct patient harm. Limitations

Our study is limited primarily by reporting biases inherent in voluntary reporting systems that influence the type and severity of events reported. In the error measurement framework developed by Thomas and Petersen,11 these limitations prevent the use of voluntary event reporting to establish error rates or to measure the effect of interventions to improve patient safety. Consequently, our estimates of cost and length of stay associated with the events captured by voluntary reporting are imprecise. However, the estimates are sufficiently large in aggregate to safely conclude that these events are costly and, as such, represent an important source of waste and inefficiency. Furthermore, this study is constrained to 3 hospitals within a single health system, with most reports filed by nursing staff. Consequently, generalization of these results to new hospital environments is limited. Other hospitals may have different reporting cultures and systems that capture different event types and frequencies. Finally, not all real costs are captured in the administrative database. For example, the cost of investigation and review of these events is not captured. Costs also are not captured for the numerous instances of miscommunication among medical care teams or days of lost work to patients caused by longer hospital stays. Despite these limitations, the methods reported herein could be repeated in other hospital systems to estimate levels of waste and inefficiencies for quality improvement activities unique to their specific environments. Summary

The present study estimates the large aggregate effect of events and near misses in a large hospital

59

system as a specific case study. There is ample reason to believe that these estimates still underrepresent total costs. Costs of review and investigation, risk management expenses, and nonbillable costs are not included in the cost estimates. Using the conceptual framework developed by Thomas and Petersen,11 previous calculations of the national cost of patient safety events have been based on studies that capture primarily “active errors” and often define events with some level of patient harm. Our study involves chiefly those events that would be considered “latent errors” and near misses and do not cause patient harm. Compared with other methods presented in the framework with capacity to identify this type of event, voluntary event reporting provides more accessible standardized data on a far larger set of events. As a result, voluntary event reporting is likely the best way to capture these types of events and their associated costs. Unfortunately, we cannot make national extrapolations with our data and add these results to existing national estimates, as there is some overlap between the types of events reported herein and those of other studies. Given the voluntary nature of the data collection, we cannot be confident about the sampling of events reported. Nevertheless, our study substantially expands the total picture of unnecessary costs in inefficient and potentially unsafe patient care because this study captures events that do not rise to the level of harm to the patient or even general awareness to anyone but care providers. A larger analysis using multiple hospitals and reporting systems should be conducted to compare the distribution of report types and estimates from this study to determine the full extent of inefficiencies in hospital care. ACKNOWLEDGEMENTS

The authors would like to thank Cynthia Palmer, MSc, Eugene A Kroch, PhD, and Irvin Paradis, MD, for providing comments and suggestions. REFERENCES 1. Longo DR, Hewett JE, Ge B, Schubert S. The long road to patient safety: a status report on patient safety systems. JAMA. 2005;294(22):2858-2864. 2. Kohn LT, Corrigan JM, Donaldson M, eds. To Err Is Human: Building A Safer Health System. Washington, DC: Institute of Medicine; 1999. 3. Thomas EJ, Studdert DM, Newhouse JP, et al. Costs of medical injuries in Utah and Colorado. Inquiry. 1999;36(3):255-264.

60

Paradis et al

4. Classen DC, Pestotnik SL, Evans RS, Lloyd JF, Burke JP. Adverse drug events in hospitalized patients: excess length of stay, extra costs, and attributable mortality. JAMA. 1997;277(4):301-306. 5. Bates DW, Spell N, Cullen DJ, et al; Adverse Drug Events Prevention Study Group. The costs of adverse drug events in hospitalized patients. JAMA. 1997;277(4):307-311. 6. Einbinder JS, Scully K. Using a clinical data repository to estimate the frequency and costs of adverse drug events. J Am Med Inform Assoc. 2002;9(6)(suppl 1):s34-s38. 7. Senst BL, Achusim LE, Genest RP, et al. Practical approach to determining costs and frequency of adverse drug events in a health care network. Am J Health Syst Pharm. 2001;58 (12):1126-1132. 8. Zhan C, Miller MR. Excess length of stay, charges, and mortality attributable to medical injuries during hospitalization. JAMA. 2003;290(14):1868-1874. 9. Nuckols TK, Bell DS, Liu H, Paddock SM, Hilborne LH. Rates and types of events reported to established incident reporting systems in two US hospitals. Qual Saf Health Care. 2007;16(3):164-168. 10. Milch CE, Salem DN, Pauker SG, Lundquist TG, Kumar S, Chen J. Voluntary electronic reporting of medical errors and adverse events: an analysis of 92,547 reports from 26 acute care hospitals. J Gen Intern Med. 2006;21(2):165-170. 11. Thomas EJ, Petersen LA. Measuring errors and adverse events in health care J Gen Intern Med. 2003;18(1):61-67. 12. Aspden P, Corrigan JM, Wolcott J, Erickson SM, eds. Patient Safety: Achieving a New Standard for Care. Washington, DC: Institute of Medicine; 2004. 13. Jha AK, Kuperman GJ, Teich JM, et al. Identifying adverse drug events: development of a computer-based monitor and comparison with chart review and stimulated voluntary report. J Am Med Inform Assoc. 1998;5(3):305-314.

AMERICAN JOURNAL OF MEDICAL QUALITY 14. Leape L. Reporting of adverse events. N Engl J Med. 2002;347(20):1633-1638. 15. Roumm AR, Sciamanna CN, Nash DB. Health care provider use of private sector internal error–reporting systems. Am J Med Qual. 2005;20(6):304-312. 16. Martin SK, Etchegaray JM, Simmons D, Belt WT, Clark K. Development and Implementation of The University of Texas Close Call Reporting System. Rockville, MD: Agency for Healthcare Research and Quality; 2005. AHRQ publication 050021 (1-4). 17. Mekhjian HS, Bentley TD, Ahmad A, Marsh G. Development of a Web-based event reporting system in an academic environment. J Am Med Inform Assoc. 2004;11(1): 11-18. 18. Nordgren LD, Johnson T, Kirschbaum M, Peterson ML. Medical errors: excess hospital costs and lengths of stay. J Healthc Qual. 2004;26(2):42-48. 19. CareScience Inc. CareScience [computer program]. Philadelphia, PA: CareScience Inc; 2006. 20. U.S. Pharmacopeia Web site. USP Medication Errors Reporting Form. 2006. https://secure.usp.org/hqi/patientSafety/ mer/merform.html. Accessed February 15, 2006. 21. National Database of Nursing Quality Indicators. Guidelines for Data Collection and Submission on Quarterly Indicators. Kansas City: University of Kansas School of Nursing; 2008:52. 22. William BD. Understanding Regression Assumptions. Newbury Park, CA: Sage Publications; 1993. 23. Manning WG. The logged dependent variable, heteroscedasticity, and the retransformation problem. J Health Econ. 1998;17(3):283-295. 24. Austin PC, Ghali WA, Tu JV. A comparison of several regression models for analyzing cost of CABG surgery. Stat Med. 2003;22(17):2799-2815.