Aug 8, 2003 - for patients hospitalized on the medical floor with ... at Los Angeles, Los Angeles, CA (Drs. Rhew and Weingarten); Zynx Health. Inc. and ...
REPORTS FROM THE FIELD
QUALITY IMPROVEMENT
A Multicenter Intervention to Improve the Care of Hospitalized Patients with CommunityAcquired Pneumonia Prabhu P. Gounder, MD, David C. Rhew, MD, Lynette Landry, RN, PhD, Tracy Sklar, RD, MS, MBA, Gregory H. Dorn, MD, MPH, John Y. Wong, and Scott R. Weingarten, MD, MPH
Abstract • Objective: To assess process and outcome measures before and after a multifaceted intervention for patients hospitalized on the medical floor with community-acquired pneumonia (CAP). • Setting and participants: 1054 patients with CAP admitted to 12 acute care community hospitals from March 1998 to February 2000 (baseline, n = 517) and from October 2000 to March 2001 (postintervention, n = 437). • Intervention: Hospitals could select 1 or more of 4 tools (preprinted orders, care pathways, a clinical pathway development tool, and sticker reminders) and 2 improvement strategies (physician profiling/ performance reports and the Institute for Healthcare Improvement’s [IHI] Rapid Cycle Methodology). Update meetings and conference calls were held periodically to promote collaboration among hospitals. • Measurements: Time (in hours) from patient triage to initiation of intravenous (IV) antibiotics, percentage of patients who receive antibiotics within 8 hours and within 4 hours of triage, days from triage to conversion from IV to oral antibiotics, days of in-hospital observation after switch from IV to oral antibiotics, length of stay (LOS), and inhospital mortality rate. • Results: Significant improvements were noted in the mean time to administration of first dose of IV antibiotics (4.01 to 3.57 hours, P = 0.002), percentage of patients receiving antibiotics within 4 hours (67.9% to 75.9%, P = 0.001), mean days from triage to switch from IV to oral antibiotics (2.02 to 1.25 days, P < 0.001), and mean LOS (5.62 to 5.04 days, P = 0.006). • Conclusion: Certain CAP process and outcome measures improved after the multicenter intervention. A controlled study is needed to verify that this type of intervention can result in improvements in antibiotic timing and LOS.
www.turner-white.com
C
ommunity-acquired pneumonia (CAP) remains one of the leading causes of morbidity and mortality in the United States and is a source of significant financial costs for acute care institutions. Hospitals in the United States admit more than 900,000 patients with pneumonia each year and spend an estimated $9.7 billion treating them [1,2]. Interventions that improve outcomes for patients with pneumonia and reduce costs have been documented [3–8]. Ametaanalysis of randomized controlled trials demonstrated that in low-risk adults, pneumococcal vaccinations can reduce rates of bacteremic pneumococcal pneumonia [3], while 3 cohort studies have demonstrated that early initiation of antibiotics is associated with improved survival [4–6]. One study has demonstrated that inpatient observation of patients with CAP after switch from intravenous (IV) to oral antibiotics has little clinical benefit [7]. Many of these patients can be safely discharged on the day of IV-to-oral conversion, thereby reducing length of stay (LOS) and resulting in cost savings [8]. Despite the documented benefits of these processes of care, physicians do not consistently adhere to them when treating patients with CAP [4,9,10]. Furthermore, retrospective analyses have shown that variations in processes of care are associated with poor patient outcomes [4,5]. Thus, interventions to reduce variations in physician practice patterns may enable community-based acute care hospitals to improve the quality of care for patients with CAP. Several projects to increase physician adherence to specific processes of care have been initiated by nonacademic health systems but have produced mixed results [10–12]. Astatewide multifaceted quality improvement initiative that involved feedback of performance data and implementation
From the Cedars-Sinai Health System, Department of Medicine, Los Angeles, CA (Dr. Gounder); David Geffen School of Medicine, University of California at Los Angeles, Los Angeles, CA (Drs. Rhew and Weingarten); Zynx Health Inc. and Cedars-Sinai Health System, Department of Health Services Research, Beverly Hills, CA (Drs. Dorn and Weingarten and Mr. Wong); and Catholic Healthcare West, San Francisco, CA (Dr. Landry and Ms. Sklar).
Vol. 10, No. 8 August 2003 JCOM 431
MULTICENTER CAP INTERVENTION of a pneumonia care pathway was associated with significant improvements in delivery of antibiotics within 8 hours, collection of blood cultures before initiating antibiotics, checking patient oxygenation status within 24 hours of hospital arrival, and shorter LOS [11]. In another study, an institution implemented a pneumonia clinical pathway to increase pneumococcal vaccination rates but did not document increased rates [12]. In this paper, we describe the efforts of Catholic Healthcare West (CHW) to improve the quality of care for patients admitted with CAP. The CHW Pneumonia Project Setting Initiated in January 2000, the Pneumonia Project was designed as a collaborative project involving hospitals from the CHW system that volunteered to participate. The project was coordinated by the CHW care management team, which consisted of individuals from the CHW corporate quality improvement group. At this time, CHW consisted of 47 facilities located in California (43), Nevada (2), and Arizona (2), and all were invited to participate in the project. Hospitals that fulfilled the following 3 criteria were eligible for participation: (1) pneumonia had been identified by senior leadership as a focus for quality improvement; (2) the hospital agreed to form an active and motivated CAP improvement team consisting of, at a minimum, a physician leader, nurse, clinical pharmacist, and member of the quality improvement team; and (3) the hospital had identified a need or opportunity for improvement in antibiotic utilization. Hospitals that wished to participate were asked to provide at least 40 randomly selected charts of patients admitted with CAP for review during a baseline period (March 1998 to February 2000) and a postintervention period (October 2000 to March 2001). The study coordinators decided upon a 2-year timeframe for baseline data collection to ensure that all hospitals would be able to find an adequate number of complete charts. A considerably shorter time period was allocated for postintervention because the project interventions could be sustained in the participating hospitals for only 6 months due to limited resources. Furthermore, participating hospitals requested feedback in a timely manner. Sixteen hospitals responded to the invitation to participate. However, 2 of the hospitals did not complete the follow-up chart review, and 2 others were unable to provide the required diagnosisrelated group (DRG) codes, leaving 12 hospitals available for analysis. Chart Review and Patient Selection In January 2000, participating hospitals began conducting retrospective chart reviews to collect baseline demographic and process of care data on patients admitted with CAP. Patients with DRG codes 89 (simple pneumonia and pleur432 JCOM August 2003 Vol. 10, No. 8
isy, age > 17 years with comorbid conditions) or 90 (simple pneumonia and pleurisy, age > 17 years without comorbid conditions) were included for analysis. Patients were excluded if they were transferred into the hospital, admitted to the intensive care unit, or had incomplete data. The 12 hospitals provided 697 charts during the baseline period and 531 during the postintervention period. After applying the inclusion and exclusion criteria, 517 charts from the baseline period and 437 charts from the postintervention period were available for analysis. All data were collected through extensive chart reviews rather than through secondary ICD-9-CM codes. These chart reviews were conducted by a facility representative from quality improvement, a representative from the hospital’s pharmacy, or an individual selected by the care management team. The same person did not necessarily collect the preintervention and postintervention data. A computerbased data collection tool was developed to help the abstractors gather chart information. Drop-down menus were used when possible. The data collection tool was simple to use, which decreased the likelihood of data entry errors; however, validation of the data was not performed due to limited resources. Quality Indicators The project targeted and tracked the following quality indicators for CAP care established by the Joint Commission on Accreditation of Hospital Organizations (JCAHO) [13] and Centers for Medicare and Medicaid Services (CMS) [14]: Early administration of antibiotics • Time to administration of antibiotics (JCAHO) • Percentage of patients receiving antibiotics within 8 hours [4] (CMS 6th Scope of Work) • Percentage of patients receiving antibiotics within 4 hours [5,6,14] (CMS 7th Scope of Work) LOS • Time from admission to conversion from IV to oral antibiotics [15,16] • Days of observation after switch from IV to oral antibiotics [7,17,18] The collaborative also aimed to improve performance in other areas, including assessment of oxygenation status within 24 hours of hospital arrival, collection of blood cultures prior to antibiotic administration, collection of blood cultures within 24 hours of triage, screenings for pneumonia and influenza vaccinations, and initiation of appropriate empiric antibiotics. Unfortunately, we were unable to assess the effect of the collaborative interventions on these process of care www.turner-white.com
REPORTS FROM THE FIELD measures because the data collected were insufficient to perform the necessary analyses. Interventions Interventions to improve the care of patients admitted with CAP were implemented in May 2000. The CHW care management team offered the participating facilities a “pneumonia tool kit” to help them meet the JCAHO and CMS quality measures. Hospitals could select 1 or more of 4 tools (preprinted orders, care pathways, a clinical pathway development tool, and sticker reminders) and 2 improvement strategies (physician profiling/performance reports and the Institute for Healthcare Improvement’s [IHI] Rapid Cycle Methodology). The care management team provided hospitals with examples of preprinted orders and care pathways. However, most of the hospitals either modified these orders and pathways or devised their own. Therefore, no standard set of preprinted orders or care pathways was used at all the hospitals. The orders and pathways were placed in convenient locations within the emergency department and in nursing stations. To help in pathway development, evidence-based reference material was available through the Clinical Pathway Constructor, an internet-based tool developed by Zynx Health Incorporated (Beverly Hills, CA). Sticker reminders focused on changing the practice of individual physicians and ancillary staff. For example, stickers with conversion guidelines as well as “antibiotic alert” stickers reminding nurses to administer antibiotics as soon as possible were placed on the front of patients’ charts. Physician profiling and periodic feedback reports allowed physicians to compare their performance on quality indicators with that of other physicians. For example, pharmacists reviewed charts to determine whether switches from IV to oral antibiotics were appropriate and contacted physicians who did not adhere to the recommended strategy. Individual and group rates of appropriate IV-to-oral switches were reported to all physicians as part of printed periodic feedback reports. Finally, the IHI Rapid Cycle Methodology allowed for accelerated improvements through rapid cycle tests of changes [19]. Briefly, the IHI methodology is a 2-part model that tries to accelerate change by improving on existing models for change within an organization. The first part requires answering a series of questions in order to identify goals. The second part utilizes the Plan-Do-Study-Act cycle to test and implement changes in real work settings. All participating hospitals received training in the methodology during the first meeting of the collaborative. All 16 of the hospitals received a data sheet that summarized the baseline characteristics and discharge outcomes for patients with CAP admitted to the CHW system and were www.turner-white.com
QUALITY IMPROVEMENT
invited to participate in a variety of meetings that occurred throughout the quality improvement effort to help keep team members focused on the project and to facilitate the exchange of ideas and information between hospitals. The first meeting of the quality improvement initiative took place in April 2000 at a central location with representatives from each of the participating hospitals present. At this meeting, the goals of the collaborative were explained and the pneumonia tool kit was distributed. Midway through the intervention period, a meeting was held to update the teams from the participating hospitals on what was working and what was not. In addition, the teams at the hospitals took part in biweekly telephone conference calls, mainly to provide and receive updates. Some of these conference calls included guest speakers who reviewed specific topics related to the project. In the intervening time periods between conference calls and collaborative meetings, the teams at each facility met to discuss the implementation of the tools and strategies. Of the 12 hospitals that provided complete charts, all participated in the facility team meetings, 11 participated in the initial meeting and biweekly conference calls, and 9 participated in the update meeting. Project Assessment Measures Data on patient demographics, comorbid conditions, findings on physical examination, laboratory values, and the CAP quality indicators were collected. Severity of illness was stratified according to the Pneumonia Severity Index (PSI) [20]. The mean time to administration of antibiotics was calculated from the triage time (ie, time upon entering the emergency department) to the time the first dose of IV antibiotics was administered. Overall LOS was calculated as the discharge date minus the triage date. Patients who were not started on IV antibiotics during the hospitalization were excluded from analyses relating to IV-to-oral switch and inhospital observation after IV-to-oral switch. Also, patients who did not have an in-hospital (including discharge) order to discontinue all IV antibiotics and start an oral antibiotic were excluded from the analysis relating to in-hospital observation after IV-to-oral switch. Statistical Analysis Hospital admission was used as the unit of analysis. The distribution of patients among PSI classes for hospitals in the pre- and postintervention periods was compared using chisquare analysis. For binary outcome measures (eg, receipt of antibiotics within 8 hours), we calculated adjusted odds ratios for preintervention versus postintervention status using logistic regression. The covariates included in the model were hospitals (12 categories) and severity level (5 categories, PSI class 4 used as the reference category). Both Vol. 10, No. 8 August 2003 JCOM 433
MULTICENTER CAP INTERVENTION Table 1. Hospital Characteristics*
Average number of available beds Teaching hospitals, n Geographic location, n Northern California Bay Area Southern California Arizona and Nevada
Table 3. Baseline versus Postintervention Distribution of Severity of Illness in Admissions
Participating Hospitals (n = 12)
Nonparticipating Hospitals (n = 25)
209
205
2
1
4 5 3 0
9 2 11 4
*Describes 37 of the 47 hospitals in the Catholic Healthcare West system at the beginning of the collaborative. The other 10 nonparticipating hospitals did not provide demographic information or were not acute care hospitals.
Table 2. Tools and Strategies Used by 12 Participating Hospitals Hospitals Tools and Strategies Preprinted orders Care pathways Clinical pathway constructor Reminders/stickers Physician profiling IHI Rapid Cycle Methodology
A B C D E F G H
I
J K L
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
IHI = Institute for Healthcare Improvement.
covariates were entered as sets of indicator variables, as we could not assume that they were interval scaled. For continuous variables (eg, time to administration of IV antibiotics), we calculated adjusted differences from the baseline to postintervention periods using linear regression models after logtransforming the variables to normalize their distribution. Covariates included in the models were the same as those included in the previously described logistic regression models. Estimates of differences between preintervention and postintervention time periods were then transformed back to the original units. All calculations were performed using SAS software, version 8.1 (SAS Institute, Cary, NC). Results There was no statistically significant difference in the average available bed size (P = 0.92) or the regional distribution (P = 434 JCOM August 2003 Vol. 10, No. 8
Characteristic PSI Class, % I 0 II III IV V Age > 65 years old, % Male, % Nursing home residents, % Comorbidities, % Congestive heart failure Neoplastic disease Liver disease Cerebrovascular disease Renal disease
Baseline Group (n = 514)*
Postintervention Group (n = 367)
7.59 16.54 22.18 39.49 14.20 74.47 45.07 13.73
07.55 15.10 19.68 41.42 16.25 69.11 48.05 12.13
0.07 0.36 0.46
25.53
32.49
0.02
14.70 05.42 09.67
13.04 03.66 19.45
0.46 0.20 < 0.001
11.22
15.79
0.04
P Value 0.76
PSI = Pneumonia Patient Outcomes Research Team (PORT) Severity Index. *n = 517, but 3 charts were excluded because there was insufficient data to calculate PSI class.
0.15) between the 12 hospitals that participated and nonparticipating hospitals (Table 1). Two of the 3 teaching hospitals in the CHW system participated in the project. Of the 12 hospitals that completed the chart review, 11 used the IHI Rapid Cycle Methodology, 10 used sticker reminders and the Zynx Health Clinical Pathway Constructor tool, 9 used preprinted orders, 8 used care pathways, and 6 used physician profiling (Table 2). The mean number of interventions used by the hospitals was 4.6 (range, 2 to 6). At baseline, 76% of the patients who were admitted were in PSI classes III–V, compared with 77% of patients in the postintervention period (Table 3). Logistic regression analysis showed no difference in the rate of admission for patients with more severe disease (ie, PSI classes III–V) in the postintervention period as compared with the baseline period (odds ratio [OR], 1.086 [95% confidence interval {CI}, 0.803–1.468]). Patients at baseline were more likely to be older than 65 years (P = 0.07). However, postintervention patients were more likely to have congestive heart failure (P = 0.02), cerebrovascular disease (P < 0.001), and renal disease (P = 0.04). www.turner-white.com
REPORTS FROM THE FIELD
QUALITY IMPROVEMENT
Table 4. Comparison of Outcome Measures Between Baseline and Postintervention Groups
Outcome Measure
Baseline Group (n = 517)
Mean time to first dose of 4.01 (2.80) IV antibiotics (median), hr‡ Received antibiotics within 88.5 8 hours, % Received antibiotics within 67.9 4 hours, % Mean time to antibiotic oral 3.82 (3) conversion (median), days Mean length of oral observa2.02 (1) tion (median), days Mean length of stay (median), 5.62 (5) days No. of in-hospital deaths 17 (3.3) (% mortality)§
Patients Excluded, %*
Postintervention Group (n = 437)
Patients Excluded, %*
Adjusted OR (95% CI)
Adjusted Difference†
P Value
22.8
3.57 (2.30)
5.9
—
–0.634
0.002
22.8
88.6
5.9
—
0.498
22.8
75.9
5.9
—
0.001
2.9
3.75 (3)
8.5
1.190 (0.719–1.969) 1.795 (1.253–2.571) —
0.276
0.246
4.6
1.25 (0)
6.4
—
–0.434
< 0.001
0.4
5.04 (4)
0
—
–0.551
0.006
0
17 (3.9)
0
—
—
—
CI = confidence interval; IV = intravenous; OR = odds ratio. *Patients were excluded because of insufficient data to calculate measure. †Adjusted for severity of illness and hospital effect and reflects the difference in the means. ‡Insufficient data from 2 hospitals accounted for almost two thirds of the 22.8% loss to follow-up at baseline for the timeliness of antibiotic delivery measures. §Study was not sufficiently powered to detect statistically significant difference in mortality.
In analyses adjusted for severity and hospital effect, 2 of the 3 measures assessing timeliness of antibiotic delivery showed improvements from baseline to postintervention period (Table 4). Specifically, we found a statistically significant decrease in the mean time to initiation of IV antibiotics (4.01 to 3.57 hours; P = 0.002) and an increase in the percentage of patients receiving antibiotics within 4 hours (67.9% to 75.9%; OR, 1.795 [95% CI, 1.253–2.571]). The number of patients receiving antibiotics within 8 hours remained unchanged from an already high baseline compliance rate of 89%. Although linear regression analyses did not indicate a statistically significant change in mean days to conversion from IV to oral antibiotics, the mean days of in-hospital observation after conversion decreased significantly from 2.02 (median, 1) to 1.25 days (median, 0) (P < 0.001) over the course of the intervention. Analyses adjusted for severity and hospital effect showed a significant decrease in the mean LOS over the study period, from 5.62 days (median, 5) to 5.04 (median, 4) days (P = 0.006). Finally, the statistical power of the study (power = 0.079, β = 0.921) was insufficient to detect any significant differences in in-hospital mortality between the 2 study periods. Although all analyses were adjusted for severity, the only process of care or outcome measure that varied significantly between PSI classes was LOS. As expected, we found that www.turner-white.com
each progressive PSI class was associated with a statistically significant increase in the LOS. However, there was no statistically significant difference between the various PSI classes in terms of timing of antibiotics, days to conversion from IV to oral antibiotics, or days of observation on oral antibiotics. Discussion Statistically significant improvements in several key CAP process of care and outcome measures were seen after the implementation of a multifaceted intervention. The percentage of patients receiving antibiotics within 4 hours of triage increased, and the median time for antibiotic delivery decreased from baseline to postintervention. However, there was no significant change in the percentage of patients receiving antibiotics within 8 hours, which was likely due to a “ceiling effect” where the high baseline adherence rate for this measure (89%) did not allow much opportunity for further improvement. Furthermore, this suggested that the 4 -hour upper limit (CMS 7th Scope of Work measure) for the quality measure pertaining to timing of antibiotic administration was more sensitive in detecting a change in the median time to administration of antibiotics (JCAHO measure) as compared to the 8-hour (CMS 6th Scope of Work measure) upper limit. Also, the mean days of in-hospital observation after conversion from IV to oral antibiotics decreased, as did the Vol. 10, No. 8 August 2003 JCOM 435
MULTICENTER CAP INTERVENTION overall mean LOS, while the mean time from admission to switch from IV to oral antibiotics did not decrease. This finding suggests that the decrease in LOS was primarily attributable to the decrease in the duration of in-hospital observation after conversion from IV to oral antibiotics, and that hospitals interested in reducing LOS for CAP patients should consider paying closer attention to the observation period after the switch from IV antibiotics. Finally, the same number of patients (17) died in the baseline (3.3%) and postintervention (3.9%) periods. However, it should be noted that this study was not sufficiently powered to evaluate the effect of the interventions on in-hospital mortality, and the decreased LOS in the postintervention period compared to the baseline period may have biased the inpatient mortality. There are at least 4 potential explanations for the observed improvements in the postintervention period. First, the intervention itself may have resulted in the improvements. Hospitals implemented several strategies to educate and remind physicians to adhere to the key processes of care in which improvements were seen. A follow-up survey indicated that many of the participating hospitals felt that the baseline comparative data and the meetings that occurred throughout the collaborative (specifically, the initial meeting where training and education were provided for the various tools and strategies and the facility team meetings that helped to keep hospitals focused on quality improvement in pneumonia) were the most useful elements of the intervention. Second, these changes may have resulted from secular changes, normal fluctuations in practice patterns, or the effect of a concurrent CHW intervention to reduce LOS for a variety of medical conditions, including CAP. However, it is unlikely that seasonal differences in the data collection periods before and after the intervention positively impacted the findings because in the preintervention period, 69% of patients were discharged in the fall/winter months, while in the postintervention period, 100% of patients were discharged in the fall/winter months. Since patients who are admitted with pneumonia during the fall/ winter months generally have longer LOS than those in the spring/summer months, it is more likely that the postintervention patients would have had a longer LOS than those treated in the preintervention period. Consequently, this biases the results against showing a significant reduction in LOS. Third, since participation in the study was voluntary, it is likely that the hospitals in our evaluation were the ones most likely to demonstrate improvements. Fourth, the positive effects observed in this clinical trial may be partially attributed to the “Hawthorne effect” [21]. That is, participants in a study may be motivated to perform better than usual simply because they know that their actions are being measured. This effect was unlikely to have occurred in this project, however, because for the most part, the physicians who were responsible for making the clinical decisions were not informed that their actions were 436 JCOM August 2003 Vol. 10, No. 8
being measured as part of a study. On the other hand, several physicians were made aware that their actions were being monitored as part of a quality improvement intervention (eg, physician profiling), and this may have contributed to improvements in their performance. If in fact the observed improvements were either partially or wholly attributable to the quality improvement intervention, then there could be a few reasons to explain the success of the intervention. Hospitals were not required to implement all of the quality improvement interventions but rather had the opportunity to choose the number and type of tools and strategies that were most likely to succeed within each hospital setting. A secondary analysis (data not shown) demonstrated that incremental use of evidence-based practices (Clinical Pathway Constructor, care pathways, preprinted orders) was associated with lower mean number of hours to first dose of antibiotics (P = 0.006) and in reduced days of in-hospital observation after switch from IV to oral antibiotics (P = 0.011). Also, incremental use of the IHI methodology was associated with improvements in administration of antibiotics within 4 hours of arrival at the hospital (P = 0.022) as well as in earlier switch from IV to oral antibiotics (P < 0.001). There was no incremental benefit to the increased use of physician-directed interventions (physician profiling, physician reminders) for any of the indicators. Second, all the selected hospitals had a commitment from senior leadership to improve pneumonia care within the facility. Previous studies have shown that local physician champions and opinion leaders are important in changing physician behavior [22,23]. Furthermore, all participating hospitals already had a team in place that was dedicated to improving CAP quality of care. Finally, the participating hospitals all had an identified need or opportunity for improvement in antibiotic utilization. It has been previously documented that when baseline adherence rates to process of care measures are high, the opportunity to make improvements is diminished [24,25]. A primary limitation in the analysis of this project was that there was no control group. This study is intended to be an observational analysis of a multidimensional intervention in a real-world setting and was not designed to test a specific hypothesis. Therefore, we were unable to conclude whether the observed improvements in performance were due to the interventions or secondary to other reasons such as secular trends and confounding concurrent interventions. In addition, the results may have been biased by a high loss to follow-up. We were unable to analyze 23% of the baseline admissions with respect to timeliness of antibiotic delivery measures (mean time to delivery of IV antibiotics from time of triage, percentage of patients receiving IV antibiotics within 4 hours of triage, and percentage of patients receiving antibiotics within 8 hours of triage) due to insufficient data. On the other hand, for all the other measures (days from admission to www.turner-white.com
REPORTS FROM THE FIELD converting from IV to oral antibiotics, days of in-hospital observation after converting to oral antibiotics, LOS, in-hospital mortality), the loss to follow-up was less than 5%. We do not know whether the observed decrease in overall LOS occurred due to a “squeezing the balloon” phenomenon [26]. That is, did LOS decrease at the expense of increases in the rate of re-hospitalization after discharge, the number of return visits to the emergency department, or the 30-day mortality? Since we do not have data regarding postdischarge follow-up, we cannot report whether these outcomes worsened as a result of a reduction in LOS from 5.62 to 5.04 days. However, it should be noted that the reductions in LOS were likely a result of applying evidence-based guidelines that were based on previously validated studies addressing early switch and discharge [15,16] and avoidance of in-hospital observation after switch [7,16–18]. In these original studies [7,15–18], the practices to reduce LOS were shown to be both effective and safe (ie, no significant difference between early-switch and non–early-switch groups in terms of postdischarge outcomes). Nonetheless, the unanswered question remains whether patients fulfilled all of the appropriate eligibility criteria prior to switching from IV to oral antibiotics and being discharged from the hospital. Finally, while all hospitals were asked to submit at least 40 charts for review during both preintervention and postintervention periods, not all hospitals were able to fulfill this goal. Of the 12 hospitals analyzed, 4 hospitals during the baseline period and 7 hospitals during the follow-up period were unable to meet the 40-chart minimum. Since hospitals had varying outcomes in the baseline period and the ratio of preintervention to postintervention numbers of charts varied across hospitals, we were concerned that a differing mix of hospitals in the preintervention and postintervention admissions might bias our results. We mitigated this concern by applying appropriate adjustments in our comparative analyses between the 2 periods. Our findings suggest that performing a multifaceted intervention for patients with CAP can result in increased adherence to evidence-based quality measures that may potentially improve clinical outcomes. A controlled study is needed to verify these findings. The authors thank Kevin Knight, MD, MPH, for his assistance in the statistical analysis of the data. Corresponding author: David C. Rhew, MD, Zynx Health, Inc., 9100 Wilshire Blvd., East Tower, Ste. 655, Beverly Hills, CA 90212, drhew@ cerner.com.
References 1. Vital and health statistics: detailed diagnoses and procedures, national hospital discharge survey, 1990. Washington (DC): National Center for Health Statistics; 1992. Publication
www.turner-white.com
QUALITY IMPROVEMENT
no. 113, series 13. 2. Lave JR, Lin CJ, Fine MJ, Hughes-Cromwick P. The cost of treating patients with community-acquired pneumonia. Semin Respir Critic Care Med 1999;20:189–97. 3. Fine MJ, Smith MA, Carson CA, et al. Efficacy of pneumococcal vaccination in adults. A meta-analysis of randomized controlled trials. Arch Intern Med 1994;154:2666–77. 4. Meehan TP, Fine MJ, Krumholz HM, et al. Quality of care, process, and outcomes in elderly patients with pneumonia. JAMA 1997;278:2080–4. 5. Kahn KL, Rogers WH, Rubenstein LV, et al. Measuring quality of care with explicit process criteria before and after implementation of the DRG-based prospective payment system. JAMA 1990;264:1969–73. 6. McGarvey RN, Harper JJ. Pneumonia mortality reduction and quality improvement in a community hospital. QRB Qual Rev Bull 1993;19:124–30. 7. Rhew DC, Hackner D, Henderson L, et al. The clinical benefit of in-hospital observation in “low-risk” pneumonia patients after conversion from parenteral to oral antimicrobial therapy. Chest 1998;113:142–6. 8. Fine MJ, Pratt HM, Obrosky DS, et al. Relation between length of hospital stay and costs of care for patients with communityacquired pneumonia. Am J Med 2000;109:378–85. 9. Cleves MA, Weiner JP, Cohen W, et al. Assessing HCFA’s Health Care Quality Improvement Program. Jt Comm J Qual Improv 1997;23:550–60. 10. Metersky ML, Galusha DH, Meehan TP. Improving the care of patients with community-acquired pneumonia: a multihospital collaborative QI project. Jt Comm J Qual Improv 1999;25:182–90. 11. Meehan TP, Weingarten SR, Holmboe ES, et al. A statewide initiative to improve the care of hospitalized pneumonia patients: the Connecticut Pneumonia Pathway Project. Am J Med 2001;111:203–10. 12. Metersky ML, Fine JM, Tu GS, et al. Lack of effect of a pneumonia clinical pathway on hospital-based pneumococcal vaccination rates. Am J Med 2001;110:141–3. 13. Joint Commission on Accreditation of Healthcare Organizations. Overview of the community acquired pneumonia (CAP) core measure set. Available at www.jcaho.org/pms/ core+measures/cap_overview.htm. Accessed 18 Jul 2003. 14. Centers for Medicare and Medicaid Services. National pneumonia project. Available at www.nationalpneumonia.org/ index.html. Accessed 18 Jul 2003. 15. Rhew DC, Tu GS, Ofman J, et al. Early switch and early discharge strategies in patients with community-acquired pneumonia: a meta-analysis. Arch Intern Med 2001;161:722–7. 16. Ramirez JA, Vargas S, Ritter GW, et al. Early switch from intravenous to oral antibiotics and early hospital discharge: a prospective observational study of 200 consecutive patients with community-acquired pneumonia. Arch Intern Med 2000;159: 2449–54. 17. Beumont M, Schuster MG. Is an observation period necessary after intravenous antibiotics are changed to oral administration? Am J Med 1999;106:114–6. 18. Dunn AS, Peterson KL, Schechter CB, et al. The utility of an
Vol. 10, No. 8 August 2003 JCOM 437
MULTICENTER CAP INTERVENTION
19.
20.
21. 22.
in-hospital observation period after discontinuing intravenous antibiotics. Am J Med 1999;106:6–10. Institute for Healthcare Improvement. IHI quality improvement measures: a model for accelerating improvement. Available at www.ihi.org/resources/qi. Accessed 18 Jul 2003. Fine MJ, Auble TE, Yealy DM, et al. A prediction rule to identify low-risk patients with community-acquired pneumonia. N Engl J Med 1997;336:243–50. Parson HM. What happened at Hawthorne? Science 1974; 183:922–32. Thomson OB, Oxman AD, Haynes RB, et al. Local opinion leaders: effects on professional practice and health care out-
comes. Cochrane Database Syst Rev 2000;(2):CD000125. 23. Oxman AD, Thomson MA, Davis DA, Haynes RB. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. CMAJ 1995;153:1423–31. 24. Rhew DC, Riedinger MS, Sandhu M, et al. A prospective, multicenter study of a pneumonia practice guideline. Chest 1998;114:115–9. 25. Weingarten SR, Riedinger MS, Hobson P, et al. Evaluation of a pneumonia practice guideline in an interventional trial. Am J Respir Crit Care Med 1996;153:1110–5. 26. Schroeder SA, Cantor JC. On squeezing balloons. Cost control fails again. N Engl J Med 1991;325:1099–100.
Copyright 2003 by Turner White Communications Inc., Wayne, PA. All rights reserved.
438 JCOM August 2003 Vol. 10, No. 8
www.turner-white.com