An Automated Procedure Logging System ... - Wiley Online Library

8 downloads 16083 Views 231KB Size Report
measure resident compliance with the system. Methods: This ... The authors have no relevant financial information or potential conflicts of interest to disclose.
RESIDENCY ADMINISTRATION

An Automated Procedure Logging System Improves Resident Documentation Compliance Thomas S. Seufert, MD, Patricia M. Mitchell, RN, Allison R. Wilcox, Julia E. Rubin-Smith, MSPH, Laura F. White, PhD, Kerry K. McCabe, MD, and Jeffrey I. Schneider, MD

Abstract Objectives: The purpose of this study was to determine the effect of an automated procedure logging (APL) system on the number of procedures logged by emergency medicine (EM) residents. Secondary objectives were to assess the APL’s effect on completeness and accuracy of procedure logging and to measure resident compliance with the system. Methods: This was a before-and-after study conducted at a university-affiliated, urban medical center, with an annual emergency department census of >130,000. The EM residency is a 4-year, Residency Review Committee (RRC)-accredited program with 12 residents per year. We developed software to electronically search and abstract resident procedures documented in the electronic medical record (EMR) and automatically export them into a Web-based residency management system. We compared the mean daily number of procedures logged for two 6-month periods: October 1, 2009, to March 31, 2010 (pre-APL), and October 1, 2010, to March 31, 2011 (post-APL), using a two-sample t-test. We also generated a random sample of 231 logged procedures from both the pre- and post-APL time periods to assess for completeness and accuracy of data transfer. Completeness and accuracy in the pre- and postAPL periods were compared using Fisher’s exact test. Aggregate resident compliance with the system was also measured. Results: The mean daily number of procedures logged increased by 168% (10.0 vs. 26.8, mean difference = 16.8, 95% confidence interval [CI] = 15.4 to 18.2, p < 0.001) after the implementation of APL. Procedures logged with the APL system were more complete (76% vs. 100%, p < 0.001) and more accurate (87% vs. 99%, p < 0.001). Most residents (42 ⁄ 48, 88%) used APL to log at least 90% of procedures. Only 4% of procedures eligible for automation were logged manually in the post-APL period. Conclusions: There was a significant increase in the daily mean number of procedures logged after the implementation of APL. Recorded data were more complete and more accurate during this time frame. This innovative system improved resident logging of required procedures and helped our assessment of Accreditation Council for Graduate Medical Education (ACGME) Patient Care and Practice-Based Learning Competencies for individual residents. ACADEMIC EMERGENCY MEDICINE 2011; 18:S54–S58 ª 2011 by the Society for Academic Emergency Medicine

H

ealth information technology (HIT) is a rapidly growing sector of the health care field. Electronic medical records (EMRs) are being widely adopted in an effort to improve care and efficiency. The federal government has set forth a series of ‘‘meaningful use objectives’’ as part of an effort to both encourage and regulate the use of HIT.1,2

Research in the field has informed these regulations, including ways to use HIT to improve documentation. For instance, computerized physician order entry has been shown to reduce medication errors,3 and electronic charting enables clinical decision support, which has demonstrated improvements in patient care.4

From the Department of Emergency Medicine, Boston Medical Center, Boston University School of Medicine, Boston, MA. Received April 22, 2011; revision received June 24, 2011; accepted June 28, 2011. Presented at the Council of Emergency Medicine Residency Directors (CORD) Academic Assembly, San Diego, CA, March 2011; and the New England Regional Society for Academic Emergency Medicine (SAEM) meeting, Hartford, CT, April 2011. The authors have no relevant financial information or potential conflicts of interest to disclose. Supervising Editor: John H. Burton, MD. Address for correspondence and reprints: Jeffrey I. Schneider, MD; e-mail: [email protected].

S54

ISSN 1069-6563 PII ISSN 1069-6563583

ª 2011 by the Society for Academic Emergency Medicine doi: 10.1111/j.1553-2712.2011.01183.x

ACADEMIC EMERGENCY MEDICINE • October 2011, Vol. 18, No. 10, Suppl. 2

Medical education also has the potential to benefit from advances in electronic documentation and HIT. In recent years, there has been increasing pressure on resident physicians, residency training programs, and employers to document the procedural training, experience, and competency of residents. Accurate recording of these data is required by the Accreditation Council for Graduate Medical Education (ACGME); however, the lack of documentation of the procedural experience of trainees is a frequent source of Residency Review Committee (RRC) citations,5 and the core competency– based evaluation of resident procedural experience has proved challenging. In addition, potential employers often request that graduating residents provide documentation of procedures performed during their training as part of the hiring and credentialing process. The method by which residents and training programs track procedural experience varies widely (e.g., paper logs, personal card systems or spreadsheets, or electronic residency management systems such as we use), and handwritten, Web-based, and personal digital assistant (PDA)-based software systems have been plagued by a lack of reliability and accuracy, as well as poor resident compliance.6–9 One study found that only 60% of procedures actually performed were logged when done so via manual entry.7 This study sought to examine the effect of a novel, automated procedure logging (APL) system on the number of procedures logged by our emergency medicine (EM) residents. We hypothesized that automating the resident procedure logging process would result in an increase in the number of procedures logged, as well as improve the completeness and accuracy of procedure logging documentation. METHODS Study Design This was a before-and-after study of resident procedures logged during two 6-month time periods: October 1, 2009, to March 31, 2010 (pre-APL), and October 1, 2010 to March 31, 2011 (post-APL). This study was deemed exempt from informed consent requirements by the Boston University Medical Center Institutional Review Board. Study Setting and Population The study was conducted at Boston Medical Center, an urban academic emergency department (ED) with a postgraduate year (PGY) 1–4 EM residency program enrolling 12 residents per year. The hospital is a Level I trauma center, and the ED has an annual census in excess of 130,000 visits. The ED EMR is Picis ED Pulsecheck (Wakefield, MA), and all training programs at Boston Medical Center rely on New Innovations (NI; Uniontown, OH) for residency management support (i.e., case logging, evaluations, monitoring conference attendance, duty hours, and general personnel tracking). Study Protocol We developed APL software, which enables the seamless communication between our ED EMR and NI. The software retrieves procedural data identified in the



www.aemj.org

S55

EMR and automatically exports it into the procedure logs of individual residents. To record a procedure prior to the implementation of APL, residents were required to log in to NI, in addition to documenting the procedure in the EMR. This extra step, which interrupted normal work flow patterns, was seen as a critical obstacle in the successful and accurate documentation of procedural experience. The APL uses structured query language to access the EMR database and generate a procedure list. Specifically, the process relies on three distinct, yet overlapping and complementary search strategies, which aim to ensure a complete and accurate data set: 1) while documenting a procedure in the EMR, residents can ‘‘flag’’ the procedure by using a drop-down menu added to the EMR procedure documentation screen (Figure 1); 2) current procedural terminology codes located in a patient chart are matched to residents who performed the procedures; and 3) a list of unstable patients who required resuscitation is generated using a combination of billing and visit data. Patient encounters are deemed ‘‘resuscitations’’ if they used one of three critical care ⁄ trauma bays in the ED or if the attending physician indicated on the chart that a patient should be billed for critical care time. Resuscitations are categorized as trauma or medical using natural language processing of the chief complaint, diagnosis, and admitting service. For example, trauma resuscitations are identified by chief complaints and ⁄ or diagnoses such as ‘‘burn,’’ ‘‘fall,’’ ‘‘fracture,’’ ‘‘gunshot wound,’’ ‘‘motor vehicle accident,’’ ‘‘pedestrian struck,’’ or ‘‘stab wound.’’ Prior to the APL development, residents were required to use the EMR procedure-charting screens to document procedures for the medical record and facilitate billing. However, the EMR did not store this information in its relational database. As a result, there was no ability to link these data to individual resident procedure logs, requiring residents to separately log procedures in NI. To minimize the effect on resident workflow, a new and simple drop-down menu was added to the existing EMR procedure charting screen

Figure 1. EMR procedure charting screen showing how up to three procedures may be flagged for logging using the dropdown menus at the bottom. Procedures that the resident observed or supervised, but did not perform, are indicated by checking ‘‘Procedure(s) observed.’’ EMR = electronic medical record.

S56

(Figure 1). This enables residents to simultaneously complete charting and log procedural data. A checkbox allows residents to indicate whether they performed, supervised, or observed the procedure. Entries in this new procedure logging section are stored in the EMR database, allowing the APL to generate reports on all procedures recorded. Procedure data sets from each of the above methods are combined, and duplicates are automatically removed. Data are transmitted securely via the internet to NI. Any errors that occur during importation are reported via an automatically generated e-mail to the system administrator. The APL was written in the Java programming language and was designed to run on all major platforms, including Windows, Mac OS, and Linux. Data Analysis The procedure logs of all EM residents were analyzed from the pre- and post-APL time periods. All data from the pre-APL period came from logs manually entered by residents; post-APL data were a combination of procedure logs automatically generated via the new software and any logs manually entered into NI during that time period. The mean daily number of procedures performed in each period was calculated and the average number of procedures logged per day was compared pre- and post-APL, using a two-sample t-test. A secondary outcome compared the pre- and postAPL procedure logs for completeness and accuracy. Completeness was defined as inclusion of all of the following required data elements: date, medical record number, birthdate, and procedure type. Accuracy was defined as the absence of errors in all of the required elements. Records were excluded from the pre-APL sample of logs analyzed for completeness and accuracy if they were performed during off-service rotations (e.g., obstetrics), in an ED other than our universityaffiliated one (e.g., a community ED in which the residents rotate), or in a simulation laboratory, as there was no way for the investigators to assess the accuracy of data recorded. From preliminary data we estimated the proportion of complete records pre-APL to be 0.77 (95% confidence interval [CI] = 0.70 to 0.83). To be conservative, we used the lower bound of the CI, 0.70, to determine an appropriate sample size. It was determined that 231 records pre- and post-APL were needed to detect a 10% increase from 0.70 with 80% power. To detect a 10% change from the baseline accuracy rate of 87%, we needed 112 records pre- and post-APL. The sample size needed for accuracy rate was lower than that needed for completeness and, as a result, the sample size for completeness was used for analysis. A random sample of 231 procedure logs was generated from both the pre- and post-APL time periods for a total of 462 records, and the completeness and accuracy of the records was reviewed by a single investigator (TS). A second trained investigator (AW), using explicit criteria for coding, independently reviewed a random sample of 10% of these records to determine inter-rater reliability. A kappa coefficient was subsequently calculated. We compared accuracy and completeness of preand post-APL procedure logs using Fisher’s exact test.

Seufert et al.



AUTOMATED PROCEDURE LOGGING

Table 1 Daily Average Procedure Counts and Percentage Increase by Procedure Type Procedure Name Nasopharyngoscopy Patient follow-up Pediatric medical resuscitation* Adult medical resuscitation* Adult trauma resuscitation* Nerve block Pediatric trauma resuscitation* Intubation* Procedural sedation* Dislocation reduction* Cardiac pacing* Cardioversion Incision and drainage Arthrocentesis Casting and splinting Central venous access* Laceration repair Pericardiocentesis* Lumbar puncture* Chest tube* Total

Pre-APL

Post-APL

% Increase

0.03 0.74 0.11

0.27 4.32 0.52

900 487 375

1.69

7.03

317

0.92

3.77

309

0.19 0.14

0.52 0.32

171 127

0.64 0.24 0.32 0.05 0.08 0.77 0.16 0.21 0.64 1.96 0.03 0.79 0.29 10.0

1.42 0.52 0.66 0.09 0.14 1.41 0.27 0.32 0.95 2.85 0.04 0.98 0.36 26.8

121 116 109 89 86 83 69 55 48 45 33 25 25 168

*Procedure with ACGME-specified target.

For all analyses, we considered those procedures with specific recommended numbers found in the ACGME guidelines,10 as well as several other procedures that the authors felt important to EM training (Table 1). Vaginal deliveries and cricothyrotomies were excluded from analysis due to curriculum changes in the training program that occurred between the two study periods preventing a valid and appropriate comparison. Finally, we examined resident compliance with the APL system. Resident compliance was defined as the percentage of residents who used APL to log at least 90% of their ED procedures in the post-APL period. All data analysis was performed using SAS 9.2 (SAS Institute, Inc., Cary, NC). Results are reported with 95% CIs and p-values. RESULTS Residents logged an average of 168% more procedures per day during the postautomation period: 10.0 versus 26.8 (mean difference = 16.8, 95% CI = 15.4 to 18.2, p < 0.001). Postautomation procedure logs were more likely to be complete (76% vs. 100%, p < 0.001) and accurate (87% vs. 99%, p < 0.001). Errors in the preautomation group included incorrect medical record number (16), procedure date (7), date of birth (4), and procedure type (3). Errors in the postautomation group were the result of EMR data entry errors made by residents resulting in incorrect procedure type (2) and incorrect categorization of a resuscitation by APL (1). Inter-rater reliability was excellent for both completeness and accuracy with perfect agreement between the

ACADEMIC EMERGENCY MEDICINE • October 2011, Vol. 18, No. 10, Suppl. 2

two reviewers for completeness and only one record being discordant for accuracy (j = 0.85, 95% CI = 0.55 to 1.00). The number of active residents (48) was the same during the pre- and post-APL periods; ED volume was also similar between the two periods. Most residents (42 of 48, 88%) used APL to log at least 90% of procedures. Only 4% (174 of the 4,249) of procedures eligible for automation were logged manually in the post-APL period. DISCUSSION We report a significant increase in the daily average number of procedures logged after the implementation of APL, suggesting that the new system facilitated resident procedure logging. In addition to increasing the number of procedures logged, we found that APL significantly improved the completeness and accuracy of procedural data entered. Resident compliance with the new system was high, enabling the residency leadership to gain a much more accurate understanding of the true procedural experience of residents. This novel software solution to a problem common to many residency training programs addresses concerns that resident procedure logging may not accurately reflect their true practice. While the EM RRC mandates that the procedural experience of residents be assessed, tracked, and monitored, there is no specific requirement as to how these data are obtained or recorded. Although previous studies have discussed the benefits of computerized tracking of resident procedures, systems that use PDA- and Web-based procedure tracking systems are suboptimal in that they require residents to either carry (and learn to use) another device or to access another software application through an additional log-in. Studies of these systems report poor compliance.6–9 In contrast, our compliance rate suggests residents successfully integrated APL into their normal work flow. To our knowledge, APL represents the first reported attempt to automate and integrate procedure logging with EMR documentation. APL is unique in that it minimally alters the typical resident work flow, eliminates duplicate data entry, and requires very little active resident participation for the successful logging of a performed procedure. As increasing demands are placed on the time of both residents and residency leadership, any interventions that facilitate operational efficiency are of value to the field of EM. LIMITATIONS This study was performed at a single institution, which may limit the generalizability of its findings. The software was specifically developed to enable communication between Picis ED Pulsecheck and NI. The ability to replicate the process with other EMRs and procedure logging applications is unknown; however, the system has the ability to export the procedure data into an Excel spreadsheet, perhaps enabling an interface with other procedure logging applications. Other training programs could collaborate with their own information



www.aemj.org

S57

technology departments to develop comparable integrated applications. It should be noted that 280 EDs nationwide use Picis ED Pulsecheck11 and 67% of EM training programs use NI.12 Procedures that are performed on off-service rotations, in our simulation laboratory, or in other EDs in which the residents rotate (e.g., a community ED affiliate that uses a different EMR) are not captured by APL. Although two identical 6-month periods were chosen for comparison and there was no appreciable change in patient volume, there is a small possibility that the measured change in procedure documentation was not a result of APL, but rather in a change in the frequency of performed procedures. Finally, as the study periods involve 2 academic years, the residents pre- and post-APL were not exactly the same (12 graduates were replaced by 12 new interns). However, there is no reason to suspect that the total number of procedures would change from year to year. CONCLUSIONS There was a significant increase in the daily mean number of procedures logged after the implementation of automated procedure logging. Recorded data were also more complete and more accurate during this time frame. This innovative system improved resident logging of required procedures and helped our assessment of the ACGME Patient Care and Practice-Based Learning competencies for individual residents. Other training programs might benefit from a similar system. The authors would acknowledge Mark B. Mycyk, MD, and James A. Feldman, MD, MPH, for their guidance in the preparation and review of the manuscript and Layla Rahimi, BA, for administrative support.

References 1. U.S. Government. American Recovery and Reinvestment Act of 2009 ⁄ Division A ⁄ Title XIII. Available at: http://en.wikisource.org/wiki/American_Recovery_and_Reinvestment_Act_of_2009/Division_A/Title_XIII. Accessed Jul 28, 2011. 2. Blumenthal D, Tavenner M. The ‘‘meaningful use’’ regulation for electronic health records. N Engl J Med. 2010; 363:501–4. 3. Bates DW, Leape LL, Cullen DJ, et al. Effect of computerized physician order entry and a team intervention on prevention of serious medication errors. JAMA. 1998; 280:1311–6. 4. Kawamoto K, Houlihan CA, Balas EA, et al. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ. 2005; 330:765. 5. Carter W, Meyer L. Session 21–specialty update: emergency medicine. Accredidation Council for Graduate Medical Education (ACGME) 2011 Annual Education Conference. Nashville, TN, March 3–6, 2011. 6. Dire DJ, Kietzman LI. A prospective survey of procedures performed by emergency medicine residents during a 36-month residency. J Emerg Med. 1995; 13:831–7.

S58

7. Langdorf MI, Montague BJ, Bearie B, et al. Quantification of procedures and resuscitations in an emergency medicine residency. J Emerg Med. 1998; 16:121–7. 8. Garvin R, Otto F, McRae D. Using handheld computers to document family practice resident procedure experience. Fam Med. 2000; 32:115–8. 9. Langdorf MI, Strange G, Macneil P. Computerized tracking of emergency medicine resident clinical experience. Ann Emerg Med. 1990; 19:764–73.

Seufert et al.



AUTOMATED PROCEDURE LOGGING

10. Accreditation Council for Graduate Medical Education. Emergency Medicine Guidelines. Available at: http://www.acgme.org/acWebsite/RRC_110/110_ guidelines.asp. Accessed Jul 28, 2011. 11. Picis, Inc. Press Releases. Available at: http:// www.picis.com/news/press_releases.aspx. Accessed Apr 20, 2011. 12. New Innovations, Inc. News and Market Stats. Available at: http://www.new-innov.com/pub/news. aspx. Accessed Jul 28, 2011.