reviews based on regulatory compliance. Improved ... personnel who used the VA EHR system, 129 ... a pencil and paper instrument, a simulated VA EHR.
Initial Steps toward Validating and Measuring the Quality of Computerized Provider Documentation Kenric W. Hammond, MD1, Efthimis N. Efthimiadis, PhD2, Charlene R. Weir, PhD3, Peter J. Embi, MD4, Stephen M. Thielke, MD1, Ryan M. Laundry2, Ashley Hedeen, MD1 1 VA Puget Sound Health Care System, Seattle WA; 2University of Washington, Seattle WA; 3George E. Wahlen VA Medical Center, Salt Lake City UT; 4University of Cincinnati, Cincinnati OH
Abstract Background: Concerns exist about the quality of electronic health care documentation. Prior studies have focused on physicians. This investigation studied document quality perceptions of practitioners (including physicians), nurses and administrative staff. Methods: An instrument developed from staff interviews and literature sources was administered to 110 practitioners, nurses and administrative staff. Short, long and original versions of records were rated. Results: Length transformation did not affect quality ratings. On several scales practitioners rated notes less favorably than administrators or nurses. The original source document was associated with the quality rating, as was tf·idf, a relevance statistic computed from document text. Tf·idf was strongly associated with practitioner quality ratings. Conclusion: Document quality estimates were not sensitive to modifying redundancy in documents. Some perceptions of quality differ by role. Intrinsic document properties are associated with staff judgments of document quality. For practitioners, the tf·idf statistic was strongly associated with the quality dimensions evaluated. Background Narrative documentation of patient care is increasingly prominent in electronic health record (EHR) systems. Compared to paper records, electronic documents allow quick access and efficient review by multiple users in multiple locations, and increasingly, the benefit of better availability is seen to justify the cost of creating and managing electronic patient care documentation. Direct input requires valuable clinician time. Alternatives such as dictation and voice recognition are also costly. Electronic documentation does more than assist clinical decision-making by affording physician and nurse access to patient records [1]; it also assists real-time quality and utilization review, billing, workload
management and research. Document creation is enlisted to implement quality improvement policy via structured inputs, checks and clinical reminders. Adherence to documentation standards is essential for facility accreditation, and compliance review comprises much of the work of health information management departments. Computerized providera documentation (CPD) systems exert strong and sometimes disruptive influence on workflow patterns and document production [1, 3], and some aspects of document quality may have been compromised in the transition to electronic systems [4-6]. Cognitive approaches to characterizing document quality have begun to complement traditional quality reviews based on regulatory compliance. Improved understanding of human factors has the potential to inform interface design and lead to more effective CPD systems. One analysis of provider interviews identified five factors influencing satisfaction with clinical documentation work: Time Efficiency, Availability, Expressivity, Structure and Quality [7]. Another study, using paired adjectives presented to practitioners, identified four document features consistently associated with the perception of quality: Well-formed, Comprehensible, Accurate and Compact [8]. In the present investigation, we interviewed VA practitioners (physicians, nurse practitioners and physician assistants), nurses (RN and LPN), and administrative staff (billers, coders, medical information specialists and quality assurance staff) who work with CPD and elicited their opinions about using documentation in their typical work. Qualitative analysis identified grounded concepts of document quality which we incorporated in an a
Defined in CFR 45-160.103, p. 701, to include “…any other person or organization who furnishes, bills, or is paid for health care in the normal course of business” [2].
AMIA 2010 Symposium Proceedings Page - 271
instrument. We will describe instrument opment, testing methods, and initial results.
devel-
conducted guided semi-structured interviews of groups of VA practitioners, nurses and administrative personnel who used the VA EHR system, 129 subjects overall. Subjects reported examples of benefits and challenges encountered when navigating and working with CPD. Core themes as well as concepts of document quality identified by others [78] were extracted. Ten instrument items based on identified themes were created using the language of interviewees, expressed as semantic differentials, and presented as response items (Table 1).
Methods Authorization to view and analyze patient data and to recruit VA staff interview and test subjects was granted by the Institutional Review Boards at the Puget Sound and Salt Lake City VA facilities. Quality concepts related to documentation were developed from theme analysis of transcripts from fourteen focus groups conducted in 2007 and 2008 at four Department of Veterans Affairs (VA) sites. We 1 2 3 4 5 6 7 8 9 10
This note doesn't at all tell me what's going on with the patient It's very difficult to skim to important information in this note It's very difficult to distinguish the author's text from template text This note doesn't at all help me anticipate the needs of the patient I always skip over this sort of note I can't at all follow what the author was really thinking in this note I have to wade through this note to get what's important to me This note is incomplete (for this type of note) I don't at all trust the information in this note This note is not at all consistent with the overall clinical picture
vs
vs
This note fully tells me what's going on with the patient It's very easy to skim to important information in this note It's very easy to distinguish the author's text from template text This note fully helps me anticipate the needs of the patient I always read this sort of note I can fully follow what the author was really thinking in this note I don't have to wade through this note to get what's important to me This note is complete (for this type of note)
vs
I fully trust the information in this note
vs
This note is fully consistent with the overall clinical picture
vs vs vs vs vs vs
Table 1. Quality Dimensions expressed as semantic differentials. Twelve test documents covering inpatient and outpatient episodes were developed from inpatient and outpatient records of a single VA patient judged to be typical on the basis of gender, problem complexity and age. Documents were chosen to represent a range of themes, types and authors. The patient’s notes and data were manually de-identified following a procedure approved by the VA Puget Sound Privacy Officer, removing all names and HIPAA identifiers, and altering all time-stamps.
negatively, we hypothesized that greater amounts of template-derived boilerplate text and inserted patient data (e.g. lab results, medications and problem lists) in test notes would result in lower quality judgments. To systematically investigate this, the test documents were prepared in three versions: short, long or original. The short transformation eliminated boilerplate and redundant inserted data. The long transformation maximized inserted data and the original document was left unchanged. Specifics of the transformation are shown in Table 2.
Because the focus groups viewed redundant text 1. Original form to Short form: a. Suppress headers and footers b. Display abnormal labs, but otherwise suppress inserted lab, medication and problem lists c. One-line display of vital signs. d. Summarize boilerplate lists of normal findings with the phrase: “within normal limits”. 2. Original form to Long form a. Insert problem, lab and medication lists in provider notes, in the longest available format b. Display vital signs in 8-line format. Table 2. Test document transformations.
AMIA 2010 Symposium Proceedings Page - 272
Quality Dimension
1 “tells what’s going on” 2 “skim” 3 “distinguish author text” 4 “anticipate patient need” 5 “skip”
Form (long/ original/ short) (F, p) 1.45, 0.2348
Role (F, p)
Source Note (F, p)
tf•idf (all roles) (F, p)
tf•idf Admin (p)
tf•idf Nurse (p)
tf•idf Pract (p)
13.64, ***
30.18, ***
65.96, ***
.0101
.0008
***
0.80, 0.4502
13.80, ***
16.84, ***
49.74, ***
.014
.28 NS
***
2.39, 0.0924
13.52, ***
8.85, ***
29.80, ***
.22 NS
.48 NS
***
2.06, 0.1273
7.89, 0.0004 1.79, 0.17, NS 10.71, ***
24.38, ***
62.03, ***
.0111
.023
***
58.65, ***
77.64, ***
***
.14 NS
***
26.39, ***
71.09, ***
.0104
.0442
***
16.49, ***
53.93, ***
.0323
***
19.00, ***
15.24, ***
.0141
.0001
13.48, ***
22.70, ***
.0804 NS .6793 NS .31 NS
.0373
***
21.11, ***
36.46, ***
.0443
***
2.46, 0.0860
6 “I can follow the author’s thinking” 7 “wade”
0.73, 0.4800
8 “complete”
2.84, 0.0586
9 “trust”
0.77, 0.4642
10 “consistent”
1.77, 0.1701
3.04, 0.0482
8.21, 0.0003 1.21, 0.30, NS 10.56, *** 4.23, 0.0148
.059 NS
***P and N>P; ***
3.83 3.95
3.72 3.88
3.20 3.36
A>P and N>P; *** A>P and N>P; ***
3.62
3.47
3.14
3.80
3.71
3.25
3.62
3.50
3.10
9 “trust”
4.03
3.82
3.49
10 “consistent”
3.85
3.64
3.51
A>P; 0.0001 N>P; 0.0095 A>P; *** N>P; 0.0004 A>P; 0.0002 N>P; 0.0004 A>P; *** N>P; 0.0073 A>P; 0.0035 N>P; 0.2676
1 “tells what’s going on” 2 “skim” 3 “distinguish author text” 4 “anticipate patient need” 6 “follow author’s thinking” 7 “wade”
Comparison; p
***