Radiography xxx (2014) 1e5
Contents lists available at ScienceDirect
Radiography journal homepage: www.elsevier.com/locate/radi
Reducing image interpretation errors e Do communication strategies undermine this? B. Snaith a, *, M. Hardy b, E.F. Lewis a a b
Radiology Department, Mid Yorkshire Hospitals NHS Trust, Wakefield, UK School of Health, University of Bradford, Bradford, UK
a r t i c l e i n f o
a b s t r a c t
Article history: Received 23 January 2014 Received in revised form 11 March 2014 Accepted 14 March 2014 Available online xxx
Introduction: Errors in the interpretation of diagnostic images in the emergency department are a persistent problem internationally. To address this issue, a number of risk reduction strategies have been suggested but only radiographer abnormality detection schemes (RADS) have been widely implemented in the UK. This study considers the variation in RADS operation and communication in light of technological advances and changes in service operation. Methods: A postal survey of all NHS hospitals operating either an Emergency Department or Minor Injury Unit and a diagnostic imaging (radiology) department (n ¼ 510) was undertaken between July and August 2011. The questionnaire was designed to elicit information on emergency service provision and details of RADS. Results: 325 questionnaires were returned (n ¼ 325/510; 63.7%). The majority of sites (n ¼ 288/325; 88.6%) operated a RADS with the majority (n ¼ 227/288; 78.8%) employing a visual ‘flagging’ system as the only method of communication although symbols used were inconsistent and contradictory across sites. 61 sites communicated radiographer findings through a written proforma (paper or electronic) but this was run in conjunction with a flagging system at 50 sites. The majority of sites did not have guidance on the scope or operation of the ‘flagging’ or written communication system in use. Conclusions: RADS is an established clinical intervention to reduce errors in diagnostic image interpretation within the emergency setting. The lack of standardisation in communication processes and practices alongside the rapid adoption of technology has increased the potential for error and miscommunication. Ó 2014 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.
Keywords: X-ray Radiographer Interpretation Error Red dot Emergency department
Introduction Errors in the interpretation of diagnostic images in the emergency department (ED) are a persistent problem internationally.1e4 Where injuries remain undiagnosed, or a delay to injury diagnosis is experienced as a consequence of interpretive error, patients may be predisposed to long term morbidity4,5 and organisations to the potential of litigation.6 Similarly, the unnecessary treatment of ‘normal’ conditions, whilst not resulting in preventable morbidity, may impact on the lifestyle and psychological experience of the patient. Importantly, with increasing financial constraints being applied to healthcare, the overtreatment of patients also results in unnecessary resource utilisation although few authors have
* Corresponding author. Radiology Department, Mid Yorkshire Hospitals NHS Trust, Aberford Road, Wakefield WF1 4DG, UK. Tel.: þ44 01924542034. E-mail address:
[email protected] (B. Snaith).
considered this directly.7e9 To address these issues, a number of strategies to reduce the risk of diagnostic error in the ED have been suggested including senior medical review of images,4 immediate radiology reporting5,6 and initial evaluation of images by the examining radiographer.7e9 With increasing ED attendances and staff shortages placing unprecedented pressures on ED services,10 the opportunity for senior medical scrutiny of diagnostic images may be limited.11 Likewise, whilst immediate radiology reporting has been shown to be clinically and cost effective,9,12 widespread implementation has not been achieved, presumably due to the overall rise in radiology activity and competing pressures. As a result, the only widely implemented intervention to date in the UK has been review of diagnostic images by the examining radiographer and immediate communication of these findings to the ED clinician to assist clinical diagnosis.13 Often described as a radiographer abnormality detection scheme (RADS), this system offers a second pair of eyes in the image review process but does not replace the review and interpretation of images by the treating
http://dx.doi.org/10.1016/j.radi.2014.03.006 1078-8174/Ó 2014 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.
Please cite this article in press as: Snaith B, et al., Reducing image interpretation errors e Do communication strategies undermine this?, Radiography (2014), http://dx.doi.org/10.1016/j.radi.2014.03.006
2
B. Snaith et al. / Radiography xxx (2014) 1e5
clinician and ultimate definitive radiology report. Initially developed in the UK, RADS have now been implemented in approximately 85%e90% of UK hospitals and acute care centres13,14 and are increasingly being adopted internationally.15,16 2 common methods of radiographer communication through RADS have been reported. The first is the use of a visual flag (symbol) on the diagnostic image or electronic system (the colloquial ‘red dot system’17) to highlight abnormal findings, and the second is communication via an electronic or paper comment proforma13,18 requiring the radiographer to write a description of their image observations. A number of studies have evaluated the success of RADS in reducing ED interpretive errors.18 Yet despite initial implementation in the 1980s and subsequent multidisciplinary acceptance and widespread adoption, the absence of RADS guidance or standards resulted in UK hospitals developing local systems in response to patient pathways and clinician preferences.13 Prior to the introduction of digital imaging technologies, these locally driven approaches were a relatively effective and safe method of communication. However, advances in technology over the last decade have revolutionised imaging services and the filmless environment now provides greater opportunities for image sharing both within and across organisations. This rapid introduction of digital imaging technologies did not overtly consider operational needs beyond image acquisition and radiology reporting. Consequently, opportunities to standardise RADS were overlooked and individual organisations were once again left to implement RADS communication in an ad hoc way.19 Due to the differences in working practices associated with digital imaging advancements (e.g. remote image review, telemedicine, electronic image transfer to specialist hospitals), these inconsistencies in RADS communication may now pose a threat to patient safety and service quality although the potential size of any risk remains uncertain as no study detailing the variety of approaches to RADS communication has been published. Objectives This article uses data from a UK survey of imaging departments within hospitals providing emergency care services to identify the prevalence of RADS and variation in communication methods used. While the results of this study represent UK practice, the findings have significant implication internationally as many countries have, or are planning to, implement RADS into mainstream practice.15,16,20 Method Data collection and analysis comprised a national postal survey undertaken between July and August 2011. Inclusion was restricted to NHS hospitals operating either an ED or Minor Injury Unit (MIU) and a diagnostic imaging (radiology) department (n ¼ 510). The sample was compiled from the UK Government ED Statistics and National Hospital databases (Health and Social Care in Northern Ireland 2011; Health in Wales 2011; Hospital Episode Statistics 2011; The Scottish Government, 2011) and facilities were confirmed through hospital websites. Where uncertainty remained, telephone contact was made with the individual hospital to confirm eligibility for inclusion. Surveys were addressed to the lead ED/MIU radiographer. The questionnaire (available from the authors) was designed to elicit information on the type of emergency service provided and, where in operation, details of RADS including method of communication, anatomical scope and terminology adopted. In addition, all sites using an electronic or paper comment proforma were invited to return an example with the questionnaire. The survey was piloted prior to distribution to ensure question accuracy,
appropriateness and relevance. No postal reminders were employed as response rate was comparable to that of a similar study13 and analysis of non-response bias21 using both analysis of characteristics and methods aligned to continuum of resistance theory suggested sample representativeness. The pre-coded quantitative responses were analysed using Microsoft Excel; open-ended responses were analysed using the original questionnaire items as a framework and grouped under broad themes. The survey was considered to represent service evaluation and therefore did not require ethical approval. Results We received completed questionnaires from 325 respondents (n ¼ 325/510; 63.7%). Responses reflected the total sample in terms of emergency service provided with 58.8% (n ¼ 191/325) being from centres with an ED. 288 sites (n ¼ 288/325; 88.6%) operated a RADS with the majority (n ¼ 227/288; 78.8%) employing a ‘flagging’ system as the only method of communication. Variation in system operation and anatomical scope was evident (Table 1). Flagging systems Of the 277 sites that operated a ‘flagging’ system, 248 (89.5%) marked the image directly, most commonly by annotating the term ‘red dot’ (n ¼ 142/248; 57.3%) although other annotations and a wide range of symbols were reported (Table 2). 29 of the responding sites indicated the annotation (flag) is placed adjacent to the site of the injury, but gave no indication of how multiple injury sites are managed. Importantly, only 53.8% (n ¼ 149/277) of sites reported having RADS guidelines in place and the majority of sites (n ¼ 197/277; 71.1%) considered radiographer participation to be voluntary. Inconsistency in RADS operation and application of annotations and symbols by radiographers within individual departments or organisations was highlighted in a number of textual responses. “Variation [exists] between radiographers, some add? some just write red dot” Respondent 58 Table 1 Description of RADS. RADS description RADS system employed Flagging system (red dot or similar) only Commenting system only Both flagging and commenting systems operated Flagging system employed to indicate: Normal appearances Uncertain appearances Abnormal appearances Not Stated Flagging system: anatomical scope Musculoskeletal examinations only Musculoskeletal and chest examinations All radiographic examinations Not stated Commenting proforma employed to indicate: Normal appearances Uncertain appearances Abnormal appearances Not stated Commenting proforma: anatomical scope Musculoskeletal examinations only Musculoskeletal and chest examinations All radiographic examinations Not stated
Responses no. (%) 227/288 (78.8%) 11/288 (3.8%) 50/288 (17.4%) 8/277 (2.9%) 235/277 (84.8%) 273/277 (98.6%) 1/277 (0.4%) 168/277 (60.6%) 35/277 (12.6%) 73/277 (26.4%) 1/277 (0.4%) 28/61 (45.9%) 50/61 (82.0%) 57/61 (93.4%) 2/61 (3.3%) 35/61 (57.4%) 11/61 (18.0%) 13/61 (21.3%) 2/61 (3.3%)
Please cite this article in press as: Snaith B, et al., Reducing image interpretation errors e Do communication strategies undermine this?, Radiography (2014), http://dx.doi.org/10.1016/j.radi.2014.03.006
B. Snaith et al. / Radiography xxx (2014) 1e5
3
Table 2 Variations in RADS ‘flags’ across UK hospitals.
“Participation in scheme mandatory but not always adhered to!” Respondent 462 “at main site [we write] ‘red dot’ on image e at MIU [we use] opinion slips” Respondent 205 A small number of respondents indicated that the ‘flagging’ system had been suspended as a result of workload pressures or anxiety over its implementation and operation in both radiology: “‘red dot’ system was stopped to avoid confusion as not every radiographer participated” Respondent 385 and within the ED. “We used to operate a ‘red dot’until 2 months ago when a doctor during a SUI [serious untoward incident] investigation used the lack of red dot on a chest x-ray as mitigating circumstances for why he missed the lesion. Radiology service manager withdrew the scheme”. Respondent 165 “We used to run red dot but stopped as it was felt that MIU were not scrutinizing the images adequately”. Respondent 320 A further respondent suggested that even though a RADS was not in operation, radiographers placed ‘flags’ on images. “although not on a scheme, some radiographers input ‘red dot’” Respondent 177
Sites that did not apply ‘flags’ to the image indicated radiographer findings by placing ‘flags’ on imaging referral forms, PACS, radiology/ED information systems or patient return slips (Table 2). The effectiveness of these modes of communication were raised by a small number of open text responses. “All A&E minors [patients] take back slip with their exam label [flag] on and realistically, I think it gets as far as the reception desk in A&E [ED] and isn’t seen by a doctor.” Respondent 148
Commenting systems Of the 61 sites operating RADS through a comment proforma, operation guidelines were in place at 35 sites (57.4%) and radiographer participation was considered voluntary at 62.3% (n ¼ 38/61). Comments were recorded on an electronic data system at 24 sites (39.3%) and 36 sites (59.0%) used a paper proforma with 15 different examples being returned for evaluation. These paper proformas varied in both size (4 less than A5; 8 A5; 3 A4) and colour (12 white; 2 yellow; 1 pink). Communication on the majority of forms was by tick boxes (n ¼ 10/15; 66.7%) supplemented by an area for free text although phrasing and terminology adopted varied. A single form also included a skeleton image to allow radiographers to indicate site of concern. Although all proformas expected the identity of the completing radiographer to be stated (signature, initials or staff number), only 11 identified the hospital or NHS Trust. Inconsistency in proforma description or title was also noted with the terms radiographer ‘comment’, ‘report’, ‘opinion’, ‘observation’ and ‘assessment’ being used. Despite this variation, the majority of forms (n ¼ 11) emphasised the status of the radiographer evaluation stating it was the opinion of the examining radiographer (n ¼ 3/11; 27.3%), not a formal report (n ¼ 2/11; 18.2%) or both (n ¼ 6/11; 54.5%).
Please cite this article in press as: Snaith B, et al., Reducing image interpretation errors e Do communication strategies undermine this?, Radiography (2014), http://dx.doi.org/10.1016/j.radi.2014.03.006
4
B. Snaith et al. / Radiography xxx (2014) 1e5
Discussion This study is the first to consider RADS operation and participation practices from a quality and safety perspective and without doubt, the findings paint a worrying picture. RADS remain an accepted practice and evidence of their value in reducing interpretive errors within the ED is well documented.13,22 However, much of this work was undertaken prior to widespread implementation of digital imaging systems and reported evaluations have been of educational cohorts23,24 or from individual hospital sites/organisations.16,25,26 No identified studies have considered RADS in the light of advances in imaging technology or the impact of this technology on work and communication practices. Further, no study has considered how the wide range of digital imaging platforms and suppliers within the UK (or even within individual organisations or localities) has impacted on healthcare practices. The issues are exacerbated as imaging technology developments have been predominantly based around the American markets where image acquisition is undertaken by radiologic technologists, a profession aligned to UK radiographers but with severely restricted contribution to clinical decision making. As a result, the adoption of new technologies without cognisance of variations in clinical practice or professional roles may have inadvertently created new hazards and increased risk of diagnostic error through inconsistent and fragmented systems of communication. Standardisation of RADS operation is required, particularly as we increase the electronic sharing of images between healthcare organisations. We must also consider the impact of increasing ED service pressures, staff shortages and rotational or short term medical and nursing cover.10,27 If ED staff are increasingly working or rotating through different sites and organisations, variation in RADS could lead to confusion and increase risk of error. This is particularly evident as symbols have conflicting meanings, concurrent multiple injuries are identified by a single flag (which may be adjacent to a single site) and commenting proformas vary in both appearance and type (electronic or paper). A crucial finding of this study was the lack of local RADS guidelines explaining the role, remit and operation. This in itself will lead to inconsistencies and potential for communication confusion and error if neither the operators or users of the scheme have a clear understanding of its remit and purpose. If we add to this the voluntary nature of RADS participation at the majority of hospital sites, the inconsistent anatomical scope of RADS between sites, the conflicting meaning of symbols and variation in methods of communication adopted across and within organisations, then the potential for error is amplified. Since distribution of this survey, guidance by the UK radiography body (Society and College of Radiographers) has been published27 and offers some direction towards RADS unification, addressing questions around anatomical scope, participation, education, governance and audit. However, whilst this guidance advocates the abolition of ‘flagging’ systems in favour of written radiographer evaluations to aid transparency of communication in light of previous concerns,22 its advocacy stops short of RADS standardisation. Indeed, the guidance states that “A proforma system is recommended, either electronic or paper-based or both, and should be developed in accordance with identified clinical need locally”.28 We have yet to see what impact this guidance has had on UK clinical practice but anecdotal evidence suggests that the use of visual flags persists as the established practice. It is clear that the issues identified in this survey require multi-factoral intervention aligned with macroergonomics and other aspects of human factors science. In this digital era, systems of communication extend far beyond individual hospital boundaries and potential reach should
be taken into account in their design. However, resolution is only possible if we recognise that a problem exists and findings of this survey have clearly highlighted an overlooked aspect of practice in need of intervention to ensure service quality and patient safety are not unwittingly compromised. Conclusions RADS is an established clinical intervention to reduce errors in diagnostic image interpretation within ED and as such is valued by both radiographers and ED staff alike. However, the lack of RADS standardisation alongside the rapid adoption of technology has increased the potential for error and miscommunication, an ironic situation given that RADS were initially introduced to reduce risk of diagnostic errors. Conflict of interest None. Acknowledgements Philip English for assistance in data collection. References 1. Enderson BL, Reath DB, Meadors J, Dallas W, DeBoo JM, Maull KI. The tertiary trauma survey a prospective study of missed injury. J Trauma 1990;30:666e9. 2. Guly H. Diagnostic errors in an accident and emergency department. Emerg Med J 2001;18:263e9. 3. Wei CJ, Tsai WC, Tiu CM, Wu HT, Chiou HJ, Chang CY. Systematic analysis of missed extremity fractures in emergency radiology. Acta Radiol 2006;47:710e7. 4. Hallas P, Ellingsen T. Errors in fracture diagnoses in the emergency departmente characteristics of patients and diurnal variation. BMC Emerg Med 2006;6:4. 5. Sharma H, Bhagat S, Gaine WJ. Reducing diagnostic errors in musculoskeletal trauma by reviewing non-admission orthopaedic referrals in the next-day trauma meeting. Ann Roy Coll Surg 2007;89:692e5. 6. Fileni A, Magnavita N. A 12-year follow-up study of malpractice claims against radiologists in Italy. Radiol Med 2006;111:1009e22. 7. Williams SM, Connelly DJ, Wadsworth S, Wilson DJ. Radiological review of accident and emergency radiographs: a 1-year audit. Clin Radiol 2000;55:861e 5. 8. Benger JR, Lyburn ID. What is the effect of reporting all emergency department radiographs? Emerg Med J 2003;20:40e3. 9. Hardy M, Snaith B, Scally A. The impact of immediate reporting on interpretive discrepancies and patient pathways within the emergency department: a randomized controlled trial. Br J Radiol 2013;86:20120112. 10. Flowerdew L, Brown R, Russ S, Vincent C, Woloshynowych M. Teams under pressure in the emergency department: an interview study. Emerg Med J; 2012. http://dx.doi.org/10.1136/emermed-2011e200084. 11. Guly HR. Risk management. J Accid Emerg Med 1997;14:408e10. 12. Hardy M, Hutton J, Snaith B. Is a radiographer led immediate reporting service for emergency department referrals cost effective? Radiography 2013;19:23e7. 13. Snaith B, Hardy M. Radiographer abnormality detection schemes in the trauma environment e an assessment of current practice. Radiography 2008;14:475e 81. 14. Price RC, Le Masurier SB. Longitudinal changes in extended roles in radiography: a new perspective. Radiography 2007;13:18e29. 15. Okeji MC, Udoh BE, Onwuzu SW. Appraisal of reporting of trauma images: implications for evolving red dot system in Nigeria. ARPN J Sci Technol 2012;2: 533e5. 16. Hlongwane ST, Pitcher RD. Accuracy of after-hour ‘red dot’ trauma radiograph triage by radiographers in a South African regional hospital. S Afr Med J 2013;103:638e40. 17. Berman L, de Lacey G, Twomey E, Twomey B, Welch T, Eban R. Reducing errors in the accident department: a simple method using radiographers. Brit Med J 1985;290:421e2. 18. Brealey S, Scally A, Hahn S, Thomas N, Godfrey C, Crane S, et al. Accuracy of radiographers red dot or triage of accident and emergency radiographs in clinical practice: a systematic review. Clin Radiol 2006;61:604e15. 19. Coelho JM, Rodrigues PP. The red dot system: emergency diagnosis impact and digital radiology implementation a review. HEALTHINF:508e11. Available from: http://repositorio.chporto.pt/handle/10400.16/842; 2011 [accessed 15.01.13]. 20. McConnell J, Devaney C, Gordon M. Queensland radiographer clinical descriptions of adult appendicular musculo-skeletal trauma following a condensed education programme. Radiography 2013;19:48e55.
Please cite this article in press as: Snaith B, et al., Reducing image interpretation errors e Do communication strategies undermine this?, Radiography (2014), http://dx.doi.org/10.1016/j.radi.2014.03.006
B. Snaith et al. / Radiography xxx (2014) 1e5 21. Lewis EF, Hardy M, Snaith B. Estimating the effect of nonresponse bias in a survey of hospital organizations. Eval Health Prof 2013;36:330e51. 22. Dimond B. Red dots and radiographers’ liability. Health Care Risk Rep; 2000:10e 2. 23. Hargreaves J, Mackay S. The accuracy of the red dot system: can it improve with training? Radiography 2003;9:283e9. 24. Hardy M, Culpan G. Accident and emergency radiography: a comparison of radiographer commenting and ‘red dotting’. Radiography 2007;13:65e71.
5
25. Renwick IGH, Butt WP, Steele B. How well can radiographers triage x-ray films in an accident and emergency department. Brit Med J 1991;302:568e9. 26. Snaith BA. Are trusts replacing the red dot? Br J Radiol UKRC Suppl; 2004:46e7. 27. Pham JC, Andrawis M, Shore AD, Fahey M, Morlock L, Pronovost PJ. Are temporary staff associated with more severe emergency department medication errors? J Healthc Qual 2011;33:9e18. 28. Society and College of Radiographers. Preliminary clinical evaluation and clinical reporting by radiographers: policy and practice guidance. London: SoR; 2013.
Please cite this article in press as: Snaith B, et al., Reducing image interpretation errors e Do communication strategies undermine this?, Radiography (2014), http://dx.doi.org/10.1016/j.radi.2014.03.006