Internet-Based Error Reporting Systems: Usability is

0 downloads 0 Views 77KB Size Report
domain, systems such as the Aviation Safety Reporting. System (ASRS) have ... systems, specifically focusing on aspects where human factors is particularly ...
PROCEEDINGS of the HUMAN FACTORS AND ERGONOMICS SOCIETY 51st ANNUAL MEETING—2007

822

Internet-Based Error Reporting Systems: Usability is Power Marc Resnick, Ph.D. Industrial and Systems Engineering Florida International University Miami, FL 33199 [email protected] Error reporting systems have been around for many decades, in domains such as aerospace, with great success. In contrast, domains such as health care have resisted broad-based systems, due in part to cultural issues and fear of litigation. A recurring issue in the development of all of these systems is usability. Usability affects the development, growth, usage, and sustainability of error-reporting systems in many ways. As these systems migrate to the Internet and become more broadly accessible, usability will become a dominant factor in system success. Whether the system is publicly accessible, Intranet-based behind a company firewall, or semi-private and managed through a government agency or non-governmental organization, some usability issues will apply to all systems and others will shift in importance. This paper applies an existing knowledge management model to the analysis of error reporting systems, highlighting the significant impact and necessity of usability on the success of error-reporting systems, using examples from a variety of domains.

Introduction

The CaSIDA Model

The purpose of error reporting systems is to identify the errors that are made within a particular domain, evaluate whether and how they can be prevented or minimized, and implement the necessary process changes. In the aerospace domain, systems such as the Aviation Safety Reporting System (ASRS) have been successful in translating error reports into safety improvements. However, there are many challenges that restrict the success of reporting systems, either because of cultural or technical conditions. Errors that are typically addressed include those caused by an inappropriately applied procedure, an incorrectly diagnosed situation, or overgeneralization of processes to contexts in which they are inapplicable. Levitt and March (1988) reported that individuals in organizations behave and make decisions based on a process of matching known procedures to recognized situations. When a procedure is inappropriate for a particular context, it is critical to modify it. This is where error reporting systems can be helpful. Error reporting systems can cover many error domains, including operations, management, and communication. There are also errors caused by physical limitations such as lack of sufficient hand-eye coordination or insufficient visual detection capability. These can also be captured by an error reporting system and used by system designers to modify the design of controls and displays. One key to the success of error reporting systems is to uncover the tacit knowledge that composes almost half of organizational knowledge (Rosalski, 2001). By making tacit knowledge explicit, knowledge can be shared among all individuals within the organization and errors can be prevented throughout.

The CaSIDA Model (see Figure 1) was developed by Resnick (2002, 2004) to describe the human factors requirements for knowledge management systems, specifically addressing the development of these systems for organizational learning. Designers and managers of errorreporting systems, particularly for extensive domains such as national databases of health care errors or aerospace errors, can use these insights to enhance the effectiveness of their systems. A human factors focus is essential because technology can significantly constrain the human aspects of knowledge management (Argyris and Schon, 1978; Huber, 1991). Resnick (2004) defined knowledge management as “a method to convert the unarticulated ideas of employees into structured information and then transform that information into useful knowledge.” Managed dissemination is also necessary to ensure that participants who can benefit from the knowledge are made aware of its reliability, context, and other details. The overall purpose of knowledge management systems is to improve the performance of the organization using it. Thus the knowledge must be presented to the right person, at the appropriate time and place, and in the correct format in order to produce learning and measurable benefits. Specifically, knowledge management requires a focus on the five stages of the CaSIDA model in Figure 1. The rest of this paper will apply the model to the design of error reporting systems, specifically focusing on aspects where human factors is particularly relevant.

PROCEEDINGS of the HUMAN FACTORS AND ERGONOMICS SOCIETY 51st ANNUAL MEETING—2007

823

Capture

Storage

Capture includes the active or passive collection of information. For error reporting systems, only active collection is feasible. Even then, there are many participation management issues that affect whether individuals are willing to input errors into the system. First, the people involved must be aware that a reportable error has occurred. Because not all errors lead to salient incidents, this is not as simple as it seems. Near misses (near hits) occur much more frequently than damage-causing errors and often reflect the same incorrect procedure (Nielsen, Carstensen, and Rasmussen, 2006). In order for insight into frequent errors to be gained, submission of near miss errors is essential. But near misses often are not included because the individuals involved don’t recognize that an error has been made (Tamuz, Thomas, and Franchois, 2004). Also, errors are commonly caused by multitasking or interruptions of a primary task (Garrett and Caldwell, 2006). When the error is discovered later, key details may not be recalled. A second major factor is the time and effort required to input errors into the system. Escoto, Karsh, and Beasley (2006) report that the hassle of inputting errors is a common challenge to participation in medical error reporting systems. Common, minor incidents may not seem important enough to be worth the time required. This can be overcome by having a shorter input dialog for less important errors. In order to be useful, characteristics of the error such as its context, cause, frequency and possible solutions should be included wherever possible. The input interface must balance the need to make the process fast and easy (to encourage widespread use) with the need for significant detail about each error (to make it useful). Structured dialogs with pulldown menus for common attributes can be used as long as text entry is available for errors that do not match. Domains that have very structured procedures, such as aerospace, can benefit most from this format. The attributes and fields that are included should probe users to consider aspects of the error that they may not have been originally aware of without significantly increasing the time required to input each error. A third factor is the perceived costs and benefits for the individual doing the reporting. Each domain will need to address this challenge in a way that is customized to its specific culture. Escoto, Karsh, and Beasley (2006) reported that health care practitioners are motivated by ethical values and the desire to learn from their mistakes. However, Daly (2005) found that medical errors are particularly sensitive because of the fear of litigation and the organizational culture of the medical profession. In this case, voluntary and/or anonymous participation rules must be used. Contributors must be assured that the security and privacy levels that they expect from the system are maintained. Morag and Gopher (2006) found that framing the capture process as a collection of system difficulties and hazards rather than human errors significantly increases reporting and results in a wider spectrum of errors that better represent the true distribution of real world experience.

Storage involves several components. Errors that are entered must be compared with existing errors to identify similarities and overlap. This is necessary to facilitate the integration stage. The storage system must be manageable by the IT workers responsible for maintaining the system (Tamuz, Thomas, and Franchois, 2004). The usability of data warehousing software has not been addressed extensively in the peer-reviewed literature, although companies often run usability analyses in-house for their products (Eckerson, 2006) and guidelines exist such as those by Zolly (2001). In general, error reporting systems use error taxonomies to structure the errors. Iden and Shappell (2006) divide errors using an information processing framework into perceptual, decision, and skill-based. This can be further subdivided to distinguish sensory limitations, working memory capacity limitations, long term memory accessibility, and others. Errors can also be classified according to the motivation of the individual who made the error. Alper et al. (2006) use several dimensions, including: • Intention: intentional or unintentional • Purpose: malevolence, necessary to overcome unanticipated constraints, inapplicable requirement, necessary to comply with another rule, save time or effort, necessary to get the job done, exception/emergency • Beneficiary: patient, employee, company Within each error category, a standard set of fields should be used to represent the details, but also have flexibility to support unusual and infrequent error types (Escoto, Karsh, and Beasley, 2006). For example, the Human Factors Analysis and Classification System (HFACS) includes four components for aviation errors: unsafe acts, preconditions for unsafe acts, unsafe supervision, and organizational influences (Shappell and Weigmann, 2000). To facilitate end users searching this taxonomy, it should be structured using the situational models in which users will search. For example, medical practitioners may alternately search using the treatment they are using for a procedural question or a tool name if they are unsure of its use and want to identify common errors associated with it. Another important human factors contribution to the storage phase is to support strong passwords (Bhargav et al., 2004). Despite the increasing sophistication of encryption technologies, the weak link of human behavior continues to sacrifice system security. For error-reporting systems, this is particularly critical because of findings such as Fazel and McMillan (2001) who found that the trust surrounding an error reporting system is key to its acceptance. Or more familiarly, garbage-in, garbage-out. It is also particularly relevant for medical systems because the public’s trust in the health care system can be sacrificed if they have access to salient details of medical errors (Anonymous, 2001).

PROCEEDINGS of the HUMAN FACTORS AND ERGONOMICS SOCIETY 51st ANNUAL MEETING—2007

Interpretation The next stage of the CaSIDA model is interpretation. In order for error reporting systems to be truly useful, they need to identify trends, make generalizations, and perhaps create new knowledge based on the stored errors. Simple interpretation can be accomplished through statistical analysis and data mining to identify errors that have high frequency, cost, or importance (Ma and Drury, 2003). This relies on an effective taxonomy in the storage stage so that similar errors can be aggregated. Morag and Gopher (2006) found that multiple input interfaces or generic text entry can lead to significant challenges in interpretation and integration of errors. When different terms are used to describe the same errors, an automated system cannot perform this aggregation. If the interpretation process is perceived as too onerous, it will not be done. Thus systems that use controlled vocabularies are preferred. The error descriptions also can be evaluated to determine the cause of the error. With current technology, this is an error by error process that can be labor intensive, but can yield valuable insights. For example, decision errors can be reviewed to identify if decision making heuristics contributed (Croskerry, 2002) and to identify preventive measures. Similarly, experts can review errors to generalize the set of contexts to which an error is relevant and where the preventive measures are appropriate. More extensive interpretation can be accomplished using semantic network analysis (Fisher and Hoffman, 2006). This involves mapping the concepts, contexts, and characteristics of each error with the intention of identifying new knowledge. Emergent knowledge that was previously unknown to any particular individual can be discovered by semantic network analysis. This process requires a partnership of technology and active interpretation by human experts. Significant advances in artificial intelligence will make the analysis easier, but the usability of the interpretation interface can already be improved through the application of user-centered design.

Dissemination Dissemination is one of the most challenging aspects of an effective error reporting system. If the insight gained from the system is not provided to decision makers in a format and at the time at which it is needed, it is clearly wasted. However, this is more complicated a task that it may seem. For example in the medical domain, an error and its corresponding preventive measure would have to be presented to a practitioner during the diagnosis or treatment process where the error may occur. If the practitioner recognizes the potential for an error, perhaps he/she may query the system. But in most medical care situations, this is unlikely either because an emergency situation does not allow the practitioner time to query the system or the practitioner is unaware of the error risk. Therefore, the knowledge gained from the error reporting system would have to be fed into the medical education system or as part of continuing education requirements.

824

The challenge is somewhat easier in domains such as aerospace or manufacturing where checklists and process maps are often used to manage procedures. In these cases, the checklists can be modified immediately upon discovery of an error-causing situation. However, when use of the system requires active querying, the same challenges as with medical care remain. In general, the value of using the system depends on the perceived value of the results. When the error database is sparse, the chance that any query will yield a result is small, discouraging use. When users develop a low perception of system utility, this can linger even after the database has grown. External incentives and motivation may be necessary to elicit use, increasing the management overhead required to administer the system. An additional challenge that must be overcome is the likelihood of compliance or reliance errors due to overconfidence in the system. As with other decision support systems, users may assume the system has more capability than it does and accept recommendations when they may be inappropriate, thus leading to new errors. Human factors guidelines for designing decision support systems to minimize reliance and compliance errors should be followed (Lee and See, 2004). Future studies should investigate specific guidelines for error reporting systems.

Auditing Auditing an error reporting system is necessary to ensure that the trends identified and the solutions provided are accurate and that they remain so as the state of the art within the domain evolves. For example with medical systems, new medical procedures can make old errors inapplicable. These should be removed from the system to prevent false alarms from slowing down use of the system and decreasing its perceived utility. Similarly, new models of air traffic systems or cockpits designs can have the same effect within aerospace. Auditing can be done using subject matter experts who systematically review the contents of the system to eliminate outdated content. Human factors has a primary role to play here because of the complexity of the search and sort processes. There are many existing guidelines for designing search interfaces (see for example Resnick and Vaughan, 2006). However, these need to be customized for the specific nature of searching through an extensive corpus of error descriptions. Empirical data can also be used for system auditing. The purpose of the error reporting system is to reduce the occurrence of errors. The frequency of errors that have been captured by the system should go down. If not, there is a bottleneck in the system’s use, either because of poor interpretability or dissemination.

System-wide Considerations There are several issues that apply to all stages of the knowledge management process for error reporting systems. Technological limitations are a key challenge. When a system is managed by an external organization such as a government

PROCEEDINGS of the HUMAN FACTORS AND ERGONOMICS SOCIETY 51st ANNUAL MEETING—2007

agency or industry trade group, hardware and software compatibility between the systems housed within each participating organization must be considered. If each organization has different capabilities or requirements, it can significantly diminish the effectiveness of the system. Even within a corporate Intranet, many divisions will have different systems. In the medical domain, this is specifically problematic because even within a single unit such as a hospital, there may be many different networks that need to access the system. While human factors often does not receive primary consideration when addressing technology compatibility, it does play an essential role. Translating incompatibilities, developing error messages and transfer protocols that can be used by the typical user who is not from the IT department is critical. Managerial issues also must be considered. Organizational culture, team communication, shift change management, and many other aspects have significant human factors impacts (Resnick, 2007). Developing a framework of use cases and user requirements for each domain so that an error reporting system can be customized and effectively applied is critical. When done correctly, not only are errors reduced, but the organization’s culture becomes perceived by its members as fundamentally just (von Thaden et al., 2006).

Summary An effective error reporting system based on a well designed error database, usable input and output interfaces, effective participation and dissemination policies, and appropriate management and oversight has the potential to significantly reduce the errors made in a variety of domains. By minimizing the prevalence and severity of errors and creating a culture of continuous improvement whereby new errors are quickly identified and addressed can lead to fewer and less severe errors, reduced cost, and thus improved organizational performance. In some domains, such as medical care, this improvement can lead to saved lives. Using a model such as CaSIDA can structure this investigation and improve the overall outcome. Human factors can play an important role in the specification and development of effective error reporting systems.

References Alper S.J., Karsh B., Holden R.J., Scanlon M.C., Patel N., and Kaushal R. (2006). Protocol violations during medication administration in pediatrics. Proceedings of the Human Factors and Ergonomics Society 50th Annual Meeting. Human Factors and Ergonomics Society: Santa Monica, CA. Anonymous (2001). Medical errors and medical culture. British Medical Journal, 322, 1236-1237. Argyris C. and Schon D.A. (1978). Organizational Learning: A Theory of Action Perspective. Addison Wesley: Reading, MA. Bhargav A., Proctor R., Schultz E., Tai B. and Vu K. (2004). Promoting Memorability and Security of Passwords Through Sentence Generation. Proceedings of the Human

825

Factors and Ergonomics Society 48th Annual Meeting. Human Factors and Ergonomics Society: Santa Monica, CA. Croskerry P. (2002). Achieving quality in clinincal decision making: Cognitive strategies and detection of bias. Academic Emergency Medicine, 9, 11, 1184-1204. Daly R. (2005). Voluntary system to collect medical-error data. Psychiatric News, 40, 17, p11. Eckerson W. (2006, February 2). A CIO’s data warehouse: nine steps to success. Search CIO, Retrieved on 02/11/07 at searchsap.techtarget.com/ originalContent/0,289142,sid21_gci1164625,00.html Escoto K.M., Karsh B. and Beasley J.W. (2006). Multiple user considerations and their implications in medical error reporting system design. Human Factors, 48, 1, 48-58. Fazel S.and McMillan J. (2001). Commentary: A climate of secrecy undermines public trust. British Medical Journal, 322, 1239-1240. Fisher K.M. and Hoffman R. (2006) Knowledge and Semantic Network Theory. Semantic Research, Inc. White Paper. Retrieved on 02/11/07 at www.semanticresearch.com/downloads/ whitepapers/theory_whitepaper.pdf Garrett S.K. and Caldwell B.S. (2006). Task coordination and group foraging in healthcare delivery teams. Proceedings of the Human Factors and Ergonomics Society 50th Annual Meeting. Human Factors and Ergonomics Society: Santa Monica, CA. Huber G.P. (1991). Organizational Learning: The Contributing Processes and The Literatures. Organizational Science, 2, 1, 89-115. Iden R and Shappell S.A. (2006). A human error analysis of U.S. fatal highway crashes 1990-2004. Proceedings of the Human Factors and Ergonomics Society 50th Annual Meeting. Human Factors and Ergonomics Society: Santa Monica, CA. Lee J.D. and See K.A. (2004). Trust in automation: Designing for appropriate reliance. Human Factors, 46, 50-80. Levitt B. and J.G. March. (1988). Organizational Learning. Annual Review of Sociology, 14, 319-340 Ma J. and Drury C. (2003). The Human Factors issues in data mining. Proceedings of the Human Factors and Ergonomics Society 47th Annual Conference. Human Factors and Ergonomics Society: Santa Monica, CA. Morag I. and Gopher D. (2006). A reporting system of difficulties and hazards in hospital wards as a guide for improving human factors and safety. Proceedings of the Human Factors and Ergonomics Society 50th Annual Meeting. Human Factors and Ergonomics Society: Santa Monica, CA. Nielsen K.J., Carstensen O. and Rasmussen K. (2006). The prevention of occupational injuries in two industrial plants using an incident reporting scheme, Journal of Safety Research, 37, 479-486. Resnick M.L. (2002). Knowledge Management in the Virtual Organization. Proceedings of the 11th Annual Conference on Management of Technology. International Association for Management of Technology. Miami, FL.

PROCEEDINGS of the HUMAN FACTORS AND ERGONOMICS SOCIETY 51st ANNUAL MEETING—2007

Resnick M.L. (2004) Management Requirements For Knowledge Management Systems in the Virtual Organization. International Journal of Networking and Virtual Organizations, 2 (4), 287-297. Resnick M.L. and Vaughan M. (2006) Best Practices and Future Visions for Search User Interfaces. Journal of the American Society of Information Science and Technology, 57, 6, 781-787. Resnick M.L. (2007). The Effects of Organizational Culture on System Reliability: A Cross-Industry Analysis. Proceedings of the Industrial Engineering Research Conference. Institute of Industrial Engineers: Norcross, GA. Rosalski D. (2001). Personal portals: the key to knowledge on demand. CRM Forum, Retrieved on 11/30/01 at www.crmforum.com/cgi-bin/item.cgi?id=64795&d=101&nl= nd48.

Shappell S. A. and Wiegmann D.A. (2000). The Human Factors Analysis and Classification System – HFACS, DOT/FAA/AM-00/7. Tamuz M., Thomas E.J., and Franchois K.E. (2004). Defining and classifying medical error: lessons for patient safety reporting systems. Journal of Quality and Safety in Health Care, 13, 13-20. Von Thaden T., Hoppes M., Li Y., Johnson N., and Schriver A. (2006). The perception of just culture across disciplines in healthcare. Proceedings of the Human Factors and Ergonomics Society 50th Annual Meeting. Human Factors and Ergonomics Society: Santa Monica, CA. Zolly L. (2001). The role of usability testing in web site design. USGS Center for Biological Informatics. Retrieved on 02/11/07 at cendi.dtic.mil/presentations/ LisaZollyApr17rev.ppt.

The CaSIDA Model participation management technology opting policy active/passive privacy

capture

data models

storage

data mining semantic networks

active/passive privacy access

826

interpretation and transformation

audit

distribution

Figure 1. CaSIDA Model of Knowledge Management