How to Submit Proof Corrections Using Adobe ...

7 downloads 303199 Views 2MB Size Report
Using Adobe Reader is the easiest way to submit your proposed ..... Therefore,. a.doctor.who.is.an.expert.witness.has.to.have.certain.training.and.qualifications.
How to Submit Proof Corrections Using Adobe Reader Using Adobe Reader is the easiest way to submit your proposed amendments for your IGI Global proof. If you don’t have Adobe Reader, you can download it for free at http://get.adobe.com/reader/. The comment functionality makes it simple for you, the contributor, to mark up the PDF. It also makes it simple for the IGI Global staff to understand exactly what you are requesting to ensure the most flawless end result possible. Please note, however, that at this point in the process the only things you should be checking for are: Spelling of Names and Affiliations, Accuracy of Chapter Titles and Subtitles, Figure/Table Accuracy, Minor Spelling Errors/Typos, Equation Display As chapters should have been professionally copy edited and submitted in their final form, please remember that no major changes to the text can be made at this stage. Here is a quick step-by-step guide on using the comment functionality in Adobe Reader to submit your changes. 1.

Select the Comment bar at the top of page to View or Add Comments. This will open the Annotations toolbar.

2.

To note text that needs to be altered, like a subtitle or your affiliation, you may use the Highlight Text tool. Once the text is highlighted, right-click on the highlighted text and add your comment. Please be specific, and include what the text currently says and what you would like it to be changed to.

3.

If you would like text inserted, like a missing coma or punctuation mark, please use the Insert Text at Cursor tool. Please make sure to include exactly what you want inserted in the comment box.

4.

If you would like text removed, such as an erroneous duplicate word or punctuation mark, please use the Add Note to Replace Text tool and state specifically what you would like removed.

This proof is copyrighted by IGI Global. It is being provided as a courtesy to the author to review prior to publication. Posting on any online site (for example, ResearchGate, Academia.edu, ArXiv) or distributing this proof without written permission of IGI Global is prohibited and a violation of copyright law.

Impact of Medical Errors and Malpractice on Health Economics, Quality, and Patient Safety Marina Riga Health Economist-Research, Greece

A volume in the Advances in Medical Education, Research, and Ethics (AMERE) Book Series

Detailed Table of Contents

Preface. ................................................................................................................. xi ;

;

Chapter 1 Clinical Pathways and the Human Factor: Approaches to Control and Reduction of Human Error Risk............................................................................. 1 Vaughan Michell, Reading University, UK Jasmine Tehrani, Reading University, UK ;

;

;

;

;

;

;

A key approach to improving patient safety is to seek to modify both formal and informal behaviours in response to the extensive reporting of error causes in the literature. This response is primarily in two parts; a) actions to minimise the risk of error or b) actions to control against error. For a) very valuable work has also been undertaken in running human factors courses to demonstrate and try to change poor behaviour via best practice models. In the case of b) much work has been done on increasing control regimes such as checklists and also formal rules in formal procedures. However, these actions tend to be specific to specific health units, are often piecemeal and are not integrated to complement each other. Little work has been done to integrate these formal and informal/social behaviour into clinical pathways or health activities. This chapter reviews current thinking and develops a methodology and proposal for identification and control of human error in clinical pathways based on the research of the two authors. ;

Chapter 2 Medical Errors: Impact on Health Care Quality. ................................................. 32 Jayita Poduval, Pondicherry Institute of Medical Sciences, India ;

;

;

;

;

The impact of medical errors on the delivery of health care is massive, and it significantly reduces health care quality. They could be largely attributed to system failures and not human weakness. Therefore improving health care quality and ensuring quality control in health care would mean making systems function in a better manner. In order to achieve this all sections of society as well as industry must be involved. Reporting of medical error needs to be encouraged and this

may be ensured if health care professionals as well as administrators and health consumers come forward without fear of being blamed. To get to the root of the problem- literally and metaphorically- a root cause analysis and audit must be carried out whenever feasible. Persons outside the medical care establishment also need to work with medical service providers to set standards of performance, competence and excellence. ;

Chapter 3 Patient Safety and Medical Errors: Building Safer Healthcare Systems for Better Care............................................................................................................ 59 Vasiliki Kapaki, University of Peloponnese, Greece Kyriakos Souliotis, University of Peloponnese, Greece ;

;

;

;

;

;

;

Patient Safety is considered to be the most important parameter of quality that every contemporary healthcare system should be aiming at. The terms “Patient Safety” and “Medical Errors” are directly linked to the “Safety Culture and Climate” in every organization. It is widely accepted that medical errors constitute an index of insufficient safety and are defined as any unintentional event that diminishes or could diminish the level of patient safety. This chapter indicates that a beneficial safety culture is essential to enhance and assure patient safety. Furthermore, health care staff with a positive safety culture is more probable to learn openly and successfully from errors and injuries. ;

Chapter 4 Application of Quality Management in Promoting Patient Safety and Preventing Medical Errors.................................................................................... 88 Ali Mohammad Mosadeghrad, Tehran University of Medical Sciences, Iran Abraha Woldemichael, Mekelle University, Ethiopia ;

;

;

;

;

;

;

The combination of healthcare professionals, processes and technologies bring significant benefits for patients. However, it also involves an inevitable risk of adverse events. Patients receiving health care in health institutions have the potential to experience some forms of medical errors. The word medical error commonly encompasses terms such as mistakes, near misses, active and latent errors. This signifies the complexity and multidimensional nature of the error. The consequences can be costly to the patients, the health professionals, the health care institutions, and the entire health care system. The costs may involve human, economic, and social aspects. Thus, ensuring quality health care can contribute to patients’ safety by reducing potential medical errors in practice. This chapter aims to introduce a quality management framework for improving the quality and effectiveness of services, reducing medical errors and making the healthcare system safer for patients.

;

Chapter 5 The Perspectives of Medical Errors in the Health Care Industry....................... 109 Kijpokin Kasemsap, Suan Sunandha Rajabhat University, Thailand ;

;

;

;

;

This chapter presents the overview of medical errors; drug prescription errors and prescribing; the overview of medical error disclosure; medical errors and telemedicine; medical errors and medical education; the overview of nursing medication errors; and the aspects of medical errors in the health care industry. Reducing medical errors, increasing patient safety, and improving the quality of health care are the major goals in the health care industry. Medical errors are caused by mistakes in drug prescription, dosing, and medical administration in inpatient and outpatient settings. Heath care-related guidelines, institutional safety practices, and modern health care technologies must be applied in hospitals, clinics, and medical offices to reduce the occurrence of medical errors. The chapter argues that understanding the perspectives of medical errors has the potential to enhance health care performance and reach strategic goals in the health care industry. ;

Chapter 6 The Role of Forensic Medicine in Medical Errors............................................. 139 Grigorios Leon, Hellenic Society of Forensic Medicine, Greece ;

;

;

;

;

This chapter presents the importance of Legal and Forensic Medicine in medical malpractice and explains how autopsies have a crucial role for the evaluation and the prevention of medical errors. Health systems vary from country to country; however, experts are indispensable in each system. In fact, experts’ opinions are asked for resolution of specific court cases. Standard of care is often assessed by expert medical witnesses who testify for one of the litigants. The physician who acts as an expert witness is one of the most important figures in malpractice litigation. Therefore, a doctor who is an expert witness has to have certain training and qualifications and to act under common recommendations. The ideal medical expert seems to be the forensic doctor. In the future, a harmonization of practices could be applied in medical liability cases and the guidelines provided by the medico-legal community could constitute a stable base for their evaluation. ;

Chapter 7 The Psychological Impact of Medical Error on Patients, Family Members, and Health Professionals. ................................................................................... 166 Mary I. Gouva, TEI of Epirus, Greece ;

;

;

;

;

The current chapter examines the psychological implications emerging from medical errors. Whilst the psychological effects have studied, nonetheless the consequent impacts and the underlying psychological causes have not been sufficiently analysed and/ or interpreted. The chapter will add to the literate by using a psychodynamic

approach in analysing the psychological impact of medical errors and provide interpretations of the underlying causes. The chapter concludes that medical errors lead to a series of implications. For the patient the quality of interactions with health professionals are directly affected and usually have immediate consequences. The impact of these consequences in the patient is mediated by the patient’s personality, history of the individual and the psychoanalytic destiny of the patient. For the patient’s relatives medical errors create emotional cracks leading to regression and eventual transference of the medical errors as a “bad” object. For health professionals medical errors impact upon the psychological defence mechanisms of the psychic Ego. ;

Chapter 8 The Second Victim Phenomenon: The Way Out................................................ 190 Paraskevi Skourti, National and Kapodistrian University of Athens, Greece Andreas Pavlakis, Neapolis University Pafos, Cyprus ;

;

;

;

;

;

;

Medical error happens when an action within the medical field is not fulfilled as planned, or the plan is performed incorrectly. Patient and family are the first victim of an adverse event. The damage in a patient’s health, leads in a distressing situation not only for the patient, but also for the clinician who is responsible for this outcome. The term “second victim” refers to the trauma that a health professional sustains due to a serious adverse event in the healthcare system. After a medical error the caregivers are experiencing the aftermath in their personal and professional life. They feel isolated and abandoned, and some of them are coming up against the law with penal and disciplinary ramifications as a consequence of the blame culture in the health care system. Some health professionals experienced the consequences of an unfortunate incident even if it did not lead in harm to the patient’s health. ;

About the Contributors.................................................................................... 215 ;

;

215

About the Contributors

Mary Gouva is Associate Professor of Psychology of the School of Health & Social Welfare, at the Technological Educational Institute of Epirus and is Head of the Research Laboratory Psychology of Patients, Families and Health Professionals. She has specialized in Social Psychiatry and her research interests of her scientific work in the context of her Ph.D. studies relate to the Psychological Characteristics of patients with Acute Leukemia and of their families. Her studies and her research interest concern the investigation of the psychological factors in somatic diseases, the patients’ and their families’ psychological profile and the psychological characteristics of the health professionals. Her clinical work has psychodynamic direction and include psychological support and counseling intervention on patients and their family members and participation in therapeutic groups of hospital clinics at the University Hospital of Ioannina. Her published scientific work concerns articles, announcements and books. Kijpokin Kasemsap received his BEng degree in Mechanical Engineering from King Mongkut’s University of Technology, Thonburi, his MBA degree from Ramkhamhaeng University, and his DBA degree in Human Resource Management from Suan Sunandha Rajabhat University. He is a Special Lecturer in the Faculty of Management Sciences, Suan Sunandha Rajabhat University, based in Bangkok, Thailand. He is a Member of the International Association of Engineers (IAENG), the International Association of Engineers and Scientists (IAEST), the International Economics Development and Research Center (IEDRC), the International Association of Computer Science and Information Technology (IACSIT), the International Foundation for Research and Development (IFRD), and the International Innovative Scientific and Research Organization (IISRO). He also serves on the International Advisory Committee (IAC) for International Association of Academicians and Researchers (INAAR). He has had numerous original research articles in top international journals, conference proceedings, and books on the topics of business management, human resource management, and knowledge management, published internationally.

About the Contributors

Grigorios Leon is the President of the Hellenic Society of Forensic Medicine. As a certified Forensic Pathologist and a sole trader, he maintains one of the few private medico-legal consulting practices in Greece. He is a graduate (MD) of the Medical School of the University of Rome “La Sapienza”, where he also obtained two Master Degrees (MSc). In 2009 he received his PhD from the Medical School of the National and Kapodistrian University of Athens. He has trained and worked in the Department of Legal Medicine and Toxicology at the University of Athens, in the Children’s hospital “Agia Sophia” as well as in the Office of Medical Examiner and trauma services of Broward County in the State of Florida, USA. He is a Professor of Forensic Pathology at the Police Academy of Greece. His research interests are in the areas of Forensic Pathology, Medical Deontology and Bioethics. For his scientific work, he has been awarded scholarships from, amongst others, UNESCO and the European Committee. Vaughan Michell is an Informatics Lecturer and Business Technology Consulting Programme Director within the Informatics Research Centre at Henley Business School. He is also an honorary Senior Lecturer in Health Informatics in the Simulation Unit at the Royal Berkshire Hospital. Vaughan’s research focuses on the informatics of the business and technology interface at the design and operational level. He supervises PhD students in health informatics and related areas and has published papers in the areas of health informatics & semiotics, patient safety, clinical pathways, medical device capability and cognition. Research interests include: semiotics, affordance and human and machine capability, knowledge intensive processes and cognition, device design and invention and man-machine interaction. Vaughan has a BSc in mechanical engineering from UCL, an MBA from Warwick University and a D.Phil in Robotics Image Processing from Oxford University. Ali Mohammad Mosadeghrad is an Assistant Professor of Health Policy, Management and Economics at Tehran University of Medical Sciences. He received his PhD from University of London in Health Policy and Management. He has a wealth of experience in health policy, management and economics. He is an author, speaker, and a professional management consultant and trainer. Mosadeghrad has written extensively on many aspects of organization and management covering a full spectrum of subjects in strategy formulation, implementation and evaluation. His research work appears in international journals such as International Journal of Health Care Quality Assurance, International Journal of Strategic Change Management, International Journal of Health Policy and Management, and Health Services Management Research Journal. He has also contributed to many international conferences. His research interests include public sector management, strategic 216

About the Contributors

management, quality management, knowledge mobilization, Organisational health, and organisational change. His latest research is focused on international strategies. Andreas Pavlakis studied Nursing at the School of Nursing in Nicosia, Cyprus; in 1982 he was awarded a Degree in Law (Major in Public Law and Social Sciences) by the Aristotelion University of Thessaloniki, Greece. He then proceeded with his post-degree studies in “Open and Distance Learning” at the Hellenic Open University (1999) and in Legislative Drafting (Commonwealth Distance Training Programme) which was awarded to him by the Commonwealth of Learning. In 1991 he completed his Doctorial Studies at the National and Kapodistrian University of Athens, Greece in the field of nursing. In 2003 he was appointed by the Council of Ministers as a Member of the Secretariat of the Open University of Cyprus until 2005. From August 2006 - 2015 he was an Assistant Professor of Health Care Management at the OUC. He has participated in a number of research projects dealing with health in public and private sector. Since October 2015 he is an Associate Professor at the University Neapolis of Pafos. Jayita Poduval is a practicing otolaryngologist based in India. She completed her training from Mumbai in 1997 and has since been working in academic practice, medical research and postgraduate education in various places in India, with brief stints in Nepal and Malaysia. She also writes on medical and social issues and has several publications on these topics. She is also a trekker, amateur photographer and travel blogger. Paraskevi Skourti is a R.N. studied in the Technological Educational Foundation in Athens. She gained her Magister Artium degree in “Health Units Management” from Open University of Cyprus in July of 2015. Today, she studies in Law Faculty in National and Kapodistrian University of Athens. Jasmine Tehrani recently completed her PhD in the area of clinical pathways and health informatics. Abraha Woldemichael is currently a Ph.D. student in Health Policy at Tehran University of Medical Sciences. He is also a lecturer at Mekelle University, School of Public Health, Ethiopia where he is teaching Healthcare management, Fundamentals of public health, Public health law and ethics, Health economics and Health monitoring and evaluation both to undergraduate and postgraduate students. He has a B.Sc. in Nursing, BA in Management, Bachelor in Legal Law and M.Sc in Health Monitoring and Evaluation. 217

1

Chapter 1

Clinical Pathways and the Human Factor:

Approaches to Control and Reduction of Human Error Risk Vaughan Michell Reading University, UK Jasmine Tehrani Reading University, UK

ABSTRACT A key approach to improving patient safety is to seek to modify both formal and informal behaviours in response to the extensive reporting of error causes in the literature. This response is primarily in two parts; a) actions to minimise the risk of error or b) actions to control against error. For a) very valuable work has also been undertaken in running human factors courses to demonstrate and try to change poor behaviour via best practice models. In the case of b) much work has been done on increasing control regimes such as checklists and also formal rules in formal procedures. However, these actions tend to be specific to specific health units, are often piecemeal and are not integrated to complement each other. Little work has been done to integrate these formal and informal/social behaviour into clinical pathways or health activities. This chapter reviews current thinking and develops a methodology and proposal for identification and control of human error in clinical pathways based on the research of the two authors.

DOI: 10.4018/978-1-5225-2337-6.ch001 Copyright ©2017, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.

Clinical Pathways and the Human Factor

1. INTRODUCTION AND BACKGROUND 1.1 Patient Safety Although large numbers of people continue to be successfully cared for and treated in the National Health Service, a significant number of errors and other forms of harm occur. It has been calculated that up to 10% of patients admitted to NHS hospitals are subject to a patient safety incident and that up to half of these incidents could have been prevented ((Osborn and Williams, 2004; Vincent et al., 2001). Surprisingly, up to half of the 10% of Iatrogenic or accidental errors could have been prevented (Michell et al, 2012). It was estimated by a Bristol Royal Infirmary Inquiry (Bristol HMSO, 2001) that around 25,000 preventable deaths occur in the NHS each year due to patient safety incidents. These incidents also generate a significant financial burden that includes avoidably prolonged care, additional treatment and litigation costs. Avoidable unintended or accidental outcomes of medical care, medical errors are also a serious and challenging issue in many other countries including North America. The influential Institute of Medicine‘s (IOM‘s) report, To Err Is Human highlighted the extent of the problem and the need for remediation was documented in Building a Safer Health System (1999), where between 44,000 and 98,000 people die in hospitals each year as the result of medical errors. There is broad international agreement on the importance of achieving improvements to quality in this area (Milligan, 2007). The recorded event where an error is noticed ie a safety incident is defined by the National Patient Safety Agency (NPSA, 2004) as: any unintended or unexpected incident which could have or did lead to harm for one or more patients receiving NHS funded care‘‘. These types of incidents are also referred to in the literature as adverse events/incidents, medical error, clinical error, and include the concept of near miss. The latter is a situation in which an error or some other form of patient safety incident is averted, such as noticing and therefore avoiding giving the wrong drug to a patient. In the UK, the terminology for self-inflicted errors by clinicians and health workers has evolved from serious untoward incident to ‘significant event’ or in extreme cases ‘never events’ with examples of over 1600 serious incidents occurring in one NHS region in one single year (Rosenorn-Lanng, 2014) However, whatever the terminology these events are all dependent on the human in the room and in the loop, clearly driving the need to understand the human as a source of error. The study of the effect of the human condition on safety events and human errors is often termed ‘human factors’ and is clearly important in the understanding of safety problems since the care and intervention activities are primarily human driven. 2

Clinical Pathways and the Human Factor

1.2 Human Factors Chapanis defines human factors as ‘a body of information about human abilities, limitations and characteristics that are relevant to the design process’ (Chapanis, 1996). In a work context, human factors include environmental, organisational and job factors, and individual characteristics which influence behaviour in a work environment. Clinicians have suggested ‘Human factors relate to the aspects of human behaviour that reduce certainty of actions and can set conditions for, and create, human errors. This alludes to the fact that human factors not only relate to the way the error is driven by human actions, or inaction, but also a human failing may be a precursor and contributing factor to an error by other individuals and indeed machines that are predicated on human decisions. Human factors can perhaps be more simply understood as all the factors or conditions that affect human behaviour and particularly human fallibility or the propensity for error and unintended outcomes. Sadly, whilst human fallibility leading to errors can be moderated, they cannot be eliminated. It is inevitable that errors will occur in healthcare, as they do in other safety critical industries, because they are an intrinsic human trait – to err is human (Kohn et al., 2000). There have been a number of attempts to propose a categorisation of human factors that lead to errors and patient safety issues. Reason (1995) analysed conditions under which human factors can contribute to safety failures and proposed a generic model of accident causation (Reason, 1995). (Chang et al, 2005) conducted a series of similar studies and presented an evaluation of existing patient safety terminologies and classifications and grouped the findings into five complementary root nodes: impact, type, domain, cause and prevention. In this paper, cause and type root nodes are further analysed for the purpose of better understanding of human factors and towards a generic taxonomy and classification schema of human factors influencing near misses and adverse events. As a basis for understanding the range of human factors Rosenorn-Lanng & Michell developed the ‘SHEEP’ structured factor model as an acronym for classifying the human factor variables that influence error into five groups; (S) systems, (H) human interaction, (E) environment, equipment, (P) personal (Rosenorn-Lanng & Michell, 2014). This approach can provide a useful l checklist of human factors, both causal and influential that safety events and errors can be categorised against to understand the influence of human factors on activity and task failures. Ongoing capture and categorisation of these events can then yield a database of human factor patterns in a specific clinical environment that can be statistically analysed to focus sparse improvement resources to resolve them. Example factors from the SHEEP model are used to illustrate this chapter and can also be integrated with the other models mentioned. Our concern is to understand the human factors affecting human clinical actions and seek ways to moderate the 3

Clinical Pathways and the Human Factor

Figure 1. Factors affecting patent safety outcomes (adapted from (Flin et al, 2009)

impact of these factors. The following sections investigate the cause of error and the range of human factor drivers. To develop the approach we adopt a modified form of Jackson & Flin’s model of factors affecting patient safety (Figure 1) (Flin et al, 2009).

2. ERRORS 2.1 Individual Human Error Factors Reason defined an error as ‘a failure of a planned action’ (Reason, 1995) and identified 3 key types of individual human error. He distinguished between slips – failure to do something and mistakes or failures to do the thing right. Leape suggested slips were due to attention issues or intention failures (Leape, 1994). Mistakes relate mainly to errors in the conscious human mind’s judgement and decision making and cover rule based and knowledge based types. Rule based errors involve either applying the correct rule, to the wrong context due to incorrect situation perception or applying a rule that has been recorded incorrectly, to the right situation (Leape, 1994). Knowledge based errors resulted from cognitive processing failures of using an incorrect, familiar or incomplete mental model that does not represent the actual situation.

4

Clinical Pathways and the Human Factor

Perception Errors Some errors due to forgetfulness (slips- P1) or incorrect decision making (mistakes) are due to perception errors where a specific pattern may not be noticed or the clinician forgot to search for it or the pattern was not recognised correctly (Michell et al, 2012). A good example of forgetting is given by the clinical case of missing an obvious elbow injury when a patient was brought in for treatment (Smits et al, 2009; Panella et al, 2003). Another type of perception error is where the focus of the action is on the wrong object- i.e. description errors (P2). Individuals usually have a plan or model of what they expect to perceive and another source of error relates to the wrong plan or expected perception (P3). A final source of perception error is seeing a specific cue eg bruised skin and taking the wrong action by making assumptions about the implications from the perceived visual cue (P4).

Rule Based Errors Work activities that are often repeated as routine clinical actions or experienced activities are often encoded by human cognitive rules (Rasmussen, 1983) (Shappel and Wiegmann, 2000). Leape (Leape, 1994) defined three types of rule based errors. Firstly the correct rule may be used in wrong context (C1), for example due to a perception error eg mis-assigning a clinician with an inappropriate skill to a specific patient problem (Smits et al, 2009). Alternatively it may involve applying a rule that has been understood by the individual incorrectly (c2), to the right situation, for example the mis calculation of medical doses (Smits et al, 2009). Further details can be found in Michell 201444 IGI Book

Knowledge Based Errors Knowledge based errors are errors in cognitive action plans due to incorrect or incomplete mental models (C4)/ (c5) of the clinical situation and context (Michell IOT 2014). This is often due to lack of experience or the problem of applying and extrapolating existing experience and knowledge in novel or new situations (Smits et al, 2009). For example human factor compulsion to revert to a (see i)) familiar mental model and decision, rather than a realistic one produced by cognitively assessing all the facts to select a superior decision (Reason, 2000). Other drivers of knowledge based errors are the human impulse to re-use and fit the situation facts to a pre-existing and well known habitual cognitive response (C3) (Reason, 2000). A summary of the personal error types is given in Figure 2. Hence identifying the points and situations in which these types of errors may be more prevalent can help in reducing patient safety risk. 5

Clinical Pathways and the Human Factor

Figure 2. Individual Errors (Adapted from Michell, 2014)

2.2 Cognitive and Physiological Abilities The importance of the condition of the human in the action loop suggests that actors in clinical processes must become self-aware of their own condition that may affect the state or outcome of a situation that could lead to error. Research has identified a large number of factors that can act as error contributors. These include physiological conditions including stress, fatigue, cognitive workload, time pressure, lack of knowledge and the need for help (Rosenorn-Lanng, 2014). Within clinician selfawareness a number of clinically specific conditions have been identified that are important to our discussion.

Situational Awareness and Coning of Attention Some key errors have been caused by clinicians being too focused on their task and missing a deteriorating situation around them that results in error and patient harm. For example the process for intubating patients can so focus the clinician that they can ignore the fact that a patient cannot breath (Rosenorn-Lanng & Michell, 2014). This is the human factor of situational awareness (Reason 1990), first identified by German First World War flying ace Oswald Boelke (Stanton et al, 2001). It relates to awareness of what is going on about the person – or the perception, comprehension of meaning and projection of the status of elements of a person’s environment (Flin et al, 2009). It can include lack of recognition of critical cues for decisions, failure to interpret meanings, lack of understanding of individual task responsibilities and failure to communicate (Stanton et al, 2001). These human factor tendencies seem to 6

Clinical Pathways and the Human Factor

occur where a cognitive task requiring focused attention by an individual is required. Such tasks are typical of most medical interventions. However, the issue here is ensuring salient events, states or information relevant to the activity focus are noticed by the concentrating clinicians. One solution is to encourage clinicians to vocalise their thoughts and actions to make it clear what their focus is on and enable others to raise awareness of missing critical factors and concerns (Rosenorn-Lanng, 2014). A factor related to situational awareness is the coning of attention of a typical surgeon, used to focusing on the detail of a surgical intervention. Three types of attention cones have been proposed 1. The eye focus cone of vision 2. A side cone of peripheral vision for example in checking machine readings and locations of devices and individuals and 3. The auditory cone or focus of hearing that can sometimes block out important information as well as noise (Rosenorn-Lanng, 2014). An additional factor in coning may be related to habit. Leape identified that slips resulted from failures in automatic skills for example a tendency for humans to follow the most habitual rather than the correct routine (Leape, 1994).

2.3 Personal and Physiological factors: Stress and Fatigue Humans live and work in complex environments that can affect our physiology and hence alter our cognitive and perception responses and decision making. Stress, or an individual’s negative response to the pressure of work is the key driver of these changes (Flin et al, 2009). Stress varies with an individual’s balance of their capability vs. what is demanded of them, which in turn depends on their skills, education and training. Examples of personal factors that increase stress and impact safety are low job satisfaction and morale, high workload (Rosenorn-Lanng & Michell, 2014). A good example of the very wide range of factors that were found to affect performance are given in Figure 3.

2.4 Organisational Factors A combination of process, human and system controls constitute a defence against errors in depth. Organisational errors result from errors in systems or people and resources and represent a failure of multiple checks and controls due to ‘active and passive failures’ and gaps in safety defences in depth’ as defined by reason in his ‘Swiss Cheese model’ (Reason, 1998). Work on the sheep model identified a wide range of factors that influence physiology and stress. 7

Clinical Pathways and the Human Factor

Figure 3. Example personal factors (adapted from (Rosenorn-Lanng, 2014))

The Impact of Culture Swidler identified the impact of culture on actions : ‘Culture provides the material from which individuals and groups construct strategies of action’ and ‘groups and individuals call upon these resources selectively bringing to bear different styles and habits of action selectively (Swidler, 1986). This is particularly visible in the impact of important influence of the hospital and the professional clinical group and even the cross cultural team to which clinicians belong. Safety culture ideally involves everyone focusing on the ‘value and priority of the patient’ (Weigmann et al, 2010). The culture of an organisation therefore provides a set of standards of behaviour to which its members follow and aspire. Hence a culture in which procedures and standards are flouted and there is little concern and management and control of slack practices can provide a fertile environment for a multitude of quality problems and errors. A key weapon in error reduction and quality improvement in healthcare is the adjustment of culture (Davies et al, 2000). The World Health Organisation defined safety culture in terms of individual and group attitudes, competencies and patterns of behaviour (Flin et al, 2009). Critically this culture must be open to views and findings and ideally a culture that encourages learning and improvement (Reason, 1998). However, culture, as Davis et al suggest has imprecise definitions but can be described as ‘the emergent property of that organisation’s constituent parts; the 8

Clinical Pathways and the Human Factor

behaviour of the organisation at different levels does impact actions and their outcomes as seen in well-known phrases such as ‘it is the way things are done around here as well as the way things are understood, judged and valued’ (Davies et al, 2000). Culture drives behaviour and hence is a precursor to the human factor failings we seek to avoid.

Managing Clinical Activities: Workload Managing and controlling clinical behaviour and activity is however itself open to human factor failings. The focus on managing human in their activities relies on a) allocating work efficiently and effectively, i.e. ‘planning, scheduling and forecasting’ and b) observing and controlling violations of process and appropriate behaviour to avoid error and patient risk situations. With increased medical demands it is no surprise that increase in task workload and the cognitive complexity of the task are both error-inducing factors (Weigmann et al, 2010). Also mis-scheduling of the right staff can also be an issue (Helmreich, 2000). Good management and control requires accurate, timely and appropriate situational information in the right context. Cognitive and workload overload and the inability to be aware of everything can also lead to additional or ‘knock on’ errors (Smits et al, 2009).

2.5 Team/Unit Culture Leadership and Role Errors in Teams Leadership - the process of influencing individuals to achieve goals (Flin et al, 2009) for clinical tasks can be vital in directing the clinical intervention towards the desired solution. Leadership in a medical situation is de-factor a case of safety leadership – ie responsibility for encouraging everyone to make the right decisions to ensure safety. But all too often in a complex and hectic clinical emergency situation it is unclear who is in charge. This can lead to the assumption that others are or have already made critical decisions or to delays in action. A lack of clear leadership can result in decisions being deferred, tasks not delegated and completed and a lack of information flow and coordination that can lead to serious errors (Rosenorn-Lanng, 2014) (Mohr et al, 2002).

Role Conflict Role conflict, where an individual is uncertain of their role, has unreasonable job demands or incompatible requests on them, is a well known driver of stress (Piko, 2006). Even ignoring stress, role conflict in team situations where it is uncertain 9

Clinical Pathways and the Human Factor

whose task it is to lead or act can result in the task being left undone and hence turn into a potential safety error by default. Ideally any clinical process must ensure there is a clear definition and understanding of roles in terms of who is responsible for and who must undertake each action or activity. The technique of process mapping using swim lanes to identify responsibilities vs. activities within a process helps to identify who performs what actions and where there is ambiguity or duplication that may lead to delay in action, inaction or error (Wohed et al, 2006).

Communication Communication – ‘the transfer of information, ideas, feelings’ (Flin et al, 2009) is often critical to efficient and effective task execution in a medical environment. Communication failures are a major contributor to medical errors, particularly in operating theatres. However communication covers many forms both in human and machine transfer of information and the lack of it or the type and comprehension of the message to whether the message is timely and appropriate [Debbie sheep igi]. Problem identification, for example in a surgical situation requires the free input of open views that may be inhibited by human factors such as [Debbie book]. Similarly critical decision making as in medical diagnosis requires the generation, understanding and communication of alternative solutions that can be affected by human error factors. Normal workload and task allocation and reporting relies on clear and unambiguous communication that may be affected by personal and environmental factors (Davies, 2005). Macintosh et al identified five key characteristics of communication in healthcare: communication is necessary to reduce morbidity, must be used by all team members, must be able to occur in the situation, and must be effective ie focused on the salient points and must be the right type of communication (Davies, 2005).

Handover Error Particular error examples often concern the problem of miscommunication or misinterpretation of meanings when transferring information between teams (Noble and Donaldson, 2011). This leads to the need for clear communication and handover points of information in medical processes and also clear responsibility for communicating and ideally guidelines on what to communicate if these errors are to be reduced. The World Health Organisation actively promotes pre task briefings via a checklist (Flin et al, 2009) and the format for ensure efficient and effective communication is given in the SBAR (Situation, Background, Assessment, recommendation) format that is widely promoted among clinicians, but in many cases not always used (Haig et al, 2006). 10

Clinical Pathways and the Human Factor

Systems and Human Factors The human within a clinical situation almost always operates within a framework of information and organisational structures that we refer to as ‘systems’ in this context. A health organisation operates within an arrangement of informal behaviour rules, driven by values and beliefs, formal behaviour rules as dictated by standard procedures and technical behaviours driven by the technology used within the health enterprise. Formal information artefacts within a hospital may range from detailed protocols and procedures to care bundles and pathways. Typical sources of information error are multiple and confusing copies of information and ambiguous information leading to different perceptions and action (Rosenorn-Lanng & Michell, 2014).

2.6 The Work Environment Reason coined the term ‘local traps’ to identify working environment conditions that in conjunction with violations of procedures and human error can create unsafe and risky patient situations (Reason, 1998). Rosenorn-Lanng, an experienced clinical practitioner in her research stratified the environmental conditions into static elements of the physical structure and location and arrangement of resources and the dynamic elements relating to interruptions to process and location issues as being contributors to error conditions (Rosenorn-Lanng, 2014). Environmental factors such as lack of resources due to lack of knowledge of their location or being unable to access them can be instrumental in denying vital and sometime lifesaving care. Hospitals also have a frequent need to move clinicians and patients and dynamic issues such as journey time or blockages and delays in the movement of resources can also reduce decision time and lead to pressure to make mistakes and slips. A simplified set of environmental factors is illustrated in Figure 4. The main goal of introducing human factor controls in to clinical pathways is not so much to minimise that particular error but to enhance human performance at different levels of system.

2.7 Clinical Pathways The key means of management of planned clinical intervention in many hospitals is the grouping of procedures, information and guidelines around a route or path of treatment for a specific medical condition – what are often known as clinical pathways. There are a range of definitions of clinical pathways. Clinical pathways are also known as care maps, anticipated recovery pathways were introduced in 1985 and are an attempt at developing practical standard operating procedures for clinical processes (Li et al, 2014). Clinical pathways (CP) represent an approach by 11

Clinical Pathways and the Human Factor

Figure 4. Static environment factors (adapted from (Rosenorn-Lanng, 2014)

healthcare organisations to develop an ideal planned sequence of steps to minimise risks and variations in clinical intervention (Cabitza et al, 2008). Despite the lack of formal industrial type process design of pathways in many health organisations, other than formalisation by committee, their specifications involve the proven clinical best practices from medical guidelines (Cabitza et al, 2008) and are an attempt to standardise care processes (Ye et al 2008). In reality CPs are often realised as collections of often disparate and abbreviated blocks of information for use by many disciplines, nurses, surgeons, anaesthetists that represent the distillation of best clinical practice for the treatment of a specific medical condition., but are specific to each institution and are generally used as an organisational management tool for coordinating clinicians actions for a specific patient condition (Audimoolam et al, 2005). Most clinical pathways are still paper-based and designed for the ideal patient scenarios and include both planning information as well as mechanisms to record variations in actual clinical interventions (Michell et al, 2012). A CP serves as a useful guide for more detailed analysis of activities and human factor relationships to clinical work. Formal clinical pathways, using and enforcing well documented tasks and protocols and specified goals, are known to reduce slips because of the structured support information for activities in the pathway documentation (Panella et al, 2003). A correctly identified and disseminated clinical pathway can act as a scaffold onto which human factor knowledge and controls can be welded to better manage their outcomes.

12

Clinical Pathways and the Human Factor

3 MODELLING HUMAN BEHAVIOUR AND ERROR This section addresses how we can model human behaviour by utilising how humans use cultural rules. These cultural rules take the form of behavioural norms that drive many human actions and mistakes as a precursor to better identification of human factor and their influence on the safety outcome of clinical pathways. We follow this by a review what are human factors and risk measurements that can be applied to enable the discovery of where human factor errors are most likely to occur within pathways.

3.1 Rules and Human Norms An organization can be visualised as a social system in which people behave in an organized manner conforming to a certain system of norms. These norms relate to rules, regulations and patterns (Wright 1963). Norms are often referred to as rules that are shared by and defined by a culture, such as norms in religion, law and social convention. In an organisation, norms reflect the regular behaviour of members that enable g co-ordination of their actions. Norms are developed through practical experiences of agents in a society and in turn have functions of directing, coordinating and controlling actions within society (Liu, 2000). An organisation can be modelled as a system of social agents where people conduct themselves in an organised way by conforming to regularities of perception, behaviour, belief and value. The function of a norm is to determine whether patterns of behaviour are lawful or acceptable in the context of the society. Norms thus can be seen as a form of standards for executing behaviour for the members of a cultural group that wishes to conform to these norms to coordinate their actions. Identifying, capturing and modelling norms or human rules governing behaviour enables us to partly predict and anticipate human actions and human co-ordination with other agents. Norms can be categorised in a variety of ways. Five types of norms that govern substantive human actions or behaviour can be identified in this way, each of which controls an aspect of human behaviour. They are perceptual norms, evaluative norms, cognitive norms and behavioural norms. Perceptual norms are human rules to guide pattern recognition, for example how a clinician might perceive a cancerous lump. Perceptual norms need to be used in conjunction with evaluative norms which are essentially rules that identify relative value and ranking necessary for decision making between for example the importance of one pattern or view over another. Cognitive norms relate to cause and effect and logic such as in norms that relate to evidence based medical treatments and the knowledge and implications of specific perceptions and beliefs. Finally behavioural norms are essentially human rules for performing specific activities. Norms can also be categorised according to 13

Clinical Pathways and the Human Factor

whether they reflect the human rule for a specific action (substantive), or a rule to communicate only (communication norms) or a human rule that dictates the control of actions (control norm) (Stamper, 1994). For a complete description of different types of norms, see (Stamper, 1994) Norms relate to human behaviour which can be categorised according to the interaction. Firstly informal human behaviour enables norms related to living such as religion, laws and cultural rules. Secondly the evolution of industry has led to formal norms seen as standard operating procedures and rules regarding business interaction. These formal norms are governed by roles and organisational sanctions. Thirdly in operating technical equipment technical norms have been developed such as the need to swipe the screen of an iphone in order to operate it. Identifying these repeated rules can help provide a type of formal logic around what can sometimes be seen as complex human behaviour. But formalisation has limits even when the norms can be defined explicitly. This is because members of an organisation need to interpret them. As we have seen with human error, human interpretation, even of important and life critical rules can be prone to error. Also many norms defy formalisation because of their complexity and sensitivity to human values, views and beliefs which are subject to many dynamic factors in a situation. Hence, any formalisation devised must include agents (authorities of action who can take explicit responsibility for their actions) whom are part of the system and cannot be expressed in mathematical and logical symbols. Identifying implicit norms in human factor behaviour examples help us to understand and develop appropriate controls. ie control norms to counteract the influence of human factor driven errors and slips.

Modelling Norms Norms can be modelled as: If then 7 is considered a high risk factor contributor. FMEA assumes a process model or documentation is available and that events leading to failure can be identified and that remedial risk reduction actions are also identifiable (Reason, Manstead et al., 1990). Muehlen et al advocate a technique for risk aware process modelling by taking into account the a risk structural model for risk composition, a goal model relating risks of missing goals and a risk state model to evaluate dynamic risk combinations (Rosemann and Zur Muehlen, 2005). Reason reminds us that successful high reliability organisations use tools and reminders to help them remember the safe approach to actions (Reason, 2000). Human behaviour factors are factor that other authors suggest should be borne in mind when both patient safety problems are mapped and tools chosen (Chiozza and Ponzetti 2009). It is well known that clinical risk can be contained by risk management initiatives provided they cover all areas in which can emerge (Verbano and Turra, 2010), thus necessitating the inclusion of a process that considers and wide range of risk factors in an accessible form. However, little work has been undertaken to integrate these methods. Our approach builds on Muehlen’s three model and FMEA risk analysis to a) identify the safety risks in clinical pathways l. and b) identify the specific human factor risks based on a knowledge of human factors in an accessible form such as the SHEEP model human factors that c) can complement traditional hard predictive controls and softer human behaviour controls..

4. INTEGRATING HUMAN FACTORS INTO CLINICAL PATHWAY MANAGEMENT 4.1 Controls Controls involve the management of a goal situation by understanding and perceiving a difference between the desired and actual goal state (the control error) and then identifying an action – ‘the control action or ‘control’ to move the undesired state back towards the desired state. Muehlen identified four business risk reduction approaches of which 3 apply in the case of clinical processes. A risk relates to the probability that some undesired state that may occur, such as the opportunity for an error. Firstly risks can be reduced by the introduction of controls to better ensure goal outcomes and reduce risk events (Roseman and Muehlin, 2005). Secondly 16

Clinical Pathways and the Human Factor

risks can be avoided by altering/redesigning the process for specific actions and resources. Finally risks could be accepted and their impact reduced by pre-prepared contingency plans. Sadiq et al (Sadiq et al, 2007) suggest the need to ensure a systematic approach to business objectives and control objectives in process design. Although Sadiq’s focus was compliance controls, it applies equally well to error controls. What is needed is explicit analysis and modelling of the process and defined and reasoned control objectives against a defined clinical risk and set of internal controls to reduce the risk. What often happens is that for clinical processes controls in the form of checklists (Grieshaber et al, 2009) etc are often added after errors occur and a root cause analysis occurs as a system of reminders to prevent reoccurrence. Muehlen advocating the use of business process management (Roseman and Muehlin, 2005) identified four business risk reduction approaches of which 3 apply in the case of clinical processes. Firstly risks could be reduced by the introduction of controls to reduce and mitigate risk events. Secondly risks could be avoided by altering/redesigning the process for specific actions and resources. Finally risks could be accepted and their impact reduced by pre-prepared contingency plans. We will focus on controls – that mainly relate to quality controls after the event and process changes which relate to integrating quality assurance into the process by removing or significantly reducing the possibility of error.

4.2 Predictive, Personal and Cultural Controls Predictive Controls The use of FMEA or other methods can produce a set of ‘predictive error control points’ ie Muehlens’ ‘controls to reduce and mitigate risk events’ where predictive controls are defined as controls able to be set up for a known process ‘a priori’ or ahead of time to catch predicted failures. Many medical procedures are designed as sets of formal rules or norms as predictive controls identified when reviewing evidence based practice or experience. There are two main types. Firstly those controls that focus on control of quality by checking it after the activity or state change has occurred ie quality controls. One of the most frequent examples is the use of checklists or check points to affirm that the control goal has been reached (Semel et al, 2010). Another type of predictive control is a quality assurance based control where the process and activities are designed to actively prevent the error occurring, ie ensuring the quality by not allowing error outcomes to occur, ie through process design or change (Roseman and Muehlin, 2005). Predictive controls require a good knowledge of the actual activities and behaviours and their variations, actions and states of the stakeholders and known failure modes. This enables the identification 17

Clinical Pathways and the Human Factor

of control objectives and needed controls at appropriate risk points. Predictive controls can be modelled as formal control norms or standard operating procedures.

Control of Perception Errors Perception activities can be identified in clinical process models by use of verbnoun combinations such as ‘check’, ‘monitor’, ‘review’, ‘evaluate’. Typically clinical activities relating to perception may involve diagnosis and assessment using the clinician’s knowledge and experience of similar patterns and their meaning. Errors of perception such as a slip can often be controlled by the inclusion of checklists to reduce or remove forgetfulness and to ensure the clinician focuses on specific features that fit the pattern. Checklists are widely used in medicine [ref] and often included as part of clinical pathway paperwork. However, repeated use can numb the cognitive need for the checklist and result in simple box ticking or reverse engineering of the checklist after the event. Incorrect action plans can be reduced by cues and guides such as protocols and phone apps that can provide assurance of the right action given specific cues as can visual pattern charts to reduce perception cue error.

Control of Rule Based Errors Context and rule error points are more difficult to identify, often because they are part of the tacit knowledge of the physician or clinician and their internal decision process. However, they are highly likely to occur at clinical decision points of diagnosis and substantive intervention. Peleg and Tu advocate formal specification of clinical guidelines and detailed checking and update to detect rule based and related errors in clinical guideline and pathway documents (Peleg and Tu, 2009). The introduction of electronic records and related automation and personal phone apps is likely to render this more automatic, especially with the increasing use of artificial intelligence and the development of internet of things applications in medical systems (For more details see Michell IOT 2014). However, one partial control is to make the rule and decision making process by ensuring clinicians verbalise their decision rule thoughts and make them explicit and open to feedback.

Control of Knowledge Based Errors Knowledge based error is also difficult to plan for and manage due to its tacit nature. However, it can often be identified in clinical simulations where there is possibility of discussion and analysis of the thought process and knowledge required for the activity. In future the use of the Internet of Things (IoT) and sensor information

18

Clinical Pathways and the Human Factor

may enable the detection of such errors from the record of follow on activities (Michell, 2014). A potential solution to knowledge based errors is increased planning and design of the process or clinical pathway and work activities to facilitate a predictive common shared understanding that by default (provided there is a minimum risk of group think or dominance by one party) should enable reasoned mental models to be developed that can survive error situations (Jalote-Parmar et al, 2008). This often involves extensive training and simulation to ensure the shared knowledge is identified, integrated and indeed used effectively as the varying situations occur.

Identifying Predictive Control Points A key issue is to identify where specific failures may occur and where specific predictive controls can occur. The use of the SHEEP model at one hospital has helped by providing a checklist of factors and categories of failure (Rosenorn-Lanng & Michell, 2014). When a medical department develops an error report the SHEEP human factor groups and types can be used to categorise the contributing factors and the weight of their contribution. Analysing the contributing factors enables a measure of the highest or most frequently occurring human factor and their impact. This information can then be used to identify where in a clinical pathway such human factor failures are likely to occur and to develop the predictive controls mentioned above.

4.3 Formal and Informal Norms As we have seen modification and control of human behaviours ultimately depends on the individual and their adherence to rules. Hence to increase the control over human factors we can develop and apply more formal norms by the training and enforcement of new procedures and role specifications as part of a process improvement or redesign exercise.

Control Norms for Situational Awareness and Coning of Attention. Both situational awareness and coming of attention suggest a need to be aware of where there is a high risk of these situations in clinical activities. Solutions to reduce situational awareness include organising information to ease understanding and match personal goals and the addition of cues, and training in pattern recognition and multi-tasking (Stanton et al, 2001). This suggests a general rule or norm of whenever < high risk action> and < focused cognitive work by a single individual or specialist> then < ensure an independent second observer with knowledge and 19

Clinical Pathways and the Human Factor

authority to enable the effective transmission of salient events, states or information to the cognitive work individual or team>. Other solutions involve more elaborate rule sets.

Personal Controls Whilst formal controls can be enforced by the organisation the human focus of medical processes can also be moderated by the use of the humans themselves. Benabou & Tirole’s research on personal commitment and self-control identified personal rules that individuals used to manage their behaviour [(Bénabou and Tirole, 2004). Ainslee 1992 (in (Bénabou and Tirole, 2004)) defined personal rules as willpower impulse controls that prevent temptation to act in a damaging way to the individual. We adapt Benabou and Tirole and Ainslee’s view to identify a generic set of personal norms. However, norms are subject to the willpower and lack of perfect recall that can alter their use and effectivity. Nevertheless in our research we identified good examples of personal rules that clinicians had used, that were driven by strong willpower (an important self-regulating mechanism advocated by Baumeister et al. (1994)) due to the bad event or ‘near miss’ that emphasised the need for their personal control over future events of this type. We therefore define a second set of controls as personal controls. We define personal controls as informal control rules or heuristics - ie behavioural norms used by the individual to ensure the correct outcome of an action. Personal controls are informal as they are tacit and not formally codified by the organisation. Personal controls depend on the individual’s character and self-discipline for their introduction and are typically the result of experience and concern about the outcome of an activity. They are part of the individual’s set of behavioural norms. For example one interviewee from a patient safety survey (Rosenorn-Lanng & Michell, 2014) always verbally repeated drug volume and strength information and asked for a second check whenever they knew themselves to be tired and hence the possibility of a perception or epistemic error was reduced. Hence an applicable norm might be: Whenever< administering a drug> If must Another clinical medical practitioner suggested their personal control norms for the parts of the clinical pathway (CP) they were responsible for involved careful checking; ‘It is easy to become blasé and so I check each stage carefully and I ensure that I involve others and their views’ Whenever< reviewing a clinical pathway > If < responsible physician > must 7) on the patient if a key action was mistaken. ie ‘1 step to disaster’. For example miscalculating the quantity of a lethal drug to give an overdose, or removing the wrong organ [ref]. Patient high risk activities also have low detection rates or few steps in which they are detected as a result of few or poor control norms ie D > 7. But how do we identify P >7? For example anaesthetists and nurses routinely administer lethal drugs with very few failures and hence P ≪1 or 2. This is often because the routine nature of the task is just that and there is no significant change in routine leading to unexpected actions or a reduction in perception, evaluative or cognitive capability. As Reason asserts it is often events and unfamiliar or unprepared for situations that give rise to safety errors. The key to how P, the probability of a patient safety risk occurring, might suddenly increase is to identify the conditions under which P will drastically increase due to the removal of ‘normal’ safety measures as a result of a change in the situation or 23

Clinical Pathways and the Human Factor

Figure 7. The proposed HF-FMEA process

human behaviour. This requires identifying the human factor failure points and the controls employed in the formal process – typically a codified clinical pathway. If we can define these high risk points for each elective clinical process we can then define the human factors behaviours that could lead to the catastrophic result and design countermeasures to them. An excellent example is the risk of cabin pressure failure due to external cabin doors not being correctly locked resulting in the now universal safety control countermeasure of ensuring each cabin crew member that checks a door check their opposite number’s doors are safely locked or unlocked. To identify the human factor risks and their respective controls a multistep process is proposed. Firstly a clinical pathway process model needs to be made available or developed. This should typically be in BPMN swim lane format to show the activities and roles responsible together with any necessary metrics such as timings and decision conditions. The traditional traditional Failure Modes and Effects Analysis (FMEA) method is used to identify failure points and modes within processes and activities and their relative probability of occurrence and level of impact to produce the risk assessment number. Existing formal controls (quality control and assurance measures) should also be identified to determine the detectability and impact of the error, given many errors may be contained by existing pre- event quality assurance controls. This is the first pass. A second review or pass of the process is then made to identify human factor drivers of failure modes, ie what 24

Clinical Pathways and the Human Factor

possible factors are likely to occur that could stimulate error. This covers the perception and cognitive error types and the different levels of Jackson and flin’s model ie cognitive skills and personal resources through to team and unit culture and organisational factors. Use should be made of models and lists of situational factors affecting safety such as the SHEEP model to identify which factors may occur at specific activities in the process. For example the environmental and systems factor list will always have some relevance in any process. A third pass is then made to assess any dangerous combinations of failure modes that could cause knock on effects and increase high impacts. For example where a minor error may drive or lead to a bigger impact and probability of occurrence – eg an incorrect patient name may lead to an incorrect and incorrectable operation if a single further control point is missed or such an event occurs in conjunction with human factors such as tired surgeons in an emergency situation. The FMEA risk priority number based list should then be re-prioritised to confirm the major risks and the major human factor drivers. The high risk human factor driven parts of the process/activities are then studied to identify predictive controls in the form of active checks on quality and or process redesign to provide quality assurance. Where possible personal and cultural control points should be identified and the related behaviour norms should be extracted to support training and discussion. These behaviour norms and details of the potential human factor driven failure mechanisms should then be used to develop relevant training and simulation exercises and workshops organised to develop these behaviours. For example the non-technical skills training often provided to healthcare professionals to ensure human factors awareness (Rosenorn-lanng, 2014b).

6 CONCLUSION This chapter has reviewed current thinking and develops a methodology for identification and control of human error in clinical pathways based on the research of the two authors. Using examples from the literature we have proposed and described a method of identifying and integrating the study and control of human factors into a traditional FMEA method of process and clinical risk analysis. We have suggested a semiotic norm based approach for the identification of human factor behaviours and norm based controls. We have explained how traditional predictive controls and risk avoidance by process redesign can be combined with the identification of human factor controls. We have shown how the use of the HF-FMEA approach and the identification of organisational, team and individual risks and errors can be used to define risk points and how new cultural and personal controls can augment traditional formal control regimes to reduce human factor error. This is achieved through 25

Clinical Pathways and the Human Factor

application of an adapted risk model to identify the human factor risks, modelling and integration of human factor and controls in clinical pathways as demonstrated by an actual simulation example. We hope this will result in more rigorous control the care process ensuring completeness, consistency and patient safety by enabling the mapping of formal and informal/safety controls into clinical pathways.

REFERENCES Audimoolam, S., Nair, M., Gaikwad, R., & Qing, C. (2005). The Role of Clinical Pathways in Improving Patient Outcomes. Retrieved from cs.dal.ca Bénabou, R., & Tirole, J. (2004). Willpower and personal rules. Journal of Political Economy, 112(4), 848–886. doi:10.1086/421167 Cabitza, F., Simone, C., & Sarini, M. (2008). Knowledge artifacts as bridges between theory and practice: the clinical pathway case. In Knowledge Management in Action (pp. 37-50). Springer US. Chapanis, A. (1996). Human factors in systems engineering. John Wiley & Sons, Inc. Chang, A., Schyve, P. M., Croteau, R. J., OLeary, D. S., & Loeb, J. M. (2005). The JCAHO patient safety event taxonomy: A standardized terminology and classification schema for near misses and adverse events. International Journal for Quality in Health Care, 17(2), 95–105. doi:10.1093/intqhc/mzi021 PMID:15723817 Davies, H. T., Nutley, S. M., & Mannion, R. (2000). Organisational culture and quality of health care. Quality in Health Care, 9(2), 111–119. doi:10.1136/qhc.9.2.111 PMID:11067249 Dhillon, B. S. (2003). Methods for performing human reliability and error analysis in health care. International Journal of Health Care Quality Assurance, 16(6), 306–317. doi:10.1108/09526860310495697 Flin, R., Winter, J., & Cakil Sarac, M. R. (2009). Human factors in patient safety: Review of topics and tools. World Health, 2. Grieshaber, D. C., Armstrong, T. J., Chaffin, D. B., Keyserling, W. M., & AshtonMiller, J. (2009). The effects of insertion method and force on hand clearance envelopes for rubber hose insertion tasks. Human Factors: The Journal of the Human Factors and Ergonomics Society, 51(2), 152–163. doi:10.1177/0018720809336414 PMID:19653480

26

Clinical Pathways and the Human Factor

Haig, K.M., Sutton, S., & Whittington, J. (2006). SBAR: A shared mental model for improving communication between clinicians. Joint Commission Journal on Quality and Patient Safety, 32(3), 167-75. Kohn, L. T., Corrigan, J. M., & Donaldson, M. S. (Eds.). (2000). To err is human: building a Safer Health System (Vol. 6). National Academies Press. Leape, L. L. (1994). Error in medicine. Journal of the American Medical Association, 272(23), 1851–1857. doi:10.1001/jama.1994.03520230061039 PMID:7503827 Li, W., Liu, K., Yang, H., & Yu, C. (2014). Integrated clinical pathway management for medical quality improvement–based on a semiotically inspired systems architecture. European Journal of Information Systems, 23(4), 400–417. doi:10.1057/ejis.2013.9 Liu, K., & Dix, A. (1997, May). Norm governed agents in CSCW. In The First International Workshop on Computational Semiotics. IGI. Michell, V., Tehrani, J., & Liu, K. (2012). Are clinical documents optimised for patient safety? A critical analysis of patient safety outcomes using the EDA error model. Health Policy and Technology, 1(4), 214–227. doi:10.1016/j.hlpt.2012.10.003 Michell, V. (2014). The Internet of Things and Opportunities for Pervasive Safety Monitored Health Environments. In Patient Safety and Quality Care through Health Informatics. IGI. doi:10.4018/978-1-4666-4546-2.ch020 Milligan, F. J. (2007). Establishing a culture for patient safety–The role of education. Nurse Education Today, 27(2), 95–102. doi:10.1016/j.nedt.2006.03.003 PMID:16713030 Mohr, J. J., Abelson, H. T., & Barach, P. (2002). Creating effective leadership for improving patient safety. Quality Management in Health Care, 11(1), 69–78. doi:10.1097/00019514-200211010-00010 PMID:12455344 Noble, D. J., & Donaldson, L. J. (2011). Republished paper: The quest to eliminate intrathecal vincristine errors: a 40-year journey. Postgraduate Medical Journal, 87(1023), 71–74. doi:10.1136/qshc.2008.030874rep PMID:21173052 Osborn, S., & Williams, S. (2004). Seven steps to patient safety. An overview guide for NHS staff. Retrieved from http://www. npsa. nhs. uk/nrls/improvingpatientsafety/ patient-safety-tools-and-guidance/7steps/ Panella, M., Marchisio, S., & Di Stanislao, F. (2003). Reducing clinical variations with clinical pathways: Do pathways work? International Journal for Quality in Health Care, 15(6), 509–521. doi:10.1093/intqhc/mzg057 PMID:14660534

27

Clinical Pathways and the Human Factor

Peleg, M., & Tu, S. W. (2009). Design patterns for clinical guidelines. Artificial Intelligence in Medicine, 47(1), 1–24. doi:10.1016/j.artmed.2009.05.004 PMID:19500956 Piko, B. F. (2006). Burnout, role conflict, job satisfaction and psychosocial health among Hungarian health care staff: A questionnaire survey. International Journal of Nursing Studies, 43(3), 311–318. doi:10.1016/j.ijnurstu.2005.05.003 PMID:15964005 Pronovost, P., & Sexton, B. (2005). Assessing safety culture: Guidelines and recommendations. Quality & Safety in Health Care, 14(4), 231–233. doi:10.1136/ qshc.2005.015180 PMID:16076784 Rasmussen, J. (1983). Skills, rules, and knowledge; signals, signs, and symbols, and other distinctions in human performance models. Systems, Man and Cybernetics, IEEE Transactions on, (3), 257-266. Reason, J., Manstead, A., Stradling, S., Baxter, J., & Campbell, K. (1990). Errors and violations on the roads: A real distinction? Ergonomics, 33(10-11), 1315–1332. doi:10.1080/00140139008925335 PMID:20073122 Reason, J. (2000). Human error: Models and management. BMJ (Clinical Research Ed.), 320(7237), 768–770. doi:10.1136/bmj.320.7237.768 PMID:10720363 Rosemann, M., & Zur Muehlen, M. (2005). Integrating risks in business process models. ACIS 2005 Proceedings, 50. Rosenorn-Lanng, D., & Michell, V. (2014). The SHEEP Model; Applying Near Miss Analysis. In Patient Safety and Quality Care through Health Informatics. IGI. Reason, J. (1995a). Understanding adverse events: Human factors. Quality in Health Care, 4(2), 80–89. Rosenorn-Lanng, D. (2014). Human Factors in Healthcare: Level One. Oxford University Press. Reason, J. (1995b). Safety in the operating theatre—Part 2: Human error and organisational failure. Current Anaesthesia and Critical Care, 6(2), 121–126. doi:10.1016/ S0953-7112(05)80010-9 Reason, J. (1998). Achieving a safe culture: Theory and practice. Work and Stress, 12(3), 293–306. doi:10.1080/02678379808256868 Sadiq, S., Governatori, G., & Namiri, K. (2007). Modeling control objectives for business process compliance. In Business process management (pp. 149–164). Springer Berlin Heidelberg. doi:10.1007/978-3-540-75183-0_12

28

Clinical Pathways and the Human Factor

Semel, M. E., Resch, S., Haynes, A. B., Funk, L. M., Bader, A., Berry, W. R., & Gawande, A. A. et al. (2010). Adopting a surgical safety checklist could save money and improve the quality of care in US hospitals. Health Affairs, 29(9), 1593–1599. doi:10.1377/hlthaff.2009.0709 PMID:20820013 Shappel, S. A., & Wiegmann, D. A. (2000). The human factors analysis and classification system--HFACS (No. DOT/FAA/AM-00/7). US Federal Aviation Administration, Office of Aviation Medicine. Smits, M., Groenewegen, P., Timmermans, D., van der Wal, G., & Wagner, C. (2009). The nature and causes of unintended events reported at ten emergency departments. BMC Emergency Medicine, 9(1), 16. doi:10.1186/1471-227X-9-16 PMID:19765275 Spath, P. L. (2003). Using failure mode and effects analysis to improve patient safety. AORN Journal, 78(1), 15–37. doi:10.1016/S0001-2092(06)61343-4 PMID:12885066 Stanton, N. A., Chambers, P. R., & Piggott, J. (2001). Situational awareness and safety. Safety Science, 39(3), 189–204. doi:10.1016/S0925-7535(01)00010-8 Stamper, R. (1994). Social norms in requirements analysis: an outline of MEASUR. Academic Press Professional, Inc. Swidler, A. (1986). Culture in action: Symbols and strategies. American Sociological Review, 51(2), 273–286. doi:10.2307/2095521 Verbano, C., & Turra, F. (2010). A human factors and reliability approach to clinical risk management: Evidence from Italian cases. Safety Science, 48(5), 625–639. doi:10.1016/j.ssci.2010.01.014 Vincent, C., Neale, G., & Woloshynowych, M. (2001). Adverse events in British hospitals: Preliminary retrospective record review. BMJ (Clinical Research Ed.), 322(7285), 517–519. doi:10.1136/bmj.322.7285.517 PMID:11230064 Wiegmann, D. A., Eggman, A. A., ElBardissi, A. W., Parker, S. H., & Sundt, T. M. III. (2010). Improving cardiac surgical care: A work systems approach. Applied Ergonomics, 41(5), 701–712. doi:10.1016/j.apergo.2009.12.008 PMID:20202623 Wohed, P., van der Aalst, W. M., Dumas, M., ter Hofstede, A. H., & Russell, N. (2006, September). On the suitability of BPMN for business process modelling. In International conference on business process management (pp. 161-176). Springer Berlin Heidelberg. doi:10.1007/11841760_12 Ye, Y., Jiang, Z., & Yang, D.-A. (2008). Semantics-Based Clinical Pathway Workflow and Variance Management Framework. IEEE/SOLI 2008.IEEE International Conference on Service Operations and Logistics, and Informatics. 29

Clinical Pathways and the Human Factor

Zur Muehlen, M. (2002). Workflow-Based Process Controlling: Foundation. Design, and application of Workflow-driven Process information system.

30

Clinical Pathways and the Human Factor

Figure 8: Index to Key Acronyms

APPENDIX

31