Digital Human Models for Automated Ultrasound User ...

3 downloads 0 Views 1MB Size Report
multiple aspects that are unique in the real-time diagnostic world [1]. In order to take ..... hazard”, Journal of Diagnostic Medical Sonography, vol.1, pp. 121-125 ...
Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting

561

DIGITAL HUMAN MODELS FOR AUTOMATED ULTRASOUND USER INTERFACE DESIGN Giuseppe Andreoni, Department of Design, Politecnico di Milano, Milano, Italy Marco Mazzola, Department of Design, Politecnico di Milano, Milano, Italy Tiziana Atzori, IRCCS Fondazione Don Carlo Gnocchi, Florence, Italy Federica Vannetti, IRCCS Fondazione Don Carlo Gnocchi, Florence, Italy Lucia Modi, Esaote S.p.A., Florence, Italy Sara D’Onofrio, Esaote S.p.A., Florence, Italy Leonardo Forzoni, Esaote S.p.A., Florence, Italy ABSTRACT The purpose of this theoretical paper is to describe the development of a new technology for the automated analysis and design definition of Ultrasound (US) system User Interfaces (UI) and US transducers. US examination is a real-time multi-factor approach, which involves the whole sonographer’s body; its automated evaluation, analysis and design must take into account many different factors and aspects which need to be evaluated and implemented. The proposed technology, based on Digital Human Modeling (DHM) systems, would get input from multifactor technologies such as Motion Analysis, Eye Tracking, Superficial Electromyography, Stereo Imaging and also physical information such as temperature, ECG, respiration activity, etc., applied to different US users for different clinical applications and protocols. The utilization of DHM to manage and analyze these diverse requirements would drive the automated optimization of system design, in terms of ergonomics and workflow.

Copyright 2016 by Human Factors and Ergonomics Society. DOI 10.1177/1541931213601129

INTRODUCTION/THESIS Ultrasound (US) systems are usually shared service devices which should be able to perform examinations for the various clinical applications of different users (different professions, levels of knowledge, countries, clinical protocols). This means the US User Interface (UI) has multiple aspects that are unique in the real-time diagnostic world [1]. In order to take all of these aspects into account, the US system UI can be defined, in general terms, as the entire US system, including the control panel, main screen and any screen or reconfigurable input system such as the Soft Key Menu, US probe or US scanner body [2]. The methodology for the real-time acquisition of detailed and heterogeneous US imaging data makes it unique and complex, requiring the operator to be focused on different things at the same time: the probe (usually in the right hand), the US device (usually in the left hand), the patient, the main screen, the control panel and the touch screen when present (interfaced with the eyes) [3]. In addition, the ergonomic evaluation of US systems has to consider an elaborate and varied range of aspects related to the clinical application, the protocols, the type of US system and probe used and the user expertise. Furthermore, the variability of patients’ body sizes and characteristics and the different US system settings for the investigation of different anatomical parts add another level of variation to be considered. The complexity of the interface, methodology and ergonomic evaluation make it difficult to define a unique US

UI design in a way that satisfies all possible users and scenarios. The approach to characterizing and analyzing US UI and probe design is difficult to define and target. This process requires the development of new technology with a suitable approach to drive the definition and implementation of US design in terms of UI and workflow solutions. REVIEW Ultrasound System User Interfaces – Overview Diagnostic US systems are characterized by real-time diagnostic examination, which means operators must have the US probe in one hand, with the other hand free for the US system UI. The concepts of UI usability and workflow, probe design and ergonomics have been discussed by US system designers and users for many years. US systems and probe ergonomics have been described and addressed in many Standards and Guidance Documents from Regulatory Organizations, Healthcare Institutions and Sonographers’ Associations [4-7]. Yet today there are still no Industrial Standards available for the design of US system UIs. The different design outputs of the various US system producers are usually based on the company’s experience and in most cases they prefer to continue with their own UI solutions, which their loyal customers are familiar with, even if they are not ergonomically optimized from a universal perspective. The utilization of these US system UIs can lead to the

Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting

562

discomfort of Work-Related Musculoskeletal Disorders (WRMSD) among US operators. The problem of WRMSD is even more crucial now, due to the increased number of US systems sold per year and the consequent increase in users, patients, the number of examinations performed worldwide each year, working hours and stress [8-14]. Also recently, regulatory bodies such as the FDA have been paying growing attention to the ergonomics and usability aspects of US systems (e.g. U.S. Department of Health and Human Services, Food and Drug Administration, 2011; European Standard EN 62366, Medical devices, 2008) but so far no optimized process has yet been implemented. US system UIs are lacking in design standards. Observational techniques and general knowledge about their utilization (for designers, consultants, operators) are common strategies for defining a US system UI [15-22]. The expert US user, who can perform examinations following the different examination protocols used for each application, level of expertise and national/local standard, practically doesn't exist. Experience tells us that a US UI designed by a Cardiologist doesn't satisfy a Radiologist; a US UI designed by an American sonographer doesn't completely satisfy European users [1, 8, 13, 14]. Data and information gathering by sonographers is complex due to the fact that each subject/group practices the applications of a US scanner in different ways, with different modalities; in this scenario, it is not possible to have a general user/group reference point for all applications. Moreover, US systems are shared service devices, designed to perform examinations for all clinical environments: the concept and technology of an automated design for US system UIs can be helpful in addressing these concerns. Modularity and easy configuration are needed to adjust and adapt to each sonographer [13, 14]. The need to create shared service diagnostic US products is a direct consequence of market demand. Economic constraints require maximum return on investment, and capital equipment used in the largest possible number of clinical scenarios (and even departments).

and Superficial Electromyography (SEMG), depending on which US system UI is under investigation (i.e., Motion Analysis for touch screen and control panel layout, SEMG for probe ergonomics and body system, Eye Tracking for main screen layout) (see Figure 1). An optimal solution would be to consider an SEMG with wireless sensors and data management, to avoid the presence of cables and thus significantly facilitate EMG data analysis during the simulation. This would help to perform the measurements without reduced comfort / space due to the multi-lead cable system.

NEW CONTRIBUTION

Figure 1 – Conceptual structure and relationship between the Real and Virtual Environment for DHM Automated US UI Design

The technology proposed here would lead to an automated definition for US system UIs, in a Digital Human Modeling (DHM) environment, where the operator is represented by an avatar that interacts with a virtual US system and its virtual UI. This new technology could be useful both for new system and probe design and also for new US software development, particularly the user display and touch screen layout. Feedback from user interaction with different real US systems is used to produce an empirical model of a US system as a starting point for running simulations on the virtual US machine. The user’s avatar is defined by recording and transferring the interaction between the real user and the real US system into the digital world: this could be obtained using gesture and motion recognition systems as Motion Analysis, Eye Tracking

All the acquired ergonomic analyses are entered into the DHM system, using an input collector to direct all signals (if possible, also considering other physiological signals, such as ECG and respiration activity) on the same platform. When the avatar and the empirical model are fully defined, the simulation can start. The simulation outputs are the inputs for the next simulation, in order to reach an optimal result through trial and error, in an iterative process. Every simulation has the same fixed avatar, but the US UI is redesigned at each iteration, until a satisfactory result is obtained (see Figure 2). At the end of the process, the designers always have the possibility to prioritize the output and manage the obtained results, in terms of the major characteristics and definitions driven by the operator’s needs.

Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting

This analysis could be repeated, varying the clinical application, the clinical protocol, the patient’s anthropometry, the level of user expertise, professional characteristics, etc., in order to identify an optimized solution taking different needs into account. The real user test is video-recorded and triggered by the input collection data systems, in order to make data review easier, if necessary. Generally speaking, this kind of technological concept can be applied to any real time UI, i.e. an airplane cockpit, motor vehicle, real time video-recording system, manual or semi-automated machine for industrial automation, logistics or goods management systems for docks, stocks, ships, etc.

563

Constraints: a