A Practical Method for Data Handling in Multi

0 downloads 0 Views 227KB Size Report
NVivo 10 (QSR International Pty Ltd, Doncaster, Victoria, Australia) coding themes ... series of analyses were performed in QDAS to query the system on specific ...
302

Exploring Complexity in Health: An Interdisciplinary Systems Approach A. Hoerbst et al. (Eds.) © 2016 European Federation for Medical Informatics (EFMI) and IOS Press. This article is published online with Open Access by IOS Press and distributed under the terms of the Creative Commons Attribution Non-Commercial License 4.0 (CC BY-NC 4.0). doi:10.3233/978-1-61499-678-1-302

A Practical Method for Data Handling in Multi-Method Usability Research Studies a

Mattias GEORGSSONa,b,1 and Nancy STAGGERSa,c Department of Biomedical Informatics, University of Utah, Salt Lake City, Utah, USA b Faculty of Computing, Blekinge Institute of Technology, Karlskrona, Sweden c Summit Health Informatics, Salt Lake City, Utah, USA Abstract. Background: Analyses of large, complex data sets are common in health informatics and usability research. Researchers need feasible ways of streamlining data handling and analyses. We offer a useful approach across qualitative and digitalized data. Methods: Illustrated by a usability evaluation study on a mHealth system, we present methods for managing a large set of usability data using qualitative data analysis software (QDAS). Three different data collection methods were used (usability testing, in-depth interviews, and open-ended questionnaire responses). Results: The process began at initial transcription and all data were imported into the system. Content analysis was used throughout - from problem identification to assigning problem classifications and severity ratings to linkages with system views. Conclusion: This approach was practical and useful as it allowed the capture and synthesis of a large number of multifaceted usability problems. We recommend this approach to other researchers performing usability evaluations on large data sets. Keywords. QDAS, Content analysis, Usability evaluation, Chronic disease selfmanagement

1. Introduction Chronic diseases such as diabetes mellitus, cardiovascular disease, chronic respiratory disease, cancer and stroke cause about 38 million people’s death worldwide each year [1]. Due to the increase in the number of patients with chronic conditions a concomitant increase exists for patients’ need to self-manage their diseases [2]. Information and communication (ICT) solutions can be of assistance [3, 4], but ICT usability is a known issue for these systems and applications. ICT usability needs to be improved to increase effectiveness and long-term adoption of these self-management solutions [5, 6]. Performing usability evaluations of these complex self-management systems, especially when multi-method approaches are used, requires the analysis of large data sets. Finding ways of streamlining data management, data analyses and data presentations are important for efficient data handling across large sets of rich qualitative data. Qualitative data analysis software (QDAS) traditionally provides support for content analysis [7], but can also be used to support handling and analyses 1

Corresponding Author: Faculty of Computing, Blekinge Institute of Technology, 371 79 Karlskrona, Sweden. E-Mail: [email protected]

M. Georgsson and N. Staggers / A Practical Method for Data Handling

303

of large, combined data sets. This can lead to more efficient and improved analyses. We offer new ways of analyses with QDAS that provide additional insights into large data sets. We use descriptions and examples from a prior multi-method usability evaluation study of a mHealth self-management system [8] to illustrate data management with QDAS. The goal of this paper is to describe how QDAS can be used to support usability problem determination across digitalized, qualitative data analyses for both inductive and deductive coding for three different data collection methods: a Think Aloud usability test (qualitative data), in-depth interviews (qualitative data) and descriptive, open-ended responses from questionnaires (qualitative data). 2. Methods The purpose of the mHealth study for diabetes self-management was to evaluate 10 patients’ experienced usability [8]. The patient sample was drawn from a larger randomized controlled trial [9]. The mHealth evaluation began with a Think Aloud (TA) session where each participant performed a series of validated tasks [8] while they expressed their thoughts out loud and an observer took notes about usability issues [10]. The session was digitally recorded using audio and screen capture software. The next phase was in-depth interviews, also recorded through digital audio recording software. Last, the participants filled in an electronic questionnaire. The TA usability test, in-depth interview session recordings were transcribed and uploaded digitally along with the digital questionnaire responses and the observer’s digital notes into QDAS. Data were made available in a digital, uniform textual format for further data analyses. Before importing the data into the QDAS, a hierarchical structure was also created based on the different usability tasks, self-management processes and sub-tasks. This was done to facilitate the tracing of the origin of the usability problems for the later analysis phases. 3. Results Specific details about the results of the mHealth study are described elsewhere [9]. This methods article outlines the usefulness of QDAS in analyzing large data sets and identifying observed usability problems. After importing study data into QDAS the combined data were analyzed by the researchers using traditional content analysis techniques [11]. Researchers used text to describe usability issues in the system after reading the data multiple times. This allowed inductive coding for themes. In QDAS, NVivo 10 (QSR International Pty Ltd, Doncaster, Victoria, Australia) coding themes are called nodes. As an example, the node for an identified usability problem might contain text from several different data collection methods and from several different participants, but these were all identified and linked to the same usability problem in a theme (node). After usability problems were identified they were associated with the specific interaction task participants had performed. Then, the type of usability problem was deductively coded using the Usability Problem Taxonomy (UPT) [12]. This provided problem classification into an artifact component (Visualness, Language and

M. Georgsson and N. Staggers / A Practical Method for Data Handling

304

Manipulation) and/or a task component (Task-mapping and Task-facilitation) [12]. Severity ratings for each usability problem were assigned on a scale from 1 to 4 [13] depending upon the seriousness of the usability problem to the user. Finally each respective usability issue was also associated under the specific view or appropriate graphical interface in the system. Critical linkages and analyses were made among UPT, artifact or task component classifications and associated severity levels for each usability problem. Both inductive and deductive ratings were coded. These were, importantly, linked to each respective usability problem by adding choices as attributes to each respective theme (node) in NVivo. This facilitated linkages to each identified problem, its respective task in the mHealth system and its associated UPT and severity level rating. It was also important to ascribe the problem to its specific view or graphical interface where it was detected. To facilitate this in the QDAS, free themes (case nodes) were created which provided attributes assignment for each respective system view in the mHealth system. Finally, a series of analyses were performed in QDAS to query the system on specific questions. Then, we could select and compare different coding attributes with each other and visualize answers in tables and graphical representations such as diagrams and figures (Model explorer), and theme maps that could visualize the usability data and support further analyses in QDAS.

Figure 1. Coded usability problem with assigned UPT, severity rating, tasks and system views

4. Discussion Improved system usability is needed to promote and meet patients’ ICT usage requirements for self-management of chronic diseases [5, 6]. Improving patients’ conditions places high demands on the effectiveness of this technology [14]. The aim of this study was to investigate how QDAS might support data handling and analyses of a multi-method usability evaluation. QDAS, specifically NVivo 10, facilitated all steps of data handling and analyses for this usability evaluation study. QDAS was particularly helpful in several respects.

M. Georgsson and N. Staggers / A Practical Method for Data Handling

305

Large quantities of digital and qualitative data were able to be imported in a simple manner into NVivo. Its interface provided good data display and a clear overview of the collected data compared to the conventional, traditional manual handling of these kinds of data. Linkages could be made with screen views (Graphical User Interfaces), tasks and various aspects of usability problems. This kind of complex association was previously noted as an advantage in the literature because it allows researchers to being freed from manual and clerical tasks, saving time, being able to deal with large amounts of qualitative data, having increased flexibility, and improving the validity and auditability of qualitative research [7]. We found that QDAS supported changes in all parts of the data handling and analysis process. For example, theme refinement was easily accomplished across the different data sets. New themes could be inserted for usability problems or they could be simply updated. An important feature was that QDAS allowed for both inductive and deductive coding. Creating and updating codes or changing themes based on new insights was easily accomplished. This provided a more comprehensive picture of usability problems and their analyses. Another major advantage was that validated tasks representing patient selfmanagement processes in the system could be used to structure and support the analysis process. This helped identify the origin of usability problems along with the ability to assign the identified usability problems to free case nodes representing different views/interfaces of the system. These associations meant that a lot of trouble and time was saved compared to manual coding or trying to construct associations across hybrid electronic and paper data. QDAS allowed usability problem classification by supporting deductive coding of UPT classifications and severity ratings for each respective usability problem. The digital environment in QDAS contributed transparency in the analysis process and in the overview of data. This made it possible to trace codes backwards when needed and surfaced different ideas, assumptions and analyses without risking losses of later data analyses and insights. This traceability can lead to added trustworthiness in the content analysis process [15]. Plus, the digital environment made it possible to easily share analyzed data between or among coders as was the case in our study. By linking various types of data, new insights from the data were made. An important point was that QDAS allowed for powerful analyses across original documents with rich transcribed data and allowed for new attributes to be coded based on participant and the various data collection methods. Associations linked coded text to themes, usability problems, problem attributes, UPT classifications, tasks, severity ratings and system views. Then comparisons could be made across identified usability problems, participants, tasks and different system views, allowing for new insights to be made. The powerful analyses and results visualization in graphical models showed complex connections and increased our depth of understanding about the existing usability problems. Despite all the positive benefits, limitations exist. We experienced each of them. Other authors stress the importance of reflecting on the merits and limitations of QDAS [16]. Researchers should not be overwhelmed by the volume and breadth of data. Instead they should focus on the depth and meaning of data versus not be distracted from the real work of analysis which can be very time-consuming. The time and energy spent learning to use a complex computer package also needs to be taken into consideration. The intelligence and integrity that a researcher brings to the research process must also be applied in the choice and use of tools and analytical processes [7].

306

M. Georgsson and N. Staggers / A Practical Method for Data Handling

Other authors stated that it is important to include as much detail as possible about the analysis process, describe the features used at each step of the process with specific details of how the program assisted in achieving specific measures of quality [16] which is what we attempted in our study. 5. Conclusion We found using QDAS greatly assisted us in the coding and analysis of usability problems. QDAS was practical and useful as it allowed us to determine, capture and synthesize a large amount of multifaceted usability problems. Although it was a laborious process, we recommend this approach for usability evaluations because it allows for linkages across a variety of data types, provides transparency during analyses and allows for new insights from the large amount of data. Linkages could be made across coded rich text data, attributes and cases. QDAS also facilitated both inductive and deductive coding. QDAS can be beneficial to other researchers performing usability evaluations with multi-method approaches, especially on large amounts of data. The end result of solid usability evaluations can be better designed systems with higher levels of usability and adoption rates for the end user. References [1] World Health Organization (WHO). Noncommunicable diseases. January 2015 [cited 2016 25 Feb]; Available from: http://www.who.int/mediacentre/factsheets/fs355/en/. [2] T. Bodenheimer, et al., Patient self-management of chronic disease in primary care. JAMA, 2002. 288(19): p. 2469-75. [3] S.E. Wildevuur and L.W. Simonse, Information and communication technology-enabled person-centered care for the "big five" chronic conditions: scoping review. J Med Internet Res, 2015. 17(3): p. e77. [4] M. Dadgar and K.D. Joshi, ICT-Enabled Self-Management of Chronic Diseases: Literature Review and Analysis Using Value-Sensitive Design, in Proceedings of the 2015 48th Hawaii International Conference on System Sciences. 2015, IEEE Computer Society. p. 3217-3226. [5] C. Or and D. Tao, Usability study of a computer-based self-management system for older adults with chronic diseases. JMIR Res Protoc, 2012. 1(2): p. e13. [6] P.R. Sama et al., An evaluation of mobile health application tools. JMIR Mhealth Uhealth, 2014. 2(2): p. e19. [7] W. St John and P. Johnson, The pros and cons of data analysis software for qualitative research. J Nurs Scholarsh, 2000. 32(4): p. 393-7. [8] M. Georgsson and N. Staggers, An evaluation of patients' experienced usability of a diabetes mHealth system using a multi-method approach. J Biomed Inform, 2015. [9] K. Capozza et al., Going mobile with diabetes support: a randomized study of a text message-based personalized behavioral intervention for type 2 diabetes self-care. Diabetes Spectr, 2015. 28(2): p. 8391. [10] K.A. Ericsson and H.A. Simon, Protocol analysis: verbal reports as data. 1984, Cambridge, Mass.: MIT Press. 426 s. [11] S. Elo and H. Kyngas, The qualitative content analysis process. J Adv Nurs, 2008. 62(1): p. 107-15. [12] S. Keenan et al., The Usability Problem Taxonomy: A Framework for Classification and Analysis. Empirical Software Engineering, 1999. 4(1): p. 71-104. [13] D. Travis, How to prioritise usability problems. 2009 October 15 [cited 2015 August 26]; Available from: http://www.userfocus.co.uk/articles/prioritise.html. [14] A. Carter et al., Mobile Phones in Research and Treatment: Ethical Guidelines and Future Directions. JMIR Mhealth Uhealth, 2015. 3(4): p. e95. [15] S. Elo et al., Qualitative Content Analysis. SAGE Open, 2014. 4(1). [16] T. Paulus et al., The discourse of QDAS: reporting practices of ATLAS.ti and NVivo users with implications for best practices. International Journal of Social Research Methodology, 2015: p. 1-13.

Suggest Documents