Department of Computer Science and Information Systems ... their jobs. The omabinination of qualitative and quantitative data led to the development of a.
Combining Methods in Evaluating Information Systems: Case Study of a Clinical Laboratory Information System Bonnie Kaplan Department of Computer Science and Information Systems The American University, Washington, DC 20016 Dennis Duchon Division of Management and Marketing, College of Business University of Texas at San Antonio, San Antonio, TX 78285-0634
underlying users' reactio to these ss_ti. Few go beyond cost/benefit analysis. As with studies of information technologies in other settings, selected features of the information technology, the organization, the user, and the information needs are considered independent and objective. Their essential character is treated as unchanging over the course of the study [1, 28, 33], exoept for the changes an intervention may cause in some set of characteristics under study.
ABSTRAC This paper reports how quantitative and qualitative methods were combined in a case study of a clinical laboratory informstion system. The study assessed effects of the system on work in the laboratories, as reported by laboratory technlogists seven d months post iplemntation. Primry ces by the computer system were increases in the amvunt of
paperwork performed by technologists and
im-
provements in laboratory results reporting. Individual technologists, as well as laboratories, differed in their assessments of the system according to their perceptions of how it affected their jobs. The omabinination of qualitative and quantitative data led to the development of a theoretical explanatory model of these differences. The paper disces methodological imlications and the need for an approach to evaluating medical computer systems that takes into account the interrelationships and processes of which the system is a part.
Most studies neglect social, cultural, and political issues [27, 28] that could affect outoommes and constructs [33]. Individual differences among users also are ignored, although even users of the same system may view it differently [23]. An alt.ernative approach would view change as emerging from complex indeterminant interactions between the technology, the users, and the environent [35]. There is a need for methods to study ways in which the techges individuals and nology interact with and also to observe and analyze contexts [22]. Rather than treating information systems or computers as having "impacts" [10], information systems can be considered as socially constructed concepts with meanings that are affected by the "impacts" and which change from person to person or over time.
INTRODUCTION Although few studies address the impact of computer systems on medical workers [30], an estimated 45% of computer-based information systems in medicine fail due to user resistance and sabotage, even though these system are sound technologically e computer sysstem chane the role of [11]. individual workers and alter their work behavior [12], users' attitudes toward a computer system are likely to be influenced by how it affects their work. The importance of users' views in systems acceptance and use of a new technology has been well documented. Findings such as the empirical evidence that forms the basis of Rogers's Classic Diffusion Mbdel [39] indicate a need for systematic exploration of how well a computer application matches users' perceptions of their needs.
Qualitative Methods Interpreting information technology in terms of meanings and social action is becoming more popular as evidene grows that information system development and use is a social as well as technical process [20, 27, 28, 33, 34, 37]. The study of social systems involves so many uncontrolled--and unidentified--variables, that methods for studying closed systems do not apply as well in natural settings as in ontrolled ones [8, p. 32].
The value of context-dependent research to study social systems has been rearked upon by researchers in a variety of disciplines. Imersion in context is a hallmark of qualitative research. Qualitative methods allow researchers to elicit and observe what is significant and important to those being studied in the total situations where the behavior occurs ordinarily [6]. Researchers using these methods often attempt not to impose preexisting theoretical categories on the data [46,
Little formal study of this subject has been undertaken. Most evaluations of medical computer applications focus on the outcomes and impacts of system use--such as timeliness, completeness, error rate, retrievability, usage rates, cost, and user satisfaction [4, 41]--rather than on the processes
0195-4210/89/0000/0709$01.00 © 1989 SCAMC, Inc.
709
system according to how these changes affect them differentially. Different groups within a laboratory have acoepted a computer system for different reasons [15].
p. 25], but to iteratively derive them inductively from the data until an adequately coherent interpretation is reached [6, p. 124; 17]. In this way, although qualitative methods provide less explanation of variance in statistical terms than quantitative methods, they can yield data from which can be developed prooess theories and richer explanations of how and why processes and outcomes occur [35]. Qualitative approaches to studies of information technology, though unusual, have not been absent from the literature [e.g., 2, 18, 20, 27, 34, 37]. The value of qualitative methods for studying computer system development and use recently has been reaffirmed [40].
Research was conducted at a 650 metropolitan university medical stalled a commercial laboratory tion system for use by all nine Laboratory Medicine.
bed midwestern center that incomputer informalaboratories within
Data Collection Both qualitative and quantitative data were collected. There were three sources of data: interviews, observations, and a questionnaire.
There has been a move in other fields towards combining qualitative and quantitative methods to provide a richer, contextual basis for interpreting and validating results [e.g., 5, 9, 44]. Moreover, inreases robustness of using multiple me results because fndins can be srntee through triangulation, the cross-validation achieved when conclusions from different sources of data converge and are found congruet [2, 5, 21, 46], or when an explanation is developed to account for all of the data when they diverge [43]. Case studies, which often use sutiple eth, are investigations using multiple sources of evidenoe to study a contemporary phenonon within its reallife context [5; 16; 46, p. 23]. They have been advanod for inforwation systems research in order to understand the nature and complexity of the prooesses taking place [2, 18]. Very few medical information systems have been studied in this way.
Interviews and Observations
Interviews and observations were conducted imnediately prior to system installation. Directors and chief supervisory personnel from each of the laboratories were interviewed during the week immediately prior to syste implementation. The purpose of these intensive interviews was two-fold: (1) to determine what interviewees expected the potential effects of the computer system would be on patient care, laboratory operations, and hospital operations, and (2) to generate questionnaire items for a survey of laboratory technologists. Once the computer system began to be used, laboratory directors met at least weekly to discuss systems-related problems. One of the authors attended these meetings regularly as a participant observer. Six months after the system was installed, researchers were present in the laboratories to talk with laboratory staff and observe their activities.
Curet St
This paper describes how qualitative methods and quantitative methods were combined during the first phase of a longitudinal study. Detailed research results are reported elsewhere [24-26] and sumar-
Survey Questionnaire
ized below. Here the focu is methodological.
Next a questionnaire was developed. It included scaled-response item which assessed job satisfaction variables adapted from the Job Diagnostic Survey [19] and also specific computer related variables developed through the interviews, participant observation at metings, and laboratory observations. Questions concerning expectations were adapted from Kjerulff et al. [29]. Finally, subjects res ed to four open-ended questions to assess changes causd b the computer system and to elicit suggestions for improved system use. This questionnaire was given to all 248 members of the laboratory staff seven months post-implementation. Useable questionnaires were returned by 119 (48%) respondents; 108 (representirg 44% of the total sample) had useable open-ended question responses.
MgS
Setting This study conoerned the impact of a clinical lab-
oratory computer inforution system on laboratory technologists' jobs, as perceived by the laboratory technologists. It explored how these peroeptions affected technologists' re es to the computer system.
et a micr oo of Clinical laboratories hospital data hwanling needs [32]. Studies of clinical laboratory oer inforution systin show
that these systems change the process of work in the laboratories. They affect the number of phone calls, time spent on directly affected by the omputer syte, p uvity, efficiency, ard laboratory staffing and organization. These systems also change the servioe product of laboratories. Not only is turnaround time for test results affected, but results reports imprve in legibility, organization, and accuracy [7, 14, 31, 36, 38, 45]. Different lratories can be affected differatly [45], and laboratory personnel may respond to the
FIRST ANALYSIS RESULTS
Analysis of Qualitative Data Detailed analyses of interviews and responses to open-ended questions showed a clear similarity between expectations discussed by interviewees and themes in respondents' answers to open-ended questions. Both sets of informants emphasized changes
710
SECOND ANALYSIS
in amout of work and improvements in results reporting.
Develotuent of Model The interview firding that technologists' jobs were not expected to change, despite the changes in what technologists would be doing, suggested examining how techologists viewed their jobs. The repeated themes of system benefits though improved results reporting and the disadvantages of increased workload led to a way to analyze these views. B e technologists sed to focus on one of these themes over the other, possibly there was a group of respondents orresponding to each of these two
Interviewees generally agreed that, although there would be more work for laboratory technologists, their jobs would not be c ed by the computer system. Ihis assessment that technologists' work would not change was mad with full knowledge that, as one interviewee said, "bench techs wo't change what they do; the omputer will be one more thing they have to do."
Seven months later, there was a general sense among laboratory technologists that clerical duties and paperwork had increased and that productivity had suffered. Respondents, however, credited the computer system with miking laboratory test results available more quickly. They also said that reports were more complete, more accurate, easier to read, and provided a picture of "the whole patient." Nevertheless, they felt that doctors and nurses expected to get test results by calling the laboratories rather than through the computer syBtem.
themes.
A theoretical model of these two groups was developed. One group would see their jobs in terms of producing results reports, the other in terms of the bench work necessary to producing those results reports. Although it was expected that the bench work would not change, the couter syste could affect bench work by changing the amnt of paper-
work and the number of
phone
calls,
When responses were grouped by laboratory, unrked differences were evident between some of the laboratories. Also, individual laboratory technologists seemed to focus either on the changes in workload and number of irritants (such as phone calls) in their work, or on the improved results reporting.
see
and
by requir-
wold
ing time for data entry.
Thus, the group that their jobs in terms of the bench work (i.e.,
the proces of producing laboratory results) would respond to the computer system according to how it affected those items measured by the Personal Hassles and Inreased Blae variables. The more the computer system interfered with bench work, the less favorable these individuals would be toward the system.
Scaled Response Questions other group of respondents would see their jobs in terms of the outcome or product, rather than the proces8, of laboratory work. To them, the quality of results reporting and the service that the laboratories provide would define their work. To the extent that the computer system enhanced results reporting and service, they would be favorable towards it. Their of the ccpxter system according to its effects on service and reporting would be reflected in their survey responses to External Commications and Service Outcomes items. The
A factor analysis of Job Diagnostic Survey items resulted in extracting four factors: Skill Variety, Task Identity, Autonomy, and Feedback. These four common factors [3, 42] were used as job characteristic variables in subsequent analyses. Factor analysis of the computer-related questionnaire items resulted in the identification of five variables: External Commnications, Service Outcors, Personal Intentions, Personal Hassles, and Increased Blame. These five computer variables reflect very similar thmes to those in the interview data and responses to open-ended questions: changes in workload and changes in results reporting.
Furthermore, according to this model, although the two groups of laboratory technologists differ according to how they view their jobs, such particularistic aspects of how respondents define their jobs are not measured by the job characteristic measures. Consequently, there would be no correlation between job satisfaction and other measures.
by sming an individual's scores on the separate job characteristic measures [13]. Analyses of differences in job complexity attributable to individual differenoe factors (age, gender), job experience (time in laboratory, time in present job, previous laboratory experienoe, previous laboratory computer experience), or environiental factors (shift, laboratory) indicated no differences attributable to
A job complexity score was created
Analysis Based
on
Model
Two new variables were created, one a combination of scores on External Comumication and Service Outcomes and the other a combination of scores on Personal Hassles and Incressed Blame. Personal Intentions was omitted because it did not assess the interaction between specific aspects of the computer system and the job.
these factors.
Intercorrelations of the job character..stic and computer variables showed weak or nonexistent relationships. However, there were statistically significant differences among the laboratories for all of the computer variables. The qualitative data also had indicated marked differences among the laboratories. The next stage was to detenrine what might account for the differences.
These two new variables were negatively correlated, This indicates that respondents tended to have high scores on one variable and low scores on the other,
711
i.e.
that there
Next
an
were
orientation
two groups
score was
of technologists.
computed for each
respondent by subtracting the sue of that person's scores on Personal Hassles and nsed Blame froa ication and Service the scores on External Outcomes. The resulting orientation score indicates an overall job/computer orientation that reflects either an individualIa process or product assessment of the computer system.
The findings also suggest the value of investigating specifically how a compter system affects users' jobs bee these effects can influerKe reactions towards the coAubter system. However, standard job characteristic m s do not take into account differences in how res ents holding ostensibly the same job define their jobs; contextspecific measures eema re appropriate.
When the orientation score was regressed on Laboratories, statistically significant differences in orientation were found across laboratories. When orientation scores were split at the median and technologists assigned to either a high (i.e. product) orientation score group or a low (i.e. process) one, in each laboratory, the great majority of technologists had either a product or a process orientation. Consequently, laboratories can be considered as either product or process oriented. The laboratory with the highest orientation score (i.e. the ost product oriented) was strikingly favorable toward the computer system in response to open-ended questionnaire items, while the two laboratories with the lowest orientation scores (i.e. the most process oriented) were markedly hostile in these responses. This finding indicates that the orientation score produced results compatible with the qualitative results.
Most medical systems are evaluated an such outoe asures as timeliness and accur of data, or changes in costs and p ctivity. Few use a cumbination of qualitative and quantitative approache. Those that do provide rich data and provocative analyses. MAditional study is needed to focus on the processes by which an iformation system and work are interrelated in a medical setting. Interactionist studies my im ove ur understaing of how to design and implement medical computer a to examine plications. By extending our specific effects of individual cpter stam in particular contexts, through a cobination of qualitative and quantitative methods, we can improve our general understanding of what affects the e and use of medical cfter system.
DISCUSSION
Dependent variables for the study of firm and industry-level impacts of inforation technology. In Proc. Ijgjth International Conf. on Information Systems, pp. 10-23, (Pittsburg, Penn., Dec. 6-9 , 1987). 2 Benbat I, Goldstein DK, and Mead M: The case research strategy in studies of information system. MS Q. 11(1987) 369-386. 3 Birnbaum PH, Pahr JL, and Wong GYY: The job characteristics model in Hong Kong. J. of lied Psychology 71(1986) 598-605. 4 Blum BI: Clinical Infozution te (New York: Springer-Verlag, 1986). 5 Bonoma TV: Case search in Frketing: Opportunities, problem, and a process. J. of Merketing Res. 22(1985) 199-208. 6 Bredo E and Feinberg W: Part two: The interpretive approach to social and educational research. In E Bredo and W Feinberg (Edits.): Knowledge and Values in Social and Educational Research (Philadelphia: Tmle University Press, 1982). 7 Brooks RC, Casey IJ, and Blackmn PW Jr: Evaluation of the Air Force Clinical AutoOation System (AFAS) at Wright-Patterson USAF Medical Center. I: S .6 HDSN-77-4. (NTIS no. AD-A043 664). II: Analysis. HDSN-77-5. (NTIS no. AD-A043 665). (Arlington, Va.: Analytic Services, 1977). 8 Cook TID and Campbell DT: Quasi-Exverimentation: Design and Analysis Issues for Field Settings (Boston: Houghton Mifflin, 1979). 9 Cook TD and Reichardt CS (Edits.): Qualitative and Quantitative Methods in Evaluation Research (Beverly Hills: Sage Publications, 1979). 10 Danziger JN: Social science and the social imlpacts of computer technology. Social Science Q 66(1985) 3-21. 1 Bakos JY:
Initially our statistical analysis, as in previous studies examining relationships between job characteristics and computer acceptance and use, indicated no correlation between these variables. However, analysis of qualitative data from open-ended questionnaire item and interviews suggested some connection between how respondents viewed their jobs, the computer system's effects on their jobs, and their reponses to the c pt. Based on the understanding gained from the qualitative data, an interpretive model was developed and further statistical analysis was undertaken. This analysis supported the interpretation derived from the qualitative data. Our analysis indicates that a user's view of a computer system is related to how that system supports or interferes with the performance of a job, as defined by the person holding that job. It should be cautioned that the direction of causality between perceived i ts on job and assessment of a computer system has not been established by this study. Perhaps what is most important are their interrelationship.
OONCLUSIONS This study has methodological implications for evaluating medical information system. Qualitative methods proved especially valuable. In the absence of qualitative data, it would have been difficult to account for differences among laboratories and little would have been learned from the study. Although the differences were evident both in the quantitative and qualitative data, it was the qualitative data from which an interpretive model was derived.
712
11 Dowling AF Jr.: Do hospital staff interfere with computer system implementation? Health Care Menagement Rev. 5(1980) 23-32. 12 Dowling AF Jr.: Medically Oriented ComputerBased Information Systems. Medical Care 20(1982) 253-254. 13 Ferris GR and Gilwore DC: A methodological note on job complexity indexes. J. of Awlied Psychology 70(1985) 225-227. 14 Flagle CD: Operations research with hospital computer system. In ME Collen (Edit.): Hospital Computer System, pp. 418-430 (New York, John Wiley and Sons, 1974). 15 Fleck A et al.: Experience with the introduction and routine operation of a computer-base reporting system in a clinical biochemistry laboratory. International J. of Bio-Medical Computing 5(1974) 189-202. 16 George AL and MbKeown TJ: Case studies and theories of organizational decision making. In Advances in Information Processing in O tions 2, pp. 21-58 (Greenwich, Conn.: JAI Press, 1985) 17 Glaser BG and Strauss AL: The Discovery of Grounded Theory: Strategies for Qualitative Research (New York: Aldine, 1967). 18 Goldstein D et al.: Use of qualitative methods in MIS research. In Proc. Seventh International Conf. on Information fists, pp. 338-339 (San Diego, Cal., Dec. 15-17, 1986). 19 Hackman JR and Oldham GR: Motivation through the design of work: test of a theory. Organizational Behavior and Human Perfonmmnce, 16(1976) 250-279. 20 Hirschheim R, Klein H, and Newman M: A social action perspective of information system development. In Proc. Ijgkht International Conf. on Informtion Systems, pp. 45-57 (Pittsburg, Penn. Dec. 6-9, 1987). 21 Jick TD: Mixing qualitative and quantitative methods: Triangulation in action. In J Van Maanen (Edit.): Qualitative MethodoloIy, pp. 135-148 (Beverly Hills: Sage Publications, 1983). 22 Johnston J: Editor's notes. In J Johnston (Edit.): Evaluating New Information Technologies, New Directions for Program Evaluation, no. 23, pp. 1-3 (San Francisco: Jossey-Bass, 1984). 23 Kaplan B: The computer as Rorschach: Implications for management and user acceptance. In Proc. Seventh Ann. sym. C uter Aplications in Medical Care, pp. 664-667 (Silver Spring, Md.: IE Computer Society Press, 1983). 24 Kaplan B: Impact of a clinical laboratory computer system: Users' perceptions. In R Salamen, BI Blum, and MJ J~rgensen (Edits.): Medinfo 86: Fifth World Cres on Medical Inforutics, pp. 1057-1061 (Amterdam: North-Holland, 1986). 25 Kaplan B: Initial impact of a clinical laboratory computer system: Themes commn to expectations and actualities. J. of Medical Systems, 11(1987) 137-147. 26 Kaplan B and Duchon D: A job orientation model of impact on work seven months post implementation. In Medinfo 89: Sixth World Congress on Medical Informatics, (in press) (Amsterdam: North-Holland, 1989).
713
27 Kling R: Social analyses of computing: Theoretical perspectives in recent empirical research. Computing Surveys 12(1980) 61-110. 28 Kling R and Scacchi W: The web of computing: Cou(bter technology as social organization. In MC Yovits (Edit.): Advances in Cmters 21, pp. 2-90 (New York: Ademic Press, 1982). 29 Kjerulff KH et al.: Predicting employee adaptation to the implentation of a medical inforution system. Proc. Sixth An. Symp. UR!ier Applications in Medical Care, pp. 392-397 (Silver Spring, Md.:I. Computer Society, 1982). 30 Krobock JR: A taxonamy: Hospital inforiation systems evaluation methodologies. J. of Medical System 8(1984) 419-429. 31 Lewis JW: Clinical laboratory information system. Proc. of the IEEE 67(1979) 1229-1300. 32 Lincoln TL and Ko RA: Cmputers, health care, and medical information science. Science 210 (October 17, 1980) 257-263. 33 Lyytinen K: Different perspectives on information systems: Problemm and solutions. ACorCputing Surveys 19(1987) 5-46. 34 Markus ML: Power, politics, and MIS imlementation. Communications of the ACM 26(1983) 430-444. 35 Mtrkus ML and Robey D: Information technology and organizational change: Causal structure in theory and research. PMazement Science 34(1988) 583-598. 36 Morris FJ: Impact of computerization on laboratory staffing. J. of Medical S 10(1986) 355-359. 37 Mknford E, Hirschheim R, Fitzgerald G, and WoodHarper T (Edits.): Research Methods in Information Systems, (Amsterdam: North Holland, 1985). 38 Nicol R and Smith P: A survey of the state of the art of clinical biochemistry labatory computerization. International J. of Bio-Medical Canputing 18 (1986 ) 135-144 . 39 Rogers EM and Shoemaker FF: C ication of Innovations: A Cross-Cultur As (New York: The Free Press, 1971). 40 Shneiderman B and Carrol JM: Ecological studies of professional progra rs: An overview. Commurications of the ACM (1988) 1256-1258. 41 Simborg DW and Whiting O'Keefe QE: Evaluation methodology for ablatory care information systems. Medical Care 20(1982) 255-265. 42 Stone EF: "The M ting Effect of Work Related Values on the Core Job-Scope Satisfaction Relationship," unpublished doctoral dissertation, University of California, Irvine, 1974. 43 Trend MG: On the reconciliation of qualitative and quantitative analyses: A case study. In TD Cook and CS Reichardt (Edits.): Qualitative and Quantitative Methods in Evaluation Research, pp. 68-86 (Beverly Hills: Sage Publications,
1979).
44 Van Meanen J (Edit): Qualitative Methodology, pp. 9-18 (Beverly Hills: Sage Publications,
1983).
45 Wolfe HB: Cost-benefit of laboratory computer systems. J. of Medical System 10(1986) 1-9. 46 Yin R1: Case Sty Research: De and Methods (Beverly Hills: Sage Publications, 1984).