action, and process redesign forged into one methodology. .... The project is being developed using object-oriented analy- ... We thought the second goal would.
Boston, Massachusetts USAo April24-28,1994
Human Faciors in Computing Systems 5%?
Methods in Search of Methodology--Combining
HCI
and Object Orientation Susan E. McDaniel, Cognitive
Science
Gary M. Olson, Judith
and Machine
The
University 701
Ann
Laboratory
of Michigan
Tappan
Arbor,
Intelligence
S. Olson
MI
mcdaniel@csmil.
Street 48109-1234 umich.edu
+1-3 13-764-6715
KEYWORDS
use OOM because of its modularity and rapid prototyping capabilities, and one of the Principal Investigators is an expert in HCI. The management team had advocated a strong user-centered design in bidding for this project, and they won it partly on that promise. Thus, from the start, this was seen as a project that would be driven by the users’ needs, involve users, and incorporate appropriate HC1 methods in various stages throughout the design and development.
Object-oriented methods, human computer interaction, user-centered design, business process redesign ABSTRACT Software design and user interface design and analysis methods are each insufficient methods for ensuring good software development. We propose a combination of object-oriented analysis and design, human computer interaction, and process redesign forged into one methodology. We describe the use of these methods in a project case study and conclude with a synopsis of how the methods worked and lessons we learned.
In brief, we mixed the methods of 00M and HCI. Since neither field is very strong in the stage at which the functionality of the software is determined, we added one additional method. We borrowed function allocation from Business Process Redesign (BPR) [6, 14]. Thus, we incorporated the following methods:
INTRODUCTION The field of Human Computer Interaction (HCI) offers a vwiety of methods to help assure that software is user-centered, easy to learn and easy to use [1]. Although there have been a number of success cases in the use of these methods [12], their adoption into software design in practice is minimal [2]. This may be due in part to the fact that these methods do not cohere into an adoptable methodology for software development but rather have to be attached to some existing software design method.
●
direct
to
copy
provided commercial
that
fee
the copies
advantage,
title
of the publication
that
copying
Machinery.
without
the ACM
and its date
is by permission To copy
ACM
0-89791
of this
copyright
appear,
-650-6194
material
or distributed
. rapid prototyping from 00M . user testing from HCI ● iterative design and analysis from 00M
THE PROJECT The software design project that is described herein is called the Upper Atmospheric Research Collaborator (UARC). The goal of this project is to develop software to create a collaborative laboratory for scientists doing research on the E and F regions of the upper atmosphere. The community of scientists is geographically dispersed, but all have instruments located in Km.gerlussuaq (formerly Sondrestromfjord), Greenland. The scientists are located in Michigan, Maryland, at two sites in California, and in Denmark. The software will provide remote, real-time access to data from 5 instruments, remote control of the instruments, a collaborative workspace for discussing data, papers, and experiments, the ability to discuss data in real-time from
is
notice
and the is given
or to republish,
for Computing requires
a fee
USA /0145
and HCI
In this paper, we first describe the project background, followed by a description of the methods we used, and comments on the value we found for each method and its pitfalls. We conclude by distilling these findings and highlighting the recommendations for practice.
for
and notice
of the Association
otherwise,
and/or specific permission. CH194-4/94 Boston, Masaachuaetts 01994
all or part are not made
of use cases from 00M
. interface analysis from HCI
An opportunity for testing this combined methodology presented itself to us in 1992. We joined a team that was given the task of designing a Collaborator for space physicists, as described in more detail below. This opportunity had two very promising features to it: They had already decided to Permission
and distillation
function allocation and priority setting from BPR . modular software design from 00M
Software design methods themselves, however, are moving in the direction of recognizing the value of understanding the user. Object-Oriented Methods (OOM) in particular advocate beginning with use cases or scenarios of use extracted from the users themselves and iterative development peppered with user testing [4, 5, 7, 8, 13]. 00M appeared to be ripe for attaching HCI methods in order to create a more complete user-centered design methodology.
granted
collection
●
. .. S3.50
145
!$2!
CHI’94 * “Celebrofi/7$ h~ferdepedellce”
Human Faciors in Computimg Systems
—,.,.—,
Collecting use cases from potential users adds to the designers’ understanding of the users’ language and also begins the documentation phase of the system requirements. Recording the use cases in the users’ language rather than in the language of the designers helps the resulting system to reflect the users’ domain. It is an all too common occurrence to have a user request certain functionality and for the designers to ‘hear,’ and design something other than what the user asked for. Additionally, the use cases provide a basis for determining the major classes and objects needed to implement the proposed systems in an object-oriented fashion [13].
each site, the ability to annotate the data with text, voice and graphics, and the ability to replay data from previous time periods. The project is being developed using object-oriented analysis and design methods and is being programmed using an object-oriented language, Objective-C. The development and deployment environment is NeXTSTEP 3.0 and NeXTSTEP 486. The choice of the NeXTSTEP environment was made because of its excellent development toolkit, attention to user interface issues, and its object-oriented paradigm. THE METHODS Collection
and Distillation
To gather the use cases, we met with both individuals and groups of users. The design team took copious notes during this process to try to capture as much detail as possible. Figure 1 shows an excerpt of the use case collected from a group of users about a joint experiment called the Rodeo Campaign. “Campaign” is a term used to denote experiments where the scientists are looking for similar phenomena over a period of severaI days using one or more instruments.
of Use Cases
Jacobson [7] describes use cases as a specific way of uncovering part of the functionality of a system. Our working definition of use case is a description in the user’s language of the process of their work. When designing software for individual use, this entails only a description of each individual’s work; when designing group software the communications between members of the group must be taken into account in addition to each individual’s role. There are two core goals of collecting use cases: getting a very detailed account of the steps in the process and finding the objects in the user’s domain. We thought the second goal would be easily met if the use cases were detailed enough, so we focussed on getting the necessary detail.
The design team asked the users to talk about their work, to describe a particular scenario, the actors in that scenario (people, machines, organizations, etc.), the objective of the scenario, and anything else the users felt was relevant. In addition to ascertaining how the work is currently being done, we asked the users to speculate on how the new technology might change their work, how the technology can improve their lives, and how they would like to see the technology integrated into their work.
radar schedule 10 day period, 6 hours on 6 different nights each evening look at weather needs to be clear call weather bureau at airport satellite ground station check ground-based instruments magnetometer nometer look for activity if likely to be clear, and have arcs, turn on radar wait one hour set up antenna files browse mode 3 parallel scans radar doing elevation scans (browse mode) watch atl-sky camera display when sun-aligned arcs occur determine orientation from all-sky screen do off-set to 3 paraltel scan settings (not automated at present) 3 paratlel scans
We found that initially the scientists were talking only at a very high level about their work and the processes they use. It did not provide a level of detail sufficient for our purposes. The scientists seemed to be reluctant to talk about specifics in their domain as if they assumed we would not understand them. They also seemed surprised that we were so interested in their work. In order to get the detail needed, the use case recorder continually asked for more detail and clarification. We also found that working with groups was conducive to this process because the scientists would jump in to add something to what another scientist was saying. Use cases contain a huge amount of needed information,. so, because we had to leaf repeatedly through the material to find specifics about a particular process, we reorganized them. The first step was to reorder the outline so that related parts of the use cases were adjacent. At the same time we eliminated redundancies, clarified misunderstandings, and identified where more information was needed. Through this distillation process we became ever more familiar with the domain of the users.
each scan about a minute do until arcs go away or need to change orientation monitoring all-sky camem arcs last minutes to 10s of minutes display radar scans on atl-sky all-sky display shows a line where the radar antenna is when arc goes away go back into browse mode watch all-sky six hours of this kind of iterative searching and watching doubt if spend too much time with other instruments Fabry-Perot change mode so points in same direction as radar scans Figure
1.
A n excer~t
After condensing the use cases, we found it very useful to convert them from a textual outline into a tirneline or flowchart. This helped to delineate the stages in a particular experiment or campaign and clarified the process of each experiment. Robertson [13] collects scenarios from users but uses a different method to move from the scenarios to the object-oriented design stage. His “systematic question asking method” and reformulation of the scenarios into propositions is an interesting contrast to the methods we have used here.
from the use case on the Rode(
Campaign.
146
Boston, Massachusetts USA* April2428,1994
Human Factors in Computing Sys(ems Q technology. We began looking at other ways to analyze these data. In particular we wanted to know what parts of the various processes (across users) were similar (for interface consistency) and also which portions of the processes were most amenable to automation or support via computer technology.
Arcs sighted
Figure 2. Flowchart of part of a use case from the Rodeo Once the information is in flowchart form, it is important to return to the users to make sure that the flowcharts correctly represent the process. This ensures that any errors will be caught before they are incorporated into the system. Figure 2 shows an example flowchart of part of the use case shown in Figure 1. We recommend including the entire task in the flowcharts, from the conception and planning stage through to the final stage of the work. Note that the flowcharts are a representation of the users’ work, not a representation of the proposed system.
● Analyze the remaining tasks to determine performed by human or computer.
those best
Description
Transport Information
I
The movement of information I the information
Information
I%OCeSS
Sort information Soti information
Process
from one physical location to the next. The location of
changes, but the form dces not.
Change information location does not.
I
storage medium. The form of the information
according to prespecified, according to multiple,
Retrieve information
changes, but the
stable, explicit rules.
complexly
related dimensions.
from several sources and merge aspects of each into a new record.
Several inputs merge to a single output.
~---
t
Negotiate
Look for patterns in retrieved information Persuade, teach, learn. This typically
involves judgement,
and interpersonal
or interac-
tive communication. Create Information
has
● Reorganize the processes to rid them of redundancies and inefficiencies.
Vocabulary
Judgmental
we used
The Sasso, et al. method [14] suggests six steps in the redesign process. The first four steps parallel the collection and analysis of use cases, building the flowcharts, and returning to the users for clarification and comments. The analysls phase (step 5) gives the analysts the following two tasks:
The process of building the flowcharts was also very instructive and helped to determine which events came before other events, which events were common to many scenarios, and what the objects of the users domain were. But the use cases and flowcharts do not identify which areas are most likely to benefit from the incorporation of
Algorithmjcaily
Allocation process
next
not, to our knowledge, been used for software design outCampaign. side of the business world. We used, and recommend using, a method called Office Analysis, a variant of BPR [14]. It is used to analyze existing processes to determine where bottlenecks occur, which parts of a process can be supported by information technology, and which parts are candidates for full automation. This method provides tools and a vocabulary which can be used to define the objects, actions, and processes of the task at hand. Although Sasso, Olson, & Merten [14] advertise and use the method for office analysis, we found that it could be used for any task for which computer support or automation is being considered.
Although the process of collecting and analyzing the use cases was lengthy and tedious, it was extremely useful. Much of the information gathered was not directly relevant to the software design but provided important background information about the scientific process and the context in which the system would reside.
Transform
Function The
Organize, synthesize, add new information. rules.
Figure 3. Process redesign vocabulary.
147
There are no explicit
inputs or processing
I
Q,
Human Fac(ors in Compu{ing Sys(ems This method and its vocabulary are used to build dlagranw of the information flow and the various actors. These diagrams are similar to flowcharts, except that they identify the actors in the process and identify steps in the process using the vocabulary. The actors in a process are all of the people and all of the major objects; the steps are those identified in the use cases and flowcharts, but here they are associated with a particular actor. The diagrams are built by listing the actors across the top of a piece of paper; the activities and events me then entered under the associated actor and represented by boxes in the diagram, each labeled with one of the terms from the vocabulary. The arrows are added to show the direction of information flow between actors and
The tirst of these is also a part of the analysis of use cases, so we will focus on the second. This step is accomplished by identifying all of the actors in a process and by labeling portions of the task using a particular vocabulary. The vocabulary indicates to the analysts which of the tasks in the process are better performed by humans and which are better performed by computer. The vocabulary for coding the tasks is included here in Figure 3. The elements of the vocabulary are listed in order horn the most easily automatable to the least automatable. The items in the middle of the list are activities that cannot be automated, but can be supported by providing information, archives, or helpful calculations. Weather
Scientists
Site Crew
Repw-ts
Radar
Camera
Weather Reports
Radar
Camera
Magnetometer
Ricmeter
-.
L& Mwatwr $xCca,l, can Nr$ulwal-mr bum.,, !+344, mkmekr *!a,”m*dah
1
“m One
hour
late r: Site Crew
Scientists
Riometer
%wtrometer
Fa bw-Perot
●
E5i2z?3 ,.,.,, .,,,,..., ,., ,,;
FE&zl
●-
FiiEzi3 .. +
h&m mttzmix b Mwinimol%lt? med tits mcekdlw?
I
C2!ExEr * ITmti,
Figure
4.
Go&% b)k
Process redesign diagram from Rodeo Campaign use case.
148
●
Boston, Massachusetts USA* April24-28,1994
Human Factors in Computing Systems %?,.
—
In general, time moves in the downward direction in the diagram. Figure 4 shows an example diagram illustrating put of the use case shown in Figure 1.
tasks.
on this software prior to its use in an experimental campaign. In the quick analysis we did, we looked for obvious problems, ones which could be quickly fixed, and would not require any major reworking of the software. Problems which did not tit these criteria were noted so they could be dealt with after the campaign. This meant that the interface was not ideal, the software was not robust, and there were some major bugs in the data display code which were not identified until the first or second day of the campaign.
Once this diagram is built it is very easy to see where automation is most likely to be effective, where the process must remain largely human-centered, and where computer support can be used. In essence, as can be seen in Figure 4, the vocabulary prescribes automation and support: information technology can be used to automate each of the boxes labeled “transport” and “transform.” The topmost “analyze” box can be supported, for instance by the automatic gathering of the needed information by an on-line resource manager, so that the information will always be at the scientists’ disposal.
During the campaign, in addition to the scientists, many of the designers and the programmers were also logged in and therefore available during the experiment. This led to a large amount of discussion about the software during the campaign via the communication tool in the software. If any users had questions about how features of the software worked, they could send messages to the other users and generally get help on how to do what they wanted. Many of the users also used the messages to make suggestions about the software and the interface as well as for socializing and talking about the science.
In addition to using the diagrams to locate possible automation and support by technology, we used the diagrams to develop a list of possible software modules, and a list of possible system objects for 00M. Some of the actors in the diagrams as well as items which appear in the boxes are candidates for objects. Objects which appear in Figure 4 include the instruments at the top of the diagram: radar, allsky camera, magnetometer, and riometer. The data from each of these instruments is also a possible object, as are such domain items as arcs and scans. Modular
Software
This initial version of the software, despite the interface flaws and reliability problems, was extremely well received by the scientists during actual use. For the first time, scientists who were not physically in Greenland were able to view the data from afar in real-time, discuss strategies for control of the radar, and discuss the atmospheric activity shown by the radar data. Given that this was an historic event in the lives of these scientists, it is not surprising that the software was so well received despite the flaws.
Design
At this point in the system design process we have used various methods to ensure that our understanding of and familiarity with the users’ work is very high. Now the objective is to identify how software and hardware can be used to augment the work. To this end, we generated a list of possible software modules using the process redesign diagrams as a basis. We wanted the modules to cover all aspects of the process which had been identified as either supportable or automatable, and to cover each of the instruments and major experiments that the scientists use. Although some of the proposed software modules had been foreseen they were now further justified by the design process. There were also several modules which were newly identified by this process.
All messages sent and every action users initiated with the software were logged for later analysis. This included every button clicked, every window move, resize, open, or close, and every other action the user could take with the interface. The message logs were analyzed and coded according to the content of the message. This allowed us to get a baseline for how much of the discussion was about the software (technology confusion) and the science. In addition, we used the technology confusion category of messages as a sort of thinking aloud protocol, in that these were unsolicited comments and confusions about the software. These messages were used as one guide in deciding what should be modified in the software prior to the next campaign. This process is a further example of UCD in that we used suggestions and feedback from the users to guide interface and functionality changes to the software,
After listing the possible software modules for the project, we ranked them as with regard to the effect they would have on the process, how widely they would be used, the ease of implementation, and how critical they were in terms of meeting the goals of the collaborator. A very brief description of the functionality of the proposed module and open issues for implementation were included in the ranking. We used this ranking to determine the order in which the modules should be implemented and began to implement the highest ranked item. This was software to allow the scientists to view the data from the radar in real-time remotely. The software also provided a communication tool, a message window into which users could type messages, so hat tie scientists could ‘~k’ to one another during the experiment. Interface
After this initial campaign, the software was modified, and the interface changed in some dramatic ways, primarily at the request of the users, but also as a result of extensive user interface analysis. The user interface analyses we performed were: . gener~ized ●
object-action
~ansition
networks (GTN)[9]
analysis [10]
●GOMS analysis (on parts of the interface) [3][9][l
1]
observation of use (videotapes of users using the software were made at one lab) . analysis of the message and action logs ● discussions with the users ●
Analysis
The developers took the list of modules and built a system encompassing the highest ranked items without further consulting the HCI people. However, the NeXTSTEI’ toolkit provides many guidelines for UI design resulting in a system which was quite usable. Time pressures set in at this point and we were able to do very little interface analysis
Building GTNs of an interface requires the analyst to use every feature of the software to determine what happens in
149
Human Fac[ors inCompu@ Systems @ formal user testing of various features of the software in the coming months.
response to every possible user action. This process pinpointed places where the interface was problematic in its response, where nothing happened in response to a user action, and where users needed additional information about the state of the system.
Iterative Design and Analysis Modifications to the software were made in response to three sources: suggestions from users, designer suggestions (above laundry hst), and programmer ideas. The modifications made in response to user suggestions were often implemented by the programmer without consulting the design team. In addition, the programmer is and was much more likely to take suggestions from one user, co-located with the developer, over all other users. The result was that another user, of equal status, repeatedly made a suggestion without getting any acknowledgment from the programmer (who was on-line at the time) and without ever seeing the suggestion implemented. The software began to reflect one particulm user’s wants and desires to the exclusion of the other users.
We used object-action analysis to find consistency problems in the human interface of the software. For example, similar objects in the interface required different courses of action to accomplish a similar goal. GOMS analysis is a very fine-grained analysis which shows how well the interface and software match the goals of the user. We expect to use it later for analysis of new features. The message logs were useful in finding areas of the interface which were unclear to users, obtrusive in use, or incorrect in function. The observations (both video and live) during actual use of the software proved invaluable in terms of finding improvements and enhancements as well as identifying
unworkable
parts
of the program.
The interface designer’s suggestions were implemented although often not the way they were suggested, leading to pecuhar results. The laundry list of modifications to be made prior to the next campaign was only partially completed while the programmers added other features which were not on the list.
In one case an
observer saw a user go through the same sequence of events every time he wanted to send a message. Several steps in the sequence were unnecessary and one step which was made necessary by his actions was always left out. The user expressed frustration with the interface and never figured out that there was a better way to do what he was doing, showing us that there was a problem with the interface. The user action logs showed that some users used certain features extensively while others never used them.
For example, during one of the early days in the campaign, a suggestion was made to include a way for users to see who else was logged in and viewing the data, writing and reading messages and the like. The suggestion was made for a ‘who button’ which would provide the above functionality. This function was implemented by the following morning of the campaign and showed up as a menu item labeled, “Finger.” For anyone who has used UNIX systems, this command name makes sense, but in our user population, few people knew the UNIX meaning of finger. This is a case of an interface designer making a suggestion, and a programmer hearing the suggestion In his own language, rather than in the language of the users.
We used all of this information to create a laundry list of modifications, improvements, and reworkings of the software for the programmers to use as a guide. In addition, several possible interfaces for new or reworked features were built by the interface designer and given to the programmers to use. Rapid Prototyping The ability to rapidly charge the software that was one reason for selecting an object-oriented development paradigm also led to some interesting problems. During the campaigns the programmers would take user suggestions and implement them along with bug fixes overnight and distribute new versions of the software to the users prior to the beginning of the experiment the next day. This behavior resulted in many new features being introduced during the course of the campaign, but also contributed to the software development process becoming akin to a runaway freight train. Once the programmer was in this mode of making changes in response to user suggestions, it was very difficult to stop and proceed in art integrated way. Even after the campaigns were over the programmer continued to respond to user suggestions in preference to design team suggestions.
CONCLUSIONS
AND LESSONS
LEARNED
The combination of methods we have used for this project has resulted in very high quality software. It has been very well received by the community of scientists for whom it was designed. On the other hand the development process has not been without glitches. We summarize the steps we used and their parent methodologies in Table 1 below and then summarize the positives and negatives of the process we used.
Table 1. Steps in the development process and the source method.
User testing User testing has consisted of recording users actions, and videotaping and observing users using the system. The actual use of the software during the two campaigns has provided us with very real data on the usability of the software. We have analyzed the data collected so far and will analyze data from future use of the software. We are using these analyses to guide changes in the software as well as to determine usage patterns by different classes of users and for different experiments. In addition, we plan to do more
-
150
Boston, Massachusetts USAo April24-28,1994
Human Factors in Computiog Systems %?
The Stages of Function Allocation and Priority Setting can be Helped by BPR The problem of how to determine what features and what functionality a piece of software should have and also how to match functionality to an existing process is answered by doing BPR. This methodology provides an easy to use vocabulary for determining which parts of a process are conducive to automation and support from technology. This was a very useful process for clarifying the needed functionality and determining where the technology could best be applied. Users Not Necessarily Good Designers Involving users in the design and evaluation phases of software development is very important, but must be balanced with reasoned analysis. Users and designers suggest new functions and interfaces one by one, but overall