CHI
Usability in Practice Session
changing the world, changing ourselves
Usability in Practice: User Experience Lifecycle — Evolution and Revolution Stephanie Rosenbaum (moderator) Tec-Ed, Inc. +1-734-995-1010
[email protected]
Chauncey E. Wilson (moderator) Bentley College +1-781-891-2608
[email protected]
Timo Jokela University of Oulu +358-8-553-1971
[email protected]
Janice A. Rohn Siebel Systems +1-650-477-5852
[email protected]
Trixi B. Smith Lansing Community College +1-517-483-1650
[email protected]
Karel Vredenburg IBM +1-905-413-2330
[email protected]
ABSTRACT
The practice of usability and user-centered design must integrate with many other activities in the product development lifecycle. This integration requires political savvy, knowledge of a wide variety of methods, flexibility in using methods, inspiration, and innovation. The speakers and their colleagues have met these requirements and describe their experience fitting various methods into design and development efforts. This forum highlights their successes and setbacks. Keywords
Usability, user experience, user-centered design, current state analysis, competitive analysis INTRODUCTION
This forum discusses case studies showing how usercentered design methods are integrated into a variety of environments ranging from a college library to two of the world’s largest corporations. This forum complements the three Usability in Practice sessions that focus on specific methods. Speakers will share their experience melding individual methods into the design and development process. The case studies described in this forum also highlight flexibility and innovation as key attributes of success. Here are short summaries of each presentation. Janice Rohn describes how Siebel’s centralized userexperience (UE) team is responsible for the overall design of products rather than piecemeal design and evaluation. Establishing the UE team as the focal point for design requires strong support from senior management, plus team members who possess both design and evaluation skills. The Siebel UE team meets the challenges of large-scale design by tracking all UI bugs, creating templates and a style guide, actively working with customers, and conducting baseline studies to verify progress.
Copyright is held by the author/owner(s). CHI 2002, April 20-25, 2002, Minneapolis, Minnesota, USA. ACM 1-58113-454-1/02/0004.
898
Timo Jokela and his colleagues want to understand the challenges that occur when user-centered design (UCD) is introduced into a development process. Their starting point is to use assessment of UCD processes as the basis for planning improvement action. Their experiments led to innovative assessment workshops where the essentials of UCD are communicated to the design staff and improvement actions planned as teamwork. A simplified UCD process model is a key asset in the workshops. Trixi Smith describes how she initiated an effort to redesign a college library Web site. Her effort involved a wide variety of methods: the creation of a collaborative action plan, analysis of competitive library Web sites, storyboards, prototypes, and—perhaps most importantly—constant usability testing of their UCD methods. The concept of usability testing our own UCD methods is a key issue that is sometimes neglected in the rush to get products out the door. Karel Vredenburg has spent years designing and implementing UCD processes at IBM. Key themes in the integration of UCD into the IBM development process are the education of all new employees through formal training, intranet materials, and case-based training for management who will implement UCD practices. As with the UE team at Siebel, a key ingredient for success has been the establishment of core metrics that provide baseline and comparative data on the success of UCD practices. The key themes that emerge from our four speakers are: •
Assessment and the collection of metrics about both process and product are critical for establishing the worth of UCD activities and planning usability improvements.
•
Group involvement is an essential attribute of a successful UCD effort. Individual effort is necessary, but not sufficient for ensuring that UCD activities fit well into the overall development process. Communicating basics of UCD to all design staff is a must.
•
Textbook methods like traditional usability testing often have to be adapted to different contexts. UCD
minneapolis, minnesota, usa • 20-25 april 2002
practitioners need to be flexible and be willing to experiment (and experience both success and failure). The rest of this paper describes the speakers’ experience fitting UCD activities into the development process. THE USER EXPERIENCE LIFECYCLE AT SIEBEL Janice A. Rohn
At Siebel we have implemented a true user experience lifecycle, both with respect to design processes and user research. This includes the organizational structure and composition of the User Experience (UE) group, the activities we perform, and our areas of responsibility Organizational Structure of the UE Group
The Siebel UE group is part of Product Marketing, which is responsible for specification of all products. The UE Senior Director reports to the Senior VP of Product Marketing and meets regularly with the President and the CEO of Siebel; these relationships strengthen the position of User Experience within the company. Siebel has a single, centralized UE group responsible for the entire product line. Some companies divide UE groups into separate subgroups by job roles, separating user researchers from product designers; but we believe such separation limits professional development and hinders communication, which leads to poorer designs. Within the Siebel UE department, there is no division of groups by skill set, since most people have a mixture of skills. We also conduct cross-training: for example, designers are trained in user research, and user research professionals who haven’t previously written style guides can learn how. There are still people who specialize in field studies, lab studies, or interaction design. However, the exposure and cross-training results in happier UE professionals (because they are learning and have a variety of activities), greater flexibility in resource balancing, and ultimately better products because each individual’s skill set and knowledge is richer. Process for Product Design
The UE group is truly responsible for the product design; our interface standards are followed across the company. All teams, including Product Marketing, Engineering, Quality Assurance, Technical Publications, and Corporate Marketing, follow the guidelines set by the UE group. User interface defects are reviewed, approved, and prioritized by UE; other groups can’t change these priorities. As the Siebel product line moves to the web, the UE group both creates and owns the HTML templates for the products. Thus we not only guide compliance with user-interface style guides, but also with the web templates. We instituted a review and approval process, and we hold weekly meetings across the functional areas to ensure that proposed designs are technically feasible and meet the requirements of the various groups. Changes to the web templates are made by the UE group, so UE must agree with any variations in the product design.
Usability in Practice Session
We also created a browser-based style guide for better decision capture, approval tracking, feedback, and communication. Unlike the previous Word documents, this style guide contains both the “Ideal” design, which doesn’t change, and the “Current” design, which changes as technical challenges are solved to bring the Current design closer to the Ideal. Additionally, since many people are involved in product development, the style guide includes a decision tracking and approval system. The various people in Product Marketing and UE review and approve the product design before it goes to Engineering, then Engineering signs off when they agree to build the product as specified. This style guide is evolving to become a design tool, where Product Managers can build screen designs (constrained by a layout editor which only enables designs specified by the UE style) and view the screens in multiple languages to ensure that the localized designs work. Ultimately the tool will provide hooks to Engineering so there are no translation errors from the design spec to the product: the design in the tool becomes the product interface. User Research Methods and Activities
Often UE groups have organizational barriers to user research. Other departments restrict access to customers, concerned that UE people will either threaten their roles (such as Product Marketing professionals wanting sole access to customers) or have a negative impact (such as Sales professionals concerned that their sale will be put at risk). Siebel takes a different approach: customer information is available across the company, using our own Customer Relationship Management software, and the UE group has full access to every customer. As a result, we have been able to perform field studies with a variety of customers. This data is critical in order to understand the real customer profiles, task flows and frequencies, and product feature and design requirements. Currently, we are performing four customer site visits per product type and producing both individual site visit reports (to understand the individual needs of the different companies) and a field study summary report for the product type. The data from the field studies drive the designs, which are then evaluated in the usability lab with customers and prospects. Although Siebel’s products are configured for each customer, the goals of the UE group include designing the products to require minimal configuration and no training. Data from the user research studies, both in the field and in the lab, are introduced into the products via marketing requirements documents, the defect tracking system, and release plans. We’ve also performed a series of baseline benchmark studies, including objective measures such as successful task completion, time to complete task, major and minor errors, and others. Now we can perform benchmark studies on subsequent releases and compare the results.
899
CHI
Usability in Practice Session
Looking Ahead
To avoid the trap of creating only short-term designs and “band-aids” for each release, we initiated the “Next Generation” design program, which is not tied to a particular release. For the “Next Generation” products, user research data—combined with data from Product Marketing, Sales, and other sources—drives future product requirements and design. We have generated low-fidelity prototypes (using PowerPoint and other methods) and high-fidelity prototypes in XML, HTML, and other technologies. The UE group is working with Engineering to map which “Next Generation” features can be included in which product release; this approach helps us work toward a shared design vision for the future. IMPROVING USER-CENTERED DESIGN AT BUSCOM, NOKIA, AND TEAMWARE IN FINLAND Timo Jokela, Mikko Jämsä, and Netta Iivari
This research focuses on how to improve the role and impact of UCD in development projects—or how to introduce UCD into development organizations. The basic hypothesis of our method, developed during six iterative experiments in 2000 and 2001, is that the first step in improving the role of UCD is to carry out a current state analysis (“assessment”), to understand current strengths and weaknesses before proposing improvements. We did our research at Buscom, Nokia, and Teamware in Finland. We began with an approach widely used in the software community: to start improvement efforts through process assessment. Based on the resulting capability profile, the developers should plan and implement the improvement actions. In our first two experiments, we used “SPICE” [3] style process assessment, with an ISO 13407 [2] based UCD process model (“Human-Centred Lifecycle Process Descriptions” [4]) and the IDEAL guide for software process improvement [7] as references.
carry out the assessments as workshops rather than through individual interviews. The workshops reinforced our realization that assessments were a highly effective training tool for UCD. After the workshops, the participants challenged themselves at an individual level: “Now I know that we should assign more resources to the usability requirements process.” The other major change in the method was to identify the implementer of the improvement actions before the assessment. (The “normal” way is to carry out an assessment and only then start to plan where to concretely improve, and by whom.) The potential implementer is typically a future product development project team, the manager of which voluntarily expresses an interest in usability issues. When we identified this implementer prior to the assessment workshop, a subproject on usability was set up in a planning workshop soon after the assessment We expect the method to be further refined as we carry out further experiments. We especially want to learn how to tailor our approach to the different cultures that companies may have, so we have added a cultural anthropologist to our research team. Summary of the Method
The main steps of the method [5] are: •
Select an appropriate “pilot” project. The main criterion in selecting the project is the interest and commitment of the project manager or the decision-maker for the development activities. Give a presentation of the method to the manager and other stakeholders in the potential project. If they say “yes,” only then go on.
•
Set up a management team to make decisions related to the UCD work; the project manager should chair the team. Also set up an implementation team to carry out the UCD activities. Plan the management activities, including ensuring that most project staff will attend the workshops.
•
Conduct Workshop 1: an assessment session where the development process of a previous project is contrasted against our UCD process model. Unlike in the early experiments, the key driver in the assessment is not to get an exact score of the status of UCD, but rather development staff training. Mapping past reality with a UCD model seems to make a very efficient and effective training session about the essentials of UCD.
•
A week or two later, conduct Workshop 2: an action planning session where the UCD actions of the project are planned as a team. The outcome of Workshop 2 is a rough plan of what UCD activities to do in the project and—more importantly—a plan about the management activities to organize the work. The UCD activities should be integrated into the overall project plan.
•
Implement the UCD action plan under the control and decisions of the management team.
Results of Traditional Methodology
Despite the value of the models, the assessment process did not have the desired results. The software developers found the models and the results difficult to understand, and the recommended usability programs were perceived as delaying the development schedule. On the positive side, a process assessment is carried out by interviewing many individuals, and the interviewees found these sessions educational and useful. In the third experiment, we simplified the UCD process model, examining the current state through outcomes of processes rather than through activities. This new UCD process model made more sense to the developers, and the assessment team found their job easier. However, here and in the first experiments, the majority of the developers—those who were not actually interviewed—did not learn the benefits of improving usability. How the Method Evolved
The next major change was in the style of implementing the assessment. To reach all the development staff, we started to
900
changing the world, changing ourselves
minneapolis, minnesota, usa • 20-25 april 2002
USER-CENTERED DESIGN FOR LIBRARY WEB IMPROVEMENT Trixi B. Smith
The world of Internet technology is quickly changing in ways that offer new opportunities as well as challenges, and library websites are fast becoming the portals through which patrons interact with library services. As libraries help change the world by offering increasingly effective and userfriendly websites, we must also continue to keep pace with this change in patrons’ concepts of the library by providing ever more flexible, patron-centered, and innovative library portals. With this concept in mind, our library Web Improvement Team is involved in a one-year website redesign process incorporating team-based user-centered design and full accessibility coding. Since we have a highly diverse population using our library website, a wide variety of services to provide through our website, and a limited budget, we have been required to be very creative and innovative in our choice of usability methodologies [6, 8, 9, 10, 11]. User-Centered Design Methods
Some of the methods we are using include: application of a flexible action plan, team drafting of website objectives, mission and index page features, split-team creation of homepage prototype layouts, competitor website usability testing, 5-minute onsite usability testing, trial-run usability testing, link category testing, online form surveys, multiple prototype usability testing, usability testing with visually impaired and physically challenged patrons as well as patrons who speak English as a second language, online accessibility tools such as Bobby validator, and Betsie textonly converter. Collaborative Action Plan
The first step in redesigning our library website was to develop an action plan to be reviewed and approved by the LCC Library Web Improvement Team. This overall plan was adapted from Jessica Burdman’s ideas [1]. Since this is the first time that LCC Library has done a major revision of its library website, the plan was drafted and adopted with the understanding that it must be flexible so that we could alter our timeline as we all learned “on the job.” Our first action plan included the creation of a mission statement, target audience description, and scope of the project—which all combined to form a comprehensive website definition. Also listed in the first plan was the plotting of an information architecture, which included the storyboard (website structure) page schematics, detailed timeline, and task assignments. Finally, the plan detailed the website design process including creative design, technical design, content planning, template, standards, prototype, and full website construction. Competitor Website Tests
While the team was working on creating the new website mission statement, we were also running a series of “Competitor Website Tests.” This was a technique adapted
Usability in Practice Session
from traditional usability testing, but instead of testing a prototype of our new website during the design process, we tested our website in its current version along with four other local college library websites. We assigned six library patrons ten tasks often done on a library website. These Competitor Website Tests were very illuminating, helping us determine typical confusions patrons experience in using library websites, as well as showing us the strengths and weak-nesses of a group of library websites—including our own. Split-Team Homepage Layout Drafting
The Competitor Website Tests and Web Improvement Team brainstorming gave us a good indication of what features we should probably include on our redesigned library homepage. Once we had a list of features, we voted on first, second, and third priority items. Then we used another novel technique in which we split our web improvement team into two groups, gave them markers, pencils, pens, scrap paper, whiteboard, overhead pages, and lots of erasers; and sent them into two different rooms to draft their vision of a new library homepage layout (using the list of features in their priority levels). This technique was not only fun, but gave the team a clear idea of just how challenging it is to fit everything deemed important onto a website homepage. It also gave all team members an opportunity to contribute their skills and experience. Testing the Tests
After our graphic designer performed her magic, we had two homepage mock-ups to test on library patrons to determine whether were going in the right direction. But before we performed mock homepage testing with actual patrons, we drafted a usability test questionnaire detailing typical library tasks for the test participants to perform. To “test the test,” we first had web improvement team members pair up and run the test on each other during one of our biweekly team meetings. This resulted in many suggestions for clarification of wording. Once the usability test questionnaire was revised, we ran a trial-run usability test—complete with video camera and clipboard—on several library staff members. This produced another set of useful comments and a revision of the testing questionnaire and process. Mock Homepage Usability Testing
Finally we were ready to do our first official round of mock homepage usability testing with a section aimed at determining the best terminology to use for difficult categories on our website. We called it “link category testing” and it included part A and B questions such as: (A) What would you call a main link which would lead to an area of our website that provides information for students who are visually impaired or physically challenged? And (B) What would you expect to find under a link called “Assistive Technology”? The results of the first round of mock homepage usability testing have inspired a major revision in both layout and terminology of our first draft new library homepage.
901
Usability in Practice Session
CHI
changing the world, changing ourselves
Future Plans
Introduction Strategies
Future plans through June 2002 include development of a prototype website, more usability testing, full website construction, and yet more usability testing, as well as further research and testing on the Betsie text-only converter script and W3C accessibility standards.
Making the transition from traditional human factors and usability approaches to full-scale UCD involved a major cultural transformation for IBM and a paradigm shift for its practitioners. Several steps were taken to ensure that the key elements of this transition were carried out appropriately to ensure success. These key elements included identifying core principles, carrying out education, and integrating UCD into the company’s business and development process.
INTEGRATED USER-CENTERED DESIGN: ORGANIZATIONAL TRANSFORMATION AND METHODOLOGICAL OPTIMIZATION Karel Vredenburg
This section of the paper outlines the key success factors in the introduction, deployment, and optimization of the IBM version of User-Centered Design. It describes what worked and didn’t work in the areas of acquiring requisite skills, developing and delivering education and training, formulating processes and methods for the company and integrating them into our business process, cross-company practitioner communication, development of new corporate ease-of-use positions, benchmark studies with other companies, collaboration with key universities on industry studies, and optimization of methods and methodologies to enhance UCD so it continues to improve crafting compelling user experiences for our customers. UCD dramatically improved the ease of use of IBM software, hardware, and services [12]. For example, the workstation database product, DB2 Universal Database, used UCD starting with the 5.0 release of its product. The results of our own studies, business results, and trade press reviews all substantiate the improvements made in ease of use. For example, PC Week referred to “a vastly easier client setup procedure, integrated replication and a fresh new interface,” InfoWorld pointed out that the “Latest DB2 exceeds competition.” Administrative functions are well-integrated into the easy-to-use Control Center interface.” Information Week wrote that “Installation, on both the server and client, is mind-numbingly easy…Universal Database is breathtaking for its enormous leap into ease of use.” In the area of hardware, UCD was first used for our notebook computers, starting with the ThinkPad 770 and 600 models. Again, Gartner Report wrote, “If winning in the notebook game is the result of attention to details, the 770 has it in spades, especially when it comes to usability.” The business case derived from these types of results further drives the broad implementation of UCD at IBM. Origins
A human factors organization was first established at IBM in the mid-1950s, and various usability and human factors methods have been used over the years. IBM’s integrated version of UCD was developed in the early 1990s and continues to evolve [13]. It has incorporated ideas from IBM project teams via the company’s UCD Advisory Council, and from industry peers via the CHI, UPA, and HFES conferences, and standards organizations such as ISO, ANSI, and NIST.
902
We gave particular focus to introducing company employees to the new approach. This was especially important, given that all members of a development organization were now responsible for the total user experience of the product in ways in which they were never previously. To address this new responsibility, an overview presentation, including video, was created and delivered to all employees via an internal television broadcast and individual development site visits. This was augmented by overview and practitioner information made available to all employees via our intranet. Multimedia classes were also developed at both the introductory and advanced levels to teach UCD. A casebased executive workshop was also developed for management teams—executives through project leaders—to provide them with the skills and knowledge required to manage UCD projects. Finally, and perhaps most importantly, UCD principles, methods, and metrics were integrated into the company’s business and development process. Our experience to date has shown that while all of the above have had an appreciable effect on introducing and deploying UCD, the most significant contributions included the articulation of core principles, having a set of highly efficient methods, and integrating UCD into the company’s business and development process. A common excuse for not integrating UCD into company engineering processes is that there are no agreed-to elements that can be measured and thus managed. We disagreed with this and developed a core set of UCD metrics that summarize the key elements that are important in the management of UCD on projects at the project, division, and corporate level. A metrics dashboard ensures that organizations are using the same information for all products. It includes project information like the individual responsible for the design of the total user experience, the target user audience, prime competitor, and user problem fix targets. Included as well are specific easeof-use objectives for the release, targeted customer satisfaction, and their status. Enablement information (e.g., schedule and budget) is reported, as is the list of total user problems identified and the fix status of each. A running monthly summary of the top five open user problems and their status is also included. Finally, a summary of the number of hours of UCD studies completed is provided in the categories of understanding users, evaluating designs, and hands-on testing.
minneapolis, minnesota, usa • 20-25 april 2002
Organization
UCD at IBM is carried out at the product team level by UCD teams. The specialized disciplines of visual design, industrial design, user experience design, and information design constitute the core members of the team along with marketing, product development, and support specialists. Teams
In addition to forming parts of UCD teams, the specialized UCD disciplines also report to discipline organizations, and are managed using a matrix approach. Performance plans for team members typically include a discipline contribution as well as a contribution to the product.
Usability in Practice Session
•
Usability professionals need to understand that those who design the product—who may not always be usability professionals—will ultimately determine its usability. Communicating the essentials of UCD to the whole product team is a must.
Work to develop metrics to assess both products and processes. All our speakers highlighted the importance of metrics, even ones as simple as the time spent on various UCD activities. All our speakers noted different metrics that were useful in comparing product versions, understanding the status of UCD in an organization, determining the success of UCD methods, and measuring team performance.
Three corporate positions and accompanying organizations have also recently been formed to ensure the attainment of IBM’s strategic objectives regarding ease of use. These include a Vice President of Ease of Use, responsible for overall strategy; a Director of Ease of Use Integration responsible for implementing UCD across the management team; and a Corporate User-Centered Design Architect and Team Leader responsible for IBM’s UCD methods, processes, tools, and staff.
REFERENCES
Enhancements
Through work with practitioners and executives as well as benchmarking with other companies, integrated usercentered design continues to be enhanced. Major recent enhancements include the specification of multidisciplinary role activities, resources, and work products by development phase, the further detailed assessment of core metrics like user satisfaction throughout the development process, and the integration of the best story-, scenario-, and model-based methods into a new user engineering methodology. SUMMARY What are the lessons learned from our forum?
•
•
Putting usability and UCD methods into practice requires both top-down and bottom-up support. Senior management support is good, but support from individual developers and mid-level management is also necessary. Senior VPs may support UCD, but development managers and directors are often the ones who have the power to implement decisions. Be innovative in your use of UCD methods, but practice iterative design and evaluation on those methods to ensure that your innovations really work and provide a justifiable return on investment (ROI).
•
Involve entire development teams and their managers in practices like UCD workshops so there is a forum for asking questions and discussing different world views about design and development.
•
Think of “force multipliers” like training for all new employees on UCD methods, a central design team that channels all UI issues into one group, UI templates and style guides, and continual involvement with customers.
1.
Burdman, J. Collaborative Web Development: Strategies and Best Practices for Web Teams. AddisonWesley, Reading MA, 1999.
2.
ISO/IEC 13407 Human-Centred Design Processes for Interactive Systems, ISO/IEC 13407: 1999 (E), 1999.
3.
ISO/IEC 15504-2 Software Process Assessment - Part 2: A reference model for processes and process capability, ISO/IEC TR 15504-2: 1998 (E), 1998.
4.
ISO/IEC 18529 Human-centred Lifecycle Process Descriptions, ISO/IEC TR 18529: 2000 (E), 2000.
5.
Jokela, T. Assessment of user-centred design processes as a basis for improvement action. An experimental study in industrial settings. Oulu Univ. Press, Oulu, 2001. See http://herkules.oulu.fi/isbn9514265513/.
6.
Krug, S. Don’t Make me Think: A Common Sense Approach to Web Usability. Circle.com Library, Indianapolis IN, 2000.
7.
McFeeley, B. IDEALSM : A User’s Guide for Software Process Improvement. CMU/SEI-96-HB-001, Software Engineering Institute, Pittsburgh, 1996, 236
8.
McMullen, S. Usability Testing and Library Web Site Redesign at Roger Williams University, available at http://gamma.rwu.edu/users/smcmullen/usable.html
9.
Nielsen, J. Designing Web Usability. New Riders Publishing, Indianapolis IN, 2000.
10. Pearrow, M. Web Site Usability Handbook. Charles River Media, Rockland MA, 2000. 11. Prown, S. Detecting ‘Broke’ Usability Testing of Library Web Sites, available at http://www.library. yale.edu/~prowns/nebic/nebictalk.html 12. Vredenburg, K. Increasing Ease of Use: Emphasizing Organizational Transformation, Process Integration, and Method Optimization. Commun. ACM 42, (1999), 6771. 13. Vredenburg, K., Isensee, S., Righi, C. User-Centered Design: An Integrated Approach. Prentice Hall, 2001.
903