challenging issues in the integration of agile development and UX, tasks and goals ...... allows remote usability tests on low-fidelity prototypes. ...... survey only if you are familiar with the XXX APP (you have e.g. used or tried the software or.
Tampereen teknillinen yliopisto. Julkaisu 1339 Tampere University of Technology. Publication 1339
Kati Kuusinen
Integrating UX Work in Agile Enterprise Software Development
Thesis for the degree of Doctor of Science in Technology to be presented with due permission for public examination and criticism in Tietotalo Building, Auditorium TB104, at Tampere University of Technology, on the 6th of November 2015, at 12 noon.
Tampereen teknillinen yliopisto - Tampere University of Technology Tampere 2015
ISBN 978-952-15-3605-2 ISSN 1459-2045
Abstract Agile development methodologies have become the norm in software development. Simultaneously, user experience (UX) has become an increasingly important factor in the success or failure of software systems. However, agile methodologies do not give guidance on how to conduct UX work. Companies encounter challenges in UX work despite the previous academic research activities in order to integrate UX work with agile development practices. This doctoral thesis investigates how to integrate UX work as part of agile development of enterprise software, where other factors such as business benefits often overrule end user needs. The thesis studies tasks and goals of which agile UX work consists of, challenges companies encounter while integrating UX work with agile development, and actions that support the integration. This thesis consists of nine publications and an introductory part. The research has been conducted in seven Finland-based international large- and middle-sized companies over the years 2011 to 2014. The research approach of the thesis is inductive, aiming at explaining and structuring the studied phenomena based on empirical findings. The research consists of three rounds of studies utilizing both qualitative and quantitive methods. The main data gathering methods include surveys and interviews. Most of the study participants were agile projects’ team members, but also end users and other roles significant to research and development activities in companies. The main contributions of this thesis address the research questions of challenging issues in the integration of agile development and UX, tasks and goals related to UX, and activities that support the integration. We present a framework called BoB (Best of Both Worlds) for integrating UX work in agile enterprise software development. While most of the previous models for agile UX concentrate on integrating the role of UX specialist responsible for the UX work, BoB approaches the integration via UX tasks and shared ownership of the crossfunctional team including software developers and UX specialists. The results convey the conclusion that agile UX work should be collaborative effort with several contributing roles. BoB structures UX work via UX tasks unlike the previous models for agile UX work and partially merges upfront design work with development iterations.
i
Preface I started my thesis work in 2011 in the Cloud Software program because I had enjoyed my work as research assistant and thought that I would like to continue research. I thought the doctoral thesis would appear as a by-product of the fun work. (Little did I know back then...) After working as a software developer for several years for just one organization, I enjoyed the opportunity to work for multiple companies simultaneously, to absorb different ways of working and learn both about the companies and about myself. I was enthusiastic to conduct studies, learn research methods and to become more objective, create new knowledge, and help companies to solve their challenges. Research was truly my calling. Then after the first two years, I was thrown in the darkness and the next couple of years were pure struggle. The process was far from what I had expected. Nevertheless, being persistent, I am finally there and able to enjoy the work and life again. I thank my supervisors Prof. Kaisa Väänänen and Prof. Tommi Mikkonen for guiding me through the struggle, reflecting my thoughts, and for letting me conduct my work independently on my own way. I thank Prof. Jan Gulliksen for agreeing to act as the opponent at the public defense. Asst. Prof. Åsa Cajander and SL Peggy Gregory reviewed my thesis and I am grateful for all your constructive and thoughtful comments and feedback that helped me to improve the thesis. I thank TEKES and DIGILE for funding my work. I am truly grateful for all the company contact persons who remain anonymous for confidentiality reasons. I have become friends with a couple of you during the process. You have been amazing in arranging study participants and committing in the long research process. I want to thank all the study participants over the world: all interviewees, survey participants, workshop participants, and you who participated to my webinars. Without you, the thesis would not exist and the new knowledge had not been born. It has also been amazing to see how the input you provided has truly led to organizational change in the companies. I thank my then research assistant Santtu Pakarinen for sharing the enthusiaism and fun with me. I am grateful to Dr. Heli Väätäjä for those numerous discussions, for being a superb co-author and for advising me with the thesis; you are truly brilliant. Asst. Prof. Timo Partala introduced me to quantitative research methods, ii
thank you for the hours round SPSS. I want to thank Jarmo Palviainen, my longesttime roommate ever for all the old married-couple squabbles we have had over the years at the office. Dr. Thomas Olsson, thank you for being such a nice guy. I am grateful for Laura Hokkanen for sharing the last exhausting miles of the thesis marathon with me. I hope the thesis will be only a sprint for you, as it seems it will. I do not know how to express my gratitude to Marie-Elise Kontro who become a dear friend for me during the process and who was my strongest support when I had hard times. I want to thank all my current and previous colleagues including Aino Ahtinen, Mari Ahvenainen, Jani Heikkinen, Jarno Ojala, Jani Sundström, Jari Varsaluoma, Tanja Walsh, and many others for all the laugh and tears during the years. I also want to thank my high-school chemistry teacher Marja-Leena Pylvänäinen for your kindness and for believing in me. Now I guess I am the queen of sciences as you wrote on my scholarship after I asked not to receive one because it would be too embarrassing for a tough gal like me. Finally, I want to thank my loved ones. Thank you Eero for being my Bunny. You are amazing. Thank you for bearing everything besides me, and for your love and support. Now you finally are with The Doctor. Warm thanks to my Mum for all the hard work and for being a friend for me during the past years. Thanks to my brother Henry for being you. Still, the kindest words I have ever heard have come from your rude mouth. Mummo, I owe everything to you Granny. I do not even dare to think what my life would have been if you were not there for me. Mummo, you have always encouraged and been proud of me whatever I did. One single person like that can change the whole world for another person. I have everything in life when I have the desire to be alive and the ability of gratitude. Kindness will save the world.
Tampere, September 8th, 2015
Kati Kuusinen
iii
Supervisors:
Professor Kaisa Väänänen Professor Tommi Mikkonen Department of Pervasive Computing Tampere University of Technology
Pre-examiners:
Assistant Professor Åsa Cajander Department of Information Technology Uppsala University Senior Lecturer Peggy (Amanda Jane) Gregory School of Computing, Engineering and Physical Sciences University of Central Lancashire
Opponent:
Professor, dean of school Jan Gulliksen School of Computer Science and Communication Royal Institute of Technology (KTH)
iv
Contents Abstract ............................................................................................................. i Preface.............................................................................................................. ii Contents ........................................................................................................... v List of Included Publications.......................................................................... viii 1. Introduction ................................................................................................. 1 1.1 Fields of Research ................................................................................. 2 1.2 Research Goals, Research Questions, and Publications .......................... 3 1.3 Research Scope ..................................................................................... 5 1.4 Research Methods and Context.............................................................. 5 1.5 Results and Contribution ....................................................................... 6 1.6 Structure of the Thesis ........................................................................... 6 2. Background and Focal Concepts .................................................................. 7 2.1 Software and its Engineering ................................................................. 7 2.2 Plan-Driven Development ..................................................................... 8 2.3 Agile Software Development................................................................. 9 2.3.1 Scrum
11
2.3.2 Lean Software Development
12
2.3.3 Kanban Development
12
2.3.4 Extreme Programming
13
2.3.5 Continuous Software Engineering
14
2.4 Approaches for HCI Development....................................................... 14 2.4.1 Human-Centered Design
15
2.4.2 Other Approaches to HCI Development
16
2.5 Fundamentals of Development Processes ............................................ 18 2.6 Enterprise Software ............................................................................. 19 2.7 User Experience .................................................................................. 20 2.8 Concepts Related to Agile UX Work ................................................... 21 2.9 Dimensions of Integration of Agile and UX Development ................... 23 2.9.1 Dimensions of Project Performance
24
2.9.2 Development Work Performance and Success
25
3. Related Research ....................................................................................... 26 3.1 Overview of Secondary Studies and Theses on Agile UX Work .......... 26
v
3.2 Approaches for Process Integration of UX and Agile Software Development ....................................................................................... 27 3.3 Agile UX Work Practices ..................................................................... 31 3.4 People and Social Factors in Agile UX Work ....................................... 32 3.5 Supporting Technological Factors ........................................................ 34 3.6 Gap in Research ................................................................................... 35 4. Research Approach, Methods, and Process ................................................. 36 4.1 Research Approach .............................................................................. 36 4.2 Studied Companies and Research Participants...................................... 39 4.3 Research Methods ................................................................................ 42 4.4 Research Ethics.................................................................................... 43 4.5 Research Process and Schedule ............................................................ 44 4.6 Answering the Research Questions ...................................................... 45 4.7 Research Validity................................................................................. 46 5. Results ....................................................................................................... 50 5.1 Summary of Contributions per Publication ........................................... 50 5.2 Challenging Issues in Agile UX Work ................................................. 53 5.3 Tasks and Goals of Agile UX Work ..................................................... 56 5.4 Challenging Tasks and Goals of Agile UX Work ................................. 60 5.5 Supporting Factors of Agile UX Work ................................................. 61 5.5.1 People Factors
62
5.5.2 Technological Factors
64
5.5.3 Task Allocation between Contributing Roles
64
5.5.4 Process Factors
66
5.6 Framework for Organizing Agile UX Work in Enterprise Software Development ........................................................................ 67 5.6.1 Inputs and Outputs of BoB
69
5.6.2 Collaborative Activities
71
5.6.3 Within-iteration Process for Agile UX Work
73
5.6.4 Example Project
78
6. Discussion and Conclusions ....................................................................... 80 6.1 Overview of the Research Subject ........................................................ 80 6.2 Revisiting the Research Questions ....................................................... 80 6.3 Contributions of the Thesis .................................................................. 83 6.4 Revisiting the Research Methodology .................................................. 84 6.5 Future Work......................................................................................... 85 vi
6.6 Conclusions ......................................................................................... 86 References ...................................................................................................... 88 Appendices ..................................................................................................... 98 Original Publications .................................................................................... 167
vii
List of Included Publications The thesis consists of a summary and the following original Publications: P1. Kuusinen, K., Mikkonen, T., Pakarinen, S. Agile user experience development in a large software organization: Good expertise but limited impact. Proc. Human-Centered Software Engineering (HCSE’12), LNCS 7623, Springer Berlin Heidelberg (2012), pp. 94-111. P2. Kuusinen, K. and Väänänen-Vainio-Mattila, K. How to make agile UX work more efficient: Management and sales perspectives. Proc. 7th Nordic Conference on Human-Computer Interaction: Making Sense through Design (NordiCHI '12), ACM (2012), pp. 139-148. P3. Kuusinen, K. Improving UX work in Scrum development: A three-year follow-up study in a company. Proc. Human-Centered Software Engineering (HCSE’14), LNCS 8742, Springer Berlin Heidelberg (2014), pp. 259-266. P4. Kuusinen, K. Overcoming challenges in agile user experience work: Cross-case analysis of two large software organizations. Proc. 41st Euromicro Conference Series on Software Engineering and Advanced Applications (SEAA’15), IEEE Computer Society (2015), DOI 10.1109/SEAA.2015.38 P5. Kuusinen, K. The impact of user experience Work on Cloud Software development. Communications of Cloud Software, 2 (1), (2013). P6. Kuusinen, K, Mikkonen, T. Designing user experience for mobile apps: Long-term product owner perspective. Proc. 20th Asia-Pacific Software Engineering Conference (APSEC'13), IEEE Computer Society Order Number E5158 (2013), pp. 535-540 P7. Kuusinen, K. Task allocation between UX specialists and developers in agile software development projects. Proc. Interact’15, LNCS 9298, Springer Berlin Heidelberg, (2015), pp. 27-44 P8. Kuusinen, K, Mikkonen, T. On designing UX for mobile enterprise apps. Proc. 40th Euromicro Conference Series on Software Engineering and Advanced Applications (SEAA’14), IEEE Computer Society, (2014), pp. 221-228. P9. Kuusinen, K., Väätäjä, H., Mikkonen, T., Väänänen, K. Towards understanding how agile teams predict user experience. Accepted to Integrating User Centred Design in Agile Development, HCI book series, Springer.
viii
The Publications are reproduced by permission of the publishers. In all the Publications, the candidate has been the lead author. Detailed descriptions of contributions per Publication are as follows: In Publication P1, the candidate planned, conducted, and analyzed the study together with the third author who assisted in the work. The candidate wrote the majority of the Publication. The second author had a significant contribution in structuring the Publication and writing some of the sections. Publication P2 is based on the same research setting of Publication P1. The candidate conducted and analyzed the majority of the study alone. She also wrote the majority of the Publication. The second author contributed on polishing the writing. In Publication P3, the candidate is the sole author. She planned, conducted, analyzed, and reported the studies alone. Another researcher commented on the writing and helped to form the camera-ready version. Publication P4 is based on the data utilized in Publications P1 and P2. The candidate is the sole author. She analyzed and reported the studies alone. In Publication P5, the candidate is the sole author. She planned, conducted, analyzed, and reported the study by herself. Another researcher commented on the writing and guided with analysis methods. In Publication P6, the candicate planned, conducted, and analyzed the study alone. She wrote the majority of the Publication. The second author wrote some of the sections. He also contributed on structuring and polishing the writing. In Publication P7, the candidate is the sole author. She planned, conducted, analyzed, and reported the studies by herself. Publication P8 is based on the earlier Publications. The candidate formed the analysis and wrote the majority of the Publication. The second author wrote some of the sections. He also contributed on structuring and polishing the writing. In Publication P9, the candidate was responsible of planning, conducting, and reporting the study by herself. Another researcher guided the candidate in planning the study analysis. The second author contributed to planning and writing of the paper. Rest of the authors contributed on polishing the paper and writing some smaller segments.
ix
1.
Introduction
Modern software development often follows agile methodologies (Dingsøyr et al. 2012). Simultaneously, companies report that good user experience (UX) of software is increasingly becoming an important business goal (Rohn 2007, Publication P2). However, current agile approaches do not guide how to integrate UX work with software development activities. Despite customer centricity being one of the core values of agile methodologies (Highsmith et al. 2001), it was observed early that it, as such, does not ensure good UX (Patton 2002). Beck (1999) declared that ”The best customers are those who will actually use the system being developed.” Still, in many cases the customer role in agile projects is unclear (Hoda et al. 2011). In addition, Jokela and Abrahamsson (2004) observed that the plain active user involvement during the development does not ensure usability in the developed software. Thus, it has become evident that it is essential to understand the actual user need; in general, a fluent user flow alone does not provide good UX (Hassenzahl 2008). In practice, understanding user needs necessitates upfront planning before starting the implementation (ISO 9241-210 2010, Brhel et al. 2015). This has become one of the core problems in integrating agile software development and UX work (Brhel et al. 2015, Salah et al. 2014, da Silva et al. 2011). Several attempts to balance the need for upfront design and the urge to start implementation iterations have been made (Brhel et al. 2015). One of the most common models to integrate UX work with agile methodologies (agile UX) is the “one sprint ahead approach” by Sy (2007) (da Silva et al. 2011, Brhel et al. 2015). However, despite the attempts to integrate UX work with agile development, such problems related to agile UX work as balancing the amount of upfront design and maintaining synchronization between UX design and implementation activities seem to remain (Salah et al. 2014). Therefore, it appears that UX work is not sufficiently integrated into agile development and improved approaches are needed.
1
1.1
Fields of Research
The research mainly belongs to the research fields of software engineering (SE) and human-computer interaction (HCI). The primary categories according to ACM Computing Classification System (ACM CCS 2012) are “Software and its engineering → Software creation and management → Software development process management → Software development methods → Agile software development,” and “Human-centered computing → Human computer interaction → Interaction design.” “Software engineering is an area of engineering literature that deals with processes, methods and tools that aim for enabling complex computer-based systems to be built in a timely manner with quality” (Pressman 2010). ”Human-computer interaction is a discipline concerned with the design, evaluation and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them” (Hewett et al. 1996). More specifically, this thesis concentrates on research of agile user experience development, referring to agile software development (e.g., Highsmith and Cockburn 2001) that puts emphasis on developing software that the user values. Agile development is considered in this thesis as a software engineering methodology that alleviates that late change in software development should not be avoided as in many cases it is impossible to anticipate the complete set of requirements early. Instead, it suggests embracing change by reducing the cost of change throughout a project. Approaches to reduce the cost of late change include, for example, prioritizing work based on business value, small incremental iterations and short feedback cycles, cooperative work and open communication, and early and continuous software delivery. In addition, efficiency is seeked with decreased management hierarchy by close cooperation with customer, and using self-organizing cross-functional teams. The definition is adapted from Highsmith and Cockburn (2001) and the Agile Manifesto (2001).
2
User experience is considered in this thesis as “person's perceptions and responses resulting from the use and/or anticipated use of a product, system or service” (ISO 9241-210 2010). We accepted any related definition of agile development and user experience while collecting empirical data for the thesis for the following reasons: 1. There are several definitions for both agile development and user experience; the concepts are still unestablished; see, for instance, Dingsøyr et al. (2012) for agile development and Lallemand et al. (2015) or Law et al. (2009) for UX. 2. There is a strong empirical focus in this thesis as the candidate studied how companies carry out these activities in practice. The conception and application of these concepts vary between companies. In addition, the understanding of what UX is is different in industrial UX development and academic UX research (Väänänen-Vainio-Mattila et al. 2008). 3. Related research uses several terms that can be phenomenologically equated (for instance, integrating usability in agile software engineering or integrating
human-centered
development
with
Scrum
(an
agile
methodology)). (See e.g., (da Silva et al. 2011) for the terms.) We discuss the core concepts in more detail in Chapter 2.
1.2
Research Goals, Research Questions, and Publications
The research goal is to make recommendations on how to integrate UX work as part of agile enterprise software development where business benefits and efficiency often are the main development goals. We aim at creating means to structure agile UX work in ways that enable software development in a timely manner with expected quality. The need for the research rose from current challenges in industrial software development to align UX work with other agile software development activities in practice. By UX work, we refer to activities that aim at developing software that is usable, fulfills user needs, and provides desired UX. It contains research, design, development, and evaluation activities. UX work is strongly based on the principles of human-centered development (HCD) (ISO 9241210 2010), whereas agile software development is described with various
3
methodologies such as Scrum and Lean development (Highsmith 2002). However, current agile software development methodologies do not instruct how to perform UX development activities or how to ensure good UX of the outcome. Thus, it can be expected that agile UX practices are not established in companies; there are numerous ways to conduct these activities, and challenges are common. Therefore, we assume that improving agile UX development practices will also increase the project performance. The explicit research questions are as follows: RQ1. What are the current challenges in conducting UX work in software development projects following agile methodologies? RQ2. What are the tasks and goals of agile UX work in enterprise software development? RQ3. Which activities support the integration of agile development and UX work? All the research questions are addressed both with literature research and via empirical studies. Table 1 presents an overview of the relation between research questions and included Publications P1 - P9 as well as a summary of research methods and main results per study. Table 1 Mapping between studies, research questions, publications, research methods, and contribution. Study
Research
Publications
Approach
questions
Methods
and
Main results
sample
I
RQ1, RQ2, RQ3
P1, P2, P3, P4
Qualitative, case studies
II
RQ2, RQ3
P5, P6, P7
Qualitative, quantitative
III
RQ2, RQ3
P9
Quantitative
I, II, III
RQ3
P8, thesis introduction
Inductive
Surveys (N=150) and interviews (N=50) in 3 companies Weekly repeated survey (N=50) in 8 projects, interviews (N=14) Survey (N=55) in 6 projects
Building theories from case studies (Eisenhardt 1989)
Organizational-level practices and challenges, UX tasks UX tasks on project level explained via development roles Defining dimensions of enterprise software UX, understanding how practitioners and users assess UX A framework (BoB) for agile UX work
4
1.3
Research Scope
The research focuses on agile development of tools and work-related systems in large- and middle-sized companies. Thus, the research scope excludes, for example, the development of leisure systems such as games and development in academic context such as student projects. In more detail, the research focuses on agile development of such tools and work-related software systems that have a graphical user interface. The research is not limited based on participant roles (e.g., whether there are specialized UX experts participating in the work or not), nor the size or maturity of the developed software outcome (e.g., enterprise systems vs. mobile apps, or new development vs. introducing new features to existing software). All domains and categories of tools and work-related software systems (e.g., embedded or safety critical) are accepted. However, considerations related to domains themselves are beyond the scope (such as regulations in certain domains, e.g., medical or the interplay between hardware and software development).
1.4
Research Methods and Context
We studied existing literature with a procedure of systematic literature review (Kitchenham et al. 2007) to understand currently popular and recommended practices and challenges in them and to reflect empirical findings with previous studies. We conducted empirical research with a case study approach in two phases. First, we conducted case studies on the company-level and later on project-level. Then we formed a preliminary framework for agile UX work based on the findings and earlier research. Finally, we conducted a third round of empirical research using quantitative approach. Largely, the candicate conducted the research on her own. Altogether, we conducted empirical research in seven Finland-based international software intensive medium-sized and large companies developing enterprise software and specialized tools during years 2011 to 2014. Mainly, we conducted the research within the Cloud Software program (CSW) (http://www.cloudsoftwareprogram.org) of DIGILE (Finnish Strategic Centre for Science, Technology and Innovation in the field of ICT and digital business) funded by the Finnish funding agency for technology and innovation, TEKES. The goal of CSW was as follows: “the Cloud Software program (2010-2013) aims to 5
significantly improve the competitive position of Finnish software intensive industry in global markets”. CSW was large national research program with 8 research institutes and over 20 software-intensive companies as participants.
1.5
Results and Contribution
The main contributions of this thesis include the following three subjects: 1.) Listing of commonly challenging issues in agile UX work. 2.) Description of identified tasks and goals related to agile UX work in enterprise software development. 3.) Factors that support the integration of agile development and UX. Based on these three subjects that answer research questions RQ1 and RQ2 and partially RQ3, we build a framework called BoB for integrating UX work in agile enterprise software development. While most of the previous models for agile UX concentrate on integrating the role of UX specialist responsible of the UX work, BoB approaches the integration via UX tasks and shared ownership of the crossfunctional team. Additional contributions include increased understanding of 1.) UX tasks and roles in agile software development, and 2.) dimensions of UX -related to enterprise software.
1.6
Structure of the Thesis
The rest of this thesis is structured as follows: Chapter 2 introduces background and focal concepts. Chapter 3 presents related research. Chapter 4 introduces the research approach and methods. In Chapter 5, we present contributions of the included publications and the resulting framework for agile UX work. Chapter 6 discusses the results, presents implications for future work, and draws the final conclusions of the thesis.
6
2.
Background and Focal Concepts
This chapter introduces approaches for software engineering and for HCI development, concepts of enterprise software and user experience, and focal issues and concepts related to integrating UX work and agile development. We start with the concept of software engineering and continue describing how it is realized with plan-driven and agile methodologies. We address in more detail only those agile methodologies that are relevant to the thesis, i.e., those that the studied companies utilized or that are essential for introducing the related research. In addition, although most of the methodologies and approaches are general and can be applied to various types of industrial development, we limit to software development. Considering the concepts of human-computer interaction (HCI), human-centered design (HCD), and user experience (UX), we introduce those only to an extent that allows the reader to understand the concepts and their relevancy in academic research and in the industry. Subsequently, we discuss the core elements of software engineering and human-centered design. Finally, we introduce the concept of integration and organizational dimensions relevant to the integration between agile development and UX work.
2.1
Software and its Engineering
Software engineering discusses processes, methods and tools utilized for developing complex computer-based systems (Pressman 2010). A core function of software engineering is to fulfill user needs: “You build computer software like you build any successful product, by applying a process that leads to a high-quality result that meets the needs of the people who will use the product. You apply a software engineering approach” (Pressman 2001). A software product is the entity of “the programs, documents, and data produced as a consequence of the software engineering activities defined by the process” (Pressman 2001). On the other hand, it is also, “the resultant information that somehow makes the user’s world better” (Pressman 2001). Thus, the software is both the actual outcome and the perception of it. 7
Next, we discuss existing approaches to realize software engineering practices.
2.2
Plan-Driven Development
Plan-driven development methodologies emphasize gathering and documenting a complete set of requirements before starting implementation (Williams et al. 2003). The documentation becomes a plan that is followed throughout the rest of the project. Waterfall model is the most known plan-driven methodology and here we concentrate on it. The waterfall model divides development activities into a process of sequential phases based on the target of the activity (Figure 1). The origins of the waterfall model are in (Royce 1970). Royce’s (1970) model consists of the following sequential phases: requirements specification, design, implementation, integration, testing, installation, and maintenance (Royce 1970).
Figure 1 Royce's (1970) model for software development is considered as the origin of the waterfall model. In the requirements specification phase all requirements are documented. The design phase results in the architectural design; also, UI is specified in the phase. After these phases, the software system is implemented according to the specification. Also test cases are planned during the implementation phase. When all the required implementation (including testing the code and all logic paths) is ready, 8
software components are integrated to form a complete software system and the actual system can be tested. As the system passes the defined tests, it can be installed and the maintenance phase begins. Despite the sequential nature of the process, Royce (1970) intended the model to be iterative and incremental. He suggested that the sequence of phases should be conducted twice: the first iteration yielding a prototype and the second one would be the actual deliverable product. Incremental Model: As practices included in the waterfall model evolved from the Royce’s (1970) model, its iterative approach was largely overlooked (Larman et al 2003). However, an approach called incremental model later introduced incremental waterfall process in which the software is produced via several releases each increasing the functionality (Pressman 2010). Thus, incremental model realizes the Royce’s (1970) original model by allowing multiple iterations of the waterfall process. We exclude models contributing towards agile development (such as evolutionary development models (e.g., May et al. 1996) and Spiral model (Boehm 1988)) of the scope of this thesis. One of these models, Rapid Application Development (Martin 1991), is briefly discussed in section 2.4.1 Human-Centered Design. For instance, Boehm (2006) and Larman et al. (2003) present the history of software engineering in more detail.
2.3
Agile Software Development
Agile software development methods were created to address problems related to rapid change in market forces, system requirements, implementation technology, and project staff (Cockburn et al. 2001). Changes in requirements, scope, and technology can occur during a project’s life cycle, and thus the project team should be able to handle the change instead of trying to stop the change after certain phase of the project (Highsmith et al. 2001). Thus, the cost of change needs to be reduced throughout the project (Highsmith et al. 2001). In order to reduce the overall cost of change throughout the project, agile methodologies encourage the following: 1. Reduce the time between a decision and its consequences via rapid and continuous customer feedback loops through early and continuous delivery and through having user experts in the development team (Cockburn et al. 2001). 9
2. Reduce the cost of moving information between people by enabling constant communication and by embracing team spririt (Cockburn et al. 2001). 3. Minimize the content that needs to be changed with simple solutions (Highsmith et al. 2001). 4. Improve design quality continuously (considers business, user, and technical issues) (Highsmith et al. 2001). 5. Detect defects early through constant testing (Highsmith et al. 2001). The underlying values of agile methodologies are defined in Agile Manifesto (2001). Those include include effective use of people via constant collaboration and self-organization. Tools and processes are available to support the interaction between the stakeholders, not the focus of the action. Instead of inclusive rules, agile methods offer generative rules for managing software development in order to respect the individual and their creativity. All relevant stakeholders – sponsor, customer, user, and developer – should be in the same team in order to merge the multidisciplinary expertice and share information for more appropriate decisions. (Highsmith et al. 2001) Another basic principle is the use of working software as a project measure (Highsmith et al. 2001). Since agile methods are feature-based, completed features are a clear means to present the progress and mark the end of iteration (Highsmith 2002). In addition, working software enables validating the business value of the feature; customers and users understand working software better than documents or diagrams (Highsmith 2002). Highsmith (2002) defines the following as the major agile methodologies: ∂ Scrum (Schwaber and Beedle 2001). ∂ Dynamic Systems Development Method (DSDM) (Stapleton 1997). ∂ Crystal Methods (Cockburn 2004). ∂ Feature-Driven Development (FDD) (Palmer et al. 2001). ∂ Lean Development (Poppendieck et al. 2003). ∂ Extreme Programming (XP) (Beck 1999). ∂ Adaptive Software Development (ASD) (Highsmith 2000). Of these, Scrum is the most popular (Dingsøyr et al. 2012).
10
Next, we present those agile methodologies that are relevant to this thesis, namely Scrum and Lean Software Development in more detail. In addition, we present two approaches (Kanban Development and Continuous Software Engineering) that are utilized to support agile ways of working but are not complete software engineering methodologies themselves. These methodologies and approaches were either in use in the studied companies or present in related research to an extent that requires their introduction here. 2.3.1
Scrum
Scrum (Figure 2) is an agile software development methodology that was first introduced by Takeuchi and Nonaka in 1986 (Takeuchi et al. 1986) and was further developed in the 1990’s by, for instance, Schwaber (1997). It defines three core roles namely the product owner (PO), scrum master (SM) and the team who together form the scrum team (Schwaber et al. 2001).
Figure 2 Scrum methodology defines three artifacts (product backlog, sprint backlog and working software) and four ceremonies (sprint planning, daily scrum, sprint review and sprint retrospective). The team is responsible for building the software, the PO ensures the profit (or return on investment) of the software, and the SM helps the team and the PO to be successful in their tasks (Schwaber et al. 2001). Scrum defines three artifacts that build up the software system – the product backlog that contains features of a single release, sprint backlog that contains tasks for a single iteration, and working software that can be demonstrated or delivered to the customer at the end of each sprint (Schwaber et al. 2001). In addition, Scrum defines four ceremonies that help the team to build the software, namely sprint planning where the team selects tasks 11
for the next iteration, daily scrum where the team discusses the daily work and possible impediments, sprint review where the customer and PO can use the live software, and sprint retrospective where the team together with the SM inspect and adapt the development process (Schwaber et al. 2001). Scrum utilizes time-boxed iterations called sprints that last in maximum four weeks, the most common being two weeks (Schwaber et al. 2004). 2.3.2
Lean Software Development
Lean software development amends the Lean ideology, which originates from Lean Production (Womack et al. 1990) and Toyota Production System (Ohno 1988), for software development (Poppendieck et al. 2003). Table 2 introduces the core values of Lean software development. Table 2 Core principles of Lean software development, descriptions of those, and examples of techniques to realize each principle. (Summarized from Poppendieck et al. 2003) Principle
Description
Techniques
Eliminate waste
Remove all unnecessary actions (waiting, extra features, partially done work) and warehousing and focus on the value-adding activities Allow trial and error and evolvement instead of trying to get it right first time; focus on effective learning instead of minimizing the number of required learning cycles Decrease the cost of change by maximizing the amount of gained information before making a decision Delivering the software fast to customers shortens feedback cycles and allows late decisions thus minimizing the risk Enrich the work environment, move decisions to lowest possible level in the organization, remove power hierarchies Build a system that delights the customer (is functional, usable, reliable, and economic) and that is a smooth, cohesive whole See the big picture of the system; quality comes from how well the parts work together, avoid sub-optimizing
Just in time development (Kanban), Value stream mapping Learning cycles: buildmeasure-learn, Iterative development, Continuous improvement, Feedback cycles Iterative development, Simple, change-tolerant design to reduce the cost of later change Pull system (Kanban), Short cycle times
Amplify learning Decide as late as possible Deliver as fast as possible Empower the team Build integrity in See the whole
Continuous improvement, Fostering intrinsic motivation, Experimentation and feedback Model-driven design, Early user feedback, Refactoring, Testing Systems thinking, Agile contracts, Measuring
Lean software development is more a collection of principles and enabling techiques than a well-defined process for agile development. 2.3.3
Kanban Development
Kanban is a method of Lean ideology for avoiding overproduction by managing and limiting the work in progress (WIP) (Liker 2004). It originates from manufacturing industry where it serves as a sign to support the one-piece flow in 12
pull development mechanism (Liker 2004). In manufacturing, there can be a number of Kanban cards equivalent to the capacity of the system (Anderson 2010). One card will be attached to each piece under development and no additional work can be started without a card (Anderson 2010). Thus, the cards serve as a signaling and work-limiting system. When one-piece flow is working fluently, Kanban cards become unneeded. Poppendieck et al. (2003) introduced Kanban for software development in a form of a board. The original board by Poppendieck et al. (2003) is utilized for visualizing the workflow; it does not contain mechanisms for limiting the work in progress nor does it guide a pull system. Nevertheless, it seems that in software engineering Kanban has been adopted particulary in the form of the Kanban board; for instance, Corona et al. (2013) discuss Kanban board as the main tool to visualize and coordinate teamwork. They (Corona et al. 2013) identified in their analysis of 14 different Kanban boards the following common categories of activities: specification, development, test, and deploy. Each of the task categories had a median WIP limit of 3-4 simultaneous tasks. According to a recent literature review of Kanban in software engineering, Kanban is mainly used as a supplementary technique to visualize and limit the work in progress and thus it needs to be supported with other agile practices (Ahmad et al. 2013). However, in academic research, the usage of Kanban has been mostly reported on a higher level without revealing the details of how it is used in the industry (Ahmad et al. 2013). 2.3.4
Extreme Programming
Extreme programming (XP) aims at reducing project risk, improving responsiveness to business change, improving productivity, and making the work more fun for the software developer (Beck 1999). XP builds on rules related to planning, managing, designing, coding, and testing. The scope of the release is decided in a planning session called planning game. Iterations are planned small based on business value; only those features are implemented that are required instantly. Automated acceptance tests act as feature repository. The core practices of XP include early and continuous feedback cycles, test-driven development, automated tests, pair programming, continuous integration, and refactoring. Besides
13
implementation, pairs add value to analysis, design, and testing. The core team members are developers and an on-site customer. (Beck 1999) 2.3.5
Continuous Software Engineering
Continuous software engineering activities are means to enable rapid development and release cycles in organizations. They often combine automation in software development and deployment processes in a way that minimizes the time required for such practices as integration, verification, deployment and delivery of software (Fitzgerald et al. 2014). In addition to automated practices, continuous can also refer to practices conducted manually on a steady basis (Fitzgerald et al. 2014). Fitzgerald et al. (2014) define that continuous development consists of the following continuous activities: integration, deployment, delivery, verification and testing, security, and compliance. In addition to software development activities, the entire software life cycle can include continuous activities related to planning, operations, and improvement (Fitzgerald et al. 2014). In relation to agile software development methodologies, the concept of continuous can be equated with the concept of flow used in Lean development (Fitzgerald et al. 2014), and continuous integration is a practice utilized in XP (Beck 1999). Bosch (2014) claims that continuous integration has no business value if the organization does not follow agile working practices. Continuous development practices can be utilized with any agile development methodology. However, as there is a constant possibility to deliver whenever something is ready, continuous delivery can diminish the idea of time-boxed iterations utilized, for instance, in Scrum.
2.4
Approaches for HCI Development
Human-computer interaction (HCI) has its origins in human-tool interaction and information processing at the dawn of computing (Grudin 2012). HCI started as a research area combining information processing psychology perspective on human interaction with technology (Kaptelinin 2012 ref Card et al. 1983). HCI research truly emerged during 1970’s and 1980’s when people with no particular specialist background started to use computers and usability become more important (Grudin 2012). Moreover, there started to be a growing demand for approaches that give
14
guidance on how to develop for HCI and for good usability. We introduce such approaches next. 2.4.1
Human-Centered Design
HCD is the most prevalent approach for HCI development. The human-centered design (HCD, also called user-centered design, UCD) process is defined in (ISO 9241-210:2010) as follows: “Human-centred design is an approach to interactive systems development that aims to make systems usable and useful by focusing on the users, their needs and requirements, and by applying human factors/ergonomics, and usability knowledge and techniques. This approach enhances effectiveness and efficiency, improves human well-being, user satisfaction, accessibility and sustainability; and counteracts possible adverse effects of use on human health, safety and performance.” (ISO 9241-210:2010) The standard states that HCD can be “incorporated in approaches as diverse as object-oriented, waterfall and rapid application development” (ISO 9241210:2010). Thus, agile methodologies are not explicitly mentioned in the standard. However, Rapid Application Development (RAD) (Martin 1991) has many of the qualities of agile methodologies such as focusing on the business problem, iterative and incremental development, active customer role, speed, and flexibility (Martin 1991, Gottesdiener 1995). On the other hand, it also contains supporting properties of HCD such as strong focus on prototyping. In addition, the process of RAD is similar to HCD process. The HCD process aims at setting goals together with users and iterating the design until these goals are met (ISO 9241-210:2010). In HCD, the development is started from understanding and specifying the context of use (Figure 3). Then user requirements are specified and design is produced and evaluated against the requirements. This cycle is repeated until the implementation meets the requirements. Except for the UI implementation, outputs of these activities are numerous specifications and descriptions of context, requirements, and interaction and evaluation results. However, HCD utilizes such practices and artefacts as scenarios, mock-ups and prototypes in design and evaluation. (ISO 9241-210:2010). The core principles of HCD are as follows (ISO 9241-210:2010):
15
1. the design is based upon an explicit understanding of users, tasks and environments 2. users are involved throughout design and development 3. the design is driven and refined by user-centred evaluation 4. the process is iterative 5. the design addresses the whole user experience 6. the design team includes multidisciplinary skills and perspectives
Figure 3 Human-centered design process (ISO 9241-210:2010) In addition to the core principles enlisted in the standard, Gulliksen et al. (2001, 2003) state that systems development should be both iterative and incremental. UI design should be presented in a simple way that allows users and stakeholders to understand it and evaluation should happen in the context of use. Gulliksen et al. (2001) alleviate that prototyping should occur from very early in the process: right after specifying a few typical use case scenarios. Moreover, design solutions should be iterated rapidly and continuously in design-evaluate-redesign loops. (Gulliksen et al. 2001, Gulliksen et al. 2003) 2.4.2
Other Approaches to HCI Development
Besides HCD, other approaches for HCI development include activity-centered design (ACD), usage-centered design (UsCD), process-centered design (PCD) and 16
participatory design (PD). Of these, PCD is aimed for developing UI for enterprise software that must realize an underlying business process (Henry 2007). The fundamental idea of PCD is to design the UI based on business process modeling (Indulska et al. 2009). Henry (2007) claims that while HCD is well suited for developing consumer software, it fails in development of enterprise software that involves a business process. Henry (2007) considers PCD better for business software development since the UI should represent the workflow that corresponds to the underlying business process. HCD in contrast concentrates on the user interaction, which draws the attention away from issues that are critical for business (Henry 2007). ACD concentrates on activities people engage instead of humans themselves (Gay et al. 2004). ACD is based on activity theory (Norman 2005), which models human activity particularly in the context of tool-mediated production and via signmediated communication (Engeström et al. 1999). In activity theory, the actor is a subject who utilizes tools to transform an object into an outcome within defined roles, rules and social context, or community (Engeström 1987). Thus, the activity itself can be seen as the mediator between the subject and the object (Leontiev 2014)1. Activity has a purpose that is seeked via actions (tasks) and operations (Engeström 1987). Activity can be equated with motive, action with goal and operation with instrumental conditions (Engeström et al. 1999). Norman (2005) motivates the need for ACD in that although HCD has improved usability, software is still complex for the user. Moreover, many everyday objects such as cameras and cars have evolved over time based on user feedback and via a deep understanding of activities that users perform (Norman 2005). Thus, besides a deep understanding of the user, ACD alleviates the meaning of deep understanding of the underlying technology, utilized tools, and the human reasons for the activities (Norman 2005). UsCD concentrates on users’ roles and tasks in terms of the system being developed: UsCD “uses abstract models to systematically design the smallest, simplest system that fully and directly supports all the tasks users need to accomplish” (Constantine et al. 2002 a). UsCD is a model-driven approach that emphasizes utilizing use cases. It is related to ACD; however, UsCD concentrates more on the usage and users’ task accomplishment than the higher level activity. 1
Originally published in Philosophy in the USSR, Problems of Dialectical Materialism (Moscow, 1977, p. 180202). Available at http://www.marxists.org
17
Kuutti et al. (2014) introduced practice theory for HCI research. They consider HCI has an “interaction perspective” and contrast it with their practice theory that offers a “practice perspective” (Kuutti et al. 2014). HCD considers user interaction as the core phenomenon that happens in a context, which includes all prevailing conditions in which the interaction occurs (Kuutti et al. 2014). Moreover, as the standard definition of usability (ISO 9241-210:2010) states, usability is an extent to which a specified user can use the system under specified conditions, i.e. context. Thus, the context is considered static (Kuutti et al. 2014). In contrast, practice theory focuses on studying “computer-supported practice” (Kuutti et al. 2014). It gives equal value to the user interaction and the context of use as factors that are important or even intervowen to the practice (Kuutti et al. 2014). Finally, the core idea of PD is to have users participate in design together with designers (Sanoff 2006). In PD, designers design together with users instead of for users (Sanders 2002). Thus, users are seen as partners instead of subjects (Sanders et al. 2008). PD often uses visual artefacts such as prototypes enhanced with face-toface discussions to elaborate the design in a stakeholder group (Bødker et al. 1990, Sanders et al. 2010). Thus, its techniques are similar to HCD. To summarize, HCD focuses on the human and her/his needs and abilities. Its fundamental goal is in usability and user satisfaction. PD can be considered similar to HCD but it involves users as active participants instead of subjects. In contrast, ACD and other activity-based approaches focus on users’ goal-oriented tasks and purposeful activities. The fundamental goal of activity-based approaches is in supporting users’ tasks and activities. Finally, PCD concentrates on modeling business processes and its fundamental goal is in aligning the user’s task flow with the underlying business process in enterprise software.
2.5
Fundamentals of Development Processes
Royce (1970) and Pressman (2001) consider analysis and build the core activities of software development. HCD process (ISO 9241-210:2010) follows an understand-specify-design-evaluate cycle. Lean ideology utilizes the plan-do-checkact process (e.g., Liker 2004) and later the more lighweight build-measure-learn cycle (Ries 2011). Those were adopted from Deming’s plan-do-study-act and Shewhart’s specification-production-inspection processes and have their roots in 17th century philosophy of science, namely hypothesis-experiment-evaluation cycle 18
introduced in Bacon’s New Method (Moen et al. 2010 ref. Deming 1950, Shewhart 1939). In the context of software development, all the mentioned approaches can be equated to a cycle where first a preliminary understanding needs to be gained. Then something measurable is built based on the understanding. The outcome is evaluated and the cycle is repeated based on the results (if needed). This basic cycle forms an iteration. Repeating the cycle by adding ‘a little’ more in each iteration makes the process incremental. Core questions in different development approaches are the size of an increment, the size of iteration and the duration or the size of each phase in the cycle. For instance, waterfall development and traditional HCD have emphasized the size of the “understand” phase before building anything. Indeed, as the UX or interaction design is often understood holistic (e.g., Gulliksen et al. 2003, Salah et al. 2014), also the size of an increment can easily become large (Salah et al. 2014). In software development, the “understand” activity is often called “upfront design” as it traditionally has happened before starting the implementation. In contrast to plan-driven approaches, agile software development bases on little design up front and gaining the understanding through building and evaluation. It also emphasizes small increments and short iterations, i.e., short validation cycles.
2.6
Enterprise Software
By enterprise software we refer to software that is intended or used for work purposes in companies or other work-related organizations. Abrari et al. (2006) define enterprise software as “a software program that is used by business people to increase productivity through automated data processing.” Moreover, enterprise software must realize a set of business requirements (Abrari et al. 2006). Henry (2007) describes that a business application typically enables a business process or a part of it in order to achieve specific business objectives. Enterprise software often is a high-cost, long-term investment for a company (Henry 2007). Vuolle et al. (2010) describe that the aim of mobile business software is to create value for the customer organization including its employees and other stakeholders instead of for individual consumers. We boarden this definition from mobile business services to include all enterprise software. Moreover, we include the supplier company itself in the definition of customer organization, i.e. in-house tool development. While enterprise software creates value for the customer organization 19
using the software, employees of the customer organization utilize it to create value for their customers and stakeholders (Vuolle 2011). Moreover, performance objectives, usage scenarios, and buying criteria of an enterprise are different from those of individual consumers (Henry 2007). The use of enterprise software is often mandatory for the user: The employer selects the software employees will use in their work in order to create value for the organization itself and its customers. Enterprise software is primarily measured by its ability to realize business requirements. UX-related measure scales that are utilized for evaluation of enterprise software mainly measure usefulness, productivity, performance, and the ease of use. Technology Acceptance Model (TAM) by Davis (1989) predicts users’ intention to use through perceived usefulness and perceived ease of use. Technology Satisfaction Model (TSM) is an alteration of TAM in which the intention of use is replaced with user satisfaction since the use of enterprise software often is mandatory for the user (Lee et al. 2008). In addition to perceived usefulness and perceived ease of use, Lee et al. (2008) include perceived loss of control and perceived market performance in their scale. Finally, Task-Technology Fit (Goodhue et al. 1995) measures the impact of individual performance via effectiveness, productivity, and the system’s ability to increase the productivity of the user. While other scales exist, we limit to introducing only a few popular ones as examples of such scales in this thesis.
2.7
User Experience
The standard definition of user experience (UX) is as follows: “person's perceptions and responses resulting from the use and/or anticipated use of a product, system or service” (ISO 9241-210: 2010). The definition is ambiguous and numerous others exist (Lallemand 2015, Law et al. 2009). Commonly, UX is understood as subjective, context dependent, and dynamic (Law et al. 2009). UX has also temporal dimensions: it can happen before usage (anticipated UX), during usage (momentary UX), after usage (episodic UX), or over time (cumulative UX) (Lallemand et al. 2015). According to a recent study (Law et al. 2015), in academic research the most commonly utilized frameworks for UX are the hedonic-pragmatic model (Hassenzahl 2004) and sense-making experience (McCarthy et al. 2004). The hedonic-pragmatic model divides user experience into hedonic or non-utilitarian 20
dimension and pragmatic or instrumental dimension (Hassenzahl 2004). Hassenzahl (2004) further divides the hedonic into two subdimensions of identification and stimulation while the instrumental contains mostly items related to usability and usefulness. Usability is often seen as a necessary precondition for good UX (Hassenzahl 2008, Lallemand et al. 2015). The formal definition of usability is as follows: “extent to which a system, product or service can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use” (ISO 9241-210:2010). Preece et al. (2002) describe the difference between UX and usability followingly: “user experience goals differ from the more objective usability goals in that they are concerned with how users experience an interactive product from their perspective rather than assessing how useful or productive a system is from its own perspective.” Väänänen-Vainio-Mattila et al. (2008) discuss the differences in the conception of UX between academic UX research and industrial UX development. They conclude that while the research concentrates mostly on hedonic aspects and emotions, companies concentrate more on functionality and usability issues (Väänänen-Vainio-Mattila et al. 2008). Moreover, although early HCI studies concentrated almost exclusively on task- and work-related usability issues and achievement of behavioral goals (Hassenzahl et al. 2006), UX research has mainly concentrated on consumers and leisure systems (see e.g., Diefenbach et al. (2014) for categorization of publications applying the hedonic). Thus, it is unclear what shapes UX of enterprise software or work-related tools: what are its dimensions and is it different from UX of leisure systems.
2.8
Concepts Related to Agile UX Work
We have equated certain related terms as we conducted searches for related reseach review. We base our taxonomy on the one used in (da Silva et al. 2011). Table 3 presents the equated concepts. Thus, we studied UX AND AGILE (DEVELOPMENT) in the context of SOFTWARE ENGINEERING. We included ‘DEVELOPMENT’ in the substring for searching ‘AGILE’ as using ‘agile’, or ‘lean’ as keywords (without the defining word ‘development’ or equal) resulted into thousands of unrelated articles. We excluded ‘usability’ as such from the UX keywords for the same reason. We limited 21
the study on the field of software engineering and thus the third search substring (in addition to UX and agile) was ‘software development’ or ‘software engineering’. Table 4 presents the structure of the search string. Table 3 Focal concepts related to agile UX work Core concept
Equated concepts usability
UX
engineering,
human-computer
interaction,
computer-human
interaction, human factor, human factors, user-centered design, human-centered design,
human-centered
software
engineering,
user-centered
software
engineering, user experience agile development*, lean development*, scrum, extreme programming, , feature driven development, dynamic system AGILE
development, Kanban (agile or lean) development*
SOFTWARE
development, engineering,
software
development,
methodology,
method,
software methods,
practice, practices, project, life cycle, lifecycle
Software development, software engineering
ENGINEERING
We searched the following databases: IEEE Xplore Digital Library, Elsevier ScienceDirect, CiteSeerX, and Scopus and limited the search on title, abstract and keywords. In addition, we conducted individual searches in Google Scholar and various conference proceedings and journals and utilized snowballing method for identifying related research. When we reviewed and analyzed the identified related reseach, we did not utilize as straightforward equation as during the searches. We considered comparable the concepts of human-centered design, user-centered design, and usability engineering as they all describe methodologies or processes for organizing usability or UX work. In addition, we considered ‘usability’ and ‘UX’, ‘usability work’ and ‘UX work’, and ‘UX practitioner’ and ‘usability practitioner’ comparable pairwise as the research considered industrial UX work, and in the industry those concepts are close to each other (Väänänen-Vainio-Mattila et al. 2008, Wright et al. 2007); see, for instance, the use of concepts in (Miller et al. 2009). We further equated the roles of UX/usability/UI designer, -engineer, -specialist etc. and activities (designing, engineering etc.). We did not distinquish the related research based on the studied context of agile methodologies. However, we acknowledged the individual
22
differenceds between the methodologies in the analysis. In contrast to agile methodologies, we did not consider HCD or related process as a prerequisite for conducting agile usability or UX work. Table 4 Structure of our search string. UX (or equated concept) Agile
And
Lean
Development (or equated concept) And
Scrum or XP or FDD or DSDM or Kanban software development or software engineering
2.9
Dimensions of Integration of Agile and UX Development
Integrating agile and UX development involves a combination of aspects from the fields of software engineering, human-computer interaction, and management. Definitions of the integration of HCI work and SE practices mostly concentrate on organizational level aspects. Bloomer et al. (1997) state that “usability is successfully integrated into an organization when a strategy is developed which leads to key usability benefits and supports overall business objectives.” Schaffer (2004) considers that it indicates likely success of the integration when it has become a routine in an organization to include usability professionals and to utilize HCD process in development projects. Venturi et al. (2004) have an itemized definition: “’UCD integration’ is achieved when every phase of the product lifecycle follows the principles of User Centered Design, when UCD team is provided with the proper skills and experience, it is supported by the management commitment and a proper UCD infrastructure and when awareness and culture are properly disseminated in and out of the organization.” Thus, Schaffer (2004) and Venturi et al. (2004) consider integration successful when usability practitioners and HCD practices have a place in the organization whereas Bloomer et al. (1997) align usability strategy with business objectives. Seffah et al. (2005) take a more practical approach for integration. Although they do not provide a clear definition, they discuss the interplay between SE and HCI, and the extent to which the daily work of the practitioners of those disciplines should be connected. Considering these approaches to the integration, it is clear that the problem is multidimensional and 23
several aspects from an individual practitioner to the organizational culture need to be addressed. 2.9.1
Dimensions of Project Performance
Pressman (2001) divides software project management under the following factors: people, product, process, and project. People include the project team and stakeholders whereas product refers to the scope and decomposition of the problem to be solved during the project. Process refers to the series of predictable steps – or phases and activities – conducted to develop the software (Pressman 2001). Finally, a “project is a temporary endeavor to create the result” (PMI Institute 2004), result being here the software. In the context of agile development, Chow et al. (2008) identified dimensions related to success and failure of agile projects based on earlier research (Table 5). Barksdale et al. (2012) conducted a non-systematic literature review of agile usability integration strategies. They categorized their findings under dimensions similar to those of Chow et al. (2008). Brhel et al. (2015) adopted the taxonomy of Barksdale et al. (2012) in their systematic review of user-centered agile software development with a minor change of combining people and social into one single dimension. Table 5 enlists dimensions identified by each of aforementioned authors. Organizational dimension includes such factors as organizational culture and management commitment (Chow et al. 2008). Team, customer and other stakeholders form the people dimension (Brhel et al. 2015, Chow et al. 2008). In agile UX context, it includes also changes in the team composition to integrate the role of UX specialist (Brhel et al. 2015). Barksdale et al. (2012) differentiate social dimension from people dimension mainly including the knowledge creation and sharing and the social context of the team (Barksdale et al. 2012, Brhel et al. 2015). The process dimension considers factors related to the project definition process itself (Chow et al. 2008) and merging UX and agile development processes to include both perspectives (Brhel et al. 2015). Practices relate more to the daily work and ways of working in agile UX context, such as using prototypes (Brhel et al. 2015). Chow et al. (2008) define technical factors as working techniques to include “agile software techniques and delivery strategy” which is more in line with practices as defined by Brhel et al. (2015) and Barksdale et al. (2012). In contrast, Brhel et al. (2015) and Barksdale et al. (2012) define technology as “the use of
24
technological means to support and coordinate activities.” Finally, Chow et al. (2008) mention also project dimension, which includes such factors as the type, size and length of the project. Table 5. Dimensions related to success and failure of agile projects (Chow et al. 2008) and to integrating UX work with agile methodologies. Chow et al. (2008) Organizational
Barksdale et al. (2012) Practices
Brhel et al. (2015) Practices
People
Process
Process
Process
Technology
Technology
Technical
People
People and Social
Project
Social
Dimensions presented in Table 5 are in line with areas of general organizational performance measurement research. For example, Kennerly and Neely (2003) argue that the following are enabling factors for evolution of performance measurement systems: process, people, systems, and culture. We conclude that aforementioned board phenomena have impact on agile UX work and integration of related areas. However, since all these dimensions are extensive and abstract, they need to be operationalized to be measured and managed. Unfortunately, there are practically no established metrics available for measuring the performance of development work (e.g., Aaltonen et al. 2012 p. 73-76). 2.9.2
Development Work Performance and Success
Factors affecting knowledge work productivity are often classified into inputs, processes and outputs (e.g., Aaltonen et al. 2012, Melo et al. 2013, Stainer et al. 1998). Dimensions for evaluating both the development work performance (inputs and process) and the success of the outcome (outputs) include often aspects of quality, scope, time and cost. Chow et al. (2008) utilize these four in assessing successfulness of agile projects. In addition, we utilized those in Study II to evaluate project success. Quality and scope can be considered metrics (or objectives) of effectiveness as in successfulness of producing desired or intended result while time and cost measure efficiency, throughput or capacity. However, these are only general dimensions to evaluate project success; the actual metrics need to be formed according to the requirements of the case in question.
25
3.
Related Research
This chapter discusses current approaches of agile UX work and related research. First, we introduce relevant secondary studies and doctoral theses. Then we present the related research in the context of approaches or processes, practices, people and social factors, and technological or tool-related factors. Finally, we position our work by presenting the gap in current research by comparing results of two recent literature reviews.
3.1
Overview of Secondary Studies and Theses on Agile UX Work
There are a few recent literature reviews available of agile UX or UCD work. The earliest identified one (Sohaib et al. 2010) addresses (i) observed tensions between usability engineering and agile methods that make the disciplines difficult to integrate, and (ii) suggested approaches to realize the integration. Da Silva et al. (2011) report a literature review on popular practices in agile UCD work. Salah et al. (2014) study challenges and practices in agile UCD integration. Jurca et al. (2014) report mainly metadata on agile UX studies in their systematic mapping study. In addition, they (Jurca et al. 2014) report identified recommendations of existing work. Salvador et al. (2014) identify usability techniques utilized in agile development. The most recently reported literature review investigates principles that constitute a user-centered agile software development approach (Brhel et al. 2015). To conclude, literature reviews on the topic mainly concentrate on the practices and techniques utilized in agile UX work (da Silva et al. 2011, Salvador et al. 2014, Jurca et al. 2014, Salah et al. 2014). In addition, challenging aspects of the integration have been identified (Sohaib et al. 2010, Salah et al. 2014). The systematic review of Brhel et al. (2015) is so far the most extensive and analytic one as it classifies various aspects related to agile UX integration. In addition to reviews, some doctoral theses have been published on the topic. The main contributions of identified ones are as follows. Da Silva (2012) suggests a framework for integrating interaction design with agile development. His framework has a structure similar to Sy’s (2007) “one sprint ahead” model on which
26
he applies the most common practices identified in a systematic review (da Silva 2012). Ferreira (2012) investigates the combination of agile development and UX work: how it is accomplished and what shapes it in practice. She concludes that integration of agile development and UX design is an ongoing achievement in practice. Moreover, the integration is achieved by (i) maintaining focus and coordination, (ii) mutual awareness, (iii) expectations about acceptable behavior, (iv) negotiating progress, and (v) engaging with each other (Ferreira 2012). Lárusdóttir’s (2012) doctoral research concentrates on user-centered evaluation practices particularly in Scrum development. She concludes that empirical and informal qualitative methods are common in Scrum whereas quantitative evaluation is rarely conducted (Lárusdóttir 2012). Finally, Salah (2013) studies maturity models in the context of integrating agile development and UCD. Her main contribution is a maturity model for assessing specifics, activities, success factors, and challenges in the agile UCD integration domain (Salah 2013). Next, we introduce related research utilizing the taxonomy of Brhel et al. (2015), namely process, practices, people, and technical factors.
3.2
Approaches for Process Integration of UX and Agile Software Development
There are a few methodologies or models for structuring agile UX (or usability) work in agile projects. We present those and give a short discussion contrasting process-related principles derived by Brhel et al. (2015) with fundamentals of software development. In Lightweight Usage-Centered Design (Constantine et al. 2002 b) certain iterative activities are executed within a team that includes both UI designers and software developers, and one or more users or user surrogates. Those activities are related to defining user roles and users’ tasks in relation to the planned system, creating paper prototypes, and finally implementing UI based on the paper prototype and associated task cases. Components that are not directly dependent on the UI design can be implemented in parallel with the UI refinement process. Constantine et al. (2002 b) argue for minimal up-front design during which overall interaction architecture is created. Patton (2002) adapts the Lightweight Usage-CenteredDesign approach in his Interaction Design methodology.
27
Rapid Contextual Design (Beyer et al. 2004) is an agile UCD methodology based on the four following axioms: 1. Separate design from engineering. 2. Make the user the expert. 3. Keep up-front planning to a minimum. 4. Work in quick iterations. Rapid Contextual Design is based on Contextual Design introduced in (Beyer et al. 1998), and it has been developed when working with teams utilizing XP (Beyer et al. 2004). The Rapid Contextual Design process is started with setting a project focus. Then a few users are interviewed in their own context (Contextual Inquiry) and users’ tasks are modeled based on the gathered and analyzed user data. Then user stories are built together with the whole project team and planning game is run with those user stories. Detailed UI design is drawn during the first implementation iteration separated from the coding work. The UI design is tested with users in paper prototypes in interview sessions. Tested UI design is then handled to developers for implementation. UI design and testing work is continued similarly during the following iterations. Thus, Beyer et al. (2004) introduced the idea of parallel UI design and development tracks in Rapid Contextual Design. They suggested that UI is designed and tested on paper prototypes iteration ahead of development. They also suggested that the UI team can test the working code iteration behind the implementation. (Beyer et al. 2004) Miller’s (2005) and Sy’s (2007) ‘one sprint ahead’ (Figure 4) model is perhaps the most common framework for agile UX work. Others, for instance Fox et al. (2008) and da Silva (2012), have modified or further developed it. The model is based on experiences of agile adoption in a company utilizing Adaptive Software Development (ASD) methodology (Miller 2005). The model (Miller 2005, Sy 2007) divides the work into two parts, namely design upfront and during-development activities; the first is analogous to ASD’s speculate cycle and the latter to the collaborate cycle. Conducting some design upfront (SDUF) is a common and recommended approach in agile UX development (da Silva et al. 2011). SDUF is often described as the ‘sprint zero’ that is conducted prior to implementation work (Sy 2007, Salah et al. 2014). After the sprint zero, the ‘one sprint ahead’ model (Sy 2007) divides implementation and UX design work into parallel tracks similarly to 28
Beyer et al. (2004). The idea is that UX design work constantly keeps iteration ahead of implementation. Basically UX specialists work with four iterations at the same time: user studies can be conducted two iterations ahead while UX design work is conducted one iteration ahead and evaluation is conducted an iteration behind the development.
Figure 4 The “one sprint ahead” approach (Sy 2007). Memmel et al. (2007a, 2007b) suggest CRUISER lifecycle (Cross-discipline user interface and software engineering lifecycle) for integrating SE and HCI disciplines. They amend from various user-centered approaches and integrate those with XP. CRUISER is started with (i) initial requirements up-front phase where usage scenarios are created based on essential user roles’ needs and task models. UI designer decides on the degree of user involvement and balancing user performance, UX and hedonic quality demands in this phase. The second phase is (ii) initial conceptual phase where UI and software architecture are designed in parallel via prototypes towards a single solution. Coding is started in the third phase, (iii) construction and test phase. Test cases are planned in the beginning and executed only if the usability specialist of the team considers it necessary. Memmel et al. (2007) suggest pairing usability expert with a developer in pair programming and conducting parallel UI and backend coding. The final step is (iv) deployment phase in which actual user feedback is gained and change and new feature requests can be planned for future iterations. Brhel et al. (2015) present a comprehensive summary of aspects related to process perspective in agile UX work in their systematic review (Figure 5). They 29
form process principles based on their findings of included primary studies. Their first principle is to have separate product discovery and product creation phases. The second principle is to conduct design and development iteratively and incrementally. The final principle is to have parallel interwoven creation tracks. Of the analyzed publications, 18 support this practice. However, only five of them are empirically grounded. Moreover, the concept of designing one sprint ahead is mentioned in 17 of the publications that Brhel et al. (2015) analyzed; of those only two present empirical evidence. Little design up front
39
Iterative design / development
33
Cohesive overall design
22
Product vision / innovation
22
Parallel tracks (UX and development)
18
Deferred development
17
Incremental design / development
11
Cycle zero
10
Synchronization / integration
7
Big design up front
5 0
10
20
30
40
50
Figure 5 Suggested process-related aspects of agile UX work in the related research. Numbers are occurrences in the identified publications (N = 83). (Brhel et al. 2015) To conclude, process models for integrating agile UX work in general value separation of design and development work. The principle of separate product discovery and product creation phases (Brhel et al. 2015) is in line with the idea of separare analysis and build phases of Royce (1970). The second principle of Brhel et al. (2015) follows a core principle of agile methods, iterative and incremental development. The third principle of parallel interwoven creation tracks is not empirically sound as it is supported only by few empirical studies. We claim that although numerous studies investigate the process perspective of agile UX integration, current approaches do not provide sufficient solutions for structuring it.
30
3.3
Agile UX Work Practices
There are also practice-oriented approaches to integrate agile and UCD activities. For instance, Kane (2003) suggests using discount usability engineering techniques, such as simplified thinking aloud, which can be considered as lightweight versions of original methods. Sohaib et al. (2011) integrate discount usability in XP project life cycle via utilizing the following usability practices: Scenarios are utilized along with user stories in exploration phase, card sorting is conducted as part of release planning, heuristic evaluation is utilized during acceptance tests, and finally thinking aloud is utilized during productionizing (Sohaib et al. 2011). In addition, Rapid Contextual Design methodology describes in detail the UCD methods that should be utilized during the development process (Beyer et al. 2004). Current practices in agile UCD work have been througly researched (e.g., da Silva et al 2011, Salvador et al. 2014, Brhel et al. 2015). Table 6 presents the taxonomy of Brhel et al. (2015) which divides agile UCD practices under user research, conceptualization, design, and evaluation practices. Salah et al. (2014) use categories of UCD infrastructure, people, and process. Salvador et al. (2014) divides identified usability techniques under inquiry, inspection, prototyping, and testing methods. Table 6 Classification of most common agile UCD practices. Occurrences of each practice in identified publications are in brackets. (Brhel et al. 2015) User research Contextual inquiry (10) Task analysis (6) Focus groups (4)
Conceptualization User stories (21) Guidelines (16) Scenarios (16)
Interviews (2) surveys (2)
Requirements (14) Prioritization (14) Personas (13)
Design Prototyping (40) Mock-ups (17) Wireframes (9)
Evaluation User testing (37) Expert evaluation (26) Customer or user involvement (21) Done criteria (2)
Earlier survey studies regarding usefulness and popularity of HCI practices in industrial software development are in line with the results of the SLR of Brhel et al. (2015). Those studies are not included in the SLR and they do not concentrate on agile development. For instance, Bark et al. (2006) surveyed 179 usability professionals from Nordic countries and found that user tests and rapid prototyping are considered the most useful HCI methods. Mao et al. (2001) surveyed 100 usability practitioners and found field studies (including contextual inquiry), user requirements analysis, and usability evaluation as practices rated the most important 31
whereas the most often used practices were usability evaluation, task analysis, informal expert review, and field studies. In a survey of Gulliksen et al. (2004), 194 practitioners (including usability specialists, managers, developers etc.) rated the following practices the best: (i) thinkaloud (a technique utilized especially in user tests in which the user speaks out his thoughts behind his actions on the UI), (ii) low-fidelity prototyping, (iii) interviews, (iv) field studies, and (v) scenarios. Jia et al. (2012) partially replicated the study of Gulliksen et al. (2004) in Scrum context and discovered the following practices as the best rated: (i) workshops (ii) informal usability evaluation with users (iii) meetings with users (iv) scenarios and (v) formal usability evaluation with users. We conclude that prototyping, user testing, user stories and field studies are the most often used practices in agile usability development. Informal and less structured practices were most popular before agile methodologies become prominent (Mao et al. 2001) and they seem to be most popular within agile methodologies as well (Cajander et al. 2013, Jia et al. 2012, Lárusdóttir et al. 2014). Incomplete integration of usability/UX work can be one reason for utilizing informal practices instead of formal ones. However, formality can also hinder the adoption of usability practices in companies. Thus, we consider that practices for UX and usability work should be informal, lightweight, and easy and fast to apply to be efficiently adopted in industrial software development.
3.4
People and Social Factors in Agile UX Work
People factors are related to collaboration, communication, decision-making, knowledge transfer, roles and responsibilities, and organizing people resources (Brhel et al. 2015). There is contradictory evidence in the related research considering people and social factors of agile UX work. Agile methodologies emphasize the role of the multidisciplinary team (e.g., Highsmith et al. 2001) while HCD methodologies usually emphasize the responsibility of a usability specialist in developing usable software (Brhel et al. 2015 ref. (Cooper 2004, Gould et al. 1985, Sharp et al. 2007)). Most of the related research suggests approaches that value separation of UX specialists and developers on team level: UX specialists and developers should form two separate teams (Brhel et al. 2015, da Silva et al. 2011). However, there are some contradicting evidence suggesting that cross-functional teams including a UX specialist are more efficient (e.g., Ferreira et al. 2011). 32
Moreover, although separate UX and development teams are commonly valued, close collaboration between those teams is recommended (Brhel et al. 2015, da Silva et al. 2011). Fox et al. (2008) present the concept of specialist, generalist and hybrid approaches in project staffing. Specialist approach values explicit competence areas with separate UX and software specialists (Fox et al. 2008). There is no UX specialist involved in the generalist approach and developers conduct all the work by themselves while in hybrid approach there are people (or a person) knowledgeable in both disciplines in the team (Fox et al. 2008). The specialist approach is evidently the most commonly studied one (da Silva et al. 2011) although there are suggestions that UX designers should also be knowledgeable to some extent in software development (e.g., Boivie et al. 2006). Moreover, as UX resources in companies are often scarce (e.g., Wale-Kolade 2015), we assume that a large proportion of projects are still conducted with no contribution from UX specialists as it has been previously (Seffah et al. 2004, Vukelja et al. 2007). Ungar (2008) and Leszec et al. (2008) suggest approaches for acknowledging the developer role in UX design work. Ungar (2008) includes developers in design work together with users and UX specialists in the Design Studio method. A Design Studio is a one-day workshop for creating a design concept together with developers, stakeholders, and UX designers. Benefits of the method include improved knowledge transfer, fostering of shared understanding of the design, and providing a rapid way to produce a coherent and agreed-on concept (Ungar 2008). Leszek et al. (2008) support developers with a concept of Office Hours in which each development team can consult a UX specialist periodically for an hour or two at a time. Leszek et al. (2008) focus their approach particularly for situations where UX specialist resources are scarce. U-SCRUM (Singh 2008) is a people oriented approach for integrating UX work with agile development (Figure 6). It brings another PO responsible for UX issues in the project team. A UX vision is created in the beginning of the project, and UX issues are emphasized in the product backlog (Singh 2008). Thus, Singh’s approach is on the product level concentrating on planning and the product vision.
33
Figure 6 U-SCRUM model with the role of a UX-PO. Adapted from (Singh 2008). Finally, user involvement is an evident people factor and a core principle of HCD. User involvement is commonly recommended also in agile software development, e.g., (Chamberlain et al. 2006, Cockburn et al. 2001). It is most often addressed through system evaluation and usability testing in particular in the related research (Brhel et al. 2015). Another commonly addressed place for direct user involvement is during user studies in order to elicit user needs and requirements (Brhel et al. 2015). However, we could not identify studies considering how user involvement is organized in agile methodologies.
3.5
Supporting Technological Factors
Technological means to support the agile UX integration include tools for coordinating activities and communication. Feiner et al. (2012) itroduce a tool for reporting user evaluation results that allows easy import to issue tracking system. Gonçalves et al. (2011) propose software for building low-fidelity prototypes, documenting and testing them. Hosseini-Khayat et al. (2010) developed a tool that allows remote usability tests on low-fidelity prototypes. Finally, Peixoto et al. (2009) present a conceptual model of a knowledge base and expert system for guiding developers during UI design with the help of user profiles and HCI guidelines. Brhel et al. (2015) conclude that research concerning technological integration is still in an early stage.
34
3.6
Gap in Research
Agile UX is an emerging research area, and the number of publications is growing rapidly each year (Brhel et al 2015). Most of the research has addressed the utilization of usability practices in agile development (Brhel et al. 2015). Although there are several studies considering agile UX processes and structuring the work, the area lacks empirical studies (Brhel et al. 2015). Current process models, for instance, Sy (2007) and da Silva (2012), organize agile UX work with separate design upfront phase and by separating development and UX activities on their own streamlines. Such approach introduces many problems in development work: Challenges in the model include that developers need to plan their work one to two iterations in advance, and UX specialists test code from the previous cycle while developers already build next increment on the code. These practices decrease agility. There is evidence that a cross-functional team is more efficient than the one iteration ahead approach (Ferreira et al. 2011). However, there are no known models for organizing agile UX work in such mode. Research reports several challenges in organizing and conducting UX work alongside agile software development. Process-oriented challenges include, for instance, organizing and sizing the design upfront work and chunking the UX design work for agile iterations (Salah et al. 2014). There is a recent increase in the number of studies considering people-related issues (Brhel et al. 2015). However, understanding work dynamics between software developers and UX specialists requires more research (Salah et al. 2014) as agile methodologies do not give instructions considering the UX specialist role or UX work in general. Finally, technological factors related to agile UX interaction is the least studied area. This thesis aims at increased understanding of goals, tasks and challenges in agile UX work. Moreover, it aims at structuring a novel approach to support the integration of UX work and agile development that is expected to mitigate challenges related to current integration models. In addition, the thesis increases understanding of work dynamics between software developers and UX specialists. Finally, our focus is on enterprise software development, which is a less studied area.
35
4.
Research Approach, Methods, and Process
The research for the thesis consisted of three rounds of empirical studies (Table 7) and a systematic literature research. Companies are presented in Table 9. Table 7 Summary of Research Questions and Publications per Study. Study
Participated companies
RQs
Publications
Study I
A, B, C
RQ1, RQ2, RQ3
P1, P2, P3, P4
Study II
B, D, E, F, G
RQ2, RQ3
P5, P6, P7, P8
Study III
B, D, E, F, G
RQ2, RQ3
P9
Study I concentrated on gaining understanding on current tasks and challenges in agile UX work on company-level (research questions RQ1 and RQ2). We conducted explorative case studies (as instructed by Runeson et al. (2009) and Yin (2003 b)) on agile UX in three software intensive companies. In Study II, our goal was to understand roles and tasks related to UX work on project level (RQ2 and RQ3). Thus, we conducted more structured case studies in which we surveyed eight agile software development projects from five companies over a release cycle. Lastly, in Study III, we compared users’ and agile development team members’ views of UX in a quantitative study with 26 participants from six agile projects from five companies and 29 users of software developed in those projects. We studied the developers’ ability to predict end user UX of the developed software to support the task allocation between developers and UX specialists. Thus, the focus on the third round was on stakeholder roles. To guide and complement those empirical studies, we conducted a systematic literature review following the procedure of Kitchenham et al. (2007) (guide for performing systematic literature reviews in software engineering) (RQ1, RQ2, and RQ3). In addition to the empirical reseach conducted in the three Studies, we build a framework for agile UX work utilizing the “building theories from case study research” approach as instructed by Eisenhardt (1989).
4.1
Research Approach
The research problem considered the practical application of software development methodologies – namely agile development and UX development – 36
described in the literature. Our research goal was to make recommendations on how to integrate UX work as part of agile software development. The explicit research questions are as follows: RQ1. What are the current challenges in conducting UX work in projects following agile methodologies? RQ2. What are the goals and tasks of agile UX work in enterprise software development? RQ3. Which activities support the integration of agile development and UX work? Here we address the background of selecting the research approach. The researcher had very limited ability to control or affect the studied phenomenon of industrial software development work. The phenomenon was organizational – considering human beings in their abstract and often intangible specialist work in software development. Measuring the performance of development work is complicated and in practice there are no established metrics available (e.g., Aaltonen et al. 2012 p. 73-76). In general, several factors affecting or describing development work productivity such as customer satisfaction, quality of interaction or worker satisfaction and motivation are intangible and qualitative (Aaltonen et al. 2012 p. 74). In addition, challenges related to the phenomenon studied in this doctoral research were rather unknown in the beginning and therefore difficult to operationalize and quantify. Earlier research had introduced some models and principles for agile UX work such as (Sy 2007). However, they were largely based on single cases and lessons learned in companies and thus their adequacy in explaining the phenomenon was unclear. In addition, theory testing would have been unfeasible because the candidate had in practice no control over the phenomenon. Considering these boundaries, we chose an inductive research approach (Figure 7).
Figure 7 Inductive research process. Our research goal was to make recommendations on how to integrate UX work as part of agile software development and we expected the result be a construction 37
that would be both empirically based and theory-dependent: To be generalizable, the structure should respect both agile methodologies and HCD principles (theorydependent). In addition, the structure should be empirically based since current agile methodologies did not guide agile UX work and companies were experiencing problems in integrating UX work with software development activities. Moreover, the richness of empirical context is related to generalizability of the results. We started with an idea to assess new improved practices in studied companies. However, after the first round of explorative case studies, it became evident that we cannot get such data from an adequate number of companies as we failed to find committed companies. Thus, we selected Eisenhardt’s (1989) process for building theories from case study research (Table 8) to guide the construction of a framework for agile UX work. It is a “research strategy that involves using one or more cases to create theoretical constructs, propositions, and/or midrange theory from case-based, empirical evidence” (Eisenhardt et al. 2007). The theory-building process builds on recursive empirical data cycles, emerging theory and comparing it with literature (Eisenhardt et al. 2007). Eisenhardt et al. (2007) claim that since the theory building approach is deeply embedded in rich empirical data, utilizing the methodology “is likely to produce theory that is accurate, interesting, and testable.” Building theories from case studies is an established research methodology in organizational research (Eisenhardt et al. 2007). Case study is a commonly used research approach in organizational research (Yin 2003 a) including software engineering research (Runeson et al. 2009). According to Yin (2003 b (p. 13)) a case study is “an empirical inquiry that investigates a contemporary phenomenon within its real-life context, especially when the boundaries between phenomenon and context are not clearly evident.” Previous research guides data collection and analysis in case studies, and both multiple sources of evidence and reseach methods are utilized (Yin 2003 b). The selected methodology offered us to examine the phenomenon of agile UX from several perspectives. First, we could study the phenomenon on different organizational levels including the company, project, team, and individual persons. Second, we could investigate both the process and its outcome, i.e. software development and the software product, respectively. Third, it enabled us to use methodological triangulation. A large proportion of studies of software development are qualitative due to its complex nature. Starting with qualitative surveys served as 38
our base for more detailed quantitative studies. Finally, the building theories from case studies strategy guided us in building plausible framework based on our results without the requirement to actually utilize the framework in practice. Table 8 Process of building theory from case study research (Eisenhardt 1989) Step
Activity
Getting started
Definition of research question
Selecting cases
Theoretical, not random sampling
Crafting instruments and
Multiple data collection methods, qualitative and quantitative
protocols Entering the field
Overlap data collection and analysis
Analyzing data
Within-case analysis, Cross-case pattern search using divergent techniques
Shaping hypotheses
Iterative tabulation of evidence, Replication, Search evidence for “why” behind relationships
Enfolding literature
Comparison with literature
Reaching closure
Theoretical saturation when possible
4.2
Studied Companies and Research Participants
Altogether, we conducted 75 interviews and received 519 survey responses from seven companies. Some participants were interviewed or surveyed more than once. Since we did not identify the respondents of the first round surveys, and one of the companies participated to three first rounds of research, we do not know the number of individual participants. Table 9 presents participated companies, their software development practices and main products. Companies B and D were large with around 20 000 employees, companies A and C were medium-sized with 1000 to 2000 employees. The rest of the companies had 100 to 500 employees. Companies A, C, and D had own products while Companies B, E, and F were IT service providers, and Company G had a hybrid business model, with both own products and consulting services. All the companies were developing mainly enterprise software or tools for professional use. Companies C, D, and G developed both hardware and software. Table 10 presents an overview of the participant population per study in the doctoral research. Participants and sampling methods are introduced in more detail in the included Publications. Table 10 names used surveys and interview guides, which are included in appendices. Interviews 2 follow the survey 2 structure and 39
thus there is no separate guide for those. Layout of the surveys in the appendices does not correspond to the original layout. In addition, we utilized background information surveys that we did not include in the appendices. Those contained questions of demographic information such as age, gender, level and field of education and working experience in total and in software development, UX work, and with agile methodologies. Table 9 Description of studied companies. Company
Ways of Working
Product Base
A
Aroud 1000 employees worldwide, utilizes
Specialized software systems and tools
Scrum, had a centralized UX team with 15
for both business and consumer users.
UX specialists and a few distributed ones.
Own products for several platforms.
An IT service company with around
Mainly
20 000 employees worldwide. Utilizes
business services. Also some in-house
mainly customer-defined processes. Has a
development. Both large IT systems and
centralized UX team and distributed
mobile applications.
B
customer-ordered
business-to-
specialists. C
An engineering company with around 2000
employees
worldwide.
Embedded systems for specialist use.
Utilizes
several practices such as Lean, Kanban and Scrum. No UX team. D
An engineering and technology company
Large industrial safety-critical real-time
with around 20 000 employees worldwide.
systems consisting of hardware and
Utilizes
software.
both
waterfall
and
Scrum
practices. Several small distributed UX teams and specialists. E
F
An IT service company with 100-500
Customer-ordered
business-to-business
employees in Finland. UX specialists
solutions and services, both desktop and
working in project teams.
mobile.
An IT-service company with 100-500
Customer projects; mobile and online
employees in Europe. Utilizes Scrum. A
software.
centralized UX team in one site and distributed specialists in others. G
A mobile technology company with 100-
Mobile solutions, wireless technologies,
500 employees worldwide. Utilizes agile
industrial systems.
practices and customer processes.
A
centralized UX team.
40
Table 10. Summary of methods and participants per Study (Legend: Arc = Architect, Dev = Developer, PO = Product Owner, SM = Scrum Master, UXS = User experience specialist). Study
Method
Participant roles
Countries
N
SI
Survey 1
Dev 39.2%, Manager 25.9%,
Finland 58.0%, Other 25.87%
143
(Appendix 3)
UXS 10.5%, PO/SM 10.5, other
(mainly France, Sweden,
7.0%, Unknown 8.4%
Czech Republic and Malaysia), Unknown 16.1%
Interviews 1
PO, SM, UXS, Dev, Arc,
Finland 95.2%, Sweden 2.4%
50
(Appendix 4)
Tester, Manager, Customer
and China 2.4%
Survey 2
UXS, Dev, Arc, PO
Finland 100%
8
Interviews 2
UXS, Dev, Arch, PO
Finland 100%
7
Pilot interviews
UXS, Arc, Dev
Finland 50.0%, China 25.0%,
8
(Appendix 5)
S II
(Appendix 8) Pilot survey
Belarus 12.5%, India 12.5% Dev, Tester, SM, UXS, PO
(Appendix 6, 7) Weekly survey
Finland 36.8%, China 26.3%, Belarus 26.3%, India 10.5%
Dev, UXS, PO
Finland 45.2%, Russia
(Appendices 9,
25.8%, China 22.6%, Latvia
10, 11)
3.2%, Estonia 3.2%
Retrospect survey
19
Dev, UXS, PO
(Appendices 12,
Finland 53.8%, Russia
31
26
30.8%, China 26.9%
13, 14) Interviews
UXS, PO
Finland 100%
6
Dev, UXS, PO
Finland 73.1%, Russia
26
(Appendix 15) S III
Team survey (Appendix 16)
19.2%, Latvia 3.8%, China 3.8%
User survey
User
Majority from Finland
29
(Appendix 17)
In the following, we introduce the sample per Study. More detailed introduction on research methods and participants can be found in the associated Publications. Study I: We selected the study objects within companies participating in the Cloud Software program (http://www.cloudsoftwareprogram.org/) for conveniency reasons; the first research round was quite heavy for the participants (for instance, it included several lengthy interviews) and those companies were committed to participate in academic research. We also aimed at conducting follow-up studies
41
with companies and managed to do so with one company. We conducted Study I in three companies (Companies A, B, and C (companies are presented in Table 9)) altogether with 151 survey participants and 57 interviewees. We give detailed description of the participants and their selection process including sampling methods in Publications P1, P2, P3, and P4. Study II: As we wanted to continue studying those phenomena determined in Study I by utilizing Eisenhardt’s (1989) approach, we utilized theoretical sampling when selecting the participating projects. In total, we studied eight projects from five companies (Companies B, D, E, F and G). We interviewed 14 persons and surveyed 50 with survey repeated weekly or a retrospect survey. Two of the companies were participants of the Cloud Software program, the rest three we acquired elsewhere. All the companies were Fnland-based. Since we followed each project through several weeks, willingness of the whole team to participate was one practical limitation while selecting projects. We describe the selection process in more detail in Publication P7. Study III: We conducted the third round of research with six projects participating Study II including their users (the total number of participants in Study III was 55). This allowed us to compare the data between Studies II and III, including ways of working in the projects and the quality of the outcome as user evaluations of the software. Publication P9 reports results of Study III. Theory building: We constructed a framework called BoB for integrating UX work with agile practices based on results from Studies I, II, and III. We based the framework on findings of challenging issues and important aspects of agile UX work and our suggestions regarding to task allocation of UX work. Publication P8 present contributions towards the framework. The final framework is presented in this thesis in Chapter 5.
4.3
Research Methods
We selected research methods amongst widely used ones from the fieds of software engineering, management, and human-computer interaction, since the research topic was a cross-section of those areas. The main empirical data gathering methods included web surveys and interviews (Table 10). Surveys and interview guides are listed in Appendices 3 - 17. In addition, we reviewed organizational documentation when possible. We also 42
planned observation studies. However, they were difficult to arrange in global software development. Instead, we used a repeated web survey to gather data of weekly practices in software projects in Study II. Main data analysis methods in Study I included qualitative content analysis such as the affinity wall method instructed by Beyer and Holzblatt (1998 pp. 153163) and a theory-bound analysis method using an emergent coding approach as described by Lazar et al. (2010 pp. 285-301). When feasible, we quantified qualitative data to enable statistical analysis. Quantitative data analysis methods in Study I included basic cross-tabulations and frequencies of occurrency; also statistical hypothesis testing was utilized. We gathered mainly quantitative data in Study II and Study III. In Study II, we utilized mainly descriptive statistics and correlation analysis. In addition, we utilized qualitative data to interpret statistical results. In contrast, in Study III we utilized statistical analysis methods such as tests of equity and difference to compare between data sets and principal component analysis in order to reveal hidden structures in the data. We utilized parametrical tests when the data was normally distributed and non-parametric tests when the data was non-normal. We describe research methods in detail in the included Publications.
4.4
Research Ethics
The research was conducted following the guidelines of Finnish Advisory Board on Research Integrity (Tenk website). We utilized the Finnish version of the guidelines published in 2002. Since the 2012 (Tenk 2012) update was published, we have conducted our research according to it. Thus, in general, we strived for integrity, meticulousness, and accuracy in conducting research and in processing the research results. Considering ethical treatment of respondents, we applied the following. We started all our surveys with informed consent statements (Appendix 2) and utilized informed consent forms in interviews (Appendix 1). In the informed consents, we introduced ourselves, our research group, and the funding sources to the participants. We explained the purpose of the study and the expected duration of the interview session or the web survey. We described how we would maintain confidentiality or anonymity. We also mentioned if we had signed a non-disclosure agreement with the company the interviewee was working for. Each face-to-face 43
interviewee signed an informed consent form. In remote interviews, we emailed or showed the informed consent form to the interviewee and discussed it with them. We first asked their consent to record the remote session and after they agreed, we started the recording and re-asked if they agree for the interview and for the recording. Moreover, we told each interviewee that participating is voluntary and the person can refuse to respond or refuse to continue at any moment. We analyzed the data confidentially utilizing respondent codes we attached with each participant in the informed consent form. As we transcribed the interview material or pre-treated the web survey data, we removed company and person names and referred to them with company and person codes or job titles.
4.5
Research Process and Schedule
We started the research with research questions RQ1 and RQ2 in Study I (Figure 8). The design of Study II based on a systematic review on UX practices and on findings of Study I; Study I was completely explorative aiming at reveal unknown issues to enable Study II. We designed Study II to enable utilizing the approach of building theories on case studies by Eisenhardt (1989). First, we piloted the research idea for Study II in Publication P5, and then iterated the Study II design based on our experiences. We started building the framework (theory building) in Publications P4, P8, and P9. We built the majority of the framework while compounding the thesis.
Figure 8 Research process. Figure 9 illustrates the schedule of the main activities of the doctoral research. We conducted explorative case studies for Study I between February 2011 and June 2012. We started Study II in February 2012 by piloting the research idea in two 44
software development projects. We conducted the systematic literature review between October 2012 and January 2013 and complemented it during fall 2013 and summer 2014. We continued Study II with main data gathering activities in six software development projects after we had iterated the study II design based on the pilot study and systematic review findings. Study III finalized the data gathering for the doctoral research in June 2013 – March 2014. We continued analysis of Study II and III for the thesis in 2014. The majority of the introduction part of the thesis was written during April and May 2015.
Figure 9 Schedule of the main activities of the doctoral research.
4.6
Answering the Research Questions
The answer to RQ1 is based on the following research: In Study I we surveyed of participants from Companies A and B about the most challenging issues in agile UX. We utilized Study I interviews in companies A, B, and C to explain the survey findings. Thus, we asked, for instance, interviewed architects to describe problems that arise when architecture and UX work are separated. In addition, we asked in the retrospect survey of Study II about the importance of and projects’ performance in those UX issues found challenging in Study I. The answer to RQ2 is based on the following research: We asked the participants of Study I of the most important tasks of the UX team. We iterated this list first based on Study I interviews and then with two architects and two UX specialists from Company A, and a UX specialist and a PO from Company B to cover the 45
majority of UX specialists’ tasks. Later, we iterated the list based on Study II pilot study results to cover UX tasks of all the roles including UX specialists, developers, and the PO. Then, we utilized the iterated list in Study II pilot survey to gain deeper understanding of how those UX tasks are conducted in practice and which goals they serve. Our answer to RQ3 is twofold. First, we address the question in terms of supporting factors of agile UX work and then we build a framework on our results. We base the answer to RQ3 on all our studies, mainly on Study II. Thus, we utilize all our results and related research and use the building theories from case studies strategy (Eisenhardt 1989) to address the RQ.
4.7
Research Validity
Study quality relates to “the extent to which the study minimizes bias and maximizes internal and external validity” (Kitchenham 2007 ref. (Cochrane Collaboration 2003, Khan et al. 2001)). Validity refers to trustworthiness of the research (Runeson et al. 2009). Kitchenham et al. (2007) define bias as follows: “a tendency to produce results that depart systematically from the ‘true’ results.” Both Yin (2003 b) and Runeson et al. (2009) present four aspects of validity that need to be considered in case studies, namely construct validity, internal validity, external validity, and reliability. Kitchenham et al. (2007) list four types of bias that they have adapted from medicine and amended for use in software engineering research: selection, performance, measurement, and attrition bias. Table 11 presents sources and the estimated impact of each type of bias in the doctoral research. Definitions given in the table correspond to those in Kitchenham et al. (2007), but we use more simplified ones here. Next, we discuss each bias type separately and our means to minimize their impact. Selection bias. All the participated companies were Finland-based. Also, the majority of participants were from Finland. The majority of research was conducted on companies that participated the Cloud Software program. We consider all projects that participated in Studies II and III successful; although companies want to improve their ways of working they seem to tend to allow research only on projects that are more functional. Considering these limitations, it is evident that selection bias has influenced the research. We utilized theoretical sampling to decrease the impact of this bias type. We deliberately concentrated on medium-sized 46
and large software intensive organizations that were developing mainly enterprise software or software tools. In addition, we had participants from globally distributed teams. However, the research would have benefitted from wider geographical distribution of participant companies. Table 11 Sources and impact of different types of bias in the doctoral research. *Definitions are from http://www.dictionarycentral.com/ Type
Definition of source*
Main sources of each bias
Level of impact in the
type in the doctoral research
doctoral research High
Selection
Utilizing a non-random
Omission and inclusive: using
bias
sampling method.
volunteers,
geographically
limited sample Performance
Not blinding treatment
Participants and researchers
bias
and control groups
were aware of what is being
Intermediate
measured and that the goal was to improve the current organizational situation Measurement
Inaccuracy in
No major known sources.
bias
measurements, coding
Measurements were carefully
or classification in trials
designed, piloted and analyzed.
Caused by non-
Persons not interested in or
response or withdrawal
against UX work might not
Attrition bias
Low
Intermediate
have participated as often as persons interested in the topic
Performance bias. Some participants might have tried to please the researcher and (inconsciously) respond what they thought was expected. We assume this has occurred to some extent. However, we guided the respondents to respond truthfully and alleviated that individual responses are only available for the research group and not given to, for example, the organization or their supervisor. The candidate did not prefer any approach of agile UX, nor she assumpted that UX work is a necessity in agile software development. Measurement bias. This type of bias was controlled best in our study. Each Study was deliberately designed and piloted with respondents of the target population. A few persons from target organizations evaluated survey questions in Study I. In Studies II and III, we discussed question framing with other researchers. In all the
47
Studies, we based questions on previous research. We also utilized commonly used analysis methods in each Study following the procedure carefully. Attrition bias. Response rate in web surveys is generally quite low. We assume that in our studies respondents might have different opinions compared to nonrespondents. In Study I, response rates were typical, less than 20%. In Study III, response rate in surveys for development team members was 65% and in Study II it was 81.6%. We argue that attrition bias affected our survey studies to a similar degree than other organizational web survey studies, and in Study II and Study III it might even be smaller than average. Considering interview studies, we deliberately selected interviewee types to represent all target groups. We had also interviewees that were against UX work or considered it futile. Thus, we believe we managed to gain also responses from persons against the topic. However, we agree that attrition bias has affected our study. Triangulation is a common means to aspire to validity especially in qualitative research (Denzin 2010). Denzin (2009) divides it to data, investigator, theory, and methodological triangulation. We utilized all these types of triangulation in the doctoral research (Table 12). Table 12 Utilization of triangulation in Studies I - III. Triangulation type
Study I
Study II
Study III
Data
Yes
Yes
Yes
Investigator
Yes
No
Partially
Theory
Yes
Yes
Yes
Methodological
Yes
Yes
No
Data triangulation. We had multiple data sources in each Study. We collected data from several companies and projects and from people working in different roles related to software development. We altered our focus in each study; we started on organizational level, continued first to project level and finally to level of individuals’ roles. We also examined both the process of making the software and the outcome (developed software). Investigator triangulation. The candidate designed and conducted Study I with another researcher, and multiple researchers assisted in the analysis. In addition, the candidate designed the analysis of Study III with another researcher. Another researcher revised quantitative results of Study II. Two researchers reviewed the 48
systematic review protocol. The candidate solely designed, conducted, and analyzed rest of the research. This is to decrease the confirmability of results. Theory triangulation. The research study combined different disciplines and attempted to integrate two approaches originating from separate disciplines. Those issues inherently guided to theory triangulation. We designed and analyzed studies in terms of both HCI and software engineering. We reviewed results and planned actions of improvement in each company with people whose roles and backgrounds were in different disciplines such as software engineering, social sciences and management. Methodological triangulation. We used several methods in data gathering in Studies I and II; web surveys and interviews in particular. In Study III, we utilized only web surveys that were mainly quantitative. Altogether, we utilized both qualitative and quantitative methods as well as systematic review in the doctoral research.
49
5.
Results
This chapter presents the results of the doctoral research. We start with introducing contributions of each of the included Publications and continue by summarizing the focal results per research question. First, we address research questions RQ1 and RQ2. Then we summarize results that were common to both RQ1 and RQ2, i.e. issues that are both challenging and important. As RQ3 is board, we first introduce the results on which we build the framework and finally, we present the construed framework that structures agile UX work.
5.1
Summary of Contributions per Publication
The thesis contributes towards structuring agile UX work in a way that supports desired UX in a timely manner with expected quality. Next, we present the core research output and give a brief discussion over the results per Publication. Publications P1 to P4 present results of Study I and address research questions RQ1 and RQ2. First, Publications P1, P2, and P3 present ways of working, challenges and good practices in agile UX work, and UX tasks in two companies as follows: P1. Publication P1 presents results of a survey and interview study conducted in Company A considering the practices and challenges in agile UX work. Research output of P1 includes a list of challenges in agile UX and a list of possible solutions. In addition, we present a list of identified UX tasks. Findings indicate that organizational aspects can greatly hinder agile UX work. We conclude that the organizational structure needs to support agile UX work before the impact of individual practices can become meaningful. Thus, improving agile UX practices in software development would not significantly improve agile UX work unless the organizational structure sufficiently supports the work. P2. Publication P2 presents results of a survey and interview study conducted in Company B considering the practices and challenges in agile UX development. Research output of P2 includes description of good and bad practices in agile UX work. Our perspective is in management and sales. 50
Findings of P2 are in line with those of P1. Management and and sales strategies considering software development work decreased the ability to conduct impactful UX work in the company; problems related to UX resources and timing and amount of UX work were common. Thus, those strategies should be in line with UX strategies; i.e. if UX of the developed software is important, it should be visible in management and sales strategies as well. P3. Publication P3 describes results of a three-year follow-up study in Company A. The company had rearranged its business lines to better support UX work. P3 describes the actions taken in company A and how the company representatives experienced the change. Research output includes description of actions the company took to improve agile UX work over the three-year period and report of experienced impact of those actions. The most significant changes in the company considering UX work included closer collaboration between UX specialists and architects and developers, assigning UX specialists in more impactful roles, and cultural and attitudinal change in the appreciation of UX work. P4. Publication P4 combines findings of studies introduced in P1 and P2. Results of this cross-case analysis concentrate on challenges in UX work and identification of UX tasks. Main research outcome of P4 include a detailed listing of most challenging issues in Companies A and B. We also summarize identified reasons for unsuccessful UX work. In addition, we describe the current and desired role and tasks of UX specialists in development projects. We discuss the debate of separate UX and development teams versus an interdisciplinary team with collective accountability. Finally, we provide suggestions on how to improve UX work. Main results include that participants wished for constant contribution from UX specialists throughout the project. The gap between the current and desired state was significant in the companies. In addition to contributions of Publications P1 - P4, this thesis presents new results from a survey study where we measured the gap between the importance of 51
the identified challenging tasks to success and performance of project in order to validate the Study I findings considering research questions RQ1 and RQ2. P6 presents part of the survey results. Publications P5, P6 and P7 present results from Study II and contribute towards research questions RQ2 and RQ3. They discuss the task allocation between UX specialists, POs and software developers in agile projects. P5. Publication P5 reports results from a pilot study of examining the task allocation between UX specialists and other contributors in agile projects. It utilizes the UX task list identified in Study I. Contributions include preliminary understanding of possible task allocation in multi-site development with inexperienced developers. Main results include increased understanding of tasks UX specialists actually conduct in their weekly duties and with whom they collaborate during those tasks. P6. Publication P6 compares three projects of a single PO as the PO works towards improving the ways of working in the projects. Contributions include comparison of different ways of working as experienced by the PO. In addition, P6 contributes towards understanding the UX task allocation between UX specialists and others. Finally, contributions include evaluation of Study I results considering challenging issues in agile UX in terms of surveying those in the three projects. Main results indicated that it is beneficial to co-locate the UX specialist either with the PO or the developers (if it is not possible that all are co-located). Pairing UX specialist with the PO is beneficial in larger projects with unclear vision or inexperienced developers. In smaller projects, overhead costs of the approach can grow too large. P7. Publication P7 presents results of a study of six projects from five companies based on similar research setting than we utilized in P5 in which we followed development teams over a release cycle with web surveys. Research outputs include survey statistics of the weekly survey, a classification of collaboration
52
types between the roles, and description of ways of working when following each of the types. Main results include classification of collaboration into minimal, PO – UX specialist, and developers – UX specialist types and discussion of characteristics and pros and cons of each collaboration type. Publication P8 contributes towards research question RQ3 in a form of a framework for integrating UX work into agile development. Publication P9 discusses developers’ ability to predict UX as assessed by users and thus contributes towards data-informed planning of user involvement. P8. Publication P8 introduces a conceptual model of a process combining Scrum and HCD methodologies in mobile enterprise application development. Research outcomes include a preliminary framework for agile UX in mobile application development and a mapping of activities, goals, activity owners and participants for organizing the work. P9. Publication P9 presents results from a study of how project members and users assess UX of the developed software. Research outcomes include a list of principal components of enterprise software UX and survey statistics of project team members’ ability to predict UX as assessed by end users. Main results include that developers seem to be able to understand practical dimensions of UX as assessed by end users but a UX specialist is needed to interpret and understand the hedonic. Moreover, our results indicate that UX of enterprise software consists of components related to motivation, productivity, usefulness, and professionalism. Next, we summarize our main results per research question.
5.2
Challenging Issues in Agile UX Work
The first research question RQ1 addresses challenging issues in agile UX work as follows: What are the current challenges in conducting UX work in projects following agile methodologies? 53
Publications P1, P2, and P4 address RQ1. Table 13 presents results from the cross-case analysis of Study I data from Companies A and B considering the most challenging issues reported by survey participants in an open-ended question. The list of challenges is adapted from P4 with minor corrections. We categorized the identified challenges according to the dimensions of integration as defined by Brhel et al. (2015) and resulted into four process-related, four practice-related, and two people-related challenges (Table 13). In addition, technology-related challenges included, for instance, means to communicate UX design between UX specialists and developers. There were no agreed-upon means for design communication, which caused problems, for instance, in version control (P1). This challenge arosed from the data but was not included in the ten most often mentioned ones. Table 13. Ten most challenging issues in agile UX work in companies C1 and C2. Adapted from P4. Legend: No. = issue number ordered by the number of respondents who mentioned the issue. % = percentage of respondents who mentioned the issue. The total number of respondents is 102 (60 from Company A and 42 from Company B). (*) Dimension is according to Brhel et al. (2015). Wow = ways of working. No.
Challenge
%
Dimension*
WoW
/
Outcome 1.
Understanding and fulfilling customer and user needs
29.4
Practices
WoW
2.
Maintaining the product vision
28.4
Process
Outcome
3.
Coordinating different work practices of UX specialists
27.5
Practices
WoW
and developers 4.
Getting participant users and user feedback
19.6
Practices
Outcome
5.
Having enough interdisciplinary cooperation
17.6
People
WoW
6.
Conducting UX work in time
16.7
Process
WoW
7.
Organizing processes and business models to support
14.7
Process
WoW
UX work 8.
Welcoming late change
11.8
Process
WoW
9.
Being compentent enough in UX
10.8
People
WoW
10.
Ensuring the implementation quality
9.8
Practices
Outcome
Table 13 categorizes identified challenging items also under outcome-related and development-related issues. Issues that are mostly related to the ways of working (WoW) include the following: Coordinating work practices between developers and 54
UX specialists, cooperation between disciplines, business and process models, timing of work, welcoming change, and team competence (items 3, 5, 6, 7, 8, and 9, respectively). The rest of the items are related more to the actual outcome, i.e. the software under development: Understanding and fulfilling user needs (in the outcome), maintaining the product vision, getting user feedback (of the outcome), and ensuring the quality of the outcome (items 1, 2, 4, and 10). Thus, ways of working should support activities that contribute towards the outcome with desired quality and scope. Next, we present common causes and consequences of precense of the aforementioned challenges. Organizational structures (including business and process models and coordination of work) need to support both agile ways of working and UX work. If only development teams are working in agile fashion, while other organizatios continue with plan-driven approaches, there will be problems in the cooperation between those organizations. During our studies, it was typical to have only development organizations working along agile methodologies. Having separate organizations or teams for development and UX specialists seem to hinder agile adoption in UX work. (P1, P2, P3, P4) Managing the product vision is challenging in agile UX work as in agile methodologies the big picture of the project should be clarified during the course of development instead of in a design upfront phase. This is particularly challenging for UX development due to its holistic nature, and as there are no established procedures to guide agile UX work in companies. Due to same reasons, planning UX work increments of size and duration (design chunking) adaptable to agile methodologies is challenging. (P1, P2, P4) Responsibilities and schedules considering UX work should be agreed on. This was rarely the case in our research sample. Unclear responsibilities and schedules make maintaining the product vision more difficult. In addition, timing of UX work is not straightforward when UX designers and developers are working according to separate schedules. It can also lead to situations where the UX work becomes a bottleneck. Thus, in order to keep to the schedule, POs tend make rushed decisions considering UX work that can have unpredictable impact on the user experience. Such decisions include for instance dropping UX work arbitrarily in a hurry. (P1, P2, P4)
55
Getting user feedback was considered difficult. The number of users is often smaller in enterprise than in consumer software. Users are often busy specialists and involving them in the development is challenging. Moreover, understanding the specialist users’ needs and the context of use were considered challenging. (P1, P2, P4) There were issues in communication between UX specialists and developers. Various artifacts and media were used in UX design communication. As we started the research, email threads including static UI images was a common way to communicate UX or UI design. UI design version control was lacking. Tools related to UI design communication seemed to evolve during the course of the research and we observed communicating design via backlog management tools and other more systematic ways towards the end of the research period. Another source of problems in communication between UX specialists and developers was having them in separate teams: cooperation in general was smaller then which seemed to have an impact on developers’ willingness to commit to UX work. In addition, when UX design was communicated to developers rather ready-made, developers felt dictated and the design was considered less implementable as such. (P1, P3, P7) Lacking communication can lead to double work (P7): For example, when communication was small, we observed that both the UX specialist and developers clarified user requirements and made UI designs separately. Developers also often made changes to the UI design and decided on how to implement UI design details by themselves without informing the UX specialist. These practices led to differences in the conception of user requirements and weakened the implementation quality as the implementation diverged from what the UX specialist had designed. (P7)
5.3
Tasks and Goals of Agile UX Work
The second research question RQ2 is as follows: What are the tasks and goals of agile UX work in enterprise software development? We formed a preliminary task list based on those tasks of UX teams’ participants of Study I survey considered the most important (Figure 10). Thus, the original list considered the most important tasks of UX teams. The list was further iterated after Study I interviews, and both after Study II pilot and actual survey. This work 56
yielded the list presented in Table 14. The list still excludes such tasks as creating graphic design and writing label texts. In the participated companies these tasks were usually handeled by external graphic designers and copywriters. We concentrated on those tasks that were conducted in the core project team or between an external UX specialist and the team and thus we exclude these tasks from our scope although they are important contributions in the development process. Create and communicate UX design Help development, deliver ideas Conduct studies, understand user Understand vision, big picture Create guidelines, share information Ensure implementation quality Conduct user tests Cooperate with stakeholders 0
5
10
15
20
25
30
35
Figure 10 Most important tasks of the UX team. Cross-case analysis of Study I survey. N = 117 (66 respondents from Company A and 51 from Company B). When creating enterprise software, agile UX development consists of creating and maintaining UX vision (V), creating a high-level concept (C) to communicate the early vision, designing (D) and implementing (I) the user interaction and visual style, and evaluating (E) the UX of the outcome. Evaluation can be conducted on the vision, concept or working software. (P8) We call the aforementioned concepts (C, D, E, I, and V) activity goals. We categorize the identified tasks based on their primary goal under these activities in Table 14. Thus, tasks are conducted to perform activities. Considering the UX specialist’s role, it was preferred that a UX specialist would be involved from early on throughout the project. Participating to design and vision work were especially desired from UX specialists. When there was no UX specialist available, the team desired contribution from a UX specialist most for conducting user studies, creating UI design, clarifying user definitions and target user groups, and reviewing the implementation. Moreover, it was emphasized that UX specialists should be part of agile teams instead of working from outside. (P1, P2, P4, P7) 57
Table 14 Identified UX tasks in agile enterprise software development (modified from P7). Legend: Cat. = Category, C = concept, D = design, E = evaluation, I = implementation, V = vision. Task
Description
Cat.
Creating concepts
Designing and sketching early ideas
C
Clarifying user requirements
Gathering and interpreting user requirements
V
Clarifying user definitions or target user groups
Defining who the users and user groups are
V
Planning user data gathering
Planning the user participation, studies and tests
V/E
Conducting user study
Studying users, their behavior and contexts
V
Conducting user testing
Evaluating the system, prototype, concept or idea by testing it
E
on users Creating UX designs
Designing user interaction or flow, making graphic design etc.
D
Design can be of any fidelity Reviewing UX designs
Inspecting and evaluating UX design feasibility
D
Creating architecture
Designing the software structure and deciding on the
D
designs
fundamentals of the system
Creating or grooming
Deciding of the scope of the outcome and the order of
product backlog
implementation (Scrum practice)
Planning a feature
Planning new or modified features for the release
V
Sharing understanding of
Discussing the UX design and decisions related to it,
D
the UX design
communicating the design idea to stakeholders
Sharing understanding of
Discussing
the technology
possibilities related to it
Implementing user
Implementing the user interaction and interface
I
Determining how to
Deciding on nuances and details related to the UX design, for
I
implement UX design
instance, when implementing on different platforms
technical
feasibility
and
limitations
and
V
D
interaction
details Making changes in the UX
Modifying the UX design, for example. based on feedback or
design
review
Reviewing implementation
Checking that the implementation corresponds to the design
Having a demo session
Demonstrating
the
software
to
customers,
users
D
I or
E
stakeholders (a Scrum practice)
58
In addition to UX tasks, we surveyed goals and activities that developers, UX specialists, and POs of Companies B, D, E, F, and G considered important for project success in the retrospect survey of Study II. We derived the list of surveyed UX-related issues from the list of most challenging issues created in Study I. There is some dissimilarity between the lists because the issue list for Study II was derived from preliminary results of Study I and because we wanted to increase the unambiguity of the concepts for the Study II survey. Part of results is presented in P6 and participants are presented in P7. Themes related to agile UX work that the participants considered most important for project success included the following: understanding user needs, fulfilling user needs, maintaining the big picture of the project, getting feedback from users, and having a competent team (Table 15). There were some differences in responses between the roles: Developers emphasized understanding and fulfilling user needs and maintaining the product vision. POs considered getting user feedback, understanding user needs and maintaining the product vision as the most important issues for project success. Finally, UX specialists emphasized understanding user needs, maintaining the product vision and communication between developers and UX specialists. Table 15 Importance of UX-related issues to project success on a scale from unimportant (0) to important (100). N=25. Mean
Standard deviation
Understanding user needs
87.2
12.1
Fulfilling user needs
83.2
17.7
Maintaining the big picture of the project (product vision)
82.5
13.4
Getting user feedback
80.8
17.5
Competence of the team
76.1
16.1
Cooperation between developers and UX specialists
73.1
18.6
Ensuring the implementation quality
72.4
14.5
Cooperation between the PO and UX specialists
70.6
19.1
Agility of UX work
70.2
14.7
Timing of UX work
67.8
13.2
Welcoming change
63.4
17.0
UX-related issue
59
To summarize, we divide goals of agile UX under outcome-related and processrelated goals. We identified UX tasks that are conducted to fulfill the following activity goals: creating and maintaining a product vision, creating a concept to communicate the vision, designing and implementing to realize the vision, and evaluating the outcome of any of these activities. These tasks are conducted to deliver a desired outcome, thus to understand and fulfill the user need, which is the primary outcome-related goal of UX work. The agile methodology, which the project team follows while conducting these activities and tasks, introduces processrelated goals, i.e. goals related to the ways of working. These include, for instance, being agile and timely, and working collaboratively. We conclude that our results considering tasks and goals of agile UX work provide new insight on the topic. Most of the related research has concentrated on UCD practices and how they are conducted in agile development. In contrast, our research takes a boarder perspective. We studied tasks that were considered important and that were actually conducted in agile UX. Moreover, we studied which issues practitioners consider the most focal in agile UX work regarding project success. Thus, the resulting contribution is novel.
5.4
Challenging Tasks and Goals of Agile UX Work
To validate the answer to RQ1, we measured the gap between the importance to project success and project’s performance of the goals and activities that developers, UX specialists, and POs considered important for project success in six projects of Companies B, D, E, F, and G. The validation measurement was performed in Study II two years after the original measurement of Study I. Participants (developers, POs and UX specialists) evaluated 1.) their project’s performance in each theme and 2.) the importance of the theme to project success on a fourfold table from poor (0) to excellent (100) and from insignificant (0) to significant (100), respectively. The question was part of the retrospect survey of Study II. We describe the participant population in Publication P7 and part of the results in P6. Table 16 describes the findings. Participants of Study II considered understanding and fulfilling user needs, maintaining the big picture of the project, and getting user feedback the most important areas for project success. In all these areas, performance was statistically significantly lower than importance, t = -6.09, t = -5.39, t = -8.59, and t = -5.62 for 60
all p < .001, respectively. There was no statistically significant difference between performance and importance in the following issues: agility of UX work, project team competence, and welcoming late change. Companies in general were more advanced in utilizing agile methods in 2013 than in 2011, which might have decreased problems related to agile adoption. We discuss this also in Publication P3. We conclude that the majority of challenging issues found in Study I were challenging also for projects that participated Study II. Moreover, participants considered those challenging issues also important for project success. Table 16 Difference between performance and importance in studied areas in six projects (N = 25). Results of T-test, df = 24 for all pairs. Statistically significant differences are emboldened, p < .01, SD = standard deviation. Mean Performance – Importance Pair
Performance
SD
T
p
- Importance Getting user feedback
-28.40
25.24
-5.63
.000
Understanding user needs
-26.68
21.89
-6.09
.000
Maintaining the big picture of the project
-26.48
15.41
-8.59
.000
Fulfilling user needs
-26.04
24.14
-5.39
.000
Developer - UX specialist cooperation
-23.88
20.44
-5.84
.000
Ensuring implementation quality
-23.48
21.33
-5.51
.000
PO – UX cooperation
-22.00
22.97
-4.69
.000
Timing of UX work
-18.76
25.68
-3.65
.001
Agility of UX work
-17.16
31.29
-2.74
.011
Competence of the team
-8.52
17.45
-2.44
.022
Welcoming late change
-6.08
26.54
-1.15
.263
5.5
Supporting Factors of Agile UX Work
The third research question RQ3 is as follows: Which activities support the integration of agile development and UX work? Our answer to RQ3 is twofold. We first address our findings considering factors related to integration support in this section. Then, in section 5.6, we present a framework for agile UX work that enables structuring those factors in order to (i)
61
organize the identified tasks and goals of agile UX work and (ii) diminish the identified challenges of agile UX work. Thus, activities to support the integration of agile development and UX work consist both of factors that support the integration and of a structuring framework. To support the integration, processes, tasks (or practices), people, and tool (or technical) related aspects need to be considered (Brhel et al. 2015). We first address aspects related to people and social, and to tools that facilitate communication and cooperation. Then we present results related to task allocation regarding agile UX work. We consider that task allocation is comparable with practices as defined by Brhel et al. (2015) since we understand the practice perspective as task-level integration. However, by tasks we do not refer to certain UCD practices but to activities that we identidied in our empirical research. In addition, task allocation is closely related to people factors as defined by Brhel et al. (2015) since a large part of the identified tasks are cooperative. Finally, we present process factors related to the integration of agile UX work. 5.5.1
People Factors
We detected the following three collaboration types in agile projects (P7) (Table 17). In the minimal collaboration type, the UXS works with design tasks and communicates rather ready-made design to developers. There is not necessarily actual collaboration between the roles. In the PO – UX specialist collaboration type the UX specialist works closely with the PO and the work concentrates more on those areas relevant to PO such as product vision and customer collaboration. UX design can be communicated to developers rather ready-made. In the Developers – UX specialist collaboration type the UX design is created witin costant collaboration between the UX designer and the developers. The last type was the most preferred one in Study I (P1, P4). Aforementioned collaboration types are not distinct. A single project can have characters from all of those and the degree of communication can evolve over time. These collaboration types can be considered archetypes of how the collaboration can be organized in a project. Naturally, cooperation can also be intensive between all the roles. However, there is always an overhead cost involved in communication (P6). Some projects endeavor for minimizing such cost by naming the UX specialist
62
as PO (P4, P7) and then working following the developers – UX specialist collaboration type. Table 17 Detected collaboration types in Study II (P7). Legend: PO = product owner, UXS = UX specialist, Dev = Developer Type
Benefits
Problems
Minimal
UXS can concentrate on UX tasks without
Synchronizing work between UXS and
disturbance (Ferreira et al. 2010).
developers. Maintaining the big picture of the project. Can lead to unfit design and double work. Spending the UX budget too early.
PO –
UX issues on project level. Helps in
High
overhead
cost
especially
in
UXS
maintaining the big picture of the project.
distributed projects. Developers have less impact on the design.
Dev –
Developers have visibility to reasoning
Tendency to allocate no or too little time
UXS
behind design decisions. Discussion over
for planning and user studies, risk of
the design is easier. Enables making
piecemeal work, and of compromising UX
smarter compromises between design and
too much for technical reasons.
technical limitations.
Our findings indicate that it is beneficial that all the roles contribute towards the end user UX in collaboration. A distinct UX specialist role that is responsible for almost all the UX work in the project makes the project prone to various obstacles. Moreover, cooperation and knowledge sharing between team members promotes learning: close cooperation between a PO and UX specialist allows the PO to learn to conduct user studies and gather user feedback (P6). In addition, it is beneficial for developers to learn, for example, UX design principles. Similarly, the UX specialist can learn from the other roles, which can foster the cooperation. Task sharing easens the workload of the often overburdened (or part-time) UX specialist and allows her/him to concentrate on those tasks that require specialism in UX. It also helps in creating and maintaining a shared product vision and in creating a common agreement of responsibilities considering UX work. (P1, P2, P3, P7, P9) To conclude, UX work is most beneficial for agile projects when there is a UX specialist involved from early on and throughout the project. However, the plain involvement is not sufficient for successful integration of UX work: UX tasks ought to be integrated via active collaboration of team members. The finding of need for continuous contribution from a UX specialist is in line with previous studies. 63
Moreover, related research values the importancy of communication between developers and UX specialist. However, our results differ from most studies in that our results strongly support the idea of a single collaborative multidisciplinary team instead of separation of UX and development work. 5.5.2
Technological Factors
Agile UX integration can be supported with proper tools that speed up the design work or improve communication. Tasks related to UX design and development should be communicated via backlog similarly to other development tasks. Also, UX design version control should be managed similarly to code version control: current version needs to be explicitly available for everyone. Thus, design should never be communicated, as an example, through email. (P1, P2) The form of UX design ought to be considered carefully. For instance in rapid mobile development using low-fidelity wireframes can significantly slow down the development as high-fidelity graphics can be used as reusable components in the software. In addition, current tools allow creating low- and high-fidelity prototypes and images with similar speed, and high-fidelity images can be more understandable for developers and users. In addition, cross-platform compatibility should be thought about to avoid rework. For example, responsive web design and HTML prototyping have become popular and they can be utilized to speed up the design process. (P1, P2, P5) Our research provides new contribution on technological factors considering the media and form of communicating the UX design. Previous research considering technical factors has mainly focused on introducing new tools authors have developed. However, our contribution on technological factors remains shallow and more research on the area is definitely needed. 5.5.3
Task Allocation between Contributing Roles
Since we determined that it is beneficial to collaborate and share tasks between roles, we studied which tasks can or should be handeled by others than the UX specialist and which tasks on the contrary require special skills and knowledge in UX. As it was unclear what shapes UX of enterprise software, we determined dimensions of enterprise software UX with principal component analysis of Study III data (P9). Our research suggests that in work-related tools, the main components 64
of UX are motivational factors, productivity factors, usefulness of the tool, and the tool’s perceived suitability to professional work (Table 6 in P9). Thus, our results indicate that enterprise software should foster users’ motivation, enable productivity in work, be useful, and support the users’ professional identity to offer good UX. (P9) Considering users’ and team members’ UX assessments, our results indicate that developers are able to understand instrumental (or pragmatic) aspects of UX in the software they are developing. However, developers seem to have a tendency to overemphasize the instrumental at the expense of the hedonic. Therefore, we recommend having a UX specialist to contribute to understanding and designing for non-instrumental (hedonic) aspects. (P9) Clarifying user requirements, feature planning, and reviewing and discussing UX designs are often collaborative activities. Developers conducted these activities irrespective of UX designers’ contribution – either between developers or with the UX specialist (P7). We interpret it is because those are the core required activities to be able to implement purposeful software. Thus, when there is insufficient communication between the UX designer and developers, developers need to form their own understanding of the user need and UI design solutions. Moreover, developers’ understanding might be significantly different from what the UX specialist designed. Thus, clarifying user needs, feature planning and discussing UX design should be collaborative activities. (P7) In contrast, the UX specialist can conduct the actual activity of UX design creation by herself/himself as long as there is constant communication between the UX specialist and developers to ensure the implementability of the UX design. We determined that experienced developers can be more knowledgeable in, for instance, platform style than UX specialists who often possess more holistic skills while developers can be specialized to certain platforms. Moreover, developers can and should be responsible of certain UX tasks such as considerations regarding technical limitations, following platform style, and performance issues. (P1, P5, P6, P7) We detected that POs were in many cases responsible of communicating with users. (P6, P7) It can be natural role for many POs who are accustomed to collaborate with customers. However, for understanding the hedonic needs and goals of the user, it is beneficial to include the UX specialist in the discussions. Also, if the UX specialist is responsible of making the UX design, she or he needs to 65
have proper understanding of the user. In addition, meeting the user is beneficial also for the developer. (P5, P6, P7) Thus, compared to the current approaches for organizing agile UX work, we suggest that the team should be collaboratively responsible of UX work whereas the current approaches emphasize the role of the UX specialist as the sole or main contributor of the work. 5.5.4
Process Factors
We consider that the development process should support all the other integration areas. Thus, the process should support UX work, cooperative way of working between the disciplines, UX design chunking, getting user feedback, and conducting UX design work along with development work instead of conducting upfront design activities. Moreover, it should describe responsibilities of different roles and structure UX work in a way that it supports agile development. Thus, the development process should support the work itself. However, we consider that UX work should be as lightweight as possible and activities that do not result into evident user value should not be conducted. (P1, P2, P3, P4, P5, P6, P7, P8, P9) The process followed in the software development project should give guidance on how to conduct UX work. It should enable validating UX design as well as the software itself. In many cases, the UX design can be validated in the form of working software. However, it is not always possible since it can be more affordable to validate the design for instance in the form of a prototype. Therefore, the agile value of “working software over comprehensive documentation” (Beck et al. 2001) is not always feasible in agile UX work. Thus, we broaden it as follows: “working design or software over comprehensive documentation.” By ‘working design’ we refer to validated UX of the software: there should be evidence that the design is implementable and brings both business and user value. Moreover, to be able to deliver validated working design from iteration, evaluation activities should be included in the iteration itself. Finally, the process should incorporate the UX specialist role into the team and give guidance on the task allocation between the roles. Regarding UX work, the process should guide towards short UX designevaluation loops. Time-consuming activities such as user recruiting should be planned in advance.
66
We identified two remarkable differences compared to previous research considering process factors: First, some previous models for integrating agile UX work value separation of UX work and development activities whereas our results indicate cooperation would be both preferred and more effective way of working. Thus, we claim that there is need for more cooperative process than separate UX design and software development streamlines presented in e.g., (Beyer et al. 2004, Sy 2007). Second, there is need for structuring the early design work in more agile way than the current approaches offer. Next, we present the framework we developed to address the identified factors.
5.6
Framework for Organizing Agile UX Work in Enterprise Software Development
The previous section presented factors related to supporting the integration of UX work and agile software development. This section continues addressing the research question RQ3 Which activities support the integration of agile development and UX work? by combining results from the contributing Publications into a framework for integrating UX work with agile software development. We have named the framework BoB, Best of Both worlds. The framework is named after an episode of the Star Trek: The Next Generation television series called Best of Both worlds (Roddenberry et al. 1990). The following quotes are from a scene of the episode where the Borg demand that Picard surrenders himself: – What the hell do they want with you? – I thought they weren't interested in Human lifeforms, only our technology. – Their priorities seem to have changed. BoB framework has dualistic properties: It merges best practices from both agile methodologies and UX work. In addition, it addresses both the developed enterprise software and the process of developing it. Finally, it aims at maximizing the quality while minimizing the time and cost. The framework consists of the following four main parts (Figure 11): ∂ Inputs and Outputs of agile UX work, ∂ Metrics to evaluate the Outputs, Process and Inputs, ∂ Process in which the Outputs (enterprise software) is created, 67
∂ Activities and Tasks that actualize the Process. The purpose of the framework is to enable constant production of software that pleases the user and has appropriate quality and scope. Simultaneously, another goal is to be cost-efficient and fast with short time-to market. Thus, the idea of the framework is to target to adequate UX with low cost methods – to avoid producing both over and under quality. The framework is construed for enterprise software development where the aim often is not to produce the best possible UX but fulfilling organizational needs generally is the first priority. The goal of BoB framework is to enable the team to work towards an outcome with agreed scope maximizing the quality, profit and satisfaction while minimizing cost, time and risk. Naturally, the degree of tolerance, for instance, in quality or risk can vary based on strategic decisions; as an example, the company can take large considered risks but still minimize the unwanted or unexpected parts of it, or the project can aim at lower quality but still minimize issues that are unwanted. Guide
Objectives
Inputs and Outputs of UX work
I / O Are evaluated
Utilizes Inputs
with
to create Outputs
Metrics
Realize
Activities and Tasks
Process
Structures
Figure 11 Elements of BoB framework
68
We present the framework as follows: First, we present the identified inputs and outputs related to agile UX work on an ishikawa diagram (Ishikawa 1976). Second, we present metrics related to inputs, process, and the outcome. Third, we present activities and tasks that shape the agile UX work and the recommended task allocation between the team members. Finally, we present a process model for structuring agile UX work in a way that supports the following: ∂
Organizing tasks and goals of agile UX work
∂
Overcoming identified challenges of agile UX work
∂
Offering a construction for identified supporting factors
∂
Creating enterprise software with desired UX in short time with low costs
The process model includes guidelines for sharing activities and task within the development team. 5.6.1
Inputs and Outputs of BoB
Ishikawa diagram (Figure 12) presents the identified contributors and aspects (team, tasks, process, tools, and user) of agile UX work as inputs for the framework. Arrows on the contributors and aspects describe the supporting factors. We adopt the classification of inputs, processes, and outputs from Aaltonen et al. (2012) and the presentation form (the diagram) from Ishikawa (1976). Performance metrics (quality, scope, time, and cost) are adopted from Chow et al. (2008). Factors that contribute the most towards the UX of the resulting enterprise software are included
Figure 12 Team, tasks, process, tools, and user act as inputs for software development. The resulting enterprise software is the outcome. Quality, scope, cost, and time are dimensions that are utilized to measure the successfulness of both the development performance and the developed software. 69
We relate user and the team with people factors as identified by Chow et al. (2008), and Brhel et al. (2015). Tasks are in line with practices and tools with technology dimension of Brhel et al. (2015). The team realizes the tasks using the tools to create the outcome as described in the activity theory (Engeström 2000) while the process provides a structure for the work. Software development is a form of knowledge work. Moreover, since the outcome is enterprise software that has a user, it is also most probably utilized for knowledge work. Thus, similar metrics presented, for instance by Aaltonen et al. (2012) can be utilized to evaluate both the developed software and its development process and associated input factors when feasible. User differs from the other input factors: Instead of being evaluated, they are the ones to determine the success of the outcome. Table 18 presents examples of dimensional sources or possible preforms of metrics that can be utilized to evaluate both the activity and the outcome. Table 18 Examples of possible metrics of each assessment dimension in terms of development activities and the outcome or the system being developed. Dimension
Development
Quality
Ability
to
Outcome create
Implementation Scope
desired
quality.
Level
UX.
Overall UX as rated by the users, User
and
satisfaction, error freeness, Ability of, for
composition of skills in the team.
example, to motivate the user.
Ability to respond to user need. Proper
Extent to which user need is covered.
composition of the team. Proper team
Amount of unused features.
size. Appropriate tools. Time Cost
Total development time. Time to market.
Productivity metrics. Easiness to learn,
Feature lead time.
fastness of use.
Development
cost.
Reputational cost.
Lifetime
cost.
Cost of acquiring. Cost of adopting. Cost of error.
When evaluating the process of developing the UX of the software, the dimensions for metrics can be utilized to measure the ability of the team and process to create desired UX (Figure 12). In contrast, when applying the diagram for evaluating the UX of the outcome, i.e. the UX of the developed enterprise system, the “quality” output (Figure 13) can include the identified dimensions of enterprise software: motivational factors, usability factors, usefulness of the tool, and the tools perceived suitability to professional work (P9). The “scope” output becomes then the extent to which the developed software fulfills user needs (P9).
70
Figure 13 Outputs or evaluation factors related to the outcoming enterprise software. 5.6.2
Collaborative Activities
The following activities form the process of creating enterprise software that meets the user needs (P8): ∂ creating a high-level vision of the software in terms user needs and resulting user value ∂ concretizing the vision with a concept ∂ realizing the user value through UX design and development work ∂ Validating the value creation through evaluation activities A vision is an overall idea of the software. It can be elaborated throughout the development process. However, a rough vision should be available early and there should be a common understanding of the vision amongst the team members throughout the project. All the team members should contribute towards the vision. A concept concretizes the vision and helps to communicate it. Concept can be anything from a sketch to working software. UX design contains creating the user flow or interaction design, creating visual design and selecting UI components. Development work consists of software designing and implementation (analysis and build (Royce 1970, Pressman 2001)) activities and it transfers the design into the form of working software. Evaluation aims at validating that a design decision delivers user value. Evaluation can be conducted on vision, concept, design, or working software. Table 14 (p. 58) maps identified UX tasks to the aforementioned activities and Figure 14 below describes how UX tasks can be shared for collective accountability of UX specialists, developers and POs.
71
Figure 14 General task allocation between roles. The model is created based on results of Study II and Study III. Legend: PO = product owner, UXS = user experience specialist, DEV = developer. (Kuusinen, forthcoming) We recommend the following principles for task allocation in enterprise system development considering UX work. UX specialist role: Involvement of a UX specialist is beneficial throughout the project. UX specialist’s role is especially important early in the project to ensure viable vision for the project. Early vision work decreases uncertainties and increases the clarity of the big picture of the project throughout the project’s lifecycle. UX specialist is needed for understanding and designing for non-instrumental qualities of UX. Moreover, UX specialist’s contribution is often needed to ensure fluent user interaction. PO role: A PO knowledgeable in UX issues can successfully handle the vision work. However, mistakes while creating the vision are often costly since they affect the whole project: misunderstanding the user need may even lead to developing wrong product. Therefore, it can be beneficial to refine the vision as the project proceeds instead of trying to fully understand it before starting implementation. Furthermore, PO can also collect and interpret instrumental user needs for instance
72
in a short workshop with users. When user data gathering mechanisms are available, PO seems to be able to gather user feedback and interpret instrumental results. Developer role: Developers need to understand reasons behind UX design decisions in order to be able to make good software design decisions. Thus, developers should be exposed to real-life user needs, behavior and contexts. Developers define user requirements or user stories behind development tasks by themselves if they are not included in the elicitation work. It leads to double work, mistakes in understanding user needs and differences in the conception of the big picture between team members. A developer knowledgeable in UX issues can also successfully lead the UX design work especially when the product vision is clear or the scope of the project is narrow. This is also the case when non-instrumental aspects are less relevant for the product. However, frequent discussion with a UX specialist is beneficial also in that case for ensuring the interaction fluency. In conclusion, developers should meet users, see the developed system in actual use, and have clear reasoning behind UX design decisions. 5.6.3
Within-iteration Process for Agile UX Work
Analysis is required in all software development methodologies and in innovation cycles in general including agile methodologies (e.g., Preece 2001, Liker 2004, Ries 2011). However, long analysis phase without validating activities should be avoided. Norman (2005) gives an example: “There were no systematic studies of users. Rather, early automobiles tried a variety of configurations, initially copying the seating and steering arrangements of horse-drawn carriages, going through tillers and rods, and then various hand and foot controls until the current scheme evolved.” Also we consider HCD in its current form too rigorous approach for enterprise software development. Nowadays there are methods for more rapid experimentation cycles than at the time cars developed to their current form. We consider that user needs and principles of human behavior should guide the development. However, we think that exhaustive user studies are rarely needed if we allow trial and error in the development. We recommend utilizing methodologies that emphasis design upfront for cases in which larger studies are required. Table 19 presents supporting guidelines for the within-iteration approach we are suggesting. 73
Table 19 Supporting guidelines for the within-iteration approach (Kuusinen, forthcoming). People
Process
Learn
from
others:
Broaden
Work
your
iteration
to
Produce
within
one
Tasks
Tools
Integrate UX work via
Communicate
tasks not via roles
tasks via backlog
Minimize user studies
Establish
UX
competence areas Be
willing
working
cooperate
design
Respect people from
Allow trial and error:
Treat
other disciplines
accept design debt
similarly
Involve
the
team
in
communication
whole user
feedback
channels UX to
tasks
Utilize
technologies
other
that allow rapid design
development tasks
and development
Hurry to markets to
Appreciate
Allow
enable
professionalism when
difference in visual
allocating tasks
and
actual
user
feedback
maturity functional
readiness
We have derived the supporting guidelines presented in Table 19 from the identified supporting factors discussed in section 5.5. Our guidelines considering people factors concentrate on the interdisciplinary cooperation and to fostering the different competence areas. Our guidelines related to process and task factors aim at enabling rapid UX work via task sharing and a mindset that seeks for rapid feedback and embraces human error. Guidelines concerning tools mainly aim at supporting the other factors, i.e. supporting rapid design and development, and offering smooth communication and feedback channels. Our process consists of analysis and build tasks that are conducted continuously during the development cycle (Figure 15). The same cycle is utilized throughout the project; thus, the framework eliminates the concept of separate design upfront. Instead of having a design upfront phase we suggest to include the design and planning work into several “normal” iterations that can contain both UX design and development tasks. Thus, the core vision of the software is created early. In principle, a vision and a conceptual model ought to be available before starting implementation of the actual system. Thus, creating an early vision can be equated with, for instance, roadmapping work where possible project ideas are fostered before initiating a development project and projects are established based on those ideas. On the other hand, it can also be conducted within the project if the project is started with less clear focus. When the team is unfamiliar with the domain, creating a high-level vision often requires input from users. To overcome problems related to 74
separate upfront design phase, we suggest a framework where vision and concept work are also conducted iteratively as part of the development work. Moreover, vision can be elaborated throughout the project. However, early visioning work concentrates on understanding the underlying phenomena and thus it rarely involves code creation. Therefore, early planning work should be kept to the minimum. Ideas can be evaluated on a conceptual level with, for example, prototypes or sketches. However, validating the viability of the idea often realizes only after working software is available for users. For instance, vertical approach towards visioning in which only a small set of features are in focus at a time can be used to enable early start of implementation.
Figure 15 BoB process. (Kuusinen, forthcoming) We introduce the process model in (Kuusinen, forthcoming) as follows. The early product definition phase (Figure 15) differs from the rest of the development in that there might be nothing tangible available yet. The understanding of the product vision is often lacking and because of it, communication with the user might be more intangible. We propose to start with a few short user workshops with the UXS and developer during which the product vision and most critical user stories are grown. Between those workshops, the team works towards something tangible for the users to make it easier for all the stakeholders to understand the early product vision similarly enough. Based on the early vision and most critical user stories that have been created during the early iterations, an initial backlog is created. The deliverable from this early phase (clickable version in Figure 15) can be, for
75
example, a partially functional prototype or a click-through template that realizes the most important user story (or stories). (Kuusinen, forthcoming) After the clickable version has been evaluated with users, the team starts to work towards the earliest production version of the system (Development cycle in Figure 15). User interaction is built based on continuous communication within the team and together with users when needed. When the team cannot solve a UX-related issue, developers start to build the next task on the priority list and the UX specialist investigates the problem until a solution is found, or until it is decided to postpone the task. The UX specialist either implements the user interaction or pairs up with a front-end developer. The team hurries a production version to the market to start getting actual user feedback and then continues increasing the product incrementally. The team works similarly throughout the forthcoming iterations. Thus, the approach is similar both in the early product definition phase and during the actual development phase. We recommend allowing the delivery of working prototypes and partially functioning features to gain user feedback in addition to the delivery of working software. Thus, the system can contain both fully working features and forthcoming features that are delivered for some user groups in order to allow getting early user feedback before launching the feature. This approach is especially beneficial for situations where actual randomized experiment with control and treatment groups (A/B testing) is not applicable due to smaller number of users. (Kuusinen, forthcoming) During the first iterations, especially in larger projects, the majority of outcome can be “working design:” UX design that has been validated and is potentially shippable. The design that is produced during an iteration can be for instance lowfidelity holistic system design or a highly detailed feature. Simulated back-end can be utilized in user evaluations and customer inspections. As iterations proceed, the type of outcome will gradually evolve to working software. Thus, the software can have qualities of both working software and prototype. The design type can be, for instance, sketches, wireframe, paper prototype, functional prototype, or working software. What matters is that the design has been validated to be delivering value to users, and it is both technically and economically feasible. However, the team should start delivering working software as soon as possible since it allows faster access to market and feedback from real use and otherwise the way of working can start to resemble traditional upfront design work. Thus, in smaller or more focused 76
projects, the team most probably will deliver working software from the first iteration. The team tests the outcome of the iteration during the iteration itself and fixes or further works it during the following iterations. Product work list is refined based on the iteration results. This ensures that test results will be taken into consideration in the implementation. Thus, the process diminishes the problem of working in several iterations simultaneously and thus evaluating the preceding increment only after starting the next one. Performing UX and usability tests during the current iteration necessitates keeping tests lightweight as well as planning and scheduling those in advance. At best, tests are conducted on working prototypes or on the actual software as the UX design itself can be communicated in the form of prototypes or working software. UX designers and developers work together in teams. UX tasks can be allocated to different roles as described in Publications P6, P7, and P8. The idea is that when team members collaborate they learn from each other and UX specialists do not need to conduct all UX work. We have observed that a PO can organize user sessions and be responsible of gathering user feedback while a developer can lead user interface design work especially in mobile platforms that offer clear style guides. That leaves the UX specialist to analyze the user data and create interaction design. In practice, the suggested approach requires removing strict boundaries between roles. Instead, we would refer to competences when allocating tasks between team members. Although certain tasks need to be taken care of during software development, it is not a necessity to involve certain roles. UX tasks should be allocated as other development tasks: the self-organizing team selects those from the backlog. Anyone interested can select tasks that require smaller amount of learning and less professionalism in UX. This allows the UX specialist to concentrate on tasks that are more demanding UX-wise. Similarly, certain UX tasks necessitate rather deep understanding of technical aspects and a developer might have more suitable competence for conducting those tasks. This approach surely requires understanding of UX tasks and their impact and duration when ordering the backlog and selecting tasks for each iteration. We argue that the proposed approach would tackle many of the current problems in agile UX development. The team would have better visibility to user studies and 77
thus also to user needs. The team would own both the UX design and the code which makes collaboration easier and increases the level of commitment. Design chunking would be natural part of the process from the beginning. User studies and tests can be better scheduled when the first iterations would be more design-driven. The other activities during the first iterations could be for example, building working prototypes, building the technology stack and development pipeline, experimenting with possible technologies, creating architecture design, and conducting backend development. Iterative and incremental work can be started already during the early product definition. Instead of iteration 0, there can be multiple design-oriented iterations. The outcome of the first iterations can be for instance sketches or prototypes produced in the small team that can be rapidly tested with users. The fidelity of the outcome and the proportion of implemented (instead of drawn or simulated) items can grow from iteration to iteration. Thus, the project can be started with designoriented iterations and as the work proceeds, the focus will gradually shift to implementation and working software. The idea is to constantly maintain the design validation: at any moment, the UX design should be viable. Design-oriented iterations are mostly needed in larger projects or in projects where the vision is unclear. In any case, the design should be validated in the first place and with working software as early as feasible. 5.6.4
Example Project
We give an example how the process could be utilized in practice. The example is based on our research in companies; it is not an actual project but a representation of several projects. After initiating a project, a small team starts to work towards a working prototype of the software. The team can consist of, for example, one UX specialist and one to two developers. The team together with users works towards understanding the project vision and most critical user paths. This can be done in, for instance, repeated workshops during some hours to some weeks. Between the workshops, the UX specialist drafts the user flow and user interface, which are iterated together with users in following workshops. Simultaneously, the developer(s) build the development environment and experiment with possible technologies they will utilize in the project. In addition, the team, together with the PO, creates an initial
78
project backlog. The prototype is an early version of the core functionality of the software that realizes the design idea. It can be, for example, a clickable software prototype with simulated backend, which realizes user paths of most important use cases. After the prototype is validated with users, the team size is readjusted to meet the capacity requirements of the project. The team starts to work towards the first production version of the system. The UX specialist either implements the user interaction or pairs up with a front-end developer. The user interaction is built based on continuous communication within the team and together with users when needed. The team continues with similar settings during forthcoming iterations. In addition to working software, the team will deliver prototypes and partially functioning features throughout the process for gaining rapid user feedback. Thus, the system can contain both fully working features and forthcoming features that are delivered for some user groups in order to allow getting early user feedback before launching the feature.
79
6.
Discussion and Conclusions
This chapter gives a discussion over the results as well as their contribution and validity. The chapter discusses only the validity concerning summarizing the findings of individual Publications for the thesis and building the final framework. In general, we discuss the research validity in more detail in Section 4.7. Before the closing remarks, we present implications for future research.
6.1
Overview of the Research Subject
The objective of this thesis is to make recommendations on the integration of UX work and agile methodologies in the context of enterprise software development in order to create means to support the integration. The phenomenon of agile UX integration is multifaceted for various reasons. The concepts of agile and UX are described via various definitions, models, and methodologies, and the conception of the term UX in the industry is different from academic conception. Companies apply agile methodologies in various ways different from the methodology descriptions. Moreover, agile methodologies do not guide UX work, and thus companies tend to create individual practices. The research theme is young but it is becoming increasingly popular, which probably reflects the topicality of the subject both in companies and in the academia. However, despite the growing interest towards the topic, the research has not sufficiently addressed focal areas of the phenomenon. Models for agile UX integration are scarce and similar to each other. Moreover, they lack many essential agile practices such as responsibility of a multidisciplinary team, minimizing and guiding upfront design activities, and fostering collaboration. The research field in general lacks extensive empirical studies especially considering process, people, and technical factors (Brhel et al. 2015).
6.2
Revisiting the Research Questions
We summarize our answer to each research question and shortly discuss the findings in regards to earlier research.
80
RQ1. What are the current challenges in conducting UX work in projects following agile methodologies? Challenges were related to the core goals of UX work, the ways of working, and organizational structures. Thus, they were hindering the agile UX work in several essential ways. Challenges related to the core goals of UX work included difficulties in understanding and fulfilling user needs. Maintaining the product vision, getting user feedback and ensuring the implementation quality were considered difficult which also hindered the development of the outcome. Challenges in organizing the interdisciplinary communication and coordinating different work practices of UX specialists and software developers made the ways of working more complicated. Moreover, timing and conducting UX work in agile way were also considered challenging. Finally, unsupportive organizational structures and business models were the most impactful category of challenges. Our results were in line with those presented in the systematic review considering challenges in agile UX work by Salah et al. (2014). Challenging issues that Salah et al (2014) identified were related to differences in work practices between agile and UCD such as differences in conception of upfront activities, increment sizes, usability testing and documentation. RQ2. What are the tasks and goals of agile UX work in enterprise software development? Goals were related both to the project outcome, i.e. the software under development, and to the process in which the outcome is being created. Outcomerelated goals included understanding and fulfilling the user need and activity goals that contribute toward the user need fulfillment. Those include creating and maintaining a product vision, creating a concept to communicate the vision, designing and implementing to realize the vision, and evaluating the outcome of any of these activities. The project team conducts activities and tasks following an agile methodology, which introduces process-related goals, i.e. goals related to the ways of working. These include, for instance, being agile and timely, and working collaboratively. Participants considered it essential to include UX competence in the team throughout the project with an emphasis on visioning and design work. The amount
81
of preferred contribution from the UX specialist varied from a guiding role to contributing on user research, creating UX design, and performing user tests. Practitioners considered understanding and fulfilling user needs, maintaining the big picture of the project, and getting user feedback the most important UX-related areas for project success. Areas that were considered both challenging and important to project success included getting user feedback, understanding user needs, maintaining the big picture of the project, fulfilling user needs, cooperation between developers and UX designers, ensuring implementation quality, cooperation between POs and UX specialists, and timing of UX work. Practitioners’ preferences considering the role and contribution of UX specialist in agile development has not been addressed empirically in earlier research. Nonetheless, the result is in line with the traditional role of usability specialist. Moreover, there are a few studies addressing cost-justification of UX work, for example, Rajanen (2011) and Bias et al. (2005) while studies addressing business benefits of UX work are scarce. Rajanen (2011) concludes that costs can be seen instantly whereas benefits only appear later during the software life cycle. Innes (2011) states that managers often focus on short-term cost and risk reduction while ignoring the costs that the building of inadequate software incurs. In contrast, our results suggest that practitioners consider UX work and fulfilling user needs essential for project success. RQ3. Which activities support the integration of agile development and UX work? We address supporting activities through factors that support the integration and a framework that gives a structure to the identified factors, tasks and goals of UX work. We categorize the factors into process, people, technical, and task sharing, (practices) retelling the categorization of Brhel et al. (2015). We approach the integration via sharing UX tasks between the roles and contributing towards means to decrease the need for upfront designing. Thus, our approach supports the integration through collaborative commitment to UX work and by allowing the often-overburdened UX specialist to focus on most relevant contributions by reducing the workload of the UX specialist. Our results are in line with previous research in that communication between developers and UX designers is important for the integration. However, despite 82
promoting communication between the disciplines, the majority of earlier research suggests separation of UX work and development activities. We consider that such practice decreases the agility of the project especially in cases where less rigorous planning activities would be sufficient.
6.3
Contributions of the Thesis
Our main contribution is a constructive analysis of activities to support agile UX integration that resulted in a description of supporting factors and to a framework for the agile UX integration (RQ3). The novelty value of our work is in the fashion of establishing the integration. While previous models approach the integration via a distinct role of a UX specialist (Brhel 2015) and separating the UX work from development activities (e.g., Beyer et al. 2004, Sy 2007, da Silva 2012), we realize the integration through collaborative task sharing and including the UX specialist in the development team. In addition, our work contributes towards diminishing the amount of upfront design. Extensive upfront planning is typical for UX work but deprecated in agile methodologies. There is evidence towards improved productivity and communication when UX specialist and developers work in close cooperation in cross-functional teams (Ferreira et al. 2011). However, there are no models or methodologies guiding the cooperative work as the previous models value separation of UX and development work (Brhel et al. 2015). Thus, the framework introduced in this thesis is the first identified contribution towards crafting such model. Singh (2008), Leszek et al. (2008) and Ungar (2008) present methods to increase collaboration between the UX specialist and PO (Singh 2008) or between the UX specialist and developers (Leszec et al. 2008, Ungar 2008). Ungar’s (2008) Design Studio is a good way to integrate developers to conception work while Singh (2008) describes how a UX specialist can contribute on product level and towards vision work. Our framework utilizes similar practices through task sharing. Our other contributions include results considering tasks and goals of agile UX work (RQ2). We studied those in relation to project success, which is a novel approach. In addition, we identified elements that practitioners consider both significant to project success and challenging. Moreover, our research provides new contributions to technological factors considering the media and form of communicating the UX design. Previous research considering technical factors has 83
mainly focused on introducing new tools authors have developed. However, our contribution on technological factors remains preliminary, and more research on this area is needed. Finally, we contributed towards understanding UX of enterprise software. As previous research has mostly concentrated on leisure systems, it has been unclear what shapes enterprise software UX. We formed dimensions of enterprise software UX in principal component analysis. However, the resulting model is preliminary as it has not been validated with sufficient amount of respondents (N = 55). To conclude, our research suggests that instead of trying to integrate the HCD process as such with agile development, researchers should questionate the degree to which HCD practices are required in each phase. Lack of user evaluation possibilities in agile methodologies seem to drive UX work towards getting everything right at once. Thus, we consider that a process that guides to short feedback cycles diminish the need for rigorous upfront planning. As agile development methodologies do not support design work, UX practitioners have tried to resolve the problem with endeavors to keep well ahead of the development. We consider that approaches supporting iterative design-evaluate cycles naturally reduce the need for being prepared in advance.
6.4
Revisiting the Research Methodology
The most obvious limitation of our research is that the resulted framework has not been validated empirically. We utilized the builing theories from case studies approach in building the framework, and thus it is strongly based on our empirical findings. The elements of the framework are either based on good practices we have witnessed in the industry or enablers of such practices. Moreover, the underlying factors are presented in the included Publications. However, the framework as such has been in use in none of the studied companies. Eisenhardt et al. (2007) claims that since the theory building approach is deeply embedded in rich empirical data, the methodology is likely to produce theory that is accurate, interesting, and testable. Our results are based on multiple cases and viewpoints: We investigated the phenomenon from different levels (organizational, project, team, tasks, and outcome) and from different perspectives (process, practices, people, and technical) in seven companies utilizing various approaches of agile and UX work. The research approach guided us to retain only to those findings 84
that were replicated across most of the cases, and thus it increased the robustness and generalizability of the construed integration factors and the framework (Eisenhardt et al. 2007). Finally, we discuss theoretical and peer evaluation of BoB framework. An early version of the framework has been presented to peer agile UX researchers in a workshop called “On the Integration of UCD and Agile Development” of NordiCHI 2014 conference in order to get feedback considering the framework. In addition, parts of the framework are presented in Publications P8 and P9. Furthermore, considering theoretical evaluation, according to Pfeffer (Eisenhardt et al. 2007 ref. Pfeffer 1982), a good (organizational) theory is parsimonious, testable, and logically coherent. Corley et al. (2011) argue that theoretical contribution (in management research) constitutes of originality and utility (both scientific and practical) of the theory. Based on the preliminary peer evaluation of the framework, we claim it contains such qualities.
6.5
Future Work
Our research contributes towards a more agile and more collaborative approach of integrating agile UX work. Further studies are needed to understand and foster the dynamics of the cooperation between UX specialists, developers and POs. Moreover, research on task allocation between team members and supporting the developers’ ability to contribute towards good UX are needed. The most efficient ways for communicating the UX design between UX designers and developers should be understood in order to be able to develop supporting tools and mechanisms. Our research suggests that instead of trying to integrate the HCD process with agile development as such, researchers should question the type and the degree to which HCD practices are required in agile development. Thus, future work should concentrate on finding the balance between plan and design activities and experiment and iterate activities. Embracing late change should be in focus in Agile UX research in general. Thus, for instance, mechanisms for design chunking and enabling shorter feedback loops are needed. An evident implication for future research is to empirically test and validate the resulted framework. Future research questions could include the following:
85
∂
Does utilizing BoB framework remove the identified problems? Which of the problems it fixes, which are remained? Does utilizing the framework create new challenges?
∂
What kind of influence the taken approach has on agile UX work? For instance, how does it affect team’s experience, feature lead-time, efficiency of communication, or UX of the outcome?
∂
In which contexts is the framework applicable? Can it be applied for other than enterprise software development? How does BoB work with modern approaches such as DevOps and continuous software engineering? Does it scale?
∂
How could BoB framework be improved?
As our work is limited to enterprise software development, it would be interesting to study agile UX integration in leisure systems development. It could be studied what are the tasks and goals of leisure systems development and are our results transferable to that domain. Finally, research considering the UX of enterprise software and work related tools is still at its early stage. Our contribution offers a preliminary model of the dimensions of the UX of enterprise software. However, rigorous empirical studies are needed in order to deepen the understanding of what construes the UX of enterprise software and how it differs from the UX of leisure systems.
6.6
Conclusions
This thesis contributes towards understanding the integration of user experience (UX) work and agile software development activities in enterprise software development. Our main contributions include investigations of activities to support the agile UX integration that resulted to a description of supporting factors and to a framework (called BoB) for the agile UX integration. BoB framework includes inspection of inputs, outputs, and metrics related to agile UX work in enterprise systems development, principles for task allocation between contributing roles, and a process model for conducting UX work together with other agile development activities. Our framework differs from previous ones in that it values unification of UX work and development activities instead of separation of them and that it 86
realizes the integration via UX tasks rather than via integrating a distinct role of a UX specialist. Our work presents both academic and practical contributions. First, academics can utilize the presented framework and the empirical results to further study and develop means for agile UX integration. Second, the results can aid practitioners in challenges they encounter in agile UX work. Moreover, the framework offers an empirically based model for organizing UX work in agile software development.
87
References Aaltonen, I. Ala-Kotila, P., Järnström, H., Laarni, J., Määttä, H., Nykänen, E., et al. (2012) Stateof-the-art report on knowledge work: New ways of working. VTT Technology, 106 pp. Abrari, P. & Allen, M. J. F. (2006) Business rules user interface for development of adaptable enterprise applications. Patent US20060129978 ACM 2012 CCS. Association for Computer Machinery: The 2012 ACM Computing Classification System available at: http://www.acm.org/about/class/2012 (retrieved May 13, 2015) Agile Manifesto (2001). Manifesto for Agile Software Development. Beck, K., Beedle, M., van Bennekum, A., Cockburn, A., Cunninghan, W., Fowler, M., et al. Available at: http://agilemanifesto.org/ (retrieved May 13, 2015) Ahmad, M. O., Markkula, J. & Oivo, M. (2013) Kanban in software development: A systematic review. Proc. 39th Euromicro Conference Series on Software Engineering and Advanced Applications (SEAA’13). IEEE Computer Society, pp. 9 - 16 Anderson, J. (2010) Kanban. Blue Hole Press, 261pp. Bark, I., Følstad, A. & Gulliksen, J. (2006). Use and Usefulness of HCI Methods: Results from an Exploratory Study among Nordic HCI Practitioners. People and Computers XIX — The Bigger Picture. Springer London, pp 201-217 Barksdale, J.T. & McCrickard, D.S. (2012). Software product innovation in agile usability teams: an analytical framework of social capital, network governance, and usability knowledge management. International Journal of Agile and Extreme Software Development 1 (2012), pp. 52–77. Beck, K. (1999) Extreme Programming Explained: Embrace Change, Addison-Wesley, 224p. Beyer, H., & Holtzblatt, K. (1998) Contextual Design: Defining Customer-Centered Systems. Morgan Kaufmann Beyer, H., Holtzblatt, K., & Baker, L. (2004). An agile customer-centered method: rapid contextual design. In Extreme Programming and Agile Methods-XP/Agile Universe 2004. Springer Berlin Heidelberg. pp. 50-59. Bias, R. G.; Mayhew, D. J. (2005). Cost-Justifying Usability: An Update for the Internet Age. Morgan Kaufmann Publishers. 660p Bloomer, S., Croft, R., Kieboom, H. (1997). Strategic Usability: Introducing Usability into Organisations. In Extended Abstracts on Human Factors in Computing Systems: Looking To the Future, CHI’97. ACM, New York, NY, pp. 156-157. Boehm, B. (1988). A spiral model of software development and enhancement. IEEE Computer, 21, 5, 61–72.
88
Boehm, B. (2006) A view of 20th and 21st century software engineering. Proc. 28th International Conference on Software engineering, ICSE '06. ACM New York, NY, USA, pp. 12-29 Boivie, I., Gulliksen, J., & Göransson, B. (2006). The lonesome cowboy: A study of the usability designer role in systems development. IwC, 18(4), pp. 601–634. Bosch, J. (2014) Continuous Software Engineering: An Introduction. In J. Bosch (ed.), Continuous Software Engineering, Springer International Publishing Switzerland 2014, pp. 3-13 Bødker, S., & Grønbæk, K. (1991). Cooperative prototyping: users and designers in mutual activity. International Journal of Man-Machine Studies, 34(3), 453-478. Brhel, M., Meth, H., Maedche, A., Werder, C. (2015) Exploring principles of user-centered agile software development: A literature review. Information and Software Technology 61, 2015. pp. 163-181 Cajander, Å., Lárusdóttir, M. & Gulliksen, J (2013). Existing but Not Explicit - The User Perspective in Scrum Projects in Practice. Proc. Human-Computer Interaction – INTERACT’13, LNCS vol. 8119, pp 762-779. Card, S. K., Moran, T. P. & Newell, A. 1983. The Psychology of Human-Computer Interaction. Hillsdale, NJ: Lawrence Erlbaum. Chamberlain, S., Sharp, H., & Maiden, N. (2006). Towards a framework for integrating agile development and user-centred design. In Extreme programming and agile processes in software engineering (pp. 143-153). Springer Berlin Heidelberg. Chow, T., & Cao, D. B. (2008). A survey study of critical success factors in agile software projects. Journal of Systems and Software, 81(6), 961-971. Cochrane Collaboration. (2003) Cochrane Reviewers’ Handbook. Version 4.2.1. December 2003 Cockburn, A. (2004) Crystal Clear: A Human-Powered Methodology for Small Teams. Pearson Education, 336pp. Cockburn, A. & Highsmith, J. (2001) Agile software development: The people factor. Computer, 34(11), 2001 IEEE Constantine, L. L. & Lockwood, L. (2002 a). Usage-Centered Engineering for Web Applications, IEEE Software, 19 (2), pp. 42-50. Constantine, L. L. & Lockwood, L. (2002 b). Process agility and software usability: Toward lightweight usage-centered design. Information Age, 8(8), 1-10. Cooper, A. (2004). The inmates are running the asylum, Sams, Indianapolis, IN, USA, 2004. Corley, K. G. & Gioia, D. A. (2011). Building theory about theory building: What constitutes a theoretical contribution? Academy of Management Review, 36 (1) pp-12-32. Corona, E., & Pani, F. E. (2013) A review of Lean-Kanban approaches in the software development. WSEAS Transactions on Information Science and Applications, 1(10), pp. 1-13 Da Silva, T. S. (2012) A framework for integrating interaction design and agile methods. Doctoral Thesis, Pontificia Universidade Catolica do Rio Grande do Sul, 110pp.
89
Da Silva, T.S., Martin, A., Maurer, F. & Silveira, M. (2011) User-centered design and Agile methods: a systematic review. In: Proc. of the International Conference on Agile Methods in Software Development, AGILE 2011 IEEE Computer Society Davis, F. D. (1989) A technology acceptance model for empirically testing new end-user information systems: Theory and results. MIS Quarterly, 13 (3), pp. 319-340 Denzin, N. K. (2009) The elephant in the living room: or extending the conversation about the politics of evidence. Qualitative Research April 2009 9, pp. 139-160 Denzin, N. K. (2010). Moments, mixed methods, and paradigm dialogs. Qualitative inquiry. Deming, W. E. (1950). Elementary Principles of the Statistical Control of Quality, Japanese Union of Scientists and Engineers, 1950. Diefenbach, S., Kolb, N. & Hassenzahl, M. (2014). The ‘Hedonic’ in Human-Computer Interaction. Proc. the 2014 Conference on Designing interactive systems (DIS), ACM, pp. 305314. Dingsøyr, T., Nerur, S., Balijepally, V. & Moe, N. B. (2012) A decade of agile methodologies: Towards explaining agile software development, Journal of Systems and Software, 85(6), 2012, pp 1213-1221, Elsevier Eisenhardt K. M. (1989) Building theories from case study research. Academy of Management review, 14, no. 4 (1989), pp. 532-550 Eisenhardt, K. M. & Graebner, M. E. (2007) Theory building from cases: Opportunities and challenges. Academy of Management Journal, 50 (1), 2007, pp. 25-32 Engeström, Y. (1987). Learning by expanding. Helsinki: Orienta-Konsultit Engestrom, Y. (2000). Activity theory as a framework for analyzing and redesigning work. Ergonomics, 43(7), 960-974. Engeström, Y., Miettinen, R. & Punamäki, R.-L. (1999). Perspectives on activity theory. Cambridge University Press. 462p. Feiner, J. & Andrews, K. (2012) Usability Reporting with UsabML, Proc. the 4th International Conference on Human-Centered Software Engineering (HCSE 2012), pp. 342–351. Ferreira, J. (2012) User experience design and agile development: Integration as an on-going achievement in practice. Doctoral Thesis, The Open University, 262pp. Ferreira, J., Sharp, H., Robinson, H. User experience design and agile development: managing cooperation through articulation work. Software: Practice and Experience 41(9), 2011, John Wiley & Sons, pp. 963-974 Fitzgerald, B. & Stol, K.-J. (2014) Continuous software engineering and beyond: trends and challenges. In Proc. 1st International Workshop on Rapid Continuous Software Engineering (RCoSE 2014). ACM, New York, NY, USA, pp. 1-9. Fox, D., Sillito, J., & Maurer, F. (2008). Agile methods and user-centered design: How these two methodologies are being successfully integrated in industry. Proc. AGILE'08, (pp. 63-72). IEEE. Gay, G. (2004). Activity-centered design: An ecological approach to designing smart tools and usable systems. MIT Press. 144p.
90
Gonçalves, J. & Santos, C. (2011) POLVO - Software for Prototyping of Low-Fidelity Interfaces in Agile Development, in: 13th IFIP TC 13 International Conference on HumanComputer Interaction (INTERACT 2011), pp. 63–71. Goodhue, D. L. & Thompson, R. L. (1995). Task-Technology Fit and Individual Performance. MIS Quarterly, 19 (2), pp. 213-236. Gottesdiener, E. (1995). RAD realities: Beyond the hype to how RAD really works. Application Development Trends, 2(8). Gould, J. D., & Lewis, C. (1985). Designing for usability: key principles and what designers think. Communications of the ACM, 28(3), 300-311. Grudin, J. (2012). A moving target – The evolution of human-computer interaction. In Jacko, J. (ed.), Human-computer interaction handbook (3rd ed). Taylor & Francis 2012. Gulliksen, J., Boivie, I., Persson, J., Hektor, A., & Herulf, L. (2004, October). Making a difference: a survey of the usability profession in Sweden. Proc. The third Nordic conference on Human-computer interaction (pp. 207-215). ACM. Gulliksen, J. & Göransson, B. (2001) Reengineering the system development process for usercentered design. Proc. Interact 2001, pp. 359-366 Gulliksen, J., Göransson, B., Boivie, I., Blomkvist, S., Persson, J., & Cajander, Å. (2003). Key principles for user-centred systems design. Behaviour and Information Technology, 22(6), 397409. Hassenzahl, M. (2004). The interplay of beauty, goodness, and usability in interactive products. J. Human-Computer Interaction, 19 (4), December 2004 pp 319-349 Hassenzahl, M. (2005). The thing and I: Understanding the relationship between user and product. In M. Blythe et al (Eds.), Funology (Ch.3). 2004. Kluwer Hassenzahl, M. (2008). User Experience (UX): Towards an experiential perspective on product quality. Proc. 20th International Conference of the Association Francophone d'Interaction Homme-Machine, ACM. Pp. 11-15. Hassenzahl, M., & Tractinsky, N. User experience - a research agenda. (2006). Behaviour and Information Technology, 25(2), 91–97. Henry, P. (2007). Process-UI alignment: new value from a new level of alignment. Align journal, October 2007 Hewett, T., Baecker, R., Card, S., Carey, T., Gasen, J., Mantei, M., Perlman, G., Strong, G., & Verplank, W. (1996). ACM SIGCHI Curricula for Human-Computer Interaction. Highsmith, J. (2000) Adaptive software development: A collaborative approach to managing complex systems. Dorset House Publishing Co. 392p, ISBN 0-932633-40-4 Highsmith (2002) Agile software development ecosystems. Addison-Wesley Longman Publishing Co, 404pp. Highsmith, J., & Cockburm, A. (2001) Agile software development: the business of innovation. Computer, 34(9), IEEE Computer Society
91
Hoda, R., Noble, J. & Marshall, S. (2011) The impact of inadequate customer collaboration on self-organizing Agile teams, Information and Software Technology, 53 (5), May 2011, Pp. 521534, ISSN 0950-5849 Hosseini-Khayat, A., Hellmann, T. D. & Maurer, F. (2010) Distributed and Automated Usability Testing of Low-Fidelity Prototypes. Proc. Agile Conference (AGILE2010), pp. 59–66. Indulska, M., Green, P., Recker, J., & Rosemann, M. (2009). Business process modeling: Perceived benefits. In Conceptual Modeling-ER 2009 (pp. 458-471). Springer Berlin Heidelberg. Innes, J. (2011). Why Enterprises Can’t Innovate: Helping Companies Learn Design Thinking. In HCII 2011, LNCS: 6769, 442–448. Springer Berlin / Heidelberg Ishikawa, K. (1976) Guide to quality control. 2nd ed. Asian Productivity Organization, 226p. ISO 9241-210:2010. (2010). Ergonomics of human-system interaction. Part 210: Humancentered design for interactive systems. Jia, Y. Lárusdóttir, M. & Cajander, Å. (2012). The Usage of Usability Techniques in Scrum Projects. Proc. Human-Centered Software Engineering (HCSE’12). LNCS 7623, pp. 331-341. Jokela, T. & Abrahamsson, P. (2004). Usability assessment of an Extreme Programming project: Close co-operation with the customer does not equal to good usability. Proc. Product Focused Software Process Improvement (PROFES’05). LNCS vol. 3009, pp. 393-407 Jurca, G., Hellman, T. D. & Maurer, F. (2014) Integrating Agile and User-Centered Design: A Systematic Mapping and Review of Evaluation and Validation Studies of Agile-UX. Proc. Agile Conference (AGILE 2014), pp. 24 – 32. IEEE Computer Society Kane, D. (2003, June). Finding a place for discount usability engineering in agile development: throwing down the gauntlet. Proc. Agile Development Conference, 2003. ADC 2003. pp. 40-46. IEEE Computer Society. Kaptelinin, V. (1996). Activity theory: Implications for human-computer interaction. In Nardi, B. A. (1996). Context and consciousness: Activity theory and human-computer interaction. MIT Press 400p Kaptelinin, V. & Nardi, B. (2012) Activity theory in HCI: Fundamentals and reflections. Morgan & Claypool Publishers. 106p. Kennerley, M. & Neely, A. (2003). Measuring performance in a changing business environment, International Journal of Operations & Production Management, Vol. 23 Iss: 2, pp.213 – 229 Khan, K. S., Ter Riet, G., Glanville, J., Sowden, A. J., & Kleijnen, J. (2001).Undertaking systematic reviews of research on effectiveness: CRD's guidance for carrying out or commissioning reviews (No. 4 (2nd Edition)). NHS Centre for Reviews and Dissemination. Kitchenham, B. & Charters S. (2007). Guidelines for performing Systematic Literature Reviews in Software Engineering. Vol 2.3 EBSE Technical Report,
EBSE-2007-01,
Software
Engineering Group, School of Computer Science and Mathematics, Keele University, Keele, UK, 2007 Kuusinen, forthcoming: Kuusinen, K. Practices to support within-iteration UX work: A crossfunctional team approach, accepted to Integrating User Centred Design in Agile Development, HCI book series, Springer, to be published in 2016.
92
Kuutti, K. & Bannon, L. J. (2014). The turn to practice in HCI: towards a research agenda. In Proc. of the 32nd annual ACM conference on Human factors in computing systems (CHI '14). ACM, New York, NY, USA, pp. 3543-3552. Lallemand, C., Gronier, G. & Koenig, V. (2015). User experience: A concept without consensus? Exploring practitioners’ perspectives through an international survey, Computers in Human Behavior, Vol 43, February 2015, pp. 35-48, ISSN 0747-5632 Larman, C., Basili, V. R. (2003). Iterative and incremental development: A brief history. Computer, 36 (6), 2003, pp. 47 -56. IEEE Computer Society Lárusdóttir, M. (2012) User Centred Evaluation in Experimental and Practical Settings. Doctoral Thesis, Kungliga Tekniska högskolan, 80p. Lárusdóttir, M. K., Cajander, Å., & Gulliksen, J. (2012). The Big Picture of UX is missing in Scrum Projects. Proc. International Workshop on the Interplay between User Experience (UX) Evaluation and System Development (I-UxSED 2012). pp. 49-54 Lárusdóttir, M., Cajander, Å. & Gulliksen, J. (2014). Informal feedback rather than performance measurements: user-centred evaluation in Scrum projects. Behavior and Information Technology, Vol. 33, no 11, 1118-1135 Law, E. L.-C., Roto, V., Hassenzahl, M., Vermeeren, A. P.O.S., & Kort, J. (2009). Understanding, scoping and defining user experience: a survey approach. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 719-728. Law, E. L. C., Hassenzahl, M., Karapanos, E., Obrist, M., & Roto, V. (2015). Tracing links between UX frameworks and design practices: dual carriageway. Proc. HCI Korea (pp. 188195). Hanbit Media, Inc Lazar, J., Feng, H.F., & Hochheiser, H. (2010) Research Methods in Human-Computer Interaction. John Wiley and Sons Lee, T. M. & Park, C. (2008). Mobile technology usage and B2B market performance under mandatory adoption. Industrial Marketing Management, 37 (7), pp. 833–840. Leontiev, A. N. (2014). Activity and consciousness. Revista Dialectus, (4). Leszek, A., & Courage, C. (2008). The Doctor is ”In” – Using the Office Hours concept to make limited resources most effective. Proc. AGILE Conference, 196–201 Liker, J. (2004). The Toyota Way: 14 Management Principles from the World's Greatest Manufacturer. McGraw-Hill. 330p. Martin, J. (1991). Rapid application development. Macmillian Coll Div. 736p. May, E. L., & Zimmer, B. A. (1996). The evolutionary development model for software. Hewlett Packard Journal, 47, 39-41 Mao, J.-Y., Vredenburg, K., Smith, P. W. & Carey, T. (2001). User-centered design methods in practice: a survey of the state of the art. Proc. Conference of the Centre for Advanced Studies on Collaborative research (CASCON '01), Darlene A. Stewart and J. Howard Johnson (Eds.). IBM Press McCarthy, J. & Wright, P. Technology as Experience (2004) MIT Press
93
McInerney, P. & Maurer, F. (2005). UCD in agile projects: dream team or odd couple?. interactions 12, 6, pp. 19-23. Melo, C. O., Cruzes, D. S., Kon, F. & Conradi, R. (2013) Interpretative case studies on agile team productivity and management. Information and Software Technology, Special Section: Component-Based Software Engineering (CBSE) 2011, Vol 55, iss 2, 2013, pp. 412-427 Memmel, T., Gundelsweiler, F. & Reiterer, H. (2007 a). CRUISER: A Cross-Discipline User Interface and Software Engineering Lifecycle. Human-Computer Interaction. Interaction Design and Usability. pp. 174-183. Springer Berlin Heidelberg Memmel, T., Gundelsweiler, F., & Reiterer, H. (2007 b). Agile human-centered software engineering. Proc. The 21st British HCI Group Annual Conference on People and Computers: HCI... but not as we know it, Vol. 1. . pp. 167-175. British Computer Society Miller, L. (2005). Case study of customer input for a successful product. Proc. Agile Conference ’05, pp. 225 – 234. IEEE Computer Society Miller, L. & Sy, D. (2009). Agile user experience SIG. In CHI '09 Extended Abstracts on Human Factors in Computing Systems (CHI EA '09). ACM, New York, NY, USA, 2751-2754. Moen, R. D. & Norman, C. L. (2010). Circling back. Quality Progress 43,11 (2010), pp. 22-28. Nardi, B. A. (1996). Context and consciousness: Activity theory and human-computer interaction. MIT Press 400p. Norman, D. A. (2005). Human-centered design considered harmful. Interactions, 12(4), 14-19. Ohno, T. (1988) Toyota Production System: Beyond Large-Scale Production. Productivity Press, 143pp Palmer, S. R. & Felsing, M. (2001) A Practical Guide to Feature-Driven Development, Pearson Education, 299p Patton, J. (2002). Hitting the target: adding interaction design to agile software development. In OOPSLA ’02: OOPSLA 2002 Practitioners Reports, pp. 1–7. ACM Press. Peixoto, C.S.A. & da Silva, A.E.A. (2009) A Conceptual Knowledge Base Representation for Agile Design of Human-Computer Interface, in: Third International Symposium on Intelligent Information Technology Application (IITA 2009), pp. 156–160. Pfeffer, J. (1982) Organizations and organization theory. Marshfield, MA: Pitman PMI Institute (2004) A Guide to the Project Management Body of Knowledge, PMI Standard Committee. Poppendieck, M. & Poppendieck, T. (2003) Lean Software Development: An Agile Toolkit. Addison-Wesley Professional, 203p Preece, J., Rogers, Y. & Sharp, H. (2002) Interaction design: Beyond human–computer interaction. John Wiley & Sons, New York, NY Pressman, R. (2001) Software engineering - A practitioner's approach, 5th ed., McGraw-Hill. Pressman, R. (2010) Software engineering - A practitioner's approach, 7th ed., McGraw-Hill. Rajanen, M. (2011). Applying usability cost-benefit analysis – explorations in commercial and open source software development contexts. Doctoral thesis. Acta Universitatis Ouluensis, Ser. A, Scient. rerum nat. 587
94
Ries, E. (2011). The Lean Startup: How Today's Entrepreneurs Use Continuous Innovation to Create Radically Successful Businesses. Crown Publishing. 103p. Roddenberry, G. (writer), Piller, M. (writer), Bole, C. (director). (1990). The Best of Both Worlds: Part 1 [Television series episode]. In Berman, R. (executive producer), Star Trek: The Next Generation, USA Rohn, J. A. (2007). How to organizationally embed UX in your company. interactions, 14(3), 25-28. Royce, W. (1970) Managing the development of large software systems. Proc. IEEE Wescon 26, 1970, 1-9 Runeson, P. & Höst, M. (2009) Guidelines for conducting and reporting case study research in software engineering. Empir. Software Engineering 14 (2009), 131–164 Salah, D. (2013) Maturity models in the context of integrating agile development processes and user-centred design. Doctoral Thesis, University of York, 489pp. Salah, D., Paige, R. & Cairns, P. (2014) A systematic literature review on agile development processes and user centred design integration. In Proc. of the 18th International Conference on Evaluation and Assessment in Software Engineering (EASE’14). ACM. Article 5, 10 p. Salvador, C., Nakasone, A. & Pow-Sang, J. A. (2014). A systematic review of usability techniques in agile methodologies. Proc. The 7th Euro American Conference on Telematics and Information Systems (EATIS '14). ACM, New York, NY, USA, Article 17, 6p. Sanders, E. B.-N. (2002). From user-centered to participatory design approaches. In Design and the social sciences. Taylor & Francis Books Limited Sanders, E. B. N., Brandt, E., & Binder, T. (2010, November). A framework for organizing the tools and techniques of participatory design. Proc. 11th biennial participatory design conference (pp. 195-198). ACM. Sanders, E. B. N., & Stappers, P. J. (2008). Co-creation and the new landscapes of design. Codesign, 4(1), 5-18. Sanoff, H. (2006). Multiple views of participatory design. METU Journal of the Faculty of Architecture, 23(2), 131-143. Schaffer, E. (2004). Institutionalization of Usability - a Step-by-Step Guide. AddisonWesley, 276p Schwaber, K. (1997) SCRUM Development Process. Business Object Design and Implementation. OOPSLA’95 Workshop proceedings. Springer London, pp 117-134 Schwaber, K. (2004) Agile Project Management with Scrum. Microsoft Press. Schwaber, K. & Beedle, M. (2001) Agile Software Development with Scrum. Prentice Hall PTR, 158pp. Seffah, A., Gulliksen, J. & Desmarais, M. C. (eds.) (2005). Human-centered software engineering. Springer, 391p. Seffah, A. & Metzker, E. (2004) The obstacles and myths of usability and software engineering. Communications. ACM 47 (12) (Dec. 2004), pp. 71-76.
95
Sharp, H., Rogers Y. & Preece, J. (2007). Interaction Design: Beyond Human-Computer Interaction, 2nd ed., John Wiley & Sons Inc., West Sussex, UK. Shewhart, W. A. (1939). Statistical Method from the viewpoint of quality control, U.S. Department of Agriculture, reprinted by Dover, 1939, 45pp. Singh, M. (2008). U-SCRUM: An agile methodology for promoting usability. In Agile, 2008. AGILE'08. Conference (pp. 555-560). IEEE. Sohaib, O. & Khan, K. (2010) Integrating usability engineering and agile software development: A literature review. Proc. International Conference on Computer Design and Appliations (ICCDA 2010). Vol. 2, pp. V2-32 - V2-38. IEEE Computer Society Sohaib, O. & Khan, K. (2011). Incorporating discount usability in extreme programming. International Journal of Software Engineering and Its Applications, 5(1), 51-62. Stainer, A. & Stainer, L. (1998) Strategic change in public services: a productivity and performance perspective. Strategic Change, Vol. 7, No. 2, pp. 111-119. Stapleton, J. (1997). DSDM, Dynamic Systems Development Method: The Method in Practice Cambridge University Press, 163p. Sy, D. (2007). Adapting usability investigations for Agile user-centered design. Journal of Usability Studies 2, 3 (2007), 112–132 Takeuchi, H. & Nonaka, I. (1986). The New New Product Development Game. Harvard Business Review, January/February, pp. 285-305 Tenk (2012). Responsible conduct of research and procedures for handling allegations of misconduct in Finland. Guidelines of the Finnish Advisory Board on Research integrity. Available at: http://www.tenk.fi/sites/tenk.fi/files/HTK_ohje_2012.pdf Visited 11 Aug 2015 Tenk website. Finnish Advisory Board on Research Integrity. Accessible at: http://tenk.fi/en visited 11 Aug 2015 Ungar, J. (2008). The Design Studio: Interface design for agile teams. Proc. AGILE Conference, 519–524. IEEE Computer Society Venturi, G. & Troost, J. (2004). Survey on the UCD integration in the industry. Proc. The Third Nordic Conference on Human-Computer interaction, NordiCHI '04, vol. 82. ACM, New York, NY, pp. 449-452. Vukelja, L., Müller, L., & Opwis, K. (2007). Are engineers condemned to design? a survey on software engineering and UI design in Switzerland. In Human-Computer Interaction– INTERACT 2007 (pp. 555-568). Springer Berlin Heidelberg. Vuolle, M. (2011). Measuring performance impacts of mobile business services from the customer perspective. Doctoral thesis. TUT Publication 1013, Tampere University of Technology, Tampere. Suomen Yliopistopaino, 85p. Vuolle, M. & Käpylä, J. (2010). Theoretical evaluation models used in mobile work context. Proc. The Ninth International Conference on Mobile Business / Ninth Global Mobility Roundtable. pp. 425-431. Väänänen-Vainio-Mattila, K., Roto, V. & Hassenzahl, M. (2008). Towards practical user experience evaluation methods. EL-C. Law, N. Bevan, G. Christou, M. Springett & M.
96
Lárusdóttir (eds.) Meaningful Measures: Valid Useful User Experience Measurement (VUUM) (2008): 19-22. Wale-Kolade A. Y. (2015). Integrating usability work into a large inter-organisational agile development project: Tactics developed by usability designers. Journal of Systems and Software 100, pp. 54–66. Williams, L., & Cockburn, A. (2003). Agile software development: It’s about feedback and change. Computer 36 (6), pp. 39-43, IEEE Computer Society Wright, P. C. & Blythe, M. (2007). User experience research as an inter-discipline: Towards a UX Manifesto. In Law, E., Vermeeren, A., Hassenzahl, M., Blythe, M. (Eds.), Towards a UX Manifesto - Proceedings of a cost-affiliated workshop on BHCI 2007, pp. 65-70. Womack, J. P., Jones, D. T. & Roos, D. (1990). The Machine That Changed the World. Simon and Schuster, 323p. Yin, R. K. (2003 a). Applications of case study research (2nd ed.) SAGE Publications, 173p. Yin, R. K., (2003 b). Case Study Research: Design and Methods. (3 rd ed.) SAGE Publications, 181p.
97
APPENDIX 1 – Informed Consent for Interviews INFORMED CONSENT To Participate in an Agile UX Research Session I will participate in an interview organized by Tampere University of Technology, Unit of Human-Centered Technology (IHTE) on ____ / ____ / ____.
The interview is part of a study that aims at developing best practices in Agile UX (user experience) work. ___ I give my consent for audio recording of this session ___ I do not give my consent for audio recording of this session The data gathered during this session will only be used by the research personnel at IHTE, and it will not be given outside. Any information you give out will be confidential and will be processed anonymously. Results of the study will be reported gathered and they are not connected to any identification data of participants.
Participant of the study:
Research performer:
_____________________________________
_____________________________________
Signature of the participant
Signature of researcher
_____________________________________
Clarification of the signature
_____________________________________
Clarification of the signature
(Researcher fills)
Identifier of the participant for anonymization:________________________________________
098
APPENDIX 2 - Informed Consent for Study I Survey 1 Agile Working Practices
The purpose of this study is to examine the practices and routines in management, R&D and user experience design, and cooperation between these groups. By answering the questionnaire you will advance the current understanding of working methods in your company and help in designing better practices for agile development. This questionnaire is part of Cloud Software Program. The survey is organized by the Unit of Human-Centered Technology (IHTE), which is part of the Department of Software Systems at Tampere University of Technology. The contact persons for this research are Kati Kuusinen kati.kuusinen(at)tut.fi, and Santtu Pakarinen, santtu.pakarinen(at)tut.fi The questionnaire is anonymous. All the results and responses of the questionnaire will be gathered and reported so that respondents cannot be recognized. Names of attending companies will not be reported outside the Cloud Software Program. The questionnaire data will be available for the research personnel at IHTE only, and raw data will not be given to the attending companies. The questionnaire includes several open and structured questions about the respondents' work and the working practices of the company in general. Answering the questionnaire takes approximately 45 minutes. You may answer either in English or Finnish. You can move to the questionnaire by pressing the "Next" button below. You can suspend answering and continue it later by pressing the "Break" button on any page. After that follow the orders how to continue answering later. Thank you for your valuable contribution!
099
APPENDIX 3 – Study I Survey 1 Agile Working Practices Your Work You may answer either in English or Finnish.
1. Shortly describe your main tasks: ______________________________________________________________ ______________________________________________________________
2. The development process you follow in your work can be described as: Incremental model RAD (Rapid Application Development) RUP (Rational Unified Process) or other UP Scrum Spiral model V model Waterfall or stage-gate model XP (eXtreme Programming) Other:________________________________ I do not work following any development process
3. How would you estimate your level of professionalism with usability? Novice
Expert
4. How would you estimate your level of professionalism with agile working methods? Novice
Expert
5. How would you estimate your level of professionalism with user experience (UX)? Novice Inadequate
Expert Adequate to successfully complete my assignments
6. In your opinion, what are the three most challenging issues in Agile UX work at the moment? By Agile UX work we mean designing and developing user experience in agile product development environment
________________________________________________________________ ________________________________________________________________
100
Terms Please describe the following concepts with your own words, do not check the meanings from any source.
7. In your opinion, what does "user experience, (UX)" mean? ________________________________________________________________ ________________________________________________________________
8. In your opinion, what does "agile development" mean? ________________________________________________________________ ________________________________________________________________
9. In your opinion, what does "usability" mean? ________________________________________________________________ ________________________________________________________________
UX Cooperation UX cooperation means the cooperation between UX specialists and you. If you are a UX specialist; your cooperation with other R&D functions. UX team means the group of people responsible for UX issues in design.
10. In general, which are the most important tasks of the UX team? ________________________________________________________________ ________________________________________________________________
11. Which tasks in your work are related to user experience? ________________________________________________________________ ________________________________________________________________
12. How often do you communicate or work with UX specialists? Daily Weekly Monthly Few times a year Yearly Less than yearly Never I am a UX specialist
13. When is the UX cooperation at its best? ________________________________________________________________ ________________________________________________________________
101
14. When is the UX cooperation unsuccessful or frustrating, why? ________________________________________________________________ ________________________________________________________________
15. How would you improve the UX cooperation? ________________________________________________________________ ________________________________________________________________
UX Specialists' Contribution 16. Describe the work contribution of UX specialists during the last project you were involved in ________________________________________________________________ ________________________________________________________________
17. What would have been the most desirable work contribution from UX specialists? ________________________________________________________________ ________________________________________________________________
18. Which parts of the product life cycle are you involved in the product or software development? By product life cycle we mean the time from ideating possible products or services to declining the product or service.
________________________________________________________________ ________________________________________________________________
19. How are UX specialists involved in those parts of the product life cycle you mentioned in the question 18? Please mention the phase and methods
________________________________________________________________ ________________________________________________________________
20. How are UX specialists brought into development in those parts of the product life cycle you mentioned in the question 18? Please mention the phase and method
________________________________________________________________ ________________________________________________________________
102
Guides and User Interface Documentation 21. How often do you use design guides or guidelines? We do not have any design guides or guidelines Daily Weekly Monthly Few times a year Yearly Less than yearly Never, I do not need those in my work Never, but I should use them
22. Which design guides or guidelines do you use? What is your opinion about them? Please give your opinion after the name or description of each guide or guideline.
________________________________________________________________ ________________________________________________________________
23. How many of the items in the UI specification are implemented By user interface (UI) specification we mean any UI design documentation that is utilized when implementing the user interface.
Less than 10% of items are implemented 10% - 30% of items are implemented 30% - 50% of items are implemented 50% - 70% of items are implemented 70% - 90% of items are implemented Over 90% of items are implemented We do not have documented UI items I do not know
103
24. How many of the implemented items are present in the UI specification? By user interface (UI) specification we mean any UI design documentation that is utilized when implementing the user interface.
Less than 10% of items are present in the UI specification 10% - 30% of items are present in the UI specification 30% - 50% of items are present in the UI specification 50% - 70% of items are present in the UI specification 70% - 90% of items are present in the UI specification Over 90% of items are present in the UI specification We do not have documented UI items I do not know
25. How understandable is the user interface specification? By user interface (UI) specification we mean any UI design documentation that is utilized when implementing the user interface.
Not at all understandable Slightly understandable Somewhat understandable Moderately understandable Very understandable Completely understandable We do not have UI documentation I do not deal with UI documentation
User Requirements You may answer either in English or Finnish.
26. How are user requirements documented? ________________________________________________________________ ________________________________________________________________
27. How are the validity and scope of user requirements ensured? ________________________________________________________________ ________________________________________________________________
104
28. How complete are user requirements before any implementation is started? Less than 10% of requirements are done 10% - 30% of requirements are done 30% - 50% of requirements are done 50% - 70% of requirements are done 70% - 90% of requirements are done Over 90% of requirements are done All requirements are done I do not know
29. How are backlog items prioritized? ________________________________________________________________ ________________________________________________________________
30. If sprint backlog priorities change during the sprint, how does it effect on your work? ________________________________________________________________ ________________________________________________________________
31. If sprint backlog priorities change during the sprint, how UX design can be adapted to these changes? ________________________________________________________________ ________________________________________________________________
UX Development 32. How does [COMPANYNAME] define user experience? ________________________________________________________________ ________________________________________________________________
33. How is UX supported in the architecture design? ________________________________________________________________ ________________________________________________________________
34. How and when are usability and user experience evaluated in the design? ________________________________________________________________ ________________________________________________________________
35. How is the realization of UX design ensured in development? ________________________________________________________________ ________________________________________________________________
105
Users 36. Which are the most essential user groups of your work output? Name and number the items on your list; 1. the most essential user group, 2. the second essential user group etc.
________________________________________________________________ ________________________________________________________________
37. How often do you discuss or work with users belonging to the most essential user group? Daily Weekly Monthly Few times a year Yearly Less than yearly I do not discuss or work with users
38. Which are the applications, features or services whose development you are taking part in? ________________________________________________________________ ________________________________________________________________
39. What does good user experience mean in the applications, features or services whose development you are taking part in? ________________________________________________________________ ________________________________________________________________
40. Do you have enough valid information about the users of your work output? Compared to the information about users you need in your work
Not at all
All needed
Background Information 41. Year of birth (combo box) (combo box of years between 1940 to 1995)
106
42. Gender Female Male
43. Title ________________________________ 44. Office location (pre-defined list of office locations)
45. Organization (pre-defined list of organization names)
46. Team ________________________________ 47. How long have you been working for [COMPANYNAME]? (combo box) (Combobox: Less than a year, 1 (year), 2, …, 25 (years))
48. Your work experience in total (Combobox: Less than a year, 1 (year), 2, …, 40, over 40 years)
49. Your level of education Upper secondary general education Upper secondary vocational education Post-secondary non-higher vocational education Polytechnic degree (Bachelor) Lower university degree (Bachelor) Higher polytechnic degree (Master) Higher university degree (Master) Licentiate's degree Doctoral degree
50. Your field of education ________________________________
107
APPENDIX 4 – Study I Interview 1 Guides Arkkitehdit 1. Kerro lyhyesti mitä työnkuvaasi kuuluu 2. Miten osallistut käyttäjäkokemuksen suunnitteluun tai toteuttamiseen? a. (menetelmät, protot, vastuut UXiin/ käytettävyyteen liittyen) 3. Kuinka työsi vaikuttaa käyttäjiin tai käyttäjäkokemukseen? 4. Minkälaisia UX- ja käytettävyystavoitteita tuotteille asetetaan? a. (milloin tavoitteet asetetaan, miten tavoitteisiin pääseminen varmistetaan, mittarit) 5. Mitä työltäsi odotetaan? (mistä mahdollisesti palkitaan, UX) a. Onko toimenkuvaasi tai tavoitteisiisi määritelty UX-asioita/-vastuita 6. Milloin arkkitehtuurit suunnitellaan? a. (missä prosessin vaiheessa, kuinka suhtautuu myöhemmin tuleviin muutoksiin) 7. Kuinka arkkitehtuuri suunnitellaan a. (huomioon otettavat asiat, mikä on tärkeää arkkitehtuurisuunnittelussa) 8. Kuinka hankit/varmistat riittävän ymmärryksen käyttäjän tarpeista ja tehtävistä voidaksesi suunnittella arkkitehtuurin? a. Miten osallistut käyttäjien tarpeiden ja tehtävät selvittämiseen? 9. Miten UX ja käytettävyys huomioidaan arkkitehtuurin suunnittelussa? (arkkitehtuurit, asiakaslähtöisyys) 10. Miten teet yhteistyötä UX designerien kanssa (milloin, kenen) 11. Käytetäänkö arkkitehtuurien suunnittelun apuna usability patterneja tai usability skenaarioita? (kuinka suunnitellaan) 12. Käytättekö Epicejä? (miten, mihin, miten jalostetaan user storieiksi, käytetäänkö Use Caseja) 13. Milloin käyttäjät ovat mukana tuotekehityksessä? (miten osallistuvat) 14. Osallistutko käyttäjäsessioihin, miten saat tietoa session tuloksista 15. Miten käyttäjäpalautetta kerätään/ mistä käyttäjäpalautetta saadaan? 16. Miten käyttäjäpalaute dokumentoidaan, miten ja milloin hyödynnetään 17. Miten itse hyödynnät käyttäjäpalautetta tai käyttäjätietoa työssäsi? Architects 1. Briefly tell about your job content 2. How do you participate in designing for user experience a. (methods, responsibilities in UX or usability) 3. How does your work affect on users / user experience 4. What kind of UX and usability goals are set for product or software 5. What is expected from you in your job? (are you rewarded of something) 6. When is architecture designed? a. (during which phase of the process, how are later changes managed) 7. How is architecture designed? a. What needs to be considered, what is important in architecture design 8. How do you gain (sufficient) understanding of user needs and tasks (to design architecture)? a. How do you participate in defining user needs and tasks? 9. How are UX and usability taken care of when designing architecture 10. How do you cooperate with UX designers? (when, with whom)
108
11. Are usability patterns or usability scenarios used in the architecture design? Why? 12. Do you use epics? How and for what? How are epics translated into user stories? Do you utilize use cases? When & how? 13. How do users participate in product or software development? When? 14. Do you attend these user sessions? How do you get information about session results? 15. How is user feedback gathered or received? 16. How is the feedback documented and how and when is it utilized in design and development 17. How do you utilize user feedback or information about users in your work? Developerit 1. Kerro lyhyesti mitä työnkuvaasi kuuluu: 2. Mitä työltäsi odotetaan? (mistä mahdollisesti palkitaan, UX) 3. Miten olet UX-tiimin kanssa tekemisissä? a. Miten toimii, minkälaisissa asioissa olet tekemisissä, missä vaiheessa yhteistyötä tehdään b. Mitä saat UX-tiimiltä, kuinka valmista / iteroitteko yhdessä, kuinka 4. Kuinka agiilisti / iteratiivisesti UX toimii 5. Miten osallistut käyttäjäkokemuksen tekemiseen? a. (menetelmät, protot, vastuut UXiin/ käytettävyyteen liittyen) b. Kuinka työsi vaikuttaa käyttäjiin / käyttäjäkokemukseen 6. Minkälaisia UX- ja käytettävyystavoitteita tuotteille asetetaan? a. (milloin tavoitteet asetetaan, miten tavoitteisiin pääseminen varmistetaan, mittarit) 7. Miten selvitetään käyttäjien todelliset tarpeet ja tehtävät? a. miten ne esitetään/tulee teille, osallistutteko 8. Milloin käyttäjät ovat mukana tuotekehityksessä? (miten osallistuvat) 9. Miten käyttäjäpalautetta kerätään/ mistä käyttäjäpalautetta saadaan? (miten dokumentoidaan, miten ja milloin hyödynnetään) 10. Miten voit vaikuttaa Product Managementin tekemään listaan (uuden) tuotteen ominaisuuksista? (toteutuuko muutosehdotukset, millä perusteilla priorisoidaan, asioiden riippuvuudet toisiinsa) SM tekee listan toiminnallisuuksista, joka muokataan PB:ksi 11. Miten ja mihin perustuen tuotteen käytettävyys- ja käyttäjäkokemusvaatimukset määritellään? (käytetäänkö userstoreja, use caseja) 12. Miten user storyt tehdään? (featureista) a. ketkä osallistuvat 13. Kuka aikatauluttaa Sprint Backlog Itemit? 14. Miten pystytte vaikuttamaan omaan työmäärään sprintin aikana? (sprintin suunnittelu) 15. Miten value määritellään? (syntyykö sitä joka sprintissä, miten varmistetaan ja todennetaan) 16. Onko Acceptance Criteriat käytössä? a. mitä ovat, miten luodaan, kuka tekee, miten varmistetaan, että ovat valideja ja toteuttuva b. UX acceptance criteria 17. Miten työt jakautuvat sprintin aikana tiimin kesken? 18. Miten työ jakaantuu sprinttien ja muun työn välillä, jos?
109
19. Osallistutteko ennen sprinttejä tehtävään työhön? (onko UX tehty täysin valmiiksi ennnen dev-sprinttiä, tehdäänkö UXia samanaikaisesti kun toteutetaan) 20. Kuinka done määritellään? a. (onko määritelty, muuttuuko välillä, miten määritellään, että työ on tehty hyvin) b. UX done? 21. Mitä kaikkia SCRUMin omaisia tapaamisia teillä on? a. review, retrospektiivit, 5% meetings, mitä niissä käsitellään, mitä hyötyä niistä on b. Osallistuuko UX 22. Miten reviewssa tulleet muutokset vaikuttavat työhön? (siirretäänkö seuraavaan sprinttiin, miten otetaan vastaan 23. Mitä menetelmiä käytätte työn edistymisen seurantaan? (burndown chart, technical/ design debt) 24. Miten ristiriidat/erimielisyydet tuotteen omaisuuksista/ toteutuksesta hoidetaan? a. esim. UX- ja DEV-tiimien välillä 25. Mitkä käytettävyyteen ja UXiin liittyvät asiat tulisi olla devaajien vastuulla? 26. Mitä DEV-tiimi tekee projektien ulkopuolella? (mitä DEV tekee tiiminä) 27. Jos voisit muuttaa tai parantaa jotain tuotekehityksessä tai ux:n tekemisessä, mitä tekisit? Developers 1. Briefly tell about your job content 2. What is expected from you in your job? (are you rewarded of something) 3. With whom do you cooperate? 4. How do you participate in designing for user experience a. methods, responsibilities in UX or usability b. how does it work, when (at which phase) c. what do you get from the UX team, how ready-made /do you iterate it together (dev&UX) 5. How agile or iterative is the UX team? 6. How does your work affect on users / user experience 7. What kind of UX and usability goals are set for product or software , a. when the goals are set, how do you ensure to reach goals meters 8. How are users’ real needs and tasks resolved/clarified/found out? a. How are those represented or gave to you? 9. When do users attend design and development? (RD) 10. How is user feedback gathered or received? a. How is user fb documented, how and when is fb utilized 11. SM makes a list of functionalities before PB is created. How can you contribute on this list of functionalities a. (or PB creation) b. Is the list modified based on your suggestions? How are functionalities prioritized (toteutusjärjestys vai se mitä ylipäätään toteutetaan jos jotain pitää jättää pois), c. How are dependencies of functionalities considered? 12. How and based on what are usability or UX requirements for a product defined? a. Are user stories or use cases utilized 13. How are user stories created? (from features) a. Who attends
110
14. Who schedules sprint backlog (items)? 15. How can you influence on your own workload during sprint 16. (How is value defined/specified? a. Is value generated in each sprint, how is it ensured and verified) 17. Do you have acceptance criteria, which/what kind of? a. How are they created? By whom? How is it ensured that they are valid and get realized 18. How is work divided between developers inside a sprint? 19. Do you work outside sprints? a. If, how is your work divided into work in sprints and other work? 20. Do you participate in work conducted before first sprint? a. How, what is done before starting sprints 21. How is ‘done’ defined? a. Is it defined, does it vary/change, when work is done well b. UX ‘done’ 22. Which scrum-like meetings do you have? a. review, retrospctives, 5% meetings, what do you think of those b. Does UX team or a UX specialist attend to those 23. How does changes made in scrum reviews affect on your work? a. Is work transferred into the next sprint, how is it regarded? 24. What tools/methods are used to follow how the work is progressing? b. Burndown chart, technical/design debt 25. How are disagreements in design / functionalities / implementation handled? c. Between development & UX team 26. Which issues related to UX and usability should be on developers’ responsibility? 27. (What other work than development projects does your development team do? d. (how many dev teams do you have, do you cooperate with other teams)) 28. If you could change or improve something in design and development, what would it be e. Concerning UX (work/team) Product Managerit 1. Kerro lyhyesti mitä työnkuvaasi kuuluu: 2. Mitä työltäsi odotetaan? (mistä mahdollisesti palkitaan, UX) 3. Miten businessvaatimukset syntyy / kerätään? a. Miten valitaan mitä vaatimuksia toteutetaan seuraavassa releasessa? b. UX-näkökulma? 4. Miten businessvaatimukset muutetaan featureiksi? 5. Kuinka laatimaanne listaa tuotteen ominaisuuksista iteroidaan? a. (toteutuuko muutosehdotukset, millä perusteilla priorisoidaan, asioiden riippuvuudet toisiinsa) b. Missä vaiheessa lista esitetään RD:lle ja UX:lle 6. Miten featuret tehdään listaksi, joka lähetetään eteenpäin? a. priorisoidaanko lista jo tässä vaiheessa, kuka priorisoi, miten priorisoidaan, miten UX/käytettävyys huomioidaan tässä vaiheessa, millä perusteilla UX/ käytettävyysvaatimukset tehdään, miten toiminnallisuuksiin päädytään, toiminnallisuuksien riippuvuuksien huomioiminen b. Perustuuko tutkimukseen vai mutuun
111
7. 8.
9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21.
Kuinka päädytään ensimmäiseen varsinaiseen Product Backlogiin? (ketkä tekee, miten value määritellään, miten UX value huomioidaan Käytättekö Epicejä? a. miten, mihin, miten jalostetaan user storieiksi (dev), käytetäänkö Use Caseja b. epic < feature? Vai toisin päin Miten käytettävyys ja käyttäjäkokemus näkyy tuote-/ businessstrategiassa. a. Onko UX sisällytetty tuotestrategiaan, miten Miten UX huomioidaan varhaisessa vaiheessa? (arkkitehtuurit) Miten ja mihin perustuen tuotteen käytettävyys- ja käyttäjäkokemusvaatimukset määritellään? (käytetäänkö userstoreja, use caseja) Minkälaisia UX- ja käytettävyystavoitteita tuotteille asetetaan? a. (milloin tavoitteet asetetaan, miten tavoitteisiin pääseminen varmistetaan, mittarit) Miten osallistut käyttäjäkokemuksen tekemiseen? a. (menetelmät, protot, vastuut UXiin/ käytettävyyteen liittyen) Miten olet UX-tiimin kanssa tekemisissä? a. Miten toimii, minkälaisissa asioissa Kuinka agiilisti / iteratiivisesti UX toimii Miten selvitetään käyttäjien todelliset tarpeet ja tehtävät? a. miten ne esitetään/tulee teille, osallistutteko Milloin käyttäjät ovat mukana tuotekehityksessä? (miten osallistuvat) Miten käyttäjäpalautetta kerätään/ mistä käyttäjäpalautetta saadaan? (miten dokumentoidaan, miten ja milloin hyödynnetään) Miten palautetta saadaan? (minkälaista palautetta, miten palaute huomioidaan, muuttaako palaute esim. designia) Jos voisit muuttaa tai parantaa jotain tuotekehityksessä tai UX:n tekemisessä, mitä tekisit? Miten ideat syntyvät jo ennen tuotteistamista? (mistä ne tulee, kerätäänkö niitä jonnekin, mitä tehdään ennen tuotteistuspäästöstä, Huomioidaanko UX/käytettävyys jo tässä vaiheessa)
Product Managers 1. Briefly tell about your job content1 2. What is expected from you in your job? (are you rewarded of something)2 3. How do you participate in designing for user experience14 a. methods, responsibilities in UX or usability b. how does it work, when (at which phase) c. what do you get from the UX team, how ready-made /do you iterate it together (dev&UX) 4. How agile or iterative is the UX team?15 5. How does your work affect on users / user experience13 6. What kind of UX and usability goals are set for product or software12 a. when the goals are set, how do you ensure to reach goals, meters 7. How are users’ real needs and tasks resolved/clarified/found out?16 a. How are those represented or gave to you? 8. When do users attend design and development? (concepting, RD)17 9. How is user feedback gathered or received?18 a. How is user fb documented, how and when is fb utilized
112
10. How and based on which issues are usability and UX requirements defined11 a. Are user stories or use cases utilized 11. How is the list of functionalities you create iterated?5 a. When is it represented to RD & UX are other people consulted? b. How do you change the list based on those suggestions, how do you prioritize functionalities, dependencies? 12. How are ideas generated before starting RD project? a. Where do these ideas come from, how are those collected and documented b. What is done before making the decision to start an RD project, is UX considered in this point? 13. How is UX (and usability) present in product or business strategy?9 a. Is UX included in product strategy, how? 14. How is UX considered in the early phase?10 a. Ideating, consepting, making business requirements 15. How are business requirements translated into features?3 16. How are features worked into a list that is presented to RD and UX team?4 a. Is the list prioritized before presented to RD? Who decides the priorities? b. ( How) is UX/usability considered at this phase? How are usability requirements refined? c. How are functionalities selected for the list? Dependencies between functionalities? d. Is this phase based on research or own opinions? 17. Do you use epic stories? How? For what?6 a. Which one is larger or created first, epic or feature 18. How is feedback gathered or received?8 a. Which kind of feedback, how is feedback considered or handled, does feedback change design? 19. How is the first list of functionalities transferred into product backlog19 a. Who does it/ attends? How is value (vs. effort) defined, how is UX value considered? 20. If you could change or improve something in design and development, what would it be7 f. Concerning UX (work/team) Product Owners 1. Kerro lyhyesti mitä työnkuvaasi kuuluu 2. Mitä työltäsi odotetaan? (mistä mahdollisesti palkitaan, UX) 3. Miten osallistut käyttäjäkokemuksen tekemiseen? a. (menetelmät, protot, vastuut UXiin/ käytettävyyteen liittyen) 4. Miten olet UX-tiimin kanssa tekemisissä? 5. Minkälaisia UX- ja käytettävyystavoitteita tuotteille asetetaan? a. (milloin tavoitteet asetetaan, kuka ne asettaa, miten tavoitteisiin pääseminen varmistetaan, mittarit) 6. Kuinka hankit riittävän ymmärryksen käyttäjistä ja heidän tarpeistaan (oman työsi kannalta) a. Kuinka selvitetään käyttäjien todelliset tarpeet ja tehtävät? (miten ne esitetään R&D:lle) 7. Miten käytettävyys ja käyttäjäkokemus näkyy tuote-/ businessstrategiassa.
113
8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20.
Miten ja mihin perustuen tuotteen käytettävyys- ja käyttäjäkokemusvaatimukset määritellään? Miten UX huomioidaan varhaisessa vaiheessa? a. (product managerien lista, konseptisuunnittelu, backlogi, arkkitehtuuri) Käytättekö Epicejä? a. (miten, mihin, miten jalostetaan user storieiksi, käytetäänkö Use Caseja) Kuinka alkuperäinen (SM:n) featurelista priorisoidaan? (kuka/ ketkä osallistuu, millä perusteella priorisoidaan, miten tiedetään asioiden riippuvuudet toisiinsa) Kuinka päädytään ensimmäiseen varsinaiseen Product Backlogiin? (ketkä tekee, miten value määritellään, miten UX value huomioidaan) Miten UX näkyy PBI:ssä? Onko UX:lle Acceptance Criteriat käytössä? (mitä ovat, miten luodaan, kuka tekee, miten varmistetaan, että ovat valideja ja toteuttuvat) Syntyykö (UX-)valueta joka sprintissä? (miten varmistetaan ja todennetaan) Mihin scrum tapaamisiin (review, retrospektiivit, 5% meetings) osallistut? a. mitä niissä käsitellään, mitä hyötyä niistä on määritelläänkö UX-asioille ”done”? Mitä "done" tarkoittaa? (onko määritelty, muuttuuko välillä, miten määritellään, että työ on tehty hyvin, kuinka mitataan) Milloin käyttäjät ovat mukana tuotekehityksessä? (miten osallistuvat) a. Osallistutko käyttäjäsessioihin? Mistä saat tietoa niiden tuloksista? Mistä saat käyttäjäpalautetta (tai tietoa käyttäjistä) Kuinka hyödynnät käyttäjäpalautetta työssäsi a. Miten käyttäjäpalautetta kerätään/ mistä käyttäjäpalautetta saadaan? (miten dokumentoidaan, miten ja milloin hyödynnetään)
Quality Engineers 1. Kerro lyhyesti mitä työnkuvaasi kuuluu: 2. Mitä työltäsi odotetaan? (mistä mahdollisesti palkitaan, UX) 3. Missä vaiheessa olet mukana tuotekehityksessä? a. Ketkä päättää mitä testataan? 4. Millä tavoin annat palautetta testatuista asioista? 5. Kun löydät jotain puutteita, mitä sitten tapahtuu? a. Missä vaiheessa olevaa tuotetta testaat 6. Käytättekö alihankintaa? 7. Miten osallistut käyttäjäkokemuksen tekemiseen? a. menetelmät, protot, vastuut UXiin/ käytettävyyteen liittyen 8. Minkälaisia UX- ja käytettävyystavoitteita tuotteille asetetaan? a. milloin tavoitteet asetetaan, miten tavoitteisiin pääseminen varmistetaan, mittarit 9. Missä muodossa tavoitteet / vaatimukset teille esitetään? 10. Miten UX:aa testataan? 11. Onko Acceptance Criteriat käytössä? (mitä ovat, miten luodaan, kuka tekee, miten varmistetaan, että ovat valideja ja toteutuvat) 12. Miten selvitetään käyttäjien todelliset tarpeet ja tehtävät? (miten ne esitetään R&D:lle) 13. Milloin käyttäjät ovat mukana tuotekehityksessä? (miten osallistuvat) 14. Miten käyttäjäpalautetta kerätään/ mistä käyttäjäpalautetta saadaan? (miten dokumentoidaan, miten ja milloin hyödynnetään)
114
a. Support, ux palautteen saaminen (ark) 15. Mitä kaikkia SCRUMin omaisia tapaamisia teillä on? (review, retrospektiivit, 5% meetings, mitä niissä käsitellään, mitä hyötyä niistä on) 16. Jos voisit muuttaa tai parantaa jotain tuotekehityksessä tai UX:n tekemisessä, mitä tekisit? Scrum Masters 1. Kerro lyhyesti mitä työnkuvaasi kuuluu: 2. Mitä työltäsi odotetaan? (mistä mahdollisesti palkitaan, UX) 3. Miten olet UX-tiimin kanssa tekemisissä? a. Miten toimii, minkälaisissa asioissa 4. (Kuinka agiilisti / iteratiivisesti UX toimii) 5. Miten osallistut käyttäjäkokemuksen tekemiseen? a. (menetelmät, protot, vastuut UXiin/ käytettävyyteen liittyen) 6. Minkälaisia UX- ja käytettävyystavoitteita tuotteille asetetaan? a. (milloin tavoitteet asetetaan, miten tavoitteisiin pääseminen varmistetaan, mittarit) 7. Miten selvitetään käyttäjien todelliset tarpeet ja tehtävät? a. miten ne esitetään/tulee teille, osallistutteko 8. Milloin käyttäjät ovat mukana tuotekehityksessä? (miten osallistuvat) 9. Miten käyttäjäpalautetta kerätään/ mistä käyttäjäpalautetta saadaan? (miten dokumentoidaan, miten ja milloin hyödynnetään) 10. Miten voit vaikuttaa Product Managementin tekemään listaan uuden tuotteen ominaisuuksista? (toteutuuko muutosehdotukset, millä perusteilla priorisoidaan, asioiden riippuvuudet toisiinsa) 11. Miten ja mihin perustuen tuotteen käytettävyys- ja käyttäjäkokemusvaatimukset määritellään? (käytetäänkö userstoreja, use caseja) 12. Miten value määritellään? (syntyykö sitä joka sprintissä, miten varmistetaan ja todennetaan) 13. Onko Acceptance Criteriat käytössä? a. mitä ovat, miten luodaan, kuka tekee, miten varmistetaan, että ovat valideja ja toteuttuvat b. UX acceptance criteria 14. Minkälaisia ongelmia UXissa tulee vastaan? (miten ne hoidetaan ja kuka ne hoitaa) 15. Miten työt jakautuvat sprintin aikana tiimin kesken? 16. Miten työ jakaantuu sprinttien ja muun työn välillä, jos? 17. Osallistutteko ennen sprinttejä tehtävään työhön? (onko UX tehty täysin valmiiksi ennnen dev-sprinttiä, tehdäänkö UXia samanaikaisesti kun toteutetaan) 18. Kuinka done määritellään? a. (onko määritelty, muuttuuko välillä, miten määritellään, että työ on tehty hyvin) b. UX done? (onko määritelty, muuttuuko välillä, miten määritellään, että työ on tehty hyvin) 19. Mihin SCRUMin omaisiin tapaamisiin osallistut? a. review, retrospektiivit, 5% meetings, mitä niissä käsitellään, mitä hyötyä niistä on b. Osallistuuko UX 20. Miten reviewssa tulleet muutokset vaikuttavat työhön? a. siirretäänkö seuraavaan sprinttiin, miten otetaan vastaan
115
21. Mitä menetelmiä käytätte työn edistymisen seurantaan? (burndown chart, technical/ design debt) 22. Miten ristiriidat tuotteen omaisuuksista/ toteutuksesta esim. UX- ja DEV-tiimien välillä hoidetaan? 23. Mitkä käytettävyyteen ja UXiin liittyvät asiat tulisi olla devaajien vastuulla? 24. Mitä DEV-tiimi tekee projektien ulkopuolella? (mitä DEV tekee tiiminä) 25. Jos voisit muuttaa tai parantaa jotain tuotekehityksessä tai ux:n tekemisessä, mitä tekisit? UX Manager 1. Kerro lyhyesti mitä työnkuvaasi kuuluu: 2. Mitä työltäsi odotetaan? (mistä mahdollisesti palkitaan, UX) 3. Miten olet UX-tiimin kanssa tekemisissä? a. Miten toimii, minkälaisissa asioissa b. UX-tiimin itseohjautuvuus / johtaminen 4. UX-tiimin tehtävät (design, UX työn johtaminen/mahdollistaminen, templatet, käytänteet) a. Millaisista rooleista UX tiimi koostuu, onko hyvä 5. Kuinka agiilisti / iteratiivisesti UX-tiimi toimii a. Kuinka agiilisti UXaa tehdään b. Aikataulut / oikea-aikaisuus 6. Mihin suuntaan UX-työtä halutaan kehittää? a. Missio, visio b. [COMPANYNAME] UX c. Nollasprintti käytössä 7. Toiminnallisuuksista liikkeelle lähteminen? a. Functionality > user story b. Onko toiminnallisuuksista aloittaminen ok, käyttäjän tarpeet c. Feature-driven RD? 8. Huomioivatko product managerit UX:n? 9. Miten UX huomioidaan alkuvaiheen suunnittelussa? a. (asiakaslähtöisyys, konseptit, arkkitehtuurit) 10. Minkälaisia UX- ja käytettävyystavoitteita tuotteille asetetaan? a. (milloin tavoitteet asetetaan, miten tavoitteisiin pääseminen varmistetaan, mittarit) 11. Miten UX näkyy yrityksessä ja yrityskulttuurissa? (miten sen asemaa edistetään) 12. UX ja muut yrityksen toiminnot kuin RD (markkinointi, myynti)? 13. Jos voisit muuttaa tai parantaa jotain tuotekehityksessä tai UX:n tekemisessä, mitä tekisit? UX Team Members 1. Briefly tell about your job content 2. What is expected from you in your work? a. What are you rewarded for? 3. Are you working in a UX team or as an individual UX specialist? 4. Do you work by scrum? Explain product or service: it is product, software, service, features, something that is worked on in the project you are working for 5.
Which kind of UX and usability goals are set for the software?
116
6. 7. 8.
9.
10.
11.
12. 13. 14.
15.
a. When those goals are set, who sets them b. How is it ensured that the goals are met? c. Are acceptance criteria in use for UX i. Please give an example, what kind of criteria do you use, how are criteria items created, by whom, how is it ensured that acceptance criteria are valid and met during development ii. mitä ovat, miten luodaan, kuka tekee, miten varmistetaan, että ovat valideja ja toteutuvat d. Definition of done for UX issues? i. What does done mean in UX issues? Have you defined it, does it change during development, how is it validated that the output is good How is UX value defined? a. Is it created/added up during each sprint, how is it ensured and validated? How are user needs and tasks refined? When? How and based on which issues are usability and user experience requirements specified? a. How are those described to developers? b. Do you use user stories or use cases c. Please describe which kind of user stories / use cases do you have Business requirements, technical requirements, user requirements… how are those combined? a. How are disagreements solved? Who makes the decision? How agile is your UX work currently? a. Do you work in sprints, do you have own sprints b. is UX work mainly conducted before or during development sprints? c. How can you effect on your amount of work / workload? i. Selecting items during sprint, sprint planning Which scrum meetings do you attend? a. workshops, meetings, reviews etc. b. which scrum meetings do you have in general c. sprint review, retrospectives, 5 % workshops i. which issues are discussed there, how do you or your work benefit from those? How do changes made in sprint reviews effect on your work? a. Is work postponed to following sprints, how do people react on that? Which methods do you use for following the progress of the UX work? a. burndown chart, technical/ design debt Which UX related issues or tasks should be on somebody else’s responsibility than the UX team’s? a. On whose responsibility, developers, managers, product owner, scrum master, architecht tester, customer With whom do you cooperate in your work? a. architects b. developers c. quality engineers d. product managers e. product owners f. scrum masters
117
16.
17. 18. 19.
g. other UX specialists h. customer i. user What does the UX team do outside projects, what do you do as a team? a. Do you cooperate with the other UX team members? b. Do you also work outside delivery projects? When and how are users invoved in design and development? How is user feedback gathered or from where do you get user feedback? a. How do you document and utilize user feedback Can you say something about UX work in [COMPANY/ORGANIZATION NAME] in general? What is your understanding of the situation? If you could change or improve something in your current ways of working, maybe concerned with UX work, what would you do?
118
APPENDIX 5 – Study I Survey 2 (Follow-Up Survey) UX practices in [Companyname]
In this survey, by UX related work we mean the following: By UX work we mean any work that contributes to understanding or defining user needs, designing and developing to meet the user needs, and evaluating or ensuring that the needs are being met. The work can be conducted by any project member; we do not limit the definition of UX work to a certain role (e.g. UX specialist).
1. Please select the business line about which you will be answering this survey * Select the business line you are mainly working for or otherwise find the most familiar to you.
[BUSINESSLINE NAME 1] [BUSINESSLINE NAME 2] [BUSINESSLINE NAME 3]
Please answer the following questions considering the business line you selected above. You do not need to be knowledgeable of the situation within the whole business line; you can answer based on your own experience.
2. How satisfied are you with UX related work in the business line currently? * 1
2
not at all
3
4
5
6
7 completely
3. Please list 1-3 issues in UX related work in the business line that you are currently most DISSATISFIED with: (negative issues)
1. ______________________________________________________________ ________________________________________________________________ 2. ______________________________________________________________ ________________________________________________________________ 3._______________________________________________________________ ________________________________________________________________
119
4. Please list 1-3 issues in UX related work in the business line that you are currently most SATISFIED with: (positive issues)
1._______________________________________________________________ ________________________________________________________________ 2._______________________________________________________________ ________________________________________________________________ 3._______________________________________________________________ ________________________________________________________________
5. How would you improve the current situation? ________________________________________________________________ ________________________________________________________________
6. Please position the following issues on the fourfold table according to 1. how well the business line succeeds in the issue (horizontal), and 2. how important the issue is for project success (vertical). * E.g. if you consider that a certain issue is highly important for project success but your business line has been performing poorly on the issue, place that issue at the top-left corner. To place a point (in order from A to K): just click on the table, do not try to drag&drop. To replace: select the item you want to move from the list on left and click on the table again.
A. Maintaining the big picture of the project B. Agility of UX work C. Timing of UX work D. Cooperation between the UX specialist and developers E. Cooperation between the UX specialist and the product owner F. Understanding user needs G. Fulfilling user needs H. Getting user feedback during development I. Welcoming late change J. Project team competence K. Ensuring the quality of UX during implementation
Significant Importance for project success Insignificant
Poor
Organization performance
Excellent
120
7. How do you think the following issues have changed in the business line since last year? * Please refer to the current situation in the business line and to the situation that prevailed a year ago in the organization you were then working for. If you did not work for [Companyname] a year ago, please refer to how do you think the situation has changed in [Companyname]. --- worsened greatly -- worsened somewhat - worsened slightly 0 no change improved slightly ++ improved somewhat +++ improved greatly
--- --
-
0
+
+
++ +++
Maintaining the big picture of the project worsened Agility of UX work worsened Timing of UX work worsened Cooperation between the UX specialist and developers worsened Cooperation between the UX specialist and the product owner worsened Understanding user needs worsened Fulfilling user needs worsened Getting user feedback during development worsened Welcoming late change worsened Project team competence worsened Ensuring the quality of UX during implementation worsened
improved improved improved improved improved improved improved improved improved improved improved
8. Any comments? You can write here anything on your mind considering UX related work in your company or this study. If you want to comment on the previous two questions, please give the letter (A-K) of the issue you are commenting on and the number of the question.
________________________________________________________________ ________________________________________________________________
121
APPENDIX 6 – Study II Pilot Survey Agile UX Weekly Barometer for the [PROJECTNAME] project 1. The big picture of the [PROJECTNAME] project is clear enough for me 1 2 3 4 5 6 7 no
yes
2. I believe that the user experience of [APPNAME] will be 1 2 3 4 5 6 7 bad
good
3. Did you cooperate with the UX person this week? no yes, and the cooperation was successful yes, but the cooperation was unsuccessful
4. Which of the following did you participate or do this week? Was the UX person involved? Please select all that apply.
Without UX person
With UX person
Created concepts Clarified user requirements Clarified end user definitions or target user groups Planned user studies or user tests Conducted a user study Conducted user testing Created UI designs Reviewed UI designs Created architecture designs Created or groomed product backlog Planned a feature UX person helped me to understand the design I helped the UX person to understand the technology Determined how to implement design details Made changes to the design Reviewed the implementation
122
Had a demo session Other UX-related activity, please specify _____ Other UX-related activity, please specify _____
5. This week, I had to do rework because of UX Here, by UX we mean any issues that are related to user experience or usability. Rework is any work where you redo or correct something already made (excluding normal iterative work).
1 2 3 4 5 6 7 not at all
very much
6. User interface design was Please answer only if you worked with UI design this week. By UI design we mean any documentation or guidance on how to implement user flow, UI structure or such. Usually UI pictures, prototypes, UI specification, animation etc.
1 2 3 4 5 6 7 not understandable not implementable
completely understandable completely implementable
7. Any comments: Please use this field to tell us anything you want; e.g. clarification to an answer (please provide the number of the question), something that bothered you this week, suggestions for improvement etc.
________________________________________________________________ ________________________________________________________________
123
APPENDIX 7 – Study II Pilot Survey for UX Specialists
1. The big picture of [PROJECTNAME] project is clear enough for me 1 2 3 4 5 6 7 no
yes
2. I believe that the user experience of [SOFTWARENAME] will be 1 2 3 4 5 6 7 bad
good
3. Which of the following did you participate or do this week? Who were involved? Use e.g. the following roles: Project members: product owner (PO), project manager (PM), scrum master (SM), developer (dev), architect (arc), etc. Others: other UX specialist (UX), management (MG), sales (SA), customer (CU), user (US), etc. You can use the above mentioned abbreviations for these roles. If you work with other roles or projects, please include those too.
Alone
With someone. Who were involved?
. Created concepts
________
Clarified user requirements
________
Clarified end user definitions or target user groups
________
Planned user studies or user tests
________
Conducted a user study
________
Conducted user testing
________
Created UI designs
________
Reviewed UI designs
________
Created architecture designs
________
Created or groomed product backlog
________
Planned a feature
________
Helped others to understand the design
________
Implemented UI
________
124
Determined how to implement design details
________
Made changes to the design
________
Reviewed the implementation
________
Had a demo session
________
Other UX-related activity, please specify _______
________
Other UX-related activity, please specify _______
________
4. This week, I had to do rework because of UX Here, by UX we mean any issues that are related to user experience or usability. Rework is any work where you redo or correct something already made (excluding normal iterative work).
1 2 3 4 5 6 7 not at all
very much
5. The UX work I did this week was 1 2 3 4 5 6 7 too late concentrating on insignificant issues unsuccessful not agile
early enough concentrating on significant issues successful agile
6. Briefly describe this week: E.g. your tasks in general, the overall situation of the project, any bothering issues, what should have done differently, what was good this week. Please ensure your answer is unambiguous (e.g. we can understand if you mean something was good or bad thing, or a general description etc).
________________________________________________________________ ________________________________________________________________
7. Any comments: Please use this field to tell us anything you want; e.g. clarification to an answer (please provide the number of the question), suggestions for improvement etc.
________________________________________________________________ ________________________________________________________________
125
APPENDIX 8 – Study II Pilot Interview Guide Goals of Piloting and related interview questions 1.
Job description / designation
2.
Close cooperation between competences (UX/tech/business) a. What kind of cooperation do you have with UX specialists in this project / with development / with business owners / with customer / with user? i. What communication channels do you use (and would prefer)? (face-to-face, Live Meeting, email, phone…)? Erotellaan yhteistyö eri ryhmien kanssa (eli jos vastaa esim. että emaililla, varmistetaan kenestä puhuu) 1. Any issues related to communication channels? 2. What would you change or improve related to communication? Why? ii. When and in which situations do you communicate w. UX/dev/bus/cust/user? In which phase of the project did you start the communication? Is it continuous / regular or adhoc (when needed) iii. What is the content of the communication? Which kinds of issues are discussed?
3.
Involving users in R&D, User point of view a. Which kind of users / user groups does the service or product you are developing have? b. How and when are user needs clarified? c. Which kind of needs (motives) users have related to the service / product you are developing? d. How are users involved in R&D? //vaihdoin tämän kysymyksen paikkaa a->d i. In which phases? Who is responsible? ii. In general, how is information about users collected? iii. How is the collected information about users utilized?
4.
UX-person (person who is doing UX work) is doing ’right’ things a. Who is doing UI designs in your project? (alone or in cooperation w someone, who?) i. How is designs communicated to developers? (pictures, animations, email. ii. Are developers involved in making UX designs? iii. What happens if developers disagrees with the design? b. Who has designed (will design) user flow? (alone or in cooperation w who? Is architect involved? Is UX person involved?) c. Do you have a designated person for UX work? Do you utilize his/her competence in your project? i. In which situations? What does the UX person do? When do you need him/her? ii. Who coordinates the cooperation or how is it started and when? d. Who makes decisions whether and when is UX person needed in the project?
5.
Everybody is responsible of UX issues a. Where do you get information about users (from a person, a data source etc.) i. Where and how would you prefer getting the information? b. Which kind of information do you need about users? i. Currently, do you have enough information about users? Can you get if you want? c. Which kind of goals do you have in your project concerning user experience? i. How is the realization of those ensured?
126
d. Have you had education or training related to user experience? e. Is there UX guidance or guidelines or style guides available for you? i. Do you utilize UX guides or guidelines in your work? Would you need those? f. How is the realization of UX design ensured during development? g. How are UX issues covered in daily or weekly meetings? 6.
Measuring the outcome (user evaluations etc.) a. How have you evaluated the UX/usability of the previous release of your product? b. How was the product? Which kind of defects did you find? i. How do you manage those in the next release (release number)? c. Which kind of criteria do you have for successful product / service? When can the product be considered as a success?
7.
(Agile and iterative process)
Other questions: Is your team distributed or co-located? Do you have a dedicated person for communication between sites? UX person: How are you involved with other UX persons? UX person: How have you been introduced to the project? UX person: how do you communicate with project management?
127
APPENDIX 9 – Study II Weekly Survey for Team Members of Teams with a UX Specialist
Weekly survey about user experience development in agile software projects 1. The big picture of the project is clear enough for me Please answer all the questions concerning the particular project mentioned in the email.
1 2 3 4 5 6 7 no
yes
2. I think that the user experience of the software we are currently developing will be 1 2 3 4 5 6 7 bad
good
3. Did you cooperate with the UX specialist this week? Please refer to the particular UX specialist mentioned in the email.
no yes, and the cooperation was successful yes, but the cooperation was unsuccessful. Please describe:_______________
4. Which of the following did you participate or do this week in the project? Was the UX specialist involved? Please select all that apply.
Without UX specialist
With UX specialist
Created concepts Clarified user requirements Clarified end user definitions or target user groups Planned user studies or user tests Conducted a user study Conducted user testing Created UI designs Reviewed UI designs
128
Created architecture designs Created or groomed product backlog Planned a feature Shared understanding of the UI design Shared understanding of the technology Determined how to implement UI design details Made changes to the UI design Reviewed the implementation Had a demo session Other UX-related activity, please specify ________ Other UX-related activity, please specify
_______
5. How many hours did you work for the project this week? ________________________________ 6. This week, I had to do rework because of UX Here, by UX we mean any issues that are related to user experience or usability. Rework is any work where you redo or correct something already made (excluding normal iterative work). Please answer concerning the particular project.
1 2 3 4 5 6 7 not at all
very much
7. User interface design was Please answer only if you worked with UI design in the project this week. By UI design we mean any documentation or guidance on how to implement user flow, UI structure or such. Usually UI pictures, prototypes, UI specification, animation etc.
1 2 3 4 5 6 7 not understandable not implementable
completely understandable completely implementable
8. Considering the project work this week, how do you feel? 1 2 3 4 5 6 7 frustrated sad
motivated happy
129
9. Any comments: Please use this field to tell us anything you want; e.g. clarification to an answer (please provide the number of the question), something that bothered you this week, suggestions for improvement etc.
________________________________________________________________ ________________________________________________________________
130
APPENDIX 10 - Study II Weekly Survey for UX Specialists Weekly survey about user experience development in agile software projects
1. The big picture of the project is clear enough for me Please answer all the questions concerning only the particular project mentioned in the email.
1 2 3 4 5 6 7 no
yes
2. I think that the user experience of the software we are currently developing will be 1 2 3 4 5 6 7 bad
good
3. Which of the following did you participate or do in the project this week? Who were involved? Use, for instance, the following roles: Project members: product owner (PO), project manager (PM), scrum master (SM), developer (dev), architect (arc), etc. Others: other UX specialist (UX), management (MG), sales (SA), customer (CU), user (US), etc. You can use the above mentioned abbreviations for these roles. If you work with other roles or projects, please include those too.
Alone
With someone. Who were involved?
. Created concepts
____________________
Clarified user requirements
_____________________
Clarified end user definitions or target user groups
_____________________
Planned user studies or user tests
_____________________
Conducted a user study
_____________________
Conducted user testing
_____________________
Created UI designs
_____________________
Reviewed UI designs
_____________________
Created architecture designs
_____________________
Created or groomed product backlog
_____________________
131
Planned a feature
_____________________
Shared understanding of the UI design
____________________
Implemented UI
____________________
Determined how to implement UI design details
_____________________
Made changes to the UI design
_____________________
Reviewed the implementation
_____________________
Had a demo session
_____________________
Other UX-related activity, please specify _______
_____________________
Other UX-related activity, please specify ________
_____________________
4. How many hours did you work for the project this week? ________________________________ 5. How many of your project hours were UX/UI related work? all or almost all most of them about half some of them none or almost none
6. The UX work I did this week was 1 2 3 4 5 6 7 too late concentrating on insignificant issues Unsuccessful not agile
too early concentrating on significant issues successful agile
7. Considering the project work this week, how do you feel? 1 2 3 4 5 6 7 frustrated sad
motivated happy
132
8. Briefly describe this week: E.g. your tasks in general, the overall situation of the project, any bothering issues, what should have done differently, what was good this week. Please ensure your answer is unambiguous (e.g. we can understand if you mean something was good or bad thing, or a general description etc).
________________________________________________________________ ________________________________________________________________
9. Any comments: Please use this field to tell us anything you want; e.g. clarification to an answer (please provide the number of the question), suggestions for improvement etc.
________________________________________________________________ ________________________________________________________________
133
APPENDIX 11 – Study II Weekly Survey for Team Members of Teams without a UX Specialist Weekly survey about user experience development in agile software projects (without a UX specialist)
1. The big picture of the project is clear enough for me Please answer all the questions concerning the particular project mentioned in the email.
1 2 3 4 5 6 7 no
yes
2. I think that the user experience of the software we are currently developing will be 1 2 3 4 5 6 7 bad
good
3. In your opinion, would the project have needed UX competence this week? Please select one or more of the 'no' OR 'yes' options (not both 'no' and 'yes').
No, we did not do/plan anything that affects the user experience in the project this week No, we had the needed competence in the project No, the user experience does not need to be that good No, other reason, please describe:______________________________ Yes, to better understand user needs or goals Yes, to make better design decisions considering the user flow and UI Yes, to get feedback of the current design of user flow and UI Yes, other reason, please describe:_____________________________
4. Which of the following did you participate or do this week in the project? Please report your own tasks truthfully as they occurred this week.
Created concepts Clarified user requirements Clarified end user definitions or target user groups Planned user studies or user tests Conducted a user study
134
Conducted user testing Created UI designs Reviewed UI designs Created architecture designs Created or groomed product backlog Planned a feature Determined how to implement UI design details Made changes to UI design Reviewed the implementation Had a demo session Other UX-related activity, please specify: ____________________________ Other UX-related activity, please specify: ____________________________
5. Let's imagine you had a UX specialist working for the project. Which of the following tasks he or she would have done or participated this week to be beneficial for the project? If there are no such tasks, please select the last row "none".
Created concepts Clarified user requirements Clarified end user definitions or target user groups Planned user studies or user tests Conducted a user study Conducted user testing Created UI designs Reviewed UI designs Created architecture designs Created or groomed product backlog Planned a feature Determined how to implement UI design details Made changes to UI design Reviewed the implementation
135
Had a demo session Other UX-related activity, please specify: ____________________________ Other UX-related activity, please specify:_____________________________ None
6. How many hours did you work for the project this week? ________________________________ 7. This week, I had to do rework because of UX Here, by UX we mean any issues that are related to user experience or usability. Rework is any work where you redo or correct something already made (excluding normal iterative work). Please answer concerning the particular project.
1 2 3 4 5 6 7 not at all
very much
8. User interface design was Please answer only if you worked with UI design in the project this week. By UI design we mean any documentation or guidance on how to implement user flow, UI structure or such. Usually UI pictures, prototypes, UI specification, animation etc.
1 2 3 4 5 6 7 not understandable not implementable
completely understandable completely implementable
9. Considering the project work this week, how do you feel? 1 2 3 4 5 6 7 frustrated sad
motivated happy
10. Any comments: Please use this field to tell us anything you want; e.g. clarification to an answer (please provide the number of the question), something that bothered you this week, suggestions for improvement etc.
________________________________________________________________
136
APPENDIX 12 – Study II Retrospect Survey for Product Owners Project Retrospect Survey - Product Owner
The survey contains 24 questions on four pages: 1. your opinion about the project, 2. communication in the project, 3. budget and schedule of the project, and 4. some background information about the participant. Please respond all the following questions considering your opinion about the project that you have been reporting in the weekly survey.
1. In your opinion, how successful has the project been in general? * 1 2 3 4 5 6 7 Failure
Success
2. How satisfied are you with the UX related work in the project? * By UX work we mean any work that contributes to understanding or defining user needs, designing and developing to meet the user needs, and evaluating or ensuring that the needs are being met. The work can be conducted by any project member, we do not limit the definition of UX work to a certain role (UX specialist).
1 2 3 4 5 6 7 Not at all
Completely
3. What has been good in the work related to user experience in the project? How? ________________________________________________________________ ________________________________________________________________
4. How would you improve the work related to user experience in the project (or similar projects)? You can consider e.g. the focus and timing of the work, working methods, results, communication, agility, task allocation etc.
________________________________________________________________ ________________________________________________________________
5. Please position the following issues on the fourfold table according to 1. how well the project has succeeded in the issue (horizontal), and 2. how important the issue is for project success (vertical). E.g. if you consider that a certain issue is highly important for project success but your project has performed poorly on the issue, place that issue at the top-left corner.
137
To place a point (in order from A to K): just click on the table, do not try to drag&drop. To replace: select the item you want to move from the list on left and click on the table again.
A. Maintaining the big picture of the project B. Agility of UX work C. Timing of UX work D. Cooperation between the UX specialist and developers E. Cooperation between the UX specialist and the product owner F. Understanding user needs G. Fulfilling user needs H. Getting user feedback during development I. Welcoming late change J. Project team competence K. Ensuring the quality of UX during implementation
Significant Importance for project success Insignificant
Poor Project performance Excellent
6. Please write here if you have comments on the fourfold question (question 5.), e.g. what in particular has been good or could be improved. Please refer to the letter of the issue (A-K) to make clear what you are commenting on.
________________________________________________________________ ________________________________________________________________
7. In your opinion, do/did you have a UX specialist participating as a project team member in your project? Please refer to the UX specialist (if applicable) mentioned in the invitation email. Respond as it feels for you, not as it is defined in the project setup.
Yes, the UX specialist is our development team member Yes, the UX specialist is our project team member but not a development team member No, the UX specialist is working for the project externally on a steady basis No, but we can consult an external UX specialist when needed No, we do not have a UX specialist working for the project at all
8. Considering the previous question (question 7), how satisfied are you with the situation? 1 2 3 4 5 6 7 Not at all
Completely
Please write on the text field if you want to comment on your answers on questions 7 or 8. E.g. why are you (not) satisfied with the situation?
________________________________
138
9. How often do you meet face-to face the following? 1 = never
A = we do not have this role in our project, B = I am the sole person with this role in our project
2 = once or twice during a project 3 = once or twice in a release cycle 4 = once or twice in an iteration (development cycle) 5 = several times in an iteration 6 = daily
A 1 2 3 4 5 6 B The project's developers The project's project manager(s) The project's product owner(s) The project's architect(s) The project's UX specialist(s) The project's customer(s) The user(s) of the project outcome
10. How often do you communicate with the following in other means than meeting face-to face (phone calls, email messages, instant messages, online meetings etc.)? 1 = never
A = we do not have this role in our project, B = I am the sole person with this role in our project
2 = once or twice during a project 3 = once or twice in a release cycle 4 = once or twice in an iteration (development cycle) 5 = several times in an iteration 6 = daily
A 1 2 3 4 5 6 B The project's developers The project's project manager(s) The project's product owner(s) The project's architect(s) The project's UX specialist(s) The project's customer(s) The user(s) of the project outcome
11. Which of the following methods do you use in the project for communicating the UX or UI design to developers?
139
Please note that the examples of communication tools are just to make the options more understandable. Some of the mentioned tools can be used for multiple purposes (e.g. MS Lync can be used for voice only conversations or instant messaging also.)
never .
.
.
.
.
always
Face-to-face communication Electronic voice conversation (such as phone or Skype) Web conferencing including video and audio (such as MS Lync, Adobe Connect, Skype) Sharing tasks in a software backlog tool (such as Jira) Sharing tasks in a physical backlog tool (such as paper board on a wall) Instant messaging Email messaging File sharing (such as intranet or cloud storage services) Other, please specify: ________________________________
12. In which forms is the UX or UI design communicated to developers in the project? How often the following means have been used in the project? Please note that the examples of tools are just to make the options more understandable. Some of the mentioned tools can be used for multiple purposes (e.g.doing wireframe or high-fidelity images).
never .
.
.
.
.
always
High fidelity or photorealistic images of UI screens (such as Adobe PhotoShop) Wireframe images with explanation texts (interaction and navigation presented with e.g. text and arrows) References to style guides (guidance for selecting right UI components and styles) Paper prototypes (physical low-fidelity models. E.g. cardboard, postit) Low-fidelity software prototypes (sketches of screens, mockups, storyboards. E.g. Balsamiq) Mid-fidelity software prototypes (fairly detailed but simple and approximate model with simulated functionality. E.g. MS PowerPoint, MS Visio, Axure RP, Omnigraffle) High-fidelity software prototypes (detailed graphics with some actual functionality (often simulated back-end), such as AppSketcher, FluidUI, Adobe Flash Catalyst) Working software, source code (such as HTML and CSS) Other, please specify: ______________________________
13. Timing: When the UX design has been communicated for development, it has
140
been... Please read through all the options before answering and select the one that most appropriately describes the project.
· ·
When the design is "too late" the development has to wait for the design or to implement without getting the design When the design is "too early", it leads to excess unimplemented design inventory
mostly too late sometimes too late, mostly in time mostly in time sometimes too early, mostly in time mostly too early inconsistent: sometimes too early, sometimes too late and sometimes in time
14. Readiness for development: When the UX design has been communicated for development, it has been... Please read through all the options before answering and select the one that most appropriately describes the project.
·
· ·
By "too ready made" we refer to design that is not made iteratively enough. There has not been proper discussion between developers and the UX designer and therefore the project has had either to rework with the design, make costly development decisions or discard the UX design or parts of it. By "convenient readiness" we refer to design that is suitable for implementation. By "too undefined" we refer to design that is hard to implement because it lacks significant information or it is hard to understand
mostly too ready-made sometimes too ready-made, mostly with convenient readiness mostly with convenient readiness sometimes too undefined, mostly with convenient readiness mostly too undefined inconsistent: sometimes too ready-made, sometimes too undefined and sometimes with convenient readiness
15. Schedule and budget of the project: The project was -- Clearly under budget / Clearly ahead of schedule, 0 On time / On budget ++ Clearly over budget / Clearly late of schedule
-- - 0 + ++ under budget ahead of schedule
over budget late of schedule
141
16. Were there any UX-related reasons for being on/under/over the budget? Please describe: ________________________________________________________________ ________________________________________________________________
17. Were there any UX-related reasons for being on/ahead/late of the schedule? Please describe: ________________________________________________________________ ________________________________________________________________
Some information about you This data will be used to give an overview of the participant population in academic publications. Some of the data can be used also as background variables during analysis. This data will not be reported to the participating projects or companies as such.
18. Gender * Female Male Prefer not to respond
19. Year of birth * (combo box >1995, 1995, …, 1943, 1995, 1995, …, 1943, 1995, 1995, …, 1943, 1995, 1995, …, 1943, .496, and in measurement U (N = 29, df = 27): r > .471. Principal component analysis (PCA). PCA is a multivariate statistical method that is used for extracting the important information from data and compressing the data set size by discarding other information thus analyzing the structure of the data (Abdi et al. 2010). Principal components are obtained as linear combinations of the original variables and each component has the largest possible variance under the constraint that it must be orthogonal to the preceding components (Abdi et al. 2010). We conducted PCA with SPSS to detect structure in the data and to reduce the correlated observed variables to a smaller set of UX items. We used Varimax with Kaiser normalization as the rotation method. The amount of extracted principal components was selected based on eigenvalue (>1) and coefficients with absolute value less than 0.5 were suppressed in the analysis. Scale reliability / internal consistency. We calculated Cronbach’s Alpha coefficients for created principal components to measure internal consistency of the items loaded to the component. We interpret the alpha according to Nunnally (1978) and use 0.70 as the threshold of acceptable consistency. Generally, correlation coefficient of 0.7 – 0.9 indicates high correlation, whereas 0.5 - 0.7 indicates moderate correlation.
4
Results
We start by presenting results of the principal component analysis and continue by presenting results of the assessments of agile team members and users.
t
5.1 Principal component analysis (PCA)
cr ip
The 16 measured items loaded into four components in PCA (Table 5). Item scores in Table 5 indicate the strength of correlation between the item and the component. The first four principal components account for 69.12% of the variation (Figure 1). Table 6 presents the internal consistency of each component indicating the extent to which items in the component measure the same dimension of UX.
us
Table 5 Rotated component matrix presents significant component loadings of PCA. Rotation was converged in 9 iterations using Varimax with Kaiser Normalization using SPSS. The data consists of measurements A and U, N = 55. Item
1 .811 .794 .767 .609 .567 .556
2
3
4
Ac
ce p
te d
m
Motivating – Discouraging Fun – Dull Promotes creativity – Suppresses creativity Presentable – Unpresentable Aesthetic – Unaesthetic Innovative – Conservative Easy to use – Difficult to use Easy to learn – Hard to learn Fast to use – Slow to use Desirable – Undesirable Good – Bad Useful – Useless Recommendable – Not Recommendable Professional – Amateurish Convincing – Unconvincing Reliable – Unreliable
an
Component
.797 .774 .736 .533 .635
.515 .714 .583 .845 .673 .531
t cr ip us an m
te d
Figure 1 Scree plot for the variables. Cumulative percentage of variance for the first four components is 69.12. The first principal component explains 44.59% of the variance, the second 10.42%, third 7.79%, and fourth 6.31% of the variance. Table 6 Internal consistency of principal components Cronbach’s Alpha
N of items in component
Motivation
.873 (good)
6
Productivity
.813 (good)
5
Usefulness
.749 (acceptable)
3
Professionalism
.687 (questionable)
3
Ac
ce p
Component name
Based on strongest correlations of each component, we named the generated components as follows: 1. Motivation, 2. Usability and willingness to use, 3. Usefulness, 4. Professionalism.. Items in each component vary similarly. The first component explains the system’s ability to motivate user via positive affect. It consisted of the following components: motivating, fun, promotes creativity, presentable, aesthetic, and innovative. It contains items from categories of affective and aesthetic quality and stimulation defined during the review. This
te d
m
an
us
cr ip
t
component holds many items related to traditional hedonic quality and is also in line with stimulation defined by Hassenzahl (2005). The second principal component measures usability and is connected with the user’s willingness to use the system. The following items loaded to the second component: easy to use, easy to learn, fast to use, and desirable. In addition, item good partially loaded to this component. Based on the presence of components desirable and good with traditional usability metrics, this component can be interpreted that if the perceived usability of the system is low, users in general are not willing to use the system. The second component contains items from productivity, interaction quality, appeal, and overall system quality categories defined in (Sundberg 2015). The third component measures the scope of the system; how well does it fit to its purpose and is it useful. It is correlated with overall satisfaction and recommendability. The fourth principal component seems to relate to work-related use itself and to the systems appropriateness to professional use. It contains items of professional and convincing and reliable. The component can also be associated with the plausibility of the system’s ability to complete required tasks. These results from our work-related sample indicate that in work context the dimensions of UX might not be the same than in leisure systems, and UX items might measure different aspects in work-related and leisure systems. For instance professional has been connected with aesthetic quality in leisure systems – the system looks professional instead of amateurish. In our study it was connected with items convincing and reliable. Still, the basic dimensions of hedonic and pragmatic quality were clearly present in our study. The first principal component explained the majority of traditional hedonic UX aspects, whereas the second one explained the majority of traditional instrumental qualities of UX.
ce p
4.1 Estimating and Predicting UX
In this section we present results of the empirical study considering the way users and team members assessed UX.
Ac
4.1.1 Users’ Evaluation on Projects’ UX Goals We asked team members to list one to three most and least important UX goals for the project. In all projects, team members emphasized the importance of pragmatic aspects of UX. The most often mentioned UX goals were the following: easy to use (of the 26 participants, 18 mentioned this), easy to learn (13 mentionings) and fast to use (13 mentionings). Each of these goals was mentioned in all the six projects at least by one team member. Fun (16 mentionings) and promoting creativity (13 mentionings) were named as the least important UX goals in every project. This result was expected since pragmatic aspects, productivity in particular, are often
4.1.2
cr ip
t
emphasized in enterprise system development (Innes 2011). We analyzed how users evaluated those items teams considered the most and least important UX goals compared to other UX items. Users did not give higher assessments for these dimensions compared to other dimensions; fast to use was in fact amongst the lowest scored items. Users gave the highest evaluations for the following dimensions: good (6.10), useful (6.10), and recommendable (6.07) while the lowest were the following: fun (4.52), promotes creativity (5.00), aesthetic (5.14) and fast to use (5.14). The mean of users overall UX evaluation was 5.69 while the mean over all the UX dimensions was 5.60
Differences between Measurements
Ac
ce p
te d
m
an
us
When evaluating the UX of the outcome, team members were more critical when they were asked to evaluate as they think a member of a particular user group would evaluate (measurement B) compared to when the team members responded as themselves (measurement A). The mean evaluations were systematically lower in measurement B compared to measurement A. We compared mean values of each item separately per project and found that in 59.5% of the cases the mean value in measurement A was higher than in measurement B, while the value of measurement B was higher in only 13.5% of the cases. All the roles (developers, POs and UXS) systematically gave lower assessments in measurement B compared to measurement A. However, when comparing team members assessments (A and B) to users’ assessments (U), only for UXSs and POs putting themselves in the users’ role improved their UX assessments compared to users (measurement B was closer to measurement U for UXSs). However, the number of UXSs and POs in our sample is too small to make generalizations. There was a statistically significant difference between team members’ and users’ responses in six UX items when team members were asked to respond as they think users would respond (comparison of measurements B and U) (Table 8). The equity of distribution across users’ (U) and team members’ responses was greater when team members were asked to respond as themselves (measurement A). In the latter case (comparison of measurements A and U), the null hypothesis remained for all items. The distribution of cases where user evaluation was higher than team evaluation and vice versa was relatively even when comparing measurement U with measurement A (U is higher in 46.88% and lower in 42.71% of the cases, Table 7). Whereas, when comparing measurement U with measurement B, cases where user evaluation was higher than team evaluation were overly represented. User evaluation was higher in 64.58% of the cases and lower in 28.13% of the cases. Based on the above, developers were overly critical with their responses in measurement B: developers’ evaluations corresponded users’ evaluations better when they were not trying to predict the user assessment. In contrast, both POs’ and UXSs’ assessments were closer to users’ assessments when they put themselves in the users’ place. On average, developers assessed UX items 0.29 points lower than
cr ip
t
users when assessing as themselves (measurement A) and 0.54 points lower than users when they tried to predict users’ assessment (measurement B) (on a sevenlevel scale). POs’ assessments in measurement A were 0.21 points higher than users’ and in measurement B 0.06 lower than users’, on average. UXSs assessments were on average 0.17 points higher than users’ in measurement A and 0.1 points lower than users’ in measurement B. We consider POs’ and UXSs’ assessments quite accurate with users’ assessments while developers’ assessments differed from those of users’. Given that in the participating projects UXSs and POs handled communication with users while developers’ understanding of users and their needs remained shallow (Kuusinen 2015), we conclude that trying to empathize with users seems to be unsuccessful with lacking understanding of the user. However, our sample included responses only from three UXSs and six POs and thus we want to be cautious with our conclusions.
an
us
Table 7 Distribution of differences between users’ (measurement U) and team members’ (measurements A and B) mean evaluations grouped by the direction of the difference. The mean difference between measurement U and measurement A or B is presented in brackets. Measurement A 46.88% (0.58) 42.71% (0.55) 10.42%
Measurement B 64.58% (0.70) 28.13% (0.59) 7.29%
m
U is higher (mean difference) U is lower (mean difference) U and A or B are equal
ce p
te d
In general, team members’ evaluations varied more between measurements A and B for items measuring non-instrumental quality. We compared team members’ responses between measurement A and B with Wilcoxon test using the following null hypothesis: “the median of differences between measurement A and B for each UX item separately is zero”; that is there is no difference between measurement A and B item-wise. The null hypothesis was rejected for the following items: · good (Z = -2.828, p = .005) · motivating, (Z = -2.942, p = .003) · fun (Z = -2.503, p = .012), and · innovative (Z = -2.183, p = .029). Thus, team members changed their evaluation more for abovementioned dimensions.
Ac
Table 8 Results of tests of equity between user and team responses when team members were asked to respond as they think users would. Test statistics grouping variable is respondent type (user or team member). Rejection of the null hypothesis (p < .05) is indicated by emboldening the value. UX item Easy to learn – Hard to learn Fast to use – Slow to use Easy to use – Difficult to use
MannWhitney U 229.000 362.500 271.000
Z -2.667 -.259 -1.874
Asymp. Sig. (2-tailed) .008 .795 .061
.392 .033 .030 .004 .392 .009 .021 .458 .442 .072 .281 .225 .119
t
-.856 -2.126 -2.171 -2.883 -.855 -2.602 -.769 -.983 -.769 -1.798 -1.078 -1.213 -1.561
cr ip
328.500 255.500 258.500 217.500 330.500 231.500 332.500 321.000 334.000 277.500 316.000 307.500 287.500
us
Reliable – Unreliable Desirable – Undesirable Recommendable – Not Recommendable Good – Bad Useful – Useless Motivating – Discouraging Fun – Dull Aesthetic – Unaesthetic Professional – Amateurish Convincing – Unconvincing Presentable – Unpresentable Promotes creativity – Suppresses creativity Innovative – Conservative
an
4.1.3 Assessments of Overall UX and Need Fulfillment
We compared evaluations of measured UX dimensions to the evaluation of overall UX with Pearson’s product-moment correlation (
m
Table 9).
te d
Table 9 Significant correlations (Pearson’s r, p < .1) between overall UX evaluation scores and measured UX items per measurement. N = 26 in measurements A and B and N = 29 in measurement U. Correlations found only in measurement U are in italics. Measurement A Item
Measurement B
Measurement U
R
Item
R
Item
R
.729
Easy to use
.604
Presentable
.709
Desirable
.655
Useful
.587
Innovative
.707
Innovative
.608
Easy to learn
.556
Convincing
.684
Recommendable
.516
Convincing
.551
Easy to use
.680
Good
.527
Good
.626
Professional
.517
Aesthetic
.620
Innovative
.500
Reliable
.606
Desirable
.602
Ac
ce p
Good
The following correlations were found only in measurement U: presentable, aesthetic, and reliable. Desirable was found in measurement U but not in
cr ip
t
measurement B, and convincing was found in measurement U but not in measurement A. Of the correlated items in measurement U, only “easy to use” measures instrumental quality. Thus, non-instrumental aspects correlated with the overall UX assessment clearly more than pragmatic ones. None of the items measuring instrumental quality correlated with the overall UX assessment in measurement A (team members as themselves). In general, Pearson’s r value grew smaller in measurement B compared to measurement A which might indicate that the team members were less confident with their responses in measurement B. The following items had a strong and statistically significant correlation with the users’ assessment of how well the system fulfills their needs: Recommendable,
useful, motivating, aesthetic, convincing, presentable, and innovative. They all belong to hedonic UX dimensions expect useful that is considered to measure the overall quality of the system.
an
Pearson’s r value .779 .718 .611 .600 .586 .578 .537
5
te d
m
Item pair Presentable – Unpresentable Innovative – Conservative Useful – Useless Recommendable – Not recommendable Motivating – Discouraging Aesthetic – Unaesthetic Convincing – Unconvincing
us
Table 10. Strong and significant correlations of measured items with the user assessment of the system’s ability to fulfill user needs.
2-tailed significance (p) .000 .000 .000 .001 .001 .001 .003
Limitations
Ac
ce p
Threats to external validity: We have only studied a restricted set of companies all operating in Finland, which threats population validity. The number of studied companies was limited to five, and as the data was collected from development projects, the sample is clustered; projects, their outcomes and users are unique and thus not directly comparable. We utilized the same team population in another study before, which subjects the study to multiple-treatment interference. As the sequence of measurements A and B was fixed, the study is prone to order bias. The data was small (55 participants) for PCA; it would be beneficial to double the number of participants. We based our sampling on (Preacher 2002) where the writers argue for smaller sample sizes, even for samples of 20. Thus, we consider our sample size sufficient but admit that larger one would have been beneficial. For instance, Gorsuch (1983) argues there should always be at least one hundred participants even for a small number of variables. Comrey and Lee (1992) consider that having 100 participants is sufficient but poor and a good sample size would be 500 participants.
Discussion
m
6
an
us
cr ip
t
Threats to internal validity: Selection bias always exists when comparing groups. In this particular setting utilizing randomized groups was impossible. Measurements A and B might be affected by learning effect, as participants answered the same questions twice (as themselves and as they think users would answer). We did not select the user participants by ourselves and thus we are unaware of the possible level of implementation bias. Although we guided the contact persons in selecting user participants, some of them might have selected, for instance, users that they knew being positive towards the software. Moreover, we did not control if a user would answer the survey twice. Using semantic differentials is prone to several types of evaluation bias. Those include the following: Central tendency bias occurs when respondents tend to favor the middle levels of a scale (Yu 2003). This was also observed in our study. Position bias concerns the order of evaluated items; users tend to treat the middle items differently than those in the beginning and in the end (Blunch 1984). We did not utilize counterbalancing, which can lead to position bias. PCA is prone to this bias since it can have an impact on the correlations between variables.
6.1 UX Scale
Ac
ce p
te d
The scale we utilized showed strong internal consistency in measuring 1. hedonic qualities of UX and 2. instrumental qualities of UX. Internal consistency was acceptable for measuring the 3. scope or overall quality of the system and questionable for measuring 4. fitness for professional use. However, internal consistency can be improved by increasing the number of items in the category (Cronbach 1951). It is possible in this case since there were only three items in components 3 and 4. Thus, confirmatory studies should be conducted for further validation of the dimensions of enterprise software UX. Also, different phrasing of items could be tested for improved fitness. In addition, the determined dimensions and their interpretation should be further analyzed. It seems that some items behave differently when measuring enterprise and leisure software. In leisure software such items as presentable, professional, and innovative (design) have been used for measuring aesthetic quality. For instance, Lavie et al. (2004) understand aesthetic in a broad sense and they divide it to dimensions of classical and expressive aesthetics. They describe the latter as follows: “The expressive aesthetics dimension is reflected by the designers’ creativity and originality and by the ability to break design conventions”. Especially in enterprise software the UX design often concentrates on user interaction or UI design. Thus, “breaking design conventions” most probably indicates bad design decisions since design conventions, for instance, in style guides have been created to instruct on developing fluent user interaction (Kuusinen et al.
2014). In addition, the phrasing of question in our study asked the participants to evaluate the system (as a whole) and not just its design or appeal. Thus, the results cannot be directly compared to studies where UI designs as such have been evaluated.
6.2 UX as Assessed by Team Members and Users
Ac
ce p
te d
m
an
us
cr ip
t
Based on our findings, it seems likely that developers are able to understand the pros and cons of the developed enterprise software quite well. However, they tend to focus on pragmatic aspects of the system neglecting the non-instrumental ones that in fact seem to be more important to users in terms of their UX. As enterprise software typically is tools that are used to perform practical tasks, instrumental quality naturally should be sufficient. However, non-instrumental quality contributes to user satisfaction and thus to human productivity, which might be an important organizational goal. The first principal component, revealed in our analysis, measured mainly the system’s ability to motivate the user while the second one measured usability and is correlated with the user’s willingness to use the system. Both these qualities of enterprise software are important for productivity and job satisfaction (Calisir et al. 2004, Hafeez-Baig et al. 2013). Developers seemed to think that users would appreciate especially qualities related to efficiency and productivity. They emphasized instrumental qualities even more when they were asked to assess the system as they think users would. This finding is in line with Hertzum et al. (2012) who found that usability professionals have a tendency towards utilitarian dimensions of usability. Also Innes (2011) argued that developers of ERP systems tend to neglect the hedonic. In our study, comparing measurements A and B (team members assessing UX (A) in own role vs. team members (B) placing themselves in users’ role), it seems that the tendency towards the instrumental was increased when developers were to think how users would assess the UX. However, in users’ assessments the hedonic correlated most with their overall UX evaluation and in PCA it was the first component. Clemmensen et al. (2013) did not find many differences between users’ and developers’ perception on usability. On the other hand, usability professionals construed usability differently from developers and users. Given that only three UXSs participated in our study, we want to be cautious to make generalizations about differences between developers’ and UXSs’ assessments. However, in our study, UXSs and POs were the best to predict user evaluation of both the pragmatic and the hedonic. Developers tended to be overly negative in their evaluations. Such finding might be explained by the fact that UXSs and POs were the most involved with users and that they thus have the best understanding of the user needs. However, the frequency of user communication (Kuusinen 2015) seemed not to improve the ability of POs to predict the UX as assessed by users. Altogether, the small number of POs and UXSs make this finding questionable and it definitely requires more research.
m
an
us
cr ip
t
Developers were more critical towards the UX when they were asked to evaluate the software in users’ point of view compared to their own evaluation and users’ evaluation. This finding is interesting considering the common practice amongst developers to try to think as they believe users would. The result might indicate that if developers do not have a proper understanding of the user, putting oneself in an imaginary user’s place seems to lower the ability to predict the actual user evaluation. Again, the small number of UXSs and POs in our study allows only cautious conclusions. However, in our population, putting oneself in the user’s place seemed to improve the accuracy of UXSs’ and POs’ evaluation. This finding provides an interesting opportunity for future work: does, for instance, utilizing personas or exposing developers to users improve the developers’ ability to put themselves in the users’ place and thus improve developers’ ability to predict UX. Another question is if it has an impact whether the UI is designed by the developers themselves or by a UXS. Also, it can have an impact how closely the developers work with the UXS. Neither users nor teams gave better evaluations to those UX dimensions that the teams considered the most important ones. Teams focused on usability and productivity whereas affective and aesthetic qualities seemed to better predict the overall UX of the users. Thus, we hypothesize that by setting clearer UX goals informed by user preference and shared amongst the whole project team might improve both the overall UX and the rating of the most important aspects.
6.3 Using the Scale to Focus UX Goals
Ac
ce p
te d
A plain list of UX dimensions can be useful when selecting UX goals for a project. The list itself can act as a constant reminder for developers of the multidimensionality of UX in a similar way personas method is often used. In personas method archetypes of users are created based on user data. Descriptions of personas are often hanged on walls to remind developers for whom they are developing. In our study, agile teams considered productivity items as the most important UX goals. This finding is in line with Innes (2011). To be able to guide the UX implementation during development, the team needs information on how users perceive the software being developed. In our study, we found that teams considered fast to use as one of the most important UX goals while users considered it as one of the poorest performing dimension. The team can use this information to focus their improvement work on the experienced speed of use. It would be interesting to measure if improving a quality with a low evaluation score would improve the overall UX score. On the other hand, another hypothesis could be that improving performance on dimensions with the strongest correlation to the overall UX score would increase the overall UX score. Users might also expect that items related to productivity and efficiency need to be on sufficient level not to lower the UX. However, after that, other qualities become more important predictors of the perceived UX. Thus, a third hypothesis is that concentrating on items belonging to the first principal component would increase the overall UX assessment of users.
7
Conclusions
8
us
cr ip
t
We compared UX assessments of members of agile teams and users of the software systems under development. Our results indicate that developers concentrate on instrumental aspects of UX whereas for users non-instrumental aspects might be more important predictor of their perception of overall UX. Moreover, it seems to be difficult for developers to place themselves in the user’s position and thus trying to do so can even be harmful when the team member does not have sufficient understanding of the user. These findings contribute towards understanding development team members’ ability to understand UX in order to enable allocating UX tasks between team members and thus focusing the limited UXS resource to those tasks that developers cannot handle by themselves.
Acknowledgment
References
te d
9
m
an
We thank Timo Partala for instructing with the data analysis methods. We are grateful to all the study participants. Our research has been supported by TEKES as part of the Cloud Software and Need for Speed research programs of DIGILE (Finnish Strategic Centre for Science, Technology and Innovation in the field of ICT and digital business).
Ac
ce p
1.Abdi, H. and Williams, L. J. Principal component analysis. Wiley Interdisciplinary Reviews: Computational Statistics, 2(4) 2010, pp. 433-459. John Wiley & Sons. 2.Ardito, C., Buono, P., Caivano, D., Costabile, M.F., Lanzilotti, R., Bruun, A., Stage, J., 2011. Usability evaluation: a survey of software development organizations. Proc. of the International Conference on Software Engineering and Knowledge Engineering, SEKE 11. pp. 282–287 Knowledge Systems Institute, Skokie. 3.Bargas-Avila, J., Hornbæk, K. Old Wine in New Bottles or Novel Challenges? A Critical Analysis of Empirical Studies of User Experience. In: Proc. Annual conference on Human factors in computing systems. ACM (2011), pp. 2689–2698. 4.Bastien, J. M. C. Usability testing: a review of some methodological and technical aspects of the method. Int. J. of Medical Informatics. 79(4), 2010. pp. e18-e23. Elsevier 5.Blunch, N. J. Position bias in multiple-choice questions. J. of Marketing Research, 21, 2 (1984), 216-220. 6.Bradley, M.M., Lang, P.J. Measuring emotion: the self-assessment manikin and the semantic differential. J. of Behavior Therapy and Experimental Psychiatry 25 (1994), 49– 59. 7.Bruun, A., Gull, P., Hofmeister, L., and Stage, J. Let your users do the testing: a comparison of three remote asynchronous usability testing methods, Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI), 2009, pp 1619-1628. ACM
Ac
ce p
te d
m
an
us
cr ip
t
8.Bruun, A., Stage, J. The effect of task assignments and instruction types on remote asynchronous usability testing Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI), 2012 pp. 2117-2126 9.Calisir, F. and Calisir, F. The relation of interface usability characteristics, perceived usefulness, and perceived ease of use to end-user satisfaction with enterprise resource planning (ERP) systems. Computers in Human Behavior 20(4), 2004, pp. 505–515. Elsevier 10.Clemmensen, T., Hertzum, M., Yang, J., and Chen, Y. Do usability professionals think about user experience in the same way as users and developers do? Interact 2013, Part II, LNCS 8118 (2013), 461-478. 11.Costello, B., and Edmonds, E. (2007). A study in play, pleasure and interaction design. Proc. the 2007 conference on Designing pleasurable products and interfaces (DPPI’07). pp. 76-91. ACM. 12. Comrey, A. L. and Lee, H. B. A first course in factor analysis. Hillsdale, NJ Erlbaum 13.Cronbach, L. J. Coefficient Alpha and the Internal Structure of Tests. Psychometrika 16(3), 1951, pp 297-334. Springer 14.da Silva, T.S., Martin, A., Maurer, F., Silveira, M.: User-centered design and Agile methods: a systematic review. In: Proc. of the International Conference on Agile Methods in Software Development, AGILE 2011 IEEE 15.Davis, F. D. (1989). A technology acceptance model for empirically testing new end-user information systems: Theory and results. MIS Quarterly, 13 (3), pp. 319-340 16.Desmet, P., Overbeeke, C., Tax, S. 2001. Designing products with added emotional value; development and application of an approach for research through design. The design journal 4 (1), 32-47. 17.Diefenbach et al. 2014: Diefenbach, S., Kolb, N., Hassenzahl, M. (2014). The ‘Hedonic’ in Human-Computer Interaction. Proc. of the 2014 Conference on Designing interactive systems (DIS), pp. 305-314, ACM 18.Dray, S., Siegel, D. Remote possibilities?: international usability testing at a distance. Interactions 11(2) 2004, pp. 10-17. ACM 19.Fitzgerald, B. & Stol, K.-J. (2014) Continuous software engineering and beyond: trends and challenges. In Proc. 1st International Workshop on Rapid Continuous Software Engineering (RCoSE 2014). ACM, New York, NY, USA, pp. 1-9. 20.Goodhue, D. L. & Thompson, R. L. (1995). Task-Technology Fit and Individual Performance. MIS Quarterly, 19 (2), pp. 213-236. 21.Gorsuch, R. L. (1983). Factor analysis (2nd ed.). Hillsdale, NJ: Erlbaum 22.Hafeez-Baig, A. and Gururajan, R. Expectations, Usability, and Job Satisfaction as Determinants for the Perceived Benefits for the Use of Wireless Technology in Healthcare. Pervasive Health Knowledge Management, pp 305-316. Springer 23.Hassenzahl, M. The Interplay of Beauty, Goodness and Usability in Interactive Products. Proc. HCI. Lawrence Erlbaum Associates, 19, 4 (2004), 319-349. 24.Hassenzahl, M. The Thing and I: Understanding the Relationship Between User and Product. In Blythe, M., Overbeeke, K., Monk, A., and Wright, P. (Eds.). Funology: From Usability to Enjoyment. Kluwer Academic Publishers, 2005, 31–42. 25.Hassenzahl, M. (2008). User experience (UX): towards an experiential perspective on product quality. Proc. 20th International Conference of the Association Francophone d'Interaction Homme-Machine (pp. 11-15). ACM. 26.Hassenzahl, M. and Tractinsky, N. User experience - A research agenda. Behaviour & Information Technology 25 (2), 91-97. 2006 27.Hertzum, M., Clemmensen, T. How do usability professionals construe usability? Int. J. of Human-Computer Studies 70 (2012), 26-42.
Ac
ce p
te d
m
an
us
cr ip
t
28.Hertzum, M., Clemmensen, T., Hornbaek, K., Kumar, J., Shi, Q., and Yammiyavar, P. Personal usability constructs: How people construe usability across nationalities and stakeholder groups. Int. J. of Human-Computer Interaction, 27 (8), pp. 729-761. 2011. 29.Holzinger, A. Usability engineering for software developers. Communications of the ACM, 48 (1) (2005), pp. 71–74 30.Innes, J. Why Enterprises Can’t Innovate: Helping Companies Learn Design Thinking. In HCII 2011, LNCS: 6769, 442–448. Springer Berlin / Heidelberg 31.ISO 9241. Ergonomic requirements for office work with visual display terminals (VDTs) – Part 11: Guidance on usability. Genève, CH: International Organization for Standardisation. 1998. 32.Ivory, M. Y. and Hearst, M. A. 2001. The state of the art in automating usability evaluation of user interfaces. Comput. Surv. 33(4) 2001, pp. 470-516. ACM 33.Jackson, S. Marsh, H. Development and validation of a scale to measure optimal experience: The Flow State Scale. J. of Sport and Exercise Psychology, 18 (1996) 17-35. 34.Kirakowski, J. The software usability measurement inventory: Background and usage. In P. W. Jordan et al. (Eds.), Usability Evaluation in Industry, Taylor & Francis (1996), 169178. 35.Kuusinen, K. Task Allocation between UX Specialists and Developers in Agile Software Development Projects. Accepted to Interact 2015, to appear. 36.Kuusinen, K. and Mikkonen, T. On Designing UX for Mobile Enterprise Apps. Proc. Software Engineering and Advanced Applications (SEAA) 2014, pp. 221 - 228 37.Larusdottir, M. K., Cajander, A, Gulliksen, J. Informal Feedback Rather Than Performance Measurements – User Centred Evaluation in Scrum Projects. Behaviour and Information Technology. 33(11) 2013, pp. 1118 - 1135. 38.Lallemand, C., Gronier, G., & Koenig, V. (2015). User experience: A concept without consensus? Exploring practitioners’ perspectives through an international survey. Computers in Human Behavior, 43, 35-48. 39.Lavie, T., Tractinsky, N. Assessing dimensions of perceived visual aesthetics of web sites. Int. J. of Human-Computer Studies, 60, 3 (2004), 269-298. 40.Law, E., Roto, V., Hassenzahl, M., Vermeeren, A. and Kort, J. Understanding, scoping and defining user experience: a survey approach. In Proc. CHI’09, ACM (2009), 719-728. 41.Law, E. L. C., Hassenzahl, M., Karapanos, E., Obrist, M., & Roto, V. (2015). Tracing links between UX frameworks and design practices: dual carriageway. Proc. HCI Korea (pp. 188-195). Hanbit Media, Inc 42.Lee, T. M. & Park, C. (2008). Mobile technology usage and B2B market performance under mandatory adoption. Industrial Marketing Management, 37 (7), pp. 833–840. 43.Lindgaard, G., & Kirakowski, J. (2013). Introduction to the special issue: The tricky landscape of developing rating scales in HCI. Interacting with Computers, 25(4), 271-277. 44.McCarthy, J., & Wright, P. (2004). Technology as experience. interactions, 11(5), 42-43. 45.Mehrabian, A. (1996). Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Current Psychology, 14(4), 261292.. 46.Nunnally, J. (1978). Psychometric theory (2nd ed.). New York: McGraw-Hill 47.Panwar, M. Application Performance Management Emerging Trends. Proc. Cloud & Ubiquitous Computing & Emerging Technologies (CUBE), 2013, pp. 178 - 182 48.Poppendieck, M. & Poppendieck, T. (2003) Lean Software Development: An Agile Toolkit. Addison-Wesley Professional, 203p 49.Preacher, K. J., & MacCallum, R. C. Exploratory Factor Analysis in Behavior Genetics Research: Factor Recovery with Small Sample Sizes. Behavior Genetics, 32 (2002), 153161.
Ac
ce p
te d
m
an
us
cr ip
t
50.Saiedian, H., Dale, R. Requirements engineering: making the connection between the software developer and customer. Information and Software Technology 42, 6 (2000), 419428. 51.Salah, D., Paige, R. and Cairns, P. A systematic literature review on agile development processes and user centred design integration. In Proc. of the 18th International Conference on Evaluation and Assessment in Software Engineering (EASE’14). ACM, (2014). Article 5, 10 p. 52.Salvador, C., Nakasone, A. and Pow-Sang J. A. 2014. A systematic review of usability techniques in agile methodologies. Proc. 7th Euro American Conference on Telematics and Information Systems (EATIS '14). Article 17, 6p. ACM 53.Shackel, B. Usability-Context, Framework, Definition, Design and Evaluation. Interacting with Computers 21(5-6) (2009), 339–346. 54.Schwaber, K. Agile project management with Scrum (Microsoft professional), 1st. ed., Microsoft Press. 2004. 55.Sundberg, H.-R. (2015). The importance of user experience related factors in new product development – Comparing the views of designers and users of industrial products. 23 rd Nordic Academy of Management Conference, 12-14 August 2015, Copenhagen, Denmark. 56.Sy, D. Adapting usability investigations for Agile user-centered design. J of Usability Studies 2, 3 (2007), 112–132 57.Väänänen-Vainio-Mattila, K., Roto, V. & Hassenzahl, M. (2008). Towards practical user experience evaluation methods. EL-C. Law, N. Bevan, G. Christou, M. Springett & M. Lárusdóttir (eds.) Meaningful Measures: Valid Useful User Experience Measurement (VUUM) (2008): 19-22. 58.Väätäjä, H., Koponen, T. & Roto, V. 2009. Developing practical tools for user experience evaluation: a case from mobile news journalism. European Conference on Cognitive Ergonomics (ECCE '09). VTT Technical Research Centre of Finland, VTT, Finland. pp. 240-247. 59.Yu, J.H., Albaum, G., and Swenson, M. 2003. Is a central tendency error inherent in the use of semantic differential scales in different cultures? International Journal of Market Research, 45(2), 213-228 60.Zhang, P., & Li, N. (2005). The importance of affective quality. Communications of the ACM, 48(9), 105-108. 61.Zijlstra, R. Efficiency in Work Behaviour. A Design Approach for Modern Tools. Delft University Press (1993).