feature
requirements engineering
Implementing Requirements Engineering Processes:
Using Cooperative Self-Assessment and Improvement
Jörg Dörr, Sebastian Adam, and Michael Eisenbarth, Fraunhofer Institute for Experimental Software Engineering Michael Ehresmann, Insiders Technologies
A process improvement approach for small organizations enlists employees to make improvement decisions, leading to positive stakeholder participation and high acceptance of the improvements.
R
equirements engineering can help assure success for projects and their products. Small-to-medium enterprises (SMEs) demand effective, efficient RE practice because they must adhere to delivery dates and provide the same quality as large organizations, but with a smaller budget and staff. Moreover, these companies are highly af-
fected by each customer’s specific needs, and they must react accordingly. Although an increasing number of smaller enterprises are receiving positive appraisals according to Spice (Software Process Improvement and Capability Determination, www.sqi.gu.edu.au/spice) or CMMI (CMM Integration, www.sei.cmu.edu/ cmmi), many of them still find RE a challenge. They have difficulty improving their RE practice,1–4 even if they’re well aware of the necessity. This problem has two main causes. First, smaller enterprises often lack specialists with significant experience in tailoring state-of-the-art software development to the organization’s needs. Second, they typically lack the time and money for improvement activities because they invest primarily in customer satisfaction. The ReqMan research project aimed to facilitate RE process improvement and its acceptance in SMEs (see the sidebar “RE Process Improvement at a German Small-to-Medium Enterprise”). Our approach considers an SME’s individual needs, real-
0 74 0 -74 5 9 / 0 8 / $ 2 5 . 0 0 © 2 0 0 8 I E E E
izing that such enterprises often know their domain and context well.5 We try to integrate the employees into the improvement process, enabling them to make improvement decisions rather than having external consultants make the decisions for them. We’ve successfully applied our approach several times during the last two years. We observed strong, positive employee participation in the assessment and improvement discussion as well as high acceptance of the introduced changes—even by persons who are normally skeptical toward new development activities.
Process improvement requirements in SMEs SMEs are so varied that we can’t make general assumptions about their contexts. However, from cooperating with many smaller companies, we’ve learned that process improvement methods’ May/June 2008 I E E E S o f t w a r e
71
RE Process Improvement at a German Small-to-Medium Enterprise Insiders Technologies GmbH is an SME with around 60 employees in Kaiserslautern, Germany. Insiders specializes in intelligent content recognition and processing systems for all incoming content. The company also provides business process automation by applying related technologies. Owing to an increasing number of customers and our software’s criticality, we started an extensive process improvement project to retain and enhance our products’ quality. While trying to implement large-scale process models, we soon recognized that despite the external consulting, many activities required too much overhead. It was difficult to spot what was necessary, so we had problems implementing improvements. Next, we tried agile methods, such as Extreme Program ming (XP), which were easy to understand. This mostly worked well, but our lack of knowledge and experience in this area made it difficult to determine our status quo and discover which XP practices we should implement. ReqMan let us implement for first-time systematic process improvement, with several benefits: ■
The practices were easy to understand, so team mem-
■
■
bers who weren’t familiar with requirements engineering were integrated quickly. We didn’t have discussions leading off the target process because we focused strongly on practices and improvements. Classifying the practices made it easy to focus on the most important tasks by checking whether they were implemented sufficiently.
Through ReqMan, we could focus on the most important improvements. We achieved a standard that we wouldn’t have been able to reach by ourselves. We also improved our requirements management knowledge and could determine the current state of our practice. So, we ■
■
■
failed fewer acceptance tests by systematically deriving test and design from requirements, improved communication between the product owner and developer to reduce defects in early phases, and simplified requirements management by structuring the requirements.
applicability and acceptance depends mainly on five factors:3,4,6 ■ Ease and inexpensiveness. Improvement meth-
ods should allow fast, inexpensive assessments to identify and establish improvements. SMEs should be able to identify practices that they can introduce in one step with limited effort and money. Instead of a few big iterations with unpredictable consequences, they should be able to run many small iterations that focus on one problem. ■ Understandability and predictability. All stakeholders should be able to easily understand the problems and the rationale behind chosen improvements. Because smaller enterprises are at a high risk for inappropriate changes, the benefits and challenges must be as predictable as possible. ■ Flexibility. Improvement methods should provide suggestions that are flexible and easy to adapt to the organization’s development context, philosophy (such as agility), and RE processes. ■ Enabling participation. Improvement methods should enable all relevant stakeholders to participate. The methods should motivate stakeholders to make changes and should profit from stakeholders’ knowledge about the context 72
IEEE Soft ware
w w w. c o m p u t e r. o rg /s o f t w a re
(such as which suggestion will work). ■ Concreteness. Improvement methods should
provide concrete improvements, including information on implementing changes and performing suggested activities. Otherwise, organizations with limited knowledge of state-ofthe-art methods will have trouble establishing improvements. These requirements could also apply to larger companies, but they’re especially important to smaller ones.
Why yet another improvement method? Existing process improvement frameworks have proven their usefulness and capabilities in industry. Two prominent improvement frameworks are the PDCA (Plan-Do-Check-Act) cycle7 and the Ideal (Initiating, Diagnosing, Establishing, Acting, and Learning) model.8 Several derivations and evolutions of these frameworks are available, such as the Six Sigma method, which is derived from the PDCA cycle (www.tqm.com/consulting/six-sigma/ six-sigma/?searchterm=six%20sigma). Those approaches cluster the process improvement cycle into several iterative steps. They usually include planning and analysis, in which the current process situation is identified and analyzed. Then,
Concepts of our improvement approach The core of our approach comprises ■ a framework of important RE practices and ■ a systematic method that integrates the staff
into the improvement process.
Practices-based improvement This technique focuses on process improvement using a set of abstract practices, concrete techniques, tools, and templates instead of complex reference processes that have to be tailored to the organization. This abstracting from rigid processes makes our approach flexible and appropriate for agile contexts (addressing the flexibility requirement). Moreover, practices-based improvement focuses on single activities instead of understanding the whole process. Therefore, during the assessment, the stakeholders can easily discuss and analyze the performed activities, their weaknesses, and possible improvements using only a visualized set of best practices. This lets stakeholders identify and introduce changes with less effort (addressing ease and inexpensiveness). To make practices-based improvement possible, our approach provides a set of 36 practices and related techniques that we gathered from the established RE literature and other approaches such as CMMI or Spice. Industrial and scientific
Phase Abstraction
the stakeholders discuss possible improvement activities and create solutions. After implementing the solutions, the stakeholders usually conduct a second analysis to evaluate the results and define actions for subsequent iterative steps. Unfortunately, these approaches don’t provide precise descriptions or detailed guidance on enacting a specific phase. Our approach aims to close the gap between general improvement principles and specific recommendations for executing the specific improvement steps. So, we don’t propose a new improvement framework or an assessment approach such as CMMI or Spice, which aim at objectively and comparably assessing an organization’s process capabilities. Our approach addresses the steps of analyzing and creating solutions for process improvement frameworks and can provide guidance for process assessments such as CMMI or Spice. On the basis of the results of such an assessment, our approach focuses on the identified (RE) problem areas and provides detailed practical advice and effective practices for improving concrete RE activities.
Requirements elicitation
1:n Practice
Elicit functional requirements
n:m EMPRESS nonfunctional requirements
Technique
(a)
Elicit nonfunctional requirements
(b)
RE experts then evaluated the practices and techniques. So, we can consider it as a stable baseline for RE process improvement. Our justification for providing yet another framework is the need to bring the practices to a common abstraction level and classify them according to their importance and objectives. Our framework (see figure 1) distinguishes phases, practices, and techniques. Phases are abstract activities, such as requirements elicitation. They let us focus on process parts but are too abstract with regard to assessing whether they’re sufficiently implemented. So, each phase includes a set of practices that concretize which activities must be done in that phase to establish state-of-the-art RE, such as “elicit nonfunctional requirements.” However, practices are still abstract, focusing on the what and not on the how. In contrast, techniques give clear and specific information on how to implement a practice. So, our method provides both abstract and concrete improvements (addressing concreteness). Abstracting from the realization details facilitates easy assessment by focusing on the what and why but not the how during the first improvement step. So, the approach increases understandability and predictability and supports finding suitable solutions by simply considering the set of practices. This facilitates fruitful discussions about the current state of the RE process and helps employees with less knowledge of state-of-the-art methods actively participate in assessing and improving decisions (addressing enabling participation). Finally, using these practices lets you focus on a few changes at a time. Many small and inexpensive iterative improvements—implemented over a period of time—replace one big change that would have a large impact on the organization and its employees (which also addresses ease and inexpensiveness). The corresponding prioritization can be done according to the current need, general importance of a RE practice, or the phase in which a practice is grouped.
Figure 1. Practicesbased improvement. The diagrams show (a) metamodel of our approach and (b) an example of the relationship between phases, practices, and techniques.
May/June 2008 I E E E S o f t w a r e
73
Management Manage requirements changes
Estimate cost and time
Evaluate risk
Product planning
Reuse requirements
Prioritize and negotiate requirements
Determine roles and responsibilities
Improve RE process
Choose technology
Manage variability
Elicitation
Analysis
Specification
Determine feasibilty
Formal modeling
View-based documentation
Verification and validation
Elicit nonfunctional requirements
Elicit tasks and business processes
Model GUI
Model user behavior
Assure traceability
Document rationales
Prototyping
Determine scope
Elicit goals
Model domain
Model data
Document customer requirements
Describe measurable and testable requirements
Formal requirements verification
Validate usability
Elicit functional requirements
Identify stakeholders and sources
Model interaction
Requirements impact analysis
Document developer requirements
Use standards and document structure
Review requirements
Prepare tests for requirements
Legend Basic practice
Advanced practice
Optimizing practice
Context-dependent practice
Figure 2. The set of classified practices. Practices are classified according to the requirements phase and the practices’ importance for successful requirements engineering.
74
IEEE Soft ware
Phases and practices have a 1-to-n relationship, meaning that a practice is classified into exactly one phase. Practices and techniques have a n-to-m relationship. A practice can be implemented by one or more techniques; at the same time, a technique can implement one or more practices. Our set of practices are classified along two dimensions: the associated requirements phase and the practices’ importance to the RE process (see figure 2). There are four levels of importance. Basic practices include fundamental RE tasks, thereby forming the RE process foundation. So, our approach recommends addressing all basic practices during the first improvement step. Advanced practices aim at improving specific RE aspects. Optimizing practices aren’t necessary but can further improve the processes by reflecting and selfoptimizing them. Context-dependent practices provide a benefit only in certain contexts. For instance, the practice “analyzing business process” could be useful in organizations developing business support software but probably has no benefit in embedded systems. So, we initially separate these practices from the
w w w. c o m p u t e r. o rg /s o f t w a re
other three importance levels. However, when applying our approach to an organization, we map these practices to the other three categories before the assessment starts (see the arrows in figure 2).
Cooperative self-assessment and improvement In our approach, all affected stakeholders identify, select, and agree on the improvements. We divide stakeholders into two main groups: ■ Domain experts know and understand the or-
ganizational context and processes. ■ Improvement experts have the theoretical under-
standing of RE and the improvement approach. Figure 3a shows the usual process improvement cycle. In step 1, the domain experts explain in detail the context, current problems, goals, and processes to the improvement experts (visualized by the blue arrow). In step 2, the improvement experts analyze the situation and select solutions on the basis of industrial experience and theoretical knowledge. In step 3, the improvement experts give recommenda-
tions for improving the organization or processes. The domain experts can then accept or reject the recommendations. In contrast, figure 3b depicts our approach’s cooperative style. In step 1, the improvement experts explain in general terms how this approach can solve problems or improve processes. They present applicable practices, techniques, their dependencies, and theoretical benefits. Using the given information, the domain experts assess the current situation and decide what’s appropriate. Because they know their context and what will work, they’re most qualified to analyze and select the areas to improve in step 2. In the last step, they inform the improvement experts what areas will be improved, and the improvement experts support the introduction of new techniques. Thus, the information flow between domain and improvement experts is shifted. Finally, both groups use their combined knowledge and experience to arrive at the improvement suggestions. This cooperative style has two main benefits. First, it lets participants find the right solutions by better understanding the context (addressing understandability and predictability). Second, it fosters increased acceptance due to the stakeholders’ active participation in decision making (addressing enabling participation).
The improvement process Our approach integrates those core concepts in a common PDCA cycle. We adopted existing frameworks’ principles, especially for planning and analysis, and adapted them to our cooperation style. Our approach’s innovation is in performing single steps in the improvement process rather than the improvement process in general. We’ve implemented our approach in three half-day workshops comprising seven steps. Step 1 is performing analysis and selection. The first workshop aims to elicit the current situation and problems. Having representatives from all RErelated stakeholder groups fosters a common understanding of the situation. The participants analyze the situation using the visualized practices (see figure 2), assessing how the practices are fulfilled at the company. The improvement expert acts as the moderator. The emphasis is on the domain experts sharing their knowledge and opinions regarding fulfilling the practices. Naturally, if all basic practices aren’t fulfilled, participants should focus on those practices first. Likewise, they should address the advanced practices before the optimizing practices. At the workshop’s end, stakeholders should agree on which practices to improve.
1 Context and processes
1 Possibilities
2
Analysis and selection
Analysis and 2 selection
Domain Improvement expert expert
(a)
3 Recommendations
Domain Improvement expert expert
3 Decision
(b)
Step 2 is documenting the assessment and identifying techniques. After the first workshop, the improvement experts document the assessment and prepare a first set of applicable techniques. The set is based on the candidate practices to be improved from the first workshop. Step 3 is selecting the techniques. The second workshop focuses on identifying improvements for the current process and solutions to the problems. The discussion is based on the results of steps 1 and 2. Again, in the workshop, all stakeholders should be present, and everyone has a say on which techniques to introduce. This workshop selects one to four techniques. Step 4 is introducing techniques. After the second workshop, the domain experts and the new techniques’ first users decide which candidates to introduce first. Step 5 is establishing evaluation criteria. In parallel, the improvement experts prepare criteria for evaluating the changes’ effectiveness. Step 6 is collecting data. Once the stakeholders use the techniques in everyday project work, they collect the data necessary to evaluate it. Step 7 is performing evaluation and iteration. During the third workshop (typically six months after introducing the new techniques), the stakeholders discuss the changes’ effect and begin the next improvement iteration. Because each iteration is small and inexpensive, improvement can become continuous. In this meeting, the stakeholders also discuss the collected data and assess the practices’ fulfillment to provide an overview of the changes’ effects. The rating of practice fulfillment can change over time. So, workshop participants should perform a complete assessment each time. This serves as the bridge to the next improvement iteration. This is only feasible if you can perform the assessments in a short time frame, as in our approach.
Figure 3. Cooperation styles in process improvement. (a) In the usual style, outside consultants (improvement experts) analyze the situation and recommend solutions. (b) In a more cooperative style, the enterprise’s domain experts analyze the situation and identify areas for improvements.
May/June 2008 I E E E S o f t w a r e
75
■ Understandability and predictability. Most
Degree of confirmation (1 = totally disagree, 5 = totally agree)
5
4
3
2
pt
en
ce ac lly
ov
em
Ge
ne
ra
pr im d
te en
ed
ts
y em
ap to
sy
st
Ea
de r un
ts or pp Ea
si
ly
im
pl
Su
Be
pl
ng di an
at ci p rti
pa ws lo
Al
ne
io n
pl ap to
e iv ct fe
Ef
li fic
ia
y
ts en em
pr m
pr im te ua eq Ad
ov
em ov
Ef
fic
ie
en
ts
nt
1
Figure 4. Assessment results of our method from a survey with workshop participants. The thin black line indicates the mean; the yellow bars show where 50 percent of all answers fell. (The categories listed here were later consolidated into the improvement requirements we mention in the article.)
76
IEEE Soft ware
Experience from industrial application
workshop participants understood the purpose of each discussed practice and could estimate its impact on their organization. Accordingly, the process stakeholders were largely convinced that they chose appropriate improvements, and they largely accepted these changes. Even those persons who previously were skeptical about extending the requirements process accepted the changes. ■ Flexibility. Most process stakeholders were convinced that the improvements were suitable for their context and easy to apply. There was little need to tailor any technique. Even the fact that two companies used XP didn’t seem to hamper acceptance of the approach. Instead, the stakeholders adapted the practices for their context and accepted them as applicable even in an agile environment. ■ Enabling participation. The process stakeholders were motivated because they felt a part of the improvement decisions. Even though the groups were heterogeneous—ranging from programmers and project managers to testers and documentation specialists—almost everyone contributed views. The discussion was particularly active and constructive even in the workshops with many participants. The set of practices was an excellent catalyst for the discussion, which resulted in a fruitful exchange of opinions. ■ Concreteness. The workshop participants found the improvement suggestions concrete enough to know what changes to implement and how to do so. Thus, they believed that they would be able to run the improvements successfully. However, many participants didn’t expect that they could implement the changes without external support because the case-study organizations lacked sufficient RE know-how.
Six companies have tested our approach. All the cases are from real-life projects, developing products for customers. The experience has been overwhelmingly positive. In all cases, our approach achieved a high acceptance level according to a post-study survey. In this survey, the respondents rated 58 statements about our approach using a five-point Likert scale, where 1 indicated “totally disagree” and 5 indicated “totally agree.” We surveyed all workshop participants in the six companies and received 18 responses. The results were almost the same in all case study companies, even though the companies were of different sizes (20 to 200 employees) and had different domains (financial, data mining, automotive, and so forth), process maturity (low versus medium), and degrees of agility (Extreme Programming-inspired versus rigid). So, we consider the results fairly reliable. Our approach was assessed according to the process improvement requirements we mentioned earlier (see figure 4):
It also seemed that the actual changed processes were being performed closer to the intended ones and that the changes were sustained. However, we haven’t yet evaluated the long-term effects because the changes were only recently implemented. These case studies revealed three weaknesses in our approach:
■ Ease and inexpensiveness. In all case studies,
■ Limited approach. Because our approach con-
the method was easy and inexpensive to apply. The process stakeholders didn’t have trouble performing the method because no workshop took longer than a half day. So, the method is appropriate for organizations with little time or money.
centrates on RE, the set of practices isn’t sufficient for general software process improvement. We will need to extend to further disciplines. ■ Weak support for implementing process changes. Implementing the changes—for example, changing established processes and teach-
w w w. c o m p u t e r. o rg /s o f t w a re
ing new techniques—is still a challenge our approach doesn’t sufficiently address. ■ The centrality of the moderator. Because our approach is people-oriented, the moderator has a central and difficult role in balancing the requirements know-how with facilitating communication. The moderator needs a good understanding of requirements and, even better, the domain. We initially hoped that the moderator wouldn’t need much requirements knowledge, but such knowledge turned out to be essential.
About the Authors Jörg Dörr is head of the Fraunhofer Institute for Experimental Software Engineering’s
requirements and usability engineering department. His research interests include requirements engineering, focusing on nonfunctional requirements. He received his MS in computer science from the University of Kaiserslautern. He’s a member of the German Computer Science Society and is active in the Requirements Engineering special interest group. Contact him at Fraunhofer IESE, Fraunhofer-Platz 1, 67663 Kaiserslautern, Germany; joerg.doerr@ iese.fraunhofer.de.
Sebastian Adam is a requirements engineering researcher at the Fraunhofer Insti-
tute for Experimental Software Engineering in Kaiserslautern. His research interests include process improvement issues. He received his MS in computer science from the University of Kaiserslautern. Contact him at Fraunhofer IESE, Fraunhofer-Platz 1, 67663 Kaiserslautern, Germany;
[email protected].
T
he positive responses from the case studies indicate that our central hypothesis—practice-based improvement using self-assessment is appropriate for smaller enterprises—might be true. Although the case studies show that our process improvement approach satisfactorily fulfilled requirements, we still have work to do. We must further evaluate and improve the classification. We especially need to focus on tailoring the context-dependent practices to one of the other three classes. We aim to identify the main factors affecting the context practices. Other important questions concern moving on to more comprehensive or complicated improvement approaches and reacting when formal certification is necessary. In this context, we must bridge the gap to other approaches. We’re working on integrating our method with CMMI and Spice—for example, by considering our practices as refinements of CMMI specific goals or Spice processes. Although we specifically developed our approach to support requirements process improvements, the concept might be applicable to other parts of the software engineering process. We’re looking at how we can similarly handle project management, reuse, and quality management.
Acknowledgments
The German Federal Ministry of Education and Research funded ReqMan under project 01ISC02A-D.
References 1. C. Gresse von Wangenheim, A. Anacleto, and C.F. Salviano, “Helping Small Companies Assess Software Processes,” IEEE Software, vol. 23, no. 1, 2006, pp. 91–98. 2. N. Juristo, A.M. Moreno, and A. Silva, “Is the European Industry Moving Toward Solving Requirements Problems?” IEEE Software, vol. 19, no. 6, 2002, pp. 70–77. 3. M. Niazi, D. Wilson, and D. Zowghi, “Critical Success
Michael Eisenbarth is a project manager in the Fraunhofer Institute for Experi-
mental Software Engineering’s requirements and usability engineering department. His research interests include requirements engineering and software product lines, focusing on requirements management. He received his MS in computer science from the University of Kaiserslautern. Contact him at Fraunhofer IESE, Fraunhofer-Platz 1, 67663 Kaiserslautern, Germany;
[email protected].
Michael Ehresmann is a development manager at Insiders Technologies. His
research interests include process improvement in an agile context, with a requirementsengineering focus. He received his MS in information engineering from the University of Kaiserslautern. Contact him at Insiders Technologies GmbH, Brüsseler Straße 1, 67657 Kaiserslautern, Germany;
[email protected].
Factors for Software Process Improvement Implementation: An Empirical Study,” Software Process Improvement and Practice, vol. 11, no. 2, 2006, pp. 193–211. 4. T. Olsson et al., “A Flexible and Pragmatic Requirements Engineering Framework for SME,” Proc. 1st Int’l Workshop Situational Requirements Eng. Processes, Univ. of Limerick, 2005, pp. 1–12, http://cui.unige. ch/db-research/SREP05/Papers/01.pdf. 5. A. Rainer, T. Hall, and N. Baddoo, “Persuading Developers to ‘Buy into’ Software Process Improvement: Local Opinion and Empirical Evidence,” Proc. Int’l Symp. Empirical Software Eng., IEEE CS Press, 2003, pp. 326–335. 6. I. Richardson, “SPI Models: What Characteristics Are Required for Small Software Development Companies?” Software Quality J., vol. 10, no. 2, 2002, pp 101–114. 7. W.E. Deming, Out of the Crisis, MIT Press, 1989. 8. J. Gremba and C. Myers, “The Ideal Model: A Practical Guide for Improvement,” Bridge, no. 3, 1997; www.sei. cmu.edu/ideal/ideal.bridge.html.
For more information on this or any other computing topic, please visit our Digital Library at www.computer.org/csdl. May/June 2008 I E E E S o f t w a r e
77