SOFTWARE PROCESS IMPROVEMENT AND PRACTICE Softw. Process Improve. Pract. 2002; 7: 3–15 (DOI: 10.1002/spip.150)
Implementing Software Process Improvement: An Empirical Study Tracy Hall,*,† Austen Rainer and Nathan Baddoo Computer Science Department, University of Hertfordshire, Hatfield, AL109AB, UK
Research Section
In this paper we present survey data characterizing the implementation of SPI in 85 UK companies. We aim to provide SPI managers with more understanding of the critical success factors of implementing SPI. We present an analysis of the critical implementation factors identified in published case studies. We use a questionnaire to measure the use of these factors in ‘typical’ software companies. We found that many companies use SPI but the effectiveness of SPI implementation is variable. Many companies inadequately resource SPI and fail to evaluate the impact of SPI. On the other hand, companies show a good appreciation of the human factors associated with implementing SPI. Copyright 2002 John Wiley & Sons, Ltd. KEY WORDS: software process improvement; implementation; empirical; survey
1. INTRODUCTION In this paper we present survey data characterizing the implementation of software process improvement (SPI) in 85 UK companies. Our data identifies the critical factors that managers can use to more effectively control the implementation of SPI. The aim of our work is to develop a maturity-based framework that SPI managers can use to assess the readiness of companies to accelerate the implementation of SPI (Wilson et al. 2001). SPI has become a popular approach to delivering improvements in software products (Humphrey 1989). Many companies have either a formal or informal SPI programme based on one of the fashionable SPI models. Generally SPI is reported to deliver significant benefits (Humphrey et al. ∗ Correspondence to: Tracy Hall, Computer Science Department, University of Hertfordshire, Hatfield, AL109AB, UK † E-mail:
[email protected] Contract/grant sponsor: UK Engineering and Physical Science Research Council; contract/grant number: EPSRC GR/L91962.
Copyright 2002 John Wiley & Sons, Ltd.
1991, Johnson 1994). Research shows high maturity companies benefit from increased product quality, improved customer satisfaction and reduced risk (Herbsleb and Goldenson 1996, Humphrey et al. 1991). Some companies report impressive SPI returns: ‘The [Space Shuttle] project reported a 300% improvement in productivity and two orders of magnitude reduction in defect rates.’ (Paulk et al. 1994, p. 100). Despite this, many companies continue to report: • low process maturity (Paulk and Chrissis 2000); • difficulty getting SPI off the ground (Wilson et al. 2000); • failure to maintain SPI in the longer term (Paulish and Carleton 1994). In this paper we explore how ‘typical’ software companies are meeting the challenge of implementing SPI. We particularly explore the non-technical, people-management factors that may explain why
Research Section
many companies are not achieving high process maturity. Indeed, as DeMarco and Lister say: ‘If you find yourself concentrating on the technology rather than the sociology, you are like the vaudeville character who loses his keys on a dark street and looks for them on the adjacent street because, as he explains, ‘The light is better there’.’ (DeMarco and Lister 1987, p. 6). Other empirical studies on SPI confirm the importance of people factors. Kaltio and Kinnula’s (2000) work on deploying defined software processes found that people factors were most important and a recent study of process assessment found a relationship between high process maturity and high staff morale (Herbsleb and Goldenson 1996). We focus on the organizational, management and human factors associated with implementing SPI in industry. These are factors that have been said to be neglected by software engineering research (McDermid and Bennett 1999). Our work complements the extensive SPI work undertaken by the Software Engineering Institute (SEI) as exemplified by the CMM (Humphrey 1989). CMM identifies the software engineering activities that companies should have in place at various stages in their maturity. By contrast, our work focuses on how companies can best implement these activities. This is an area increasingly identified as critical to SPI success (Humphrey 1998), as Kaltio and Kinnula say: ‘. . .capitialisation of organisational learning is through process deployment, without deployment process assets are little more than shelfware.’ (Kaltio and Kinnula 2000). Indeed, studies show that 67% of SPI managers want guidance on how to implement SPI activities, rather than what SPI activities to implement (Herbsleb and Goldenson, 1996). The work we present here is unusual. Most SPI work is based on single case studies. Such work has been criticized for being company-specific and therefore potentially unrepresentative (El Emam and Briand 1997, Herbsleb and Goldenson 1996). We not only present data from 85 companies, but data collected by us as an impartial third party. Copyright 2002 John Wiley & Sons, Ltd.
4
T. Hall, A. Rainer and N. Baddoo
In Section 2 of the paper we identify the critical implementation factors that have been presented in the literature. In Section 3 we describe more fully our study methods and outline the companies in the study. In Section 4 we present our survey results. We go on to discuss these results in Section 5 and summarize and conclude in Section 6. We briefly explain our plans for future work in Section 7.
2. BACKGROUND Appendix 1 shows that since the inception of CMM there has been a tremendous increase in SPI publications during the last ten years. We present the key implementation factors that have been reported in this literature. 2.1. Human Factors ‘all personnel must be interested in process, as the success of software projects depends on process’ (Komiyama et al. 2000). This slightly unconventional angle emphasizes the inextricable relationship between SPI and people. Indeed many of the case studies reported in the literature consider human factors critical to SPI success. Small companies are reported to be particularly vulnerable to human factors – as they usually depend heavily on key individuals (Horvat et al. 2000). Reports identify a variety of human factors that must be handled effectively. 2.1.1. SPI Leaders Companies report that SPI managers contribute significantly to SPI success. Stelzer and Mellis (1998) analysed organizational factors in SPI and identified the importance of change agents and opinion leaders in managing SPI. Herbsleb and Goldenson (1996) identified the need to appoint highly respected people to SPI. This view is supported by the experience of Hughes Aircraft where software process expertise is considered an essential precursor to SPI success (Humphrey et al. 1991). 2.1.2. Management Commitment Almost every SPI publication emphasizes the criticality of gaining and maintaining management support for SPI. Indeed, as long ago as 1991 management commitment is reported to have played Softw. Process Improve. Pract., 2002; 7: 3–15
Research Section
a major role in the success of SPI at Hughes Aircraft (Humphrey et al. 1991). However, management commitment is reported to have many strands. As well being generally important (Paulish and Carleton 1994), commitment must be visible (Pitterman 2000) and consistent (Butler 1997). Other reports also comment on the importance of ‘managing’ (Stelzer and Mellis 1998) and ‘monitoring’ (Goldenson and Herbsleb (1996)) the improvement project itself. Reports of SPI at Schlumberger demonstrate the value of good management. They found that when a manager with high SPI commitment was replaced by one with less commitment, previous process improvements were lost (Wohlwend and Rosenbaum 1994). 2.1.3. Staff Involvement Most SPI publications report on the importance of involving development staff in SPI (Stelzer and Mellis 1998, Goldenson and Herbsleb 1995). Case studies report that without developer involvement, cynicism can develop (Herbsleb and Goldenson 1996). Generally, SPI publications support existing ideas on developer empowerment and involvement (Bach 1995). For example, the need to generate a culture of process ownership is emphasized (Pitterman 2000), as is the need to value SPI as ‘real work’ (Butler 1997). The success of process improvement in the Space Shuttle Project is largely attributed to driving SPI from the bottom-up: ‘staff are the most effective locus of process management, since they are the closest to it and are best able to interpret process data’ (Paulk et al. 1994). 2.2. Organizational Factors A variety of organizational factors are reported to impact on SPI. Even in 1993, Kitson and Masters’ (1993) study of 59 companies revealed that half faced important organizational problems which were outside the scope of SPI. For example, they found that ineffective organizational structures and the after affects of previous failed initiatives damaged SPI. Furthermore, cultural problems feature strongly as barriers to SPI success. For example, organizational politics and ‘turf guarding’ are said to have a negative impact on SPI (Herbsleb and Goldenson 1996). Copyright 2002 John Wiley & Sons, Ltd.
Implementing Software Process Improvement
2.2.1. Communication The theme of effective communication recurs in the SPI literature (Stelzer and Mellis 1998). Effective lines of communication are said to benefit SPI, for example, communication gives opportunities for sharing best practice (Stelzer and Mellis 1998). One of the features of successful SPI is said to be multichannelled communications. For example, in the Space Shuttle Project (Paulk et al. 1994) feedback and communication points are designed into many aspects of SPI. Furthermore, Schlumberger reports that SPI actually generates better communication between and within departments (Wohlwend and Rosenbaum 1994). 2.2.2. Resources All case studies emphasize the importance, and difficulty, of adequately resourcing SPI. Nokia reports that it is necessary to assign resources to the deployment of new processes (Kaltio and Kinnula 2000). Siemens report that dedicated SPI resources are essential (Paulish and Carleton 1994). Many SPI commentators say that insufficient staff time and resources are a major impediment to SPI success. Herbsleb and Goldenson (1996) found that 77% of companies consider that SPI suffers from lack of resources. Furthermore, SPI costs are reported be a major disincentive to small companies as costs are disproportionate to the size of the organization (Brodman and Johnson 1994). 2.3. Implementation Factors Nokia describes a process that was never performed because of poor implementation – its documentation was difficult to access, outdated and lacked coverage (Kaltio and Kinnula 2000). This experience emphasizes the importance of getting implementation right. Many implementation factors are reported in the literature as critical to SPI success. 2.3.1. SPI Infrastructure Case studies report a variety of ways to deploy SPI effort. Curtis (2000) found that each of his three case study companies had central SPI steering groups supported by local improvement staff. The Space Shuttle Project has a sophisticated approach to organizing SPI (Paulk et al. 1994). ‘Control boards’ consisting of a multi-functional team provide points of co-ordination on crucial issues. Management responsibilities have also been delegated to control Softw. Process Improve. Pract., 2002; 7: 3–15
5
Research Section
boards and a determined effort made to develop managers from within the project’s existing staff. 2.3.2. Setting Objectives Another recurring theme relates to companies setting explicit goals and objectives for SPI. Curtis (2000) found that each of the three high maturity companies he analysed ‘had quantifiable business targets and managed their pursuit of these targets empirically’. Other commentators emphasize the need to set relevant and realistic objectives (Stelzer and Mellis 1998) that are clearly stated and well-understood (Herbsleb and Goldenson 1996). Furthermore, the Oklahoma City Air Logistics Centre found that priorities and goals must be highly tailored to the needs of the particular organization (Butler 1997). Reports also comment on the need to relate goals and objectives to formal process definitions (Pitterman 2000) and action plans (Humphrey et al. 1991). 2.3.3. Tailoring SPI A recurring theme in the SPI literature is the need to tailor SPI to particular company requirements (Stelzer and Mellis 1998, Butler 1997, Kaltio and Kinnula 2000). Related to this, SPI must also be tailored to the appropriate maturity level – high maturity processes must not be implemented in low maturity projects (Kaltio and Kinnula 2000). 2.3.4. Evaluation Companies also report on the importance of evaluating SPI progress. Curtis (2000) found that each of his three case studies ‘demonstrated improvements on multiple criteria’, including ‘time to market’. Reports from the Space Shuttle Project also say that ‘process improvement must be applied with constant attention to its impact on results’ (Paulk et al. 1994). Corning Inc. measured SPI success using product quality and project slippage (Johnson 1994).
3. THE STUDY METHODS 3.1. Survey Design We used the implementation factors reported in the SPI literature to design a questionnaire measuring the use of these factors in the UK software industry. The questionnaire was designed according to classical questionnaire design principles (Berdie and Anderson 1974) and can be viewed Copyright 2002 John Wiley & Sons, Ltd.
6
T. Hall, A. Rainer and N. Baddoo
at [http://homepages.feis.herts.ac.uk/∼pppgroup/questionnaire.html]. The aim was to gather quantitative and qualitative data characterizing the implementation of process improvement in software companies. We identified a target sample of software companies using public domain information, e.g. relevant mailing lists and conference attendance lists. Between November 1999 and February 2000 we targeted questionnaires at 1000 software companies posting them to software process improvement managers. We used standard approaches to following-up non-respondents to generate 200 responses. We used only those questionnaires where companies had a software development function and where an attempt had been made at SPI. This left 85 fully relevant questionnaires. Although our response rate may appear low, a response rate of 20 per cent is considered acceptable. Furthermore, we know of no other detailed data characterizing SPI in the UK software industry and so we consider responses from 85 companies to be good. 3.2. Companies in the Study The tables in Appendix 2 provide some basic demographic data characterizing the companies who responded to the questionnaire. The tables show that a range of companies are represented in our sample. The sample has the following main characteristics: • a balance of UK and multinational companies; • a range of software development function sizes; • a varied profile of company ages, with most companies having been established over ten years ago and many over 20 years ago; • a very high proportion of ISO9001 certified companies (this is probably a reflection of our targeting techniques rather than being representative of the industry as a whole); • a balanced representation of application areas, with a well balanced split between bespoke and commercial development activities; • a broad focus in software development effort, with development, maintenance, support and operations complementing a small amount of consultancy. Only 15 of the 85 companies in our sample said they had undergone formal CMM assessment. Maturity is one characteristic of the companies that we are Softw. Process Improve. Pract., 2002; 7: 3–15
Research Section
Implementing Software Process Improvement
interested in and so we emulated Herbsleb and Goldenson (1996) and asked the other 70 companies to informally estimate their CMM level (see Table A7 in Appendix 2). Our sample contains very few companies beyond level 2. This is probably typical as Paulk and Chrissis (2000), in their survey of high maturity organizations, refer to only 44 level 4 organizations and 27 level 5 organizations in the world (though they say that there are probably more).
Table 3. Age of SPI How long has your company’s process improvement programme been in operation? Frequency
Percentage
0–2 years 3–5 years More than 5 years
16 34 27
21 44 35
Total
78
100
Missing
7
Table 4. Objectives for SPI
4. FINDINGS
Why did your company embark on process improvement? (Respondents could select more than one motivation)
4.1. Initiating SPI Table 1 shows that a large proportion of companies in our sample have attempted SPI. Furthermore Table 2 shows that the vast majority of these have been formal rather than informal approaches to SPI (in the questionnaire we defined formal as SPI supported by documentation). Table 3 shows that SPI is firmly established in many companies. Only a fifth of companies say SPI is less than two years old and 35% say it has been in place for more than five years. Table 3 also suggests that the initiation of SPI may have started to decline, as further analysis of the data shows that only 9% of SPI programmes have been set up in the last year. In total, 72% of respondents told us that SPI in their company had clearly stated objectives. Table 4
Frequency
Percentage
67 61 61 47 38 22 18
79 72 72 55 45 26 21
Improve software quality Reducing development costs Shorten development times Increase productivity Improve management visibility Meet client requirement For marketing purposes
identifies particular SPI objectives and shows that the most popular reasons for introducing SPI are: • improving software quality; • reducing costs; • reducing timescales. Table 4 also shows that relatively few companies introduced SPI because of client requirements or for marketing purposes.
Table 1. SPI across companies Has your company tried to improve its software development process? Frequency
Percentage
Yes No
79 6
93 7
Total
85
100
Table 2. Formality of SPI How formal is your company’s approach to software process improvement? Frequency
Percentage
Formal Informal
65 13
83 17
Total
78
100
Missing
7
Copyright 2002 John Wiley & Sons, Ltd.
4.2. The Design of SPI Programmes To understand the way companies have approached SPI we asked respondents to comment on various aspects of SPI design promoted in the SPI literature. First we asked whether respondents agreed that particular aspects of SPI design were important, and second, we asked them to assess whether their company uses these SPI design principles. Table 5 shows that almost all respondents believed it was important to: • gain senior management commitment to SPI; • tailor SPI to the needs of the particular organization; • align SPI goals with organizational goals. Softw. Process Improve. Pract., 2002; 7: 3–15
7
Research Section
T. Hall, A. Rainer and N. Baddoo
Table 5. Essential components of SPI Agreement with statement
Percentage
Median
Mode
93 92 89 63 60
4 4 4 2 3
5 4 4 1 2
It is important. . . for senior management to be committed to SPI to tailor SPI to the needs of particular companies that the goals of SPI are congruent with company goals to make realistic assessments of SPI paybacks to research SPI before developing an SPI programme
Furthermore Table 5 shows that respondents judged their organizations relatively highly for achieving these aims. Respondents scored organizations lower for making realistic assessments of paybacks – despite this factor being valued as important by 63% of respondents. Table 5 also shows mixed views and scores on the importance of doing research on SPI. Table 6 shows that less than half of the companies had published an SPI implementation plan. Of the companies with no implementation plan, further analysis shows that many had an informal approach to SPI. However, of the 65 companies identified in Table 2 as having a formal approach to SPI, 30 had no implementation plan. Table 6. SPI implementation plan Did your company publish a SPI implementation plan? Frequency
Percentage
Yes No Don’t know
35 39 2
46 51 3
Total
76
100
Missing
9
Score out of five on company implementation (1 = low; 5 = high)
Table 7 shows variable approaches to implementing SPI. The most popular implementation was to phase the introduction of SPI through projects. This is, perhaps, no surprise given the project-based nature of most companies. However a large proportion of companies phased SPI via SPI issues. A fifth of companies implemented SPI throughout the company in one go.
4.3. SPI Personnel Table 8 outlines the number of SPI staff in our sample of companies. On average, each company has three full-time SPI staff. However two companies report 15 SPI staff. Further analysis shows that there is a relationship between software engineering effort and SPI effort. For example, all of the companies who have no dedicated SPI staff are small companies (less than 100 software staff). Furthermore 85% of all the small companies in this sample report less than five dedicated SPI staff; 66% of companies employing more than 500 software staff report between five and 15 SPI staff. These results suggest that the direct costs of SPI are relatively small. However, this relatively small
Table 7. Introduction of SPI How was the improvement programme introduced? Frequency
Percentage
Phased, through projects Phased, through teams Phased, through SPI issues Throughout the company Don’t know
23 4 28 16 6
30 5 36 21 7
Total
77
100
Table 8. SPI staff effort How many people are dedicated to SPI in your company? Full time equivalents
Missing Copyright 2002 John Wiley & Sons, Ltd.
8
8
Frequency
Percentage
18 37 12 3
26 53 17 4
Total
70
100
Missing
15
0 1–4 5–10 More than 10
Softw. Process Improve. Pract., 2002; 7: 3–15
Research Section
Implementing Software Process Improvement
dedicated SPI resource may explain the results presented in Table 10, where respondents say that SPI is inadequately funded. SPI infrastructure varied between companies. Only 25 companies had a central SPI resource (a Software Engineering Process Group (SEPG) or a Software Process Action Team (SPAT)). Only 39% of SPI teams were independent of the development function. However, 51% of respondents said that clear responsibilities were assigned to SPI teams. 4.4. People Issues in SPI We asked companies to comment on various human issues in SPI. We first asked about the use of specific human factors, and then about the impact of use. Table 9 summarizes responses. Table 9 shows that the use of experienced staff in SPI seems to be the most important factor. Other important factors include:
How well respected SPI staff were emerged as an important factor, and one that companies also scored highly on (this may be a biased answer as respondents were mostly SPI managers). Participation by developers in SPI also emerged as important, and was another factor on which companies scored highly. However, companies scored less well on adequately resourcing SPI and on selling SPI to developers. Table 11 shows that respondents believe that, on the whole, developers have responded positively to SPI. Only 26% of companies report a less than satisfactory reception from developers. This indicates that developers are more receptive to SPI than conventional wisdom suggests. Furthermore, we found
Table 11. Developers’ response to SPI How have developers generally responded to SPI?
• internal leadership; • process ownership; • executive support. Table 9 also shows that 59% of companies use external consultants in SPI with only 25% reporting high benefit from them. Table 10 shows the factors that respondents considered important to establishing buy-in to SPI.
Frequency
Percentage
Very enthusiastic Satisfactory Some interest Indifference Don’t know
9 45 12 8 2
12 59 16 10 3
Total
76
100
Missing
9
Table 9. Human factors in SPI Use of factor (percentage)
Experienced staff Internal leadership Process ownership Executive support External consultants
Benefit of using factor (percentage)
No
Yes
DK
None
Low
High
1 10 10 16 41
99 90 89 84 59
4 7 8 12 6
1 10 10 6 39
24 30 34 38 31
71 53 48 44 25
Table 10. Establishing buy-in to SPI Agreement with statement
It is important that SPI. . . teams consist of highly respected staff involves developers is properly resourced is ‘sold’ to developers from the start training is provided Copyright 2002 John Wiley & Sons, Ltd.
Score on company implementation (1 = low; 5 = high)
Percentage
Median
Mode
95 91 89 86 76
4 4 3 3 3
4 5 3 3 4 Softw. Process Improve. Pract., 2002; 7: 3–15
9
Research Section
T. Hall, A. Rainer and N. Baddoo
significant relationships (using the chi square statistic) between positive responses from developers and the following factors: feedback being provided to them; highly respected SPI staff, and SPI training being provided. We found no relationship between developer enthusiasm and company maturity. ` 4.6. Evaluating SPI Table 12 shows that while 68% of companies consider SPI to have been successful, 23% say it has been less than successful. In fact only 16% of companies chose the highest success rating. On the other hand, Table 13 shows that SPI has delivered benefit to management in most companies. Our further
analysis of the companies reporting success with SPI showed that resourcing was the only factor with a significant relationship (using the chi square statistic) to SPI success. We found no other significant relationships with SPI success. Table 14 shows that less than a third of companies have evaluated the impact of SPI. However, Table 15 shows that more than half of the companies measure SPI effort. This suggests that while companies are keen to measure costs they are less keen to measure other aspects of SPI. Furthermore, given the relatively small amount of evaluation data available, it is difficult to understand how companies are assessing SPI success. Of the 22 respondents (26%) who said their company did measure the impact of SPI, Table 16 shows
Table 12. Success of SPI Table 14. Evaluation of SPI
How successful has SPI been in your company? Frequency
Percentage
Not successful Marginally successful Successful Very successful Not able to assess
2 15 40 12 8
3 20 52 16 10
Total
77
100
Missing
Has your company evaluated the impact of SPI? Frequency
Percentage
22 46 7
29 63 9
Total
75
100
Missing
10
Yes No Don’t know
8
Table 13. Management value
Table 15. Effort in SPI
Has SPI delivered benefits to management?
Does your company measure SPI effort?
Frequency
Percentage
Frequency
Percentage
Yes No Don’t know
54 9 14
70 12 18
Yes No Don’t know
42 35 1
54 45 1
Total
77
100
Total
78
100
Missing
8
Missing
7
Table 16. Evaluation factors Evaluation measures
Development time Development costs Customer satisfaction Post-delivery defects Defect density Copyright 2002 John Wiley & Sons, Ltd.
10
Use of measure (percentage of the 22 companies collecting evaluation data)
Benefit of use (percentage of the 22 companies collecting evaluation data)
No
Yes
None
Low
High
DK
4 5 7 11 22
96 95 93 89 78
0 0 4 8 15
27 22 20 28 30
64 72 68 64 55
9 6 8 0 0
Softw. Process Improve. Pract., 2002; 7: 3–15
Research Section
Implementing Software Process Improvement
the factors that were tracked. These factors generally correspond to the motivators for SPI that are presented in Table 4. 4.5. SPI and CMM Maturity Table 17 indicates a relationship between maturity and SPI success (significant at the 5% level). Higher maturity companies are more likely to consider SPI successful than low maturity companies. This may be rather a tautology but nevertheless reconfirms the importance of process maturity. Table 18 shows that there is no clear relationship between maturity and the length of time companies have been using SPI. This is a useful result as it suggests that the acceleration of maturity is not necessarily linear.
5. DISCUSSION OF RESULTS Generally our results reveal that companies have been using SPI over a relatively long period of time. Despite this, comparatively few companies in
our study report high process maturity. Companies in our sample do not seem to be accelerating SPI as quickly as has been reported elsewhere (Hayes and Zubrow 1995). This is probably because the case studies reported in the literature describe leading edge SPI efforts. Companies in our study are probably more typical of the industry overall. Our findings substantiate Curtis’ view (Curtis 2000) that the most successful companies have now mastered SPI, but the rest of the industry is yet to catch up. Furthermore, our results suggest that there is still a long way to go for the majority of companies. On the other hand our results may be linked to Paulk et al.’s findings on the time lapse between implementing SPI and generating benefit – ‘In some cases process changes took up to two years to demonstrate results.’ (Paulk et al. 1994, p. 98). Our findings suggest that the biggest impediment to SPI success is inadequate resourcing. Respondents told us that resourcing was important to SPI success, but that companies were failing to provide adequate resources. Furthermore we found that resources were the only single factor directly
Table 17. CMM maturity and SPI success CMM maturity levels (formal and informal)
How successful has SPI been? Less successful More successful Frequency
Percentage
Frequency
Percentage
9 9 2 0 0
41 41 9 0 0
3 19 8 2 0
8 50 21 5 0
1 2 3 4 5 DK
2
9
6
13
Total
22
100
38
100
Table 18. CMM maturity and SPI age CMM levels (formal and informal)
How long has SPI been in operation? 3–5 years
Less than 2 years
More than 5 years
Frequency
Percentage
Frequency
Percentage
Frequency
Percentage
4 6 2 0 0
33 50 22 0 0
6 12 2 1 0
23 46 8 4 0
2 10 6 1 0
9 43 27 4 0
1 2 3 4 5 DK
0
0
5
19
3
13
Total
12
100
26
100
22
100
Copyright 2002 John Wiley & Sons, Ltd.
Softw. Process Improve. Pract., 2002; 7: 3–15
11
Research Section
related to SPI success. Perversely resources was the only consistently collected measurement data. This indicates that companies were very interested in monitoring resources. This may explain why respondents said that inadequate resources were provided. Companies were also generally ineffective at evaluating the impact of SPI. SPI effort was generally not focused and performance not systematically assessed. The only aspect of SPI that seemed to be evaluated was costs. However this result may be related to the overall low maturity of the sample. Companies did not value background SPI research particularly highly. Linked to this is our finding that, on the whole, companies did not particularly highly value the input of external consultants. Companies seemed happiest relying on internal process expertise. This may be linked to the importance companies place on tailoring SPI to the particular needs of the company. The quality of internal SPI staff emerged as a fundamentally important aspect of SPI success. Respondents felt that the use of experienced people in SPI teams was critically important to SPI success. Furthermore that developer buy-in to SPI was dependent on respect for SPI staff. Our findings support experiences from the Space Shuttle case study (Paulk et al. 1994) where they found that SPI is implemented most successfully by experienced people. Overall, our findings indicate that companies had a good understanding of the human factors associated with SPI. Most companies involved developers in SPI and understood the value of communicating to developers about SPI. Furthermore companies also seemed to understand the importance of management commitment to SPI and most seemed to be successfully demonstrating such commitment. Overall our findings show that developers were responding relatively positively to SPI and that developers were most positive about SPI when they received plenty of feedback on SPI.
T. Hall, A. Rainer and N. Baddoo
is generated by SPI. Furthermore companies generally consider SPI reasonably successful. On the other hand, companies in our sample are not maturing as quickly as some of the experiences described in the literature. Overall companies show a good understanding of the human issues related to SPI implementation. Many of the implementation factors cited in the literature are being addressed. In particular companies show a good appreciation of the importance of involving developers in SPI and demonstrating senior management commitment to SPI. Companies report relatively high developer enthusiasm for SPI. Our results show companies are performing less well on adequately resourcing SPI. We show a statistically significant relationship between resources and SPI success. This confirms the experiences of companies cited in the literature. In addition, although most companies claim to have objectives for SPI, many are failing to adequately track and evaluate SPI. We also confirm the importance of high quality SPI staff. Our study provides corroborating evidence of the relationship between highly respected SPI staff and SPI success. This relationship has been identified in a number of the SPI case studies published in the literature. Overall our findings show steady SPI progress in the UK software industry – Progress that may be accelerated by improvement in the few weak implementation areas we identify.
7. FUTURE WORK In conjunction with the questionnaire data we present here, we have also collected detailed case study data from 13 UK software companies. We plan to analyse this data to further investigate some of the issues uncovered by the questionnaire data. Our ultimate aim is to construct and validate a maturity-based model of SPI implementation factors.
6. SUMMARY AND CONCLUSIONS
APPENDIX 1: A TREND ANALYSIS OF PROCESS IMPROVEMENT PUBLICATIONS
Our findings generally indicate that SPI is progressing in the UK software industry. We show that SPI activity is widespread and that management benefit
Figure A1 shows a plot of the total number of papers published against the number of papers published with the phrase ‘process improvement’
Copyright 2002 John Wiley & Sons, Ltd.
12
Softw. Process Improve. Pract., 2002; 7: 3–15
Research Section
Implementing Software Process Improvement
Figure A1. Number of papers published versus number of papers published with the phrase ‘process improvement’ in their title or abstract. (We were unable to obtain various publications for 1995. This partially explains the dip in SPI publications for that year.)
in their title or abstract. The software engineering publications we analysed were: IEEE Software; IEEE Transactions in Software Engineering; Communications of the ACM; Journal of Systems and Software; Proceedings of the IEEE International Conference on Software Engineering; Empirical Software Engineering Journal (began publishing in 1996); Software Process Improvement and Practice Journal (began publishing in 1996).
Table A3. Age of company
APPENDIX 2: DEMOGRAPHIC DATA FROM QUESTIONNAIRE RESPONSES
Table A4. ISO certification status
Table A1. Scope of company Frequency
Percentage
Multinational UK-based
46 38
55 45
Total
84
100
Missing
Years
Frequency
Percentage
0–5 6–10 11–20 20+
2 12 34 37
2 14 40 43
Total
85
100
Missing
0
Frequency
Percentage
Yes No
77 4
95 5
Total
81
100
Missing
4
1 Table A5. Effort areas (respondents could choose more than one effort area
Table A2. Size of development effort Staff numbers
Frequency
Percentage
0–25 26–100 101+ DK
38 22 23 1
45 26 27 1
Total
84
100
Missing
System development System maintenance User support Computer operations Consultancy
Frequency*
Percentage
58 32 33 5 23
68 38 39 6 27
1
Copyright 2002 John Wiley & Sons, Ltd.
Softw. Process Improve. Pract., 2002; 7: 3–15
13
Research Section
T. Hall, A. Rainer and N. Baddoo
Table A6. Application areas (respondents could choose more than one application) Frequency
Percentage
Bespoke systems Commercial packages
64 44
75 52
Safety critical Data processing Business systems Systems software Telecommunications
24 45 54 37 34
28 53 63 44 40
Formal assessment
Goldenson DR, Herbsleb JD. 1995. After the appraisal – a systematic survey of process improvement. CMU/SEI-95TR-009. Software Engineering Institute, Carnegie Mellon University.
Informal assessment
Frequency Percentage Frequency Percentage 1 2 3 4 5
5 3 5 2 0
33 20 33 13 0
13 28 10 2 0
21 46 16 3 0
DK
0
0
8
13
Total
15
100
61
100
ACKNOWLEDGEMENTS We are sincerely grateful to all the companies and practitioners (who, for reasons of confidentiality, must remain anonymous) for their participation in this project. We are also grateful to David Wilson from the University of Technology, Sydney, for his contribution to this project. The project is funded by the UK’s Engineering and Physical Science Research Council, under grant number EPSRC GR/L91962.
REFERENCES Bach J. 1995. Enough about process: what we need are heroes. IEEE Software 12(2): 96–98. Berdie DR, Anderson JF. 1974. Questionnaires: Design And Use. The Scarecrow Press: Metuchen. Brodman JG, Johnson DL. 1994. What small businesses and small organizations say about the CMM. 16th International Conference on Software Engineering, May 16–21. Butler KL. 1997. Process lessons learned while reaching level 4. CrossTalk May: 1–6. Copyright 2002 John Wiley & Sons, Ltd.
14
DeMarco T, Lister T. 1987. Peopleware–Productive Projects and Teams. Dorset House Publishing. El Emam K, Briand L. 1997. Lister Costs and benefits of software process improvement. Technical report, ISERN97-12 Fraunhofer Institute for Experimental Software Engineering.
Table A7. CMM maturity CMM maturity levels
Curtis De Marco B. 2000. The global pursuit of process maturity. IEEE Software July/August: 76–78.
Hayes W, Zubrow D. 1995. Moving on up: data and experience doing CMM-based process improvement. Technical report CMU/SEI-95-TR-008 Software Engineering Institute. Herbsleb JD, Goldenson DR. 1996. A systematic survey of CMM experience and results. 18th International Conference on Software Engineering – ICSE. Berlin, Germany, 25–29 March, 323–330. Humphrey WS. 1989. Managing The Software Process. Addison Wesley. Humphrey WS, Snyder TR, Willis RR. 1991. Software process improvement at Hughes Aircraft. IEEE Software 8(4): 11–23. Humphrey WS. 1998. Why don’t they practice what we preach? Annals of Software Engineering 6: 201–222. Horvat RV, Rozeman I, Gyorkos J. 2000. Managing the complexity of SPI in small companies. Software Process and Improvement Journal 5: 45–54. Johnson A. 1994. Software process improvement experience in the DP/MIS function. In Proceedings of IEEE International Conference on Software Engineering. Kaltio T, Kinnula A. 2000. Deploying the defined software process. Software Process and Improvement Journal 5: 65–83. Kitson DH, Masters SM. 1993. An analysis of SEI software assessment results. In Proceedings of IEEE International Conference on Software Engineering. Komiyama T, Sunazuka T, Koyama S. 2000. Software process assessment and improvement in NEC – current status and future direction. Software Process Improvement and Practice 5: 31–43. McDermid JA, Bennett KH. 1999. Software engineering research: a critical appraisal. IEE Proceedings–Software 146(4): 179–186. Softw. Process Improve. Pract., 2002; 7: 3–15
Research Section
Paulish DJ, Carleton AD. 1994. Case studies of softwareprocess-improvement measurement. Computer 27(9): 50–57. Paulk MC, Chrissis MB. 2000. The November 1999 High Maturity Workshop. Software Engineering Institute, Carnegie Mellon University. Paulk MC, Weber CV, Curtis B, Chrissis MB (eds). 1994. A high-maturity-example: Space Shuttle onboard software. In The Capability Maturity Model: Guidelines for Improving The Software Process. Addison-Wesley, Harlow, UK. Pitterman B. 2000. Telcordia technologies: the journey to high maturity. IEEE Software 17(4): 89–96. Stelzer D, Mellis W. 1998. Success factors of organizational change in software process improvement.
Copyright 2002 John Wiley & Sons, Ltd.
Implementing Software Process Improvement
Software Process – Improvement 227–250.
and
Practice
4(4):
Wilson D, Hall T, Baddoo N. 2000. The software process improvement paradox. In Approaches to Quality Management, Chadwick D, Hawkins C, King G, Ross M, Staples G (eds). British Computer Society Publication: BCS, UK; 97–107. Wilson D, Hall T, Baddoo N. 2002. A framework for evaluation and prediction of software process improvement success. Journal of Systems & Software (to appear). Wohlwend H, Rosenbaum S. 1994. Schlumberger’s software improvement program. IEEE TSE 20(11): 833–839.
Softw. Process Improve. Pract., 2002; 7: 3–15
15