Deployment of systems development methodologies - Semantic Scholar

4 downloads 223992 Views 200KB Size Report
... in perception between IS managers and developers about the deployment of systems development ...... Productivity of the application developers is high. 1. 2.
Information & Management 43 (2006) 29–49 www.elsevier.com/locate/dsw

Deployment of systems development methodologies: Perceptual congruence between IS managers and systems developers Magda Huisman a,*, Juhani Iivari b a

School for Computer Science, Statistics and Mathematics, North-West University (Potchefstroom Campus), Private Bag X6001, Potchefstroom 2520, South Africa b Department of Information Processing Science, University of Oulu, Finland Received 21 April 2002; received in revised form 21 July 2003; accepted 9 January 2005 Available online 25 April 2005

Abstract We studied the differences in perception between IS managers and developers about the deployment of systems development methodologies. The results indicated that IS managers were generally more positive about systems development methodologies than were developers. IS managers perceived methodology support for organizational alignment, and methodology impact on the productivity and quality of the development process to be significantly more important than did system developers, who, in turn, perceived methodology support for verification and validation significantly higher than did IS managers. These differences can be explained by the relevance and importance of the support to the task that the stakeholders perform. # 2005 Elsevier B.V. All rights reserved. Keywords: Information system development; Methodologies; Methodology deployment; Managers; Developers; Perceptual congruence

1. Introduction Systems development methodologies have been one of the most intensive topics in IS and Software Engineering research. Jayaratna [41] estimated the number of methodologies to be about 1000. Furthermore, organisations are facing pressure to use these methodologies [17]. Despite the high investment in * Corresponding author. Tel.: +27 18 299 2537; fax: +27 18 299 2557. E-mail address: [email protected] (M. Huisman).

their development, their value is still a controversial issue [21,40,49,77]. Recent surveys indicated that many organizations claim that they do not use any methodologies [9,18,26,31,62]. Systems development is an activity involving and affecting many different stakeholder groups. Methodologies reflect their viewpoints and interests differently. Therefore, they perceive the benefits and problems of systems development methodologies differently. Unfortunately, there is little comparative analysis of the perceptions of these groups. Markus and Bjørn-

0378-7206/$ – see front matter # 2005 Elsevier B.V. All rights reserved. doi:10.1016/j.im.2005.01.005

30

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

Andersen [47] discussed the types of control (power) exercised by an IS department over users: they identified the technical, structural, conceptual and symbolic exercise of power; these may be related to systems development methodologies. Kraft [45] applied labor process theory to argue that IS managers use methods as a way to control programmers and to formalize their work. Also Bansler and Havn [3] contended that systems development should be studied from a labor process perspective. In our work, we focused on the differences in perceptions between IS managers and systems developers.

2. Perceptual congruence 2.1. Technological frames Orlikowski and Gash [55] introduced the concept of a ‘technological frame’, originating from Social Construction of Technology [6], to identify different assumptions, expectations, and knowledge that members of an organization use in understanding technology. They argued that technological frames have a powerful effect on technology, as people’s assumptions, expectations, and knowledge about the purpose, context, importance, and role of technology strongly influence their decisions during the design and use of those technologies. They identified managers, systems developers and users, at a minimum, as key actors whose actions significantly influence the process and outcome of technological change due to IT. The stakeholder groups have different technological frames because of their different roles and their different relationships. They also defined congruence in technological frames by referring to their alignment on key elements or categories. Congruence was not intended to suggest that they were identical but related in structure and content. They stated that congruent frames implied similar expectations about the role of the technology in their business, the nature of technological use, or the type and frequency of support and maintenance. On the other hand, incongruent technological frames implied important differences in expectations, assumptions or knowledge about some key aspects of the technology. They also claimed that the existence

of incongruent frames would cause difficulties and conflicts in the development, implementation and use of technologies. They also proposed a conceptual framework for examining the interpretations that people develop around technology. This framework included the nature of the technology, its strategy and its use. They empirically tested this framework and found that the differences in expectations and actions between users and technologists could be traced to the differences in their respective technological frames. Different ones implied different ways of knowing and making sense of technology, which in turn could result in misaligned expectations, contradictory actions, and unanticipated organisational consequences. They concluded that the concept of technological frames is particularly useful for examining and explaining the development, use, and change of IT in organisations. In particular, this explained and anticipated outcomes not captured by other perspectives. 2.2. Perceptual congruence Research on technological frames has been qualitative in nature. We applied a related concept of ‘perceptual congruence’ that allows a quantitative research perspective. Perceptual congruence can be defined as ‘‘the degree to which individuals view matters similarly’’ [71]. It can be attributed mainly to the division of labor. Different job characteristics may lead to a greater range of perception, because of different responsibilities, roles, and objectives associated with each task [54,76]. Research has indicated that greater perceptual congruence has a positive effect on an organization [66]. This can be attributed to the fact that perceptual congruence reduces uncertainty and ambiguity between individuals [79]. Conversely, it was evident that perceptual incongruence may prove to be a problem. Lederer and Prasad [46] found that different perceptions among cost estimators and non-estimators make it difficult to produce accurate cost estimates. Deming [14] argued that barriers between workers with different responsibilities could obstruct teamwork and reduce productivity. Also, it was reported that differences between users and systems professionals regarding development goals resulted in the delay of systems development, causing cost and

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

31

Table 1 Summary of empirical research on perceptual congruence in systems development Study

Stakeholders

Perceptions studied

Conclusions

Jiang et al. [43]

IS users, IS staff

Performance ratings and importance of performance measures for IS staff.

Roberts et al. [60]

Functional managers, IS managers, systems personnel, external consultants

Factors critical to systems development methodology implementation.

Jiang et al. [42]

Systems developers, systems users

Frequency of systems development problems.

Teo and King [69]

Business planners, IS executives

Business planning and Information systems planning integration.

Von Hellens [75]

Managerial viewpoint, organizational viewpoint, engineering viewpoint Managers, analysts, software engineers

Information systems quality versus software quality.

Found significant differences between IS users and IS staff regarding performance ratings and the importance of performance measures for IS staff. No significant differences were found between the perceptions of the stakeholder groups when the critical factors of systems development methodologies implementation were studied. Systems developers and systems users perceived failures associated with IS development differently. In most organizations (62.4%) IS Executives and Business planners had congruent perceptions regarding the degree of integration between business planning and information systems planning. Perceptions of software quality by stakeholders are not mutually exclusive, but each group emphasizes different activities.

Lederer and Prasad [46]

Cost estimators, non-estimators

Systems development cost estimation.

Finlay and Mitchell [16]

Developers, customers

Tangible/intangible systems outcomes associated with the introduction of IE.

Nelson [50]

IS personnel, end-user personnel

Educational needs

Verner and Cerpa [73]

Advantages and disadvantages of the waterfall and prototyping approach.

schedule overruns [70]. In a discussion of CASE implementation, [12] listed a ‘‘common view among managers and workers’’ as critical for successful technology implementation. Even though research indicated that greater perceptual congruence has positive consequences, few researchers have applied the concept in the context of IS [20,53]. Some empirical studies, where technological frame and perceptual congruence have been applied in the context of systems development, are shown in Table 1.

Perceptions of advantages of the waterfall process model and prototyping differ significantly between managers, analysts and software engineers. The rating of an attribute depends on its relevance and importance to the job of the stakeholder. Identified congruent and incongruent perceptions in systems development cost estimation among the stakeholders. Developers and customers had different perceptions regarding the achieved benefits associated with the introduction of IE. Identified congruent and incongruent perceptions of educational needs among stakeholders.

3. Systems development methodology deployment 3.1. Systems development methodologies Trying to define a systems development methodology is not easy. There is no universally accepted, rigorous, and concise definition of it [38]. Some argue that the term ‘‘methodology’’ has no place in an IS, because it literally means a ‘‘science of methods’’ [5,65]. Others argue that the terms can be applied

32

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

interchangeably [11,29,64]. Others state that methodologies encompass methods [30] or, conversely, that methods encompass methodologies [56]. Iivari and Maansaari [36] discussed a number of conceptual problems related to the use of the term systems development method. They classified these problems into two types of inconsistency: scope and category problems. Avison and Fitzgerald [2] argued that the term methodology is a wider concept than method, as it has certain characteristics that are not implied by method, the inclusion of a philosophical view. Therefore, for use here we define a systems development methodology as a combination of the following:  A systems development approach: This involves the philosophical view on which the methodology is built. It is the set of goals, guiding principles and beliefs, fundamental concepts, and principles of the systems development process that drive interpretations and actions [37]. Examples are the structured, object-oriented and information modelling approaches.  A systems development process model: A process model is a representation of the sequences of stages through which a system evolves. Some examples are the linear life-cycle model and the spiral model.  A systems development method: A method is a systematic way of conducting at least one complete phase of systems development, consisting of a set of guidelines, activities, techniques, and tools, based on a particular philosophy and the target system [81]. Examples include OMT, IE, etc.  A systems development technique: Development techniques can be defined as procedures, possibly with a prescribed notation, to perform a development activity [7], for example construction of entity relationship diagrams. 3.2. Deployment The deployment of systems development methodologies can be assessed from several perspectives. Previous studies focused on level of use and acceptance [80], aspects acquired and degree of use [57], horizontal and vertical use, satisfaction, and

improved performance [48], and the delay until the technology first reached 25% of all new development [15]. Deployment encapsulates the post-implementation stages of the innovations diffusion process where the innovation is actually being used in the organisation. 3.2.1. The perceived support that systems development methodologies provide To measure the perceived support that systems development methodologies provide, we based our measurement on the work by Henderson and Cooprider [28]. They developed and empirically tested a functional model for IS planning and design aids. This consisted of two major categories: production and co-ordination technology. They defined the former as ‘‘functionality that directly impacts the capacity of an individual(s) to generate planning or design decisions and subsequent artefacts or products.’’ Production technology consists of representation, analysis, and transformation. Therefore, it:  Allows the user to define, describe, or change a definition or description of an object, relationship, or process.  Allows the user to explore, simulate, or evaluate alternate representations or models of objects, relationships, or processes.  Executes a significant planning or design task, thereby replacing or substituting a human designer or planner. Co-ordination technology is ‘‘functionality that enables or supports the interactions of multiple agents in the execution of a planning or design task.’’ It consists of control and co-operative functionality. The first ‘‘enables the user to plan for and enforce rules, policies or priorities that will govern or restrict the activities of team members during the planning or design process.’’ The second ‘‘enables the user to exchange information with other individual(s) for the purpose of influencing (affecting) the concept, process or product of the planning/design team.’’ Henderson and Cooprider stated that organisational technology consisted of: support functionality ‘‘to help an individual user understand and use a planning and design aid effectively’’, and infrastructure ‘‘standards that enable portability of skills, know-

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

ledge, procedures, or methods across planning or design processes.’’ Support functionality is a ‘metafunctionality’ as it supports the utilisation of all basic functionalities. Henderson and Cooprider found that it was difficult for respondents to discriminate types of support functionality. The infrastructure component resulted from feedback during the study and it was not tested empirically by them. We see infrastructure functionalities as standards to support co-operation. To emphasize the significance of shared knowledge of the systems development methodologies we extended co-operative functionality to cover aspects of shared knowledge. 3.2.2. The impact of systems development methodologies When the impact of systems development methodologies is studied, it is necessary to focus on two aspects [8,27,63]:  Its impact on the quality of the product (the developed system); and  Its impact on the quality and productivity of the systems development process. These criteria will be the ultimate criteria by which a systems development methodology (and CASE technology) will be assessed. These criteria were also some of the measures proposed for measuring the effectiveness of systems development methodologies [82]. We selected five perspectives in studying the deployment of systems development methodologies. They involved the perception of how the methodology:  provided support as production technology;  provided support as control technology;  provided support as cognitive and co-operative technology;  impacted on the quality of the developed systems;  impacted on the productivity, and quality of the development process. These can be mapped to the domains of technological frames of Orlikowski and Gash. The perceived support that systems development methodologies provide can be linked to people’s understanding of the technology’s capabilities and functionalities and its probable value to the organization. Furthermore, the

33

perceived impact of systems development methodologies can be linked to the probable or actual consequences of the technology’s use.

4. Perceptual congruence and deployment of systems development methodologies Software development has undergone a process of industrialization. This transformation process included:  Its use in substituting software and machines for human labor by the use of CASE tools [74].  Division of labor, fragmentation of work, and stratification of jobs, including hierarchies of authority in software development.  Introduction of management-defined production and documentation standards and quality control measures. Reflecting the labor process perspective, some re-searchers have claimed that systems development techniques are primarily introduced as a means of improving management control [23]. The labor process perspective argued that this transformation has not been the result of the technological imperative but because managers want to restructure the process to increase profits. Even though all the transformations, especially the division of labor into more fragmented jobs, have not been realized [19], we believe that management’s agenda has been to introduce development methodologies as a way to achieve change in the systems development process. This was partly supported by the empirical findings that management’s role is vital for the acceptance of process innovation [13,25,34,68,72]. This lead to our general hypothesis: H1. The perceptions of IS managers of systems development methodologies are more positive than those of the developers. IS managers are concerned with the effectiveness of the organization, the production process, and how IS contributes to the profitability of the organization. On the other hand, systems developers are mainly concerned with the production of the final product. Since both are involved with the production of an IS, one would expect that their perceptions would not be

34

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

mutually exclusive and that some congruent perceptions exist. However, the responsibilities of IS managers and developers are different. Besides being responsible for delivering the final system, IS managers are responsible for an effective production process. This lead to our first sub-hypothesis: H1.1. Managers perceive the support provided by systems development methodologies as production technology higher than developers do. IS managers have to manage the resources used during development and maintenance; the resources are people, budget, and time. Therefore, they should emphasize this aspect more than do the developers. One of the conceptual underpinnings of systems development methodologies is the facilitation of project management and control [78]. We formulated the second sub-hypothesis as: H1.2. Managers perceive the support provided by systems development methodologies as control technology higher than developers do. Systems development is a collective work process involving systems professionals and customer representatives with different educational background and experience. IS managers must organize the different team members so that co-operation and communication problems are kept to a minimum. This lead to our third sub-hypothesis: H1.3. Managers perceive the support provided by systems development methodologies as cognitive and co-operative technology higher than developers do. The developed system is of concern to both parties, but managers are also responsible for an effective development process. The development of an information architecture and the improvement of the software development process are listed among the top ten concerns of senior IS managers [51]. We therefore formulated the fourth and fifth sub-hypothesis as: H1.4. Managers perceive the impact of systems development methodologies on the quality of developed systems higher than developers do. H1.5. Managers perceive the impact of systems development methodologies on the productivity and

quality of the development process is higher than developers do.

5. Research design 5.1. Survey A multiple-respondent mail survey was used as the research method. Two questionnaires were developed, one to be answered by IS managers and the other to be answered by systems developers. Both consisted mainly of closed-ended questions, using a five-point Likert scale. The appropriateness and the validity of the questionnaires were tested in two stages. First, six lecturers from the Computer Science and Information Systems Department at the Potchefstroom University for CHE, South Africa, tested the questionnaires. After some change, they were pilot tested in the IS department of an organisation in Gauteng, South Africa. This study formed part of a larger survey of systems development methodologies conducted between July and October 1999 in South Africa. The 1999 IT Users Handbook (the most comprehensive reference guide to the IT industry in South Africa) was used to determine the population and 443 listed organizations were contacted via telephone to determine if they were willing to participate in our study; 213 organizations agreed to take part. A package of questionnaires was sent to a contact person in each organization for distribution. The package consisted of one questionnaire to be answered by the IS manager, and a number of questionnaires to be answered by individual systems developers. The number of developer questionnaires needed was determined for each organization during the telephone contact. The response rate of the survey is presented in Table 2. Due to problems with the mailing service, it was not possible to analyze non-response bias based on the comparison of the latest responses: one could not be sure that responses Table 2 Response rate of survey

Organisations Developers Managers

Number distributed

Number returned

Response rate (%)

213 893 213

80 234 73

37.6 26.2 34.3

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

35

Table 3 Comparison of sectoral composition of businesses in sample and businesses in South Africa at the time the survey was performed, using the z-test for differences between two proportions Sectoral composition of businesses in sample (%) Business sector Primary sector Agriculture Mining

Sectoral composition of businesses in South Africa (%)

p-value

1.4 6.8

3.8 6.6

0.28 0.95

Secondary sector Manufacturing Electricity and water supply Construction

39.7 1.4 4.1

32.2 3.7 5.4

0.17 0.30 0.62

Tertiary sector Trade Transport and communication Community services Financial and business

11.0 8.2 9.6 17.8

13.9 10.4 6.5 17.5

0.47 0.54 0.28 0.95

that arrived last were not answered first. However, when we compared the sectoral composition of businesses in South Africa at the time of the survey [1] with the sectoral composition of businesses in our sample, z-test for differences between two proportions

Table 5 Profile of responding IS developers (n = 234)

Table 4 Profile of responding organisations (n = 73) N Business sector Primary sector Agriculture Mining Secondary sector Manufacturing Electricity and water supply Construction Tertiary sector Trade Transport and communication Community services Financial and business

showed no significant differences. This is shown in Table 3. The profiles of the participating organisations and the individual developers are shown in Tables 4 and 5, respectively.

%

1 5

1.4 6.8

29 1 3

39.7 1.4 4.1

8 6 7 13

11.0 8.2 9.6 17.8

Organisation size 1–50 employees 51–200 employees More than 200 employees

5 8 60

6.8 11.0 82.2

IS department size 1–5 employees 6–20 employees 20–50 employees More than 50 employees

17 23 12 21

23.3 31.5 16.4 28.8

N

%

Education Senior certificate (high school) Certificate or diploma University or technikon degree Honours or Master’s degree Ph.D. degree Other Missing

39 79 75 36 0 3 2

16.7 33.8 32.1 15.4 0.0 1.3 0.9

Title IS manager Project manager Team leader Systems analyst Analyst/programmer Programmer Other Missing

11 25 13 24 93 36 29 3

4.7 10.7 5.6 10.3 39.7 15.4 12.4 1.3

Experience in systems development None Less than 1 year 1–2 years 3–5 years 5–10 years More than 10 years Missing

4 14 21 51 53 89 2

1.7 6.0 9.0 21.8 22.6 38.0 0.9

36

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

5.2. Measurement All questions, as shown in Appendix A, were addressed to both developers and managers. In most cases, multiple items were used to measure the research variables. This resulted in a large data set. In order to reduce this, we performed factor analysis using the principal-components method with varimax normalized rotation. To determine the number of factors to retain, we used the Kaiser criterion: this is widely used—only factors with eigenvalues greater than 1 are retained. For each item, its factor loadings were analysed, and the item was grouped in the factor where it had the highest loading. The results of the factor analysis are reported in Appendix B. We also performed reliability analysis on the items of each of the factors identified, using Cronbach alpha as the index of reliability. There is no exact threshold for the required reliability, but 0.5–0.6 has been recommended as the minimum criterion in exploratory research [52,67]. In view of the lack of empirical research, we used a threshold value of 0.6. The items of each factor were then studied and an explanatory name created to represent it. The value for the factor was computed as the average of its items. Perceived methodology support provided as production technology was measured using 11 five-point Likert scale items, varying from 1 (totally disagree) to 5 (totally agree). Factor analysis using the developer data gave only one factor, while factor analysis using the manager data gave three factors: ‘‘Support for organizational alignment’’, ‘‘Support for technical design’’ and ‘‘Support for verification and validation’’. The following analysis uses both factor structures. The reliability of the first factor was 0.90/0.91,1 of the second factor 0.85/0.82, of the third factor 0.83/ 0.86 and of the total factor (including all items) 0.94/ 0.91. Perceived methodology support provided as control technology was measured using nine five-point Likert scale items. Factor analysis using the developer data and the manager data separately, resulted in only one factor. The reliability of this factor was 0.94/0.92. Perceived methodology support provided as cognitive and co-operative technology was measured 1 The figure before the slash refers to the developer data and the figure after the slash to the manager data.

using 11 five-point Likert scale items. Factor analysis using the developer and manager data separately, resulted in both instances in two similar factors: ‘‘Support for the common conception of systems development practice’’ and ‘‘Support for the evaluation of systems development practice’’. The reliability of the first factor was 0.92/0.92 and 0.79/0.92 for the second factor. Perceived methodology impact on the quality of developed systems was measured using eight fivepoint Likert scale items, adopted from the ISO 9126 standard [39]. Factor analysis using the developer data and the manager data separately, resulted in only one factor. The reliability of this factor was 0.95/0.93. Perceived methodology impact on the quality and productivity of the development process was measured using 10 five-point Likert scale items. Factor analysis using the developer data resulted in only one factor, while the factor analysis on the manager data resulted in two factors: ‘‘Productivity effects and morale’’ and ‘‘Quality effects, goal achievement and reputation’’. The following analysis uses both factor structures. The reliability of the first factor was 0.89/0.90, of the second factor 0.88/0.86 and of the total factor (including all items) 0.94/0.92. Two background variables, ‘‘Perceived performance of the IS department’’ and ‘‘Maximum intensity of method use’’ were also measured. Perceived performance of the IS department, was measured using 10 five-point Likert scale items. Factor analysis of the merged developer and manager data (n = 307) gave three factors:2 ‘‘Productivity and quality’’, ‘‘Cost of development and maintenance’’ and ‘‘Organizational health’’. The reliability of the first factor was 0.80/0.76, of the second factor 0.72/0.73, and 0.79/0.77 for the third factor after deleting one item. Maximum intensity of method use was measured as the maximum of the organizational usage, varying from 1 (nominally) to 5 (intensively), of 29 listed methods, possible other standard (commercial) methods and possible in-house developed methods.

2

Factor analysis was also performed separately for the developer data and the manager data. Both resulted in three very similar factor structures, with the exception of two items, which loaded on different factors. The factor loadings were very similar, and therefore the merged data was used.

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

37

Table 6 Results of analysis for the background variables Individual level Manager

Developer

Performance of the IS department Productivity and quality Cost of development and maintenance Organizational health

3.63 3.35 3.65

3.57 3.53 3.52

Maximum intensity of method use Max method use

2.85

2.74

#

p  0.10, *p  0.05,

**

p  0.01,

Organizational level t-value

p-value

Manager

Developer

t-value

p-value

0.57 1.48 1.30

0.57 0.14 0.19

3.64 3.31 3.64

3.57 3.44 3.57

0.73 1.33 0.90

0.47 0.19 0.37

0.40

0.69

2.93

2.56

1.51

0.14

***

p  0.001.

5.3. Data analysis Data analysis was performed using Statistica (version 5) software. Two analyses were conducted. In the first analysis, the group of developers (n = 223)3 and the group of managers (n = 73) were considered as independent samples. The multivariate Hotteling t-test for independent samples was used to analyze the differences between the perceptions of the developers and managers at the individual level. This test gives an indication of the following: (a) Difference between a vector of elements, with the number of elements greater than or equal to one. (b) It also performs post hoc tests to give an indication of the difference for each vector element individually when the number of vector elements is greater than one [24]. Differences that were tested for are the performance of the IS department, the maximum intensity of method use, the perceptions of methodology support, and the impact of systems development methodologies on the developed systems and the development process. However, responses from managers and developers from the same organizations were also available. A second analysis was conducted to analyze the difference between the perceptions of developers and managers at the organizational level. Using the ttest for dependent samples (paired-sample t-test), the average of the perceptions of the developers in the IS department was compared with the perception of the IS manager. The t-test for dependent samples only 3 Eleven questionnaires of developers who indicated their title as IS managers were not included in the analysis.

reports the differences between individual variables, and not between a vector of elements.

6. Results 6.1. Background variables Table 6 gives a summary of the results for the analysis of the two background variables. The first part of Table 6 contains the results for the perceived performance of the IS department. At the individual level, IS managers were slightly more positive about the perceived performance of the IS department, as they reported the highest values for ‘‘Productivity and quality’’ and ‘‘Organizational health’’ and the lowest value for ‘‘Cost of development and maintenance’’. However, these differences were not statistically significant. When the vector consisting of the above three factors was analyzed, Hotteling T2 = 2.89 at the level of F(3, 270) = 0.964 and p = 0.41. This indicated that no significant differences exist between the perceptions of managers and developers regarding the vector. Paired-sample t-test showed that at the organizational level, no significant differences exist between the perceptions of managers and developers regarding the three factors describing the perceived performance of the IS department. The perceptions regarding the maximum intensity of method use is described in the second part of 4

In F(x, y) the values in brackets represent the degrees of freedom, where x equals the number of groups minus 1, and y equals the total number of observations in the groups minus the number of groups. In our analysis the degrees of freedom may vary because of missing values.

38

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

Table 7 Results for perceived support provided by systems development methodologies Individual level Manager

Developer

Production technology Organizational alignment Technical design Verification and validation All items

3.64 3.38 2.93 3.43

3.33 3.28 3.13 3.26

Control technology All items

3.35 3.36 3.24

Cognitive and co-operative technology Common conception of SD practice Evaluation of SD practice **

p  0.01,

Organizational level t-value

p-value

Manager

Developer

t-value

p-value

2.31 0.74 1.24 1.28

0.02* 0.46 0.21 0.20

3.68 3.47 2.92 3.47

3.48 3.42 3.25 3.43

1.54 0.44 2.29 0.41

0.13 0.66 0.03* 0.68

3.23

0.95

0.34

3.36

3.31

0.51

0.61

3.15 3.10

1.74 0.84

0.08# 0.40

3.45 3.42

3.25 3.24

1.90 1.04

0.06# 0.30

***

p  0.001, #p  0.10, *p  0.05.

Table 6. Both at individual and organizational level managers reported higher values for method use, but these differences were not statistically significant. 6.2. Support provided by systems development methodologies Table 7 contains the results of the perceptions regarding the support provided by systems development methodologies. They show that both IS managers and developers perceived the support slightly positively (mean > 3.00), the only exception being managers perception of support for verification and validation. In the following we analysed differences between managers and developers. 6.2.1. Support provided as production technology The first part of Table 7 shows the perceived methodology support provided as production technology. At the individual level the perceptions of managers and developers differed significantly for the vector consisting of ‘‘Support for organizational alignment’’, ‘‘Support for technical design’’ and ‘‘Support for verification and validation’’, with Hotteling T2 = 21.56 at the level of F(3, 229) = 7.12 and p = 0.00014. Managers reported the highest values for ‘‘Support for organizational alignment’’, generally the same values were reported for ‘‘Support for technical design’’ and developers reported the highest values for ‘‘Support for verification and validation’’. The post hoc tests showed that only

‘‘Support for organizational alignment’’ differed statistically significantly between managers and developers, when each one of the individual factors and the factor consisting of all items were considered separately. When we consider the results at the organizational level, only ‘‘Support for verification and validation’’ differed statistically significantly. The factor consisting of all the items did not differ at the organizational level. 6.2.2. Support provided as control technology In the second part of Table 7 a summary is given of the perceived methodology support provided as control technology. In contrast to our hypothesis, no significant differences were present between the perceptions of the managers and developers, both at individual and organizational level. 6.2.3. Support provided as cognitive and co-operative technology The perceptions regarding methodology support provided as cognitive and co-operative technology is described in the last part of Table 7. At the individual level managers reported slightly higher values for ‘‘Support for the common conception of system development practice’’ and ‘‘Support for the evaluation of systems development practice’’ but the vector consisting of the two factors did not differ significantly, with Hotteling T2 = 3.03 at the level of F(2, 228) = 1.51 and p = 0.22.

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

39

Table 8 Results for impact of systems development methodologies Individual level

Organizational level

Manager

Developer

t-value

p-value

Manager

Developer

Impact on developed System All items

3.53

3.32

1.64

0.10#

3.55

3.45

0.85

0.40

Impact on development process Productivity effects and morale Quality effects, goal achievement, reputation All items

3.41 3.58 3.49

3.13 3.21 3.17

2.11 2.84 2.59

0.04* 0.005** 0.01*

3.43 3.62 3.52

3.24 3.37 3.31

1.59 2.22 2.13

0.12 0.03* 0.04*

#

p  0.10, *p  0.05,

**

p  0.01,

t-value

p-value

***

p  0.001.

Paired-sample t-test showed that at the organizational level, only ‘‘Support for the common conception of systems development practice’’ differs statistically significantly at the level of p = 0.06, in the sense that managers reported higher values than developers. 6.3. Impact of systems development methodologies In Table 8 we report the results of the analysis of the perceived impact of systems development methodologies on the developed system and the development process. They show that both IS managers and developers perceived the impact slightly positively (mean > 3.00). In the following we analysed differences between managers and developers. 6.3.1. Impact on the quality of the developed system The first part of Table 8 shows the perceived impact of systems development methodologies on the quality of the developed system. Although managers reported slightly higher values than developers, no statistically significant differences were present at the individual or organizational level. 6.3.2. Impact on the quality and productivity of the development process Perceived methodology impact on the quality and the productivity of the development process is depicted in the last part of Table 8. At the individual level the vector consisting of ‘‘Productivity effects and morale’’ and ‘‘Quality effects, goal achievement and reputation’’ differed statistically significantly

with Hotteling T2 = 8.06 at the level of F(2, 232) = 4.01 and p = 0.02. The post hoc tests also indicated that the individual factors ‘‘Productivity effects and morale’’ and ‘‘Quality effects, goal achievement and reputation’’ differed statistically significantly, as well as the factor consisting of all the items measuring the impact of systems development methodologies on the development process. Paired-sample t-test indicated that at the organizational level ‘‘Quality effects, goal achievement, and reputation’’ as well as the factor consisting of all items were significantly different. 6.4. General hypothesis To test our general hypothesis that IS managers are more positive in their perception of systems development methodologies than developers, we performed a multivariate Hotteling t-test with all the factors of the research variables (except total factors) as the vector elements. The results for the vector Hotteling T2 was 37.6 at the level of F(9, 205) = 4.02 and p = 0.00009. This indicated that the perceptions of IS managers and developers differ significantly on the support provided and the impact of systems development methodologies.

7. Discussion Our findings suggest that perceptions differ between managers and developers on systems development methodologies. A summary of the results is

40

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

Table 9 Summary of the differences in perceptions Perspective of deployment

Hypothesis

Research variable

Individual

Methodology support provided as production technology

H1.1(a)

Managers + ( p  0.05)

H1.1(b) H1.1(c)

Support for organizational alignment Support for technical design Support for verification and validation

H1.2

Support as control technology

H1.3(a)

Managers + ( p  0.10)

Managers + ( p  0.10)

H1.3(b)

Support for the common conception of SD practice Support for the evaluation of SD practice

H1.4

Impact on the quality of the system

H1.5(a)

Productivity effects and morale

H1.5(b)

Quality effects, goal achievement and reputation

Managers + ( p  0.10)) Managers + ( p  0.05) Managers + ( p  0.001)

Managers + ( p  0.05)

Methodology support provided as control technology Methodology support provided as cognitive and co-operative technology Methodology impact on the quality of developed systems Methodology impact on the productivity and quality of the development process

given in Table 9. Overall, the perceptions of IS managers were more positive than those of developers. This is in accordance with previous findings [83]. This supported our conjecture that systems development methodologies reflect management’s agenda. The perceptions of the IS managers and developers regarding the two background variables were congruent. Managers and developers viewed the performance of the IS department the same at the individual and organizational level. The values reported for the maximum intensity of method use did not differ at either the individual or organizational level. For hypothesis H1.1, when all items measuring methodology support provided as production technology were considered as one factor, no significant differences were found between the perceptions of managers and developers. However, when studying more detailed factors, managers perceived the ‘‘Support for organizational alignment’’ significantly higher than developers at the individual level, whereas developers reported significantly higher values at the organizational level for ‘‘Support for verification and validation.’’ In the case ‘‘Support for technical design’’ no statistical differences were found. These results may confirm previous research that managers and developers give a rating to a factor depending on its relevance and importance to the job. As developers perceived ‘‘Support for verification

Organizational

Developers + ( p  0.05)

and validation’’ significantly more important than managers, verification and validation may seem less important to managers. This is accentuated by the low support for it as reported by both managers and developers (see Table 7). Managers and developers both saw perceived methodology support provided as control technology to the same extent, possibly because of the duality of the role of IS managers [4], who act both as supervisors and subordinates. It may well be that IS managers evaluated methodology support as control technology both as subordinates and as supervisors. The problems of keeping deadlines and budgets of systems development [10] despite all methodology support may inflate the manager’s perceptions of methodology support as control technology. Another explanation might be that project management forms one of the common spheres of experience between managers and developers. So, their degree of ‘‘homophily’’ [61] is relatively high in that area. Homophily is defined by Rogers as the degree to which two or more individuals, who interacts, are similar in certain aspects. This in turn may lead to the congruent perceptions between the IS managers and developers. Klenke [44] argued that employees with strong professional orientations, such as IS developers, typically cultivate horizontal rather than vertical relationships and give greater credence to peer

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

processes. As a result, a strong professional orientation potentially reduced the influence of the supervisor. There was only a minor difference in the perceptions of IS managers and developers about the impact of systems development methodologies on the quality of the developed system. However, when we consider the impact of methodologies on the quality and productivity of the development process, IS managers reported statistically significant higher values both at individual and organizational level, an indication that stakeholders emphasize the activities that are most important to them. Since IS managers have responsibility for the development process, they emphasize it more.

8. Conclusions Systems development methodologies reflect management’s agenda in the sense that IS managers saw the support provided by systems development methodologies and their impact more positively than did systems developers. The essential managerial implication of the study is that it is extremely important for management to be aware of this perceptual incongruence. Systems developers tended to view the support provided by the development methodologies and their impact more critically than IS managers, probably a reason for the low degree of use of systems development methodologies (see Table 6). It is also significant to realize that perceptual incongruences between IS managers and systems developers imply different expectations, assumptions, and norms. Differences may cause conflicts between managers and developers during systems development projects, resulting in conflict which jeopardizes the project, etc. These problems in project performance may lead to a vicious circle where IS managers see increased adherence to a systems development methodology as a means to remedy the situation and developers just the opposite, until the process collapses into a totally ad hoc style of development [33]. In order to avoid this worst case scenario it is important for managers and systems developers to

41

decrease the perceptual incongruence through better communication and interaction. If management wishes to persuade systems developers to accept systems development methodologies, the question is how to do it. Closer communication and interaction between IS managers and systems developers is, of course, one means. Education and training in systems development methodologies is another. A case by Huysman reported that, despite in-house training courses on IS design, most developers continued to work in their old way [32]: training alone is not enough. Moreover, though the company hired new systems developers with the hope that old-timers might learn new ways from them, the newcomers gradually accepted the existing work practices of the old-timers. Managers also may also use their power to influence systems developers. Strong support was found for the positive influence of management support and for the negative influence of voluntariness on CASE usage [35], [22] report a strong negative influence of voluntariness on the use of the Personal Software Process innovation, and [59] found subjective norm and voluntariness significant predictors of intention to use a systems development methodology. It was also found that top management leadership is very instrumental to the adoption of quality management in systems development and quality performance [58]. IS managers should also take into account the possibility that they overrate the support provided by systems development methodologies and their impact. Perhaps the critics of systems development methodologies are right in the sense that systems development methodologies are not as beneficial as they are claimed to be. This leads to the major weakness of this paper. We analysed systems development methodologies as a homogenous phenomenon. However, one might question whether there might be differences in managers and systems developers’ perceptions of the support provided by, and the impact of, alternative systems development methodologies. Unfortunately, our data do not allow an analysis at this level of detail. Data from South African firms were dominated by the classical structured and information modelling approaches. Newer ones, such as the object-oriented approach, were not well represented.

42

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

Appendix A Items used to measure perceived methodology support provided as production technology Totally disagree 1 2 3 4 5 6 7 8 9 10 11

Our systems development methodology Our systems development methodology Our systems development methodology Our systems development methodology Our systems development methodology Our systems development methodology Our systems development methodology Our systems development methodology code during systems development. Our systems development methodology Our systems development methodology systems development projects. Our systems development methodology

helps helps helps helps helps helps helps helps

to to to in in in in to

align the system to be developed with the business. capture requirements for the system to be developed. design the architecture of the system to be developed. system design. implementing developed systems. reviewing developed systems. testing developed systems. reuse earlier requirements, designs and

Totally agree

1 1 1 1 1 1 1 1

2 2 2 2 2 2 2 2

3 3 3 3 3 3 3 3

4 4 4 4 4 4 4 4

5 5 5 5 5 5 5 5

helps to involve end-users in systems development projects. helps to build management commitment in our

1 1

2 2

3 3

4 4

5 5

helps to get the systems accepted.

1

2

3

4

5

Items used to measure perceived methodology support provided as control technology Totally disagree 1

Our systems development methodology helps to decompose the system to be developed in workable parts. Our systems development methodology helps to estimate the size of the system to be developed. Our systems development methodology helps to estimate the time and effort required for the development of a planned system. Our systems development methodology helps to plan systems development projects. Our systems development methodology helps in defining useful milestones for our systems development projects. Our systems development methodology helps to organize systems development projects. Our systems development methodology helps to keep our systems development projects under control. Our systems development methodology helps to estimate the project risks. Overall, our systems development methodology helps us to manage our systems development projects.

2 3 4 5 6 7 8 9

Totally agree

1

2

3

4

5

1 1

2 2

3 3

4 4

5 5

1 1

2 2

3 3

4 4

5 5

1 1 1 1

2 2 2 2

3 3 3 3

4 4 4 4

5 5 5 5

Items used to measure perceived methodology support provided as cognitive and co-operative technology Totally disagree 1 2 3 4 5 6 7 8

Our systems development methodology defines our desired systems development practice. Our systems development methodology describes a sound way of developing systems. Our systems development methodology forms a useful standard for our systems development. Our systems development methodology reminds me about activities/tasks of systems development. Our systems development methodology provides a useful list of possible systems development activities. Our systems development methodology provides useful guidelines for conducting systems development. Our systems development methodology provides a useful toolbox of techniques to be applied. Our systems development methodology defines an ideal process of systems development that is useful, even though it is not followed in practice. 9 Without a systems development methodology one cannot estimate how systems development should be conducted. 10 Our systems development methodology allows us to learn from our systems development experience. 11 Without a systems development methodology it is impossible to evaluate our systems development practice.

Totally agree

1 1 1 1 1 1 1 1

2 2 2 2 2 2 2 2

3 3 3 3 3 3 3 3

4 4 4 4 4 4 4 4

5 5 5 5 5 5 5 5

1

2 3 4 5

1 1

2 3 4 5 2 3 4 5

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

43

Appendix A (Continued ) Items used to measure perceived methodology impact on the quality of the developed system Totally disagree 1 2 3 4 5 6 7 8

Our systems development methodology helps to develop more functional systems. Our systems development methodology helps to develop more reliable systems. Our systems development methodology helps to develop more maintainable systems. Our systems development methodology helps to develop more portable systems. Our systems development methodology helps to develop more efficient systems. Our systems development methodology helps to develop more usable systems. Overall, our systems development methodology helps to develop better systems. Overall, our systems development methodology helps to make users more satisfied with our systems.

1 1 1 1 1 1 1 1

Totally agree 2 2 2 2 2 2 2 2

3 3 3 3 3 3 3 3

4 4 4 4 4 4 4 4

5 5 5 5 5 5 5 5

Items used to measure perceived methodology impact on the quality and the productivity of the development process Totally disagree 1 2 3 4 5 6 7 8 9 10

Our systems development methodology Our systems development methodology Our systems development methodology Our systems development methodology Our systems development methodology Our systems development methodology Our systems development methodology Our systems development methodology Our systems development methodology Our systems development methodology reputation of excellent work.

helps to develop new applications faster. helps to improve the functionality of new applications. helps to increase the productivity of the application developers. helps to decrease the cost of systems development. helps to improve the quality of the systems. helps to decrease the cost of systems maintenance. helps to improve the documentation of the systems. improves the morale in my IS department. helps to achieve the goals of my IS department. helps to improve my IS department’s

1 1 1 1 1 1 1 1 1 1

Totally agree 2 2 2 2 2 2 2 2 2 2

3 3 3 3 3 3 3 3 3 3

4 4 4 4 4 4 4 4 4 4

5 5 5 5 5 5 5 5 5 5

Items used to measure perceived performance of IS department Totally disagree 1 2 3 4 5 6 7 8 9 10 11

Speed of developing new applications is high. Functionality of new applications is high. Productivity of the application developers is high. Cost of systems development is high. Quality of systems development products is high. Cost of systems maintenance is high. Documentation of systems development products is of high quality. The morale in the IS department I work in, is high. The IS department I work in, achieves its goals well. The IS department I work in, has a reputation of excellent work. There is high agreement on my IS department’s problems.

1 1 1 1 1 1 1 1 1 1 1

Totally agree 2 2 2 2 2 2 2 2 2 2 2

3 3 3 3 3 3 3 3 3 3 3

4 4 4 4 4 4 4 4 4 4 4

5 5 5 5 5 5 5 5 5 5 5

44

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

Appendix A (Continued ) List of standard (commercial) systems development methods used in its department Nominally Structured Analysis, Design and Implementation of Information Systems (STRADIS) Yourdon Systems Method (YSM) Information Engineering (IE) Structured Systems Analysis and Design Method (SSADM) Structured Analysis and Structured Design (SASD) Structured Systems Analysis and Design (SSAD) Systems Requirements Engineering Methodology (SREM) Merise Jackson Systems Development (JSD) Information Systems work and Analysis of Change (ISAC) Effective Technical and Human Implementation of Computer-based Systems (ETHICS) Soft Systems Methodology (SSM) Multiview Process Innovation (business process re-engineering) Rapid Application Development (RAD) KADS Euromethod Navigator Method/1 Object Modeling Technique by Rumbaugh (OMT) Object Oriented Analysis and Design by Coad and Yourdon (OOA&D) Object Oriented Software Engineering by Jacobson (OOSE) Object Oriented Analysis by Shlaer and Mellor (OOA) Object-oriented Systems Analysis (OSA) Fusion MOSES Unified Modeling Language (UML) Objectory Booch Other, please specify

1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

Intensively 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2

3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3

4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4

5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5

Inhouse systems development methods used in IS department Nominally 1 1 1

Intensively 2 2 2

3 3 3

4 4 4

5 5 5

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

45

Appendix B Factor loadings on the items that measured the perceived methodology support provided as production technology Item number

Developer data F0

Manager data F1

F2

F3

1 2 3 4 5 6 7 8 9 10 11

0.81 0.80 0.80 0.81 0.78 0.75 0.79 0.71 0.76 0.82 0.81

0.73 0.71 0.25 0.51 0.41 0.23 0.32 0.02 0.80 0.81 0.85

0.28 0.40 0.80 0.70 0.62 0.34 0.18 0.78 0.41 0.05 0.05

0.29 0.19 0.22 0.01 0.30 0.85 0.83 0.24 0.01 0.38 0.34

Expl. var. Prp. total

6.79 0.62

3.69 0.34

2.69 0.24

2.00 0.18

Final factor structure: perceived support provided as production technology Support as production technology (F0)

Support for organisational alignment (F1)

Support for technical design (F2)

Support for verification and validation (F3)

All items

Align system to be developed with the business Capture requirements for system to be developed Involve end-users in SD projects Build management commitment in SD projects Helps to get systems accepted

Helps to design architecture of system to be developed Helps in system design

Helps in reviewing developed systems Helps in testing developed systems

Helps in implementing developed systems Helps to reuse earlier requirements, design and code

Factor loadings on the items that measured the perceived methodology support provided as cognitive and cooperative technology Item number

Developer data

Manager data

F1

F2

F1

F2

1 2 3 4 5 6 7 8 9 10 11

0.76 0.84 0.85 0.82 0.82 0.85 0.68 0.52 0.16 0.68 0.12

0.27 0.24 0.15 0.05 0.10 0.13 0.21 0.45 0.88 0.35 0.88

0.78 0.90 0.80 0.75 0.83 0.91 0.81 0.39 0.23 0.64 0.11

0.02 0.14 0.24 0.24 0.18 0.20 0.13 0.27 0.93 0.31 0.95

Expl. var. Prp. total

5.29 0.48

2.11 0.19

5.41 0.49

2.17 0.20

46

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

Appendix B (Continued ) Final factor structure: perceived support provided as cognitive and co-operative technology Support for the common conception of SD practice (F1)

Support for the evaluation of SD practice (F2)

Defines our desired SD practice

Without SDM one cannot estimate how SD should be conducted Without an SDM it is impossible to evaluate our SD practice

Describes a sound way of developing systems Forms a useful standard for our SD Reminds me about activities/tasks of SD Provides a useful list of possible SD activities Provides useful guidelines for conducting SD Provides a useful toolbox of techniques to be applied Defines an ideal process of SD that is useful, even though not followed in practice Allows us learn from our SD experience

Factor loading on the items that measured the perceived methodology impact on the quality and productivity of the development process Item number

Developer data

Manager data

F0

F1

F2

1 2 3 4 5 6 7 8 9 10

0.78 0.84 0.85 0.76 0.87 0.80 0.63 0.78 0.84 0.85

0.78 0.84 0.85 0.77 0.36 0.43 0.14 0.70 0.32 0.27

0.31 0.28 0.28 0.21 0.79 0.66 0.62 0.47 0.80 0.88

Expl. var. Prp. total

6.42 0.64

3.63 0.36

3.38 0.34

Final factor structure: perceived methodology impact on the quality and productivity of the development process Methodology impact on the quality and productivity of development process (F0)

Productivity effects and morale (F1)

Quality effects, goal achievement and reputation (F2)

All items

Helps to develop new applications faster Helps to improve the functionality of new applications Helps to increase the productivity of application developers Helps to decrease the cost of SD Improves the morale in our IS department

Helps to improve the quality of systems Helps to decrease the cost of systems maintenance Helps to improve the documentation of the system Helps to achieve the goals of our IS department Helps to improve our IS department.s reputation of excellent work

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

47

Appendix B (Continued ) Factor loading on items that measured the performance of an IS department Item number

Merged data F1

Developer data F2

F3

F1

Manager data

F2

F3

F1

F2

F3

1 2 3 4 5 6 7 8 9 10

0.78 0.81 0.75 0.05 0.69 0.11 0.19 0.11 0.31 0.25

0.06 0.13 0.08 0.87 0.06 0.83 0.35 0.12 0.20 0.18

0.18 0.08 0.26 0.10 0.26 0.11 0.59 0.79 0.78 0.74

0.77 0.80 0.74 0.08 0.74 0.09 0.17 0.11 0.30 0.25

0.05 0.20 0.06 0.87 0.05 0.84 0.26 0.12 0.19 0.15

0.19 0.10 0.26 0.07 0.22 0.08 0.62 0.78 0.79 0.76

0.84 0.81 0.77 0.02 0.31 0.05 0.06 0.07 0.37 0.35

0.12 0.08 0.22 0.82 0.01 0.79 0.32 0.46 0.51 0.55

0.12 0.10 0.26 0.19 0.68 0.03 0.76 0.63 0.54 0.35

Expl. var. Prp. total

2.52 0.25

1.68 0.17

2.33 0.23

2.54 0.25

1.67 0.17

2.37 0.24

2.32 0.23

2.24 0.22

1.98 0.20

Final factor structure: perceived performance of IS department Productivity and quality (F1)

Cost of development and maintenance (F2)

Organisational health (F3)

Speed of developing new applications is high Functionality of new applications is high Productivity of application developers is high Quality of SD products is high

Cost of systems development is high Cost of systems maintenance is high

The morale in IS department is high IS department achieves its goals well IS department has a reputation of excellent work

References [1] ABSA Group Economic Research, Sectoral Prospects for the South African Economy 2000–2005, 2000, http://www.absa.co.za/ABSA/Content/PubRef_Content/Articles_Files/PubRefArtFile383_0.pdf. [2] D.E. Avison, G. Fitzgerald, Information Systems Development: Methodologies, Techniques and Tools, McGraw-Hill Publishing Company, Berkshire, England, 1995. [3] J.P. Bansler, E. Havn, The nature of software work: systems development as a labor process, in: P. van den Besselaar, A. Clement, P. Ja¨rvinen (Eds.), Information System, Work and Organization Design, Elsevier Science Publishers, North-Holland, 1991, pp. 145–153. [4] F. Bartolome, A. Laurent, Managers: torn between two roles, Personnel Journal (67) 1988, pp. 72–81. [5] R. Baskerville, J. Travis, D. Truex, systems without method: the impact of new technologies on information systems development projects, in: K.E. Kendall, K. Lyytinen, J.I. DeGross (Eds.), The Impact of Computer Supported Technologies on Information Systems Development, Elsevier Science Publishers B.V., North Holland, 1992, pp. 241–269. [6] W.E. Bijker, The social construction of bakelite: toward a theory of invention, in: W.E. Bijker, T.P. Hughes, T.J. Pich (Eds.), The Social Construction of Technological Systems, The MIT Press, Cambridge, MA, 1989, pp. 159–187.

[7] S. Brinkkemper, Method engineering: engineering of information systems development methods and tools, Information and Software Technology (38) 1996, pp. 275–280. [8] D.L. Burkhard. Examination of factors contributing to the success of the implementation of computer-aided software engineering technology, in: H.R. Sprague (Ed.), Proceedings of the 23 Hawaii International Conference on System Sciences, vol. 4, 1990. [9] P.D. Chatzoglou, L.A. Macaullay, Requirements capture and IS methodologies, Information Systems Journal (6) 1996, pp. 209–225. [10] S. Conger, The New Software Engineering, Wadsworth Publishing Company, Belmont, CA, 1994. [11] D.T. Connors, Software development methodologies and traditional and modern information systems, Software Engineering Notes 17(2), 1992, pp. 43–49. [12] G.F. Corbitt, R.J. Norman, CASE implementation: accounting for the human element, Information and Software Technology 33(9), 1991, pp. 637–640. [13] M.K. Daskalantonakis, Achieving higher SEI levels, IEEE Software 11(4), 1994, pp. 17–24. [14] E.W. Deming, Out of the crisis, Massachusetts Institute of Technology Center for Advanced Engineering Study, Cambridge, MA, 1986. [15] R.G. Fichman, C.F. Kemerer, The illusory diffusion of innovation: an examination of assimilation gaps, Information Systems Research 10(3), 1999, pp. 255–275.

48

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49

[16] P.N. Finlay, A.C. Mitchell, Perceptions of the benefits from the introduction of CASE: an empirical study, MIS Quarterly 18(4), 1994, pp. 353–370. [17] B. Fitzgerald, Formalized systems development methodologies: a critical perspective, Information Systems Journal (6) 1996, pp. 3–23. [18] B. Fitzgerald, An empirical investigation into the adoption of systems development methodologies, Information & Management (34) 1998, pp. 317–328. [19] A.L. Friedman, D.S. Cornford, Computer Systems Development: History, Organization and Implementation, John Wiley & Sons, Chichester, England, 1989. [20] A.C. Gillies, Humanization of the software factory, Information and Software Technology 33(9), 1991, pp. 641–646. [21] R.L. Glass, Software Creativity, Prentice Hall, Englewood Cliffs, 1995. [22] G.C. Green, A.R. Hevner, Perceived Control of Software Developers and Its Impact on the Successful Diffusion of Information Technology, CMU/SEI-98-SR-013, Carnegie Mellon, Software Engineering Institute, Pittsburgh, PA, 1999, http://www.sei.cmu.edu/pub/documents/98.reports/pdf/ 98sr013.pdf. [23] J.M. Greenbaum, In the Name of Efficiency: Management Theory and Shopfloor Practice in Data Processing Work, Temple University Press, Philadelphia, 1979. [24] J.F. Hair, R.E. Anderson, R.L. Tatham, W.C. Black, Multivariate Data Analysis with Readings, Macmillan, New York, 1992. [25] T.J. Haley, Software process improvement Raytheon, IEEE Software 13(6), 1996, pp. 33–41. [26] C.J. Hardy, J.B. Thompson, H.M. Edwards, The use, limitations and customization of structured systems development methods in the United Kingdom, Information and Software Technology 37(9), 1995, pp. 467–477. [27] G.T. Heineman, J.E. Botsford, G. Caldiera, G.E. Kaiser, M.I. Kellner, N.H. Madhavji, Emerging technologies that support a software process life cycle, IBM Systems Journal 33(3), 1994, pp. 501–529. [28] J.C. Henderson, J.G. Cooprider, Dimensions of I/S planning and design aids: a functional model of CASE technology, Information Systems Research 1(3), 1990, pp. 227–254. ¨ sterle, A reference model for information [29] M. Heym, H. O systems development, in: K.E. Kendall, K. Lyytinen, J.I. DeGross (Eds.), The impact of Computer Supported Technologies on Information Systems Development, Elsevier Science Publishers B.V., North Holland, 1992, pp. 215–239. [30] R. Hirscheim, H.K. Klein, K. Lyyttinen, Exploring the intellectual foundations of information systems, Accounting, Management and Information Technologies 6(1–2), 1996, pp. 1–64. [31] H.M. Huisman, J. Iivari, Systems development methodology use in South Africa, in: D. Galetta, J. Ross (Eds.), Proceedings of the 9th Americas Conference on Information Systems, 4–6 August 2003, Tampa, Florida, USA, Published by Association for Information Systems on the internet at http://aisel.isworld.org, 2003, pp. 1040–1052. [32] M. Huysman, Rethinking organizational learning: analyzing learning processes of information system designers, Account-

[33] [34]

[35] [36]

[37]

[38]

[39]

[40]

[41]

[42]

[43]

[44]

[45] [46]

[47]

[48]

[49]

ing, Management and Information Technology (10) 2000, pp. 81–99. W.S. Humphrey, Managing the Software Process, AddisonWesley, Reading, MA, 1989. W.S. Humphrey, T.R. Syder, R.R. Willis, Software process improvement at Hughes Aircraft, IEEE Computer 8(4), 1991, pp. 11–23. J. Iivari, Why are CASE Tools not used? Communications of the ACM 39(10), 1996, pp. 94–103. J. Iivari, J. Maansaari, The usage of systems development methods: are we stuck to old practices? Information and Software Technology (40) 1998, pp. 501–510. J. Iivari, R. Hirscheim, H.K. Klein, A paradigmatic analysis contrasting information systems development approaches and methodologies, Information Systems Research 9(2), 1998, pp. 164–193. J. Iivari, R. Hirscheim, H.K. Klein, A dynamic framework for classifying information systems development methodologies and approaches, Journal of Management Information Systems 17(3), 2000–2001, pp. 179–218. ISO, Information technology—Software product evaluation— Quality characteristics and guidelines for their use, ISO/IEC DIS 9126, International Organization for Standardization, 1990. L.D. Introna, E.A. Whitley, Against methodism, exploring the limits of methods, Information Technology & People 10(1), 1997, pp. 31–45. N. Jayaratna, Understanding and Evaluation Methodologies, NIMSAD: a Systematic Framework, McGraw-Hill, London, 1994. J.J. Jiang, G. Klein, J. Balloun, Perceptions of system development failures, Information and Software Technology (39) 1998, pp. 933–937. J.J. Jiang, M.G. Sobol, G. Klein, Performance ratings and importance of performance measures for IS staff: the different perceptions of IS users and IS staff, IEEE Transactions on Engineering Management 47(4), 2000, pp. 424–434. K. Klenke, Leadership processes in computer mediated work groups: implications of information systems development and leadership studies, in: K.E. Kendall (Ed.), et al., The Impact of Computer Supported Technologies on Information Systems Development, Elsevier Science Publishers B.V, North-Holland, 1992, pp. 115–132. P. Kraft, Programmers and Managers, Springer, New York, 1997. A.L. Lederer, J. Prasad, Perceptual congruence and systems development cost estimation, Information Resources Management Journal 8(4), 1995, pp. 16–27. M.L. Markus, N. Bjørn-Andersen, Power over users: its exercise by system professionals, Communications of the ACM 30(6), 1987, pp. 498–504. I.R. McChesney, D. Glass, Post-implementation management of CASE methodology, European Journal of Information Systems 2(3), 1993, pp. 201–209. J. Nandhakumar, D.E. Avison, The fiction of methodological development: a field study of information systems development, Information Technology & People 12(2), 1999, pp. 176– 191.

M. Huisman, J. Iivari / Information & Management 43 (2006) 29–49 [50] R.R. Nelson, Educational needs as perceived by IS and enduser personnel: a survey of knowledge and skill requirements, MIS Quarterly 15(4), 1991, pp. 503–525. [51] F. Niederman, J.C. Brancheau, J. Weatherbe, Information systems management issues for the 1990s, MIS Quarterly 15(4), 1991, pp. 475–500. [52] J.C. Nunally, Psychometric Theory, McGraw-Hill, New York, 1978. [53] B. Nuseibeh, A. Finkelstein, J. Kramer, Method engineering for multi-perspective software development, Information and Software Technology (38) 1996, pp. 267–274. [54] C.A. O’Reilly, G.N. Parlette, J.R. Bloom, Perceptual measures of task characteristics: the biasing effects of differing frames of reference and job attitudes, Academy of Management Journal (23) 1980, pp. 118–131. [55] W.J. Orlikowski, D.C. Gash, Technological frames: making sense of information technology in organizations, ACM Transactions on Information Systems 12(2), 1994, pp. 174– 207. [56] P. Palvia, J.T. Nosek, A field examination of system life cycle techniques and methodologies, Information & Management (25) 1993, pp. 73–84. [57] A. Rai, G.S. Howard, An organizational context for CASE innovation, Information Resources Management Journal 6(3), 1993, pp. 21–34. [58] T. Ravichandran, A. Rai, Quality management in systems development: an organizational system perspective, MIS Quaterly 24(3), 2000, pp. 381–414. [59] C.K. Riemenschneider, B.C. Hardgrave, F.D. Davis, Explaining software developer acceptance of methodologies: a comparison of five theoretical models, IEEE Transactions on Software Engineering 28(12), 2002, pp. 1135–1145. [60] T.L. Roberts, M.L. Gibson, K.T. Fields, R.K. Rainer, Factors that impact implementing a systems development methodology, IEEE Transactions on Software Engineering 24(8), 1998, pp. 640–649. [61] E.M. Rogers, Diffusion of Innovations, fourth ed., The Free Press, New York, 1995. [62] N.L. Russo, R. Hightower, J.M. Pearson, The failure of methodologies to meet the needs of current development environments, in: N., Jayaratna, B., Fitzgerald, (Eds.), Lessons Learned from the Use of Methodologies: Fourth Conference on Information Systems Methodologies, 1996, pp. 387–394. [63] T. Saarinen, M. Sa¨a¨ksja¨rvi, Process and product success in information systems development, Journal of Strategic Information Systems 1(5), 1992, pp. 266–277. [64] M. Saeki, A meta-model for method integration, Information and Software Technology (39) 1998, pp. 925–932. [65] S.R. Schach, Software Engineering with Java, Richard D. Irwin, USA, 1997. [66] M.E. Schnake, M.P. Dumler, D.S. Cochran, T.R. Barnett, Effects of differences in superior and subordinate perceptions of superiors’ communication practices, Journal of Business Communication 27(1), 1990, pp. 37–50.

49

[67] V. Sethi, W.R. King, Construct measurements in information systems research: an illustration in strategic systems, Decision Sciences 22(3), 1991, pp. 455–472. [68] E.B. Swanson, Information systems innovation among organizations, Management Science 40(9), 1994, pp. 1069–1092. [69] T.S.H. Teo, W.R. King, An assessment of perceptual differences between informants in information systems research, Omega 25(5), 1997, pp. 557–566. [70] R.S. Tripp, Managing the political and cultural aspects of large-scale MIS projects: case study of participative systems development, Information Resources Management Journal 4(4), 1991, pp. 2–13. [71] D.B. Turban, A.P. Jones, Supervisor–subordinate similarity: types, effects and mechanisms, Journal of Applied Psychology 73(2), 1988, pp. 228–234. [72] R. Urwiler, N.K. Ramarapu, R.B. Wilkes, M.N. Frolick, Computer-aided software engineering: the determinants of an effective implementation strategy, Information & Management (29) 1995, pp. 215–225. [73] J.M. Verner, N. Cerpa, Prototyping: does your view of its advantages depend on your job? Journal of Systems Software (36) 1997, pp. 3–16. [74] I. Vessey, S.K. Jarvenpaa, N. Tractinsky, Evaluation of vendor products: CASE tools as methodology companions, Communications of the ACM 35(4), 1992, pp. 90–105. [75] L.A. von Hellens, Information systems quality versus software quality: a discussion from a managerial, an organizational and an engineering viewpoint, Information and Software Technology (39) 1997, pp. 801–808. [76] A. von Meier, Occupational cultures as a challenge to technological innovation, IEEE Transactions on Engineering Management 46(1), 1999, pp. 101–114. [77] D.G. Wastell, The fetish of technique: methodology as a social defence, Information Systems Journal 6(1), 1996, pp. 25–40. [78] C. Westrup, Information systems methodologies in use, Journal of Information Technology (8) 1993, pp. 267–275. [79] K.N. Wexley, E.D. Palukos, The effects of perceptual congruence and sex on subordinates’ performance appraisals of their managers, Academy of Management Journal (26) 1983, pp. 666–676. [80] J.L. Wynekoop, J.A. Senn, S.A. Conger, The implementation of CASE tools: an innovation diffusion approach, in: K.E. Kendal (Ed.), et al., The Impact of Computer Supported Technologies on Information Systems Development, Elsevier Science Publishers, North Holland, 1992, pp. 25–41. [81] N.L. Wynekoop, System development methodologies: unanswered questions and the research-practice gap, in: J.I. DeGross et al. (Eds.), Proceedings of the Fourteenth International Conference on Information Systems, Orlando, 1993, pp. 181–190. [82] J.L. Wynekoop, N.L. Russo, Studying system development methodologies: an examination of research methods, Information Systems Journal (7) 1997, pp. 47–65. [83] R.E. Yellen, What do users really think about CASE? Journal of Systems Management 34(2), 1992, pp. 16–39.