Assessing the IT Training and Development Climate: An Application of the Q-methodology Stephen C. Wingreen
J. Ellis Blanton
Sandra K. Newton
Thomas University 1501 Millpond Road Thomasville, GA 31792 229-226-1621, x226
University of South Florida 4202 E. Fowler Ave., CIS 1040 Tampa, FL 33620-7800 813-974-6757
University of South Florida 4202 E. Fowler Ave., CIS 1040 Tampa, FL 33620-7800 813-974-6767
[email protected]
[email protected]
[email protected]
Madeline Domino University of South Florida 4202 E. Fowler Ave., CIS 1040 Tampa, FL 33620-7800 813-974-6749
[email protected] 1. INTRODUCTION
ABSTRACT
Whenever information technology (IT) managers are faced with goals that may not be met by the firm’s current workforce of IT personnel, the problem arises over how to bridge the gap between the requirements of the goal and the ability of the firm’s IT personnel to accomplish the goal [37, 27, 33]. The manager must decide between the options of internally developing the firm’s existing IT personnel, hiring new IT personnel to bridge the gap, outsourcing the deficiency to a third party, or some combination of the three [42, 16, 22, 2]. In cases where the gap is narrow and time is of the essence, internal development of existing IT personnel often makes the most sense [12].
The Q-methodology was employed to address the managerial problem of deciding which of the firm’s personnel development resources should be aimed at which personnel, or groups of personnel, through the various venues of development that are available to the firm for the development of its IT workforce. The procedure identified six interpretable groups of IT professionals that seem to be associated with the development priorities of the IT management, analytical, development & programming, and operations functions. Opportunities are identified for both future research and the practical application of the Q-methodology, its associated instrumentation, and analytical procedure as a managerial decision tool in the context of the internal development of the firm’s IT personnel to meet the firm’s goals.
Once the decision is made to develop the firm’s own IT personnel, managers are faced with the additional task of framing the problem of where (which personnel) to aim the firm’s development resources, which elements of the firm’s IT skill portfolio to develop, and how to accomplish the development through the many venues of development that are available (workshops, hands-on training, distance learning, etc.). Although there has been research that addresses these problems [42, 22], and support products have been developed [28], they have focused primarily on the characteristics common to the individuals involved rather than the commonality between individuals in any given situation.
Categories and Subject Descriptors K.7.3 [The Computing Profession]: Testing, Certification, and Licensing
General Terms Management, Measurement, Human Factors.
Keywords
For instance, research has addressed groups of common factors that affect skill development among IT professionals [25, 30, 9, 35], but not whether there are groups of IT professionals that share development factors in common. This distinction is of primary practical importance because most managers are far more interested in managing groups of people rather than groups of individual factors about people [12]. Researchers should also be interested in this distinction because little is known about whether there are groups of IT professionals that are naturally clustered according to their professional development priorities,
Q-methodology, q-sort, IT professionals, training assessment, professional development, decision support Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. SIGMIS-CPR’05, April 14–16, 2005, Atlanta, Georgia, USA. Copyright ACM 1-59593-011-6/05/0004...$5.00.
12
and if there are groups, then what priorities distinguish them from one another.
managers of the strategic significance of the firm’s IT skill portfolio [20].
The deficiency in the existing research may be due in part to the unavailability of methods to factor or cluster people into natural groups according to their own subjective relationship to a phenomenon. The statistical methods that have been widely employed thus far are known as “r-factorial” methods for their assumption that groups of variables may be correlated to form factors common to a population [15, 44]. However, there are also “q-factorial” methods that are available, but underutilized in the existing research, that assume people may be correlated to form factors common to a group of [uncorrelated] variables [14]. Q-factorial methods are capable of providing the necessary information to assist managers with the task of framing the problem of where and how to allocate training and development resources. The method itself is known as the “Qmethodology”, its prescribed instrumentation is known as a “qsort”, and its analytical procedure is a form of centroid extraction known as “q-factor analysis” [43, 6, 14].
Previous work on the assessment problem has produced some useful tools for the assessment of the organizational technical updating climate [11, 25], frameworks for assessment [36, 29], and inventories of skills necessary for regional economic development [3]. Practitioners have focused on the management of employee skill inventories and their coordination with training and development programs [28, 27, 21, 29, 48].
2.2 The IT Development Decision The decision to develop the IT workforce internally has taken on a greater significance in the light of the recent developments in the labor market. Managers must proactively manage the firm’s IT skill resource through the creation of formal skill inventories aimed at allowing the firm to be more agile in a highly volatile and competitive market [39, 28, 21, 29]. Although the staffing decision has addressed primarily the long-term interests of the firm [1, 2, 8, 4, 39, 21], some recent work has focused on the short-term needs of individual projects [42, 18] and the selection and coordination of individual employees [22, 17].
It is the goal of this paper to apply the Q-methodology to develop a tool for the assessment of the IT training and development climate, identify and interpret natural groups of IT professionals according to their own professional development priorities to be used as a baseline for managerial decision making, and make recommendations for its use as a managerial decision tool. The paper will be organized as follows: a review of the existing IT and Q-methodology research. The review will be followed by a section describing how the Q-methodology was employed in this study and an analysis of a sample of IT professionals with the goal of discovering and interpreting natural groups of IT professionals according to their professional development priorities to be used as a baseline for subsequent managerial decision making. The analysis will be followed by a discussion of the implications and recommendations for both IT skill portfolio research and how the results may be applied as a managerial decision tool aimed at the problem of bridging the gap between the firm’s goals and the ability of its current portfolio of IT personnel to meet those goals by internally developing the firm’s current IT personnel.
The literature has generally focused on the goals of increasing retention [1, 28], building competitive advantage [4, 33, 41], operational performance [22, 18, 40], or coping with scarce IT labor resources [2, 42]. Previous work has also addressed the impact of the manager’s personnel selection decision on employee task assignments, and hence, on employee training and development opportunities [21]. However, very little, if any research has addressed how to directly target and prioritize individual employees based on specific skill development and training needs, allocate specific development and training resources, and target specific venues for IT personnel development, or provide ample support for the decision to do so.
2.3 Q-methodology in the IS literature The Q-methodology has been successfully employed in previous IS research, although seldom, and never as an assessment tool to assist decision makers for IT training and the development of the firm’s internal IT workforce.
2. LITERATURE REVIEW In general, both the Q-methodology and assessment of the firm’s IT development climate have been underutilized in IS research. This section will review the existing literature on the assessment of IT training and development assessment, provide a definition of the IT training and development climate, and discuss the Qmethodology with an eye towards its use in IS research.
A recent tutorial on the use of q-sorting [45] used a q-sort of MIS faculty beliefs about how to train MIS Ph.D. students as an example for future IS research. The study employed methods developed in previous research on the Q-methodology itself [43, 6]. IS researchers have used the Q-methodology to establish the existence of organizational subcultures and their relationship to DSS user satisfaction [24], competencies of software engineers [47], and metaphors in the language of IS that may be used to increase the effectiveness of systems development [23]. The Qmethodology has also been employed to investigate managerial decision-making about the deployment of technology throughout the firm [46], relative importance of IT management issues [13, 34], and compare the work climate of an IS organization to those in other industries [37].
2.1 Assessment of IT Development and Training Needs IT personnel development is the subject of a growing stream of IS research that is being fed by both the growing interests among researchers about how IT professionals obtain and maintain professional skills and competencies, and practitioners interested in the management and development of the firm’s internal IT skill portfolio. A recent ITAA survey of 500 hiring managers reports that there is both an increasing value placed on highly skilled IT professionals and an awareness among IT
13
or assumptions about what is to be expected through the course of the research. Idiographic research, on the other hand, preserves operant subjectivity because it begins with no a priori assumptions or expectations and proceeds by allowing the subjects to arrange their own version of a phenomenon, thus allowing them to “tell their own story”, as it were [5]. In other words, idiographic research examines established groups of subjects through the use of a representative sample of statements about a phenomenon of interest with the goal of making generalizations about the phenomenon.
2.4 Advantages of the Q-Methodology Over Other Assessment Methods The primary advantage of the Q-methodology over other methods of assessment is the preservation of operant subjectivity through the requirement that respondents consider statements representing an entire domain before sorting the statements relative to each other [43]. This approach is in contrast to rfactorial methods, such as profile analysis, gap analysis, or skill inventories, which typically assume that respondents consider measurement items independently of one another. For this reason, the Q-methodology is preferred when the goal of research is the comparison of “apples and oranges” so to speak, in that the Q-methodology captures and preserves the respondents’ relative preferences for “apples and oranges”.
3.1 Pilot Study and Instrument Development Pilot testing was accomplished in three phases: 1) the generation of a set of q-statements by an expert panel, 2) the refinement of the q-statement set through two separate cycles of focus group meetings, 3) the collection of a small ample of (n < 30) of data accompanied by a preliminary analysis. The recommended procedure for the development of a set of q-statements was followed [45]. Specifically, any given set of q-statements is assumed to be a representative sample of its target domain which, in this case, is the IT Training and Development Climate as described in section 2.5 of this paper. It should be noted that the set of q-statements is assumed to be representative of the construct domain rather than to be comprehensive in its representation of a construct, as is typical for instrumentation designed for r-factorial methods.
Another significant advantage of the Q-methodology is that it places minimal demands on sample size. This latter advantage is due partly to the statistical advantages of requiring respondents to sort q-statements into a normal distribution, and partly to the mathematical procedure employed by q-factor analysis [6].
2.5 Definition of the IT Training and Development Climate The IT Training and Development Climate is defined by this study to be the shared perceptions of groups of individuals about IT training and development opportunities and resources that are available through the organization, which is consistent with prior definitions of organizational “climate” [35, 10]. The IT Training and Development Climate includes the general concepts of the Technology Updating Climate [25], such as supervisor support for training, information sharing between colleagues, and organizational support for training and development. It differs, however, by its inclusion of specific areas of training content, resources, and venues, such as “development tools/ applications”, “adequate funding for training”, or “distance based/ Internet training” to name a few. The inclusion of specific content, resources, and venues are the figurative “apples and oranges” of the IT Training and Development Climate that warrant the use of the Qmethodology for methodological support.
Following the recommendations of prior research [45], the statements were generated by a combination of expert input, literature in the domain of interest, and respondent feedback. In phase one, an expert panel generated an initial set of q-sort statements both through discussion and by culling the literature for training and development climate factors [25, 26, 38], specific content areas [19, 20, 3], and venues for training and development [26, 38, 20]. In phase two the q-sort set was discussed in two cycles of focus group meetings that provided additional q-sort statements and feedback for the revision and refinement of the existing set. A small sample was drawn in phase three for the purpose of pilot testing both the administration of the q-sort set and the statistical analysis technique to be applied. The results of the pilot study indicated that the q-sort instrumentation was adequate for primary data collection.
3. RESEARCH METHOD
3.2 Measurement
The Q-methodology was employed in accordance with the goal of this study to discover whether there are naturally occurring groups of IT professionals who cluster around training and development structures in the organizational climate. Utilizing the Q-methodology entails the adoption of its guiding philosophy of preserving operant subjectivity, the guidelines for instrument development and measurement using the q-sort, and a specialized centroid factor extraction technique known as “qfactor analysis” [14, 6].
The measurement proceeded according to the guidelines of the Q-sort methodology as developed in previous research, [45, 3, 43] using the set of q-statements resulting from the pilot procedure (Appendix 1). Each respondent was asked to sort the statements into categories based on their perceived desirability. The respondents were additionally instructed, in accordance with established guidelines [6, 45], to sort “from the outside in”, that is, to begin by sorting statements into the extreme categories and working their way inwards to the less extreme categories.
The preservation of operant subjectivity may best be described as the principle of allowing the subjects to speak for themselves in their own voice [43, 6]. The distinction is sometimes made between nomothetic and idiographic approaches to research design. Most empirical research is nomothetic in that it requires the a priori acceptance of a set of conditions that underlie the research, such as a set of variables and their inter-relationships
3.3 Sample and Data Collection The data was drawn from alumni lists from two universities in the Tampa Bay area, and non-traditional graduate students in evening and weekend courses at three universities in the same geographic area. In all, there were 898 responses from an estimated sample frame comprised of 6,763 potential
14
by consulting the reports of “statements sorted by consensus versus disagreement” (Appendix 3) and “rank statement totals for each factor” (Appendix 2) that is provided as output of the statistical procedure used by the PQMethod software. If the output does not allow a meaningful differentiation to be made between factors five and six and the first four factors, then it will provide evidence that factors five and six are not necessary.
respondents, which resulted in 13% overall response rate. Of the 898 responses, 441 were not IT professionals, and 180 of the rest provided only partial responses, thus leaving a total of 277 respondents who completed the q-sort questionnaire. Nonresponse bias was checked by obtaining partial demographic information from n = 60 non-participants and comparing it to the demographic information provided by the study participants. The threat posed by non-response bias was judged in this manner to be minimal since the two groups appeared to be statistically indistinguishable. The sample was also analyzed based on respondent demographic and job characteristics and compared to previous cross-sectional studies of the IT workforce [19, 20]. Taken as a whole, the similarity of the sample drawn for this study to known characteristics of the IT workforce leads to its acceptance as a representative cross-section of the population of IT professionals.
The report of “statements sorted by consensus versus disagreement” is valuable for its identification of the statements that differentiate the most between factors. In order, the statements most valuable for making distinctions between factors are professional development and training priorities about: programming languages, IT management skills, network administration, databases, development tools/ applications, certifications, distance-based/ Internet training, information sharing between peers, self training/ tutorials/ manuals, formal education, and systems analysis and design. Interpretation of the factors will involve the definition of groups through both a comparison and discussion of their most extreme differences and by their respective highest training and development priorities.
4. ANALYSIS The goal of the analysis is both to identify and interpret a minimum number of factors that explain a maximum variance among the respondents to be used as a baseline for comparison in subsequent academic and managerial applications. Q-factor methods, like r-factor methods, are inherently indeterminate, and there are no hard rules established for q-factor methods that will help researchers determine exactly how many factors are adequate to explain the data.
Factor 1 is characterized by professionals who highly prioritize training and development in IT management skills, systems analysis and design, development tools and applications, databases, and programming languages, in that order. The emphasis seems to be firmly placed on the content of training and development for those who comprise the first factor. This group may be very loosely and generally described as having the preferences of upwardly-mobile (based on their desire for IT management skills) system software professionals.
Since ordinary statistical analysis packages do not include procedures for q-factoring, special software (PQMethod) was obtained to analyze the data. Although the output of the qfactor procedure shares much in common with the more familiar r-factor procedures, such as the extraction of factors and their associated variance, it also produces some information that is distinctive to q-factoring. In accordance with the goals of the analysis, the output that will be examined by this study will be the factors extracted and their associated variances, the report of “statements sorted by consensus versus disagreement”, the “rank statement totals for each factor”, which indicate the order in which statements represent a factor, and the “distinguishing statements for each factor”, which offers information that may be used to judge whether a weak factor is sufficiently distinguishable from other stronger factors, and therefore interpretable.
Factor 2 is characterized by professionals who prioritize distance-based learning, formal education, adequate funding for training, supervisor support for training, and peripheral maintenance and configuration, in that order. A cursory examination of the factor scores reveals that the respondents who were enrolled in non-traditional degree programs are overrepresented in this group, which could explain the emphasis on the distance-based learning, formal education, and financial and supervisor support for development, which is a common model for non-traditional students, who typically get reimbursed by their employers for maintaining a minimum grade point average. Those who comprise factor 2 appear to neither desire nor disdain the acquisition of IT management skills (ranked 15th out of 27). Because of their relationship to the segments of the sample drawn from currently enrolled students, factor 2 may be easily envisioned and as technicians (based on their unique preference for peripheral maintenance and configuration) who are working their way through school.
The factor procedure began by retaining six factors using a ruleof thumb that the number of factors extracted should explain at least 50% of the sample variance. The six factors cumulatively explained 51% of the variation in the sample, with factors one through six individually explaining 15%, 12%, 9%, 6%, 5%, and 4% percent of the variance respectively. By way of comparison to q-sorts in prior research, O’Reilly, Chatman, and Caldwell [32] reported 41% of the variance was explained with an eight-factor solution, and 38% of the variance was explained with a seven-factor solution, with the weakest factors explaining only three percent (3%) of the variance for that factor.
Factor 3 is exemplified by professionals who prioritize training and development in network administration, obtainment of certifications, IT management skills, information sharing between peers, and operating systems. As with those who comprise factor 1, the emphasis for those who comprise factor 2 is also firmly planted in training content, although this group finds a markedly different realm of content desirable. It is also noteworthy that those who comprise factors 1 and 3 seem to undervalue each others priorities – factor 1 members find network administration, information sharing, and operating systems undesirable, while factor 3 members find programming languages, databases, and development tools and applications
Although it is convenient that a six factor solution explained more than 50% of the population variance, it also raises the question as to whether the additional complexity involved with the retention and interpretation of two additional factors is worth the 9% of the population variance cumulatively explained by those two factors. The answer to this question will be addressed
15
development. The result is a new perspective on IT professionals’ training and development priorities that may be used effectively as a baseline for comparisons in future research. The results are of primary interest to practitioners for their preservation of operant subjectivity and the identification of groups with similar training and development preferences. This is important since, if given the choice; most managers would prefer information that allows them to manage the people themselves rather than traits or characteristics of the people. The following two decision scenarios illustrate practical applications of the results.
undesirable. Factor 3 may be generally described as having the development and training preferences of upwardly-mobile network management and infrastructure development professionals. Factor 4 is characterized by professionals who prioritize programming languages, databases, time off from work for training, development tools and applications, and adequate funding for training. Those who constitute factor 4 are seemingly unique in their disdain for IT management skills, which they rank 26th out of 27 in their sorting of training and development statements. Except for their preferences for adequate funding and time off from work for training, factor 4 resembles factor 1, even to their rejection of the preferences of those who comprise factor 3. Factor 4 may be described as having the development and training preferences of career software developers with no management aspirations.
5.2 Decision Scenario 1: Finding the “Best Fit” If the assumption is given that managers have knowledge of the staffing and skill requirements for any given goal or project, the use of the Q-methodology provides rich and specific information that may lead to the identification of professionals or groups of professionals that place a high priority on the same skills, training and development resources, and training venues. At this point, the decision resembles a matching problem that entails identifying the candidate whose q-sort most highly correlates with the “management profile” [37, 7], and hence the “best fit”.
Factor 5 is characterized by professionals who find traditional instructor-led training, cross training with work colleagues, systems analysis and design, general user knowledge, IT management skills, and development tools and applications desirable. Although the professionals who characterize factor 4 are one of five groups emphasize IT management skills, and one of three groups to emphasize systems analysis and design, they are unique in their preference for the venue-related priorities of instructor-led training and cross-training on the job, and general user knowledge. Those who comprise factor 5 resemble those who comprise factor 1 except for the value they place on the highly social activities of information sharing (ranked 7th) and cross-training with work colleagues (ranked 2nd), which their counterparts in factor 1 rank 25th and 16th, respectively. Based on this, factor 5 may be very generally and loosely distinguished from their factor 1 counterparts by their social orientation to upwardly-mobile systems development training preferences.
This scenario may be illustrated using data gathered from a department manager and three candidates for a project assignment at a large IT consulting firm (Appendix 4, Scenario 1). The manager has provided a q-sort of the firm’s training and development priorities; the employees have each provided qsorts of their personal training and development priorities. The correlation reported in column 2 is computed using a version of the Spearman-Brown Prophecy Formula [31, 32]. The remaining columns report the q-sorts for the manager and each of three job candidates. The manager’s q-sort represents the firm’s training and development priorities, while the candidates’ q-sorts represent their personal training and development priorities. The correlation reported in column two is the correlation of each job candidate with the management profile. In this case, candidate 1’s personal priorities are most closely aligned with the firm’s priorities, as represented by the notably high 0.64 correlation.
Factor 6 is characterized by professionals who prioritize formal education, adequate funding for training, IT management skills, databases, systems analysis and design, and time off from work for training. A cursory examination also reveals that currently enrolled students are over-represented in factor 6. Factor 6 appears to differ from factor 2 by their rejection of distancebased learning (ranked 26th) in favor of formal education (ranked 1st). Factor 2, on the other hand, embraces distancebased education (ranked 1st) as a priority. Factor 6 is somewhat similar to both factor 5 and factor 1, but is represented strongly by the non-traditional students in the sample with a very strong preference for a formal education. Based on this, factor 6 may be very generally characterized as having the training and development preferences of upwardly-mobile system development professionals who are working their way through college.
Columns three through twelve report the q-sort statement numbers (ref. Appendix 1) of each candidate’s responses for the five most important training and development priorities (columns 3 – 7) and the five least important priorities. Even a casual inspection of each candidate’s q-sort reveals why candidate 1 is the best choice. Candidate 1 agrees with the management q-sort by placing high priorities on programming languages, general technologies, and data storage, and placing low priorities on adequate funding for training, time off from work for training, and supervisor support for training. Candidate 2 agrees with the management profile only in the high prioritization of programming languages and certifications. Candidate 3 agrees with the management profile only in the high prioritization of training in general technologies.
5. DISCUSSION AND RECOMMENDATIONS Given that the six groups extracted by the q-factor procedure were interpretable without too much difficulty, they are judged both to be adequate and a reasonably defendable representation of the known IT workforce, and therefore retained for discussion. The results are of interest to researchers for their identification of naturally-occurring groups of IT professionals based on their own priorities about professional training and
In summary, through the use of management – employee profile correlations, management may easily and effectively employ the q-methodology to identify a candidate with the “best fit” of
16
personal training and development priorities from a pool of job candidates.
work for training, self-training/ tutorials/ manuals, traditional instructor-led training, and formal education.
5.3 Decision Scenario 2: Training and Development Resource Allocation
All three groups are interpretable both within the context of the firm and in the larger context of the IT profession. The resource allocation decision at this point simply follows the priorities common to each group. From the individual perspective, the resources will be allocated according to their own subjective preferences for training and development. From the management perspective, allocating resources on a group-bygroup basis will be a more efficient means of achieving the desired objective than allocating resources on a person-byperson basis. It is a win-win scenario between the interests of management and each individual IT professional.
Resource allocation, as is typical for highly unstructured problems, requires information to assist the decision maker with the task of framing the problem. The Q-methodology provides information that will allow the identification of individual and group priorities, such as whether to aim training and development resources at off-site workshops, cross-training with work peers, distance-based or Internet training, or some combination of them all.
In summary, even when little advance knowledge is possessed about the firm’s training and development priorities, the Qmethodology provides enough information for managers to frame the resource allocation problem, and from there move to decisions about the allocation of specific resources. The rotated factor matrix that is output by a q-factor analysis supports the placement of IT professionals into naturally-occurring groups according to their own subjective training and development preferences. The report of “rank statement totals for each factor” provides the specific information about each group’s training and development priorities.
To illustrate this decision scenario, q-sorts were obtained from twelve employees from the IT department of a large financial firm (Appendix 4, Scenario 2). A q-factor analysis reveals that these 12 employees factor into three distinct groups based on their training and development priorities. Each factor individually explains 15% of the total response variance, for cumulative total of 45% for all three groups, which is very good by comparative standards [32]. Factor 1 is comprised of employees 6 (Senior Programmer Analyst), 7 (Senior Programmer Analyst), 8 (Systems Analyst), and 9 (Systems Analyst). Factor 2 is comprised of employees 4 (Supervisor), 5 (Senior Programmer Analyst), 10 (Database Administrator), and to a lesser extent employee 2 (Training Specialist). Factor 3 is comprised of employees 1 (Application Development Manager), 3 (Business Analyst), 11 (Programmer Analyst), and to a lesser extent employee 12 (IT Support Technician).
6. CONCLUSION The Q-methodology is capable of providing rich and specific information to both researchers and practitioners. As a tool for research, the Q-methodology provides a means of discovering naturally-existing groups of IT professionals according to their own desires and preferences. The instrument developed in this study has demonstrated adequate properties to recommend its further use for the assessment of the IT development and training climate among IT professionals. In the hands of managers and decision makers, the instrument has demonstrated capabilities to extract valuable information from the firm’s internal climate that may be used to support decisions related to the internal development of the firm’s existing IT workforce to fill the gap between organizational goals and its existing IT skill portfolio.
The rank statement totals with normalized factor scores for each factor are also reported in Appendix 4, Scenario 2. The information in this table may be used to identify specific training and development preferences of the three groups observed in this firm. Once the groups are identified and interpreted, the result will allow management to administer training and development resources more efficiently on a group-by-group basis. Factor 1 members in this firm are typical of the career system development professionals discussed under “Factor 4” in the analysis. They prioritize training and development in programming languages, development tools and applications, systems analysis and design, databases, time off from work for training, and adequate funding for training. They do not prioritize the development of management skills or certifications.
7. REFERENCES [1] Agarwal, Ritu and Ferratt, Thomas (1999). Coping with labor scarcity in information technology: strategies and practices for effective recruitment and retention. Pinnaflex Educational Resources, Inc. : Cincinnati, OH. [2] Agarwal, Ritu and Ferratt, Thomas W. (2001). Crafting an HR strategy to meet the need for IT workers, Communications of the ACM, 44(7), pp .58 – 64. [3] AlignMark (2000). Florida high-tech corridor competency model for the position of corridor technologist. A white paper, prepared by AlignMark; 258 Southhall Lane, Suite 400; Maitland, FL 32751. [4] Bartlett, C. and Ghoshal, S. (2002). Building competitive advantage through people. Sloan Management Review, 43(2) pp. 34 – 41. [5] Bem, D. and Allen, A. (1974). On predicting some of the people some of the time: the search for cross-situational
Factor 2 members in this firm display an affinity for the upwardly-mobile system software professionals discussed under “Factor 1” in the analysis. This particular group differs from their “Factor 1” peers in the larger IT profession in that they do not prioritize the development of management skills as strongly as do their “Factor 1 and they place a much higher priority on information sharing between peers. Factor 3 members in this firm are typical of the “Factor 6” IT professionals discussed in the analysis. This group of employees strongly prefer training and development in IT management skills, systems analysis and design, time off from
17
Resource Management International Digest, 3(1), pp. 27 – 30. [22] Jordan, William C.; Inman, Robert R.; Blumenfeld, Dennis E. (2004). Chained cross-training of workers for robust performance. IIE Transactions, 36(10), pp. 953 ff. [23] Kendall, J. E. and K. E. Kendall (1993). Metaphors and methodologies: living beyond the systems machine, MIS Quarterly 17(2), pp. 149 – 171. [24] Kendall, K. E., J. R. Buffington, and J. E. Kendall (1987). The relationship of organizational subcultures to DSS user satisfaction. Human Systems Management (7) 1, pp. 31 – 39. [25] Kozlowski, Steve W. and Hults, Brian M. (1987). An exploration of climates for technical updating and performance. Personnel Psychology, 40(3), Fall 1987, pp 539-563. [26] Leidner, D. E. & Jarvenpaa, S. L. (1995). The use of information technology to enhance management school education: a theoretical view. MIS Quarterly, 19(3), 265 – 291. [27] McDowell, Callie (1996). Aligning work force capabilities with business strategies. The Human Resource Professional, 9(5), pp. 3 – 5. [28] Meade, Jim (2000). Self-assessment tool helps target training. HR Magazine, 45(5), pp. 167 – 170. [29] Mirabile, Richard J. (1991). Pinpointing development needs: a simple approach to skills assessment. Training and Development, 45(12), pp. 19 – 23. [30] Noe, Raymond A. and Wilk, Steffanie L. (1993) Investigation of factors that influence employees' participation in development activities. Journal of Applied Psychology, 78, 291-302. [31] Nunnally, Jum C. (1978). Psychometric theory. New York : McGraw HillBook Company. [32] O’Reilly, Charles A.; Chatman, Jennifer; and Caldwell, David F. (1991). People and organizational culture: a profile comparison approach to assessing personorganization fit. Academy of Management Journal, 34(3), pp. 487 – 516. [33] Pfeffer, Jeffrey (1994). Competitive advantage through people. California Management Review, 36(2), pp. 9 – 28. [34] Pimchangthong, D.; Plaisent, M.; and Bernard, P. (2003). Key issues in information systems management: a comparative study of academics and practitioners in Thailand. Journal of Global Information Technology Management, 6(4), pp. 27 – 44. [35] Reichers, Arnon E. and Schneider, Benjamin (1990). Climate and culture: an evolution of constructs, chapter 1 in Schneider, Benjamin, ed. Organizational Climate and Culture. New York : Jossey-Bass. [36] Rowe, Christopher (1995). Clarifying the use of competence and competency models in recruitment, assessment and staff development. Industrial and Commercial Training, 27(11), pp. 12 – 17. [37] Ryan, Ann Marie and Schmit, Mark J. (1996). An assessment of organizational climate and P-E fit: a tool for organizational change. International Journal of Organizational Analysis, 4(1), pp. 75 – 95.
consistencies in behavior. Psychological Review, 81, pp. 506 - 520. [6] Brown, S. R. (1980). Political subjectivity: applications of Q Methodology in Political Science. New Haven, CT: Yale University Press. [7] Chatman, Jennifer A. (1991). Matching people and organizations: selection and socialization in public accounting firms. Administrative Science Quarterly, 36(3), pp. 459 – 484. [8] Collins, Rosann Webb and Birkin, Stanley J. Challenges of managing the global IS/IT workforce, strategies for managing IS/IT personnel. Hershey, PA : Idea Group Publishing 2004. [9] Colquitt, Jason A.; LePine, Jeffrey A.; and Noe, Raymond A. (2000). Toward an integrative theory of training motivation: a meta-analytic path analysis of 20 years of research. Journal of Applied Psychology, 85 (5), pp. 678 – 707. [10] Denison, Daniel R. (1996). What is the difference between organizational culture and organizational climate? A native’s point of view on a decade of paradigm wars. Academy of Management Review, 21(3), pp. 619 – 654. [11] Farr, J. L.; Enscore, J. E.; Dubin, S. S.; Cleveland, J. N. & Kozlowski, S. W. (1980). Relationships among individual motivation, work environment, and updating in engineers. Coret Madera, CA : Select Press. [12] Frank, Frederic D. and Taylor, Craig R. (2004). Talent management: trends that will shape the future. Human Resource Planning, 27(1), pp. 33 – 40. [13] Gottschalk, Petter (2001). Key issues in IS management in Norway: an empirical study based on Q-methodology. Information Resources Management Journal, 14(2), pp. 37 – 45. [14] Green, P. E. (1978). Analyzing Multivariate Data. Hinsdale, IL : Dryden Press. [15] Hair, Joseph F.; Anderson, Rolph E.; Tatham, Ronald L.; and Black, William C. Multivariate data analysis. New Jersey : Prentice Hall, 1998. [16] Heinonen, Cheryl (2001). When hiring gurus makes more sense than fulltime employees. Manage, 52(3), pp. 20 21. [17] Hopp, Wallace J.; and Van Oyen, Mark P (2004). Agile workforce evaluation: a framework for cross-training and coordination, IIE Transactions, 36)(10), pp. 919 ff. [18] Hopp, Wallace J.; Tekin, Eylem; and Van Oyen, Mark J. (2004). Benefits of skill chaining in serial production lines with cross-trained workers. Management Science, 50(1), pp. 83 – 98. [19] ITAA (2002). Bouncing Back: Jobs, skills and the continuing demand for IT workers. Information Technology Association of America. [20] ITAA (2004). Adding value… growing careers: the employment outlook in today’s increasingly competitive job market. Information Technology Association of America (ITAA). [21] Ito, Jack K. (1995). Current staff development and expectations as criteria in selections decisions. Human
18
[43] Stephenson, W. (1953). The study of behavior: Qtechnique and its methodology. Chicago : University of Chicago Press. [44] Tabachnick, Barbara G. and Fidell, Linda S. Using multivariate statistics, second edition. New York : HarperCollins Publishers, 1989. [45] Thomas, Dominic M and Watson, Richard T. (2002). Qsorting and MIS research: a primer. Communications of the AIS, 8, pp. 141 – 156. [46] Tractinsky, N. and Jarvenpaa, S. L. (1995). Information systems design decisions in a global versus domestic context, MIS Quarterly 19(4), pp. 28. [47] Turley, Richard T. and Bieman, James M. (1995). Competencies of exceptional and nonexceptional software engineers. Journal of Systems Software, 28, pp. 19 – 38. [48] Umpleby, Tom (1987). Development of IT staff. Information Age, 9(3), pp. 143 – 155.
[38] Schambach, T. and Blanton, J. E. (2002). The professional development challenge for IT professionals. Communications of the ACM, 45(4), pp. 83 – 87. [39] Schwarzkopf, Albert B.; Mejias, Roberto J.; Jasperson, Jon; Saunders, Carol S.; and Gruenwald, Hermann (2004). Effective practices for IT skills staffing, Communications of the ACM, 47(1), pp. 83 - 88. [40] Shaikh, Muzaffar A. (1998). A "peak shaving" approach to project staff reallocation. Computers & Industrial Engineering, 35(1,2), pp. 129 – 132. [41] Sinclair, John; Collins, David (1991). The skills time bomb (Part 1). Leadership & Organization Development Journal, 12(1), pp. 4 – 6. [42] Smith, Douglas; Nauss, Robert M.; Subramanian, Ashok; and Beck, Ron (2004). Decision support for staffing, outsourcing, and project scheduling in MIS strategic plans. INFOR, 42(1), pp. 79 – 100.
APPENDIX 1. Q-sort Instrumentation The questions in this section concern your beliefs about your current organization, job, and working environment. This section concerns your beliefs about IT training opportunities and resources within your current organization. For this section, you are asked to sort twenty items from the list of twenty-seven training-related items listed below according to your beliefs about the priorities of those items. The items themselves are about IT training content, training venues, and general organizational training resources. Sort the items according to your personal priorities. Sort the items "from the outside in", that is, decide on two "very desirable" and "very undesirable" items first, then select three items each for the "desirable" and "undesirable" categories, and five items each for "somewhat desirable" and "somewhat undesirable". The last seven items need not be sorted, and will be categorized as "neutral". Please pay attention to make sure that you enter an item once. Very Desirable (2 items) Desirable (3 items) Somewhat Desirable (5 items) Somewhat Undesirable (5 items) Undesirable (3 items) Very Undesirable (2 items) 1. Programming languages
10. Multimedia design
19. Distance-based/ Internet training
2. Databases
11. General hardware troubleshooting
20. Formal education
3. Development tools/ applications
12. Information sharing between peers
21. Self-training/ tutorials/ manuals
4. General technologies
13. Network administration
22. Cross training with work colleagues
5. Data storage
14. Telecommunications
23. "Traditional" instructor-led training
6. Certifications
15. IT management skills
24. Adequate funding for training
7. General user knowledge
16. Systems analysis and design
25. Time off from work for training
8. Operating systems
17. Servers
26. Supervisor support for training
9. Freedom to choose own training content
18. Freedom to choose own training venue
27. Peripherals configuration)
Please double-check to make sure that you enter each item only once.
19
(maintenance/
APPENDIX 2. Rank Statement Totals for Each Factor: Normalized factors scores are in left columns, statement rankings are in right columns. Statements
Factors
1
2
3
4
5
6
1. Programming languages
1.39
5
-2.02
27
-1.7
27
1.77
1
0.11
13
0.44
9
2. Databases
1.41
4
-1.77
26
-0.35
18
1.49
2
0.18
12
1.16
4
3. Development tools/ applications
1.69
3
-1.05
23
-1.37
25
1.26
4
0.8
6
0.88
7
4. General technologies
0.02
13
-0.14
14
-0.2
15
0.36
11
-0.13
15
0.17
11
5. Data storage
-0.85
23
-1.22
25
-0.47
19
0.34
13
-0.59
22
-0.06
14
6. Certifications
0.46
6
0.08
13
1.76
2
0.83
7
-1.5
24
-0.67
22
7. General user knowledge
-0.7
22
-0.24
16
-0.96
23
0.05
14
0.84
4
-0.5
20
8. Operating systems
-1.03
24
-1.21
24
1.13
5
-0.44
16
-0.18
16
-0.47
19
9. Freedom to choose own training content
-1.92
27
-0.7
19
-0.62
21
-1.12
24
-0.22
18
0.02
13
0.15
11
-0.34
17
-1.37
26
-0.57
19
-2.23
27
-0.81
23
10. Multimedia design 11. General hardware troubleshooting
-1.74
26
-0.71
20
-0.22
16
-1.08
23
-0.29
19
-0.88
24
12. Information sharing between peers
-1.24
25
-0.79
22
1.6
4
-1.03
22
0.75
7
-0.15
15
13. Network administration
-0.66
21
-0.74
21
2.06
1
-1.38
25
-1.56
26
-0.4
17
14. Telecommunications
-0.5
20
0.22
12
-1.08
24
-1.76
27
-1.3
23
-0.51
21
15. IT management skills
1.96
1
-0.14
15
1.68
3
-1.59
26
0.84
5
1.33
3
16. Systems analysis and design
1.79
2
-0.42
18
-0.19
14
-0.44
17
0.98
3
1
5
-0.21
15
0.51
11
0.05
11
0.49
9
-1.53
25
0.33
10
0.27
9
0.91
7
-0.02
13
0.49
8
0.47
11
-0.39
16
19. Distance-based/ Internet training
0.18
10
1.39
1
-0.8
22
-0.61
20
-0.21
17
-1.99
26
20. Formal education
0.35
7
1.33
2
0.34
7
-0.9
21
0.1
14
1.94
1
17. Servers 18. Freedom to choose own training venue
21. Self-training/ tutorials/ manuals
0.01
14
0.72
10
-0.3
17
-0.47
18
0.63
10
-2.05
27
22. Cross training with work colleagues
-0.21
16
0.84
9
0.19
9
0.03
15
1.75
2
-0.45
18
23. "Traditional" instructor-led training
0.31
8
0.91
8
0.04
12
0.44
10
1
0.1
12
24. Adequate funding for training
1.78
0.04
12
1.27
3
1.06
6
1.26
5
0.67
8
1.5
2
25. Time off from work for training
-0.28
18
1.09
6
0.23
8
1.26
3
-0.36
20
0.94
6
26. Supervisor support for training
-0.24
17
1.14
4
0.09
10
1.01
6
0.66
9
0.87
8
27. Peripherals (maintenance/ configuration)
-0.47
19
1.1
5
-0.58
20
0.34
12
-0.47
21
-1.35
25
20
APPENDIX 3. Factor Q-Sort Values for Statements sorted by Consensus vs. Disagreement (Variance across normalized Factor Scores) Statement
Factor Arrays
1
2
3
4
5
6
4. General technologies
0
0
0
0
0
0
18. Freedom to choose own training venue
1
1
0
1
0
0
24. Adequate funding for training
0
2
1
2
1
3
26. Supervisor support for training
0
2
1
1
1
1
5. Data storage
-2
-2
-1
0
-1
0
11. General hardware troubleshooting
-3
-1
0
-2
-1
-2
7. General user knowledge
-1
0
-2
0
2
-1
1
1
0
1
3
0
9. Freedom to choose own training content
-3
-1
-1
-2
-1
0
14. Telecommunications
-1
0
-2
-3
-2
-1
25. Time off from work for training
-1
1
1
2
-1
1
17. Servers
0
0
0
1
-2
1
22. Cross training with work colleagues
0
1
1
0
3
-1
-2
-2
2
0
0
-1
0
0
-3
-1
-3
-2
-1
2
-1
0
-1
-2
16. Systems analysis and design
3
-1
0
0
2
2
20. Formal education
1
3
1
-1
0
3
21. Self-training/ tutorials/ manuals
0
1
0
-1
1
-3
-2
-1
2
-1
1
0
19. Distance-based/ Internet training
1
3
-1
-1
0
-3
6. Certifications
1
0
3
1
-2
-1
3. Development tools/ applications
2
-2
-2
2
1
1
2. Databases
2
-3
-1
3
0
2
-1
-1
3
-2
-3
0
15. IT management skills
3
0
2
-3
2
2
1. Programming languages
2
-3
-3
3
0
1
23. "Traditional" instructor-led training
8. Operating systems 10. Multimedia design 27. Peripherals (maintenance/ configuration)
12. Information sharing between peers
13. Network administration
21
APPENDIX 4. Decision Scenarios Supported by the Q-methodology Scenario 1: Finding the “Best Fit” Between Managerial and Individual Priorities ρ Manager
3
3
2
2
2
-2
-2
-2
-3
-3
1
6
4
5
19
22
23
26
24
25
Candidate 1
0.64
1
8
2
4
5
24
25
26
20
27
Candidate 2
-0.01
2
3
1
6
15
9
10
13
8
11
Candidate 3
-0.39
4
13
17
25
26
9
16
21
1
12
Appendix 4, Scenario 1: Correlations between q-sorts of management priorities and those of three internal job candidates Note: the candidate whose priorities are most similar to the priorities of management will exhibit the highest correlation. In this case, candidate 1 (correlation of 0.64) is the best fit with management priorities. Scenario 2: Training and Development Resource Allocation Job Title
Factor 1
Factor 2
Factor 3
Employee 1
Application Development Manager
0.2332
-0.0197
0.9112
Employee 2
Training Specialist
0.0742
-0.0222
0.3365
Employee 3
Business Analyst
-0.0974
0.2371
0.5609
Employee 4
Supervisor
-0.0557
0.6015
0.224
Employee 5
Senior Programmer Analyst
0.3024
0.7053
0.1285
Employee 6
Senior Programmer Analyst
0.7555
0.2252
0.2255
Employee 7
Senior Programmer Analyst
0.4382
0.1014
0.2994
Employee 8
Systems Analyst
0.6579
0.0601
-0.0975
Employee 9
Systems Analyst
0.5067
0.3164
0.2328
Employee 10
Database Administrator
0.2077
0.7387
-0.0288
Employee 11
Programmer Analyst
0.3769
0.2023
0.4703
Employee 12
IT Support Technician
0.1063
0.3624
-0.0193
Variance explained
15%
15%
15%
Appendix 4, Scenario 2: Q-factor analysis of twelve employees to receive allocations of training and development resources Note: The q-factor analysis has identified three groups of employees that share similar training and development priorities. Factor 1 is comprised of employees 6, - 9; factor 2 is comprised of employees 4, 5, 10, and 12; factor 3 is comprised of employees 1 – 3, and 11.
22
Statement
Factor 1
Factor 2
Factor 3
1. Programming languages
2.12
1
1.98
2
0.03
2. Databases
1.01
6
1.48
3
0.22
11
3. Development tools/ applications
1.24
4
1.35
4
0.12
13
4. General technologies
-0.39
17
-0.41
18
-0.55
19
5. Data storage
-0.13
15
-0.96
23
-0.63
21
6. Certifications
-0.8
23
2.12
1
-1.24
23
-0.12
14
-1.09
24
0.64
7
7. General user knowledge
14
-0.8
22
0.18
10
-0.39
18
9. Freedom to choose own training content
-2.12
27
-1.59
26
-1.34
24
10. Multimedia design
-0.52
19
-0.65
22
-0.71
22
11. General hardware troubleshooting
-0.83
24
0.12
12
-0.57
20
12. Information sharing between peers
-1.42
25
0.4
7
-0.06
15
13. Network administration
-1.65
26
-0.15
16
-1.57
26
14. Telecommunications
-0.62
20
-2.04
27
-1.89
27
15. IT management skills
-0.65
21
0.26
9
2
1
8. Operating systems
16. Systems analysis and design
1.21
5
1.03
5
1.92
2
17. Servers
0.01
12
0.08
13
0.16
12
18. Freedom to choose own training venue
0.51
8
-0.32
17
0.47
9
-0.42
18
0.17
11
0.24
10
0.77
6
19. Distance-based/ Internet training
0
13
-0.1
15
-0.16
16
-0.64
21
1.11
4
22. Cross training with work colleagues
0.88
7
-1.1
25
-0.08
17
23. "Traditional" instructor-led training
0.42
9
-0.56
20
1.05
5
24. Adequate funding for training
1.27
3
0.3
8
-0.08
17
14
1.13
3
6
0.61
8
20. Formal education 21. Self-training/ tutorials/ manuals
25. Time off from work for training
1.34
2
0.04
26. Supervisor support for training
0.28
11
0.62
Appendix 4, Scenario 2: Rank statement totals with normalized factor scores for the three factors extracted by the q-factor analysis of the same twelve employees
23