Development of a Measure for the Organizational Learning Construct

10 downloads 595 Views 338KB Size Report
Journal of Management Information Systems / Fall 2002, Vol. 19, No. 2, pp. 175–218. ... ministration, a Masters in MIS, and a Ph.D. in MIS. He has published in ...
DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT

175

Development of a Measure for the Organizational Learning Construct GARY F. TEMPLETON, BRUCE R. LEWIS, AND CHARLES A. SNYDER GARY F. TEMPLETON is an Assistant Professor of MIS in the College of Administrative Sciences at the University of Alabama in Huntsville. He has previously taught MIS courses at Athens State University, Syracuse University, and Auburn University. He has a B.S. in Business Administration (finance major), an M.S. in Business Administration, a Masters in MIS, and a Ph.D. in MIS. He has published in the area of organizational learning and his research also focuses on large information systems development. BRUCE R. LEWIS is an Assistant Professor of MIS in the Wayne Calloway School of Business and Accountancy at Wake Forest University. He spent 25 years as a practicing IT professional, including serving as the Executive Director of Computing at Auburn University and as a member of the board of the Alabama Supercomputer Authority. He holds a B.S. in Mathematics and an M.S. in Statistics; he received his Ph.D. in MIS from Auburn University. His research interests include issues relating to the management of information technology and business intelligence systems. He has published in the Journal of Management Information Systems, Communications of the AIS, Journal of Computer Information Systems, and Expert Systems with Applications. CHARLES A. SNYDER is the Woodruff Endowed Professor of MIS in the Department of Management at Auburn University. He received a Ph.D. in Management from the University of Nebraska and he holds an M.S. in Economics from South Dakota State University, an MBA from the Ohio State University, and a BFA from the University of Georgia. His more than 200 refereed publications have appeared in journals such as Journal of Management Information Systems, Information & Management, Academy of Management Review, Academy of Management Executive, California Management Review, Data Management, IEEE Transactions on Engineering Management, and Decision Support Systems. He is coauthor of The Management of Telecommunications, published by Irwin McGraw-Hill. His research interests include knowledge management, information resource management, expert systems, computer-integrated manufacturing, systems analysis and design, and telecommunications management. ABSTRACT: The concept of organizational learning (OL) is receiving an increasing amount of attention in the research and practice of management information systems (MIS) due to its potential for affecting organizational outcomes, including control and intelligence, competitive advantage, and the exploitation of knowledge and technology. As such, further development of the salient issues related to OL is warranted, especially measurement of the construct. Based on a domain definition grounded in the literature, this research represents the initial work in developing an empirically reliable and valid measure of organizational learning. The rigorous method utilized in the derivation of this measure, which integrates two methodological frameworks for Journal of Management Information Systems / Fall 2002, Vol. 19, No. 2, pp. 175–218. © 2002 M.E. Sharpe, Inc. 0742–1222 / 2002 $9.50 + 0.00.

176

TEMPLETON, LEWIS, AND SNYDER

instrument development, is the main strength of this work. The result is an eightfactor, 28-item instrument for assessing OL, derived from a sample of 119 knowledge-based firms. The empirically derived factors are awareness, communication, performance assessment, intellectual cultivation, environmental adaptability, social learning, intellectual capital management, and organizational grafting. MIS function managers can use these factors to gauge organizational or subunit success in the creation and diffusion of new applications of information technology. KEYWORDS AND PHRASES: innovation, organizational change, organizational intelligence, organizational learning, scale development, technology adoption.

PROMINENT ORGANIZATIONAL THEORISTS have predicted that the amount of information and knowledge that organizations must process will continue to increase [46, 70]. Several authors have responded to this new era by prescribing learning models for the design of organizations that are more responsive to turbulent environments [8, 39, 91, 99, 112, 128, 134]. In such conceptualizations, organizational learning (OL) is depicted as having a great potential for affecting organizational outcomes, such as organizational control and intelligence, competitive advantage, and the exploitation of knowledge and technology. Since interest in applying OL designs has increased over the past several years [134], further development of the salient issues related to OL is warranted, especially measurement of the construct. OL theory has profound relevance to the science and practice of management information systems (MIS). Past research indicates MIS is useful in the facilitation and exploitation of the three modes of OL espoused by Argyris and Schön [5], who drew upon the work of Gregory Bateson [12] in the behavioral sciences. First, MIS can translate to superior single-loop learning (SLL), the mode corresponding with incremental organizational change initiatives. Stein and Zwass [128] proposed that successful SLL is better facilitated by the existence of organizational memory performance standards, which often accompany the adoption of organizational memory information systems (IS). Second, MIS can be used to exploit double-loop learning (DLL), information processing intended to translate into radical organizational change. Stein and Vandenbosch [127] discovered five critical success factors that contribute to higherorder learning (DLL) throughout the system development life cycle. They suggested that advanced IS, such as expert and executive information systems, provide unique opportunities for learning that has strategic implications. Finally, MIS can impact deutero learning, whereby organizations and its members learn how to learn [12]. Alavi [1] surveyed 127 MBA students to determine if group decision support system (GDSS) usage enhances collaborative learning. She found that students using GDSS in collaborative learning experienced higher levels of perceived skill development, self-reported learning, positive classroom experience, and individual learning performance (measured by course grades). Alavi et al. [2], who conducted a longitudinal field study to investigate the efficacy of desktop videoconferencing (DVC) in support-

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT

177

ing collaborative telelearning, corroborated these findings. Their findings suggested that higher measures of critical-thinking skill development, group commitment, and group attraction were attributed to the DVC-supported distance-learning environment. The nature of knowledge or innovations subject to adoption during OL endeavors is also a significant topic in studies that incorporate OL and MIS. Premkumar et al. [105] examined the relationship between innovation (complexity, compatibility, costs, relative advantage, and communicability) and diffusion (adaptation, internal and external diffusion, and implementation success) characteristics. Perceptions of technical and organizational compatibility of electronic data interchange (EDI) were found to predict its implementation success. Nelson [95] assessed the knowledge and skill requirements of IS and end user personnel. He found that IS personnel need more organizational knowledge, end users need more IS-related skills, and that both personnel categories were deficient in general IS knowledge. Vessey and Conger [142] used process-tracing methods to investigate the effectiveness of process-, data-, and object-oriented methodologies in specifying information requirements during system development. They found that process methodologies facilitated superior learning, and it was the only type that significantly affected learning over the three trials. Research also shows the impact of OL within the MIS function. Vandenbosch and Higgins [138] provided evidence that MIS success is dependent upon member learning style. They surveyed 73 executives in nine companies and found that the success of an executive support system (ESS) is dependent on the type of executive learning. Interestingly, consistent with prior research in MIS, they found that individual differences were not significant precursors to learning behavior. Larsen [78] investigated whether the implementation of an information technology (IT) innovation is best explained by the ability of middle managers to innovate using business or IT experience. He found that business and IT knowledge could both stimulate innovativeness as well as IT adoption success. The research described herein represents the most rigorous attempt to date at creating an instrument to assess OL-related behaviors in organizations through the empirical development of a valid and reliable measure. In so doing, it provides a systematic technique for collecting, analyzing, and interpreting data about OL for application in organizational research. This undertaking is significant because: (1) the concept of OL is a paradigm for organizational thought, (2) without a measure it is difficult to assess the extent of OL in organizations, (3) a better understanding of OL is important for management, and (4) empirical research in OL will benefit from a quantitative means of measuring the concept.

Review of Organizational Learning Definitions DUE TO THE EXPANSIVE IMPLICATIONS of OL theory, it is virtually impossible to attribute its genesis to a single theorist, work, or even discipline. Perhaps psychologists would attribute its development to pioneers of individual learning research such as Thorndike [136], Watson [146, 147], Pavlov [103], and Skinner [120]. OL theorists would attribute its beginnings to many important cumulative developments made in

178

TEMPLETON, LEWIS, AND SNYDER

organizational theory that help explain its nature. For instance, the tenets of Adam Smith’s [122] Wealth of Nations embodied the organizational adoption of techniques and technologies. Frederick Taylor’s [131] views on scientific management explained how work associated with the management of organizational operations could be objectified and improved. In his seminal report on learning curves, T.P. Wright [149] described the outcome patterns associated with what many would describe as collective learning behavior. Credit should also be given to those responsible for articulating contrasting organizational design alternatives, such as bureaucracy [148] and organizational functionalism [49]. Each of these historical works has had a profound impact on the thought that led to the need for OL research. Cyert and March [37] were the first to coin the phrase organizational learning, and articulate learning as an organizational phenomenon. The cumulative tradition of OL research has resulted in the development of several key tenets, involving: (1) organizational mode, (2) organizational environment, (3) member behavior and cognition, and (4) information content. Organizational mode refers to the extent to which the organization is seeking intended change, whether it is single, double [5], or deutero [12] learning. SLL can be said to embody the philosophies and prescriptions associated with managing incremental change, such as those espoused by Edward Deming [41]. DLL relates to the radical field of change articulated by Hammer and Champy [61]. Deutero learning is defined as “learning to learn,” and is perhaps the most intelligent behavior organizations can exhibit. OL also represents a body of thought that (relative to competing paradigms) pronounces the significance of organizational environment in inducing organizational self-design [107]. More precise classifications of member behavior and cognition have been offered by Huber [71] and Bandura [9]. Senge [114] has articulated a series of five organizational disciplines, from which organizational stakeholders should portray in their work. Past attention on the explicit notion of OL has been placed on its conceptualization [96], management [92], development [55], and exploitation [126]. There exists very limited previous serious attempts at providing an acceptable domain definition of OL [10, 132]. Barriers to defining OL have plagued its development, and involve challenges associated with reaching consensus, despite its complexity, in a time of unprecedented popularity. Few have attempted to derive psychometrically acceptable measures of varying perspectives on the OL construct [57]. Given the multitude of perspectives regarding its definition and use, it is hoped that the results of the current study contribute at least a partial solution to the difficulties inherent in OL research.

Progression of Study THIS RESEARCH PROGRESSED THROUGH THREE STAGES . First, a conceptual definition of the OL construct was derived from a content analysis of the literature. Next, a set of items was generated and a measurement instrument was designed, evaluated, and refined through several steps. Finally, data from an administration of the instrument were summarized to provide a statistical profile of the extent to which organizations engage in OL.

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT

179

This methodology is based on the paradigm for measurement development proposed by Churchill [28], which has been utilized in various MIS studies [80, 86, 108, 116, 117]. Each of the four instrument development phases enumerated by Churchill [28] focuses on satisfying validity and reliability concerns about the construct through iterative development and testing. These phases include: (1) construct domain specification, (2) construction of items, (3) data collection, and (4) measure purification. In this study, particular attention was paid to the data collection and measure purification steps by subjecting the instrument to repeated administrations and tuning. The successful tradition of the Churchill [28] paradigm was augmented in this study through consideration of Malhotra and Grover’s [90] Ideal Survey Attributes (ISA). These attributes, denoted as ISA-n in this research, are displayed in Table 1. Convenient in the context of the Churchill [28] method, the ISA items relate to key success factors in instrument development and quality improvement [7]. This paper notes which of the ISA attributes were satisfied in each step of the instrument development process.

Results Stage I: Conceptual Definition of OL FOR THIS PROJECT A CONCEPTUAL DEFINITION of OL was determined by conducting a content analysis of selected OL literature. Content analysis involves any of several techniques used to systematically analyze and concisely describe the content of written, spoken, or image communications [4, 23, 25]. It was employed in this research to support existing theory defining OL subconstructs (for example, Huber’s [71] four OL subprocesses) and to extend the theory by uncovering additional important activities related to OL. The selected literature included academic and practitioner articles and books concerned with OL in several disciplines. The ProQuest Direct database was accessed for articles and books that met the search criterion; articles and books were chosen if the phrase organizational learning was found in the title or was in the key word list of the article. Bibliographies of the selected articles were reviewed to further explore important concepts. Table 2 presents a list of authors and their works, along with their specific contribution in articulating the OL construct, which were discovered and utilized in the content analysis. For the purposes of defining OL, it is important to distinguish between what the concept is and what it is not. The myriad of descriptive research on OL can be parsed into (1) works that explicitly discuss OL and (2) works that refer to OL in relation to other organizational issues. Works that discuss the OL construct (see Table 2) are useful in its definition, whereas the latter is useful in describing its relationship with other concepts [87, 135]. This research does not attempt to define and measure non-OL aspects of organizations, such as performance outcomes, context, and resource availability, which might correlate with OL. Even so, uniquely defining the OL construct can be an extremely arduous undertaking. First, a belief prevails among theorists that various perspectives on OL not only exist, but also are appropriate [29]. For instance,

180

TEMPLETON, LEWIS, AND SNYDER

Table 1. Malhotra and Grover’s [90] Ideal Survey Attributes General ISA-1 ISA-2 ISA-3 ISA-4

Is the unit of analysis clearly defined for the study? Does the instrumentation consistently reflect that unit of analysis? Is the respondent(s) chosen appropriate for the research question? Is any form of triangulation used to cross-validate results?

Measurement error ISA-5 Are multi-item variables used? ISA-6 Is content validity assessed? ISA-7 Is field-based pretesting of measures performed? ISA-8 Is reliability assessed? ISA-9 Is construct validity assessed? ISA-10 Is pilot data used for purifying measures? ISA-11 Are confirmatory methods used? Sampling error ISA-12 Is the sample frame defined and justified? ISA-13 Is random sampling used from the sample frame? ISA-14 Is the response rate over 20 percent? ISA-15 Is nonresponse bias estimated? Internal validity error ISA-16 Are attempts made to establish internal validity of the findings? Statistical conclusion error ISA-17 Is statistical power sufficient?

in academia, there are many reference disciplines found to be influential on OL theory, including organizational sociology; organizational behavior and psychology [119]; organization theory; industrial economics; economic history; and business, management, and innovation studies [44]. Second, within organizations, members within the various functional areas (human resources, information technology, strategic management, process development, finance and accounting, and so on) view OL differently. Templeton [132] and Shrivastava [118] articulated six and three perspectives, respectively, on OL from which definitions might be drawn. Third, subjects that learn do so in widely varying style patterns [32, 44] that are largely dependent upon environmental context [94, 135]. Finally, researchers are also apt to disagree as to the level of analysis at which OL is enacted [71, 113]. For these reasons, it is important to respect the varying potential operational definitions that can be useful in practice and research. Whereas a well-rounded view of OL can be very useful in helping researchers deliberate and decide what areas to investigate, any effort to satisfy all perspectives with one omnibus measurement instrument is futile. Arriving at an acceptable definition and measure of the concept will only serve as a temporary solution in such a complex and emerging discipline. Even though the use of definitions in emerging disciplines such as OL can be inefficient [29], we believe that development efforts such as the research reported here is a fruitful way for the field to make progress toward the goal of “normal science,” as advocated by Kuhn [77]. Such efforts will serve as the impetus for discussion and refinement of ideas, tools, methods, and goals, which will serve

Albert (1992) [3] Argyris and Schön (1978) [5] Argyris and Schön (1996) [6] Bahlmann (1990) [8] Barnsley et al. (1998) [11] Bechtold (2000) [13] Bedeian (1986) [14] Bell and Scott-Kemmis (1990) [15] Berg (1993) [16] Bouwen and Fry (1991) [18] Bowman and Hurry (1993) [19] Bowonder and Miyake (1994) [20] Brown and Starkey (2000) [21] Cavaleri (1994) [26] Chalofsky (1996) [27] Cohen and Levinthal (1990) [30] Cook and Yanow (1993) [31] Cross and Baird (2000) [35]

Author(s)

Individual

3

Organizational and societal

Demographic

3

3

3 3 3

3 3 3 3

3 3

3

3

3

3

3 3

3

3

3

3

3 3

3

Information interpretation

Social action Information distribution

Perspective

Contribution

Knowledge acquisition

Table 2. Attributes of Organizational Learning Supported by the Literature

3 (continued)

3

3

3 3

3

Organizational memory

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT 181

Cyert and March (1963) [37] Daft and Huber (1987) [38] Daft and Weick (1984) [39] De Geus (1988) [40] DiBella et al. (1996) [42] Dixon (1992) [43] Dodgson (1993) [44] Dowd (2000) [45] Dutton and Dukerich (1991) [47] Engeström (1999) [48] Fiol and Lyles (1985) [50] Fisher and White (2000) [51] Foy (1980) [52] Friedlander (1983) [53] Gardner (1983) [54] Garvin (1993) [55] Gioia and Thomas (1996) [56] Goldhar and Lei (1995) [58] Goodman and Darr (1998) [59] Hannabuss (1984) [62]

Author(s)

3

3

Individual

3

3

3 3

Organizational and societal

Demographic

3

3

3

3 3

Knowledge acquisition

3 3

3

3

3

3

3

3

3

3

Information interpretation

Social action Information distribution

Perspective

Contribution

Table 2. Attributes of Organizational Learning Supported by the Literature (continued)

3 3

3 3

3 3

3

Organizational memory

182 TEMPLETON, LEWIS, AND SNYDER

Hedberg (1981) [65] Hedberg et al. (1976) [66] Herbert (2000) [67] Hines and Goul (1998) [68] Hobday (1990) [69] Huber (1991) [71] Kiernan (1993) [74] Kuchinke (1995) [76] Lee et al. (1992) [81] Leonard-Barton (1992) [82] Levinthal and March (1993) [83] Levitt and March (1995) [84] Lukas et al. (1996) [85] Lyles and Schwenk (1992) [88] Mahoney (1995) [89] March (1991) [89] McGill et al. (1992) [92] Miller (1996) [93] Nicolini and Meznar (1995) [97] Nonaka (1991) [98] Poell et al. (2000) [104] Pucik (1988) [106] Rothwell (1993) [110] Sackmann (1991) [111] Schein (1996) [112] Schein (1996) [113] Senge (1990) [114] Senge and Sterman (1993) [115]

3

3

3

3

3 3 3 3

3

3 3

3 3

3 3

3

3

3

3 3 3 3

3

3

3

3

3

3 3

3

3 3

3 3

3 3 3

(continued)

3

3 3

3

3

3

3 3 3

3

3 DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT 183

Sinkula (1994) [119] Slater and Narver (1995) [121] Spender (1989) [124] Sproull (1981) [125] Stata (1989) [126] Templeton and Snyder (1999) Van de Ven (1986) [139] Ventriss (1990) [140] Ventriss and Luke (1988) [141] Walsh and Ungson (1991) [143] Watkins and Marsick (1993) [144] Watkins and Marsick (1995) [145]

Author(s)

Individual 3

Organizational and societal

Demographic

3 3

3

3

3

3 3

Information interpretation 3

3

Information distribution

Social action

3 3 3

Knowledge acquisition

Perspective

Contribution

Table 2. Attributes of Organizational Learning Supported by the Literature (continued)

3 3 3 3

3 3

3

Organizational memory

184 TEMPLETON, LEWIS, AND SNYDER

Albert (1992) [3] Argyris and Schön (1978) [5] Argyris and Schön (1996) [6] Bahlmann (1990) [8] Barnsley et al. (1998) [11] Bechtold (2000) [13] Bedeian (1986) [14] Bell and Scott-Kemmis (1990) [15] Berg (1993) [16] Bouwen and Fry (1991) [18] Bowman and Hurry (1993) [19] Bowonder and Miyake (1994) [20] Brown and Starkey (2000) [21] Cavaleri (1994) [26] Chalofsky (1996) [27] Cohen and Levinthal (1990) [30] Cook and Yanow (1993) [31] Cross and Baird (2000) [35] Cyert and March (1963) [37] Daft and Huber (1987) [38] Daft and Weick (1984) [39]

Author(s)

3

3 3

3

Structural change

3

Deutero learning Intelligence 3

Information content validity

Outcome

3

Control

(continued)

3

3

3

3

3

Organizational consequences

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT 185

De Geus (1988) [40] DiBella et al. (1996) [42] Dixon (1992) [43] Dodgson (1993) [44] Dowd (2000) [45] Dutton and Dukerich (1991) [47] Engeström (1999) [48] Fiol and Lyles (1985) [50] Fisher and White (2000) [51] Foy (1980) [52] Friedlander (1983) [53] Gardner (1983) [54] Garvin (1993) [55] Gioia and Thomas (1996) [56] Goldhar and Lei (1995) [58] Goodman and Darr (1998) [59] Hannabuss (1984) [62]

Author(s)

3

3 3 3 3

3

Structural change

3

Deutero learning

3

Intelligence

3

3

3

Information content validity

Outcome

Perspective

Contribution

Table 2. Attributes of Organizational Learning Supported by the Literature (continued)

3

3

Control

3

3 3

3

Organizational consequences

186 TEMPLETON, LEWIS, AND SNYDER

Hedberg (1981) [65] Hedberg et al. (1976) [66] Herbert (2000) [67] Hines and Goul (1998) [68] Hobday (1990) [69] Huber (1991) [71] Kiernan (1993) [74] Kuchinke (1995) [76] Lee et al. (1992) [81] Leonard-Barton (1992) [82] Levinthal and March (1993) [83] Levitt and March (1995) [84] Lukas et al. (1996) [85] Lyles and Schwenk (1992) [88] Mahoney (1995) [89] March (1991) [89] McGill et al. (1992) [92] Miller (1996) [93] Nicolini and Meznar (1995) [97] Nonaka (1991) [98] Poell et al. (2000) [104] Pucik (1988) [106] Rothwell (1993) [110] Sackmann (1991) [111] Schein (1996) [112] Schein (1996) [113] Senge (1990) [114] Senge and Sterman (1993) [115] Sinkula (1994) [119]

3

3

3 3

3

3

3

3

3 3 3 3 3

3

3

3

3

3

3

3

3

3 3 (continued)

3

3

3

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT 187

Slater and Narver (1995) [121] Spender (1989) [124] Sproull (1981) [125] Stata (1989) [126] Templeton and Snyder (1999) Van de Ven (1986) [139] Ventriss (1990) [140] Ventriss and Luke (1988) [141] Walsh and Ungson (1991) [143] Watkins and Marsick (1993) [144] Watkins and Marsick (1995) [145]

Author(s)

3

3 3

3

3

Structural change

3

Deutero learning

3

3

3

Intelligence

3

3

3

Information content validity

Outcome

Perspective

Contribution

Table 2. Attributes of Organizational Learning Supported by the Literature (continued)

3

Control

Organizational consequences

188 TEMPLETON, LEWIS, AND SNYDER

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT

189

as standards of practice in OL research. It would be most useful for the field to concurrently develop, test, and deliberate the utility of a working sample of measures that are adequate representations of meaningful perspectives. Due to the numerous definitional choices available to researchers, it is necessary to explain and justify the point of view employed in attempts at operationalizing definitions into measures. In consideration of the varying perspectives regarding the subject, this research focused on three paradigmatic views (see Table 2) that theorists use when defining the OL concept: the demographic, social action, and outcome perspectives. Templeton [132] discusses four justifications for focusing operationalization efforts on the social action perspective: (1) it has an existing cumulative tradition of acceptance in the field, (2) it has the most potential for utility in OL research, (3) it facilitates the examination of ways in which organizational members enact pieces of OL, and (4) it addresses the levels of analysis that are active during OL. Each of these justifications is aimed at supporting managerial practice in learning organizations. In addition, the social action perspective allows us to inquire about organizational structure that is more dynamic than demographic and outcomes, and are therefore more subject to managerial decision making and control. For instance, environmental context may directly alter learning style choices but not the demographic state of the organization, and differing learning mechanisms may result in the same outcome [83]. In the review of literature for the content analysis in this study, all relevant articles were processed using ontological specification, as described by Templeton and Snyder [133]. The ontological specification procedure involves four steps: (1) selection of the topic area, (2) delineation of concepts that describe the overall construct, (3) transfer to a reusable medium, and (4) use of concepts in labeling source. This effort led to the establishment of several search attributes related to OL and involved multiple passes through the literature. As a result, 78 explicit definitions of OL were discovered and synthesized into the following conceptual definition for this study: Organizational learning is the set of actions (knowledge acquisition, information distribution, information interpretation, and organizational memory) within the organization that intentionally and unintentionally influence positive organizational change. Our analysis of the literature uncovered widespread support for the OL components contained within the Huber [71] taxonomy. For that reason, this definition is very similar to Huber’s. The methodology produced a definition of the social-action perspective of OL, and not its hypothetical correlates. Further, this definition depicts OL as an organizational-level construct and an ongoing process.

Stage II: OL Measurement Instrument Stage II of this project involved developing and perfecting an instrument based on the conceptual definition of OL presented above. In addition to the definition, the content analysis of the literature from Stage I produced a sample of item stems that depicted

190

TEMPLETON, LEWIS, AND SNYDER

OL activities in organizations (Table 3). Again, these items were largely derived from Huber’s [71] work, with extensions made in numerous areas. These items were used to generate the original statements on the instrument. The methodology for completing Stage II involved several steps to establish content validity throughout the instrument development process. The original draft of the questionnaire included a total of 46 questions, each derived from an item stem (Table 3) representing a distinct aspect of OL. The questions characterized the respondent’s perceptions about the presence in their organization of specific OL behaviors. Since organizations cannot perceive phenomena, individuals may be (and commonly are) surveyed as its proxy [90]. Thus, the OL questionnaire was designed to elicit the respondent’s professional judgment about the appearance of OL activities in their firm. Scale response categories for each item on the instrument were: (1) strongly disagree, (2) moderately disagree, (3) undecided, (4) moderately agree, and (5) strongly agree. The development of these questions addressed the ISA-1 (unit of analysis clearly defined), ISA-2 (instrument reflects unit of analysis), and ISA-5 (variables include multiple items) quality attributes. In addition to questions about OL on the initial questionnaire, data on individual- (job function and top management experience) and organizational- (industry and firm size) level demographic variables were solicited. Next, a pretest of the questionnaire was conducted. Four categories of respondents were selected for the pretest, based on their expertise: MIS faculty and practitioners, survey instrumentation experts, and organizational behavior theorists. The questionnaire, sent by facsimile, included a separate evaluation form (SEF). The SEF offered each respondent an opportunity to critique the instrument on matters important for good questionnaire design, such as format, content, understandability, terminology, and ease and speed of completion. In addition, the respondents were asked to identify specific questions they felt should be added or deleted from the questionnaire. Finally, the respondents were asked to make suggestions for enhancement. A total of nine pretest packets were administered and returned. All responses were reviewed and adjustments made based on the feedback. The pretest step in the development of the instrument addressed ISA-7 (pretesting), and began the cyclical process of data collection and instrument purification that continued throughout Stage II. Following revisions from the pretest, a pilot test was undertaken to appraise and further purify the instrument. A cover letter and the revised questionnaire were administered to 24 IT management professionals from 10 different industries. The cover letter explained the purpose of the research, and asked the respondents to complete the questionnaire and offer suggestions for improvement. This step utilized an electronic interface with a web-based form for data collection. Again, the questionnaire was revised based on the feedback. The pilot test step in the development of the instrument addressed ISA-10 (pilot testing). The content validity of the measurement instrument was then directly investigated by executing a variation on the procedure developed by Lawshe [79] for quantitatively assessing content validity. This technique employed a content evaluation panel of individuals knowledgeable about the concept being measured. The panel consisted

Information distribution

Congenital learning

Knowledge acquisition

Knowledge dissemination*

Knowledge logistics*

Searching and noticing

Grafting

Vicarious learning

Experiential learning

Criterion

Subconstruct KA-a KA-b KA-c KA-d KA-e KA-f KA-g KA-h KA-I KA-j KA-k KA-l KA-m KA-n KA-o KA-p KA-q KA-r KA-s ID-a ID-b ID-c ID-d ID-e ID-f ID-g

Item code

Table 3. Original Organizational Learning Subconstructs, Criteria, and Item Stems

New member learning* Member learning from organizational creation* Organizational experimenting Organizational self appraisal Experimenting organizations Unintentional or unsystematic learning Experience-based learning Imitating competitors* Imitating interindustrial organizational practices* Imitating alliance organizational practices* Practicing corporate intelligence* Boundary spanning* Adopting new members* Adopting organizational forms* Adopting intelligence* Scanning Focused searching Monitoring performance Noticing Understanding knowledge sources* Understanding knowledge content* Understanding information needs* Sharing* Educating and training* Technology-based disseminating* Integrating disparate knowledge* (continued)

Item stems (n = 46)

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT 191

Human memory* Other memory*

Computer-based OM

Storing and retrieving information

Unlearning

Information overload

Note: * Extension of Huber’s [71] typology.

Organizational memory

Cognitive maps and framing

Information interpretation

Media richness

Criterion

Subconstruct II-a II-b II-c II-d II-e II-f II-g II-h II-i II-j II-k II-l OM-a OM-b OM-c OM-d OM-e OM-f OM-g OM-h

Item code

Reframing* Homogenous interpreting* Cognitive map influence* Language framing* Communications media capability* Media richness* Media choice* Exceeding information processing limitations* Resolving information overload* Informational unlearning* Behavioral unlearning* Structural unlearning* Storing* Retrieving* Managing data* Strategic human resources turnover* Electronic storing* Electronic documenting* Human memory* Other memory*

Item stems (n = 46)

Table 3. Original Organizational Learning Subconstructs, Criteria, and Item Stems (continued)

192 TEMPLETON, LEWIS, AND SNYDER

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT

193

of 18 leading OL experts, primarily from academia. The panelists were sent a copy of the revised instrument and were asked to respond to each activity’s relevance to OL on a three-point scale: 1 = not relevant, 2 = important (but not essential), 3 = essential. All 20 of the experts responded, and from these data, a content validity ratio (CVR) was computed for each item from the following formula: CVR = (n – N/2)/(N/2), where n is the frequency count of the number of panelists rating the item as either 3 = essential or 2 = important (but not essential), and N is the total number of respondents. Lawshe [79] only utilized the “essential” response category in computing the CVR. In this study, a less stringent criterion was employed [86]. Responses of both “important (but not essential)” and “essential” were utilized because they were positive indicators of the items’ relevance to OL. Respondents that did not provide a rating on a given item were not used in the calculation of the CVR for that item. Table 4 presents the means and CVRs of the items from the Lawshe procedure. The CVR for each item was evaluated for statistical significance at the 0.05 level, using the table published by Lawshe [79]. Statistical significance meant that more than 50 percent of the panelists rated the item as either “essential” or “important.” According to Lawshe [79], this majority vote indicated some content validity for the item. Of the 46 items, 31 were found to be significantly content valid, and each remained on the final version of the questionnaire. The 15 statistically insignificant items were dropped from the study at this point, resulting in the final version of the questionnaire shown in the Appendix. The Lawshe procedure responded to ISA-6 (content validity assessment). The final version of the questionnaire was administered to top managers of companies in Huntsville, Alabama, a research and science-based community. This population of firms was targeted due to the expectation that organizational learning is more likely to be evident in knowledge-based or information technology-dependent firms. Although OL takes place in all organizations, some are perceived to be more knowledge-intensive than others. As Huntsville consistently ranks among the top U.S. metropolitan areas in software employment density [64] and was listed as one of ten new “high tech havens” in the United States [109], the companies in this study are generally in this latter group. The 1999–2000 Industrial Directory for the Chamber of Commerce of Huntsville/Madison County was used for selection of the sample. Of the 1,259 high tech and knowledge-based firms listed in the directory, 383 were randomly selected for the study. The cover letter asked the heads of these commercial firms to serve as proxy respondents for their organizations. The respondents chosen for this study addressed the ISA-3 (appropriate sample respondents), ISA-12 (sample frame justified), and ISA-13 (random sample) quality attributes. Three aspects of response quality were assessed. First, 119 of the 383 sample frame members responded, representing a 31.1 percent response rate. This satisfied ISA-14, that the response rate should be over 20 percent. On average, these respondents had approximately 10 years of experience with their company and had served in their current position for 6.5 years. The average age of their companies was nearly 13

0.11 0.11 1.00* 0.88* 0.89* 1.00* 0.06 0.11 0.29 0.65* 0.76* 0.07 0.89* 0.53* 0.67* 0.44 1.00* 0.67* 0.88*

2.56 1.72 2.94 2.56 2.47 2.35 1.65 1.78 2.65 2.67 2.00 2.61 2.50 1.94 2.22 2.28 2.06 2.28 2.44

New employees ignore the knowledge of existing employees. The company is still highly influenced by the vision of the founder(s). Management uses feedback from company experiments (such as trials of new methods and surveys). Management monitors important organizational performance variables. Employees are discouraged from recommending new work ideas. Employees learn about the company’s recent developments through informal means (such as news stories and gossip). KA-g. Overall, the company is losing personnel experience. KA-h. The company imitates competitors (that is, products, strategies, and practices). KA-i. Management ignores the practices of organizations outside our industry. KA-j. Management learns from the company’s partners (such as, customers, suppliers, allies). KA-k. Management ignores the strategies of competitors’ top management. KA-l. Managers ignore information about industry events. KA-m. The company hires highly specialized or knowledgeable personnel. KA-n. The company acquires subunits (such as, organizations, functions, departments) based on short-term financial gain. KA-o. When internal capabilities are deficient, we acquire them from the outside. KA-p. Management monitors the fit between company strategy and competitive environment. KA-q. Management proactively addresses problems. KA-r. The company collects data on all facets of performance. KA-s. Management learns new things about the company by direct observation.

KA-a. KA-b. KA-c. KA-d. KA-e. KA-f.

CVR

Mean

Knowledge acquisition

Table 4. Lawshe Procedure Results

17 18 18 18 18 17

17 17 18 17 17 17 15 18

18 18 18 17 18

n

194 TEMPLETON, LEWIS, AND SNYDER

When employees need specific information, they know who will have it. Employees have difficulty finding needed work-related information. Employees are keenly aware of where their knowledge can serve the company. Employees keep information (such as, numbers, plans, ideas) from other employees. Employees make extensive use of IS to support their work. Management assigns employees to other parts of the organization for cross training. Top management integrates information from different organizational areas.

II-h. II-i. II-j. II-k. II-l.

II-g.

II-a. II-b. II-c. II-d. II-e. II-f.

Managers consistently scan and update their views of the competitive environment. Employees’ interpretations about company events differ widely. Management encourages the use of frameworks and models to assist in decision-making. Employees are encouraged to communicate clearly. The communications tools used in the company are deficient. The company’s communications tools (telephone, e-mail, and so on) are capable of rich information content. Employees have a large variety of communications tools (telephone, e-mail, Internet, and so on) from which to choose. There is too much information available in the company. Before final decisions are made, options are evaluated rigorously. Management removes obsolete information from employee access. Our employees resist changing to new ways of doing things. The company is slow to react to technological change.

Information interpretation

ID-a. ID-b. ID-c. ID-d. ID-e. ID-f. ID-g.

Information distribution

0.67* 0.47 0.44 0.89* 0.88* 0.56*

0.38

2.00 2.06 2.20 2.56 1.94 2.06 2.06

0.06 0.44 0.78* 0.65* 0.13

0.89* 0.44 0.88* 0.56* 1.00* 1.00* 1.00*

2.59 1.73 2.24 2.11 1.75

2.56 2.50 2.33 2.78 2.00 2.17 2.53

18 15 18 18 17 18 (continued)

16

17 18 18 17 16

18 18 16 18 17 18 18

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT 195

Note: * Significant at the 0.05 level.

The company stores detailed information for guiding operations. Employees retrieve archived information when making decisions. There is a formal data management function in the company. The company maintains a certain mix of skills among its pool of employees. The company makes extensive use of electronic storage (such as, databases, data warehousing, scanned documents). OM-f. Employees use electronic means to communicate. OM-g. The company develops experts from within. OM-h. The company makes extensive use of information from other firms (suppliers, partners, customers, and so on).

OM-a. OM-b. OM-c. OM-d. OM-e.

Organizational memory

Knowledge acquisition

Table 4. Lawshe Procedure Results (continued)

0.76* 1.00* 0.53* 0.67* 0.76* 1.00* 0.67* 0.38

2.24 1.88 2.28 2.06

CVR

2.47 2.41 2.18 2.17

Mean

16

17 18 18

17 18 17 18

n

196 TEMPLETON, LEWIS, AND SNYDER

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT

197

years. Second, ISA-17 is concerned with the sufficiency of statistical power in reducing statistical conclusion error, or the accuracy of conclusions about covariation made on the basis of statistical evidence. Malhotra and Grover [90] state that statistical conclusion error is dependent upon the statistical power of a test (its ability to detect effects of a specific size given the particular variances and sample sizes of the study). In this study, there was an item-to-subject ratio of 3.83 (119/31), which translated to adequate statistical power for exploratory factor analysis. Third, in order to assess nonresponse bias (ISA-15) in the returned questionnaires, a chi-square test for differences between the industry distribution of the respondent group and the population was employed. Using the four industry categories (IT, research, knowledge application, and engineering/design), the chi-square test resulted in a p-value of 0.451, implying no difference between the population and sample groups with respect to industry affiliation. The psychometric properties of the instrument were evaluated next using the data from the administration of the questionnaire. Both construct validity (ISA-9) and reliability (ISA-8) were addressed. Construct validity is concerned with the appropriateness of the underlying structure of the OL construct [24, 129]. This study utilized two methods for assessing construct validity: determining the empirical dimensions of OL through principal components factor analysis, and checking the reasonableness of these dimensions through known groups analysis. Factor analysis was employed in this research to empirically select the most important items to represent OL [63] and to provide a statistical grouping of items with similar theoretical meanings [75]. Categorizing items using this method resulted in the satisfaction of ISA-5 (variables include multiple items). Although the current study started with the creation of item stems from four prominently known theoretical subconstructs, exploratory (nonhypothetical) methods were used to establish empirically derived factors from the data. Exploratory methods were appropriate because (1) no theory exists based on testing the coexistence of all four factors in a cohesive model, and (2) this research represents the initial empirical work done on the proposed factors. Before construct validity was assessed via factor analysis, two tests were performed to determine the appropriateness of using factor analysis on these data. The Kaiser-Meyer-Olkin test (KMO = 0.78) exceeded 0.70, which is in the middling range [60]. The Bartlett sphericity test (F = 1998.18, df = 465, p = 0.00) was significant at the 0.001 level. Thus, the item pool from the response data was amenable to factor analysis. Exploratory principal components factor analysis was conducted to extract factors with eigenvalues of one or greater [17, 101, 102, 130]. A scree plot was used to further verify the number of factors to be included in the final solution. The sequential application of these two procedures resulted in the inclusion of eight factors in the measure of OL. Several rotation techniques were tested on the original 31 items. The rotated factor solutions were judged on simplicity [75, 116], interpretability [72, 80], and the percent of variance explained [16, 130]. The rotation method that best satisfied these criteria was equamax, a combination of two orthogonal rotation strategies: quartimax (which simplifies the variables) and varimax (simplifies the factors). The

198

TEMPLETON, LEWIS, AND SNYDER

strength of orthogonal rotation methods like equamax is that the results are more likely to be replicated in future studies. The factors were statistically formed based on the item factor loadings. An item was assigned to a factor if its loading on that factor exceeded 0.50, which is at the top of the range of 0.50 [130] and 0.35 [80, 116] used in previous exploratory studies. Factors that had no loadings exceeding 0.50 were dropped from further analysis. As a result, a total of three items were dropped: II-j (informational unlearning), KA-c (organizational experiments), and KA-e (experimenting organizations). The remaining 28-item solution explained 68.4 percent of the systematic covariance among the items. No items loaded on multiple factors. Finally, labels were given to the empirically derived factors of OL, as reported in Table 5. The exploratory factor analysis procedure resulted in the establishment of eight reasonable dimensions to describe OL. These factors provided evidence of the construct validity of the derived measure. The first factor was labeled awareness, and accounted for 10.6 percent of the overall covariance. The five items contained in the awareness factor had loadings ranging from 0.55 to 0.69, and represented the extent to which organizational members are aware of the sources of key organizational information and its applicability to existing problem areas. The second factor, labeled communication, accounted for 9.5 percent of covariance. Factor loadings ranged from 0.51 to 0.84 among the three items, which represented the extent of communication and that exists between organizational members. This factor included consideration for the use of, and accessibility to, communications technologies. The third factor was labeled performance assessment, and accounted for 9.4 percent of the total covariance. Factor loadings ranged from 0.58 to 0.81 among the four items, which represented the comparison of process- and outcome-related performance to organizational goals. The fourth factor, intellectual cultivation, accounted for 8.8 percent of overall covariance. The factor loadings of this construct ranged from 0.51 to 0.68 among the four items, which represented the development of experience, expertise, and skill among existing employees. The fifth factor was environmental adaptability, which accounted for 8.1 percent of total covariance. The four items contained in this factor had loadings ranging from 0.60 to 0.66, and represented mostly technology-related items pertaining to organizational responses to environmental change. The sixth factor, social learning, accounted for 8.1 percent of total covariance. Factor loadings in the three-item construct ranged from 0.63 to 0.74. The items represented the extent to which organizational members learn through social channels about organizational concerns. The seventh factor was intellectual capital management, and accounted for 7.4 percent of covariance. The loadings for the three items in this factor ranged from 0.56 to 0.68. The intellectual capital management construct represented the extent to which the organization manages knowledge, skill, and other intellectual capital for longterm strategic gain.

0.79 0.51 0.81 0.78 0.63 0.58 0.68 0.66 0.61 0.51

Performance assessment KA-r The company collects data on all facets of performance. OM-a The company stores detailed information for guiding operations. OM-c There is a formal data management function in the company. II-c Management encourages the use of frameworks and models to assist in decision-making.

Intellectual cultivation OM-g The company develops experts from within. KA-j Management learns from the company’s partners (such as, customers, suppliers, allies). ID-f Management assigns employees to other parts of the organization for cross training. KA-s Management learns new things about the company by direct observation.

0.84

0.69 0.69 0.60 0.58 0.55

Loading

Communication OM-f Employees use electronic means to communicate. II-g Employees have a large variety of communications tools (telephone, e-mail, Internet, and so on) from which to choose. II-d Employees are encouraged to communicate clearly.

Awareness ID-a When employees need specific information, they know who will have it. KA-d Management monitors important organizational performance variables. KA-q Management proactively addresses problems. ID-g Top management integrates information from different organizational areas. ID-c Employees are keenly aware of where their knowledge can serve the company.

Factor and contents

Table 5. Underlying Dimensions of OL

0.69

0.76

0.85

0.86

Alpha

(continued)

8.8

9.4

9.5

10.6

Percent covariance

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT 199

0.63

0.68 0.60 0.56 0.82 0.56

Intellectual capital management KA-n The company acquires subunits (such as, organizations, functions, departments) based on short-term financial gain. (–) OM-d The company maintains a certain mix of skills among its pool of employees. KA-m The company hires highly specialized or knowledgeable personnel.

Organizational grafting KA-k Management ignores the strategies of competitor’s top management. (–) KA-o When internal capabilities are deficient, we acquire them from the outside.

0.74 0.73

0.65 0.65 0.60

0.66

Loading

Environmental adaptability ID-e Employees make extensive use of IS to support their work. OM-e The company makes extensive use of electronic storage (such as, databases, data warehousing, scanned documents). II-l The company is slow to react to technological change. (–) OM-b Employees retrieve archived information when making decisions. Social learning ID-d Employees keep information (such as, numbers, plans, ideas) from other employees. (–) II-k Our employees resist changing to new ways of doing things. (–) KA-f Employees learn about the company’s recent developments through informal means (such as news stories and gossip). (–)

Factor and contents

Table 5. Underlying Dimensions of OL

0.46

0.52

0.66

0.74

Alpha

6.5

7.4

8.1

8.1

Percent covariance

200 TEMPLETON, LEWIS, AND SNYDER

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT

201

The eighth factor was organizational grafting, which accounted for 7.4 percent of total covariance. The two items contained in organizational grafting had loadings of 0.56 and 0.82. This construct represented the extent to which the organization capitalizes on the knowledge, practices, and internal capabilities of other organizations. Known groups analysis is a method for investigating construct validity [33, 34] by testing for differences in scores between classes of respondents that are expected to differ. If significant differences occur as expected, known groups analysis can support the notion that the instrument has construct validity. Firm age and size are attributes known to differentiate respondents and influence OL scale or subscale scores [39, 119]. In the current study, known groups analysis was employed by calculating Pearson’s correlation coefficient using the summed item scores for each OL dimension with firm age and size. Table 6 shows that the intellectual cultivation dimension is significantly correlated with both the age and size of the firm. The only other significant correlation is between intellectual capital management and size. This shows partial association between OL and entity age and size among high tech and knowledge-based companies. These results suggest that propositions about the relationship between organizational learning and firm age and size have been generated by researchers with conceptualizations about OL that emphasize the organization memory component. In addition, it suggests that dimensions that are heavily grounded in intellectual capital management are related to firm age and size, but not other OL-resident dimensions. We conclude that the results of this test provide partial evidence that the derived measure exhibits construct validity. The partial results are just as likely to be a result of the lack of collective experience in the field regarding the likely behaviors of the OL construct in various organizational contexts [137]. The combination of the two methods—factor analysis and known groups analysis—provides empirical evidence that the OL measure exhibited acceptable construct validity (ISA-9). These two procedures were used sequentially in this research, and in combination with the Lawshe procedure, satisfied ISA-4, which calls for triangulation in validating the measure. Tests of reliability, a further requirement for construct validity [100], satisfy ISA-8 and are used to assess the extent to which random error (that is, variation or unreliability) exists in an instrument. In this study, reliability was determined by calculating Cronbach’s alpha for each of the factors [28, 80], as shown in Table 5. An alpha statistic of 0.5 to 0.6 is sufficient in the exploratory research, but 0.8 is inevitably more desirable [101]. Only organizational grafting (alpha = 0.46) had an internal reliability score indicating a possible concern. The other seven factors exhibited alphas greater than 0.5, six were greater than 0.6, five were greater than 0.7, and two were greater than 0.8. Using a Cronbach’s alpha of 0.7 as the optimum level that maximizes reliability and minimizes dimension item size [123], the instrument is reasonably reliable.

Stage III: Statistical Profile of OL Stage III involved computing a statistical profile of the population of interest. As can be seen from the means in Table 7, all of the 28 items that made up the eight OL dimensions on average received ratings greater that three. This indicated that they

202

TEMPLETON, LEWIS, AND SNYDER

Table 6. Correlations Between OL Dimensions, Age, and Size of Local Operations Statistic

OL dimension

Pearson correlation

Awareness Communication Performance assessment Intellectual cultivation Environmental adaptability Social learning Intellectual capital management Organizational grafting Awareness Communication Performance assessment Intellectual cultivation Environmental adaptability Social learning Intellectual capital management Organizational grafting

Significance

Age

Size

–0.12 –0.05 0.03 –0.22 –0.15 –0.18 –0.10 –0.06 0.26 0.63 0.75 0.04* 0.17 0.09 0.35 0.56

–0.20 –0.05 0.01 –0.36 0.06 –0.01 –0.31 0.04 0.06 0.65 0.94 0.00** 0.54 0.91 0.00** 0.69

Notes: ** Correlation is significant at the 0.01 level; * correlation is significant at the 0.05 level.

were implemented to some extent within companies in the sample. The top three items all dealt with communications. The most implemented OL activity was employee communications tools (item II-g), with a mean rating of 4.64 on the five-point scale. This was followed closely by electronic means of communications (item OM-f) with a mean of 4.5, and encouragement of employee communications (item II-d) with a mean of 4.45. This finding was to be expected, given that the population of interest consisted of knowledge-based organizations that typically emphasize communications. Table 8 depicts normative statistics (dimension means and standard deviations) for five categories of the sample respondents: CEO, CIO, functional manager, project manager, and other. Among the four well-defined respondent categories, the CEOs perceived higher levels of OL in their companies than the other positions. On the other hand, the CIOs rated their companies lower on OL than the other positions. These statistics indicated that perceptions about the extent of OL implementation vary among management groups. Norming statistics were also computed for the six industry groups, as reported in Table 9. According to the OL dimension means in Table 9, engineering and design firms exhibited slightly more OL activity than the others did. Although there were some differences on the individual OL dimensions, these industries were very similar on their overall implementation of OL.

Conclusions MANAGERS OFTEN SEEK ALTERNATIVE organizational forms in order to facilitate the ongoing environmental demands for change. Successful models, like the organiza-

Employees have a large variety of communications tools (telephone, e-mail, Internet, and so on) from which to choose. Employees use electronic means to communicate. Employees are encouraged to communicate clearly. The company develops experts from within. The company maintains a certain mix of skills among its pool of employees. The company hires highly specialized or knowledgeable personnel. Management learns from the company’s partners (such as, customers, suppliers, allies). The company is slow to react to technological change. (–) Management proactively addresses problems. Employees keep information (such as, numbers, plans, ideas) from other employees. (–) The company makes extensive use of electronic storage (such as, databases, data warehousing, scanned documents). Management learns new things about the company by direct observation. When employees need specific information, they know who will have it. Management ignores the strategies of competitors’ top management. Our employees resist changing to new ways of doing things. Management monitors important organizational performance variables. Employees are keenly aware of where their knowledge can serve the company. Top management integrates information from different organizational areas.

II-g

KA-s ID-a KA-k II-k KA-d ID-c ID-g

OM-f II-d OM-g OM-d KA-m KA-j II-l KA-q ID-d OM-e

Item

Code

Table 7. Statistical Profile from Final Instrument Administration

0.77 0.81 0.77 0.85 0.95 1.15 0.88 1.07 0.94 0.97 1.22 0.85 0.89 1.12 1.12 0.99 0.95 0.94 (continued)

3.97 3.96 3.91 3.90 3.89 3.87 3.87 3.86

Standard deviation

4.64 4.50 4.45 4.20 4.12 4.10 4.08 4.03 4.02 3.99

Mean

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT 203

The company acquires subunits (such as, organizations, functions, departments) based on short-term financial gain. Employees learn about the company’s recent developments through informal means (such as news stories and gossip). Employees make extensive use of IS to support their work. When internal capabilities are deficient, we acquire them from the outside. The company stores detailed information for guiding operations. There is a formal data management function in the company. Employees retrieve archived information when making decisions. Management assigns employees to other parts of the organization for cross training. Management encourages the use of frameworks and models to assist in decision-making. The company collects data on all facets of performance.

KA-n

ID-e KA-o OM-a OM-c OM-b ID-f II-c KA-r

KA-f

Item

Code

Table 7. Statistical Profile from Final Instrument Administration (continued)

3.81 3.76 3.73 3.72 3.67 3.55 3.51 3.47 3.34

3.82

Mean

1.12 1.18 1.11 1.00 1.14 1.03 1.10 1.02 1.11

1.19

Standard deviation

204 TEMPLETON, LEWIS, AND SNYDER

Awareness Communication Performance assessment Intellectual cultivation Environmental adaptability Social learning Intellectual capital management Organizational grafting

Subconstruct/ job position

0.49 0.34

0.74

0.46

0.71 0.68

0.57

0.85

3.71

4.17

3.97 3.97

4.06

3.98

SD

4.14 4.72

Mean

CEO (n = 52)

3.29

3.78

3.15 3.64

3.56

3.13

3.50 3.94

Mean

1.01

1.12

1.00 1.04

0.64

1.05

0.91 1.38

SD

CIO (n = 12)

3.61

4.30

3.53 3.85

3.72

3.03

3.58 4.59

1.11

0.56

1.03 0.77

0.69

0.91

0.86 0.68

SD

Functional manager (n = 9) Mean

Table 8. Norms for OL Dimensions Based on Respondent Position

3.62

4.03

3.79 3.92

3.71

3.63

3.89 4.79

Mean

0.71

0.44

0.87 0.75

0.72

0.86

0.46 0.22

SD

Project manager (n = 13)

3.88

3.95

3.95 3.87

3.86

3.57

3.78 4.33

Mean

0.88

1.06

0.84 1.02

0.82

0.73

0.99 0.71

SD

Other (n = 33)

3.82

4.01

3.83 3.90

3.94

3.55

3.90 4.53

Mean

0.90

0.79

0.85 0.83

0.67

0.81

0.76 0.69

SD

Total (n = 119)

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT 205

Awareness Communication Performance assessment Intellectual cultivation Environmental adaptability Social learning Intellectual capital management Organizational grafting

Subconstruct/ SIC class

0.68 0.48

0.94

0.61

0.60 0.74

0.53

0.86

3.64

3.92

4.00 3.82

4.15

3.91

SD

3.99 4.65

Mean

IT (n = 28)

3.96

4.26

4.08 4.13

3.98

3.60

4.02 4.79

Mean

0.99

0.47

0.62 0.59

0.59

0.75

0.44 0.29

SD

Engineering and design (n = 13)

3.50

3.89

3.33 3.72

3.99

3.64

3.66 4.40

Mean

1.07

0.83

1.12 1.03

0.64

0.88

0.80 0.90

SD

Knowledge-based applications (n = 19)

Table 9. Norms for OL Dimensions Based on Industry Classification

3.58

4.33

3.96 4.00

3.96

3.42

4.05 4.77

Mean

0.79

0.43

0.71 0.75

0.55

0.68

0.63 0.32

SD

Research (n = 13)

3.91

3.82

3.83 3.92

3.91

3.48

3.88 4.37

Mean

SD

0.84

0.98

0.88 0.88

0.77

0.78

0.88 0.8

Other (n = 46)

3.82

4.01

3.83 3.90

3.94

3.55

3.90 4.53

Mean

0.90

0.79

0.85 0.83

0.67

0.81

0.76 0.69

SD

Total (n = 119)

206 TEMPLETON, LEWIS, AND SNYDER

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT

207

tional learning paradigm, can help firms assimilate new technologies, achieve competitive advantage, and process knowledge better in the pursuit of ongoing realignment in today’s high tech, competitive environments. The iterative methodology employed in this research integrated two instrument development frameworks used in prior organizational studies [85, 90]. Thorough content and construct validity assessments and reliability tests were performed in order to enhance the internal and external validity of the resultant measure. All but two of Malhotra and Grover’s [90] ISAs were applied in this research methodology, thus assuring that adequate rigor was present in developing a quality measure of organizational learning. This research offers three contributions to the existing body of knowledge about the organizational learning concept: (1) a conceptual definition, (2) an empirically reliable and valid measure, and (3) norms for benchmarking. These contributions, particularly the OL measure, are important for facilitating the assessment of OL in organizations and enabling future empirical research on OL. The first contribution of this study is a consensus definition of the OL concept. Three views of OL were apparent in the literature: the demographic, social action, and outcome perspectives. The social action view of OL was explored further due to its implications for explaining organizational phenomena and was adopted in this study. The following social action definition of the OL construct resulted from this analysis: OL is the set of actions (knowledge acquisition, information distribution, information interpretation, and organizational memory) within the organization that intentionally and unintentionally influence positive organizational change. The second contribution of this study is an empirically derived measure of OL that exhibited acceptable levels of validity and reliability. These results indicated that OL is a multidimensional construct consisting of eight distinct components. The eight underlying dimensions of OL were determined using factor analysis applied to survey data from a high tech community: • the extent to which organizational members are aware of the sources of key organizational information and its applicability to existing problem areas (awareness); • the extent of communication and that exists between organizational members (communication); • the comparison of process- and outcome-related performance to organizational goals (performance assessment); • the development of experience, expertise, and skill among existing employees (intellectual cultivation); • technology-related items pertaining to organizational responses to environmental change (environmental adaptability); • the extent to which organizational members learn through social channels about organizational concerns (social learning); • the extent to which the organization manages knowledge, skill, and other intellectual capital for long-term strategic gain (intellectual capital management); and • the extent to which the organization capitalizes on the knowledge, practices, and internal capabilities of other organizations (organizational grafting).

208

TEMPLETON, LEWIS, AND SNYDER

The third contribution of this study is the establishment of norming data: means for the items in the OL construct, and dimension means based on the position of the respondent and industry classification. These data may be used to benchmark organizational assessment results using the OL instrument. Organizations scoring above these standards may be considered learning organizations. Organizations scoring below these levels might want to dedicate more resources toward the areas indicated in the OL dimensions. The tenets of OL offer rich insights into how MIS researchers and practitioners may inquire into improving the field. Indications from preliminary research show that OL can greatly enhance traditional MIS functions such as IT development, deployment, support, and training. In many innovative ways, MIS can stimulate and facilitate all three change-relevant modes of learning: single-loop, double-loop, and deutero. In addition, several important research topics could be studied at the convergence of the fields of OL and MIS, such as the nature of technological innovations and the adoption of knowledge management within organizations.

Limitations and Opportunities for Future Research THE LIMITATIONS OF THE STUDY INCLUDE the nature of the OL discipline and methodological issues. Regarding the discipline, one problem is that OL theory is still emerging in a widespread effort to conceptually explain its structure and function [94]. Although a concentrated effort on providing acceptable definitions, measures, and methods is paramount to advancing the field, we should expect findings have a relatively short life span. This research contributes to the goal of advancing OL theory by providing an acceptable measurement instrument at a time of unprecedented levels of theoretical and empirical inquiry into the construct [36]. This research represents the first generation of attempts at measuring the OL construct, which will be used as the foundation for subsequent advancements made by the multiple disciplines attempting to take the field toward the “normal science” state. Another problem with advancing the construct is its complexity. It consists of varying perspectives that derive from a multitude of disciplines. We focus on the social action perspective of OL, at the expense of excluding others. For instance, we have defined OL as those actions that precede and influence positive organizational change. Therefore, the resulting measure does not assess change outcomes, an aspect of many OL definitions. Subsequent research should be done on positive organizational changes that are intended and unintended consequences of OL as measured here. Only then will researchers be able to discover how specific organizational changes can be assigned to specific OL behaviors. In addition, there are a myriad of paradigms used in the conceptualization of how organizations are structured and work. The research reported herein is based on the view of organizations as a collection of individuals serving as agents that act on behalf of the interests of the firm. Adherence to others of the widely diverging conceptualizations about organizations, such as the “collective of communalities” design articulated by Brown and Duguid [22, p. 54], may render the findings of this study less generalizable.

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT

209

Regarding methodology, this research is limited by the use of top managers as proxies for organizational members engaged in collective action. However, we justified the use of top managers as proxy respondents by citing the commonality [90] and economy [73] of this technique. This issue highlights the numerous complexities associated with acquiring data from multilevel constructs such as OL [94]. Another potential methodological limitation is the disproportionate number of CEOs in the sample. Yet, the dispersion of respondents was largely incidental to the goal of reaching a senior local executive respondent, a priority for obtaining the most knowledgeable proxy available for questioning. In addition, the collection of data in a confined geographical space (Hunstville, Alabama) limits the generalizability of the study. Finally, although only one factor was below acceptable levels for exploratory research, the reliability scores of some factors (social learning, intellectual capital management, and organizational grafting) indicated a need for follow-up research. The measurement instrument developed in this study for OL should be considered a first iteration and needs to undergo further empirical testing in order to improve its efficacy in organizational studies. Based on the eight factors extracted from the sample data in this study, new items should be derived from the literature and tested in the presence of the items promoted in this research. It would be especially relevant that new items be generated within the definitional meaning of the underlying constructs containing a small number (2–3) of items. Finally, the addition of new dimensions to the OL instrument should be contemplated in future research, based on evolving notions of the concept. In addition, the instrument could be used in a longitudinal study to investigate differences in levels of OL over time, between industries, between sectors (private and public), and between organizational subunits (R&D, operations, finance, and so on). Further, the relationship between OL and its proposed precursors, contexts, and consequences [135] would contribute greatly to the current body of knowledge on OL. The most important contribution of this research is the potential for establishing a quantitative appraisal of the OL construct. In this vein, it would be appropriate to determine the relationship between OL and organizational effectiveness and other outcome measures. The link between OL and organizational sustainability and prosperity has been commonly suggested, and can be inferred from its popularity in established academic journals in a broad range of reference disciplines. This relationship can be empirically tested using the myriad of objective financial data on corporations provided in the Security and Exchange Commission’s (SEC) EDGAR database. Researchers should test the relationship between OL and quality-based measures such as time to market, total cycle time, defects per unit, and technology transfer rates. Finally, researchers should test the relationship between OL and measures of success related to knowledge management concerns such as creativity, innovativeness, and strategic planning and decision-making success. Given these potential areas of inquiry, it is easy to gauge the potential impact of this research on the economic progress of modern organizations and societies. This project contributes to the cumulative tradition and provides the basis and direction for future research on the OL construct. In addition, the instrument developed

210

TEMPLETON, LEWIS, AND SNYDER

in this study may be employed as a diagnostic tool to determine the success of OL implementation in practice. Understanding how information technology can support OL will be of paramount importance in designing effective organizational structures and cultures for the future. Given the complexity of the OL construct, the nature of OL documented in this research will provide tangible benefits in these endeavors. Acknowledgments: An earlier version of this paper was presented at the Annual Meeting of the Decision Sciences Institute on November 21, 2000. The authors thank the following for assisting in the early development of the research instrument: Karen Ayas, Art Bedeian, Terry Byrd, Lt. Col. Chester Carter III, Richard Daft, Jim Davis, Bill Deery, Anthony DiBella, Nancy Dixon, Frazier Douglass, Hubert Feild, Nelson Ford, David Garvin, Stan Harris, Robert Hirschfield, Dorothy Leonard, Bryan Lukas, Michael Marquardt, David Nye, Paul Nystrom, Susan Owen, Carl Pegals, George Roth, Stan Slater, John Slocum, Ray Stata, Gerardo Ungson, Andy Van de Ven, Curtis Ventriss, and Robert Zmud.

REFERENCES 1. Alavi, M. Computer-mediated collaborative learning: An empirical evaluation. MIS Quarterly, 18, 2 (1994), 159–174. 2. Alavi, M.; Wheeler, B.C.; and Valacich, J.S. Using IT to reengineer business education: An exploratory investigation of collaborative telelearning.MIS Quarterly, 19, 3 (1995), 293–312. 3. Albert, S. The algebra of change. In B.M. Staw and L. Cummings (eds.), Research in Organizational Behavior. Greenwich, CT: JAI Press, 1992, pp. 179–229. 4. Allen, M.J., and Yen, W.M. Introduction to Measurement Theory. Monterey, CA: Brooks/ Cole, 1979. 5. Argyris, C., and Schön, D.A. Organizational Learning: A Theory of Action Perspective. Boston: Addison-Wesley, 1978. 6. Argyris, C., and Schön, D.A. Organizational Learning II. Boston: Addison-Wesley, 1996. 7. Bagozzi, R.P., and Baumgartner, H. The evaluation of structural equation models and hypothesis testing. In R.P. Bagozzi (ed.), Principles of Marketing Research. Oxford, UK: Blackwell, 1994, pp. 386–422. 8. Bahlmann, T. The learning organization in a turbulent environment. Human Systems Management, 9, 4 (1990), 249–256. 9. Bandura, A. Social Learning Theory. Upper Saddle River, NJ: Prentice Hall, 1977. 10. Barnett, C.K. Organizational learning theories: A review and synthesis of the literature. Unpublished manuscript, University of New Hampshire, Durham, 2001. 11. Barnsley, J.; Lemieux-Charles, L.; and McKinney, M.M. Integrating learning into integrated delivery systems. Health Care Management Review, 23, 1 (1998), 18–28. 12. Bateson, G. Steps to an Ecology of Mind: Collected Essays in Anthropology, Psychiatry, Evolution, and Epistemology. London: Intertext Books, 1972. 13. Bechtold, B.L. Evolving to organizational learning. Hospital Materiel Management Quarterly, 21, 3 (2000), 11–25. 14. Bedeian, A.G. Contemporary challenges in the study of organizations. Journal of Management, 12, 2 (1986), 185–201. 15. Bell, M., and Scott-Kemmis, D. The mythology of learning-by-doing in World War II airframe and ship production. Science Policy Research Unit, University of Sussex, 1990. 16. Berg, D. Expanding perceptions, possibilities and profits. Journal for Quality and Participation, 16, 7 (1993), 6–10. 17. Bernstein, I.H. Applied Multivariate Analysis. New York: Springer-Verlag, 1988. 18. Bouwen, R., and Fry, R. Organizational innovation and learning: Four patterns of dialog between the dominant logic and the new logic. International Studies of Management and Organizations, 21, 4 (1991), 37–51.

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT

211

19. Bowman, E.H., and Hurry, D. Strategy through the option lens: An integrated view of resource investments and the incremental-choice process. Academy of Management Review, 18, 4 (1993), 760–782. 20. Bowonder, B., and Miyake, T. Innovations and strategic management: A case study of Hitachi Ltd. Technology Analysis and Strategic Management, 6, 1 (1994), 55–81. 21. Brown, A.D., and Starkey, K. Organizational identity and learning: A psychodynamic perspective. Academy of Management Review, 25, 1 (2000), 102–120. 22. Brown, J.S., and Duguid, P. Organizational learning and communities-of-practice: Toward a unified view of working, learning, and innovation. Organization Science, 2, 1 (1991), 40–57. 23. Budd, R.; Thorp, R.; and Donohew, L. Content Analysis of Communications. New York: Macmillan, 1967. 24. Carmines, E.G., and Zeller, R.A. Reliability and Validity Assessment. Thousand Oaks, CA: Sage, 1979. 25. Carney, T.F. Content Analysis. London: B.T. Batsford, 1972. 26. Cavaleri, S.A. “Soft” systems thinking: A pre-condition for organizational learning. Human Systems Management, 13, 4 (1994), 259–267. 27. Chalofsky, N.E. A new paradigm for learning in organizations. Human Resource Development Quarterly, 7, 3 (1996), 287–293. 28. Churchill, G.A., Jr. A paradigm for developing better measures of marketing constructs. Journal of Marketing Research, 16, 1 (February 1979), 64–73. 29. Cohen, M.D., and Sproull, L.S. (eds.). Organizational Learning. Thousand Oaks, CA: Sage, 1995. 30. Cohen, W., and Levinthal, D. Absorptive capacity: A new perspective on learning and innovation. Administrative Science Quarterly, 35, 1 (1990), 128–152. 31. Cook, S.D.N., and Yanow, D. Culture and organizational learning. Journal of Management Inquiry, 2, 4 (1993), 373–390. 32. Corsini, R. Concise Encyclopedia of Psychology. New York: Wiley, 1987. 33. Cronbach, L.J. Test validation. In R.L. Thorndike (ed.), Educational Measurement, 2d ed. Washington, DC: American Council on Education, 1971, pp. 443–507. 34. Cronbach, L.J., and Meehl, P.E. Construct validity in psychological tests. Psychological Bulletin, 52, 4 (1955), 281–302. 35. Cross, R., and Baird, L. Technology is not enough: Improving performance by building organizational memory. Sloan Management Review, 41, 3 (2000), 69–78. 36. Crossan, M.M. An organizational learning framework: From intuition to institution. Academy of Management Review, 24, 3 (1999), 522–537. 37. Cyert, R.M., and March, J.G. A Behavioral Theory of the Firm. Upper Saddle River, NJ: Prentice Hall, 1963. 38. Daft, R.L., and Huber, G.P. How organizations learn: A communications framework. In N. Ditomaso and S.B. Bacharach (eds.), Research in the Sociology of Organizations. Greenwich, CT: JAI Press, 1987, pp. 1–36. 39. Daft, R.L., and Weick, K.E. Toward a model of organizations as interpretation systems. Academy of Management Review, 9, 2 (1984), 284–295. 40. De Geus, A.P. Planning as learning. Harvard Business Review, 66, 3 (1988), 70–74. 41. Deming, W.E. Out of the Crisis. Cambridge, MA: MIT Press, 1986. 42. DiBella, A.J.; Nevis, E.C.; and Gould, J.M. Understanding organizational learning capability. Journal of Management Studies, 33, 3 (1996), 361–379. 43. Dixon, N.M. Organizational learning: A review of the literature with implications for HRD professionals. Human Resource Development Quarterly, 3, 1 (1992), 29–49. 44. Dodgson, M. Organizational learning: A review of some literatures. Organization Studies, 14, 3 (1993), 375–394. 45. Dowd, S.B. Organizational learning and the learning organization in health care. Hospital Materiel Management Quarterly, 12, 3 (2000), 1–3. 46. Drucker, P.F. The coming of the new organization. Harvard Business Review, 66, 1 (1988), 45–53. 47. Dutton, J.E., and Dukerich, J.M. Keeping an eye on the mirror: Image and identity in organizational adaptation. Academy of Management Journal, 34, 3 (1991), 517–554.

212

TEMPLETON, LEWIS, AND SNYDER

48. Engeström, Y. Innovative learning in work teams: Analyzing cycles of knowledge creation in practice. In Y. Engeström, R. Miettinen, and R.-L. Punamaki (eds.), Perspectives in Activity Theory. Cambridge: Cambridge University Press, 1999, pp. 377–404. 49. Fayol, H. General and Industrial Management, C. Storrs, Trans. London: Pitman, 1949. 50. Fiol, C.M., and Lyles, M.A. Organizational learning. Academy of Management Review, 10, 4 (1985), 803–813. 51. Fisher, S.R., and White, M.A. Downsizing in a learning organization: Are there hidden costs? Academy of Management Review, 25, 1 (2000), 244–251. 52. Foy, N. The Yin and Yang of Organizations. New York: Morrow, 1980. 53. Friedlander, F. Patterns of individual and organizational learning. In S. Shrivastiva and Associates (eds.), The Executive Mind: New Insights on Managerial Thought and Action. San Francisco: Jossey-Bass, 1983, pp. 192–220. 54. Gardner, H. Frames of Mind. New York: BasicBooks, 1983. 55. Garvin, D. Building learning organizations. Harvard Business Review, 71, 7 (July– August 1993), 78–91. 56. Gioia, D.A., and Thomas, J.B. Identity, image, and issue interpretation: Sensemaking during strategic change in academia. Administrative Science Quarterly, 41, 3 (1996), 370–403. 57. Goh, S.C., and Richards, G. Benchmarking the learning capability of organizations. European Management Journal, 15, 5 (1997), 575–583. 58. Goldhar, J.D., and Lei, D. Variety is free: Manufacturing in the twenty-first century. Academy of Management Executive, 9, 4 (1995), 73–86. 59. Goodman, P.S., and Darr, E.D. Computer-aided systems and communities: Mechanisms for organizational learning in distributed environments, MIS Quarterly, 22, 4 (1998), 417–440. 60. Hair, J.F., Jr.; Anderson, R.E.; Tatham, R.L.; and Black, W.C. Multivariate Data Analysis with Readings, 4th ed. Upper Saddle River, NJ: Prentice Hall, 1995. 61. Hammer, M., and Champy, J. Reengineering the Corporation. New York: Harper Collins, 1993. 62. Hannabuss, S. Learning and information. Information and Library Manager, 3, 4 (1984), 38–45. 63. Harman, H.H. Modern Factor Analysis. Chicago: University of Chicago Press, 1976. 64. Haynes, R. SIIA names top 25 metro areas for software employment. Software and Information IndustryAssociation (SIIA) press release, June 6, 2000 (www.siia.net/sharedcontent/ press/2000/6-6-00.html). 65. Hedberg, B. How organizations learn and unlearn. In P.C. Nystrom and W.H. Starbuck (eds.), Handbook of Organizational Design. London: Oxford University Press, 1981, pp. 8–27. 66. Hedberg, B.; Nystrom, P.; and Starbuck, W.H. Camping on seesaws: Prescriptions for a self-designing organization. Administrative Science Quarterly, 21, 1 (1976), 41–65. 67. Herbert, I. Knowledge is a noun, learning is a verb. Management Accounting, 78, 2 (2000), 68–69. 68. Hines, M.J., and Goul, M. The design, development, and validation of a knowledgebased organizational learning support system. Journal of Management Information Systems, 15, 2 (Fall 1998), 119–152. 69. Hobday, M. Telecommunications in Developing Countries: The Challenge from Brazil. London: Routledge, 1990. 70. Huber, G.P. The nature and design of post-industrial organizations. Management Science, 30, 8 (1984), 928–951. 71. Huber, G.P. Organizational learning: The contributing processes and the literatures. Organization Science, 2, 1 (1991), 88–115. 72. Kachigan, S.K. Multivariate Statistical Analysis. New York: Radius Press, 1982. 73. Kerlinger, F.N., and Lee, H.B. Foundations of Behavioral Research, 4th ed. New York: Harcourt College Publishers, 1999. 74. Kiernan, J.M. The new strategic architecture: Learning to compete in the twenty-first century. Academy of Management Executive, 7, 1 (1993), 7–21. 75. Kim, J.O., and Mueller, C.W. Introduction to Factor Analysis. Thousand Oaks, CA: Sage, 1982.

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT

213

76. Kuchinke, K.P. Managing learning for performance. Human Resource Development Quarterly, 6, 3 (1995), 307–316. 77. Kuhn, T.S. The Structure of Scientific Revolutions. Chicago: University of Chicago Press, 1962. 78. Larsen, T.J. Middle managers’ contribution to implemented information technology innovation. Journal of Management Information Systems, 10, 2 (Fall 1993), 155–176. 79. Lawshe, C.H. A quantitative approach to content validity. Personnel Psychology, 28, 4 (1975), 563–575. 80. Lederer, A.L., and Sethi, V. Root causes of strategic information systems planning implementation problems. Journal of Management Information Systems, 9, 1 (Summer 1992), 25–45. 81. Lee, S.; Courtney, J.F., Jr.; and O’Keefe, R.M. A system for organizational learning using cognitive maps. OMEGA International Journal of Management Science, 20, 1 (1992), 23–36. 82. Leonard-Barton, D. The factory as a learning laboratory. Sloan Management Review, 34, 1 (1992), 23–38. 83. Levinthal, D.A., and March, J.G. The myopia of learning. Strategic Management Journal, 14, Special issue (Winter 1993), 95–112. 84. Levitt, B., and March, J.G. Organizational learning. In M.D. Cohen and L.S. Sproull (eds.), Organizational Learning. Thousand Oaks, CA: Sage, 1995, pp. 516–540. 85. Lewis, B.R. The information resource management concept: Domain, measurement and implementation status. Ph.D. dissertation, Auburn University, 1993. 86. Lewis, B.R.; Snyder, C.A.; and Rainer, R.K., Jr. An empirical assessment of the information resource management construct. Journal of Management Information Systems, 12, 1 (Summer 1995), 199–223. 87. Lukas, B.A.; Tomas, G.; Hult, M.; and Ferrell, O.C. A theoretical perspective of the antecedents and consequences of organizational learning in marketing channels. Journal of Business Research, 36, 3 (1996), 233–244. 88. Lyles, M.A., and Schwenk, C.R. Top management, strategy and organizational knowledge structures. Journal of Management Studies, 29, 2 (1992), 153–174. 89. Mahoney, J.T. The management of resources and the resource of management. Journal of Business Research, 33, 2 (1995), 91–101. 90. Malhotra, M.K., and Grover, V. An assessment of survey research in POM: From constructs to theory. Journal of Operations Management, 16, 4 (1998), 403–423. 91. March, J. Exploration and exploitation in organizational learning. Organization Science, 2, 1 (1991), 71–87. 92. McGill, M.; Slocum, J., Jr.; and Lei, D. Management practices in learning organizations. Organizational Dynamics, 21, 1 (1992), 67–79. 93. Miller, D. A preliminary typology of organizational learning: Synthesizing the literature. Journal of Management, 22, 3 (1996), 485–505. 94. Morgeson, F.P. The structure and function of collective constructs: Implications for multilevel research and theory development. Academy of Management Review, 24, 2 (1999), 249–265. 95. Nelson, R.R. Educational needs as perceived by IS and end-user personnel: A survey of knowledge and skill requirements. MIS Quarterly, 15, 4 (1991), 503–525. 96. Nevis, E.C.; DiBella, A.J.; and Gould, J.M. Understanding organizations as learning systems. Sloan Management Review, 36, 2 (1995), 73–85. 97. Nicolini, D., and Meznar, A. The social construction of organizational learning: Conceptual and practical issues in the field. Human Relations, 48, 7 (1995), 727–746. 98. Nonaka, I. The knowledge-creating company. Harvard Business Review, 69, 6 (1991), 96–104. 99. Nonaka, I. A dynamic theory of organizational knowledge creation. Organization Science, 5, 1 (1994), 14–37. 100. Nunnally, J.C. Psychometric Theory. New York: McGraw-Hill, 1967. 101. Nunnally, J.C. Psychometric Theory, 2d ed. New York: McGraw-Hill, 1978. 102. Overall, J.E., and Klett, C.J. Applied Multivariate Analysis. New York: McGraw-Hill, 1972.

214

TEMPLETON, LEWIS, AND SNYDER

103. Pavlov, I.P. Conditioned Reflexes. New York: Oxford University Press, 1927. 104. Poell, R.F.; Chivers, G.E.; Van der Krogt, F.J.; and Wildemeersch, D.A. Learningnetwork theory. Management Learning, 31, 1 (2000), 25–49. 105. Premkumar, G.; Ramamurthy, K.; and Nilakanta, S. Implementation of electronic data interchange: An innovation diffusion perspective. Journal of Management Information Systems, 11, 2 (Fall 1994), 157–186. 106. Pucik, V. Strategic alliances with the Japanese: Implications for human resource management. In F. Contractor and P. Lorange (eds.), Cooperative Strategies in International Business. Lexington, MA: Lexington Books, 1988, pp. 487–498. 107. Pugh, D.S., and Hickson, D.J. Writers on Organizations. Thousand Oaks, CA: Sage, 1989. 108. Rainer, R.K., Jr., and Harrison, A.W. Toward development of the end user computing construct in a university setting. Decision Sciences, 24, 6 (1993), 1187–1202. 109. Rogers, A. A new brand of tech cities. Newsweek, 137, 18 (April 30, 2001), 44–51. 110. Rothwell, S. Managing organizational learning. Manager Update, 4, 3 (1993), 221–232. 111. Sackmann, S.A. Cultural Knowledge in Organizations. Thousand Oaks, CA: Sage, 1991. 112. Schein, E.H. Culture: The missing concept in organization studies. Administrative Science Quarterly, 41, 2 (1996), 229–240. 113. Schein, E.H. Three cultures of management: The key to organizational learning. Sloan Management Review, 38, 1 (1996), 9–20. 114. Senge, P. The Fifth Discipline: The Art and Practice of the Learning Organization. New York: Doubleday/Currency, 1990. 115. Senge, P., and Sterman, J. Systems thinking and organizational learning: Acting locally and thinking globally in the organization of the future. European Journal of Operational Research, 59, 1 (1993), 137–150. 116. Sethi, V., and King, W.R. Construct measurement in information systems research: An illustration in strategic systems. Decision Sciences, 22, 4 (1991), 455–472. 117. Sethi, V., and King, W.R. Development of measures to assess the extent to which an information technology application provides competitive advantage. Management Science, 40, 12 (1994), 1601–1627. 118. Shrivastava, P. A typology of organizational learning systems. Journal of Management Studies, 20, 1 (1983), 7–28. 119. Sinkula, J.M. Market information processing and organizational learning. Journal of Marketing, 58, 1 (1994), 35–45. 120. Skinner, B.F. The Behavior of Organisms. Upper Saddle River, NJ: Prentice Hall, 1938. 121. Slater, S.F., and Narver, J.C. Does competitive environment moderate the market orientation-performance relationship? Journal of Marketing, 58, 1 (1995), 46–55. 122. Smith, A. An Inquiry into the Nature and Causes of the Wealth of Nations. London: W. Strahan and T. Cadell, London, 1776. 123. Spector, P.E. Summated Rating Scale Construction: An Introduction. Thousand Oaks, CA: Sage, 1992. 124. Spender, J.-C. Industry Recipes: An Inquiry into the Nature and Sources of Managerial Judgment. Oxford, UK: Basil Blackwell, 1989. 125. Sproull, L.S. Beliefs in organizations. In P.C. Nystrom and W.H. Starbuck (eds.), Handbook of Organizational Design, vol. 2. New York: Oxford University Press, 1981, pp. 167–202. 126. Stata, R. Organizational learning: The key to management innovation. Sloan Management Review, 30, 3 (1989), 63–74. 127. Stein, E.W., and Vandenbosch, B. Organizational learning during advanced system development: Opportunities and obstacles. Journal of Management Information Systems, 13, 2 (Fall 1996), 115–136. 128. Stein, E.W., and Zwass, V. Actualizing organizational memory with information systems. Information Systems Research, 6, 2 (1995), 85–117. 129. Stone, E. Research Methods in Organizational Behavior. Santa Monica, CA: Goodyear Publishing, 1978. 130. Straub, D.W. Validating instruments in IS research. MIS Quarterly, 13, 1 (1989), 147–169.

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT

215

131. Taylor, F.W. The Principles of Scientific Management. New York: Harper Brothers, 1911. 132. Templeton, G.F. Review: Defining organizational learning—A focus on the social action perspective. Submitted to MIS Quarterly, 2002. 133. Templeton, G., and Snyder, C.A. Toward a method for providing database structures derived from an ontological specification process: The example of knowledge management. In A. Abecker, S. Decker, K. Kinkelmann, and U. Reimer, Proceedings of the Workshop “Knowledge-Based Systems for Knowledge Management in Enterprises.” Kaiserslautern, Germany: DFKI GmbH, 1997, pp. 121–131. 134. Templeton, G., and Snyder, C. A model of organizational learning based on control. International Journal of Technology Management, 18, 5–8 (1999), 705–719. 135. Templeton, G.F., and Snyder, C.A. Precursors, contexts, and consequences of organizational learning. International Journal of Technology Management, 20, 5–8 (2000), 765–781. 136. Thorndike, E.L. Animal Intelligence. New York: Macmillan, 1911. 137. Tsang, E.W.K., and Kwan, K.-M. Replication and theory development in organizational science: A critical realist perspective. Academy of Management Review, 24, 4 (1999), 759–780. 138. Vandenbosch, B., and Higgins, C.A. Executive support systems and learning: A model and empirical test. Journal of Management Information Systems, 12, 2 (Fall 1995), 99–130. 139. Van de Ven, A.H. Central problems in the management of innovation. Management Science, 32, 5 (1986), 590–607. 140. Ventriss, C. Organizational theory and structure: An analysis of three perspectives. International Journal of Public Administration, 13, 6 (1990), 777–798. 141. Ventriss, C., and Luke, J. Organizational learning and public policy. American Review of Public Administration, 18, 4 (1988), 346–347. 142. Vessey, I., and Conger, S.A. Requirements specification: Learning object, process, and data methodologies. Communications of the ACM, 37, 5 (1994), 102–113. 143. Walsh, J.P., and Ungson, G.R. Organizational memory. Academy of Management Review, 16, 1 (1991), 57–91. 144. Watkins, K., and Marsick, V. Sculpting the Learning Organization: Lessons in the Art and Science of Systematic Change. San Francisco: Jossey-Bass, 1993. 145. Watkins, K., and Marsick, V. The case for learning. In E.F. Holton III (ed.), Proceedings of the 1995 Academy of Human Resource Development Annual Conference. Baton Rouge: Academy of Human Resource Development, 1995, pp. 1–7. 146. Watson, J.B. Psychology as the behaviorist views it. Psychological Review, 20 (1913), 158–177. 147. Watson, J.B. Behaviorism. Chicago: University of Chicago Press, 1924. 148. Weber, M. Bureaucracy. In S.A. Theodoulou and M.A. Cahn (eds.), Public Policy: The Essential Readings. Upper Saddle River, NJ: Prentice Hall, 1995, pp. 259–265. 149. Wright, T.P. Factors affecting the cost of airplanes. Journal of the Aeronautical Sciences, 3, 4 (1936), 122–128.

216

TEMPLETON, LEWIS, AND SNYDER

Appendix Final Questionnaire Version Instructions: The following questions pertain to your company’s local operations, employees, and management. Please respond to each question using the following scale: 1 2 3 4 5 Strongly Moderately Undecided Moderately Strongly Disagree Disagree Agree Agree The following questions relate to your company’s local operations:

1. 2. 3. 4. 5. 6. 7.

8. 9.

10.

Strongly Disagree The company develops experts from within ......... 1 The company stores detailed information for guiding operations ...........................................1 There is a formal data management function in the company ...................................................... 1 The company is slow to react to technological change .............................................1 The company maintains a certain mix of skills among its pool of employees .......................1 The company hires highly specialized or knowledgeable personnel ...................................... 1 The company makes extensive use of electronic storage (such as, databases, data warehousing, scanned documents) ................1 The company collects data on all facets of performance ...................................................... 1 The company acquires subunits (such as, organizations, functions, departments) based on short-term financial gain ........................1 When internal capabilities are deficient, we acquire them from the outside .........................1

2

3

Strongly Agree 4 5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

The following questions relate to your company’s local employees: 11. Employees use electronic means to communicate ......................................................... 1 12. Employees have a large variety of communications tools (telephone, e-mail, Internet, and so on) from which to choose ............ 1

2

3

4

5

2

3

4

5

DEVELOPMENT OF A MEASURE FOR THE ORGANIZATIONAL LEARNING CONSTRUCT

13. Our employees resist changing to new ways of doing things .............................................1 14. Employees learn about the company’s recent developments through informal means (such as news stories and gossip)............... 1 15. Employees retrieve archived information when making decisions ......................................... 1 16. Employees make extensive use of IS to support their work .................................................1 17. Employees are keenly aware of where their knowledge can serve the company........................1 18. Employees keep information (such as, numbers, plans, ideas) away from other employees.............................................................. 1 19. When employees need specific information, they know who will have it ................................... 1 20. Employees are encouraged to communicate clearly .................................................................... 1

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

The following questions relate to your company’s local management: 21. Management proactively addresses problems....... 1 22. Management monitors important organizational performance variables ...................1 23. Management removes obsolete information from employee access ...........................................1 24. Management assigns employees to other parts of the organization for cross training ........... 1 25. Top management integrates information from different organizational areas .......................1 26. Management learns from the company’s partners (such as, customers, suppliers, allies)...... 1 27. Management ignores the strategies of competitors’ top management ............................... 1 28. Management learns new things about the company by direct observation ............................. 1 29. Management encourages the use of frameworks and models to assist in decision-making .................................................... 1 30. Management uses feedback from company experiments (such as surveys and trials of new methods) .................................................... 1 31. Employees are discouraged from recommending new work ideas............................. 1

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

2

3

4

5

217

218

TEMPLETON, LEWIS, AND SNYDER

32. Which of the following best describes your job position (check one)? _ Chief Executive Officer (CEO) _ Chief Information Officer (CIO) _ Vice President of Information Technology _ Other ____________________________

_ Technology Director _ Data Center Director _ Project Manager

33. Number of years you have worked in this company______________________ 34. Number of years worked in your current position in this company __________ 35. Number of employees in your local company operations _________________ 36. Age (in years) of your local company operations ________________________ To receive the results of this study, please write your name and the appropriate contact information below: __________________________________________________________________

Suggest Documents