Developing and Assessing Erp Competencies: Basic

0 downloads 0 Views 693KB Size Report
Dec 7, 2015 - assessment”. According to Wang and Haggerty [62], there is a ...... [51] National Association of Colleges and Employers. Job. Outlook 2012.
Journal of Computer Information Systems

ISSN: 0887-4417 (Print) 2380-2057 (Online) Journal homepage: http://www.tandfonline.com/loi/ucis20

Developing and Assessing Erp Competencies: Basic and Complex Knowledge Patrick Charland, Pierre-Majorique Léger, Timothy Paul Cronan & Jacques Robert To cite this article: Patrick Charland, Pierre-Majorique Léger, Timothy Paul Cronan & Jacques Robert (2016) Developing and Assessing Erp Competencies: Basic and Complex Knowledge, Journal of Computer Information Systems, 56:1, 31-39 To link to this article: http://dx.doi.org/10.1080/08874417.2015.11645798

Published online: 07 Dec 2015.

Submit your article to this journal

Article views: 2

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=ucis20 Download by: [Université du Québec à Montréal]

Date: 11 March 2016, At: 05:22

DEVELOPING AND ASSESSING ERP COMPETENCIES: BASIC AND COMPLEX KNOWLEDGE PATRICK CHARLAND Université du Québec à Montréal Montréal (Québec), Canada

PIERRE-MAJORIQUE LÉGER HEC Montréal Montréal (Québec), Canada

TIMOTHY PAUL CRONAN University of Arkansas Fayetteville, Arkansas 72707, USA

JACQUES ROBERT HEC Montréal Montréal (Québec), Canada

Downloaded by [Université du Québec à Montréal] at 05:22 11 March 2016

ABSTRACT This research studies the influence of individual knowledge mastery of competency task performance of Enterprise Resource Planning (ERP) learners. The research design involved the assessment of participants’ ERP competency, each of whom participated in four games of a computer-based simulation, ERPsim. ERP knowledge was assessed using a validated questionnaire, which included questions with different complexity levels. Results indicate that although reported student grade point average is not a predictor of ERP competency, ERP knowledge mastery (particularly complex knowledge) does predict ERP competence. While mastering basic ERP knowledge does not predict the competency of the participants, these results can provide useful guidelines with respect to teaching and assessment practices, as well as the development of ERP curricula. To effectively prepare learners to be able to perform in authentic learning contexts, instructors could emphasize the mastery of complex knowledge and consequently use complex knowledge test questions as a component of the instruction. Keywords: ERP Competencies, ERP Knowledge, Competency Assessment, End-User Training, Pedagogical Issues. INTRODUCTION AND PURPOSE The purpose of this article is to present new evidence in identifying how knowledge influences end-user competency. Specifically, the present paper focuses on a competency related to the use a large-scale information system (IS) application called enterprise resource planning (ERP) systems. ERP systems are business software packages (usually company-wide comprehensive) that companies use to integrate the management of their finance and operations [48]. These systems integrate all the business processes of the company. Currently, most large organizations have adopted ERP software to manage their operations and smaller organizations are increasingly adopting them [12]. Training future business graduates in the understanding of how to use and how to make business decisions using ERP systems is argued to be critically important in today’s economy [9]. The IS literature calls on educators to foster the learning of these systems in higher education by future managers to adequately prepare for the market place [29]. Therefore, the purpose of this research is to investigate specifically the influences and applicability of basic and complex knowledge in the development of competent ERP end-users. This paper first reviews the education and information system literature with respect to the notions of competency and knowledge. Next, the method section presents a training simulation called ERPsim that is used to test the effect of trainee knowledge on their competency in managing a simulated company using Volume 56 Issue 1, Fall 2015

an ERP system. The results of this study highlight the critical role of complex knowledge in predicting ERP competency. The paper concludes by discussing the impact of those results in the development of ERP curricula. LITERATURE REVIEW The Emergence of a Competency-Based Approach for Teaching Complexity and for Employability Challenges in university graduates’ employability have made it necessary for an update of higher education teaching strategy as well as a better alignment between industry needs and classroom teaching [59]. Consequently, higher education has entered a period of transformation that involves the implementation of new instructional approaches designed mainly to better prepare students to put acquired knowledge into action. Today, the “(...) basic objectives of higher education consist of providing university graduates with a system of essential knowledge, abilities, and skills, and also of developing their capacity and readiness to put their knowledge to work in professional activity” [53, p.70]. Additionally, the intent of changes in the methods is to provide students with a better understanding of complexity, meaning “ thinking about multiple interdependent levels, nonlinear causality, and emergence ” [46, p.1023]. To achieve the goal of teaching complex knowledge as well as to the ability to put knowledge into practice, the competencybased approach has emerged as the new educational paradigm [15, 53, 55]. This emergent approach presents a consequent challenge for professors regarding how to teach and how to assess competencies [28, 52, 56, 65]. In fact, “(...) the development currently labeled as ‘competency based learning’ have been growing in momentum over the last thirty years. It has its origins in profound concerns that traditional education programs were failing to address the needs of both learners and industry” [54, pp.187-188]. Educational trend researchers call for student-centered and competency-driven programs [27]. A fundamental debate currently is in process and is focused on assessment of competency1 as well as new educational approaches regarding the role of knowledge mastering [24, 30]. Given that, Brinke, Sluijsmans, and Jochems [5, p.107] indicate that “competencybased university education demands a renewed vision of assessment”. According to Wang and Haggerty [62], there is a lack of quantitative empirical studies that establish the impact of ,QWKHDEVHQFHRIDJHQHUDOO\DJUHHGGH¿QLWLRQRIFRPSHWHQFHRU competency, in this article, we use “competency” with respect to the naming of “Competency-based training” or “Competency-based learning” approaches.

1

Journal of Computer Information Systems

31

knowledge in one’s competency. As Phelps et al. [56, p.67] point out, “new discourses in educational theory and practice, which are founded on non-linear approaches to learning and teaching, provide added impetus to engage in the competency/capability debate, and re-examine our approaches to computer education”. Because of the importance of competency in end-user training in information technology, a clear understanding of its role is needed. Answers to these questions will address the “need for more sophisticated studies and more systematic approach to the evaluation of competency-based training” [14, p.36].

Downloaded by [Université du Québec à Montréal] at 05:22 11 March 2016

Competency-Based Training, Learning, and Knowledge

knowledge and learning outcomes. Also inspired by the work of Bloom, the present study considers basic knowledge as the first level of the taxonomy; that is, the recognition of information. Complex knowledge is viewed as the second, third, and fourth levels of Bloom’s taxonomy; that is, restating information (comprehension), applying knowledge to a problem (application), and identifying constraints within a problem (analysis). In a conceptual synthesis throughout recently reformed educational systems curricula, Jonnaert et al. [28] observe that knowledge is mostly defined as a “personal resource,” while the concept of competency is mainly understood as a “capacity to mobilize and use effectively a whole of resources” (p.281), including knowledge. From that perspective, competency and knowledge appear to be hierarchically related suggesting that knowledge mastering is essential to competency development. For instance, Basselier, Horner Reich, and Benbasat [2] argue for the essential effect of explicit and tacit knowledge in developing a manager’s competency in IS. Importantly, much evidences support this epistemological posture. Findings from cognitive psychology suggest that learners need basic knowledge to develop higher order thinking skills ([10, 13, 21, 22, 52, 64]. Hirsch [20, p.144] asserts that “there is a great deal of evidence, indeed a consensus in cognitive psychology, that people who are able to think independently about unfamiliar problems and who are broad-gauged problem solvers (...) are, without exception, well-informed people.” This suggests that domain specific information, called knowledge in the present study, is necessary to higher-order skills, such as competency. From a global perspective, Hirsch [20] explains that basic knowledge acts as a matrix to orient learners in their mental space. He adds that even though it is evident that useful background knowledge changes over time, there is currently no reasonable argument to the wide spread assertion that basic knowledge is changing at such a rapid pace that it is useless to learn knowledge. In short, “higher-order skills are invariably and necessarily conjoined with a great deal of relevant, domain-specific information. Hence, there is no way to gain the skills without gaining the associated information.” [21, p.8]

The earliest competency-based training (CBT) literature emerged in the 1920’s in the United States [19, 61]; at that time, there was a perception of the requirement to meet the ever-changing needs of industry and the new economy [14]. CBT theory originates from the behaviorist movement since the approach focuses on observable behaviors and an expected result [56], which is performance. Jonnaert [25] has produced one of the most intelligible conceptual clarifications on the concept of competency. He claims that confusion in the definition of competency comes mostly from the different ways competency is used in four domains -- linguistic, cognitive psychology, education, and work sciences. Competency for linguists is an individual potential not yet demonstrated, but clearly different from performance, which has a social component. This distinction between competency and performance is generally accepted by cognitive psychologists who share the view that competency is latent or virtual, whereby performance is effective and observable. Thus, many authors merge and mix the competency and performance concepts. They clearly define a competency as a contextualized, situated, and ‘hands-on’ concept. Unlike qualification, competency necessarily includes evaluation and has a synthetic rather than analytic approach [38]. Stated differently, the concept of qualification comes from a more Taylorian conception of professions; currently, work sciences have moved to the concept of competency, whose aim is to reflect the new complexity of occupational situations. In the field of education, competency refers TABLE 1. Revised Cognitive Domain: Bloom’s Taxonomy of Learning Objectives (Adapted from [31]) mostly to the effective mobilization of an individual’s resources in a given situation. Learning Description of Learning Assessment Process Resources refer here to personal knowledge, Objective skills, and attitudes. Moreover, work 1 Remembering Student recalls and Answering direct questions/tests sciences specialists have developed their recognizes information own appropriation of competency inspired Student changes Ability to act on or process information from the former concept of qualification, a 2 Understanding information into a different by restarting in his of her own terms static enumeration of the qualities required symbolic form to successfully realize a task. In brief, a Student discovers Application of knowledge to simulated competency represents the way individuals 3 Applying relationships, problems manage their cognitive and social resources in action resulting in a certain level of generalizations, and skills performance. 4 Analyzing Student solves problems in Identification of critical assumptions, light of conscious alternatives, and constraints in a Role of Knowledge in Competency-Based knowledge of relationships problem situation Training between components and the principle that organizes To test a theory of competency the system development, the concept of knowledge 5 Evaluating Student develops the ability Logical consistency and attention to and its typology should be clarified. To to create standards of detail this end, “conceiving of knowledge as a judgment, weigh, and randomly arranged store cupboard full of analyze facts completely violates what we know about the structure of the human mind.” [65, 6 Creating Student goes beyond what Solution of a problem that requires p.41]. The widely accepted revised Bloom’s is known providing new original, creative thinking taxonomy of learning objectives (Table 1) insights has been used by researchers investigating 32

Journal of Computer Information Systems

Volume 56 Issue 1, Fall 2015

as cognition (process adaptation and vision about the role of IT in the organization). Other studies have focused on knowledge as central aspect Of recent, business school enrollment is growing as businesses of systems integration. Kang and Santhanam [29] suggested desire and need knowledgeable, competent employees. The central the consideration of IT knowledge as a continuum. Focusing on objective of business school education is to teach application collaborative applications such as an ERP system application, and decision making knowledge or “the acquisition of the art they suggest that a productive IT user should have knowledge of utilization of knowledge” [63]. Everwijn et al. [13] indicate of the following dimensions: (1) technical IT knowledge related that Whitehead’s definition addresses the primary mission of to the commands and tools embedded in the IS applications, business schools as well as that of other professional development (2) business IT knowledge related to the contextual knowledge programs. They state that a business school’s mission is “how covered by the use of the IS application, and (3) collaborative IT to make certain that knowledge acquired gets transformed into knowledge related to the understanding of tasks interdependent ability to apply” (p. 425). with those of other users of the IS application. The later With regard to the role of IS in business education, end-user dimension differentiates this particular framework from other training (EUT) and competency-based training (CBT) have also previous contributions and more appropriately fits the context been the focus of much research. Gupta, Bostrom, and Huber of an integrated enterprise system. While much of the current [16] point out that businesses are spending increasing amounts research has relied on attitudes, perceptions, and self-reports on training end-users. They suggest that this is largely due to the to assess learning, the present study reports objective measures critical role of IS and technology in today’s business environment, and compares these results to traditional affective measures of the ever growing pace of change in technology, and the dynamic learning. nature of business. Consequently, training methods are changing Initially based on items used by Seethamraju [66], and Kang as well, but with little research to support the methods used. A and Santhanam’s [29] categories of knowledge, Cronan, et. al. [8, scientific literature analysis in business education brings forward 9] further developed and expanded tools to measure ERP learning. the multiplicity of concepts referring to an IS competency. Table In this study, ERP learning factors (using sixteen items) measured 2 presents key articles which help to define IS competency. These enterprise systems management knowledge, business process definitions are in relation to a general definition of a competency, knowledge, and SAP transactions skills. Enterprise Systems presented previous by Jonnaert [25] - efficient resource (ES) Management Knowledge is defined as the extent to which mobilization in context. an individual understands the impact of an ERP (as well as the integrated information it TABLE 2. Summary of IT competency in the IS literature provides) on the organization as a whole -(Adapted from Wang & Haggerty [62]) including impacts on organizational structures Construct Article Definition and responsibilities, business processes, reporting, control (or insurance) and decision User competence (Munro et al., [49] ) The end user tools, skills and making. ES management knowledge knowledge that an individual reflects the individual’s knowledge of how possesses and can bring to bear enterprise management utilizes an ERP and on his/her job how the use of ERP affects the enterprise. User competence (Marcolin, Compeau, The user’s potential to apply Business Process Knowledge (BP) is the Munro, & Huff, [47]) technology to its fullest possible extent to which an individual has a general understanding of business terminology, extent so as to maximize key operations processes, and their interperformance of specific job tasks relatedness. BP knowledge includes IS competence (Bassellier et al., [2]) The set of IS-related explicit and understanding the delineation of key business tacit knowledge that a business activities within and between functional areas manager possesses that enables such as financial accounting, procurement, him or her to exhibit IS manufacturing and sales. SAP Transaction leadership in his or her area of business Skills represent the extent to which an individual has the information systems user End-user (Blili, Raymond, & The ability to adopt and utilize skills required to utilize the SAP application competence Rivard, [3]) end-user computing to perform transactions supporting business operations as well as to setup and understand the associated master data. In terms of lead studies in the IS field, Seetharamju [66] The next section describes the general research design explored learners’ perceived knowledge gain after incorporating and measures developed to assess learners’ knowledge and ERP (based on SAP) instructional strategies into a business competency. Results and implications will then be presented and curriculum (readings, examples, exercises and classroom discussed. lectures). His analysis revealed that students perceived that they had gained a significantly high level of knowledge (for example, METHOD OF ANALYSIS implementation and SAP software skills). Basselier, et al. [2] suggest the components of both explicit and tacit knowledge are ERPsim: A Computer-Based Simulation used as a necessary for a manager to be competent in IT. They define IT Research Tool to Assess a Learner’s ERP Competency competency as a set of IT-related explicit and tacit knowledge that The movement from more traditional training methods (such enables one to act proactively in order to successfully perform as, lecture, lab exercises, and assignments) toward active learning the job. They also suggest that explicit IT knowledge covers in higher education, including competency-based training, has knowledge of technologies, applications, system development, become a part of one instructional approach that has gained management of IT, and access to IT knowledge. Tacit knowledge acceptance in IS, namely simulation based training [39]. In order includes the experience of the manager (personal use of to apply a more active learning approach in IS teaching practices computers, IT project experience, and management of IT) as well

Downloaded by [Université du Québec à Montréal] at 05:22 11 March 2016

ERP Competency

Volume 56 Issue 1, Fall 2015

Journal of Computer Information Systems

33

Downloaded by [Université du Québec à Montréal] at 05:22 11 March 2016

and end user training as well as facilitate the assessment of learners’ IS competency, a computer-based business simulation, ERPsim, was used (developed and validated in previous research activities [40, 41, 42, 43, 44]). Within a simulation experience, learners are placed in a situation whereby they make decisions and manage the operations of a fictitious company using an ERP software system. The simulation system closely resembles the authentic nature of decisions required to operate and manage an on-going business [41]. Specifically, participants must monitor the profitability of their business and make tactical decisions by using information from an ERP system, namely SAP. The business simulation is enabled via a simulation engine called ERPsim. More information on the simulation is provided in Léger et al. [41]. ERPsim development began in 2004 and consists of more than 630 classes and 136,500 lines of Java code as of 2014. To date, ERPsim has been used as a pedagogical approach by 832 professors, lecturers, and professional trainers in over 377 universities worldwide. Between September 2009 and February 2014, more than 20,000 simulation games with an average of 6 teams (4 students per team) were played by teams composed of students registered at universities member of the SAP University Alliance. While other simulation games allow participants to take a strategic view of an enterprise [e.g. 37], the close coupling with a real ERP software technology allows participants to apply their competency in an authentic context. More specifically, in the simulation, participants must interact directly with the ERP software itself as they would in an authentic business context. While participating in the simulation, all decisions must be entered by participants – directly interfacing with the ERP system itself. This simulation allows participants to learn, over a period of time, about the outcome and resultant consequences of their decisions and their effects on the company overall. Reports available within the ERP system are used to ensure the profitability of their subsequent operations. Participants must analyze their data and then use the system to make new decisions (price level, product quantity, etc.) regarding the next period. Previous studies have shown that profitable teams in the ERPsim game are more likely to have implemented more efficient communication strategy and better collaborative decision making processes [6]. Therefore, the simulation provides an authentic, meaningful experience whereby learners are able to apply their competency. This creates a unique learning environment where participants can learn about the nature of integrated business systems and business processes using a hands-on approach. Research studying newly hired graduates indicates competency acquired in the simulation is transferable into the marketplace [9]. Research Participants and Research Design Participants in this research are IS major senior undergraduate students of universities that are members of the SAP University Alliance and who have previously participated in the classroom ERP simulation game. An invitation to participate was sent to students who asked to be contacted for participation in this research protocol. One hundred twenty (120) business school students, organized in forty (40) teams of 3 participants, registered for this experiment. In this experiment, each team had to manage a fictitious company as a part of ERPsim business simulation. Teams were required to produce up to 3 products and competed with automated robots managed by the simulator. The scenario of the simulation research was the same for all teams; stated differently, every team 34

encountered the same market conditions for decision-making requirements. In addition, each team member had a specific role within the team—planner, scheduler, or seller. The planner was responsible for planning, executing the material requirements planning and for purchasing the raw materials needed. The scheduler was responsible for scheduling the production shop floor to manufacture the finished products from the raw materials. Finally, the seller set the price of each of the available finished products based on available stock as well as market demand. Details regarding these roles can be obtained in Leger et al. [41]. Prior to the experiment, participants had to consent to participate in the research and answered a background survey (demographic questions, prior learning experience with ERP systems) useful to eventually establish an initial competency of each team members. As seen in Figure 1, each team participated in a total of four short business simulation games during one experimental session (30 minutes each). After initial instructions, teams had 10 minutes to prepare before each session.

FIGURE 1. Research design Variables used in the Study ERP Competency (inferred from financial performance in the game) In the context of this study, computer-based simulation has the benefit of quantifying the performance in the experimental task. The profit generated in a simulation game is determined as a direct proxy of their competency. This is based on the assumption that participants who are better at solving the business problem (using their competency) should achieve better profit. Previous research suggests that the profit in this game is tied to other important IS concept such as cognitive absorption, and more specifically perceived control [42]. It should be noted that, in the context of this paper, each team is playing against the exact same simulated market (as opposed to one another) to allow for between team comparisons. During each game, usage and profitability data from the ERP system was collected. All the transactions performed by users are recorded for audit purposes (a component of the ERP system); it is therefore possible to map each subject to their specific role (seller, planner and scheduler). Because the simulation is based on a competitive game, the financial profitability of each team can be precisely tracked and collected. As a consequence, we are able to specifically measure the role-based contribution of team members within their role to profitability. The contribution to profit of the seller is best measured by the difference between the total selling price and the total standard accounting cost of the finished products sold during the game. If nothing is sold, then the contribution of the seller is zero. Adjustments are made to account for stock outs, as this is not a direct consequence of the sellers’ decisions. Therefore, if one or more finished products were stocked out, then the seller’s contribution was adjusted. The size of the adjustment was equal to the difference between the average daily contribution of the sellers when there is stock and the average contribution when one or more finished products are stocked out. The contribution to profit of the scheduler responsible for production is the difference between the standard accounting cost of the finished products and the cost of raw materials. Furthermore, the stock out adjustment

Journal of Computer Information Systems

Volume 56 Issue 1, Fall 2015

Downloaded by [Université du Québec à Montréal] at 05:22 11 March 2016

calculated for the seller is subtracted from that of the scheduler, as the latter is partly responsible for these situations. Similarly, adjustments were made to account for missing raw materials, as these do not depend on the scheduler’s task. Adjustments calculated for the scheduler were subtracted from the planner’s contribution, as the planner is responsible for providing the raw materials. The size of the adjustment was equal to the difference between the average daily contribution of the sellers and the scheduler (sales revenues minus the cost of raw materials) when there is raw material stock and their average contributions when one or more raw materials are stocked out. If there are raw material stocks, the planner’s score is zero. If there are stock outs, the planner is penalized by the average loss incurred to the others. ERP Knowledge Anderson and Lawton [1, p.204] point out that: “(…) to determine the student’s level of knowledge, we should employ a test that measures their level of knowledge. However, if we are interested in whether students are performing well at a higher level learning, then tests of knowledge and comprehension alone are inadequate, and we need to use tests or other direct measures of learning that require analysis, synthesis, and evaluation.” A database of questions was developed which could be used to measure ERP knowledge of an individual user with respect to SAP. Sixty (60) objective questions were developed by a panel of 4 experts in the simulation game to cover various degrees of complexity. As discussed previously, “simple” questions were designed to measure basic knowledge as presented in Bloom’s taxonomy whereas “complex” questions were designed to measure comprehension, application, and analysis knowledge learning objectives (see Table 1). Hence, question objectives vary from the subject being able to recognize information (basic knowledge), to the subject being able to process information by restating it in their own terms (comprehension), to apply knowledge to a problem (application), to identify constraints in a problem (analysis). Generally, “simple/basic” questions correspond the Basselier, et al.’s [2] explicit knowledge and to Cronan, et al.’s [8, 9] business process knowledge and SAP transaction skills knowledge components. “Complex” questions generally correspond to Basselier, et al.’s [2] tacit knowledge and to Cronan, et al.’s [8, 9] enterprise systems management knowledge. Over 100 faculty members were contacted and were used 1) to initially assess the complexity of each question and 2) to classify questions (via their correct or incorrect responses) as basic or complex. First, faculty were asked to answer the objective questions and rate the perceived complexity of each question on a scale of 1 to 5 These faculty respondents were members of the SAP University Alliance. Of the 100 faculty contacted, fifty nine (59) answered at least one of the sixty (60) questions presented. Each question was answered by at least twenty (20) faculty members. It should be noted that the faculty invited to participate have all been trained in the use of SAP and of ERPsim software, and have previously used the ERPsim software as part of their respective classes. Hence, the simulation, the technology used to play it, and the business context of the business game itself offers a common point of reference for all experts. Consequently, questions submitted to the faculty for their response refer to knowledge required by players in the game. Questions were displayed to the faculty experts in a random order. The objective questions that were answered correctly by at least 60% of the faculty were retained for the study. Hence, if a question was incorrectly answered by more than 40% of the faculty, that question was considered to be ambiguous and was removed from the analysis. As an outcome of this process, thirty (30) questions were retained and used to test basic or complex knowledge of each team members involved in this experimentation. Table 3 presents Volume 56 Issue 1, Fall 2015

the summary statistics of the perceived complexity (by the respondents) of the resulting questions. The median perceived complexity score (3.7) of the question was used to segment the question. As a result, twenty (20) questions were classified as basic ERP knowledge and ten (10) questions address complex ERP knowledge. TABLE 3. Expert Survey : Summary Statistics of the Perceived Complexity of Selected ERP Knowledge Questions (60 questions) Mean 3.57 Standard deviation 0.60 Median 3.62 Min 1.50 Max 4.66 n= 59 experts (using a scale of 1 – simple to 5 - complex). As a part of the experiment, all 120 participants answered the 30 ERP knowledge questions in random order before the experimental task. The knowledge scores for each participant were calculated as the percentage of valid answers on the 30 questions. Therefore, ERP complex knowledge corresponds to the percentage of valid answers on the 10 complex questions and ERP basic knowledge to the same percentage on the 20 basic questions. Grade Point Average (GPA) To assess academic performance and achievement of university students, the grade point average (GPA) is recognized by many as a robust indicator [18], perhaps even a predictor for eventual academic success [50]. Kuncel, Credé, and Thomas [33, p. 63] remind that: “high school GPA is one of the best predictors of college grades [57, 64], and college GPA is a robust predictor of performance in graduate school [36], pharmacy programs [34], business school [35], and law school [45].” To control for the participants general knowledge in the field of IS and business, we have used a self-reported GPA as a standardized measure to control for the individual effect in our model, presented in the next section. Model Development To discriminate individual student performance from team performance, for single game versus the four games played one after the others, research models were used to evaluate individual competency. These models assume that the measured competency of a member in a team depends on a series of observable characteristics, GPA, Basic Knowledge and Complex Knowledge, as well as some unobserved abilities. Using the data from the first game, we estimate the following Model 1.

Residuals of Model 1 above provide a proxy for the unobserved abilities that remain uncorrelated to the observable characteristics. This then is used to predict the measured competency in subsequent games (t=2, 3, and 4).

Journal of Computer Information Systems

35

Downloaded by [Université du Québec à Montréal] at 05:22 11 March 2016

RESULTS

of resources” [28, p.281] or by the idea [2] of using knowledge to act successfully to perform the job. As observed in our results, Appendix A presents descriptive statistics for the variables of simple knowledge mastering is not necessarily linked to one’s this study. It is noted that the unobserved abilities ( were estimated capacity of mobilization, usage, or performance in a complex using 120 valid observations (i.e. data from one game played situation, as in the ERPsim game. by 40 teams with 3 participants). As previously mentioned, the Second, it appears that the self-declared GPA does not have unobserved abilities (value of ) correspond to the residual of the a significant effect on the performance of the subjects in the regression of Model 1. The mean and standard deviation of the simulation. As shown in Table 4, students with a high GPA did not unobserved abilities (are calculated using the residual of Model significantly perform better. It is noted that GPA is considered a 1 in game 1. In addition, Appendix A contains the correlations robust indicator and predictor for success and achievement. While between the variables used in this study. It is important to observe most of higher education institutions select student candidates that no correlation value is above the intercorrelation threshold using this metric, employers also widely bring a GPA criterion in of 0.7 [17]. their recruitment processes. According to the National Association A repeated measure data set with 360 valid observations of Colleges and Employers [51], 76.3% of the 244 respondent was used to estimate Model 2, since the lagged variable is not employers to the survey reported plans to screen candidates by available for the first of the four games. Data analysis used their GPA. Our results are consistent with studies criticizing STATA/SE 10.1 with XTreg command for estimation. XTreg is candidate selection based only on a GPA criterion [11, 32]. Most used with longitudinal or panel data; it fits cross-sectional timeof the studies performed using GPA as a predictor to academic series (or panel data) regression models with random-effects. success were typically performed in traditional/magisterial Further information about this analysis procedure can be found in academic environments. Orienting and restructuring college [60, p.1691] . Table 4 presents the model estimation for Model 2. and university programs by competency-based approaches, Table 4 - The Effect of Basic and Complex ERP Knowledge for a better transition in employment situations, could mean on ERP Competency questioning the candidate selection process. This could also mean questioning recruitment policies by the employers who expressed the need for TABLE 4. The Effect of Basic and Complex ERP Knowledge on ERP Competency. a better alignment between industry needs Model 2 (ERP Competency) and classroom teaching as mentioned in the introduction [59]. (DV: Competency, panels, games 2 to 4, n=360,) We also find that unobserved learning Coefficient S.E. Sig. and subsequent abilities of the initial game (Constant) 0.200 0.283 0.120 remain uncorrelated to the observable GPA -0.074 0.072 0.077 characteristics but remain good predictors ERP Knowledge Basic -0.259 0,269 0.084 of subsequent game outcomes. Specifically, ERP Knowledge-Complex 0.442 0.278 0.028* our results reveal the significant impact of Residual ( +ˆ ૚) 0.538 0.045 0.000* the residual. Unobserved abilities estimated R2 0.3038 from Game 1 data (residual +ˆ t=1) are strong Wald chi2 154.95 0.0000* predictors of the participants’ competency * p < 0.05 (Ⱦ = 0.538, p < 0.000). While this research DV: Competency, panels, games 2 to 4, 120 subjects x 3 games, n=360 design does not allow precise detection Game 1data is to calculate the residual for Game 2 estimation of the most influential factor in the DV: Competency. Repeated Measures (STATA XTreg). residual, this effect could imply that ERP Directional hypothesis (One tail p values) competency is a function of the ability of the learner to transfer (or carry forward) the DISCUSSION learning from one situation to the next. This finding is related with the concepts of knowledge or competency transfer. From The objective of this study was to investigate the influence of an individual perspective, knowledge transfer could be defined individual knowledge mastery of competency task performance as the knowledge acquired in one situation being applied to of Enterprise Resource Planning (ERP) learners. Our findings another [58]. As a corollary, Hong, Horng, Lin, and Chan Lin [23] highlight the key role of Complex Knowledge in the competency insist that competencies can also be transferred in similar (yet of the participants. The regression model (Table 4) also indicates different) contexts to solve similar problems. Future empirical that ERP Complex Knowledge is a significant predictor of research should be designed to specifically study the influence of competency (Ⱦ = 0.44, p < 0.05). However, we observe that knowledge/competency as a part of the residual factor. ERP Basic Knowledge does not have a significant effect on Taken together, our results provide useful guidelines for the competency. In other words, those participants who had a teaching and assessment practices. In order to effectively train higher level of complex knowledge linked to the performed tasks or prepare students (and end users) to perform better, instructors were able to achieve a better performance and were judged to could prepare them using complex knowledge/learning, i.e. high be more competent in the simulation they had to handle. We are level questions in the Bloom’s taxonomy, etc. In return, with the reminded that Anderson and Lawton [1] insist that a simulation perspective to assess a student/user competency, results from a is an ineffective pedagogy for teaching terminology, factual test using complex questions seem to be an interesting indicator. knowledge and basic notions. In this experiment, all participants However, we must bear in mind that the research model explains had previously attended an ERP class; this basic knowledge 30% of the variance of the competency in the experimentation. was previously acquired. The results indicate that once basic The results cannot be interpreted as a direct recommendation knowledge is acquired, basic knowledge mastery acquired is to abandon in-situation assessment of competency, in favor of not sufficient to predict performance. To perform well in this developing tests with complex questions. As Jonnaert et al. [26, simulation, the user must use this basic knowledge that was p.8] write, “a competency can only be developed in a situation, acquired. This latest result is in direct reference to definitions according to available resources, under the basis of the performed of a competency discussed earlier. A competency was generally actions realized.” understood as a “capacity to mobilize and use effectively a whole 36

Journal of Computer Information Systems

Volume 56 Issue 1, Fall 2015

Downloaded by [Université du Québec à Montréal] at 05:22 11 March 2016

CONCLUSION The objective of this research was to use empirical data in order to study the effect of knowledge (simple and complex) on the performance of learners, as a direct measure of competency. According to the results presented, we can conclude that while basic knowledge is important to develop an ERP competency, mastering basic knowledge is not a significant predictor to the competency involved in solving problems using an information system. Rather, the results indicate that competency can be partly predicted by answering higher level (complex) questions. Besides these specific complex questions, it is also observed that the self-declared GPA is not a significant predictor to the ERP competency of users. Based on the analysis models, it is noted that the importance of the residual factor (unobserved individual characteristics) as a strong predictor of the performance of the later games. Despite its scope, this research design presents certain limits that must be taken into account in the previous interpretations of the results above. Firstly, it must be noted that game profit was chosen as a sole indicator of performance and translated into a proxy of competency. Profit is considered a natural metric in business education and in operational management training. Admittedly, other metrics exist (ex. quality of decision-making, quality of collaboration, etc.) and should be considered in future ERP competency research. These qualitative metrics do present analysis challenges. Our results also rely on self-declared GPA; it was not possible to control the variance due to the university attended. In the field of IS training, the research results bring into question widespread assessment practices. Indeed, many current professional IS certifications use test performance solely based on simple knowledge questions. In many organizations, IS analysts are generally prescreened on the basis of their GPA. However, new hiring initiatives have been reported; for instance, companies such as SAP and Microsoft have recently created problem solving events where recruits can show their creativity in student competitions; thus demonstrating their competence to recruiters. Companies interested in employing and faculties interested in training highly qualified personnel should consider developing certifications in which more complex items (utilizing complex knowledge) could be used to predict competencies. REFERENCES [1] Anderson, P. H., & Lawton, L. “Business simulations and cognitive learning”, Simulation & Gaming, (40:2), 2009, pp.193-216. [2] Basselier, G., Horner Reich, B., & Benbasat, I. “Information WHFKQRORJ\FRPSHWHQFHRIWKHEXVLQHVVPDQDJHU$GH¿QLWLRQ and research model”, Journal of Management Information Systems, (17), 2001, pp.159-182. [3] Blili, S., Raymond, L., & Rivard, S. “Impact of task uncertainty, end-user involvment, and competence on the success of end-user computing”, Information & Management, (33), 1998, pp.137-153. [4] Bloom, B. S., Englehart, M. D., Furst, E. D., Hill, W. H., & Krathwohl, D. R. Taxonomy of educational objectives: 7KH FODVVL¿FDWLRQ RI HGXFDWLRQDO JRDOV +DQGERRN  7KH FRJQLWLYHGRPDLQ David McKay, New York, 1959. [5] Brinke, D. J.-T., Sluijsmans, D. M. A., & Jochems, W. M. G. “Self-assessment in university assessment of prior learning procedures”, International Journal of Lifelong Education, (28:1), 2009, pp.107-122. doi: 10.1080/02601370802568697 [6] Caya, O., Léger, P-M., Grebot, T. et Brunelle, É. “Integrating, sharing, and sourcing knowledge in ERP usage context”, Knowledge Management Research and Practice. 10.1057/ kmrp.2012.54, forthcoming. Volume 56 Issue 1, Fall 2015

[8] Cronan, P., Leger, P.-M., Robert, J., Babin, G., & Charland, P. “Comparing objective measures and perceptions of cognitive learning in an ERP simulation game: A Research Note”, Simulation & Gaming, (43:4), 2012, pp.461-480. [9] Cronan, T. P., & Douglas, D. E. “A student ERP simulation game: A longitudinal study”, Journal of Computer Information Systems, Fall 2012, pp.3-13. [10] De Corte, E. “Actief leren binnen krachtige leeromgevingen [Active learning in powerful learning environments] ”, Impuls, (26:3), 1996, pp.145-156. [11] Deckro, R. F. “MBA admission criteria and academic success”, Decision Sciences, (8), 1996, pp.765-769. [12] Drobik, A.“IT Market Clock for ERP Platform Technology”, New York: Gartner, 2012. [13] Everwijn, S. E. M., Bomers, G. B. J., & Knubben, J. A. “Ability- or competence-based education: Bridging the gap between knowledge acquisition and ability to apply”, Higher Education, (25:4), 1993, pp.425-438. [14] Gonczi, A. “Review of international trends and developments in competency based education and training”. In A. Argüelles & A. Gonczi (Eds.), Competency based education and training: a world perspective, 2000, pp. 15-39. México: Grupo Noriega. [15] Guillemette, F., & Gauthier, C. “Approche par compétences (APC) et formation pratique: Analyse documentaire et critique”, Brock Education, (16:1), 2006, pp.112-133. [16] Gupta, S., Bostrom, R. P., & Huber, M. “End-user training methods: what we know, need to know”, SIGMIS Database, (41:4), 2010, pp.9-39. doi: 10.1145/1899639.1899641 [17] Hair, J.F., Anderson, R.E., Tatham, R.L. & Black, W.C. Multivariate data analysis with readings, Prentice Hall, Englewood Cliffs, NJ, 1998. [18] Halfors, D., Vevea, J.L., Iritani, B., Cho, H., Khatapoush, S. & Saxe, L. “Truancy, grand point average, and sexual activity: A meta-analysis of risk indicators for youth substance use”, Journal of Social Health, (72), 2002, pp.205-211. [19] Harris, R., Guthrie, H., Hobart, B., & Lundberg, D. Competency-based education: Between a rock and a whirlpool, McMillan, Melbourne, 1995. [20] Hirsch, E. D. Some Excerpts from The Schools We Need And Why We Don’t Have Them, DoubleDay, New York, 1999. [21] Hirsch, E. D. The Schools we need and why we don’t have them. Random House, New York, 1999. [22] Hirsch, E. D. Curriculum and Competence. In T. M. Moe (Ed.), A Primer on America’s Schools, Hoover Institution Press, Stanford, 2001. [23] Hong, J.-C., Horng, J.-S., Lin, C.-L., & ChanLin, L.J. “Competency disparity between pre-service teacher education and in-service teaching requirements in Taiwan”, International Journal of Educational Development, (28), 2008, pp.4-20. [24] Hyland, T. “Competence, knowledge and education”, Journal of Philosophy of Education, (27:1), 1993, pp.57-68. doi: 10.1111/j.1467-9752.1993.tb00297.x [25] Jonnaert, P. Compétences et socioconstructivisme: Un cadre théorique, De Boeck, Bruxelles, 2009. [26] Jonnaert, P., Barrette, J., Boufrahi, S., & Masciotra, D. “Contribution critique au développement des programmes d’études : compétences, constructivisme et interdisciplinarité”, Revue des Sciences de l’éducation, (30 :3), 2004, pp.667-696. [27] Jonnaert, P., Ettayebi, M., & Lafortune, L. Observer les réformes en éducation. In L. Lafortune, M. Ettayebi & P. Jonnaert (Eds.), Observer les réformes en éducation, Presses de l’Université du Québec, Québec, 2007, pp.1-14. [28] Jonnaert, P., Ettayebi, M., & Opertti, R. Introduction: Dynamique des réformes éducatives contemporaines. In

Journal of Computer Information Systems

37

Downloaded by [Université du Québec à Montréal] at 05:22 11 March 2016

M. Ettayebi, R. Opertti & P. Jonnaert (Eds.), Logique de compétences et développement curriculaire, L’Harmattan, Paris, 2008, pp.17-25. [29] Kang, D., & Santhanam, R. “A longitudinal field study of training practices in a collaborative application environment”, Journal of Management Information Systems, (20:3), 2003, pp.257-281. [30] Kouwenhoven, W. (2009). Competence-based Curriculum Development in Higher Education: a Globalised Concept? In A. Lazinica & C. Calafate (Eds.), Technology Education and Development, 2009, Available from: http://www.intechopen. com/books/technology-education-and-development/ competence-based-curriculum-development-in-highereducation-a-globalised-concept. [31] Krathwohl, D.R. “A revision of bloom’s taxonomy: An overview”, Theory IntoPractice, (41:4), 2002, pp.212-218 [32] Kuncel, N. R., Credé, M., & Thomas, L. L. “A metaanalysis of the predictive validity of the graduate management admission test (GMAT) and undergraduate grade point average (UGPA) for graduate student academic performance”, Academy of Management Learning & Education, (6:1), 2007, pp.51-68. [33] Kuncel, N. R., Credé, M., & Thomas, L. L. “The validity of self-reported grade point averages, class ranks, and test scores: A meta-analysis and review of the literature”, Review of Educational Research, (75), 2005, pp.63-82. [34] Kuncel, N. R., Credé, M., Thomas, L. L., Klieger, D. M., Seiler, S. N., & Woo, S. E. “A meta-analysis of the Pharmacy College Admission Test (PCAT) and grade predictors of pharmacy student success”, American Journal of Pharmaceutical Education, (69), 2005, pp.339-347. [35] Kuncel, N. R., Credé, M., & Thomas, L. L. “The validity of the GMAT: A meta-analysis” Paper presented at the Annual conference of the Society for Industrial and Organizational Psychology, April 2004, Chicago, Il. [36] Kuncel, N. R., Hezlett, S. A., & Ones, D. S. “A comprehensive meta-analysis of the predictive validity of the Graduate Record Examinations: Implications for graduate student selection and performance”, Psychological Bulletin, (127), 2001, pp.162-181. [37] Larreche, J. C., & Gatignon, H. Markstrat: A Marketing Strategy Game: Participant’s Manual, Scientific Press, 1977. [38] Le Boterf, G. Construire les compétences individuelles et collectives, Éditions d’Organisation, Paris, 2000. [40] Léger, P.-M. “Using a simulation game approach to teach enterprise resource planning concepts”, Journal of Information Systems Education, (17:4), 2006, pp.441-447. [41] Léger, P.-M., Cronan, P., Charland, P., Pellerin, R., Babin, G., & Robert, J. “Authentic OM problem solving in an ERP context”, International Journal of Operations & Production Management, (32:12), 2012, pp.1375-1394. [42] Léger, P.-M., Davis, F. D., Cronan, P., & Perret, J. “Neurophysiological correlates of cognitive absorption, in an enactive training context”, Computers in Human Behavior, (34), May 2014, pp. 273–283. [43] Léger, P.-M., Feldstein, H. D., Babin, G., Charland, P., & Robert, J. “Business simulation training in information technology education: Guidelines for new approaches in IT training”, Journal of Information Technology Education, (10), 2011, pp.37-51. [44] Léger, P.-M., Robert, J., Babin, G., Pellerin, R., & Wagner, B. ERPsim, ERPsim Lab (erpsim.hec.ca), HEC Montreal, Montreal, Canada, 2007. [45] Linn, R. L., & Hastings, C. N. “A meta-analysis of the validity of predictors of performance in law school”, Journal of Educational Measurement, (21), 1984, pp.245-259. 38

[46] Liu, L., & Hmelo-Silver, C. E. “Promoting complex systems learning through the use of conceptual representations in hypermedia”, Journal of Research in Science Teaching, (46:9), 2009, pp.1023-1040. doi: 10.1002/tea.20297 [47] Marcolin, B. L., Compeau, D. R., Munro, M. C., & Huff, S. L. “Assessing user competence: conceptualization and measurement”, Information systems Research, (11:1), 2000, pp.37-60. [48] Markus, M. L., Tanis, C., & Fenema, P. C. “Enterprise resource planning: multisite ERP implementations”, Commun. ACM, (43:4), 2000, pp.42-46. doi: 10.1145/332051.332068 [49] Munro, M. C., Huff, S. L., Maccolin, B. L., & Compeau, D. R. “Understanding and measuring user competence”, Information & Management, (33), 1997, pp.45-57. [50] Murray, C., & Wren, C. T. “Cognitive, academic, and attitudinal predictors of the grade point averages of college students with learning disabilities”, Journal of Learning Disabilities, (36), 2003, pp.407-415. [51] National Association of Colleges and Employers. Job Outlook 2012. Nace Research : National Association of Colleges and Employers, Washington, 2011. [52] Norris, N. “The trouble with competence”, Cambridge Journal of Education, (21:3), 1991, pp.331-341. doi: 10.1080/0305764910210307 [53] Noskov, M. V. “The mathematics education of an engineer”, Russian Education & Society, (49:11), 2007, pp.70-84. doi: 10.2753/res1060-9393491104 [54] Oates, T. “Emerging issues: The response of higher education to competency based approaches”. In J. Burke (Ed.), Competency Based Education and Training, Falmer Press, London, 1989, pp.186-196. [55] OECD (Organization for Economic Co-operation and Development). “Definition and Selection of Competencies: Theorical and Conceptual Foundations”, DeSeCo Report. [56] Phelps, R., Hase, S., & Ellis, A. “Competency, capability, complexity and computers: exploring a new model for conceptualising end-user computer education”, British Journal of Educational Technology, (36:1), 2005, pp.67-84. [57] Ramist, L. Predictive validity of the ATP tests. In T. F. Donlon (Ed.), The College Board technical handbook for the Scholastic Aptitude Test and Achievement Test, College Entrance Examination Board, New York, 1984. [58] Singley, M., & Anderson, J. R. Transfer of Cognitive Skill. Harvard University Press, Cambridge, MA, 1989. [59] Strata-Etan Expert Group. “Higher Education and Research for the ERA: Current Trends and Challenges for the Near Future”, European Commission, DG Recherche, Brussels, Belgium, 2002. [60] StrataCorp. Statistical Software: Release 10, StrataCorp LP, College Station, TX, 2007. [61] Tuxworth, E. “Competence based education and training: background and origins”, In J. W. Burke (Ed.), Competency Based Education and Training. Falmer Press, London, 1989, pp.10-25 [62] Wang, Y., & Haggerty, N. “Knowledge transfer in virtual settings: the role of individual virtual competency”. Information Systems Journal, (19), 2009, pp.571-593. [63] Whitehead, A. R. The Aims of Education and other Essays. The Free Press, New York, 1968. [64] Willingham, W., & Breland, H. Personal qualities and college admissions, College Entrance Examination Board, New York, 1982. [65] Wolf, A. “Can competence and knowledge mix?”, In J. Burke (Ed.), Competency Based Education and Training, Falmer Press, London, 1989, pp.39-53. [66] Seethamraju, R. “Enterprise systems (ES) software in

Journal of Computer Information Systems

Volume 56 Issue 1, Fall 2015

business school curriculum – Evaluation of design and delivery”, Journal of Information Systems Education, (18:1), 2007, p. 69.

Appendix A: Descriptive Statistics and Coefficients of Correlation

Downloaded by [Université du Québec à Montréal] at 05:22 11 March 2016

1

1- Competency 2- GPA 3- Basic Knowledge 4- Complex Knowledge 5- +ˆ ‫ݐ‬ൌͳ

2

Means 0.000 3.479

Std, Dev 0.988 0.618

Corr 1.00 -0.12

p

Corr

0.010

1.00

0.582

0.189

-0.02

0.650

0.468

0.183

0.03

0.000

0.989

0.55

Volume 56 Issue 1, Fall 2015

3 p

Corr

0.04

0.307

1.00

0.445

0.10

0.012

0.000

-0.12

0.024

4 p

Corr

0.51

0.000

1.00

-0.01

0.808

0.02

Journal of Computer Information Systems

5 p

Corr

0.756

1.00

39

p

Suggest Documents