Proceedings of the 34th Hawaii International Conference on System Sciences - 2001
Gaining Insight from the Data Warehouse: The Competence Maturity Model M. Kathryn Brohman Terry College of Business University of Georgia
[email protected] Abstract This study develops a competency maturity model to describe the skills of an data analyst that positively influence data warehouse usage and insight generation in today’s data warehouse environment. Exploratory case studies were used to identify critical competencies and develop new measures of data warehouse usage and success. Grounded in Triandis’ [48] theory of motivation, a research model was tested using a survey of 169 data analysts from 7 organizations. This paper makes contributions to both managers and academia regarding the competencies required to extract value form data warehouses. It also introduces a new data analysis term called “the dance”, a concept for future research.
1. Introduction Data warehouses have presented decision-makers with far more information, in far more flexible form than has been true in the past [14,41]. With such a sophisticated change in technology, users need to become more sophisticated in order to leverage the information in the data warehouse. It is clear that with access to more information, decision-making has also become more complex. However, to date, organizations have given little thought to human skill development in their data warehouse implementation. With the exception of training decision-makers on how to use new data warehouse access tools, organizations have typically invested little effort in identifying the skills required to support decision-making in this new environment. In fact, some organizations continue to outsource data analysis and only recently have identified the need to develop an in-house analytical team. The purpose of this study is to develop a competency maturity model that can be used to develop an in-house analytical team capable of effectively supporting better decision-making in data warehouse environments.
Michael Parent Richard Ivey School of Business University of Western Ontario
[email protected] Managers typically engage in operational or strategic decision-making. In doing so, they use three different types of data warehouse tools: reporting, analytical and data mining [8,39]. Operational decision-making flows from traditional decision support system usage. The decision-maker uses reporting tools to generate standard reports they use to make decisions related to day-to-day operations. For example, a purchasing manager may depend on a daily inventory report to determine what products need to be purchased. In strategic decisionmaking, decision-makers are more focused on long-term business issues related to planning. In most cases, decision-makers work closely with an analyst who has the skills required to make use of analytical and data mining tools [5]. These individuals work as a dyad to identify data analysis requirements that will address the business problem. Typically, the analyst will conduct ad hoc data analysis and data mining to generate information related to the business problem. Ad hoc data analysis is deductive, the analyst applies paradigm- or model specific knowledge to analyze the data. Data mining is inductive, enabled by multi-dimensional query and intelligent algorithms used for bottom-up discovery driven data extraction and analysis [14,41]. The dyad works together to interpret the analysis and in the end, results are used to support a more strategic decision. An example of strategic analysis is an airline quality control manager and analyst who conduct ad hoc analysis to determine the length of time it took contracted maintenance shops to repair a part. They compared the number of days across a variety of shops and used the information to renegotiate maintenance agreements. The competency maturity model developed in this study focuses specifically on analyst skills required to support strategic decision-making.
2. Literature Review According to Triandis’ [47] theory of motivation, individuals’ behavior is determined by what people would like to do (attitudes), what they think they should do (social norms), what they have usually done (habits) and by the expected consequences of their behavior. More
0-7695-0981-9/01 $10.00 (c) 2001 IEEE
1
Proceedings of the 34th Hawaii International Conference on System Sciences - 2001
comprehensive development of this theory identified individual expertise as a direct antecedent to behavior [48]. Triandis’ theory has been used to study technology usage and results concluded that this theoretical foundation was at least as powerful as the theory of reasoned action [10] in terms of prediction [4,44].
2.1. Individual Expertise Three categories of information systems expertise have been studied in relation to information systems usage: technical, systems, and business [46]. • • • • • • • •
Technical Expertise Operating systems Programming languages Distributed systems Database CASE tools and application development Network administration Security Troubleshooting
Technical expertise covers specific skills related to hardware and software [25,37,35]. Systems expertise relates to problem solving skills, including analytical and modeling skills. Business expertise is divided into three sub-categories: business, managerial and social. Business skills relate to knowledge about the industry and functional areas within the organization. Managerial skills are defined as leadership, planning and controlling. Finally, social knowledge includes communication, motivation and interpersonal skills. Table 2.1 provides a detailed summary of skills relevant to each of these categories.
System Expertise Problem Solving • General problem solving • Qualitative and logical abilities • Creativity Development Methodology • Programming logic • Systems analysis and design • Business reengineering and integration
Business Expertise Business • Industry experience • IT and business strategy • Cost benefit analysis • Feasibility Managerial • Leadership • Planning and organizing work • Controlling Social • Communication skills
Table 2.1: Summary of individual competencies To date, empirical validation of the relationship between expertise and information systems usage has been minimal. Early research found empirical support that greater technical competence positively influenced IS usage [17,29]. Snitkin and King [40] found that users with a business education made more use of spreadsheet applications. Nelson [28] surveyed 275 employees in 8 organizations and concluded that technology users had sufficient technical skills, however were less competent in terms of their knowledge of IS policies and strategies. Research has also explored the importance of technical versus business competence as related to IS job performance. Two early descriptive studies found that manager and system analysts ranked dealing with people higher than technical skills in terms of job performance [3,26]. In more recent studies, researchers surveyed MIS managers to identify the important skills specifically for system developers and IS job placement [20,32]. Results further supported the importance of business-related skills over technical skills. System competence has been studied mainly in the form of an individual’s problem solving style and how it influences the way they process and utilize information. Vandenbosch and Huff [51] found that an individual’s tolerance for ambiguity in problem solving was strongly linked to their predisposition toward scanning. Alavi and Joachimsthaler [1] identified twenty-two studies that
focused on the influence of problem-solving style on DSS performance. Results were inconclusive. In the data warehouse literature, experts have stated that data warehouse users require a new ‘mind-set” in order to analyze data effectively in a data warehouse environment. However, they were remiss in not providing a clear definition of the components of “mind-set” [34,38]. In general, the influence of individual expertise on data warehouse usage has not been studied.
2.2. Data Warehouse Usage Data warehouse usage occurs at two levels: exploratory data analysis (EDA) and structured data analysis (SDA). EDA refers to actions such as searching for data, transforming data, and extracting data for analysis. SDA refers to actions such as trend analysis, mathematical analysis and statistical analysis to generate analytical results. Brohman et. al. [5] identified these levels in an exploratory study of data warehouse usage. Levels of usage have been defined in other studies including Fuerst and Cheney [12] who defined general and specific levels of usage. Vandenbosch and Higgins [50] defined focused search and scanning as levels of executive support system usage. More related to data warehousing, Steiger [41] defined two levels to decision support system usage: inductive data analysis and deductive data analysis.
0-7695-0981-9/01 $10.00 (c) 2001 IEEE
2
Proceedings of the 34th Hawaii International Conference on System Sciences - 2001
2.3. Data Warehouse Usage Success Early IT implementation examined usage as a surrogate for implementation success [7]. However, as the subjects of this study were regular data warehouse users, usage was not deemed to be an effective measure of success. This was consistent with the argument made by Leidner and Elam [19] who studied the influence of EIS usage on decision performance. Neither managerial practice nor academia have painted a clear picture of the relationship between usage and performance [7,36]. Nevertheless, several models have been introduced in an attempt to explain the complexity of this relationship [21,23,24,49]. To date, IS researchers have focused mainly on the relationship between IT usage and decision performance. Researchers predicted that decision-makers that accessed more information from a database would be more likely to make more effective decisions based on discovery of new information [9,43]. Empirical studies have reported mixed results; Vandenbosch and Huff [51] found that in an EIS environment, executives did not utilize EIS systems to enhance effective strategic decision-making, whereas Leidner and Elam [19] found that higher frequency and duration of usage positively influenced the extent of analysis and decision-making speed in an EIS environment. Brohman et. al. [5] explored the relationship between usage and decision performance and identified insight gained as a mediating variable. Ninety-four percent of the decision-makers and analysts interviewed in their study stated that data warehouse usage enhanced the degree of insight gained, which in turn influenced decision performance. Leidner and Elam [19] also found that increased frequency and time spent using an executive information system (EIS) positively influenced the number of alternatives explored, number of data sources examined, and the depth of analysis.
3. Exploratory Study Prior to developing the research model, seven case studies were undertaken. A total of 18 semi-structured interviews were conducted with analysts and decisionmakers. Multiple perspectives were obtained to capture a richer breadth of understanding related to how analysts viewed their skills, as well as how decision-makers defined important analyst expertise [15,31]. Both analysts and managers were asked open-ended questions related to data analyst competencies, levels of data warehouse usage, and the impact of usage on the organization. Each interview was taped, transcribed, and analyzed using a technique similar to the one used by Reich and Benbasat [33]. As the transcripts were verbatim replications of the actual interviews, a thorough content analysis was possible. Reliability and confidence of
exploratory results were enhanced through convergence of observations from multiple investigators. Inter-rater reliability was assessed through calculation of kappa coefficients for all constructs in the research model [6]. Kappa coefficients ranged from 0.74 to 0.79, all above the 0.65 acceptable standard [27,45]. Seventy-two percent of respondents stated that individual business and technical expertise were important in explaining data warehouse usage and insight gained. High business expertise would enable analysts to explore a problem from multiple business, hence resulting in more insight gained. Results related to technical expertise were not as consistent. Some analysts stated that more technical expertise would enable generation of fewer, more sophisticated queries while other respondents explained that technical expertise would result in increased usage as analysts had the capability to make more extensive use of the technical tools available. Ninety-four percent of respondents stated that system expertise was important in explaining data warehouse usage and insight gained. Ninety-four percent of respondents also asserted that more exploration and structured data analysis would enhance the degree of insight gained through analysis. Respondents defined four types of insight: description, recommendations, alternative generation and evaluation, and predictive model development.
4. Research Hypotheses As a result of the literature review and the exploratory study, five hypotheses were developed and tested to determine the extent to which technical, business and system expertise influence data warehouse usage behavior and insight gained. Consistent with Triandis’ [48] theory of motivation, the research hypotheses predict that an analyst’s expertise will positively influence their usage behavior. Specifically, analysts who have more technical expertise in the form of using query packages, creating SQL queries, using applications (i.e. SAS) to clean and transform data, using applications (i.e.. Microsoft Excel) to conduct graphical and financial analysis, and using statistics applications to test relationships and build models are likely to exert more effort exploring and analyzing the data in the data warehouse. Hence, H1a: Analysts with a high degree of technical expertise will conduct exploratory data analysis (EDA) at a higher degree of intensity. H1b: Analysts with a high degree of technical expertise will conduct structured data analysis (SDA) at a higher degree of intensity. Analysts who have expertise related to key business factors in the industry as well as general business strategy
0-7695-0981-9/01 $10.00 (c) 2001 IEEE
3
Proceedings of the 34th Hawaii International Conference on System Sciences - 2001
are likely to exert more effort exploring a data analysis task from multiple business perspectives. Translated into data warehouse usage behavior, it is expected that the motivation to explore a task from multiple business perspectives will increase the intensity of exploratory and structured data analysis. Hence, H2a: Analysts with a high degree of business expertise will conduct exploratory data analysis (EDA) at a higher degree of intensity. H2b: Analysts with a high degree of business expertise will conduct structured data analysis (SDA) at a higher degree of intensity. Data analysts who have more system expertise in the form of general problem solving will explore and analyze the data in the data warehouse at a higher degree of intensity. Hence, H3a: Analysts with a high degree of system expertise will conduct exploratory data analysis (EDA) at a higher degree of intensity. H3b: Analysts with a high degree of system expertise will conduct structured data analysis (SDA) at a higher degree of intensity. The degree of insight gained relative to a task at hand will be higher when data analysts explore and analyze data at a higher degree of intensity. More exploration as well as an increased number of mathematical and statistical tests using different variables would generate more insight. Hence, H4a: Exploratory data analysis (EDA) conducted at a higher degree of intensity will increase perceived insight gained. H4b: Structured data analysis (SDA) conducted at a higher degree of intensity will increase perceived insight gained. Insight gained from enhanced intensity in utilizing the data warehouse will likely result in better decision performance. Decision performance captures both decision effectiveness related to better quality decisions that create an advantage in the market, as well as decision efficiency that relates to the time taken to reach this decision. Hence, H5: More insight gained will positively influence decisionmaking performance.
5. Research Methodology A survey was used to empirically test the research hypotheses. Three hundred eight-six surveys were sent to
analysts in seven participating organizations. One hundred sixty-nine valid responses were received, for a response rate of 44 percent. Unpaired t-tests were used to test for non-response bias. Results indicated that there were no differences between the group means of early and late respondents. The mean age of respondents was between 31 and 35. Forty percent were female. Ninety-five percent were educated at the college/university level, the most common educational backgrounds being in business (39%) and applied math/statistics (36%).
5.1. Measures All measures in this survey were from self-reports. The survey instructed the analyst to provide answers to expertise and usage questions related to a specific ad hoc analysis task they had completed in the past six months. As insight gained was a performance measure that could not be accurately self-reported on by the analyst, insight gained questions were answered by the decision-maker involved in the task described. Appendix 1 describes the details of the dual respondent survey technique designed for this study. The majority of the instruments used were pre-validated measures. The instrument used to measure data warehouse usage was developed for this study. 5.1.1. Pre-validated Measures Technical competence was measured using an instrument designed by Igbaria, Guimaraes and Davis [16]. The instrument asked respondents to indicate the extent (1= no expertise and 7 = extremely extensive expertise) of technical expertise related to query applications, SQL, extraction tools, analysis applications, statistical applications, and intelligent agents. An instrument used by Goodhue [13] to study user evaluations of information systems measured business expertise. This business expertise instrument was composed of two items that measured how well the individual understood the business mission and its relation to corporate objectives. System expertise was measured by a single item that captured the number of years of analytical experience in relation to data analysis. Insight gained was measured by four items previously validated as a measure of IS effectiveness for end-user systems [18]. These items were modified to capture the decision-maker’s perceptions related to the effectiveness of data warehouse usage to generate insight, specifically in the forms of explanation, prediction, and interpretation of alternative explanations. Two dimensions defined decision-making performance: decision-making efficiency and decisionmaking effectiveness. Efficiency and effectiveness were measured by a “level of agreement” scale that included
0-7695-0981-9/01 $10.00 (c) 2001 IEEE
4
Proceedings of the 34th Hawaii International Conference on System Sciences - 2001
three and four items respectively. Other researchers have used these items to measure system success [22,31]. 5.1.2. Instrument Development: Data Warehouse Usage Items were created, that captured the intensity of EDA and SDA usage as related to a task defined, based on an instrument used by Thompson et. al [44]. Analysts were asked to identify specific values related to usage intensity for a variety of EDA and SDA dimensions. The final instrument is included in Appendix B. Two decisionmakers and five analysts from three organizations assessed the content validity throughout the creation of the usage instrument.
5.2. Measurement Model Assessment: Validity and Reliability Mean Technical Expertise Business Expertise System Expertise EDA (in attributes) SDA (in attributes)
4.07 5.71 7.3 years Median = 6 years 38.62 Median=10 18.03 Median=3.6
A factor loading matrix was created to assess measurement model validity. A total of three items were trimmed from the measurement model as their loadings were less than .70 [2]. As all remaining items explained more variation in the construct than unexplained (i.e. loadings > .70) and explained more variation in their own construct than others (i.e. loadings > cross-loadings), the measurement model was deemed valid in terms of convergent and discriminant validity. Cronbach alpha was used as a test for reliability. As illustrated in Table 5.1, reliability of all constructs was greater than .70, hence represented reliable measures [30]. Table 5.1 also illustrates that variation in business expertise, insight gained, and decision performance was limited in the data collected. All scale items were recoded into high, average and low categories and variable measures were analyzed using mean, median, minimum and maximum.
Low ( 1 to 2.33) 23 (14%) 4 (2%) < 1 year
High (4.67 to 7) 58 (34%) 140 (83%) 35 year
Cronbach Alpha 078 0.74 1.00
0
500
1.00
0
680
0.81
138 105
0.88 0.91
4 Insight Gained 5.41 9 Decision Performance Table 5.1: Frequency Assessment of Constructs
Average (2.34 to 4.66) 88 (52%) 24 (15%)
26 55
5.3. Structural Model: Correlation Assessment Table 5.2 summarizes the correlation between latent variables in the research model.
Technical Expertise
Business Expertise
Technical Expertise 1.00 Business Expertise .076 System Expertise .033 EDA .365 SDA .322 Perceived Insight Gained .203 Perceived Decision Perf. .010 Table 5.2: Correlation of Latent Variables
1.00 .087 .033 .047 -.043 .071
System Expertise
1.00 .013 .117 .200 .212
EDA
1.00 .546 .183 .096
0-7695-0981-9/01 $10.00 (c) 2001 IEEE
SDA
1.00 .159 .035
Perceived Insight Gained
1.00 .558
Perceived Decision Performance
1.00
5
Proceedings of the 34th Hawaii International Conference on System Sciences - 2001
5.4. Structural Model: Results of Hypothesis Testing Partial Least Squares (PLS) was used to test the research model. PLS is a powerful technique used to understand the interaction between theory and data. It also
Technical Expertise
γ=.365
EDA 2 R =.134
γ=.141
γ=.500 SDA 2 R = .315
has minimal data assumptions allowing small sample sizes. Many IS researchers have made use of this technique [16,42,44,50]. Figure 5.1 illustrates significant path coefficients, at the .01 level of confidence, and R2 values for dependent variables in the research model.
γ=.151
γ=.057*
Perceived Insight Gained R2 = .087
γ=.558
Perceived Decision Performance R2 = .312
γ=.198 System Expertise * significant at the .05 level of confidence
Figure 5.1: Main Results Overall, the predictive power of the model was substantial, especially considering that EDA, SDA and insight gained were all new constructs. Empirical support was found for three of the five research hypotheses. Specifically, support was found for the positive relationship between technical expertise and both EDA (H1a) and SDA (H1b), as well as for the relationship between EDA (H4a) and SDA (H4b) and insight gained. Support was also found for a positive relationship between insight gained and decision performance (H5). No support was found between business expertise (H2) or system expertise (H3) and data warehouse usage. However, as illustrated in Figure 5.1, system expertise had a significant direct influence on insight gained. The relationship between business expertise and insight gained was not significant.
6. Discussion Results from this study supported three major conclusions. First, insight gained is an effective measure of data warehouse success. Second, technical and system competence play significant roles in predicting data warehouse usage behavior and insight gained. Finally, data warehouse usage is a complex concept that includes two
distinct types of usage: exploratory data analysis and structured data analysis.
6.1. Perceived Insight Gained: A Measure of Data Warehouse Success With close to nine percent of the variance explained in perceived insight gained, and significant relationships uncovered in both antecedent variables and decision performance, this study helps define data warehouse success. In order to enhance decision performance, organizations need to recognize the role of insight gained as a mediating variable between data warehouse usage and decision performance. The concept of new insight is broad as it relates to the creation of new ideas through many forms of data interpretation. Data warehouse usage and system expertise (i.e. problem-solving, logical ability and creativity) represent two forms of data interpretation. It is likely that other forms exist.
6.2. The Role of Competence in Data Warehouse Usage and Success As results indicated, different competencies have different influences on an analysts’ ability to generate
0-7695-0981-9/01 $10.00 (c) 2001 IEEE
6
Proceedings of the 34th Hawaii International Conference on System Sciences - 2001
insight related to a task. Analysts who are proficient in technical skills will generate insight from data, whereas analysts who are proficient in system skills (i.e. problem solving, logical ability, and creativity) are more likely to generate insight independent of the data warehouse as well. An ideal profile of an analyst highly effective at generating insight would be one that possesses both technical and system skills. Arguably, an analyst with both technical and system maturity (i.e. a maturity rating of 4:4) will generate the most insight related to the task-at-hand. This concept is illustrated by Figure 6.1. It is likely that analysts will grow through the maturity model toward the ideal, a 4:4. An analyst fresh out of college may be rated as a 2:1 technical/system competence depending on their expertise developed in college. Hence, it is not only the responsibility of organizations to provide training and experience to enable maturity, it is also the responsibility of academia to develop a curriculum that will enable maturity on both technical and systems dimensions. Analyst Maturity Rating Technical System Competence Competence 4 4 3 3 Ability to 2 2 Generate Insight 1 1 Figure 6.1: Analyst Competency Profile Model
6.3. Data Warehouse Usage: A Complex Concept This study made a significant first step in identifying two orthogonal levels to describe data warehouse usage. As final results support, the levels of intensity dedicated to exploratory data analysis will positively influence the level of intensity dedicated to analyzing the data through the use of structured mathematical and statistical tools. Although both levels of usage have a significant positive influence on insight gained, the path coefficients between EDA to insight gained is higher.
7. Limitations This study suffers from the usual limitations of surveybased research [11]. As respondent were asked to answer questions related to a task they had completed up to 6 months prior to the survey, it is possible that the data captured was less accurate than if gathered at the time of analysis. A second limitation of the study was the singleitem instruments used to measure exploratory data analysis and system expertise. One final limitation was the low variation captured in final data set. As analysts ranked themselves very high in all areas of competence, a social desirability bias might have been possible. Also, as analysts were given a choice in the task they described, the
majority chose to describe a task that was highly successful in terms of the decision performance.
8. Contribution to Managers and Research 8.1. Contribution to Managers This study makes two main contributions to managers. First, it suggests that managers need to consider skills other than technical in recruiting analysts. System expertise, including problem solving, logical ability, and creativity, will generate insight related to the task at hand. It is likely that a highly effective analytical team will have maturity in both areas, enabling analysts to leverage each other. What does this mean for organizations? First, managers need to understand the competency maturity profiles of the analysts on the current team before recruiting additional analysts. Second, organizations need to insure they encourage decision-makers to work closely with analysts, that is, create a team environment that encourages the generation of insight and enhances strategic decision performance. In some organizations, this may lead to a change in culture to enable the development of close working relationships between decision-makers and technical analysts. A second contribution relates to the significant relationship between data warehouse usage and insight gained. Organizations need to insure that data analysts are provided with the necessary resources and given the proper incentives to encourage data exploration prior to analysis. These resources may include a shared database of past analysis results, detailed meta-data, and availability of external data sources. Incentives need to recognize and reward analysts for generating insight related to strategic decisions. Another recommendation is that organizations create a standard operating procedure that requires decision-makers to request information with longer time frames. This will give analysts the opportunity to take time to explore more attributes in the data warehouse. In most current circumstances, decision-makers request information and demand delivery in a short time frame. These resources and incentives may work to encourage data warehouse usage and in effect, increase the degree of insight gained and decision performance.
8.2. Contributions to Research Methodologically, this study made two contributions. First, the dual respondent survey design proved to be an effective technique in capturing details related to behaviors involving more than one individual. Even with the added complexity of dual respondents, the response rate was average (44 percent) and 77 percent of analysts chose the dual response method over completing the survey from a single perspective. Only five surveys were
0-7695-0981-9/01 $10.00 (c) 2001 IEEE
7
Proceedings of the 34th Hawaii International Conference on System Sciences - 2001
damaged as a result of the complexity involved in survey administration. The second contribution was the introduction and validation of new measures for data warehouse usage and insight gained. From a theoretical perspective, this study provided further support for Triandis’ [48] theory, specifically with respect to the influence of technical expertise on usage behavior and system expertise on interpretation behavior. Another contribution was the introduction of insight gained as an important construct in the multi-dimensional success typology for data warehousing. It is possible that this construct will be valuable in understanding success for other forms of decision support technology.
9. Conclusion This study provided support for the assertion that the more sophisticated the data storage system, the more complex the process of data analysis and interpretation. Effective data analysis today requires organizations to design and create analytical teams that have the maturity to generate insight from both data warehouse usage and problem-solving skills. It also requires organizations to create an environment that encourages and enables decision-makers and analysts to work together in a relationship that is most effective at generating insight and enhancing decision performance. It is likely that the forms of these relationships will vary depending not only on the maturity of the analyst, but also on the skills of the decision-maker. That being said, managers need to first understand the expectation of the decision-makers. For example, if decision-makers only want analysts to generate insight from the data and have little expectation for further interpretation, it will be important to create an analytical team strong in technical competence. In this case, the decision-maker would need to provide a strong lead for the analyst to make sure the final analysis aligns with the business problem. Using the dance typology, this relationship may be similar to a waltz. In a waltz-style, the decision-maker would define the necessary steps and concentrate on leading the analyst to a desired outcome. If the decision-maker prefers to work more closely with the analyst to take advantage of the insight gained through data analysis, as well as their problem-solving expertise, the decision-maker may need to share the lead role. This dance-style is more similar to a tango. In this relationship, each partner has an understanding of their partner's next steps, but is responsible for making their own moves to contribute to the final performance. In order to achieve this level of dance, the analytical team needs to include analysts higher in system expertise maturity. Analysts require system expertise to develop a problemsolving methodology, or a series of dance steps.
It is clear that enabling effective data analysis and interpretation in today’s environment goes way beyond training. This paper highlights the importance of human process development, presented using the analogy of a "dance". Two individuals (i.e. the decision-maker and the analyst) start by identifying their own preferences and abilities. They work together to understand how each other will behave and try to identify a form (i.e. a dance style) that will best serve their ability to work together to achieve a goal. It is likely that different forms (i.e. a tango or a waltz dance style) will have a different impact on insight gained and decision performance. This study was successful in identifying the maturity model of the analyst. Much more work is required to describe the components, processes, and outcomes of “the dance”.
References [1] Alavi, M. and E.A. Joachimsthaler. 1992. Revisiting DSS implementation research: A meta-analysis of the literature and suggestions for researchers. MIS Quarterly, March: 95116. [2] Barclay, D., C. Higgins, and R. Thompson 1994. The partial least squares approach to causal modeling: personal computer adoption and use as an illustration. Technology Studies: Special Issue on Research Methodology. [3] Benbasat, I., A.S. Dexter and R.W. Mantha. 1980. Impact of organizational maturity on information systems skills needs. MIS Quarterly, 14(1): 21-34. [4] Bergeron, F. and L. Raymond. 1992. Evaluation of EIS from a managerial perspective. Journal of Management Information Systems, 2: 45-60. [5] Brohman, M.K., M. Parent, M.R. Pearce, and M. Wade. 2000. The Business Intelligence Value Chain: Data-Driven Decision Support in a Data Warehouse Environment: An Exploratory Study. Thirty-third Hawaii International Conference on System Sciences (HICSS), Maui. [6] Cohen, J. 1960. A coefficient of agreement for nominal scales. Education and Psychology Measurement, 220(1): 3746. [7] DeLone W.H. and E.R. McLean. 1992. Information systems success: The quest for the dependent variable. Information Systems Research, 3(1): 60-95. [8] Edelstein, H.A. 1998. Data mining: The state of practice. Proceedings of the Annual Leadership Conference, The Data Warehouse Institute. 63-97. [9] El-Sawy, O.A. and T.C. Pauchant. 1988. Triggers, templates and twitches in the tracking of emerging strategic issues. Strategic Management Journal, 9(5): 455-473. [10] Fishbein, M. and I. Ajzen. 1975. Belief, Attitude, Intention, and Behavior: An Introduction to Theory and Research. Boston, MA: Addison-Wesley. [11] Franz, D.R., D. Robey, and R.R. Koeblitz. 1986. User response to an online I/S: A field experiment. MIS Quarterly, 10(1): 29-44. [12] Fuerst, W. and Cheney. 1982. Factors affecting the perceived utilization of computer-based decision support systems in the oil industry. Decision Sciences, 13(4): 554569.
0-7695-0981-9/01 $10.00 (c) 2001 IEEE
8
Proceedings of the 34th Hawaii International Conference on System Sciences - 2001
[13] Goodhue, D.L. 1995. Understanding user evaluations of information systems. Management Science, 41(12): December, 1827-1844. [14] Gray, P. and H.J. Watson. 1998. Decision Support in the Data Warehouse. Upper Saddle River, NJ: Prentice Hall. [15] Hamilton, S. and N.L. Chervany. 1981. Evaluating information system effectiveness – Part 1: Comparing evaluating approaches. MIS Quarterly, September: 55-69. [16] Igbaria, M., R. Guimaraes, and G.B. Davis. 1995. Testing the determinants of microcomputer usage via a structural equation model. Journal of Management Information Systems, 11(4): 87-114. [17] Lee, D.M.S. 1986. Usage patterns and sources of assistance for personal computer users. MIS Quarterly, 10(4): 293-304. [18] Lee, S.M., Y.R. Kim, and J. Lee. 1995. An empirical study of the relationship among end-user information systems acceptance, training, and effectiveness. Journal of Management Information Systems, 12(2): 189-202. [19] Leidner, D.E. and J.J. Elam. 1994. Executive information systems: Their impact on executive decision-making. Journal of Management Information Systems, 10(3): 139155. [20] Leitheiser, R.L. 1992. MIS skills for the 1990s: A survey of MIS managers’ perceptions. Journal of Management Information Systems, 10(3): 139-155. [21] Lucas, H.C. Jr. 1993. The business value of information technology: A historical perspective and thoughts for future research. In Strategic Information Technology Management: Perspectives on Organizational Growth and Competitive Advantage, R.D. Banker, R.J. Kauffman, and M.A.. Mahmood (eds.), Harrisburg, PA; Idea Group Publishing: 359-374. [22] Marcolin, B.L. 1994. The impact of users’ expectations on the success of information technology implementation. Unpublished doctoral dissertation: University of Western Ontario. [23] Markus, M.L. and C. Soh. 1993. Banking on information technology: Converting IT spending into firm performance. In Strategic Information Technology Management: Perspectives on Organizational Growth and Competitive Advantage, R.D. Banker, R.J. Kauffman, and M.A.. Mahmood (eds.), Harrisburg, PA: Idea Group Publishing Inc.: 375-404. [24] McKeen, J.D., H.A. Smith, and M. Parent. 1997. An integrative research approach to assess the business value of information technology. In Measuring Information Technology Payoff: Contemporary Approaches, M. A. Mahmood (ed.), Harrisburg, PA: Idea Group Publishing Inc. [25] McLean, E.R. 1979. End users as application developers. MIS Quarterly. 3(4): 37-46. [26] Meador, C.L., M.J. Guyole and P.G. W. Keen. 1984. Setting priorities for DSS Development. MIS Quarterly, 8(2): 117129. [27] Moore, G.C. and I. Benbasat. 1991. Development of an instrument to measure the perceptions of adopting an information technology innovation. Information Systems Research, 2(3): 192-222. [28] Nelson, R.R. 1991. Educational needs as perceived by IS and end-user personnel: A survey of knowledge and skill requirements. MIS Quarterly, December: 503-525.
[29] Nelson, R.R. and P.H. Cheney. 1987. Educating the CBIS user: A case analysis. Database, 18(2): 11-16. [30] Nunally, J. 1978. Psychometric Theory (2nd ed.) New York, NY: McGraw Hill. [31] Rainer, R.K. Jr. and H.J. Watson. 1995. The keys to executive information system success. Journal of Management Information Systems, 12(2): 83-98. [32] Reich, B.H. 1995. Entry-level jobs for MIS graduates and implications for academic programs. Working paper: Simon Fraser University. [33] Reich, B.H. and I. Benbasat. 1990. An empirical investigation of factors influencing the success of customeroriented strategic systems. Information Systems Research, 1(3): 325-347. [34] Rist, R. 1997. Challenges faced by data warehousing pioneers. Journal of Data Warehousing, 2(1): 63-72. [35] Rivard, S. and S.L. Huff. 1985. An empirical study of users as application developers. Information and Management, 8(2): 89-102. [36] Robins, G. 1992. Productivity: A look at technology ROI. Stores, October: 48-52. [37] Rockart, J.F. and L.S. Flannery. 1983. The management of end-user computing. Communications of the ACM, 26(10): 776-784. [38] Sakaguchi, R. and M.N. Frolick. 1997. A review of the data warehouse literature. Journal of Data Warehousing, 2(1): 34-54. [39] Silver, M.S. 1991. Systems that support decision makers: Description and analysis. New York: John Wiley & Sons. [40] Snitkin, S.R. and W.R. King. 1986. Determininants of the effectiveness of personal decision support systems. Information and Management, 10(2): 83-89. [41] Steiger, D.M. 1998. Enhancing user understanding in a decision support system: A theoretical basis and framework. Journal of Management Information Systems, 15(2): 199220. [42] Taylor, S. and P.A. Todd. 1995. Understanding information technology usage: A test of competing models. Information Systems Research, 6(2), June: 144-176. [43] Thomas, J.B. and R.R. McDaniel. 1990. Interpretation of strategic issues: effects of strategy and the information processing structure of top management teams. Academy of Management Journal, 33(2): 286-306. [44] Thompson, R.L., C.H. Higgins and J.M. Howell. 1994. Information of experience on personal computer utilization: Testing a conceptual model. Journal of Management Information Systems, 11(1): 167-187. [45] Todd, P. and I. Benbasat. 1991. An empirical investigation of the impact of computer-based decision aids on decisionmaking strategies. Information Systems Research, 2(2): 87115. [46] Todd, P., J.D. McKeen, and R.B. Gallupe. 1992. The evolution of IS job skills: A content analysis of IS job advertisements from 1970-1990. Datamation, November: 14-16. [47] Triandis, H.C. 1971. Attitude and Attitude Change. New York: NY: John Wiley and Sons. [48] Triandis, H.C. 1980. Values, attitudes, and interpersonal behavior. Nebraska Symposium on Motivation, 1979; Beliefs, Attitudes and Values, Lincoln NE; University of Nebraska Press: 195-259.
0-7695-0981-9/01 $10.00 (c) 2001 IEEE
9
Proceedings of the 34th Hawaii International Conference on System Sciences - 2001
[49] Trice, A.W. and M.E. Treacy. 1986. Utilization as a dependent variable in MIS research. Proceedings of the Seventh International Conference on Information Systems, San Diego, California: 227-239. [50] Vandenbosch, B. and C.A. Higgins. 1996. Information acquisition and mental models: An investigation into the relationship between behavior and learning. Information Systems Research, 7(2), June: 198-214. [51] Vandenbosch, B. and S.L. Huff. 1997. Searching and Scanning: How executives obtain information from executive information systems. MIS Quarterly, March: 81105.
Appendix A: Dual Respondent Survey Technique A pilot study was used to test whether or not analysts could provide accurate responses to questions related to the impact of the analysis on the performance (i.e. new insight). Results concluded that analysts overestimated actual performance. The dual-respondent survey technique addresses this inaccuracy by first requiring the analyst to choose a specific task and complete the non-performance related questions in the survey. Once complete, the analyst wrote a description of the task described on the top of the following page. This page included performance related questions (i.e. insight gained and decision performance). The analyst sealed pages that included the nonperformance questions, and forwarded the questionnaire to the decision-maker involved in the task described. The decision-maker answered the performance questions related to the specific task and returned the completed questionnaire to the researcher. Appendix B: EDA, SDA Measurement Instrument
and
Insight
Insight Gained Q104: The analysis completed using the database and its tools was a valuable aid to me in gaining new insight related to the task at hand. Q105: The analysis completed had a large, positive impact on the level of explanation reported related to the task at hand. Q106: The analysis completed had a large positive impact on the quality of recommendations reported related to the task at hand.
Q107: The analysis completed had a large, positive impact on my ability to interpret the problem by exploring alternative explanations related to the task.
Gained
Exploratory Data Analysis (EDA) Q27: How many attributes did you transform to prepare data for analysis? Structured Data Analysis (SDA) Q30: How many ad hoc queries did you execute to analyze the data? Q31: How many attributes were compared and analyzed using graphs (pie charts, bar graphs etc.) during analysis? Q32: How many attributes were analyzed using mathematical formulas (i.e. ROI, margins etc.) during analysis? Q33: How many attributes were analyzed using statistical techniques (chi-square, regression etc.) during analysis? Q34: Once structured data analysis was complete, how many attributes did you further analyze by “drilling down” to a more detailed level of data?
0-7695-0981-9/01 $10.00 (c) 2001 IEEE
10