CALL
FOR
ARTICLES
Journal of Cases on Information Technology An official publication of the Information Resources Management Association
MISSION:
The Journal of Cases on Information Technology (JCIT) is an international refereed journal whose mission is to provide understanding and lessons learned in regard to all aspects of information technology utilization and management in modern organizations. The primary mission of teaching cases published in JCIT is for teaching purposes both in undergraduate and graduate information systems courses. JCIT also strives to publish teaching cases that can be instrumental to Information Systems managers to learn from the success and pitfalls of other organizations related to IT utilization and management.
F O
COVERAGE/MAJOR TOPICS:
The Journal of Cases on Information Technology (JCIT) documents comprehensive, real-life teaching cases based on individual, organizational, and societal experiences related to the utilization and management of information technology. Teaching cases published in JCIT deal with a wide variety of organizations such as businesses, government organizations, educational institutions, libraries, non-profit organizations, and so forth. Additionally, teaching cases published in JCIT report not only successful utilization of IT applications, but also failures and mismanagement of IT resources and applications. Topics to be discussed in this journal include (but are not limited to) the following:
O R
P L
• • • • • • • • • • • • • •
A B O L G
Data management Distance learning E-business E-commerce technologies E-government E-learning technologies E-services End user computing Human side of IT Information security and ethics Internet technologies Issues of emerging technology IT in business IT in developing countries
I IG
ISSN 1548-7717 eISSN 1548-7725 Published quarterly
• • • • • • • • • •
IT in government IT in libraries IT in organizations IT in small and medium-sized enterprises (SMEs) IT in the classroom IT in the healthcare industry Legal issues of IT Multimedia in education Social networks Web-enabled technologies
All submissions should be sent via the online submission system: www.igi-global.com/authorseditors/titlesubmission/newproject.aspx Ideas for Special Theme Issues may be submitted to the Editor-in-Chief at
[email protected]
Please recommend this publication to your librarian. For a convenient easy-to-use library recommendation for. please visit: http://www.igi-global.com/JCIT
JOURNAL
CASES ON TECHNOLOGY
OF
INFORMATION
January-March 2013, Vol. 15, No. 1
Table of Contents Special Issue on Effective eLearning Practices iv
Anabela Mesquita, Polytechnic of Porto, Porto, Portugal, & Algoritmi Research Center, Minho University, Braga, Portugal Paula Peres, Polytechnic of Porto, Porto, Portugal
Research Articles
F O
1
Assessing the Effectiveness of an E-Learning Framework: The Portuguese Insurance Academy Case Nuno Pena, ADVANCE Research Center - ISEG and UnYLeYa, Lisbon, Portugal Pedro Isaías, Universidade Aberta and ADVANCE Research Center-ISEG, Lisbon, Portugal
19
A B-Learning Methodology Case for Faculty at High Education Lina García-Cabrera, Computing School, Department of Computer Science, University of Jaén, Jaén, Spain Ildefonso Ruano-Ruano, Escuela Politécnica Superior de Linares, Department of Telecommunication Engineering, University of Jaén, Jaén, Spain José Ramón Balsas-Almagro, Computing School, Department of Computer Science, University of Jaén, Jaén, Spain
36
Developing Independent Learning Skills for Postgraduate Students through Blended Learning Environment Ing Liang Wong, School of Engineering and Built Environment, Glasgow Caledonian University, Glasgow, Scotland, UK
51
Education Portal on Climate Change with Web GIS Client Vilém Pechanec, Department of Geoinformatics, Palacký University, Olomouc, Czech Republic Aleš Vávra, Department of Geoinformatics, Palacký University, Olomouc, Czech Republic
69
Bridging the Knowledge Gap in Management and Operations of Transfusion Medicine: Planning, Policy and Leadership Issues Cees Th. Smit Sibinga, ID Consulting for International Development of Transfusion Medicine (IDTM),University of Groningen, Groningen, The Netherlands Maruff Akinwale Oladejo, Department of Educational Administration,Faculty of Education, University of Lagos,Akoka, Lagos State, Nigeria
83
Health Learning Practices in Adolescents Using Physical Activity Kelly O’Hara, Department of Sport Science, Research Centre in Sport Science, Health Science and Human Development, University of Beira Interior,Vila Real, Portugal Dulce Esteves, Department of Sport Science, Research Centre in Sport Science, Health Science and Human Development, University of Beira Interior,Vila Real, Portugal Rui Brás, Department of Sport Science, Research Centre in Sport Science, Health Science and Human Development, University of Beira Interior,Vila Real, Portugal Marco Rodrigues, Department of Sport Science, University of Beira Interior, Vila Real, Portugal Ricardo Rodrigues, Department Business and Economic, NECE Research Centre, University of Beira Interior, Vila Real, Portugal Paulo Pinheiro, Department of Business and Economics, NECE Research Centre, University of Beira Interior, Vila Real, Portugal
O R
P L
I IG
A B O L G
Copyright
The Journal of Cases on Information Technology (ISSN 1548-7717; eISSN 1548-7725). Copyright © 2013 IGI Global. All rights, including translation into other languages reserved by the publisher. No part of this journal may be reproduced or used in any form or by any means without written permission from the publisher, except for noncommercial, educational use including classroom teaching purposes.Product or company names used in this journal are for identification purposes only. Inclusion of the names of the products or companies does not indicate a claim of ownership by IGI Global of the trademark or registered trademark. The views expressed in this journal are those of the authors but not necessarily of IGI Global. JCIT is currently listed or indexed in: ABI/Inform; ACM Digital Library; Aluminium Industry Abstracts; Cabell's Directories; Ceramic Abstracts; Compendex (Elsevier Engineering Index); Computer & Information Systems Abstracts; Corrosion Abstracts; CSA Civil Engineering Abstracts; CSA Illumina; CSA Mechanical & Transportation Engineering Abstracts; DBLP; DEST Register of Refereed Journals; Electronics & Communications Abstracts; Engineered Materials Abstracts; Gale Directory of Publications & Broadcast Media; GetCited; Google Scholar; Information Science Abstracts; INSPEC; JournalTOCs; KnowledgeBoard; Library & Information Science Abstracts (LISA); Materials Business File - Steels Alerts; MediaFinder; Norwegian Social Science Data Services (NSD); PubList.com; SCOPUS; Solid State & Superconductivity Abstracts; The Index of Information Systems Journals; The Informed Librarian Online; The Standard Periodical Directory; Ulrich's Periodicals Directory
Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013 1
Assessing the Effectiveness of an E-Learning Framework:
The Portuguese Insurance Academy Case Nuno Pena, ADVANCE Research Center - ISEG and UnYLeYa, Lisbon, Portugal
F O
Pedro Isaías, Universidade Aberta and ADVANCE Research Center - ISEG, Lisbon, Portugal
O R
EXECUTIVE SUMMARY
P L
Effectiveness, a major concern in corporate e-Learning, is particularly decisive when projects face financial as well as time-to-market constraints. They are also important when projects target a range of attendees that are socio-demographically and geographically dispersed. This paper describes the case study about the assessment of the effectiveness of IPTEACES framework, a new instructional design Framework. This framework designated, as the name says, as IPTEACES, was conceived to facilitate e-Learning by reducing diversity in programmes facing a non-homogeneous audience, and it was applied to the insurance intermediaries’ certification course in Portugal. These intermediaries came from sixteen different corporations related to the insurance and the banking industry. Keywords:
A B O L G
Banking Industry, E-Learning, IPTEACES, Portugal, Time-To-Market Constraints
I IG
ORGANIZATION BACKGROUND
Portuguese Association of Insurers (PAI), founded in 1982, is a non-profit employers’ association of the insurance and reinsurance companies operating in the Portuguese market, irrespective of their legal nature or country of origin. The members of PAI presently account for 99% of the insurance market in terms of business turnover and human resources employed by the sector. PAI’s training organization, “Portuguese Insurance Academy” (PIA), which is where this case study is based, aims near 12.000 insurance workers as well as near 35.000 insurance
intermediaries and other direct and indirect insurance stakeholders. PIA’s intervention in the field of professional training is intended to serve the needs of the market from a double standpoint: first, to meet the training needs of the professionals working in the market, and secondly meet the needs of all those who do not work in the insurance sector per se, but come into regular contact with it in the course of their work and, therefore, need to understand insurance mechanisms and be made aware of the sector’s main products. With the publication of “Regulatory Rule 17/2006-R,” specifically with regard to qualification courses for Insurance Intermediaries – (resulting from an implementation of the EU
DOI: 10.4018/jcit.2013010101 Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
2 Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013
directive on insurance mediation), it became mandatory for all new insurance intermediaries to attend a certification course. This certification targeted a diverse socialdemography and geographically dispersed range of attendees. This demanded a new approach to e-Learning instructional design. To develop this training and certification solution in an e-Learning format (having as a formal requirement a final faceto-face examination), it was considered vital to design a specific and proprietary e-Learning “framework” which could contain in itself the “learning principles” and that it would fit, as far as possible, the diversity and heterogeneity in terms of different ages, gender, educational background, previous knowledge in the area, literacy, computer proficiency, organizational culture, motivations, values and experience / inexperience in e-Learning, etc. This framework, was primarily inspired through a pedagogical benchmark (mainly Gágne’s Nine Events of instruction (1992), Merrill’s Principles of Learning (2002, 2007), Keller’s ARCS’s model (2008) and van Merrienboer’s Ten Steps to Complex Learning (2007), as well as in a close observation of award winning e-courses (e.g. Brandon Hall Excellence in Learning Awards, International E-Learning Association Awards) and corporate e-Learning best practices (e.g. Bersin & Associates reports). With this framework in mind, we’ve conceived and designed an instructional design framework that could materialize, largely on a single approach, an appropriate learning strategy for different learners in order to fit the different learning preferences and also to respect other specific differences.
I IG
Brief Overview of IPTEACES Framework The IPTEACES framework is divided into the following phases (Pena & Isaias, 2010a & 2010b) (Figure 1): •
F O
O R
P L
A B O L G
SETTING THE STAGE Due to the heterogeneity of the target of the e-Course, it was important to create a new instructional design framework. This framework was conceived to facilitate e-Learning by reducing diversity in e-Learning programmes to be applied to a non-homogeneous audience
Front-End Procedures: In order to turn technological prerequisites (that are often a problem to the users) into intuitive information to the learner (Boyd, 2004; Schrum & Hong, 2002a; 2002b), this phase is divided in two areas: “Browser Check” and “Help Desk”. “Browser Check” is a functionality which automatically diagnoses the student’s browser configurations as well as it indicates the need for a particular software installation or configuration. The “Help Desk” was created to help students, in case they have any doubts along the course. With this in mind, students are invited throughout the course to contact the Help Desk team either by phone or by email. Student E-Learning Kit – Manuals, Quick Reference Guides and FAQ’s: It is important for the student to have access to information on how to log on and navigate both the Learning Management System and the Course. Interactive tools are provided to the students in order to address this aspect. Pedagogical Strategies (Specification of IPTEACES Framework): ◦◦ Involvement: This phase aims to immerse the student in the context of a real business or corporate scenario, where he is confronted with a problem (Merrill, 2002; 2007). From a pedagogical point of view, the goal is to gain the attention of the student (Cf. - Gagné’s first event “Gaining Attention”; Keller’s first principle of ARCS - “Motivation to learn is promoted when a learner’s curiosity is aroused due to a perceived gap in current knowledge”). ◦◦ Preparation: This phase is divided into two complementary phases:
•
•
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013 3
Figure 1. The IPTEACES framework
F O
O R
P L
A B O L G
Presentation of “Program and Objectives” and “Contextualization and Activation”: ▪▪ Program and Objectives: Presentation of the program, objectives and what is expected of the student (Cf. - Gagné’s second event “Informing the learner of the Objective”; Keller’s second principle: “Motivation to learn is promoted when the knowledge to be learned is perceived to be meaningfully related to one’s goals”). ▪▪ Contextualization and Activation: The goal of Contextualization and Activation is to make an introduction, a contextualization or a reminder of the subject so the student can activate prior existing knowledge (Cf. - Gagné’s third event Stimulating Recall of Prerequisite Learned Capabilities; Merrill’s Activation principle).
I IG
◦◦
Transmission: This phase is divided in three complementary moments: Acquisition (learning content), Systematization and Formative Assessment.
Acquisition is the central phase to present the course’s learning content. This is where the content is actually given to the learner (Gagné’s fourth event Present the Stimulus Material). After presenting part of new learning contents, it is advisable to carry out a systematization through a summary of the concepts and ideas approached. It is also advisable, at the end, the creation of a graphical representation of the connection between those concepts and ideas, i.e., the new learning material – this can be done by using “concept maps” or “dynamic diagrams”, for example. The student should also be able to know if he has understood what was explained along the content. For this, there should be an exercise or a set of questions in a formative assessment before he can proceed with the course.
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
4 Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013
•
•
•
Exemplification and Demonstration: This phase is mainly based on Merrill’s (2002; 2007) “demonstration principle” and it was divided into three sub-strategies: Real Case, Step-by-Step demo and Ask the Expert. ◦◦ A Real Case is an exemplification based on real cases and real situations. It aims to confront learners with authentic real-life situations, while trying to illustrate the relevance of the content and demonstrate the application of what was learnt. ◦◦ The step by step demo is a type of guided exemplification (Cf. Gagné’s fifth event Providing learning guidance) that seeks to illustrate the decomposition of a problem into phases and components; for this, there is a need of detailed and explained analysis of the parts that compose the complexity of a situation or a problem. ◦◦ Ask the expert is a more complex situation or problem: it is a structured example in which a student, in some areas of the course, may be faced with a problem and may ask for advice from the expert on how the problem could be solved. Application and Transfer: This phase focus upon the effort to maximize the transfer of learning, in order to foster the ability to easily apply what has been learned to new situations (Cf. Gagnés fifth and sixth event – Eliciting learning guidance and Providing feedback; Keller´s third principle Confidence and Merrill’s Application principle - Learning is promoted when learners engage in the application of their newly acquired knowledge or skill that is consistent with the type of content being taught). Connection: This phase focuses on mentoring, collaboration and tools. ◦◦ Asynchronous Mentoring: An integrated email functionality inside the course in order for the student to question their tutor was developed.
I IG
•
Each screen in the course has a unique identification, a specific code. This helps the tutor know which was the screen of the course that the student was at the time he wrote the email and, consequently, it becomes easier for the tutor to identify what is the student referring to or how the question arose. ◦◦ Collaboration: There are two types of discussion forums available: Supervised discussion Forums’ and Peer discussion forums’. ◦◦ Tools: Here the students can find an alphabetical list of terms, job aids, documentation, worksheets, etc. Evaluation: Autoscopy and Summative evaluation: At the end of each learning module, there is a proposal for the student to submit himself to do an Autoscopy - self assessment. The intention is to analyze whether, strictly from the student’s point of view, he feels he has achieved the learning objectives or not.
F O
O R
A B O L G
P L
Upon completing modules, students must perform a final assessment. The intention of this test, a summative evaluation, is to objectively assess if the student has achieved the specific objectives of each learning modules. After completing it, the student obtains detailed feedback from the results of the summative assessment. Students are able to see their classification (score), which of their answers were correct or incorrect; they can also compare their wrong answers with the correct answers and, in the end, the application creates a learning path that is directly connected to the contents related to the learning gap. The students will only continue to the Simulation phase if they successfully pass this evaluation (usually, the minimum score set is 70%, this meaning that the students have to correctly answer to 70% of the questions). This phase is based directly in Gagné’s eight event, Assess performance, as well as to Keller’s forth principle “Motivation to learn is promoted when learners anticipate and experience satisfying outcomes to a learning task”
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013 5
– which is represented in the ARCS model by Satisfaction. It is necessary for learners to have positive feelings about their learning experiences and to develop continuing motivation to learn. •
Simulation: A simulation exam was created, similar to the one the candidates need to pass on face-to-face examination after successfully completing all the e-Learning modules. This phase takes into account Gagne’s nine event (Enhance retention and transfer to the Job) and especially Merrill’s Integration Principle. - Learning is promoted when learners integrate their new knowledge into their everyday life by being directed to reflect on, discuss, or defend their new knowledge or skill.
This framework was applied in the certification course for insurance intermediaries, in a total of 3726 certified intermediaries from sixteen different corporations related to the insurance and the banking industry. Table 1 is a brief characterization of the students who attended the course:
I IG
Brief Characterization of Success Indicators: Approvals, Failures, Drop-Outs and Satisfaction’s Evaluation Among the 3,726 learners who attended the intermediaries’ certification course, 3,542 passed the course (approbation rate of 95.0%), and 184 failed (failure rate of 4.9%). More precisely, concerning the 3 exam session, 3,100 learners (83,2%) were approved the first time they took the exam, 382 learners were approved on the second exam session (10,2%) and, finally, 60 learners on the third exam session (1,6%). The global average score of the learners is 82.5%, with a standard deviation of 11.0, which shows a high variability of the learners’ results. Concerning the dropouts, 25 learners (0.7%) did not conclude the educational process. Among the 3,726 learners, 1,770 learners answered the survey of satisfaction’s evaluation, with a response rate of 50.2%. The analysis of the answers showed that, in general, the learners were satisfied or very satisfied with the course, ranking their answers over 3 on a 4 point Likert scale. It is also important to mention that the majority (76,9%) of the students didn’t have a previous e-Learning experience in a professional context.
F O
O R
P L
A B O L G
Table 1. Brief characterization of the students’ demographic indicators Industry:
From a total of 3,726 learners, 1,614 learners (43,3%) came from the insurance industry and 2112 learners (56,7%) belonged to the banking industry.
Gender:
Both genders were distributed in very similar numbers, although male learners have a slightly higher representation, totalling 1,953 learners (52,4%), in comparison with 1,773 (47,6%) of female learners.
Age groups:
The average age of the learners was 34 years old, with a standard deviation of 8.8 years. The data showed a high variability in the age characteristics of the learners – their age was between 18 and 71 years old. This said, the distribution of the learners through the different age groups had a higher number of learners in the age group between 24 and 34 years old.
Academic Qualifications:
Data showed high variability due to the existence of learners with different education levels. The secondary education is the level which includes more individuals, with a total of 1,607 learners (43,1%). After that comes the undergraduate level, with a total of 1,447 learners, reaching a representation of 38,8% and 522 learners with Primary Education (14,0%).
Residency:
The learners’ place of residence also showed a high variability, since learners came from different parts of the Portuguese territory – in this course, there were students from the 18 Portuguese regions.
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
6 Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013
CASE DESCRIPTION Because of the amount of trainees registered and the need to create a relevant and successful e-Course, finding “indicators” of e-Courses success was of major importance. In the field of learning, education, and training and, specifically, e-learning, quality has become an issue of increasing importance in both researchers’ and practitioners’ communities (Pawlowski, 2007). In the context of this e-Learning project (Insurance Intermediaries certification e-Course through the application of IPTEACES framework), the goal was to evaluate if the service provided (e-Course and e-Learning system) met the needs and expectations of the customers (students – as the key stakeholder). Many scholars suggest that students’ satisfaction with e-learning is an important factor in measuring the success or effectiveness of such a medium (Alavi, Wheeler, & Valacich, 1995; Bures, Abrami, & Amundsen, 2000; Hiltz & Johnson, 1990; Piccoli, Ahmad, & Ives., 2001; Swan, Shea, Fredericksen, Pickett, & Pelz, 2000). Although extensive research was done on the effect of user satisfaction on information system’s effectiveness, the relationship between value and satisfaction constructs for assessment of the true system’s effectiveness is lacking (Levy, 2006). In the context of e-Learning Systems effectiveness assessment, Levy (2006; 2009) developed an investigation in which he inquired students about the characteristics of e-Learning systems that they valued and also considered important during their learning experience. This investigation was also developed in the attempt to understand the relationship between the value that learners attribute to e-learning systems and the satisfaction learners experience with those e-learning systems. This author states that it is not the number of satisfied students or the level of satisfaction that suggest the system’s effectiveness. Rather, it is the extent to which students are more satisfied by the system performance with what they perceive as important. Information Systems literature defines satisfaction as the perceived performance level
I IG
students find at a post-experience point of time, with e-learning systems (Doll & Torkzadeh, 1991), whereas according to the Value Theory, value is defined as an enduring core belief about the level of importance students attribute to the e-learning system (Rokeach, 1969: 160). Levy (2006; 2009) proposed measures of learners’ perceived value and learners ‘perceived satisfaction, for assessment of the true effectiveness of an e-learning system – here defined as the “entire technological, organizational, and management system that facilitates and enables students learning via the Internet” (Levy & Murphy, 2002). E-learning systems are considered effective when learners value its characteristics as highly important and are highly satisfied by those same characteristics. Levy (2006; 2009) also proposed a set of characteristics that learners found important, or valued, when using e-Learning systems. The list of e-Learning systems characteristics was built primarily from an exhaustive review of literature and subsequently through exploratory focus groups, as well as in a qualitative questionnaire. Levy (2006) developed an assessment of such characteristics using a survey instrument. The survey instrument item scales used were: Satisfaction - Extremely unsatisfied (1), Very unsatisfied (2), Unsatisfied (3), Satisfied (4), Very satisfied (5) and Extremely satisfied (6) ; Importance - Not Important (1), Not so Important (2), Slightly Important (3), Important (4), Very Important (5) and Extremely Important (6). This survey, based upon prior validated measures from education and Information Systems literature, included satisfaction and value items for each of the 48 e-learning system’s characteristics, as well as learners’ overall value measure, overall satisfaction measure with e-learning system and an overall perceived learning measure. Due to the heterogeneity of the e-Learning system characteristics proposed - 48 e-Learning System characteristics -, Levy grouped them according to the four dimensions proposed by Webster & Hackley’s (1997): technology and support (14 characteristics), course (12 characteristics), professor (7 characteristics), and learner’s dimension (15 characteristics).
F O
O R
P L
A B O L G
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013 7
In order to determine the level of effectiveness of the e-Learning project applied at IA, it was decided to apply Levy’s proposed methodology in this project. However, because of the specificity of this e-Learning course (asynchronous e-Learning with strong component of self-learning), the Professor dimension was removed and therefore the seven e-Learning system characteristics directly linked with this dimension were extracted – this was made because it should be more pertinent to a different nature of e-Learning courses such as synchronous e-Learning courses (videoconferences), or blended learning courses. With this in mind, the online questionnaire was composed by the other three dimensions, covering a total of 41 e-Learning System characteristics: technology and support dimension (14 characteristics), Course dimension (12 characteristics) and learner’s dimension (15 characteristics). Levy (2006; 2009) proposed two benchmark tools based on the outputs of the questionnaire that can be complemented: “the Value-Satisfaction grid” and “Learners’ Value Index of Satisfaction” (LeVIS index).
Value-Satisfaction Grid
Learners’ Value Index of Satisfaction (LeVIS index)
F O
The “LeVIS index” proposed by Levy (2006), provides that measure as an overall index of learners’ perceived effectiveness of e-Learning systems by combining e-Learning systems value measures and e-Learning systems satisfaction measures. “LeVIS index” is proposed as a benchmarking tool, combining the learners’ perceived value and satisfaction, in order to indicate learners’ perceived e-Learning systems effectiveness. The “Value-Satisfaction grid” suggests that it is not sufficient that only value or only satisfaction measures are high, but rather the combination of both value and satisfaction. Consequently, the “LeVIS index” was proposed by Levy as the multiplication of the overall satisfaction (S◦) by the overall value (V◦). “LeVIS index” provides a score of the overall magnitude of the effectiveness of the e-Learning system under study. The two items (S◦ and V◦) are measured on a scale of 1 to 6, and the “LeVIS index” is calculated as:
O R
P L
A B O L G
The goal of “Value-Satisfaction grid” is to provide an indication of what the priorities for action and improvements for the e-Learning system dimension and the e-Learning systems characteristics are. The “Value-Satisfaction grid” was developed in a similar manner to the S.W.O.T. (acronym for Strengths, Weaknesses, Opportunities, and Threats) analysis, used by many marketing scholars. In the adaptation to the e-Learning context, the foundation of the “Value-Satisfaction grid” was the aggregated student perceived satisfaction as well as the aggregated student-perceived value of e-Learning system characteristics. This grid was constructed by positioning the e-Learning system characteristics of each dimension, where the mean characteristics satisfaction score is positioned on the horizontal axis and the mean characteristics value score is positioned on the vertical axis. The dimension grid was developed for each of the three dimensions. Similarly, the
I IG
“Value-Satisfaction grid” for the overall system was constructed. In this study, the measure scale ranges from 1 to 6. No scores below 3 in satisfaction and below 3 in value were found, resulting in the use of 4.5 as the cut-off point between low and high on both axes of the grid (Figure 2). The “Value-Satisfaction grid” does not provide, however, a measure for the magnitude of e-Learning system effectiveness and, therefore, should be complemented with another tool, the “LeVIS index”.
The results provide the assessment of the magnitude of learners’ perceived effectiveness integrating all the learner’s dimension value measures and dimension satisfaction measures of the e-Learning system under study. The magnitude of LeVIS provides that, when LeVIS is
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
8 Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013
Figure 2. The value-satisfaction grid (adapted from Levy (2006; 2009))
near 0, it indicates very low e-Learning system effectiveness perceived by learners. When LeVIS is near 1, it indicates very high e-Learning system effectiveness perceived by learners. This measure provides that if only one of the two measures (S◦ or V◦) is high, the overall system measure (LeVIS) score is not necessarily high. This means, as it was noted by Levy, that a clear limitation of LeVIS is the equal importance given to both Value and Satisfaction.
O R
P L
A B O L G
Assessing the Effectiveness of IPTEACES E-Learning Framework
I IG
F O
for Dimension A, characteristic A16 - “Overall, how important are technology and support to you when learning online?” was 5,35 (with a standard deviation of 0,77), confirming that students considered this dimension important. Table 2 shows the detailed scores (satisfaction, value and standard deviation) of the 14 e-Learning system characteristics that compose this dimension. In Figure 3, it is possible to see that none of the e-Learning system characteristics of this dimension was considered as having low value and/or low satisfaction, according to the overall average satisfaction of 5.00 and value of 5.31. Thus, given its location in the quadrant analysis, the characteristics A9 and A10, are e-Learning system characteristics that stand out as being highly valued and with which the trainees are very or extremely satisfied. On the other hand, the characteristics A1, A2 and A3 have lower scores. Although learners value these characteristics, their satisfaction average of 4.55, 4.62 and 4.72 demonstrates that the three characteristics linked with “Help Desk” are the ones that learners are less satisfied with.
The application of the online questionnaire’s adapted version occurred from May 2009 to June 2009, targeting 2531 students distributed by the Insurance Industry and the Banking Industry. The response rate was 52.03%, i.e., 1,317 trainees. More specifically, 59.6% of respondents were from banking industry, 40.4% were from insurance industry.
Dimension A: Technology and Support Considering the responses concerning Dimension A - Technology and Support, the average of global satisfaction score, as testified by characteristic A15 - “Overall, how would you rate your level of satisfaction with technology and support?” was 5,03 (with a standard deviation of 0,83). The average of global value score
Value-Satisfaction Grid of Dimension A: Technology and Support The “Value-Satisfaction grid” for Dimension A - Technology and Support illustrates that all the e-Learning system characteristics of this
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013 9
Table 2. Summary of detailed scores of e-learning systems characteristics of dimension a - technology and support (satisfaction, value and standard deviation)
F O
O R
A B O L G
dimension are concentrated in the quadrant of high satisfaction and high value, i.e., “Q2 – effective quadrant”, although the characteristics relating to the helpdesk (A1, A2 and A3) showed slightly lower ratings (value and satisfaction). That said, the e-Learning system characteristics A9, A10, A13, A7, A14, A12 and A6 have High Effectiveness (LeVIS index is above 0.75) and the e-Learning system characteristics A11, A8, A5, A4, A3, A2 and A1 (especially the last three) have Moderate Effectiveness (LeVIS index is below 0.75).
I IG
P L
LeVIS Index of Dimension A: Technology and Support Given the results of LeVIS index of Dimension A - Technology and Support, the global score shows a high effectiveness, having an average rate of 0.7569 (in order to better discriminate LeVIS scores, authors will use four digit divisions). In Figure 4, it is possible to see that the e-Learning System characteristic A9 – “Learning at any time of the day (schedule flexibility)” emerges as the most scored (0.8468). Immediately after, the characteristics A10 –
“Submit assignments from anywhere (via the Internet)” and A13 - “Taking quizzes remotely (off-campus)” had average rates of 0.8325 and 0.7902 respectively. The asynchronous typical characteristics of “Anytime Anywhere” were very appreciated by the students. On the other hand, as referred before, characteristics A1, A2 and A3 have response scores lower than 0.70, revealing that these are the characteristics that students identified as being less effective (A1 with an average score of 0.6345; A2 with an average score of 0.6559, and A3 with an average score of 0.6799).
Dimension B: Course Dimension B: Course shows once again that students are satisfied/very satisfied with this dimension and with the corresponding 12 e-Learning system characteristics given that the mean score of satisfaction and value are above point 5 on the rating scale. The global characteristics B13 - “Overall, how would you rate your level of satisfaction with online content of courses?” had an average score of 5,08 (with a standard deviation of 0,82), while
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
10 Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013
Figure 3. Value-satisfaction grid of e-learning systems characteristics of dimension a - technology and support
F O
O R
A B O L G
characteristics A14 - “Overall, how important is online content of courses to you when learning online?” had an average score of 5,36 (with a standard deviation of 0,78). None of the e-Learning system characteristics of this dimension were considered as having low value or low satisfaction, according to the overall average satisfaction of 5,03 and value of 5,34. Characteristics B12, B8 and B4 are those that stand out with high average scores of satisfaction and value. Characteristics B3 and B6, present rating averages slightly lower, particularly at the level of satisfaction - where the average results are 4.82 and 4.92, respectively – although the average score for value was superior concerning characteristics B3 (5,25) and B6 (5,21).
I IG
P L
Value-Satisfaction Grid of Dimension B: Course The “Value-Satisfaction grid” for Dimension B: Course reveals that all the e-Learning system
characteristics of this dimension are concentrated in the quadrant of high satisfaction and high value, i.e., “Q2 – effective quadrant” (and highly condensed, unlike dimension A where the e-Learning system characteristics are more scattered). This grid also shows that e-Learning system characteristics B12, B8, B4, B11, B7, B9 and B2 have High Effectiveness (LeVIS index above 0.75) and e-Learning system characteristics B9, B1, B6, B5 and B3 have Moderate Effectiveness (LeVIS index below 0.75) (Figure 5).
LeVIS Index of Dimension B: Course Given the results of LeVIS index of Dimension B - Course, the global score reveals a high effectiveness having a response score average of 0,7674. More specifically, and according to Figure 6, it is possible to conclude that the highest scores belong to the e-Learning System characteristics B12 “Taking practice tests prior to graded test” with an average rate of 0,8024, followed by characteristics B8 – “Ease-to-use
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013 11
Figure 4. LeVIS index of e-learning systems characteristics of dimension a - technology and support
(with course content, navigation, interface, etc.)” and B4 – “Interesting subject matter”, having average of 0,7755 and 0,7710, respectively, which shows a high effectiveness. On the other hand, characteristics B3– “Amount of material in courses” and B6– “Availability of other content (syllabus, objectives, assignments, schedule)”, are the ones who reveal the lowest scores (0.718 and 0.7288 respectively), therefore showing moderate effectiveness.
F O
Dimension D: Learner
Considering Dimension D - Learner, the average of global satisfaction score characteristic D16 – “Overall, how would you rate your level of satisfaction with the above items when learning online?” was 5,10 (with a standard deviation of 0,83), while characteristic D17 – “Overall, how important are the above items to you when learning online?” had an average score of 5,26 (with a standard deviation of 0,83). Table 4
O R
P L
A B O L G
Table 3. Summary of detailed scores of e-learning systems characteristics of dimension B – course (satisfaction, value and standard deviation)
I IG
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
12 Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013
Figure 5. Value-satisfaction grid of e-learning systems characteristics of dimension B – course
F O
O R
P L
A B O L G
shows detailed scores (satisfaction, value and standard deviation) of the 15 e-Learning system characteristics of this dimension. None of the e-Learning system characteristics of this dimension was considered as having low value or low satisfaction, according to the overall average satisfaction of 5,01 and value of 5,23. This dimension has a greater dispersion (almost similar to Dimension A) at the level of global evaluations when compared with Dimension B. On the other hand there are a number of characteristics with higher concentration and that stand out with values closer to high satisfaction and high value. As an example,
I IG
characteristic D11 – “Reduced travel cost/time (to and from campus)” and D12 – “Ability to travel while taking online courses (for business or other)”, had slightly higher averages (5,33 and 5,28) concerning the level of satisfaction, and 5,45, and 5,39 concerning the level of value. E-Learning system characteristics D4, D3, D2 and D5 are more scattered, getting away from the top quadrant, particularly regarding satisfaction. Among these characteristics, D2 and D3 can be highlighted – although considered important by students, they had lower levels of satisfaction compared to other characteristics (4,55 and 4,59, respectively).
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013 13
Figure 6. LeVIS index of e-learning systems characteristics of dimension B - course
Value-Satisfaction Grid of Dimension D: Learner The “Value-Satisfaction grid” for Dimension D - Learner illustrates that all e-Learning system characteristics of this dimension are concentrated in the quadrant of high satisfaction and high value, i.e., “Q2 – effective quadrant”, although the characteristics relating to the “class”
F O
(D1, D3, D2, D4 and D5) show slightly lower ratings. The e-Learning system characteristics D11, D12, D15, D9, D14, D13, D7, D6, D8 and D10 have High Effectiveness (LeVIS index above 0.75) and e-Learning system characteristics D1, D3, D2, D4 e D5 (especially D5, D2 and D3) have Moderate Effectiveness (LeVIS index below 0.75) (Figure 7).
O R
P L
A B O L G
Table 4. Summary of detailed scores of e-learning systems characteristics of dimension D - learner (satisfaction, value and standard deviation)
I IG
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
14 Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013
LeVIS Index of Dimension D: Learner Given the results of LeVIS index of Dimension D - Learner, the global score reveals a high effectiveness, having an average score of 0,7578. In the Figure 8, it is possible to see that the e-Learning System characteristic D11 – “Reduced travel cost/time (to and from campus)” had the highest score with 0,8201. Characteristics D12 – “Ability to travel while taking online courses (for business or other)” and D15 – “Family support”, were also regarded as being effective with average scores of 0,8048 and 0,7912, respectively. In contrast, considering the characteristics with lower response rates, e-Learning system characteristics D2 – “Amount of interaction with classmates”, D3 – “Quality of interaction with classmates” and D4 – “Classmates’ attitude (across all courses)” present scores above 0,7.
Overall Levis Index: The Effectiveness of Ipteaces E-Learning Framework Levy (2006; 2009) proposed the following categorization for LeVIS index overall scores (Figure 9): • • • • • •
If LeVIS overall score is ≥ 0.9375 – Very high effectiveness; If LeVIS overall score is ≥ 0.75 and < 0.9375 – High effectiveness; If LeVIS overall score is ≥ 0.5625 and < 0.75 – Good effectiveness; If LeVIS overall score is ≥ 0.3750 and < 0.5625 – Moderate effectiveness; If LeVIS overall score is ≥ 0.1875 and < 0.3750 – Low effectiveness and If LeVIS overall score is < 0.1875 – Very low effectiveness.
F O
O R
P L
Results from the Global LeVIS index indicate that the overall e-Learning system under study reached a global score of 0.761
A B O L G
Figure 7. Value-satisfaction grid of e-learning systems characteristics of dimension D - learner
I IG
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013 15
Figure 8. LeVIS index of e-learning systems characteristics from dimension D – learner
and therefore should be classified as being of “High Effectiveness”. All the dimensions are above 0.75 (Dimension A – 0.757; Dimension B – 0.767, Dimension D – 0.758), with a particular emphasis on Dimension B – Course, which had the highest score of all dimensions.
Overall Value-Satisfaction Grid of the 41 E-Learning System Characteristics (All Dimensions)
O R
P L
A B O L G
As it is possible to see in Figure 10, all the 41 e-Learning system characteristics are situated in the Q2 quadrant of the Value-Satisfaction Grid, i.e., in the Effective Quadrant. All the characteristics and dimensions are considered effective. However, having “excellence” as a reference, it is possible to see in this overall grid that there are 7 e-Learning System Char-
I IG
F O
acteristics (less effective) that are somehow apart from the other 34 and, therefore, need to have a quality improvement plan. These 7 e-Learning system characteristics positioned on the lowest corner of the Q2 quadrant show that there are two groups (or sub-categories) that should have priority in terms of quality improvement: Help Desk (A1, A2 and A3) originally corresponding to Dimension A - Technology and Support and Class (D3, D2, D4 and D5) originally corresponding to Dimension D – Learner. The first priority for quality intervention should concern the sub-category which was designated as Help Desk (A1, A2, and A3). The value scores of these three items are higher than the ones related to satisfaction scores. In this case, a strategy should be implemented in order
Figure 9. Overview of LeVIS index of IPTEACES e-learning framework
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
16 Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013
Figure 10. Overall value-satisfaction grid of the 41 e-Learning system characteristics: Close-up view of Q2 quadrant
F O
O R
A B O L G
to obtain a higher level of satisfaction from the students, concerning Help Desk services. The second priority is related to the subcategory designated as Class (D3, D2, D4 and D5). More specifically, D2, D3, D4 and D5 characteristics were considered as being valuable but, nonetheless, there is the need to increase the level of satisfaction related to it.
I IG
P L
CURRENT CHALLENGES/ PROBLEMS FACING THE ORGANIZATION This framework can be explored and applied in many e-Learning projects that face significant diversity in their attendees. The application of the framework has produced results that are considered to fulfill the typical main objectives of an e-Learning project: high approval rate, low dropout rate and high level of satisfaction from the students, as well as, from a management point of view, it achieved high level effectiveness based on international benchmark tools.
In the context of quality assurance, the authors, inspired by Levy’s methodology (2006; 2009), propose that e-Learning quality can be measured through the assessment of the effectiveness of the e-Learning system. Having Levy’s methodology as tool, this e-Learning project (Insurance Intermediaries certification e-Course through the application of IPTEACES framework) achieved the category of “High effectiveness” (score = 0.761) based on the assessment from 1317 students on satisfaction and value of 41 e-Learning system characteristics. However, having excellence as reference, the output of the “Overall Value-Satisfaction Grid of the 41 e-Learning System Characteristics” showed objectively which system characteristics and correspondent dimensions should have priority in terms of an improvement plan. These tools combined give managers the correct and adequate information for action improvement in the context of e-Learning quality. The most significant limitation of this study is that, so far, this framework was only imple-
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013 17
mented in one particular project (although the size of the sample was significant). Considering this, it should be applied in the future to other populations, other industries and other subjects. Concerning the measurement of effectiveness, future studies in the area may also use different methodologies, e.g. confirmatory analysis, in order to provide further validity and reliability for the results.
Juran, J. M. (1992). Juran on quality by design: The new steps for planning quality into goods and services. New York, NY: Free Press. Keller, J. (2008). First principles of motivation to learn and e3-learning. Distance Education, 29(2), 175–185. doi:10.1080/01587910802154970. Keller, J. M. (2010). Motivational design for learning and performance: The ARCS model approach. Boston, MA: Springer US. doi:10.1007/978-14419-1250-3.
REFERENCES
Levy, Y. (2006). Assessing the value of e-learning systems. Hershey, PA: Information Science Publishing.
Alavi, M., Wheeler, B., & Valacich, J. (1995). Using IT to reengineer business education: An exploratory investigation of collaborative telelearning. Management Information Systems Quarterly, 19(3), 293–311. doi:10.2307/249597.
Levy, Y., Murph, K., & Zanakisy, S. (2009). A value-satisfaction taxonomy of IS effectiveness (VSTISE): A case study of user satisfaction with IS and user-perceived value of IS. International Journal of Information Systems in the Service Sector, 1(1), 93–118. doi:10.4018/jisss.2009010106.
Bures, E. M., Abrami, P. C., & Amundsen, C. (2000). Student motivation to learn via computer conferencing. Research in Higher Education, 41(5), 593–621. doi:10.1023/A:1007071415363.
Levy, Y., & Murphy, K. (2002). Toward a value framework for online learning system. In Proceedings for the Hawaii International Conference on System Sciences (HICSS – 35) (pp. 1-9).
Doll, W. J., & Torkzadeh, G. (1991). The measurement of end-user computing satisfaction: Theoretical and methodological issues. Management Information Systems Quarterly, 15(1), 5–9. doi:10.2307/249429.
Merriënboer, J. J. G., & Kirschner, P. A. (2007). Ten steps to complex learning. Mahwah, NJ: Lawrence Erlbaum Associates.
O R
P L
A B O L G
Dondi, C., Moretti, M., & Nascimbeni, F. (2006). Quality of e-learning: Negotiating a strategy, implementing a policy. In U.-D. Ehlers, & J. M. Pawlowski (Eds.), Handbook on quality and standardisation in e-learning. Berlin/ Heidelberg, Germany: Springer. doi:10.1007/3-540-32788-6_3.
I IG
F O
Ehlers, U.-D., & Goertz, L. (2006). Quality evaluation for elearning in Europe. In U.-D. Ehlers, & J. M. Pawlowski (Eds.), Handbook on quality and standardisation in e-learning. Berlin/ Heidelberg, Germany: Springer. doi:10.1007/3-540-32788-6_11. Gagne, R., Briggs, L., & Wager, W. (1992). Principles of instructional design (4th ed.). Englewood Cliffs, NJ: Prentice-Hall. Hiltz, R. S., & Johnson, D. W. (1990). User satisfaction with computer-mediated communication systems. Management Science, 36(6), 739–765. doi:10.1287/mnsc.36.6.739.
ISO EN 9000. (2005). Quality management systems: Fundamentals and vocabulary. International Organization for Standardization.
Merrill, M. D. (2002). First principles of instruction. Educational Technology Research and Development, 50(3), 43–59. doi:10.1007/BF02505024. Merrill, M. D. (2007). First principles of instruction: a synthesis. In R. A. Reiser, & J. V. Dempsey (Eds.), Trends and issues in instructional design and technology (2nd ed., Vol. 2, pp. 62–71). Upper Saddle River, NJ: Merrill/Prentice Hall. Pawlowski, J. M. (2007). The quality adaptation model: Adaptation and adoption of the quality standard ISO/IEC 19796-1 for learning, education, and training. Journal of Educational Technology & Society, 10(2), 3–16. Pena, N., & Isaias, P. (2010a), The IPTEACES elearning framework – the analysis of success indicators and the impact on student social demographic characteristics. In Proceedings of IADIS International Conference on Cognition and Exploratory Learning in Digital Age (CELDA), Timisoara, Romania. Pena, N., & Isaias, P. (2010b). An approach to diversity: The effectiveness of IPTEACES e-learning framework. In Proceedings of the 9th European Conference on eLearning (ECEL), Porto, Portugal.
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
18 Journal of Cases on Information Technology, 15(1), 1-18, January-March 2013
Piccoli, G., Ahmad, R., & Ives, B. (2001). Web-based virtual learning environments: A research framework and a preliminary assessment of effectiveness in basic it skills training. Management Information Systems Quarterly, 25(4), 401–426. doi:10.2307/3250989.
Swan, K., Shea, P., Fredericksen, E. E., Pickett, A. M., & Pelz, W. E. (2000). Course design factors influencing the success of online learning. In Proceedings of WebNet 2000 World Conference on the WWW and Internet, San Antonio, TX.
Rokeach, M. (1969). Beliefs, attitudes, and values. San Francisco, CA: Jossey-Bass Inc. Publishers.
Webster, J., & Hackley, P. (1997). Teaching effectiveness in technology-mediated distance learning. Academy of Management Journal, 40(6), 1282–1309. doi:10.2307/257034.
Nuno Pena is the Director of UnYLeYa (LeYa Group - LeYa, one of the largest publishing groups in the Portuguese-speaking world). Before he was the Chief Learning Officer of Portuguese Association of Insurers and founder of the “Portuguese Insurance Academy”. He was a Business Unit Manager in two Consultancy Companies (in the field of Corporate Education and e-Learning Effectiveness). Nuno is also a Postdoctoral Researcher at ADVANCE Research Center of ISEG - School of Economics and Management (Technical University of Lisbon), University Lecturer and Reviewer in some Academic Journals. He holds a Ph.D. in Management Information Systems (in the speciality of Corporate e-Learning), a Master degree in Multimedia Educational Communication from Portuguese Open University (Universidade Aberta); a Postgraduate certificate in Education (PGCE) and a Bachelor in Philosophy from New University of Lisbon (Universidade Nova de Lisboa).
F O
O R
P L
A B O L G
Pedro Isaías is an associate professor at the Universidade Aberta (Portuguese Open University) in Lisbon, Portugal, responsible for several courses and director of the master degree program in Electronic Commerce and Internet since its start in 2003. He holds a PhD in Information Management (in the speciality of information and decision systems) from the New University of Lisbon. Author of several books, book chapters, papers and research reports, all in the information systems area, he has headed several conferences and workshops within the mentioned area. He has also been responsible for the scientific coordination of several EU funded research projects. He is also member of the editorial board of several journals and program committee member of several conferences and workshops. At the moment he conducts research activity related to Information Systems in general, E-Learning, E-Commerce and WWW related areas.
I IG
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.