SERVICE-ORIENTED REFERENCE ARCHITECTURE TO IMPROVE FUNCTIONALITY OF WEB-BASED E-LEARNING SYSTEMS
Raphael Angulu
A thesis submitted in partial fulfilment of the requirements for the degree of Master of Science in Information Technology of Masinde Muliro University of Science and Technology
May, 2014
DECLARATION This thesis is my original work prepared with no other than the indicated sources and support and has not been presented elsewhere for any other award.
Signature……………………… Angulu Raphael SIT/G/23/11
Date……………………...
CERTIFICATION The undersigned certify that they have read and hereby recommend for acceptance of Masinde Muliro University of Science and Technology a thesis entitled “Serviceoriented reference architecture to improve functionality of web-based e-learning systems”
Signed…………………………… Date…………………….. Mr. Anselemo Ikoha Peters School of Information Technology and Informatics Kibabii University College
Signed…………………………… Date…………………….. Prof. Amadalo Maurice Musasia Department of Science and Mathematics Education Masinde Muliro University of Science and Technology
ii
COPYRIGHT This thesis is copyright materials protected under the Berne Convection, the copyright Act 1999 and other international and national enactments in that behalf, on intellectual property. It may not be reproduced by any means in full or in part except for short extracts in fair dealing so for research or private study, critical scholarly review or discourse with acknowledgment, with written permission of the Dean School of Graduate Studies on behalf of both the author and Masinde Muliro University of Science and Technology.
iii
DEDICATION This thesis is dedicated to my dear grandmother Rosebellah Anjiri Shimuli.
iv
ACKNOWLEDGEMENT I thank the Almighty God for good health and protection he has offered to me all through this work. I am obligated to Masinde Muliro University of Science and Technology for sponsoring my studies. I extend my sincere gratitude to my supervisors Prof. Amadalo M. M. and Mr. Anselemo I. P. for their constant support, advice and guidance during the entire course of research and development of this thesis. I am grateful to my dear mother Jane Wakhu; for her sponsorship, guidance, love and support without which I would never have found my way to where I am. Thank you mum! I am indebted to my lovely wife Volian, my dear sister Bethsheba and my only classmate Perez for their support throughout this undertaking. Finally but not least, I thank the staff and students of Kenyatta University who honestly shared their experience with web-based e-learning system; vital information that helped in the development of the proposed architecture. Angulu Raphael
v
ABSTRACT E-learning is the use digital technologies to deliver learning resources to learners. It is the use of computer, electronic device, mobile and communication devices in some way to provide training, just-in-time-information, guidance from experts, and educational or learning material among learners. E-learning facilitates and enables learners to learn anytime and anywhere. Many universities and private corporations are investing significant capital in web-based e-learning systems. A number of Kenyan universities have adapted web-based e-learning technologies in their delivery of courses however; these institutions of higher learning have little or no full understanding of the factors that contribute to perceived learner effectiveness of these systems. This study attempted to understand the relationship between the value learners attribute to web-based e-learning systems and the satisfaction learners experience with these systems. Users of web-based e-learning systems are heterogeneous each with different needs and expectations and therefore a careful assessment of systems functionality is required. This study analysed existing webbased e-learning system architectures, evaluated functionality of web-based elearning systems in Kenyan universities, find out factors that determine web-based elearning systems’ user satisfaction in Kenyan universities and eventually developed integrated adaptive service oriented reference architecture (IASORA) to improve functionality and user satisfaction of these systems in Kenyan universities. Using a case study research design, the study focused on the features incorporated in the webbased e-learning systems and mapped these features with the learners and lecturers expectations of the system to find out if web-based e-learning systems users get exactly what they expect from the system. The study was done at Kenyatta University which is implementing MOODLE web-based e-learning system and a sample population was selected using both purposive sampling and simple random sampling techniques. The main instruments for data collection used were content analysis, questionnaires and interviews. Data was analysed using both descriptive and inferential statistics. This study evaluated suitability, accurateness, interoperability, compliance to standards and security attributes of the web-based e-learning system in order to determine its functionality basing on the ISO/IEC 9126 quality model for elearning systems. The findings showed that existing web-based e-learning system architectures had a number of inherent limitations that affects functionality and user satisfaction of e-learning system, e-learning system was not as functional as expected by users and system satisfaction is greatly influenced by demographic factors and other factors like network access speed, usability, accessibility, availability among other factors. The final deliverable of this study is an architecture that will seek to improve functionality and user satisfaction of these systems in Kenyan universities. The study recommends implementation of IASORA, user training, publishing of relevant content in the system and improvement of institution’s ICT infrastructure in order to improve functionality and satisfaction of web-based e-learning systems.
vi
TABLE OF CONTENTS DECLARATION ............................................................................................................. ii COPYRIGHT.................................................................................................................. iii DEDICATION .................................................................................................................iv ACKNOWLEDGEMENT ...............................................................................................v ABSTRACT .....................................................................................................................vi TABLE OF CONTENTS .............................................................................................. vii LIST OF TABLES ......................................................................................................... xii LIST OF FIGURES .......................................................................................................xiv LIST OF ABBREVIATIONS AND ACRONYMS ......................................................xv OPERATIONAL DEFINITION OF TERMS........................................................... xvii CHAPTER ONE: INTRODUCTION .............................................................................1 1.1 Background to the study ...............................................................................................1 1.2 Statement of the Problem..............................................................................................3 1.3 General objective ..........................................................................................................4 1.4 Objectives of the Study.................................................................................................4 1.5 Research questions........................................................................................................5 1.6 Significance of the study ..............................................................................................5 1.8 Assumptions of the study..............................................................................................6 1.9 Scope of the study.........................................................................................................6 1.10 Conceptual framework................................................................................................6 CHAPTER TWO: LITERATURE REVIEW................................................................8 2.1 Introduction ..................................................................................................................8
vii
2.2 E-learning and mobile learning ....................................................................................8 2.3 Software architecture ....................................................................................................9 2.4 Existing e-learning system architectures ....................................................................10 2.5 Proposed E-learning system architecture....................................................................12 2.6 Cloud computing and e-learning ................................................................................13 2.7 Adaptive e-learning ....................................................................................................15 2.8 Service oriented architecture ......................................................................................16 2.9 Web services ...............................................................................................................18 2.10 Reference architecture ..............................................................................................20 2.11 Functionality of E-learning system ...........................................................................21 2.12 Educators perceptions of functionality .....................................................................24 2.13 Components of effective e-learning..........................................................................26 2.14 Need to evaluate e-learning systems.........................................................................27 2.15 Software quality model .............................................................................................28 2.16 E-learning systems user satisfaction .........................................................................29 2.17 Technology evaluation models .................................................................................31 2.18 Evaluation of e-learning systems user satisfaction ...................................................33 2.19 Summary ...................................................................................................................34 CHAPTER THREE: RESEARCH METHODOLOGY .............................................35 3.1 Introduction ................................................................................................................35 3.2 Research design ..........................................................................................................35 3.3 Location of study ........................................................................................................35 3.4 Target population ........................................................................................................36 viii
3.5 Sampling techniques ...................................................................................................36 3.6 Sample population ......................................................................................................37 3.7 Instruments of data collection.....................................................................................38 3.7.1 Content analysis .......................................................................................................38 3.7.2 Questionnaire ...........................................................................................................38 3.8 Validation of the research instruments .......................................................................39 3.9 Reliability of the research instruments .......................................................................40 3.10 Data collection procedure .........................................................................................42 3.11 Methods of data analysis ..........................................................................................42 3.12 Presentation of results ...............................................................................................43 3.13 Ethical consideration ................................................................................................43 CHAPTER FOUR: DATA ANALYSIS AND PRESENTATION .............................45 4.1 Introduction ................................................................................................................45 4.2 Quantitative Analysis..................................................................................................45 4.2.1 Distribution of respondents by gender and age .......................................................45 4.2.2 Distribution by level of study and year of study ......................................................47 4.2.3 Distribution by mode of study .................................................................................49 4.2.4 Distribution by period of Internet and e-learning system usage ..............................49 4.2.5 Distribution by learning styles of students ..............................................................51 4.2.6 Web-based e-learning systems used by respondents ...............................................52 4.3 Qualitative Analysis....................................................................................................53 4.3.1 Analysis of existing e-learning system architectures ...............................................54 4.3.2 Functionality of web-based e-learning systems .......................................................56 ix
4.3.2.1 System interactivity ............................................................................................. 57 4.3.2.2 System response................................................................................................... 60 4.3.2.3 System accurateness ............................................................................................ 63 4.3.2.4 System suitability................................................................................................. 66 4.3.2.5 System interoperability ........................................................................................ 70 4.3.2.6 System security .................................................................................................... 74 4.3.2.7 System perceive usefulness ................................................................................. 77 4.3.2.8 System perceived ease of use .............................................................................. 82 4.3.3 Web-based e-learning system user satisfaction .......................................................87 4.3.3.1 Web-based e-learning system user satisfaction factors ....................................... 87 4.3.3.2 Supplementary factors affecting e-learning system user satisfaction .................. 88 4.3.3.3 Factors that determine user satisfaction inferential statistics............................... 90 4.3.3.4 Respondents suggestions for improving e-learning systems ............................... 97 4.3.3.5 Analysis of variance for differences in satisfaction ............................................. 97 CHAPTER FIVE: DEVELOPMENT OF PROPOSED ARCHITECTURE ..........103 5.1 Introduction ..............................................................................................................103 5.2 Current web-based e-learning system architectures .................................................103 5.3 Limitations of the current e-learning systems architectures .....................................103 5.4 Integrated adaptive service-oriented reference architecture .....................................104 5.4.1 Services identified ................................................................................................ 106 5.4.2 Presentation layer.................................................................................................. 107 5.4.3 Services layer ........................................................................................................ 107 5.4.4 Application layer .................................................................................................. 109 x
5.4.5 Storage layer ......................................................................................................... 110 5.5 Evaluation and validation of the developed architecture ..........................................111 5.5.1 Demographic distribution of evaluators ............................................................... 112 5.5.2 Response on validation constructs .........................................................................113 5.5.3 Inferential analysis of validation data ....................................................................115 5.5.4 Conclusion .............................................................................................................117 CHAPTER SIX: SUMMARY, CONCLUSION AND RECOMMENDATIONS ...118 6.1 Introduction ..............................................................................................................118 6.2 Summary of chapters ................................................................................................118 6.3 Summary of findings ................................................................................................118 6.3.1 Analysis of existing e-learning system architectures .............................................118 6.3.2 Functionality of web-based e-learning systems .....................................................119 6.3.3 Factors affecting e-learning system user satisfaction ............................................119 6.3.4 Integrated adaptive service-oriented reference architecture ..................................120 6.4 Recommendations.....................................................................................................120 6.5 Suggestions for further research ...............................................................................121 REFERENCES .............................................................................................................122 APPENDICES ...............................................................................................................136
xi
LIST OF TABLES Table 2.1: Cloud computing benefits in e-learning………………………………....14 Table 2.2: Key success factors of e-learning……………………………………......23 Table 2.3 Functionality attributes…………………………………………………...29 Table 3.1: Sample population………………………………………….……………38 Table 3.2: Instrument validity analysis……………………………………………...40 Table 3.3: Cronbach’s Coefficient Reliability……………………………………....41 Table 3.4: Summary of data analysis………………………………………….....….43 Table 4.1: Students respondents’ distribution by gender and age groups…………...46 Table 4.2: Lecturers respondents’ distribution by gender and age groups………….46 Table 4.3: Duration of Internet and e-learning system usage……………………….50 Table 4.4: Lecturers Internet and Web-based e-learning system usage……….…….51 Table 4.5: Learning styles of students………………………………………………52 Table 4.6: Architecture of web-based e-learning system……………………........…54 Table 4.7: System interactivity factors…………………………………………...…57 Table 4.8: Students’ opinion on system interactivity…………………………….….58 Table 4.9: Lecturers’ opinion on system interactivity factors………………………59 Table 4.10: System response……………………………………………………...…60 Table 4.11: Students’ opinion on system response factors……………………….....61 Table 4.12: Lecturers’ opinion on system response factors…………………........…62 Table 4.13: Students opinion on system accurateness factors…………………..…..64 Table 4.14: Lecturers’ opinion on system accurateness factors………………....….65 Table 4.15: Students’ opinion on system suitability factors……………………..….68 xii
Table 4.16: Lecturers’ opinion on system suitability factors………………..........…69 Table 4.17: Students’ opinion on system interoperability factors……………...…...72 Table 4.18: Lecturers’ opinion on system interoperability factors……………....….73 Table 4.19: System security mechanisms…………………………………..…….…75 Table 4.20: Students’ opinion on system security factors………………………..…76 Table 4.21: Lecturers’ opinion on system security factors………………….…........77 Table 4.22: System perceived usefulness…………………………………………...78 Table 4.23: Students’ opinion on system perceived usefulness factors……………..80 Table 4.24: Lecturers’ opinion on system perceived usefulness factors…………....81 Table 4.25: Perceived ease of use………………………………………………...…83 Table 4.26: Students’ opinion on system perceived ease of use factors………….....84 Table 4.27: Lecturers’ opinion on system perceived ease of use factors…………....86 Table 4.28: Pearson’s correlation of demographics and system satisfaction…..........91 Table 4.29: Other factors affecting user satisfaction……………………………..…94 Table 4.30: ANOVA for students’ satisfaction………………………………….…..99 Table 4.31: ANOVA for lecturers……………………………………………….....101 Table 5.1: Validators opinion on improved functionality………………………….114 Table 5.2: Pearson’s correlation for validation factors…………………………….116
xiii
LIST OF FIGURES Figure 1.1: Conceptual framework……………………………………………..…….7 Figure 2.1: SORAPES………………………………………………………………12 Figure 2.2: Stages of adaptive e-learning…………………………………………...15 Figure 2.3: Service-Oriented Architecture…………………………………….…….17 Figure 2.4: ISO 9126 Software quality model………………………………………28 Figure 2.5: Technology acceptance model………………………………………….31 Figure 2.6: DeLone and McLean’s ISS Model………………………………….......32 Figure 4.1: Students respondents’ distribution by level and year of study………….47 Figure 4.2: Level of study for lecturers/content developers………………………...48 Figure 4.3: Distribution by mode of study…………………………………………..49 Figure 4.4: Web-based e-learning system used……………………………………..53 Figure 4.5: System accurateness…………………………………………………….63 Figure 4.6: Lecturers response on system suitability………………………………..67 Figure 4.7: System interoperability……………………………………………….....71 Figure 4.8: Lecturers opinion on usefulness………………………………………...79 Figure 4.9: Users’ level of satisfaction…………………………………………..….88 Figure 4.10: Other factors affecting user satisfaction………………………………89 Figure 5.1: Integrated Adaptive Service-Oriented Reference Architecture……....105 Figure 5.2: Distribution by occupation and level of study………………………..112 Figure 5.3: Academic qualification and profession of respondents………………113
xiv
LIST OF ABBREVIATIONS AND ACRONYMS ACM
Association of Computing Machinery
ANOVA
Analysis of Variance
API
Application Programming Interface
CD
Compact Disk
CD-ROM
Compact Disk – Read Only Memory
CMS
Content Management System
E-learning
Electronic learning
FTP
File Transfer Protocol
IASORA
Integrated Adaptive Service Oriented Reference Architecture
IEC
International Electrotechnical Committee
IEEE
Institute of Electrical and Electronic Engineers
ICT
Information Communication Technology
IMS
Instructional Management Systems
ISO
International Standards Organization
ISS
Information System Success
HTML
Hypertext Markup Language
HTTP
Hypertext Transfer Protocol
HTTPS
Hypertext Transfer Protocol Secure
LCMS
Learning Content Management System
LMS
Learning Management System
M-learning
Mobile learning
MOODLE
Modular Object-Oriented Dynamic Learning Environment
NIST
National Institute of Standards and Technology xv
ODL
Open and Distance Learning
PDA
Personal Digital Assistant
PES
Personalized E-learning System
SCORM
Sharable Content Object Reference Model
SMTP
Simple Mail Transfer Protocol
SOA
Service-Oriented Architecture
SOAP
Simple Object Access Protocol
SORAPES
Service-Oriented Reference Architecture for Personalized E-learning Systems
TAM
Technology Adoption Model
TCP
Transmission Control Protocol
UDDI
Universal Description Discovery and Integration
VLE
Virtual Learning Environment
WSDL
Web Service Description Language
WSN
Web Service Notifications
XML
eXtensible Markup Language
xvi
OPERATIONAL DEFINITION OF TERMS Andragogy - Knowles (1975) defined andragogy as a process in which individuals take the initiative, with or without the help of others, in diagnosing their learning needs, formulating learning goals, identifying human and material resources for learning, choosing and implementing appropriate learning strategies, and evaluating learning outcomes. Architecture - Architecture is the fundamental organization of a system embodied in its components, their relationships to each other, and to the environment, and the principles guiding its design and evolution (IEEE, 2000). Cloud computing - This is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction (NIST, 2009). E-learning - An innovative approach for delivering well-designed, learner-centred, interactive, and facilitated learning environments to anyone, anyplace, anytime by utilizing the attributes and resources of various digital technologies along with other forms of learning materials suited for open, flexible, and distributed learning environments (Khan, 2005). Functionality – The capability of the software to provide functions which meet the stated and implied needs of users under specified conditions of usage (ISO/IEC 9126, 2001). Learning style – This is the complex manner in which, and the conditions under which, learners most efficiently and most effectively perceive, process, store and recall what they are attempting to learn (Begam & Ganapathy, 2013). M-learning –It is an exploitation of ubiquitous handheld technologies, together with wireless and mobile phone networks, to facilitate, support, enhance and extend the reach of teaching and learning (Keskin & Metcalf, 2011).
xvii
Pedagogy - The art and science of teaching the child that embodies a teacher-focused education (Knowles, 1984) where the teacher dominates the classroom scenario. The teacher is seen as source of knowledge. Reference architecture – This is an architecture that models the abstract architectural elements in the domain of interest independent of the technologies, protocols, and products that are used to implement a specific solution for the domain (OASIS Open, 2012). Service – This is a modular unit of business functionality (Rosen, Lublinsky, Smith, & Balcer, 2008), deployed on network accessible platforms provided by the service provider (Kreger, 2001). Service-Oriented Architecture – This is a basic architecture that represents the technically functional interfaces of software building blocks as reusable, distributed, loosely coupled, and standardized accessible services (Vogel, Anold, Chughtai, & Kehrer, 2011). System - A system is a collection of components organized to accomplish a specific function or set of functions in order to achieve some specific objective (IEEE, 2000). Web - A hypermedia-based system that provides a means of browsing information on the Internet in a non-sequential way using hyperlinks (Connolly & Begg, 2005). Web Service – A software-powered functional module (Palanivel & Kuppuswami, 2011) that consist of a family of technologies, specifications, protocols, and industrybased standards that are used by heterogeneous applications to communicate, collaborate, and exchange information in a secure, reliable, and interoperable manner (Bieberstein, Bose, Fiammante, Jones, & Shah, 2006).
xviii
CHAPTER ONE INTRODUCTION 1.1 Background to the study Several universities are currently involved in using e-learning systems to provide valid teaching and learning solutions, but several problems related to e-learning activities still remain (Munirah, Issham, Azidah, & Hanysah, 2012). Use of webbased e-learning systems for course delivery is increasing rapidly but the level of functionality of this learning experience has not been fully determined (Swierczek, Bechter, & Chankiew, 2012). As Stepanyan, Littlejohn, and Margaryan (2013) claim, educational institutions face challenges in ensuring effective teaching and learning in a rapidly changing society. This is evident in various Kenyan universities that have implemented or are implementing web-based e-learning systems but they have not determined its functionality. Properly designed and implemented e-learning can effectively and efficiently meet an institutions factual, theoretical, logical, and procedural training needs (Steen, 2008). However, no matter how well designed a new system is or how well planned the implementation; only with proper consideration of the human issues will the system succeed (Cadle & Yeates, 2004). Swierczek et al (2012), suggest that elearning should be open, flexible and distributed as suitable for diverse learners. This requires guidelines and strategies for designing and developing effective and efficient web-based e-learning systems. However there are no guidelines for analysing, designing, developing, supplying, and managing e-learning materials pedagogically. Evidently, e-learning cannot continue without pedagogical and andragogical techniques and these should, if possible, be aimed at personalised teaching.
Another problem is the fact that there is no one-size-fits-all approach to the design of e-learning because each organization and each course design has its own unique constraints, challenges and objectives (Steen, 2008). Clark & Mayer (2008) noted that, e-learning system designers need to balance the training goals, learner differences, and the training environment. These systems are not developed for a 1
particular learner or institution making them hard to meet the needs of the users. The choice of education technologies should not be guided by a technologically deterministic approach; it should be guided according to the contextual requirements related to a broad range of social, cultural, political and economic factors (Macleod, 2005). For a system to be functional it has to be suitable for the intended purpose, accurate, interoperate with other systems, compliance to standards and secure (ISO/IEC, 2001). However, most institutions that have implemented web-based elearning systems have not established if these systems have those attributes.
Hameed, Atta, and Andrea (2008) claims that it is vital to investigate how effective and efficient e-learning is and whether students would be sufficiently equipped to go out in the employment world after completing their education just through an elearning mode. However, most institutions of higher learning in Kenya rarely do any pre-implementation assessment to understand the functionality of the web-based elearning systems they want to implement and whether students will be satisfied and acquire skills through that system. Understanding of the critical factors that lead to user satisfaction and effective learning outcomes are fundamental for organizations to reap the benefits of e-learning (Ramayah, Ahmad, & Hong, 2012). Omondi (2009) states that fundamental to any evaluation, however, is the identification of key questions reflective of the needs of the stakeholders. This is to mean that whatever type of evaluation chosen, it is the needs of the stakeholders that need to be addressed. Education sector is constantly adapting to external drivers, including societal, technological changes and quality standards. These transformations require educational systems to adapt, to meet the needs and expectations of learners and other stakeholders (Stepanyan et al, 2013). No matter how good the e-learning environment is and what best technology is used to create it, if students are not satisfied then it is of little use (Hameed, Atta, & Andrea, 2008). Designers of web-based e-learning have no knowledge of the types of students that will use the system, nor the pedagogical goals to be achieved by learners. It is important to design e-learning with multiple viewpoints and to be consistent with the cultural values of the learner
2
(Swierczek et al, 2012), but web-based e-learning systems used in Kenya were not tailor made for Kenyan learner but were just imposted on them. The real value of web-based learning lies not in accessing knowledge at any time, any place and by anyone, but helping the right students to acquire the right skills and knowledge at the right time in order to function as active, self-reflected and collaborative participants in the information-based society (Munirah et al, 2012). Therefore, a careful assesment of whether these systems are effective and efficiency in the Kenyan learning ennvironment was required. According to Dougiamas and Retalis (2012), existent e-learning systems are widely built upon client-server architecture, peer-to-peer architecture and recently upon web services architectures. These systems have major drawbacks because of their limitations in scalability, availability, distribution of computing power and storage system, as well as sharing information between users. A successful e-learning system is one that addresses all issues for all type of users and should be scalable, available, interoperable, extensible, and adaptable, and indeed, it should be based on novel technologies (Kashfi & Razzazi, 2006). Higher education institutions need to establish an e-learning infrastructure that requires the development of a virtual learning environment that helps learners to gain access to educational materials anytime and anywhere (Munirah et al, 2012) in a format they prefer and using a device of their choice. The study sought to develop integrated adaptive service oriented reference architecture that would improve functionality and user satisfaction of web-based elearning systems. 1.2 Statement of the Problem Use of web-based e-learning systems for course delivery is increasing rapidly but the functionality of this learning experience has not been fully determined (Swierczek et al, 2012) because of diversity of web-based e-learning systems users (Toprak, 2010). Most e-learning platforms do not consider diverse learner characteristics and even lack for dynamic real-time based mechanisms which promote effective learning 3
(Hsu, 2012). Therefore, careful assessment of the critical factors that lead to webbased e-learning system user satisfaction and effective learning outcomes is required (Ramayah et al, 2012). As Stepanyan et al (2013) asserts, e-learning systems need to adapt, to meet the needs and expectations of diverse learners and other stakeholders, but as Mahnane, Laskri and Trigano (2013) found out, the current systems don’t. As suggested by Swierczek et al (2012), flexible adaptive design of e-learning would help increase the effectiveness of e-learning. Users of web-based e-learning systems want to access the system using various computing devices with different capabilities and varied computing power like mobile phones, tablets and computers. These users are mobile and they want to access the system from anywhere at anytime (Munirah et al, 2012) in a format they prefer and using a device of their choice. However, the current client-server architecture of these systems does not allow it to be flexible, open, distributed and ubiquitous for it to be suitable for diverse learners (Swierczek et al, 2012). As Sharmin and Sohel (2014) asserts, there is need to design, develop and evaluate an enhanced, ‘seamless’ technical e-learning environment. This study focused on developing an architecture that sought to improve functionality of web-based elearning systems by making it more flexible, distributed and open. 1.3 General objective The general objective of the study was to develop service-oriented reference architecture to improve functionality of web-based e-learning systems. 1.4 Objectives of the Study The specific objectives of the study were: i. To analyse the existing web-based e-learning system architectures ii. To evaluate functionality of web-based e-learning systems in Kenyan universities iii. To find out factors that determines web-based e-learning systems’ user satisfaction in Kenyan universities 4
iv. To develop an integrated adaptive service-oriented reference architecture to improve functionality of web-based e-learning systems 1.5 Research questions i. What are the weaknesses of existing web-based e-learning system architectures? ii. What elements affect functionality of web-based e-learning systems in Kenyan universities? iii. What are the factors that determine user satisfaction of web-based e-learning systems in Kenyan universities? iv. How will the developed architecture improve functionality of web-based elearning systems in Kenyan universities? 1.6 Significance of the study The findings of this study provided an insight and possible solutions to functionality issues affecting web-based e-learning systems as they are implemented in institutions of higher learning. The architecture that was developed would help in the implementation of web-based e-learning system in a manner that it would improve its functionality and hence improve user satisfaction.
The results of the study
contributed original knowledge on the practices of dealing with functionality issues affecting web-based e-learning and their implementation. The researcher also benefited significantly by understanding functionality and user satisfaction of webbased e-learning systems in institutions of higher learning in Kenya.
1.7 Limitation of the study In Kenyatta University, only a few courses have been uploaded into the learning management system. Kiget (2012) also found that only computer science related courses are uploaded into the e-learning system. This limited the study to various courses uploaded into the system. The number of lecturers and students using the system were also few and mostly from Computer Science Department. This had a limitation on the sample to be selected for the study.
5
Another limitation is that the study only focused on the e-learning system implemented at the case university; that is MOODLE. 1.8 Assumptions of the study i. The researcher assumed that all respondents had used one or more web-based elearning systems on the market. ii. The researcher assumed that the target population honestly shared their experiences and expectations of e-learning systems. iii. The researcher also assumed that the results obtained in data analysis are a representative of the target population. 1.9 Scope of the study The study was conducted at Kenyatta University. A sample of students, lecturers and course/content developers were involved in this study. Kenyatta University was chosen for this study because they were at an advance stage of implementing MOODLE e-learning system. The study focused on MOODLE web-based e-learning system. The modules that were studied in MOODLE are SCORM, Wikis, quizzes and assignments, workshops and lesson. These modules were chosen because they are predominantly the most basic and common modules that are implemented at the start of e-learning system implementation. 1.10 Conceptual framework The conceptual framework in Figure 1.1 shows the relationship between functionality features and the functionality of e-learning system in its environment of implementation. System features and tools determine system’s functionality attributes. Functionality attributes that affect an e-learning system functionality and user satisfaction are suitability, accurateness, interoperability, compliance to standards and security (ISO/IEC 9126, 2001).
6
Suitability is the ability of the system to perform the tasks required of it and meet user expectations. A system is said to be suitable if it meets the needs and expectations of users. A system is accurate if the results from the system match the expected results while interoperability is the capability of the web-based e-learning system can interact with other systems. A functional web-based e-learning system need to be compliant with standards, laws and regulations and should be able to prevent unauthorized access and uphold privacy and confidentiality. Other functionality attributes include system’s perceived usefulness and perceived ease of use. Affect
System Features and Tools Independent variable
Moderate
E-Learning System Functionality Dependent variable
E-learning Environment Moderating variable
Figure 1.1: Conceptual framework Source: Derived from research (2013) The moderating variable is the e-learning environment that consists of the e-learning system itself, organizational ICT infrastructure, organizational ICT policies, legal and cultural factors, system architecture, and ethical considerations, and demographic issues in the society using the e-learning system.
7
CHAPTER TWO LITERATURE REVIEW 2.1 Introduction Electronic learning has become common in the education sector in the recent years. Several researchers have studied various aspects of this learning experience but most of them focused on the outcomes of using the e-learning systems. However, a functionality assessment of any e-learning strategy needed to be done in order to determine whether students and lecturers get exactly what they expect from the system. In this chapter, the research presents an account of previous studies conducted in functionality, user satisfaction and the architecture of web-based elearning systems. 2.2 E-learning and mobile learning According to Clark and Mayer (2008), e-learning is any instruction that is delivered on a computer which has the following characteristics: includes content relevant to the learning feature, uses instructional methods such as examples or practice exercises to help learning, uses a variety of media elements to deliver the content and methods, builds new knowledge and skills which are linked to improved organizational performance. According to Allen (2005), e-learning is supported by electronic hardware and software either online (synchronous) or offline (asynchronous). Mobile learning (m-learning) is an approach to e-learning that utilises technology of mobile devices such as mobile phones and personal, digital assistants and digital audio players. M-learning has provided very innovative tools and applications to enhance the learning process. But, current m-learning applications must be integrated with the pre-existing e-learning applications and platforms to enhance the learning process, because e-learning platforms have also an important role in the learning process (Marie & Miguel, 2009).
8
According to Parker (2003), e-learning can be classified as technology delivered elearning or technology enhance e-learning. Technology delivered e-learning involves self-paced learning where learners can get learning material at any time and any location. Technology enhance e-learning is where by learners meet face-to-face with instructors, and instructors use e-learning to as a supplement to the traditional learning methods. 2.3 Software architecture Software architecture is the fundamental organization of a system embodied in its components, their relationships to each other, and to the environment, and the principles guiding its design and evolution (IEEE, 2000). An arhitecture is the structure or organization of system components (modules) and the manner in which these components interact. Software architecture comprises software elements, the externally visible properties of those elements, and the relationships among them (Bass, Clements, & Kazman, 2003). A system is a collection of components organized to accomplish a specific function or set of functions (IEEE, 2000). A component is a modular part of a system that encapsulates its contents and whose manifestation is replaceable within its environment (Object Management Group Inc., 2003). Software architecture is represented using a diagram that shows the structural aspects of a system (Eeles, 2006). These aspects may be architectural layers, components, or distribution nodes. A structural element can be a subsystem, a process, a library, a database, a computational node, a legacy system, an off-the-shelf product. Software architecture also defines composition of structural elements, their relationships (and any connectors needed to support these relationships), their interfaces, and their partitioning, interactions of these elements and their behaviour. Software architecture focuses on significant elements; these are elements that have a long and lasting effect, such as the major structural elements, those elements associated with essential behaviour, and those elements that address significant qualities such as reliability and scalability and generally it’s not concerned with the fine-grained details of these elements (Eeles, 2006). 9
2.4 Existing e-learning system architectures Majority of e-learning systems are implemented either with client-server architecture or are centralized server based (Mohammed & Hussein, 2010), peer-to-peer architecture and recently upon web services architectures (Dougiamas & Retalis, 2012). The client-server and centralized server approaches are metaphors of studentteacher and repository centric which reflect real world learning scenarios in which teachers act as the content producers while students act as the content consumers (Yang, 2006). In web-based e-learning systems, the client is the web browser that sends requests to a server and present results from the server. These systems have major drawbacks because of their limitations in scalability, availability, distribution of computing power and storage system, as well as sharing information between users. A successful e-learning system is one that addresses all issues for all type of users and should be scalable, available, interoperable, extensible, and adaptable, and indeed, it should be based on novel technologies (Kashfi & Razzazi, 2006). As noted by Resmer (1998), one of the main objectives of e-learning systems is to make information accessible to any type of users. In this architecture, the client requests for services from the server and the server responds with predetermined response for the request made. The content is displayed to each user in the same way and format regardless of the users experience, preference or capabilities of the hardware used by the client. This makes it hard to meet varied user needs and satisfaction. When a course has only a manageable size of e-learning users, simple client-server architecture will be sufficient. As the number of users grows, however, the course might require a large-scale e-learning system to handle a potentially large number of concurrent, geographically distributed users and support a large database of elearning materials (Palanivel & Kuppuswami, 2011). This problem can be tackled using cloud-based e-learning approach (Masud & Xiaodi, 2012). According to Tuncay (2010), cloud computing is one of the new technology trends likely to have a significant impact on the teaching and learning environment. Methods used in e-learning have the ability to be more interactive, provide a more convenient way to communicate between instructors and learners, and provide more 10
suitable courseware for the learners (Palanivel & Kuppuswami, 2011). Despite such capabilities, many students particularly among those learning through technology drop out from their courses. The reasons here are different characteristics among students who come from different cultures and societies. The solution to this would be utilizing Personalised E-learning Systems (PES) where the students are able to customize their learning environment based on pedagogical and personal choices (Palanivel & Kuppuswami, 2011). Figure 2.1 show the service oriented reference architecture for personalised elearning systems (SORAPES) developed by Palanivel and Kuppuswami (2011). SORAPES is a layered architecture that consists of presentation layer, service layer, application layer and database layer. The presentation layer gives a single port of entry for the users and handles the interaction between the users and the different components in the business logic layer. Here, the access to the components in the business layer is based on a user personalization. The business layer is responsible for all the work an e-learning system is supposed to do. It provides functionality such as mail, authentication, etc. The services layer utilizes the capabilities of the application server. It is composed of stateless functions that expose high-level business functionality. The database layer is responsible for the communication with the databases of the different components in the business layer. The four layers of the SORAPES are discussed in Palanivel and Kuppuswami (2011). SORAPES has a number of limitations among them; not all services are identified in the architecture and crucial services like student modeling and profiling is missing. This study sought to identify and model more services into the architecture and modify the entire architecture to a hybrid cloud-computing model consisting of private and public cloud with the integration of mobile learning environment into the the architecture. The private cloud would have services that offer unique functionalities to the institution while common services can be provided from the public cloud. The architecture is also not flexible particularly in content identification for a learner (Palanivel & Kuppuswami, 2011).
11
Learners
Administrators
Teachers E-learning portal/ API
Presentation Layer
REST Service Layer
SOAP UDDI Service Registry
Useracct.wsdl; register.wsdl; search.wsdl; integration.wsdl; security.wsdl; report.wsdl; notify.wsdl; repository.wsdl; ontology.wsdl;version.wsdl;
Authentication
Registration
Personalization
Notification
Authorization
Course
Communication
Versioning
Confidentiality
Content
Search
Semantic
Access Control
Interaction
Search/Query
Ontology
Application Layer
Web pages
Meta data
Ontology
Rules
Cache
Database Layer
Figure 2.1: SORAPES Source: Palanivel and Kuppuswami (2011) 2.5 Proposed E-learning system architecture This study proposed an integrated adaptive service oriented reference architecture (IASORA) for the e-learning systems. IASORA utilizes cloud computing capabilities 12
and integration of mobile learning environment with the adaptive e-learning architecture. The architecture will integrate various web-based e-learning system architectures and introduce new capabilities using service-oriented technology, adaptive learning technology and web services with the aim of improving system functionality and user satisfaction in Kenyan public universities. The main architectural issues to be considered while integrating mobile learning and online learning are: 1) context discovery (the system must check automatically the mobile device features and decide which services may be provided), 2) adaptation of contents, and 3) synchronization between the mobile device and the LMS (Retalis & Dougiamas, 2012). In order to support a richer set of educational functions and increase their effectiveness, e-learning systems need to interoperate, collaborate and exchange content or re-use functionality (Aroyo & Dicheva, 2004). 2.6 Cloud computing and e-learning Cloud computing involves hosting ICT infrastructure, software applications, and other computing services into cloud servers and being accessed via the internet (Mtebe, 2013) enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g. networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction (Mell & Grance, 2011). Most of the educational institutions do not have the ability to maintain the resources and infrastructures required running top e-learning systems (Lawanya & Subha, 2013) but this could be solved by running e-learning system in cloud.
Cloud computing will benefit learning institutions not only in terms of reduced cost but also in increased efficiency, reliability, portability, flexibility, accessibility, security and provide effective infrastructure and deployment model for dynamic demands (Alshwaier, Youssef, & Emam, 2012). Table 2.1 shows the benefits of using cloud computing in e-learning.
13
Table 2.1: Cloud computing benefits in e-learning General User Level Benefits i. ii. iii. iv. v. vi. vii. viii. ix. x. xi. xii. xiii. xiv. xv.
Access to application from anywhere Support for teaching and learning Software free or pay per use 24 hours access to infrastructure and content Opening to business environment and advanced research Protection of the environment by using green technologies Increased openness of students to new technologies Increasing functional capabilities Offline usage with further synchronization opportunities Increasing availability and integrity of data and applications for administrators, teachers and others Increasing mobility for students, faculty and staff Reducing local application and system resource footprint Increasing application and computing performance Increasing server and data storage capacity Real-time availability of instructors‟ information for students General Institution’s Benefits
i. ii. iii. iv. v. vi. vii. viii.
Cost reduction Centralized management of data and Applications Standardization of applications and Processes Increased flexibility for resource allocation Higher utilization of resources Rapid provisioning of software, resources and management of data Reduced burden of software version control Increase in available funds for innovation
Source: Adapted from Murtaza (2013), Lawanya and Subha (2013) The architecture will use Software as a Service (SaaS) and Platform as a Service (PaaS) models of cloud computing. SaaS means the software or application runs on provider’s servers (cloud) and users interact with it via internet (Mokhtar, Ali, AlSharafi, & Aborujilah, 2013). Google Apps such as Gmail, YouTube, and others are typical example of SaaS model (Babar & Chauhan, 2011). For example, institutions can use Gmail for students and staff instead of university email system. Many SaaS applications are available at little or no cost (Khmelevsky & Voytenko, 2010). PaaS enables users to access development platform and tools through APIs which support a specific set of programming languages (Babar & Chauhan, 2011). PaaS basically 14
aims to help developers who want to create, test, and deploy software applications on provider’s servers via internet without installing them locally (Mokhtar et al, 2013). The architecture sought to integrate both SaaS and PaaS in a hybrid type of cloudcomputing model which is made up of a mix of public and private cloud models. Normally, organizations which adopt private deployment model aims to expand its services by outsourcing services with less security and legal requirements to public cloud providers to create a hybrid cloud (Mtebe, 2013). For example, organization might use public cloud basic business applications such as email and their private cloud for storing sensitive data such as financial data (Bansal, Singh, & Kumar, 2012). Cloud-computing make e-learning system accessible through different resources such as mobile, laptop and desktop computers (Lawanya & Subha, 2013). The software applications will include services provided by the web-based e-learning system and other services provided either free or at a cost by public cloud-service providers. These services will be held in a private or public cloud. 2.7 Adaptive e-learning E-learning encompasses a wide diversity of practices in a dynamic, rapidly changing field. It must therefore be defined to encompass all learning experiences involving the acquisition or transfer of knowledge (Kumaran & Sankar, 2013). A nonpersonalized e-learning typically presents the same content to all learners regardless of the learner’s profile, their personal preferences, interest and learning attitude. Learners are very diverse and e-learning systems need to adapt to fit the needs, preferences and learning styles of the learners (Begam & Ganapathy, 2013). Adaptive learning aims to individualise the learning process by tailoring content and teaching strategies to the learner’s specific needs, preferences, knowledge or learning goals. There are three stages of adaptive e-learning as shown in Figure 2.2. Learn the learner
Understand learner
Figure 2.2: Stages of adaptive e-learning Source: Adapted from Kumaran and Sankar (2013)
15
Deliver Course
The first stage in the adaptive learning process is to get the learner preferences which include the preferred learning styles and tools the learner is more likely to use to access the course. Then system then needs to understand the learner before delivering the course as per the preferences of the learned. E-learning systems need to automatically detect students’ learning style (Begam & Ganapathy, 2013) in order to deliver content in a manner that the learner will easily learn, understand and remember. A good e-learning mechanism must be based on learner characteristics. However, most of e-learning platforms do not consider learner characteristics and even lack for dynamic real-time based mechanisms which promote effective learning (Hsu, 2012). Hence, most of e-learning platforms are not able to achieve adaptive learning effectively. When the web-based courses are used by a more diversified student population, it could reach efficiency limits, as these students may have very different learning aims, backgrounds, knowledge levels, learning styles, thinking styles and competencies (Mahnane, Laskri, & Trigano, 2013). A web-based course intended for a certain group of students may not be suitable for other students. Therefore a flexible adaptive web-based course is required in order to meet the needs and preferences of diverse students. Adaptive e-learning helps educators- in evaluating learning process, thereby making the web application more effective for learners; by providing learner-centric content (Subrat & Devshri, 2011), and learners could be automatically guided to activities and intelligently recommend resources, content and suggest areas to improve their performance based on online assignment results (Mahajan, Sodhi, & Mahajan, 2012). 2.8 Service oriented architecture Service-oriented architectures (SOA) is a basic architecture that represents the technically functional interfaces of software building blocks as reusable, distributed, loosely coupled, and standardized accessible services (Vogel et al, 2011). SOA is an architectural paradigm for designing and building distributed system on the basis of loosely coupled interacting services (Qinghua & Dong, 2008). Since SOA is concerned with the architecture of a distributed system, it is neither tied to nor 16
targeted towards a specific problem domain. The central concept in a ServiceOriented Architecture is the service, which defines a mechanism that enables access to capabilities through a predefined interface. The basic structure of a SOA is depicted in Figure 2.3. Service provider is the platform that hosts access to the service, service consumer is the application that is looking for and invoking or initiating an interaction with a service. The service consumer role can be played by a browser driven by a person or a program without a user interface, for example another Web service (Newcomer, 2002). Service Registry
Finds and Retrieves
Publish
SOAP, HTTP, SMTP
WSDL, UDDI, XML
Invoke
Service Consumer
Service Provider Bind
Figure 2.3: Service-Oriented Architecture (SOA) Source: Adapted from Kreger (2001), Erl (2005), Vogel et al (2011) and IBM (2012) Service registry is a searchable registry of service descriptions where service providers publish their service descriptions. Service consumers find services and obtain binding information (in the service descriptions) for services during development for static binding or during execution for dynamic binding. For statically bound service consumers, the service registry is an optional role in the architecture, because a service provider can send the description directly to service requestors (Kreger, 2001). To be accessible, a service description needs to be published so that the service consumer can find it. In the find operation, the service consumer retrieves a service description directly or queries the service registry for the type of service required and 17
binding involves initiating an interaction with the service at runtime using the binding details in the service description to locate, contact and invoke the service (Kreger, 2001). Services are discovered dynamically when needed, rather than being hard-coded into service consumers. Dynamic discovery of services is realized through use of publish, find, bind pattern. This mechanism makes it easy to dynamically add, remove, replace or relocate services as needed without further modification of service consumers. For example adding extra services for load balancing or replacing faulty services are scenarios where this can be of importance. Web service and SOA is depicted in (Palanivel & Kuppuswami, 2010). Using SOA one can build durable elearning contents, regardless of changes or evolutions in technology (Jabr & AlOmari, 2010). WSDL and XML are used for defining web services and UDDI is used to register the Web service for prospective users (Connolly & Begg, 2005). SOAP, HTTP and SMTP are protocols used for communication over the Internet. 2.9 Web services A Web Service is a software system designed to support interoperable machine-tomachine interaction over a network (Palanivel & Kuppuswami, 2011). Microsoft defines web-service as small reusable applications written in XML, which allow data to be communicated across the Internet or intranet between otherwise unconnected sources that are enabled to host or act on them (Connolly & Begg, 2005). Web service exists to be invoked by or to interact with a service consumer. It can also function as a consumer, using other Web Services in its implementation (Kreger, 2001). Web Services expose only their interfaces to the public. Such an interface can be completely described using a Web Service Description Language (WSDL) document that characterizes the Web Service interface in terms of operations that the Web Service provides, messages that are exchanged to do so and data types that are used to construct those messages. An important characteristic of Web Services is that implementation details of the systems are hidden behind the interface.
18
The service description contains the details of the interface and implementation of the service. This includes its data types, operations, binding information and network location. It could also include categorization and other metadata to facilitate discovery and utilization by service consumers (Newcomer, 2002). The service description might be published to a service consumer or to a service registry.
The consumer is aware that certain functionality is provided, but the internals on how this is done are abstracted behind the Web Service interface. Consequently, this involves that the actual functionality could be implemented in an arbitrary programming language and running on arbitrary platform.
Simple Object Access Protocol (SOAP) (W3C, 2007) is an application-level protocol based on XML used for data exchange and remote procedure call in distributed applications, usually for accessing Web Services. Due to its XML-based design, SOAP is platform and programming language independent. SOAP messages are transmitted embedded into or on top of other application-level protocols such as HTTP, SMTP.
The Web Service Description Language (WSDL) (W3C, 2001) provides the possibility to completely describe a Web Service interface through the use of an XML document that conforms to an XML Schema as defined by the WSDL specification of the W3C. WSDL provides machine-processable information on how to interact with a given Web Service to a Web Service consumer application. Since the Web Service is fully described by the WSDL document, it is possible to generate client code for interaction with a given Web Service by using the definitions given in the WSDL document.
The Universal Description Discovery & Integration (UDDI) (OASIS, 2002) specification defines a Web Service registry that allows possible Web Service consumers to dynamically discover Web Services that provide a certain service. While WSDL describes the Web Service interface, UDDI allows the discovery of the Web Service interface by clients. The UDDI registry is actually a Web Service itself and makes use of WSDL to describe its interface. The main purpose of the UDDI 19
registry is to allow client applications to dynamically discover Web Services that provide a required service. The use of UDDI allows the client applications to discover a replacement for failed Web Services.
Web Service Notifications (WSN) is to allow Web Services to notify other interested entities of events that have occurred inside the Web Service (Vinoski, 2004). In the basic form of publish-subscribe interaction pattern, a Web Service publishes a topic of events to which other interested entities may subscribe. Occurrence of an event inside the publishing Web Service then triggers the Notification of subscribed entities. Additional to this basic form, Web Service Notifications may also involve an intermediary Web Service, called a Notification broker that may introduce additional features and enhanced scalability to the Web Service Notification architecture. 2.10 Reference architecture This is an architecture that models the abstract architectural elements in the domain of interest independent of the technologies, protocols, and products that are used to implement a specific solution for the domain (OASIS Open, 2012). Reference architectures combine general architecture knowledge and general experience with specific requirements for a coherent architectural solution for a specific problem domain (Vogel et al, 2011). Reference architecture provides better return on technology investment; enables faster deployment of technology; and provides a modular and flexible technology base. Documenting a software architecture means to describe or specify all the architectural elements, diagrams, models decisions, rationale and anything else that concerns the architecture (Habraken, 2008). Service-Oriented Reference Architecture helps organizations adopt new serviceoriented computing by mapping this new architectural approach to business processes, technology initiatives and overall business and information technology transformation (Palanivel & Kuppuswami, 2011). The reference architecture simplifies and accelerates the service-oriented architecture solution development for an organization. 20
2.11 Functionality of E-learning system Designing effective e-learning systems across cultures is not an easy task. As Allen (2007) asserts, designing successful e-learning is part art and part science; it involves the appropriate utilization of learning and training theory, a solid understanding of the electronic designs tools and equipment, coupled with an appropriate understanding of the knowledge and/or skills to be taught. It also requires a blend of colour, style, sound and video usage in a manner that educates while entertaining the student without distracting from the learning experience (Steen, 2008). E-learning course delivery is increasing rapidly but the level of effectiveness of this learning experience has not been fully determined, particularly in Multi-Cultural programs (Swierczek, Bechter, & Chankiew, 2012). E-learning should be open, flexible and distributed as suitable for diverse learners.
E-learning fails because of poor content and an ineffective interactive e-learning experience (Engelbrecht, 2003). When beginning to create e-learning content, the students’ profiles and courses have to be stored in the system database and presented using web application to facilitate the functionality of building an e-learning management system (Jabr & Al-Omari, 2010). Steen, (2008) claims that another complication with the e-learning design process is the fact that there is no one-sizefits-all approach to the design of e-learning systems; each organization and each course design has its own unique constraints, challenges and objectives. A good elearning experience means the learner can master a subject, critically analyse and engage in collaborative inquiry and acquire holistic learning (Engelbrecht, 2003).
As Clark and Mayer (2008) noted, there is a general process whereby the designer balances the training goals, learner differences, and the training environment. It is very vital to determine the factors that affect students’ perception of quality of education while using e-learning systems. The two stake- holders, academic institutions and tutors must address these issues to satisfy students before deploying any virtual learning environment.
21
E-Learning takes place best by customizing content for the needs of the learners, focusing on learning outcomes and logically delivering content to achieve those outcomes (Engelbrecht, 2003). Learner interaction and becoming part of a community of other learners is important for effective understanding. Characteristics that relate to learning effectiveness are integration, complexity, stability and uniqueness (Huang, 2004). It is important to design e-learning with multiple viewpoints and to be consistent with the cultural values of the learner (Swierczek, Bechter, & Chankiew, 2012). A properly designed and implemented e-learning can effectively and efficiently meet an organizations factual, theoretical, logical, and procedural training needs (Steen, 2008).
Furthermore, Allen (2007) stresses that, as noted in the widely accepted theory of educational psychologist William Glasser, we learn: 10% of what we read; 20% of what we hear; 30% of what we see; 50% of what we see and hear; 70% of what we discuss with others; 80% of what we experience; 95% of what we teach someone else. Based on this scale, e-learning that includes student interactivity and personal involvement in the learning process has the potential to successfully deliver the highest student learning possible (Steen, 2008).
Lee (2011) identified learner satisfaction, positive learning climate, self-reported learning outcomes, perceived skill development, flexibility of learning opportunities and enhanced interaction as indicators of effective e-learning. To be effective elearning design needs to consider four features and relate them to the learner’s cultural context. These features include synchronicity: asynchronous or synchronous; location: instructor and participants are at the same location or distributed; independence: individual or collaborative and mode: electronic or blended (Swierczek, Bechter, & Chankiew, 2012).
For an e-learning system to be effective, it must must meet certain criteria. As noted by Steen, (2008), in general effective e-learning has the following characteristics: successful in reaching learning objectives; easy accessibility; consistent and accurate message; easy to use; entertaining; memorable; relevant and reduced training costs. 22
To succeed, the designer must properly balance the various factors involved in order to create an effective e-learning learning experience.
There are a number of factors associated with effectiveness of e-learning. Table 2.2 show a numbeof these factors. Table 2.2: Key success factors of e-learning Learning Platform
Process
Interactive Flexible Modular Context Ease of access and navigation Reliable Interface design Usability Multimedia Relevance
Collaboration Critical thinking Problem solving Continuous learning Cooperation Active learning feedback
Learner Factors Self-managemnt Learning style Time Interaction Motivation Commitment Engagement
Facilitator Factors Responsive Support Teachning style Interaction Construct feedback Informative Pedagogical model
Sources: Adapted from Selim (2007), Fressen (2007), Khan (2005), Aggrawal & Makkonen (2009), McPherson & Dunes (2008) E-learning is too often focused on the technological and system requirements than the learner needs (Halloran, 2008). Research has shown that learner attitudes and instructor quality are the most important factors for learner satisfaction (Swierczek, Bechter, & Chankiew, 2012). Logical structure of the e-learning program is important to the learner.
23
2.12 Educators perceptions of functionality This section presents various views of educators on the functionality attributes of elearning system. According to ISO/IEC 9126 model (ACM, 2010), educators have various perceptions on the functionality attributes of e-learning systems as highlighted in this section. a. Suitability Suitability of e-learning systems can be seen as educators perceptions of the extent to which the available e-learning system provides an appropriate set of functions for required pedagogic tasks and user objectives such as: learning content creation with templates; content delivery; management of student records; tracking students’ progress; communication and collaboration; organizing students into groups for projects; managing group discussion on topics; conducting assessments; maintaining records of assessment; maintaining records of teaching materials and aids, and assigning tutors to courses. b. Accurateness This is the educators perceptions of the extent to which the available e-learning system provides expected results or effects for specified tasks and user objectives such as: making announcements; creating and updating course information; managing group discussion; uploading course material; e-mail; file exchange; calendar; drawing tools; chat room discussions; updating grade books; conducting online testing and producing course statistics. c. Interoperability and data compliance This is the educators’ perceptions about capability of the e-learning system to: access content from, and provide content to digital libraries and other e-learning systems, export data (i.e. making data like marks available to other systems), and import data (e.g. class lists from SMS & curricula materials developed by educational and academic publishers).
24
d. Compliance This is the educators perceptions about the capability of e-learning system to adhere to standards: defines a common format for information about learners’ i.e. ‘including preferences, details, objectives, competencies, activities, interests, etc.’ which can be freely exchanged among systems(IMS Learner Information Packaging specification); it describes ‘how to sequence activities within a course’ (IMS Simple Sequencing specification); it models different aspects of a Unit of Learning of a course. Such aspects are the different roles, activities or resources, the synchronization of different user actions, the activity or resource sequencing depending on conditions, etc.’ IMS Specification Learning Design describes ‘how digital resources can be organized into logical learning units called content packages e.g. how smaller units of learning (e.g. lessons, activities) should be arranged’ (IMS Specification Content Packaging); define generic ways of specifying tests, assessments and questions that can be realized in many different systems. (IMS Specification Question and Test Interoperability); complies with information security Standard - ISO 27001; complies with WAI WCAG 1.0 AAA guidelines to facilitate Content Sharing/Reuse; enables sharable, durable, and reusable Web-based learning content( Sharable Content Object Reference Model (SCORM)); complies with standards such as that of underlying technologies such as HTML and XML (W3C standards); complies with standards for accessibility of Web content (W3C); complies with instructional Standards for sharing instructional materials with other online learning systems. e. Security This is the educators perceptions about security mechanisms of e-learning system to maintain the confidentiality of information about learners especially marks: identification and authentication of staff & students where username and passwords will be used; SSL for login procedures and encryption of critical data; HTTPS for web pages displaying or transmitting sensitive data; encryption for encoding data as it travels over the network; password protection of all courses, events and resources; a secure set of user privileges (role-based access control), which determine permission levels (creation and updating learning materials for teachers) that users need to control, manage, and update content; private messaging or lecturer-to-student 25
messaging; screening of student posts to prevent distribution of undesirable material; security mechanisms for chat rooms The e-learning system attributes highlighted were analyzed in addition to other constructs of functionality in order to establish the functionality of web-based elearning systems in Kenyan universities. By integrating the Technology Acceptance Model (TAM), DeLone and McLean Information System Success (ISS) model, and Expectation Confirmation Model (ECM) the study established factors that influence functionality of these systems and user satisfaction. 2.13 Components of effective e-learning As noted by Brown and Voltz (2005), instructors need to use a variety of techniques in order to meet the various learning styles of their participants (as cited in Steen, 2008). To meet this need, Brown and Volt propose using content, feedback and experience activities components for effective training in the training design. Companies and industries have developed an almost countless number of unique elearning tools, techniques and technologies to help in the design of e-learning. So the major problems that an e-learning designer faces is selecting those that best meet their needs and constraints of particular users in various domains (Steen, 2008). As Horton and Horton (2003) point out, people, not technology, create and use elearning. There are three specific categories of people involved in e-learning design and utilization: producers; hosts; and the learners. The producers consist of the authors, designers, illustrators, programmers and other creative individuals involved in the design and creation of the e-learning. The hosts consist of the organizations and equipment utilized to present/provide the e-learning to the learners. The learners are the users (students, readers, and/or workers) of the e-learning system. Each of these categories of people has specific technology requirements. As Horton and Horton (2003) recommend, the learner’s technology has to be the driving factor in the determination of which technology and technique to use in the design and hosting activities. 26
Anaraki, (2004) points out that the requirements and products used in the various stages involved (designing, offering and accessing e-learning) must also be kept in mind. This is accomplished by systematically identifying and properly weighing user constraints and requirements (i.e. time required for the typical user to learn how to use the product against time available for the entire development process; budgetary limits versus product costs, etc.) and evaluating the various products against the resultant criteria (Steen, 2008). As Allen, (2006) points out, there are various concepts and techniques that are required to successfully design e-learning applications. In particular, the present-day instructional design scenarios and the decisions which are involved at various points in the design process provide insight into appropriate design concepts and techniques. Institutions are in need of effective training systems, and training systems designers have to design effective e-learning systems to meet these needs. Effective e-learning involves the skilful use of learning and training theories and also requires an understanding of the knowledge and skills to be taught (Steen, 2008). Steen, (2008) further argues that the design process must also take into consideration the constraints involved in all phases of the training design and implementation, diversity of technological equipment, tools and techniques involved. The designer therefore needs to identify and balance various elements involved in the design in order to come up with an effective e-learning system. 2.14 Need to evaluate e-learning systems Evaluation is a course of action for determining the value and effectiveness of a learning system with benefits such as error correction, establishing the users’ point of view and reducing unsupportable design issues in a system (Allen, 2003). According to Shepherd (2005), there are four reasons for evaluating e-learning systems; validating training as business tool, justifying costs incurred in training, help improve design of the system or help in selecting training methods (as cited in Kiget, 2012). In this study, evaluation was done to help enhance the design of training and elearning system in order to improve its functionality and user satisfaction.
27
2.15 Software quality model This evaluation was based on ISO 9126 software quality model. The original model defined six product characteristics as shown in Figure 2.4.
Are the required functions available?
Functionality How easy is to transfer to another environment?
How reliable is the software?
Portability
Reliability ISO 9126
Maintainability
Usability
Is the software easy to use?
How easy is to modify the system? Efficiency
How efficient is the software?
Figure 2.4: ISO 9126 Software quality model Source: ISO 9126 Software Quality Model (1991) The above software characteristics are further divided into sub-characteristics. The sub-characteristics of functionality attribute are shown in Table 2.3.
28
Table 2.3 Functionality attributes Attribute
Explanation
Suitability
The ability of the software to perform the tasks required
Accurateness
The results from the software match the expected results
Interoperability
The e-learning system can interact with other LMS
Compliance
The system is compliant with standards, laws and regulations
Security
The system prevent unauthorized access
Source: ISO 9126 Software Quality Model (1991) These characteristics and sub-characteristics represent a detailed model for evaluating any software system. Indeed, Abran, Khelifi, Suryn & Seffah (2003) claimed that, “Even though it is not exhaustive, ISO 9126 model constitutes the most extensive software quality model developed simpler than IEEE P1484.1 LTSA model, SCORM or IMS. Unlike these other frameworks, ISO 9126 covers a wide spectrum of system features, including both technical requirements and human interaction with the system. This study will be based on the ISO 9126 model. 2.16 E-learning systems user satisfaction User satisfaction is often regarded as an individual’s feelings of pleasure or disappointment resulting from comparing a product’s performance (or outcome) in relation to his or her expectations (Chiu, Hsu, Sun, Lin, & Sun, 2005). Previous research has suggested strongly that satisfaction has a positive impact on future intentions to repurchase (Oliver, 1980). A study on usage of online banking services established the significance of satisfaction as a predictor of IS continuance (Bhattacherjee, 2001) while Van Riel et al. (2001) found that satisfaction has a strong impact on intention to continue using a portal site, and users’ continuance intention is determined by satisfaction (Roca, Chiu, & Martinez, 2006). Literature on technology usage has established that users’ continued usage intention of Information Communication Technologies is determined by usage satisfaction, which in turn is jointly influenced by perceived usability, perceived quality and usability disconfirmation (Roca, Chiu, & Martinez, 2006). Hsu, Chen, and Chiu (2003) and Khalifa and Liu (2002) noted that information quality, system quality and 29
service quality are conceptualized as three different constructs for operationalizing perceived performance. DeLone and McLean (1992) noted that system quality and information quality are directly related to user satisfaction and system use. Their findings were supported by Bharatia and Chaudhury (2004) and McGill, Hobbs, and Globa (2003) who found that information quality and system quality as separate constructs, are related to satisfaction.
Negash, Ryan, and Igbaria (2003) concluded that information and
system quality impact on satisfaction while service quality has no significant relationship with satisfaction and in contrast, service quality was found to be a significant predictor of satisfaction (Lai, 2004) while Rai, Lang, and Welker (2002) established that information quality influences satisfaction. A recent study by Ozkan and Koseler (2009) used the hexagonal e-learning assessment model (HELAM) which consists of six dimensions, namely, (1) system quality, (2) service quality, (3) content quality, (4) learner perspective, (5) instructor attitudes, and (6) supportive issues to evaluate a web-based learning system. Results showed that each of the six dimensions of the model had a significant effect on the learners’ perceived satisfaction. A recent study by Sharkey, Scott, and Acton, (2010) using DeLone and McLean’s Information System Success Model (DeLone & McLean, 2003) in an e-commerce environment found that information quality and system quality are significantly related to user satisfaction, intention to use and intention to transact. In another study by Ramayah, Ahmad and Lo (2010) in an e-learning environment in Malaysia, the impact of information quality on intention to use was found to be fully mediated by user satisfaction. User satisfaction is directly influenced by service quality and perceived usefulness, whilst perceived usefulness is directly influenced by trust and information quality (Ramayah, Ahmad, & Hong, 2012). E-learning system developers and implementers need to ensure the availability of quality, relevant and complete information to meet the needs of students to ensure user satisfaction without side-lining the importance of a reliable and accessible system. Completeness of information provided by the e-learning system seems to bring a greater sense of satisfaction among the users (Ramayah, Ahmad, & Hong, 2012). 30
This study integrated Technology Acceptance Model (TAM) (Davis, 1989), updated DeLone and McLean’s ISS (DeLone & McLean, 2003) and ECT model (Bhattacherjee, 2001) to investigate perceived web-based e-learning user satisfaction in Kenyan universities and hence establish the functionality of these systems. 2.17 Technology evaluation models The Technology Acceptance Model (TAM), developed by Davis et al (1989) intends to measure, predict, and explain user acceptance of information technology. TAM theorizes that perceived usefulness and perceived ease of use determine users’ behavioural intention and actual usage. TAM specifies the causal relationships between system design features, perceived usefulness, perceived ease of use, attitude toward using, and actual usage behaviour. Technology Acceptance Model provides an informative representation of the mechanisms by which design choices influence user acceptance, and therefore it is helpful in applied contexts for forecasting and evaluating user acceptance of information technology. TAM is shown in Figure 2.5. Perceived Usefulness (PU)
External Variables
Attitude Towards Using (A)
Behavioural Intention (BI)
Actual Use
Perceived Ease of Use (PEOU)
Figure 2.5: Technology acceptance model Source: Davis (1989) In the Technology Acceptance Model, there are two determinants including perceived ease of use and perceived usefulness. Perceived usefulness is the degree to which an individual believes that using a particular information system or information technology would enhance his or her job or life performance. Perceived ease of use is the degree to which a person believes that using a particular information system or information technology would be free of effort (Davis, 1989). 31
Perceived ease of use and perceived usefulness positively affect the attitudes toward an information system; and further, positively affect the individuals’ intentions to use and the acceptance of the information system. In addition, perceived ease of use positively affects the perceived usefulness, and both of perceived ease of use and perceived usefulness are influenced by external variable. DeLone and McLean’s Information System Success Model (1992) touches on six dimensions of system success which include system quality, information quality, use, user satisfaction, individual impact, and organizational impact. System quality and information quality singularly and jointly affect both use and user satisfaction. Intention to use (use) has positive and negative effects on the degree of user satisfaction and vice versa. Both use and user satisfaction influence an individual, which will eventually impact on an organization. An updated model of ISS (DeLone & McLean, 2003) added service quality measure as a new dimension of the IS success model and grouped both the individual and organization impact measures into a single impact or benefit category called net benefit. Figure 2.6 show the DeLone and McLean’s ISS model. Information Quality Intention to Use/Use System Quality Service Quality
User Satisfaction
Net Benefits
Figure 2.6: DeLone and McLean’s ISS Model Source: DeLone & McLean (2003) The Expectation-Confirmation Theory (ECT) (Bhattacherjee, 2001) asserts that consumers’ intention to repurchase a product or service is significantly influenced by their prior experience with that product or service (Anderson & Sullivan, 1993). Lower expectation and/or higher perceived performance may lead to a greater confirmation, which results in positive influences to customer satisfaction and continuance intention. Reversing the relationship would cause disconfirmation, 32
dissatisfaction, and discontinuance intention (Ramayah, Ahmad, & Hong, 2012). Bhattacherjee (2001) suggests that the information system users’ continuance decision is similar to consumers’ repurchase decision because both decisions (1) follow an initial (acceptance or purchase) decision, (2) are influenced by the initial use (of IS or product) experience, and (3) can potentially lead to an ex-post reversal of the initial decision.
Theoretical framework for this study was based on the Davis’ (1989) TAM, updated DeLone and McLean’s ISS (DeLone & McLean, 2003) and ECT model (Bhattacherjee, 2001). Based on the integrative review of the literature, six factors that are believed to be most significant to influence user satisfaction in using elearning were chosen to be included in this study. These factors as proposed by DeLone and McLean’s (1992) are (1) system quality, (2) information quality, (3) use, (4) user satisfaction, (5) individual impacts, and (6) organizational impacts. These factors are categorized into three distinct dimensions: (1) trainee (computer selfefficacy and motivation to learn), (2) course (ease of use and contents of training), and (3) organization (management support and organizational support). The effect of user satisfaction on e-learning functionality in terms of continuance intention and net benefits to individual was then investigated. 2.18 Evaluation of e-learning systems user satisfaction Most institutions of higher learning are implementing e-learning systems hoping to reap benefits from it. However, there are issues and challenges with the implementation of e-learning that need to be avoided or resolved in order to enhance the user satisfaction and e-learning effectiveness (Ramayah, Ahmad, & Hong, 2012). There are seven issues and challenges in implementing e-learning systems in developing countries. These issues includes lack of awareness amongst population, low adoption rate, bandwidth and connectivity limitations, computer illiteracy, lack of quality e-learning content, difficulty in engaging learners online and language barrier. According to Pituch and Lee (2006), there are three major constructs that determine functionality of an e-learning system; systems characteristics, user characteristics and usefulness and outcome constructs. This study analysed these 33
three constructs in detail to determine the functionality of web-based e-learning systems in Kenyan universities. The main feature of an e-learning system that contributes to its effectiveness is functionality (Pituch & Lee, 2006). Effective e-learning systems must provide for interactivity. Key to the learning process is the interactions among students themselves, the interactions between faculty and students, and the collaboration in learning that results from these interactions. Self-efficacy and Internet experience are some of the user characteristics that determine functionality of an e-learning system. Despite all the perceived benefits of e-learning, research indicates that a high rate of students who commence e-learning courses do not finish them (Dutton, Dutton, & Perry, 2002) because many are dissatisfied with the e-learning experience (Ramayah, Ahmad, & Hong, 2012). According to Loh (2007), system quality, information quality, and perceived usefulness influence e-learning effectiveness positively. This is further emphasized by Ramayah et al (2012) that information quality and system quality are important factors leading to increase in usage and user satisfaction of elearning. This study focused on finding the factors that determine and influence effective e-learning in Kenyan universities by analysing the functionality of the webbased e-learning system. 2.19 Summary In this chapter, an assessment of previous literature of e-learning and its functionality was done. The chapter focused on the e-learning technologies, web-based e-learning systems user satisfaction, e-learning functionality and educators’ perception of elearning system functionality. The chapter also investigated the components of effective e-learning and also looked at the various models that are used for evaluating e-learning system user satisfaction in order to determine its functionality. Finally the current architecture of web-based e-learning system was surveyed. The next chapter presents the methodology that was used in executing the study.
34
CHAPTER THREE RESEARCH METHODOLOGY 3.1 Introduction The purpose of the chapter is to explain the methodology that was used to achieve the objectives of the study. The main activities that were done in the process of this study are; preliminary study, pilot study, identification of the representative sample population, data collection methods and methods of data analysis and representation. 3.2 Research design According to Kothari (2004), research design is the conceptual structure within which research is conducted; it constitutes the blueprint for the collection, measurement and analysis of data. Research design links the data to be collected and conclusions to be drawn to the initial questions of the study – it provides a conceptual framework and an action plan for getting from questions to set of conclusions (Yin, 2003).
This study used a case study approach. Case study is an empirical inquiry that investigates a contemporary phenomenon within its real-life context, especially when the boundaries between the phenomenon and context are not clearly evident; and in which multiple sources of evidence are used (Yin, 2003). Case study involves a careful and complete observation of a social unit that places more emphasis on the full analysis of a limited number of events or conditions and their interrelations (Kothari, 2004). He further asserts that, case study is essentially an intensive investigation of the particular unit under consideration. According to Klien and Myers (1999), case study provides deep understanding of phenomena under study (as cite in Kiget, 2012, p. 35). 3.3 Location of study The study was carried out at Kenyatta University. This is a public university located in Nairobi County, approximately 24 kilometres from Nairobi city. The university is implementing MOODLE web-based e-learning and content management system.
35
3.4 Target population The research targeted students and lecturers using web-based e-learning system in the Department of Computer Science at Kenyatta University. This is because majority of students and lecturers using MOODLE web-based e-learning system in Kenyatta University come from this department (Kiget, 2012). Since the sample population of lecturers was larger than lecturers in the target department using web-based elearning system, some lecturers from other department using the system were involved in the study. 3.5 Sampling techniques Sampling techniques is a process of selecting a number of individuals from a population such that the selected group contains elements representative of the characteristics found the entire target population (Kombo & Orodho, 2003). Mugenda and Mugenda (1999) assert that a sample is a small group obtained from the accessible population as a representative of the whole population. A sample design is a definite plan for obtaining a sample from a given population. It refers to the technique or the procedure the researcher would adopt in selecting items for the sample (Kothari, 2004). The study sample was derived using both purposive sampling and simple random sampling techniques where each respondent had equal probability to be chosen from their respective category. Purposive sampling is a non-probability sampling procedure which does not afford any basis for estimating the probability that each item in the population has of being included in the sample (Kothari, 2004). Purposive sampling is a sampling technique that allows a researcher to select and use cases that have the required information with respect to the objective of the study (Mugenda & Mugenda, 1999). Purposive sampling technique was used to select content developers and publishers. The criterion that was used in purposive sampling was based on the experience in usage and non-usage of e-learning systems.
Simple random sampling is a probabilistic sampling technique where each and every item in the population has an equal chance of inclusion in the sample and each one of 36
the possible samples, in case of finite universe, has the same probability of being selected (Kothari, 2004). According to Mugenda & Mugenda (1999), this technique gives every sample in a given accessible population equal chance of being selected. This technique was used to select the sample population of students and lecturers used in the study. Students and lecturers were the main respondents in this study. Random sampling provides an efficient system of capturing, in a small group, the variations or heterogeneity that exists in the target population (Mugenda & Mugenda, 1999). According to Kothari (2004), random sampling is the best technique of selecting a representative sample because it enables the sample to have the same composition and
characteristics as the target population. 3.6 Sample population The sample population for this study was obtained using Cochran (1963) formula n=z2pq/d 2. Using this formula, n is the desired sample size of the study population, z is the standard normal deviate, p is the proportion of users in target population estimated to be using web-based e-learning system and d is the degree of accuracy allowed. This study used 93% (0.93) confidence level for students’ sample which corresponds to standard normal deviate (z) of 1.81, p is unknown and hence set at worst case value of 50% (0.5), and d is 7% (0.07). Using this formula for the student sample, the sample population was found to be 167 students as shown; =
=
/ ( .
) ( . )( . ) .
= 167
For the lecturers and developers sample, this study used 90% (0.90) confidence level which corresponds to standard normal deviate (z) of 1.64, p is unknown and hence set at worst case value of 50% (0.5) and d is 10% (0.10). Using this formula for lecturers and developers sample, the sample is found to be 67 as shown; =
=
/ ( .
) ( . )( . ) .
37
= 67
The exact proportion of users who use web-based e-learning system in the target population was not known and hence p is set at the worst case value of 50% (0.5) as proposed by Fisher et al (1998). Cochrans’ formula is appropriate when target population is large or unknown (Mugenda & Mugenda, 2003). Table 3.1: Sample population Category of Respondent Students Postgraduate
Lecturers/ Developers
Undergraduate
Unknown
Sample 33 167 99
Diploma Lecturers
Unknown Unknown
35 55
Unknown Unknown
12 234
Developers Total Population
Target Unknown
67
Degree of Accuracy 7 % (0.07)
10 % (0.10)
Mean Accuracy (0.085)
Source: Research (2013) 3.7 Instruments of data collection The researcher used the following instruments to collect data from the identified sample of study. 3.7.1 Content analysis Content-analysis consists of analysing the contents of documentary materials such as books, magazines, newspapers, journal articles and the contents of all other verbal materials which can be either spoken or printed (Kothari, 2004). This method can be used to analyse software and its documentation. This method was used to collect and analyse information on existing web-based e-learning system architectures to find out its strengths and weaknesses. Literature on previous research in the study area were analysed to find out strengths and weaknesses of the current web-based e-learning system architectures. 3.7.2 Questionnaire A questionnaire consists of a number of questions printed or typed in a definite order on a form or set of forms (Kothari, 2004). Two sets of questionnaires were administered to the sample population. One set of the questionnaires was 38
administered to the students and the other to the lecturers. The questionnaires were formulated based on the objectives of the study. The study used both structured and unstructured questionnaires. Structured questionnaires are those questionnaires in which there are definite, concrete and predetermined questions. The questions are presented with exactly the same wording and in the same order to all respondents (Kothari, 2004). The questionnaires included both closed and open questions. According to Kothari (2004), structured questionnaires are simple to administer and relatively inexpensive to analyse. Questionnaires are free from the bias of the interviewer, they offer respondents adequate time to give well thought out answers, they are handy in case respondents are not easily approachable and large samples can be made use of and thus the results can be made more dependable and reliable (Kothari, 2004). 3.8 Validation of the research instruments Validity of a test is a measure of how well a test measures what it is expected to measure (Kombo, 2006). Kothari (2004) asserts that validity refers to how well a specific research measures what it claims to measure. A scholarly research has to demonstrate evidence for accurateness, generalizability and replication (Toili, 2007). Mugenda and Mugenda (1999) argue that validity is the accuracy and meaningfulness of inferences, which are based on research results. This study used face and content validity to determine validity of the research tools. Face and content validity are secured via a panel of experts who judge the survey’s appearance, relevance and representativeness of its elements (Burton & Mazerolle, 2011). Face validity is an estimate of the degree to which a measure is clearly and unambiguously tapping the construct it purports to assess. The questionnaires and interview schedules will be analysed by experts before being administered. Three experts were asked to analyse these tools and rate them on the scale of one to ten (110). The average of the scores from the experts was calculated and the instruments were deemed to have face validity if the average score is 60% and above (Burton & Mazerolle, 2011). As started by Burton and Mazerolle (2011), the purpose of this 39
asseement is establishing an instrument's ease of use, clarity, and readability. The results of this analysis is shown in Table 3.2 Content validity was also examined. Content validity is a measure of the degree to which data collected using a particular instrument represents a specific domain of indicators or content of a particular concept (Mugenda & Mugenda, 1999). Content validity was done using a group of experts in e-learning systems to assess what concept the instrument is trying to measure and whether set of items or checklists accurately represents the concept under study. Three experts were asked to assess content validity of the tools and rate them on the scale of one to ten (1 - 10). The average from the three expects was calculated and the instruments were deemed to 6
have content validity if the average score is /10 (60%) and above (Burton & Mazerolle, 2011). The results of this analysis is shown in Table 3.2 Table 3.2: Instrument validity analysis Expert
Face Validity (x/10)
Content Validity (x/10)
Average (x/10) 7 /10 (0.70)
Verdict
(0.60)
Acceptable
(0.70)
Acceptable
(0.70)
Acceptable
8
/10
7
/10
2ND
7
/10
6
/10
6
3RD
6
/10
8
/10
7
Average 7/10
7
/10
7
1ST
/10 /10 /10
Acceptable
Source: Derived from research instruments (2013) The purpose of this assessment was to establish instrument's credibility, accuracy, relevance, and breadth of knowledge regarding the domain of study (Burton & Mazerolle, 2011). 3.9 Reliability of the research instruments Reliability is the measure of how accuracy and precise an instrument or measurement procedure is (Kothari, 2004). This is the measure of the extent to which results are consistent over time and an accurate representation of the total population under study. This means that an instrument is stable and will collect the same data if used in 40
other similar studies. It is a measure of the degree to which a research instrument yields consistent results or data after repeated trials (Mugenda & Mugenda, 1999). A pilot study was conducted to warrant reliability of the questionnaires that were used in this study. Pilot study is a preliminary survey of the main study conducted by experts and helps bring to the light the weaknesses, if any, of the questionnaire and also of the study techniques (Kothari, 2004). A pilot study requires at least one or two participants and they must be representative of the actual users (Nielsen, 1993). Six respondents, 3 students and 3 lecturers who are familiar with e-learning system from Kenyatta University were selected randomly for the pilot study. The objective of the pilot study was to ensure there was no ambiguity in the questions and to check the reliability of the questionnaire. The questionnaires and the interview schedules were also checked by supervisors and the Cronbach’s Coefficient Alpha was used to ascertain the reliability of these tools. The reliability of the tools used in this research is depicted in Table 3.3. Table 3.3: Cronbach’s Coefficient Reliability Cronbach’s Alpha
Cronbach’s Alpha Based on Standardised Items
0.704
0.712
Source: Derived from research data (2013) According to Nunnally and Bernstein (1994), the tool is deemed reliable if the Cronbach’s Coefficient value is above 0.70. This method uses a correlation of scores obtained in one item and the scores obtained in other items in the same instrument (Mugenda & Mugenda, 1999). According to Mugenda and Mugenda (1999), a higher coefficient implies that items correlate highly among themselves; there is consistency among the items in measuring the concept of interest. According to Kothari (2004), a measuring instrument is reliable if it provides consistent results. Both equivalence and stability aspects of reliability were tested on the tool during the pilot study. Stable reliability is concerned with securing consistent results with repeated measurements of the same respondent and with the same 41
instrument while equivalence aspect considers how much error may get introduced by different investigators or different samples of the items being studied (Kothari, 2004). 3.10 Data collection procedure The researcher obtained an introductory letter from the School of Graduate Studies, Masinde Muliro University of Science and Technology. This letter was used to obtain a research permit from the Kenya National Commission of Science, Technology and Innovation (NACOSTI). The permit was used as an authority to conduct the study. The researcher personally travelled to Kenyatta University to distribute the questionnaires to the sample population under study. Respondents who were not available or unreachable during the visit were sent questionnaires by mail. The respondents were given a period of two weeks to fill the questionnaires after which the researcher visited the university to collect the filled questionnaires. 3.11 Methods of data analysis The process of analysis involves the search for things that lie behind the surface content of the data – core elements that explain what the thing is and how it works (Denscombe, 2007). This study analysed both qualitative and quantitative data. In both sets of data, both descriptive and inferential statistics were used in analysis. Descriptive statistics is concerned with organising and summarising data at hand, to render it more comprehensible while inferential statistics deals with the kinds of inferences that can be made when generalising from data, as from sample data to the entire (target) population (Mouton, 1996). The descriptive statistics that were used include measure of central tendency; mean, mode and the median, measure of variability; standard deviation and variance. These descriptive statistics were used to develop indices and measures to summarise the collected data (Kothari, 2004).
Inferential statistics that were used are correlation, for analysing the degree of relationships between two variables and its correlation significance was used to find out generality of the results. Analysis of variance (ANOVA) was used to determine 42
whether there were significant differences in satisfaction of web-based e-learning system among students and lecturers. Generally, these inferential statistics were used to generalize the results from sample population to target population (Mugenda & Mugenda, 1999). They were used to determine the web-based e-learning system functionality and user satisfaction. Table 3.4 summarises the objectives, type of data collected for each objective and method of analysis applied. Table 3.4: Summary of data analysis Objective
Type of data Nominal
Statistical tests used in analysis Qualitative analysis
To evaluate functionality of web-based e-learning systems in Kenyan universities
Nominal
Descriptive and Correlation
To find out factors that determines web-based elearning systems’ user satisfaction in Kenyan universities
Nominal
Correlation and ANOVA
To develop an integrated adaptive service-oriented reference architecture to improve the functionality of web-based e-learning systems
Nominal (during validation)
Descriptive and ANOVA
To analyze the existing web-based e-learning system architectures
Source: Researcher (2013) 3.12 Presentation of results After collection and analysis of data, the results of the study were presented using tables, graphical and statistical techniques. Some of the statistical techniques that were used are frequency distributions, cross-tabulations, pie charts, and bar graphs. 3.13 Ethical consideration The research proposal was submitted and approved by School of Graduate Studies (SGS) and senate of Masinde Muliro University of Science and Technology. The researcher then acquired research permit from Kenya National Commission of Science, Technology and Innovation (NACOSTI) before proceeding to collect data. The questionnaires were accompanied by a signed introductory letter to introduce the researcher and a brief explanation of the purpose of the study. Before doing any interview, the researcher sought consent of participants and explained to them the 43
purpose of the interview and confidentiality of the information they were to provide. The information provided was treated with utmost confidentiality and was used for academic purposes only. I acknowledged all materials used in the study and listed/stated the source and authors of such material.
44
CHAPTER FOUR DATA ANALYSIS AND PRESENTATION 4.1 Introduction This chapter presents the data collected and its analysis. Data is analyzed both quantitatively and qualitatively. Quantitative analysis presents the distribution of data across various demographic variations or factors. Inferential statistics was used to find out how independent variable influences dependent variable and the significance of the results obtained. A total of 230 students’ questionnaires were distributed of which 172 were fully filled and returned. A total of 70 questionnaires were returned by lecturers and content developers from the 105 distributed. The study used sample of 167 and 67 for students and lecturers respectively. The study used alpha (αsignificance coefficient) value of 0.07 for students and 0.10 for lecturers. 4.2 Quantitative Analysis This section presents demographic distribution of data. Data is distributed across various demographic factors like age group, gender, year of study, mode of study and level of study. It also presents distribution of data among other factors like duration of internet usage, period of web-based e-learning system use, student learning styles and various web-based e-learning systems respondents have ever used. 4.2.1 Distribution of respondents by gender and age There were a total of 105 male student respondents which correspond to 62.9 % of the sample population. Female student respondents were 62 which correspond to 37.1 % of the total sampled. This distribution is depicted in Table 4.1. Findings show that most of the student respondents were male contributing 62.9 % of the total sample and female represented the remaining 37.1 % of the sample. These results agree with the findings of Kiget (2012) and Aminul, Noor, Liang and Momtaz (2011) who found out that most male students use web-based e-learning systems than their female counterparts and Saeed, Yang and Sinnappan (2009) who also found that majority of e-learning system users are males. This is because male are comfortable of using ICT due to their technology access, skills and interest as opposed to their female counterparts (Volman & Van Eck, 2001). 45
Table 4.1: Students respondents’ distribution by gender and age groups Age 25 – 30
Below 20
20 – 25
Gender Female
12
23
17
10
Total 62
Male
15
48
26
16
105
62.9
Total
27
71
43
26
167
100.0
16.2
42.5
25.7
15.6
100
Percentage %)
Above 30 % 37.1
Source: Research data (2013) As shown in Table 4.1, most of the respondents were aged between 20 – 25 years which made 42.5 % of the respondents followed by 25 – 30 age group with 25.7 % of the total respondents. This can be may be because most students join the university at the age of about twenty. Students below the age of 20 and those aged above 30 years contributed 16.2 % and 15.6 % of the sample population respectively. Most of the respondents are middle-aged in respect to the study and this agrees with findings of Aminul et al (2011) who found that most of the e-learning users are aged below 25 years. There was a total of 28 female lecturer respondents which is 41.8% of the total sample and the male lecturer respondents were 39 which correspond to 58.2% of the total lecturers sample. Table 4.2 shows the distribution of lecturers respondents by gender and age. Table 4.2: Lecturers respondents’ distribution by gender and age groups Gender Female
Age 40 – 50 9
Above 50 Total 3 28
% 41.8 %
20 – 30 4
30 – 40 12
Male
3
14
18
4
39
58.2 %
Total
7
26
27
7
67
100 %
40.30%
10.45%
100 %
Percentage (%) 10.45% 38.80% Source: Research data (2013)
46
The findings in Table 4.2 show that most of the respondents were male which is in agreement with Volman and van Eck (2001) findings. Research studies revealed that male teachers used more ICT in their teaching and learning processes than their female counterparts (Kay, 2006). Of the 67 lecturers respondents, 7 (10.45%) were aged between 20 – 30 years, 26 (38.80%) were from 30 – 40 age group, 27 (40.30%) were from 40 – 50 years age group while only 7 (10.45%) were aged above 50 years. 4.2.2 Distribution by level of study and year of study The study sought to draw respondents from certificate, diploma, degree, masters and PhD levels of study. Respondents were randomly selected but with a purpose of ensuring each level of study is represented. Figure 4.1 shows the distribution of respondents by level of study. 35 31 30 26 25
23
22
Frequency
20 20
18 Diploma
15 15
Degree
12
Masters
10 5 0
0
0
0
0 First Year
Second Year
Third Year
Forth Year
Year of Study
Figure 4.1: Students respondents’ distribution by level and year of study Source: Research data (2013) From the data presented in Figure 4.1 it can be seen that most of the respondents were degree students with a total of 100 which is corresponding to 59.9 % of the 47
student sample population and this is in agreements with Kiget (2012) findings who found that most of e-learning users in Kenyatta University are degree students. Diploma students were represented by 34 students which is 20.3 % of the student sample while masters’ students contributed 19.8 % of the sample population. Figure 4.1 also shows that majority of the respondents were from first and second year of study with both contributing by 31.7 % and 34.1 % respectively. This can be attributed to the contribution made by all levels of study. There are first and second years across all levels of study as compared to third and fourth year which is only contributed by degree students. Figure 4.2 shows that majority of lecturers/content developers are masters’ holders with a leading frequency of 42 which contribute 62.68% of the total respondents with degree and diploma holders having frequencies of 19 (28.36%) and 6 (8.96%) respectively. 25 20 20
F r e 15 q u e 10 n c y 5
14
13
Diploma Degree 5
4
3 0
5 2
1
0
0
0 20 - 30
30 - 40
40 - 50
Above 50
Age Groups
Figure 4.2: Age group and level of study for lecturers/content developers Source: Research data (2013)
48
Master
4.2.3 Distribution by mode of study This section presents the frequencies of respondents based on their respective mode of study. Web-based e-learning system users in the target university are spread across full time, part-time and distance learning modes of study. The pie chart in Figure 4.3 shows the distribution of the respondents based on their mode of study. Full Time
Part Time
Distance Learning
35, 21% 98, 59%
34, 20%
Figure 4.3: Distribution by mode of study Source: Research data (2013) Findings in Figure 4.3 show that most of the respondents were full time students making 58.7 % of the total respondents. Part time and distance learning students were represented by 34 (20.4 %) and 35 (20.9 %) students respectively. These findings concur with Aminul et al (2011) findings who found that more than 78.8% of the students who use e-learning systems are full time students and normally full time students in universities are many than part time and distance learning. 4.2.4 Distribution by period of Internet and e-learning system usage The study also sought to find out the period the respondents have used the Internet and web-based e-learning systems in order to find out if there is any relationship between these two factors and functionality and user satisfaction of web-based elearning systems. Table 4.3 below shows the distribution of the respondents in terms of these factors. Findings in Table 4.3 show that most of the student respondents have used Internet for a period of 4 – 6 years with a leading percentage of 29.3 % (49), followed by 42 49
(25.1 %) who have used Internet for 2 – 4 years. A total of 40 (24.0 %) of respondents have used Internet for more than 6 years while only 36 (21.6 %) of the respondents have used Internet for a short period of below 2 years. Table 4.3: Duration of Internet and e-learning system usage Duration of using web-based e-learning systems Duration of Internet usage 0 - 1 1-2 2-3 Above 3 Total Below 2 15 4 13 4 36
% 21.6
2–4
12
17
11
2
42
25.1
4–6
10
10
11
18
49
29.3
Above 6
0
8
11
21
40
24.0
Total
37
39
46
45
167
100
22.2
23.4
27.5
26.9
100
Percentage (%)
Source: Research data (2013) As seen in Table 4.3, the number of student respondents who have used web-based elearning system for more than 3 years is 45 which correspond to 26.9 % of the respondents. A total of 46 (27.5 %) respondents have used web-based e-learning system for 2 – 3 years, 39 (23.4 %) have used the system for 1 – 2 years and 37 (22.2%) have used the web-based e-learning system for a period not exceeding one year. A direct relationship between duration of Internet usage and duration of using web-based e-learning system was observed.
Table 4.4 shows the distribution of lecturer\content developers’ respondents in respect to web-based e-learning system usage and Internet usage. The findings in show that majority of lecturers’ and content developers had used the web-based elearning system for a period of 2 – 6 years with 2 – 4 and 4 – 6 years periods each having a frequency of 25 (37.31%). Only 5 (7.46%) had used web-based e-learning systems for a shorter period of less than 2 years while 12 (17.91%) had used webbased e-learning systems for more than 6 years. Majority of lecturers and content developers had used Internet for 2 – 4 years with a frequency of 28 (41.79%). A moderate number; 19 (28.36%) had used the Internet 50
for 4 – 6 years while those who had used the Internet for less than 2 years were only 7 (10.45%), and 13 (19.40%) had used the Internet for more than 6 years. Table 4.4: Lecturers Internet and Web-based e-learning system usage Duration of using Internet 0–2
Duration of using web-based learning systems 0-2 2-4 4-6 Above 6 2 0 0 5
Total 7
% 10.45
2–4
3
12
13
0
28
41.79
4–6
0
4
8
7
19
28.36
Above 6
0
9
4
0
13
19.40
Total
5
25
25
12
67
100
7.46
37.31
37.31
17.91
100
Percentage (%)
Source: Research data (2013) It was also found that majority of respondents were comfortable of using the Internet with 96% of them agreeing that they didn’t need any assistance to use the Internet. Table 4.5 shows the extent to which respondents use the Internet to perform various tasks. 4.2.5 Distribution by learning styles of students Students have varied learning styles. Learning style is the student preferred method and process of acquiring, processing, storing and remembering knowledge. Table 4.5 shows the distribution of respondents based on their learning style. The findings presented shows that a majority of respondents prefer learning by experimenting new ideas and examples with a leading frequency of 72 (43.1%). The findings in Table 4.5 also show that majority of degree students prefer this learning style. A total of 51 (30.5%) of the respondents prefer working with others in a group to get assignments done, 29 (17.4%) of the respondents prefer learning by listening and working alone while only 15 (9.0%) prefer reading and exploring models. These findings agree with Richmond and Cummings (2005) study who found that majority of online e-learning system users prefer experimenting what they learn, 51
solving examples, working in groups and a moderate number prefer learning by listening.
Table 4.5: Learning styles of students User learning style Reading and exploring models
Level of study Diploma Degree Masters 3 9 3
Total 15
% 9.0
Working with others in a group
17
15
19
51
30.5
Experimenting new ideas and examples
10
59
3
72
43.1
Listening and working alone
4
17
8
29
17.4
Total
34
100
33
167
100
20.3
59.9
19.8
100
Percentage (%)
Source: Research data (2013) Most of the lecturers and content developers had an opinion that the web-based elearning system being used does not support all the learning styles of the students. A total of 40 (59.70%) feel the system does not address all the students’ learning styles while only 30 (40.30%) are of contrary opinion.
4.2.6 Web-based e-learning systems used by respondents There are a number of web-based e-learning systems on the market among them MOODLE, WebCT, Sakai, ATutor, Wiki and Blackboard. The study sought to find out which web-based e-learning systems are used by students. The findings in Figure 4.4 show the distribution of respondents by e-learning systems they use. The findings in Figure 4.4 show that 119 (71.2%) of the respondents use Moodle web-based e-learning system, 28 (16.8%) used Wiki. WebCT and Blackboard were least used with frequencies of 15 (9.0%) and 5 (3.0%) respectively. A total of 49 lecturers\content developers had used Moodle web-based e-learning system while 13 had used Wiki and only 3 had used WebCT and only 2 had used Blackboard. From Figure 4.4 it can be observed that majority respondents had used Moodle.
52
119
120 110
Number of respondents using
100 90 80 70 60
Student
49
50
Lecturer
40 30 20
28 15
13
10
3
5
2
0
Wiki
Moodle
WebCT
Blackboard
Web-based e-learning systems
Figure 4.4: Web-based e-learning system used Source: Research data (2013)
4.3 Qualitative Analysis This section presents qualitative data and its analysis. Functionality attributes; interoperability, response, accurateness, suitability, interactivity, security, perceived usefulness, perceived ease of use, compliance to standards, system architecture and factors that affect functionality and user satisfaction of web-based e-learning systems are presented and analyzed.
53
4.3.1 Analysis of existing e-learning system architectures This section presents qualitative analysis of the existing web-based e-learning system architectures. This section addresses the first objective of the study which was to analyse the existing web-based e-learning system architectures Previous researches have pointed various weaknesses with the current web-based elearning system architectures. These systems have major drawbacks because of their limitations in scalability, availability, distribution of computing power and storage system, as well as sharing information between users (Dougiamas & Retalis, 2012). As shown in Table 4.6 most of the respondents are of the opinion that the web-based e-learning system is organized in client-server architecture. Table 4.6: Architecture of web-based e-learning system System architecture attribute
Frequency Neutral Agree
Disagree
System is organized in a clientserver architecture
Strongly Disagree 19 (11.40%)
36 (21.60%)
11 (6.60%)
75 (44.90%)
Strongly Agree 26 (15.60%)
I can access the web-based elearning system from anywhere
37 (22.20%)
67 (40.10%)
7 (4.20%)
35 (21.10%)
21 (12.60%)
Course content is consistent regardless of the equipment I use
40 (24.00%)
61 (36.50%)
11 (6.60%)
33 (19.80%)
22 (13.20%)
Content in the system change to my preferences and capability of my device
39 (23.40%)
51 (30.50%)
11 (6.60)
42 (25.10%)
24 (13.40%)
The architecture of the web-based 20 e-learning system need to be (12.00%) improved
28 (16.80%)
10 (6.00%)
81 (48.50%)
28 (16.80%)
Source: Research data (2013) The findings in Table 4.6 show a total of 101 (60.50%) of the student respondents agreeing that the system is organized in client-server architecture, 55 (33.00%) are of the contrary opinion while only 11 (6.60%) are undecided. This agrees with Mohammed and Hussein (2010) who found that majority of e-learning systems are implemented either with client-server architecture or are centralized server based.
54
Only 56 (33.70%) of the students agree that they can access the web-based e-learning system from anywhere while 104 (62.30%) said that they could not access the system from anywhere and only 7 (4.20%) were undecided. This concurs with Palanivel and Kuppuswami (2011) findings that most e-learning systems are not geographically distributed but however relie on a centralised server which may not be accessible from other networks. Users want prefer that the system should be designed in such a way that it is accessible from anywhere, anytime. The respondents who agreed that the content provided by the web-based e-learning system change to their preferences and capability of the device they user were 66 (38.50%) while 90 (53.90) were of the opinion that content presented do not change to the preferences of the user and capabilities of the device used and 11 (6.60%) were not decided. Very few respondents agreed that the course content is consistent regardless of the equipement they used to access the system; with only 55 (33.00%) agreeing to that ascertion, with 101 (60.50%) of the respondents rejecting the ascertion that the content is consistent regardless of the equipemnt they use and only 11 (6.60%) being undecided. Upto 109 (65.30%) felt that the architecture of web-based e-learning system need to be improved, with 48 (28.80%) being confortable with the current architecture and only 10 (6.00%) remaining undecided. The same findings were portrayed in the lecturers\content developers sample with 39 (58.20%) of the lecturers saying that the web-based e-learning system does not support all the learning styles of students. Most of them suggested that the system should be arganized such as to carter for all types of students with varied computer skills, learning styles and preferences, speed of learning with 43 (64.20%) of lecturers and 113 (67.70%) of students proposing that the system should be designed such that it adapts to user preferences, capabilities and allows the user to be in control of his\her learning process. The study also found that majority of the respondents use either laptops or desktop computers to access the web-based e-learning system, with 84 (50.30%) and 79 (47.30%) of the students saying they use desktop computers and laptops respectively 55
to access the system. Only 4 (2.40%) said they use tablets to access the system and non of the student access the system with a phone. Most lecturers: 31 (46.30%) and 29 (43.30%) use desktop computers and laptops respectively to access the web-based e-learning system and only 7 (10.40%) use tablets with non using a phone. Most of the respondents proposed that the web-based e-learning system should be designed such that it can be accessed from anywhere with any device of the users’ choice. The study also found that 133 (79.50%) of students sample and 55 (81.20%) of lecturers’ sample said they only access the web-based e-learning system while connected on the Internet. This make the web-based e-learning system less accessible especially when there is no Internet connection. Web-based e-learning system could be designed such that it can be accessed within the Local Area Network (LAN) without necessarily going over the Internet. This will improve accessibility of the system. The system should also be flexible, interoperable, extensible and adaptable as ascerted by Kashfi & Razzazi (2006) so to meet varied needs of users and accessible by any device including phones and tablets from anywhere. This will improve ubiquitous (mobile) learning allowing students and lectures ro access the system even while on transit and allow students to customize their learning environment based on pedagogical and personal choices (Palanivel & Kuppuswami, 2011). These findings are inline with the findings of Kashfi and Razzazi (2006) who found that most web-based e-learning systems are not scalable, they are less available, do not interoperate with other systems and they are mostly static (do not adapt to user needs and preferences). 4.3.2 Functionality of web-based e-learning systems Functionality is the capability of the software to provide functions which meet the stated and implied needs of users under specified conditions of usage (ISO/IEC 9126, 2001). Functionality attributes that affect an e-learning system functionality and user satisfaction are suitability, accurateness, interoperability, compliance to standards and security (ISO/IEC 9126, 2001). This section analyses functionality attributes data and discusses the result in order to achieve the second objective of the study 56
which was to evaluate functionality of web-based e-learning systems in Kenyan universities. 4.3.2.1 System interactivity System interactivity is the capability of the system to interactively communicate to the user. Table 4.7 show the results of the respondents opinion on interactivity of web-based e-learning system used. Table 4.7: System interactivity factors Interactivity factor Support studentlecturer interactive communication
Disagree 81 (48.5%)
Students Neutral 19 (33.8%)
Agree 53 (31.8%)
Disagree 34 (50.8%)
Support student-student and lecture-lecturer interactive communication
84 (50.3%)
12 (7.2%)
71 (42.5%)
24 (50.8%)
Lecturers Neutral 9 (13.4%)
4 (6.0%)
Agree 24 (35.5%)
29 (33.2%)
Source: Research data (2013) The results in Table 4.7 show that 81 (48.5%) students felt that the web-based elearning system does not support student-lecturer interactive communication, 53 (31.8%) feel that the system support student-lecturer interactive communication while 19 (33.8%) students were undecided. A total of 84 (50.3%) felt that the system do not support student-student interactive communication, 71 (42.5%) agreed that the system supports student-student interactive communication while only 12 (7.2%) remained undecided. Among the lecturers, 34 (50.8%) disagreed that the system supports student-lecturer interactive communication, 24 (35.5%) agreed and only 9 (13.4%) were undecided. The results shows that 34 (50.8%) of the lecturers disagreed with the assertion that the system support lecturer-lecturer interactive communication, 29 (33.2%) agreed to this assertion while only 4 (6.0%) remained neutral over this assertion. With majority of respondents disagreeing that system supports interactive communication, it suggest that they are not aware of the interactive communication features in the system or these features do not meet their expectations. 57
Pearson’s correlation analysis was done to find the relationship between demographics and system interactivity. Table 4.8 shows the inferential statistics for system interactivity factors. Pearson’s correlation coefficients (r) are shown and respective significance (p) is shown and highlighted in brackets.
There was a significant relationship between systems’
student-lecturer interactivity and gender, r(165) = 0.293, p = 0.001; year of study, r(165) = 0.219, p = 0.002; duration of system use, r(165) = 0.145, p = 0.031; learning style, r(165) = 0.328, p = 0.001. There was a negative significant relationship with level of study, r(165) = -0.145, p = 0.032 and very minimal relationship with age and mode of study. A significant relationship was observed between systems’ support of student-student interactive communication with gender, r(165) = 0.471, p = 0.001; year of study, r(165) = 0.239, p = 0.001 and learning style r(165) = .0276, p = 0.002.
Age
Gender
Level of study
Year of study
Duration of system use
Learning Style
Mode of Study
Table 4.8: Students’ opinion on system interactivity
System enable student-lecture interactive communication
0.037 (0.320)
0.293** (0.001)
-0.144* (0.032)
0.219 (0.002)
0.145 (0.031)
0.328** (0.001)
0.092 (0.119)
System enables student interactive communication
0.108 (0.083)
0.471** (0.001)
-0.202 ** (0.004)
0.239** (0.001)
0.119 (0.063)
0.276** (0.002)
0.058 (0.228)
System interactivity factors
Correlation is significant at 0.05 (1-tailed) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) There was a moderate relationship with age, r(165) = 0.108, p = 0.083 and duration of system use, r(165) = 0.119, p = 0.063. Level of study had a significant negative relationship with systems’ student-student interaction with r(165) = -0.202, p = 0.004 but minimal non-significant relationship with mode of study, r(165) = 0.058, p = 0.228. 58
This findings show that more males felt that system support student-lecturer interactive communication than females; students in higher level of learning were comfortable with system interactivity than students in lower levels; students in higher year of study had positive opinion on system interactivity; the more one uses the system the more they realize it’s interactive while mode of study did not affect system interactivity. Table 4.9 shows Pearson’s correlation coefficients (r) and respective significance (p), highlighted in brackets for lecturers’ opinion on system interactivity. System enable student-lecturer interactive communication had significant relationship with gender, r(65) = 0.374, p = 0.001 and duration of system use, r(65) = 0.324, p = 0.004. There was a significant negative relation between systems’ support of student-lecturer interaction and age, r(65) = -0.272, p = 0.013 and a non-significant negative relationship with level of study, r(65) = -0.090, p = 0.232.
Duration of system use
Level of study
Gender
System interactivity factors
Age
Table 4.9: Lecturers’ opinion on system interactivity factors
System enable student-lecture interactive communication
-0.272* (0.013)
0.374** (0.001)
-0.091 (0.232)
0.324** (0.004)
System enables lecturer interactive communication
-0.238* (0.026)
0.272* (0.013)
0.040 (0.373)
0.071 (0.285)
System addresses all the learning styles of students
0.029 (0.407)
0.473** (0.001)
-0.141 (0.128)
-0.446** (0.001)
Correlation is significant at 0.05 (1-tailed) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) System enables lecturer interactive communication had significant relationship with gender, r(65) = 0.272, p = 0.013 and a significant negative relationship with age, r(65) = -0.238, p = 0.026 while very minimal non-significant relationship was observed with duration of system use and level of study with r(65) = 0.040, p = 0.373 and r(65) = 0.071, p = 0.285 respectively. System addresses all the learning styles of 59
students had significant relationship with gennder, r(65) = 0.473, p = 0.001 and significant negative relationship with duration of system use, r(65) = -0.446, p = 0.001. Findings show that older lecturers felt that the system is not interactive and male lecturers felt the system is interactive compared to their female counterparts. Nevertheless, older lecturers felt that the system support all learning styles of students while lecturers with higher levels of study thought otherwise. The more lecturers use the system the more they realize that it does not support all learning styles of students. 4.3.2.2 System response System response is the speed at which the system responds to user requests. The results of the respondents’ opinion on the response of the system are shown in Table 4.10. Table 4.10: System response Response factor System response while using is fast
Disagree 85 (50.9%)
Students Neutral 20 (12.0%)
Agree 62 (37.1%)
Disagree 31 (46.3%)
System response is consistent
89 (53.3%)
20 (12.0%)
58 (34.7%)
35 (52.2%)
Lecturers Neutral 13 (19.4%) 6 (9.0%)
Agree 23 (34.3%) 26 (38.8%)
Source: Research data (2013) Results presented in Table 4.9 show that 85 (50.9%) of the students feel that the response of the system is not fast, 62 (37.1%) agreed that the response is fast while 20 (12.0%) students were undecided. Lectures on the same factors, 31 (46.3%) lecturers rated system response as being slow, 23 (34.3%) said the response is fast while 13 (19.4%) were neutral. The findings show that 89 (53.3%) students declined the assertion that the system response while using is consistent, 58 (34.7%) agreed with this assertion while 20 (12.0%) were not decided over consistency in the system response. For the lecturers sample, 35 (52.2%) said system response is not consistent, 26 (38.8%) agreed that 60
the response is consistent with only 6 (9.0%) being undecided. These findings suggest that most respondents feel that the web-based e-learning system is slow and not consistent and this could be attributed to poor ICT infrastructure and low network access speeds as Kiilu (2012) found out that most learning institutions in Kenya have poorly designed ICT infrastructure and low, inconsistent network/internet access. Pearson’s correlation analysis was done to find the relationship between demographics and system response. Table 4.11 shows the inferential statistics for system response factors. Pearson’s correlation coefficients (r) are shown and respective significance (p) is shown and highlighted in brackets. Findings in Table 4.11 show that there was a significant negative relationship between the speed of the system and level of study, r(165) = -0.181, p = 0.010 while the rest of the demographics had non-significant relationship with students’ opinion on the speed of the system.
System response factors
Age
Gender
Level of study
Year of study
Duration of system use
Learning Style
Mode of Study
Table 4.11: Students’ opinion on system response factors
System response while using is fast
-0.038 (0.311)
-0.002 (0.489)
-0.181 ** (0.010)
0.012 (0.440)
-0.070 (0.184)
-0.080 (0.154)
0.024 (0.381)
System response is consistent
0.024 (0.379)
-0.122 (0.059)
-0.140* (0.036)
-0.032 (0.343)
-0.052 (0.253)
-0.105 (0.089)
0.012 (0.438)
Correlation is significant at 0.05 (0.05) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) System response is consistent had moderate negative relationship with gender and level of study, r(165) = -0.122, p = 0.059 and r(165) = -0.140, p = 0.036 respectively while the rest of the demographics had minimal non-significant relationships with system response consistency.
61
These findings show that students in lower level of study felt that the system is fast while using with more females feeling that system response is consistent compared to males. Students in higher level of study and students who prefer working in groups and experimenting new ideas felt that system response is not consistent while using. Table 4.12 show Pearson’s correlation coefficients (r) and respective significance (p) highlighted in brackets for lecturers’ opinion on system response. Findings in Table 4.12 show that system response while using is fast had a significant negative relationship with age, r(65) = -0.217, p = 0.039 and gender, r(65) = -0.214, p = 0.041 while it had minimal non-significant relationship with level of study and duration of system use. System response is consistent had a significant negative relationship with age, r(65) = -0.325, p = 0.004 and level of study, r(65) = -0.248, p = 0.022 but had moderate negative non-significant relationship with gender, r(65) = -0.136, p = 0.136 and duration of system use, r(65) = -0.130, p = 0.147.
-0.217* (0.039)
-0.214* (0.041)
-0.100 (0.211)
0.086 (0.244)
System response is consistent
-0.325** (0.004)
-0.136 (0.136)
-0.248* (0.022)
-0.130 (0.147)
Duration of system use
Level of study
System response while using is fast
System response factors
Age
Gender
Table 4.12: Lecturers’ opinion on system response factors
Correlation is significant at 0.05 (1-tailed) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) Findings show that system response was unfavourable as you go up the age bracket for lecturers with more males feeling that the system response is not fast. Lecturers with higher levels of learning felt that system response is not consistent.
62
4.3.2.3 System accurateness Accurateness is the extent to which the available web-based e-learning system provides expected results or effects for specified tasks and user objectives. Figure 4.5 shows the respondents opinion on the accurateness of the web-based e-learning system used. The findings presented in Figure 4.5 show that 117 (70.0%) of the students respondents disagreed that with the assertion that the system provides accurate and effective drawing tools, with only 31 (18.6%) agreeing with this assertion and 19 (11.4%) remaining undecided. Respondents seemed to be comfortable with testing mechanisms in the system with 110 (65.8%) agreeing that the system provides accurate and effective mechanisms for online testing and 38 (22.8%) and 19 (11.4%) disagreeing and being undecided respectively. 100
89
90
78
80
Frequency
74
70 60 50
39
40
36
30
24
20 10
System provides mechanisms for announcements System provides accurate and effective drawing tools System provides accurate and effective online testing 46
14
11
15
19 19
17
14
6
0 Strong Disagree
Disagree
Neutral
Agree
Students Response
Strongly Agree
Figure 4.5: System accurateness
Source: Research data (2013) Majority of respondents were contented with the announcements mechanisms in the system with 125 (80.8%) agreeing that the system provides effective mechanisms for 63
making announcements, and only 17 (10.2%) and 15 (9.0%) disagreeing and being undecided. Most of the lecturers agreed that the system does not offer appropriate tools for creating and updating course content with 48 (71.6%) of them agreeing and only 13 (19.4%) and 6 (9.0%) disagreeing and being undecided respectively. Pearson’s correlation analysis was done to find the relationship between demographics and system accurateness. Table 4.13 shows the inferential statistics for system accurateness factors. Pearson’s correlation coefficients (r) are shown and respective significance (p) is shown and highlighted in brackets.
Age
Gender
Level of study
Year of study
Duration of system use
Learning Style
Mode of Study
Table 4.13: Students opinion on system accurateness factors
System provides mechanisms for making announcements
0.017 (0.411)
-0.004 (0.478)
-0.107 (0.084)
0.104 (0.091)
-0.015 (0.425)
0.037 (0.317)
-0.033 (0.334)
System provides effective drawing tools
-0.118 (0.064)
-0.107 (0.084)
0.075 (0.169)
-0.020 (0.400)
-0.005 (0.472)
0.008 (0.457)
0.057 (0.232)
System provides tools for effective online testing
0.088 (0.128)
-0.165* (0.017)
0.207** (0.004)
0.142* (0.033)
0.059 (0.223)
-0.027 (0.364)
0.101 (0.098)
System accurateness factors
Correlation is significant at 0.05 (0.05) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) Findings in Table 4.13 show that system provides mechanisms for making announcements had significant negative relationship with level of study, r(165) = 0.107, p = 0.084 and a positive significant relationship with year of study, r(165) = 0.104, p = 0.091 but with very minimal non-significant relationship with the rest of the demographics factors. System provides effective drawing tools had moderate negative relationship with age, r(165 ) = -0.118, p = 0.064 and gender, r(165) = -
64
0.107, p = 0.084 but with very minimal non-significant relationship with the rest of demographics. The findings show that as move up the year of study, announcement mechanisms in the system become favourable but diploma students seems to be more contented with systems’ announcement mechanisms than degree and masters students. Older and female students seem to be not pleased with systems’ drawing tools while males felt that system provides effective online testing tools than female students. Satisfaction with system testing tools seems to improve as you go up level of study and year of study but students who prefer learning in groups, and those who prefer experimenting new ideas seem to be uncomfortable with systems’ testing tools than those who prefer reading and working individually. Table 4.14 show Pearson’s correlation coefficients (r) and respective significance (p) highlighted in brackets for lecturers’ opinion on system accurateness.
Duration of system use
Level of study
System accurateness factors
Age
Gender
Table 4.14: Lecturers’ opinion on system accurateness factors
System provide effective drawing tools
0.134 (0.139)
0.134 (0.141)
-0.245* (0.023)
-0.335** (0.003)
System provides tools for effective online testing
-0.101 (0.207)
0.262* (0.016)
-0.249* (0.021)
0.245* (0.023)
System provides accurate course statistics
0.102 (0.207)
0.484** (0.001)
-0.304** (0.006)
0.292** (0.008)
Correlation is significant at 0.05 (1-tailed) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) Findings in Table 4.14 show that system provide effective drawing tools had significant negative relationship with duration of system use, r(65) = -0.335, p = 0.003; level of study, r(65) = -0.245, p = 0.023 but had modearte non-significant 65
relationship with age and gender. System provides tools for effective online testing had significant relationship with gender, r(65) = 0.262, p = 0.016, duration of system use r(65) = 0.245, p = 0.023 and significant negative relationship with level of study, r(65) = -0.249, p = 0.021 with moderate negative non-significant relationship with age, r(65) = -0.101, p = 0.207. System provides accurate course statistics had a significant relationshhip with gender, r(65) = 0.484, p = 0.001 and duration of system use, r(65) = 0.292, p = 0.008 while it had a significant negative relationship with level of study, r(65) = -0.304, p = 0.006 and a moderate non-negative relationship with age, r(65) = 0.102, p = 0.207. Findings show that as you go up level of study and as one continue using the system for longer period, lecturers feel that the system does not provide effective drawing tools. Males feel that system provides tools for effective online testing than their female counterparts and the same results are seen with duration of system use. This maybe attributed to the fact that the more you use a system the more you discover new services and functionality. 4.3.2.4 System suitability System suitability is the extent to which the web-based e-learning system provides appropriate set of functions for required pedagogic task tasks and user objectives. Figure 4.6 shows the respondents opinion on the suitability of the web-based e-learning system used.
Figure 4.6 show that 42 (62.7%) disagreed that the system provides appropriate tools for publishing of all types of content with 11 (16.4%) strongly disagreeing and 31 (46.3%) disagreeing. Only 18 (26.9%) lecturers agreed that the system offer appropriate tools for publishing all types of content with 7 (10.4%) being undecided. Findings also shows that 38 (56.7%) of the lecturers disagreed that the system provides appropriate means for management of student records, 25 (37.3%) agreed to the assertion and only 4 (6.0%) were undecided.
66
It was also found that 34 (65.7%) of the lecturers felt that the system does not provide appropriate means for lecturer collaboration, 19 (28.4%) felt the system support lecturer collaboration and 4 (6.0%) were undecided. Findings show that, 41 (61.2%) of the lecturers disagreed with the assertion that the system provides appropriate means for organizing students into groups for projects or assignments, 23 (34.4%) agreed and only 3 (4.5%) were undecided. The results also show that 47 (70.1%) of the lecturers disagreed with the assertion that the system provides means for assigning lecturers to courses, 17 (25.4%) agreed to this assertion while 3 (4.5%) were undecided. System provides appropriate tools for publishing of all types of content
50 45 40
System provides appropriate means for management of student records
35
Frequency
30
System provides appropriate means collaboration among lectures
25 20 15 10 5 0 Disagree
Neutral
Agree
System provides appropriate means for organizing students into groups for projects and discussions E-learning system provides appropriate means for assigning lecturers to courses
Lecturers’ Response
Figure 4.6: Lecturers response on system suitability Source: Research data (2013) Pearson’s correlation analysis was done to find the relationship between demographics and system suitability. Table 4.15 shows the inferential statistics for system suitability factors. Pearson’s correlation coefficients (r) are shown and respective significance (p) is shown and highlighted in brackets. Findings in Table 4.15 show system provides appropriate means for course content deliver had significant relationship with age, r(165) = 0.108, p = 0.083 and mode of 67
study, r(165) = 0.191, p = 0.007. This attribute also had a significant negative relationship with learning style, r(165) = -0.104, p = 0.090 but showed minimal nonsignificant relationship with other demographic factors. System provides appropriate means for organizing students into groups had a significant relationship with mode of study, r(165) = 0.162, p = 0.018 and significant negative relationship with gender r(165) = -0.154, p = 0.023 and learning style, r(165) = -0.205, p = 0.004 while other demographics were observed to have minimal non-significant relationship.
Age
Gender
Level of study
Year of study
Duration of system use
Learning Style
Mode of Study
Table 4.15: Students’ opinion on system suitability factors
System provides appropriate means for course content deliver
0.108 (0.083)
-0.054 (0.245)
0.069 (0.186)
0.074 (0.170)
0.017 (0.413)
-0.104 (0.090)
0.191** (0.007)
System provides appropriate means for organizing students into groups
0.091 (0.120)
-0.154* (0.023)
0.040 (0.302)
0.087 (0.131)
0.086 (0.136)
-0.205
0.162* (0.018)
System presents different types of course material in organized and readable format
-0.039 (0.309)
System suitability factors
**
(0.004)
-0.345 ** (0.001)
0.127 (0.051)
-0.038 (0.312)
0.033 (0.337)
-0.192 ** (0.006)
0.151* (0.025)
Correlation is significant at 0.05 (0.05) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) As shown in Table 4.15, system presents different types of course material in organized and readable format had significant negative relationship with gender, r(165) = -0.345, p = 0.001 and learning style, r(165) = -0.192, p = 0.006. A positive significant relationship was observed with mode of study, r(165) = 0.151, p = 0.025 and level of study, r(165) = 0.127, p = 0.051 while age, year of study and duration of system use recorded minimal non-significant relationship.
68
Findings show that age is more likely to predict satisfaction with means of content delivery and distance learning students being contented with systems’ content delivery mechanisms than fulltime and part time students. Students who prefer learning in groups and males feel that the system does not provide means of organizing students into groups compared to females and students who prefer working alone. Female, students in higher levels of study and distance learning students are likely to agree that system presents different types of course material in organized and readable format than their male counterparts, students in lower learning levels and full time students. Table 4.16 show Pearson’s correlation coefficients (r) and respective significance (p) highlighted in brackets for lecturers’ opinion on system suitability.
Duration of system use
Level of study
Gender
System suitability factors
Age
Table 4.16: Lecturers’ opinion on system suitability factors
System provides tools for publishing all types of course content
-0.080 (0.260)
0.245* (0.023)
0.069 (0.290)
0.327** (0.003)
System provides appropriate means for managing students records
-0.216* (0.040)
0.366** (0.001)
0.036 (0.386)
0.359** (0.001)
System provides appropriate means for collaboration among lecturers
-0.170 (0.084)
-0.467** (0.001)
-0.249* (0.021)
-0.376** (0.001)
System provides appropriate means for organizing students into groups
-0.356** (0.002)
-0.417** (0.001)
-0.247* (0.022)
-0.247* (0.022)
System provides appropriate means for assigning lecturers to courses
-0.323** (0.004)
0.012 (0.460)
-0.051 (0.341)
-0.151 (0.100)
Correlation is significant at 0.05 (1-tailed) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) System provides tools for publishing all types of course content had a significant relationship with gender, r(65) = 0.245, p = 0.023 and duration of system use, r(65) = 0.327, p = 0.003 while age and level of study recorded minimal non-significant 69
relationships. System provides appropriate means for managing students records had a significant relationship with duration of system use, r(65) = 0.359, p = 0.001 and gender, r(65) = 0.366, p = 0.001 and had a significant negative relationship with age, r(65) = -0.216, p = 0.040 while level of study recorded very minimal non-significant relationship. System provides appropriate means for organizing students into groups had significant relationship with age, r(65) = 0.356, p = 0.002, gender r(65) = 0.417, p = 0.001 and duration of system use, r(65) = 0.247, p = 0.022 but with significant negative relationship with level of study, r(65) = -0.247, p = 0.022. System provides appropriate means for assigning lecturers to courses had significant relationship with age, r(65) = 0.323, p = 0.004, moderate negative slightly significant relationship with duration of system use, r(65) = -0.151, p = 0.100 and minimal non-significant relationship with gender and level of study. Findings show that male lecturers feel that system provides tools for publishing all types of content compared to male lecturers and that the more one uses the system the more he is likely to agree to this assertion. More males and the more one uses the system the more will agree that system provides appropriate means for managing students records than females and those who have used the system for shorter period. Lecturers disagree with assertions that system provides appropriate means for collaboration among lecturers and system provides appropriate means for organizing students into groups across all the demographics presented in Table 4.16. As you move up the age bracket and duration of system use, lecturers disagree that the system provides appropriate means for assigning lecturers to courses. 4.3.2.5 System interoperability Interoperability is the capability of the available web-based e-learning system seamlessly interconnects to other e-learning systems and other management systems like Student Management Systems, Finance Management System, and Examination Management System and effectively exchanges or share information. Figure 4.7 shows the students response to interoperability factors of web-based e-learning system.
70
Findings in Figure 4.7 show that 117 (70.1%) of the students said that they cannot access and provide content to digital libraries and other e-learning systems, 4 (2.4%) were undecided and 46 (27.6%) agreed that they can access and provide content to digital libraries and other e-learning systems. 90
F r e q u e n c y
I can access content from, and provide content to digital libraries and other e-learning systems
78
80
71 70 60 50
55 46
49 43
42
40 30
24 24
I can export data from web-based e-learning system to other system e.g. from the system to excel or others I can import data from other systems to the web-based elearning system e.g. from excel to the system 22
20
11 10
4
7
5 5
0 Strongly Disagree
Disagree
Neutral
Student Response
Agree
Strongly Agree
Figure 4.7: System interoperability
Source: Research data (2013) Results show that 85 (50.9%) of the students could import data from other systems to the web-based e-learning system, 75 (44.9%) agreed that they can import data to the web-based e-learning system while only 7 (4.2%) were undecided. On exporting data from the web-based e-learning system to other systems like Microsoft Excel, 127 (76.0%) disagreed, 29 (17.4%) agreed and 11 (6.6%) were undecided. From the lecturers sample, 36 (53.7%) disagreed that they can access and provide data to digital libraries, while the remaining 31 (46.3%) agreed to the assertion. Same results were found on the capability to import data from other systems to the webbased e-learning systems. Results showed that 36 (53.7%) lecturers could not export information from the web-based e-learning system to other systems, while 28 (41.8%) could export information to other systems and 3 (4.5%) were undecided. 71
Pearson’s correlation analysis was done to find the relationship between demographics and system interoperability. Table 4.17 shows the inferential statistics for system interoperability factors. Pearson’s correlation coefficients (r) are shown and respective significance (p) is shown and highlighted in brackets.
Age
Gender
Level of study
Year of study
Duration of system use
Learning Style
Mode of Study
Table 4.17: Students’ opinion on system interoperability factors
I can access content from and provide content to digital libraries and other systems
0.125 (0.054)
-0.265 ** (0.001)
-0.018 (0.410)
-0.067 (0.196)
-0.046 (0.277)
-0.181 ** (0.010)
0.147* (0.029)
I can export data to other e-learning systems
-0.066 (0.199)
-0.248 ** (0.001)
0.062 (0.212)
-0.045 (0.283)
-0.004 (0.479)
-0.155* (0.023)
-0.180 ** (0.010)
I can import data from other elearning systems
-0.112 (0.075)
-0.277 ** (0.001)
-0.021 (0.393)
-0.061 (0.216)
-0.042 (0.296)
0.010 (0.449)
-0.259 ** (0.001)
System interoperability factors
Correlation is significant at 0.05 (0.05) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) As shown in Table 4.17, I can access content from and provide content to digital libraries and other systems had significant relationship with mode of study, r(165) = 0.147, p = 0.029 and age, r(165) = 0.125, p = 0.125 but had significant negative relationship with gender, r(165) = -0.265, p = 0.001 and learning style, -0.181, p = 0.010 while level of study, year of study and duration of system use had minimal, non-significant relationship. I can export data to other e-learning systems had significant negative relationship with gender, r(165) = -0.248, p = 0.001, learning style, r(165) = -0.155 and mode of study, r(165) = 0.180, p = 0.010 while it had minimal non-significant relationship with the other demographics. I can import data from other systems had significant negative relationship with age, r(165) = -0.112, p = 0.075, gender r(165) = -0.277, p = 0.001 and a significant relationship with mode 72
of study, r(165) = 0.259, p = 0.001 while it had minimal non-significant relationship with level of study, year of study, duration of system use and learning style. Findings show that male and older students are likely to disagree that they can access content from and provide content to digital libraries and other systems, import data from other systems to the e-learning system and export data from e-learning system to other systems. Distance learning students are not able to import and export data from other systems. Table 4.18 show Pearson’s correlation coefficients (r) and respective significance (p) highlighted in brackets for lecturers’ opinion on system interoperability. I can access content from, and provide content to digital libraries and other e-learning systems had significant relationship with age, r(65) = 0.170, p = 0.085; gender, r(65) = 0.216, p = 0.040 and significant negative relationship with level of study, r(65) = -0.182, p = 0.070 with minimal non-significant relationship with duration of system use.
Duration of system use
System interoperability factors
Level of study
Gender
Age
Table 4.18: Lecturers’ opinion on system interoperability factors
I can access content from, and provide content to digital libraries and other elearning systems
0.170 (0.085)
0.216* (0.040)
-0.182 (0.070)
0.084 (0.242)
I can export data from e-learning system to other e-learning systems
0.232* (0.030)
0.222* (0.036)
-0.025 (0.420)
0.226* (0.033)
I can import data from other systems to the e-learning system
0.046 (0.356)
0.191 (0.061)
-0.276* (0.012)
0.104 (0.201)
Correlation is significant at 0.05 (1-tailed) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) I can export data from e-learning system to other e-learning systems had significant relationship with age, r(65) = 0.232, p = 0.030; gender, r(65) = 0.222, p = 0.036 and 73
duration of system use, r(65) = 0.226, p = 0.033 while had very minimal nonsignificant relationship with level of study. I can import data from other systems to the e-learning system had signifiicant relationship with gender, r(65) = 0.191, p = 0.061 and significant negative relationship with level of study, r(65) = -0.276, p = 0.012 while age and duration of system use had minimal, non-significant relationship. The correlations in Table 4.18 show that increase in age among lecturers is likely to predict satisfaction with system capabilities to exchange information with other systems with male feeling that they can comfortably export and import information from the system. Findings show that the more lecturer is learned the less he can import and export information from the system. 4.3.2.6 System security This is the availability of security mechanisms in the web-based e-learning system for maintaining the confidentiality of information about learners and lecturers. Table 4.9 show the response for security mechanisms in the web-based e-learning system.
Findings in Table 4.9 show that 152 (91.0%) of the students agreed that the system provides password protection to all courses and events, 11 (6.6%) disagreed and only 4 (2.4%) were undecided. A total of 134 (80.2%) students disagreed that the system provides mechanism for screening students post to prevent distribution of undesired material, while 22 (12.8%) agreed and 11 (6.6%) were not decided. From the lecturers response, 52 (77.6%) agreed that the system provides password protection for course and events, 12 (12.9%) agreed and 3 (4.5%) were undecided. A total of 32 (47.8%) lecturers disagreed that the system encrypts content as it travels over the network, 29 (43.3%) agreed that content is encrypted as it travels over the network while 6 (9.0%) were undecided. Results show that 40 (59.7%) lecturers agreed that the system provides a secure set of privileges, 24 (35.8%) disagreed and only 3 (4.5%) were undecided. Findings show that 33 (49.3%) lecturers disagreed with the assertion that the system provides mechanism for screening of student posts to prevent distribution of undesirable material, 28 (41.8%) agreed while 6 (9.0%) were undecided. 74
Table 4.19: System security mechanisms Security factor System provides password protection of all resources System uses encryption for encoding data as it travels over the network System provides a secure set of user privileges System provides mechanism for screening of student posts to prevent distribution of undesirable material
Disagree 11 (6.6%)
Students Neutral 4 (2.4%)
Agree 152 (91.0%)
Disagree 12 (17.9%)
Lecturers Neutral 3 (4.5%)
_
_
_
32 (47.8%)
6 (9.0%)
29 (43.3%)
_
_
_
24 (35.8%)
3 (4.5%)
40 (59.7%)
134 (80.2%)
11 (6.6%)
22 (12.8%)
33 (49.3%)
6 (9.0%)
28 (41.8%)
Agree 52 (77.6%)
Source: Research data (2013) Pearson’s correlation analysis was done to find the relationship between demographics and system security. Table 4.20 shows the inferential statistics for system security factors. Pearson’s correlation coefficients are shown and respective significance is shown and highlighted in brackets. As shown in Table 4.20, system provides password protection of all resources had significant negative relationship with age, r(165) = -0.126, p = 0.052; gender, r(165) = -0.211, p = 0.003; learning style, r(165) = -0.129, p = 0.048 and had significant relationship with level of study, r(165) = 0.124, p = 0.055 while it had minimal nonsignificant relationship with duration of system use, mode of study and year of study. System provides mechanism for screening of students posts had significant relationship with age, r(165) = 0.130, p = 0.047; learning style, r(165) = 0.161, p = 0.19 while very minimal non-significant relationship was observed with other demographics.
75
Age
Gender
Level of study
Year of study
Duration of system use
Learning Style
Mode of Study
Table 4.20: Students’ opinion on system security factors
System provides password protection of all resources
-0.126 (0.052)
-0.211 ** (0.003)
0.124 (0.055)
-0.047 (0.273)
0.088 (0.129)
-0.129* (0.048)
0.093 (0.116)
System provides mechanism for screening of students posts
0.130* (0.047)
0.073 (0.175)
-0.060 (0.221)
0.097 (0.107)
-0.066 (0.197)
0.161* (0.019)
-0.063 (0.208)
System security factors
Correlation is significant at 0.05 (0.05) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) Findings show that students in higher levels of learning are likely to agree that the system provides password protection for all resource while females and students in lower age groups are likely to disagree. Table 4.21 show Pearson’s correlation coefficients (r) and respective significance (p) highlighted in brackets for lecturers’ opinion on system security factors. Findings in Table 4.21 show that system provides password protection for all resources had significant negative relationship with age, r(65) = -0.190, p = 0.062; gender, r(65) = -0.189 and duration of system use, r(65) = -0.303, p = 0.006 while it had minimal non-significant relationship with level of study. System uses encryption to encode data as it travels over the network had significant relationship with level of study, r(65) = 0.566, p = 0.001 with moderate non-significant relationship with age, gender and duration of system use. System provides a secure set of user privileges had a significant negative relationship with age, r(65) = -0.225, p = 0.034 and significant relationship with level of study, r(65) = 0.357, p = 0.002 with minimal non-significant relationship with gender and duration of system use. System provides mechanism for screening students posts had significant negative relationship with age, r(65) = -0.251, p = 0.020; gender, r(65) = -0.232, p = 0.029; level of study, r(65) 76
= -0.321, p = 0.004 and significant negative relationship with duration of system use, r(65) = -0.180, p = 0.072.
Duration of system use
-0.190 (0.062)
-0.189 (0.063)
-0.029 (0.408)
-0.303** (0.006)
System uses encryption to encode data as it travels over the network
0.104 (0.200)
0.149 (0.114)
0.566** (0.001)
0.125 (0.157)
System provides a secure set of user privileges
-0.225* (0.034)
-0.061 (0.313)
0.357** (0.002)
-0.045 (0.358)
System provides mechanism for screening students posts
-0.251* (0.020)
-0.232* (0.029)
-0.321 ** (0.004)
-0.180 (0.072)
Gender
System provides password protection for all resources
Age
Level of study
Table 4.21: Lecturers’ opinion on system security factors
System security factors
Correlation is significant at 0.05 (1-tailed) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) Male lecturers and lecturers in lower age brackets are likely to agree that the system provides password protection to all resources. However, the more the lecturer uses a web-based e-learning system the more likely he is to disagree to this assertion. Male feel that system provides a secure set of user privileges than their female counterparts. Age, gender, level of study and duration of system use predict that users will disagree that the system provides mechanisms for sceening of students posts. 4.3.2.7 System perceive usefulness This is the degree to which the web-based e-learning system is useful for the intended functions and objectives. Table 4.10 show the response of students on the usefulness of the web-based e-learning system they use.
77
Findings in Table 4.22 show that 110 (67.9%) disagreed that using the system will help them accomplish learning tasks more quickly, 47 (28.2%) agreed while 10 (6.0%) were undecided. Results show that 101 (50.5%) disagreed that using the system will improve their learning performance, 62 (37.1%) agreed that the using the system improves their learning performance and only 4 (2.4%) were neutral. A total of 90 (53.9%) respondents disagreed that using the system makes it easier to learn course content, 62 (37.1%) agreed and 10 (6.0%) were undecided. On whether the system allows learner control over his\her learning process, 94 (56.3%) disagreed, 64 (38.3%) agreed and 9 (5.4%) were undecided. The findings were discouraging on general usefulness of the web-based e-learning system with 95 (56.9%) feeling that the system is not very useful in their learning, 62 (37.1%) felt the system is useful in their learning and 10 (6.0%) remained neutral. Table 4.22: System perceived usefulness Strongly Disagree 41 (24.6%)
Disagree
Neutral
Agree
69 (41.3%)
10 (6.0%)
26 (15.6%)
Strongly Agree 21 (12.6%)
40 (24.0%)
61 (36.5%)
4 (2.4%)
43 (25.7%)
19 (11.4%)
Using the system makes it easier to learn course content
35 (21.0%)
55 (32.9%)
10 (6.0%)
50 (29.9%)
12 (7.2%)
System allows learner control over his/her learning process
34 (20.4%)
60 (35.9%)
9 (5.4%)
48 (28.7%)
16 (9.6%)
I find the e-learning system useful in my learning
40 (24.0%)
55 (32.9%)
10 (6.0%)
50 (29.9%)
12 (7.2%)
Perceive usefulness Statement System help me accomplish learning task more quickly Using the system will improve my learning performance
Source: Research data (2013) Figure 4.8 shows the lecturers feelings on the usefulness of the web-based e-learning system. Findings shows that 38 (56.7%) disagreed that using the system makes it easier to teach course content, 23 (34.3%) agreed that the system makes it easier to teach course content while 6 (9.0%) remained undecided. A total of 29 (43.3%) lecturers agreed that using the system enhances their effectiveness in teaching while 78
38 (56.7%) disagreed, and same findings were observed on web-based e-learning system being useful in teaching process. 30 25
23
24
25
26
23
Using the system makes it easier to teach course content
21
Frequency
Using the system enhances my effectiveness in teaching
20
15 15
14
15 I find the web-based e-learning system useful in my teaching
10
6 4
5
2
3
0 0 0 Strongly Disagree
Disagree
Neutral
Agree
Strongly Agree
Lecturers’ Response
Figure 4.8: Lecturers opinion on usefulness Source: Research data (2013) Pearson’s correlation analysis was done to find the relationship between demographics and system perceived usefulness. Table 4.23 shows the inferential statistics for system perceived usefulness factors. Pearson’s correlation coefficients (r) are shown and respective significance (p) is shown and highlighted in brackets. Findings in Table 4.23 show that system will allow me to accomplish learning task more quickly had significance relationship with age, r(165) = 0.275, p = 0.047; level of study, r(165) = 0.111, p = 0.095 and duration of system use, r(165) = 0.496, p = 0.001 while it had significant negative relationship with gender, r(165) = -0.284, p = 0.44; year of study, r(165) = -0.329, p = 0.034; learning style, r(165) = -0.118, p = 0.092 and mode of study, r(165) = -0.123, p = 0.057. Using the system improves my learning performance had significant relationship with age, r(165) = 0.419, p = 0.016; gender, r(165) = 0.377, p = 0.024 and level of study r(165) = 0.375, p = 0.025 while it had significant negative relationship with year of study, r(165) = -0.391, p = 79
0.022; duration of system use, r(165) = -0.336, p = 0.033; learning style, r(165) = 0.206, p = 0.064 and mode of study, r(165) = -0.174, p = 0.073.
Age
Gender
Level of study
Year of study
Duration of system use
Learning Style
Mode of Study
Table 4.23: Students’ opinion on system perceived usefulness factors
System will allow me to accomplish learning task more quickly
0. 275 (0.047)
-0.284 (0.044)
0.111 (0.095)
-0.329 (0.034)
0.496 (0.001)
-0.118 (0.092)
-0.123 (0.057)
Using the system improves my learning performance
0.419 (0.016)
0.377 (0.024)
0.375 (0.025)
-0.391 (0.022)
-0.336 (0.033)
-0.206 (0.064)
-0.174 (0.073)
System makes it easier to learn course content
0.449 (0.010)
0.116 (0.068)
0.334 (0.033)
0.113 (0.094)
-0.478 (0.004)
-0.498 (0.001)
-0.142* (0.033)
System allows learner control over his/her learning process
0.343 (0.032)
0.352 (0.030)
0.402 (0.019)
0.464 (0.007)
-0.223 (0.059)
-0.320 (0.036)
-0.118 (0.065)
System perceived usefulness factors
Correlation is significant at 0.05 (0.05) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) System makes it easier to learn course content had significant relationship with age, r(165) = 0.449, p = 0.010; gender, r(165) = 0.116, p = 0.068; level of study, r(165) = 0.334, p = 0.033 and year of study, r(165) = 0.113, p = 0.094 while it had significant negative relationship with duration of system use, r(165) = -0.478, p = 0.004; learning style, r(165) = -0.498, p = 0.001 and mode of study, r(165) = -0.142, p = 0.033.System allows learner control over his/her learning process had significant relationship with age, r(165) = 0.343, p = 0.032; gender, r(165) = 0.352, p = 0.030; level of study, r(165) = 0.402, p = 0.019 and year of study, r(165) = 0.464, p = 0.007 while it had significant negative relationship with duration of system use, r(165) = -
80
0.223, p = 0.059; learning style, r(165) = -0.320, p = 0.036 and mode of study, r(165) = -0.118, p = 0.065. Findings show that age will likely predict that students will agree that system will allow me to accomplish learning task more quickly. Students who prefer learning is a group and distance learning students are more likely to disagree that that the elearning system is useful in their learning process while students in higher levels of learning are likely to agree that the system is useful. Table 4.24 show Pearson’s correlation coefficients (r) and respective significance (p) highlighted in brackets for lecturers’ opinion on system perceived usefulness. Using the system will make it easier to teach course content had significant negative relationship with age, r(65) = -0.243, p = 0.086; gender, r(65) = 0.256, p = 0.082; level of study, r(65) = 0.255, p = 0.034 and duration of system use, r(65) = 0.305, p = 0.006.
Duration of system use
System percieved usefulness factors
Level of study
Gender
Age
Table 4.24: Lecturers’ opinion on system perceived usefulness factors
Using the system will make it easier to teach course content
-0.243 (0.086)
0.256 (0.082)
0.255* (0.034)
0.305** (0.006)
Using the system will enhance my effectiveness in teaching
-0.238 (0.088)
-0.045 (0.360)
0.212* (0.042)
0.219* (0.038)
I find the e-learning system useful in my teaching
-0.270* (0.013)
-0.443 (0.018)
0.180 (0.073)
0.224* (0.034)
Correlation is significant at 0.05 (1-tailed) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) Using the system will enhance my teaching effectiveness had significant negative relationship with age, r(65) = -0.238 but had significant relationship with level of study, r(65) = 0.212, p = 0.042 and duration of system use, r(65) = 0.219, p = 0.038 81
while had minimal non-significant relationship with gender. These findings concur with Mbarek and Zaddem (2013) who found that perceived usefulness influence positively e-learning effectiveness. E-learning system is useful in teaching had significant relationship with level of study, r(65) = 0.180, p = 0.073 and duration of system use, r(65) = 0.224, p = 0.034 while it had significant negative relationship with age, r(65) = -0.270, p = 0.013 and gender, r(65) = -0.443, p = 0.018. Older lecturers are not likely to agree that the system is useful in their teaching process while lecturers with higher levels of study are more likely to agree to this assertion. Duration of system use also influences the feeling of the system being useful positively with users who have used the system for long period more likely to agree that the system is useful. 4.3.2.8 System perceived ease of use Perceived ease of use is the degree to which the web-based e-learning system is easy to use to perform various pedagogic tasks and achieve course objectives. Table 4.11 shows the students’ responses on the perceived ease of use of the available web-based elearning system.
Findings in Table 4.25 show that majority of the students are not confident of using the system if they have never used it before with 90 (53.9%) disagreeing, 69 (41.4%) agreed and 8 (4.8%) were undecided. There was equal number of responses of students who are confident of using the web-based e-learning system without being shown with 81 (48.5%) saying they need to be shown how to use the system, and the same number saying they don’t need to be shown how to use the system. A total of 90 (53.9%) disagreed that they are confident of using the system as long as they have a lot of time to complete the task at hand, 70 (41.9%) said they need a lot of time to complete a task with the system and 7 (4.2%) were undecided.
82
Table 4.25: Perceived ease of use Perceived ease of use statement I am confident of using system even if there is no one around to show me I am confident of using system even if I have never used such a system before I am confident of using the system as long as I have a lot of time to complete the task to be done Learning to operate the webbased e-learning system is easy for me
Strongly Disagree
Student Response Disagree Neutral Agree
Strongly Agree
31 (18.6%)
50 (29.9%)
5 (3.0%)
54 (32.3%)
27 (16.2%)
30 (18.0%)
60 (35.9%)
8 (4.8%)
39 (23.4%)
30 (18.0%)
46 (27.5%)
44 (26.3%)
7 (4.2%)
51 (30.5%)
19 (11.4%)
33 (19.8%)
58 (34.7%)
11 (6.6%)
42 (25.1%)
23 (13.8%)
Source: Research data (2013) About learning how to operate how to use the web-based e-learning system, 91 (54.4%) disagreed that it is easy to learn how to operate the system, 65 (38.9%) agreed that it is easy to learn how to operate the system while 11 (6.6%) were undecided. These findings coincide with findings of Kiget (2012) who found that 57.5% of students found it hard to learn how to use the web-based e-learning system. Same results were depicted from the lecturers sample with 33 (49.2%) disagreeing with the assertion that they can use the web-based e-learning system even if there is no one to show them and 34 (50.8%) agreed that they can use the system without being shown how. A total of 36 (53.7%) disagreed that they can use the system even if they have never used it before while 31 (46.3%) agreed to that assertion. Results showed that 37 (55.2%) lecturers need a lot of time with the system to accomplish a task, while 30 (44.2%) did not need a lot of time to complete a task with the system. On the ease of learning how to operate the e-learning system, 36 (53.7%) disagreed that it is easy to learn how to operate the system while 31 (46.3%) agreed that it is easy to learn how to operate the system.
83
Pearson’s correlation analysis was done to find the relationship between demographics and system perceived ease of use. Table 4.26 shows the inferential statistics for system perceived ease of use factors. Pearson’s correlation coefficients (r) are shown and respective significance (p) is shown and highlighted in brackets. I am confident of using the system even if there is no one to show me how, had a significant relationship with age, r(165) = 0.152, p = 0.025; duration of system use, r(165) = 0.197, p = 0.066 and year of study, r(165) = 0.104, p = 0.091 and had significant negative relationship with mode of study, r(165) = -0.108, p = 0.065 while it had minimal non-significant relationship with gender, level of study and learning style.
Age
Gender
Level of study
Year of study
Duration of system use
Learning Style
Mode of Study
Table 4.26: Students’ opinion on system perceived ease of use factors
I am confident of using the system even if there is no one around to show me how
0.152* (0.025)
0.048 (0.268)
0.020 (0.400)
0.104 (0.091)
0.197 (0.066)
0.014 (0.413)
-0.118 (0.065)
I am confident of using the system even if I have never used it before
0.108 (0.071)
0.054 (0.243)
0.072 (0.176)
0.027 (0.363)
-0.021 (0.393)
-0.155* (0.023)
-0.010 (0.449)
I am confident of using the system as long as I have a lot of time to complete task to be done
0.045 (0.280)
0.152* (0.025)
-0.035 (0.327)
0.182** (0.009)
0.200** (0.005)
0.173* (0.013)
-0.009 (0.454)
Learning to operate the system is easy for me
0.050 (0.262)
0.006 (0.471)
0.171 (0.074)
0.102 (0.056)
0.114 (0.067)
-0.183 ** (0.009)
0.002 (0.488)
System perceived ease of use factors
Correlation is significant at 0.05 (0.05) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) 84
As shown in Table 4.26, I am confident of using the system even if I have never used it before had significant relationship with age, r(165) = 0.108, p = 0.071 and significant negative relationship with learning style, r(165) = -0.155, p = 0.023 while very minimal non-significant relationship was observed with gender, level of study, year of study, duration of system use and mode of study. I am confident of using the system as long as I have a lot of time to complete task to be done had significant relationship with gender, r(165) = 0.152, p = 0.025; year of study, r(165) = 0.182, p = 0.009; duration of system use, r(165) = 0.200, p =0.005 and learning style, r(165) = 0.173, p = 0.013 while it had very minimal, non-significant relationship with age, level of study and mode of study. As shown in Table 4.26, learning to operate the system is easy had significant relationship with level of study, r(165) = 0.171, p = 0.074; year of study, r(165) = 0.102, p = 0.056 and duration of system use, r(165) = 0.114, p = 0.067 while it had significant negative relationship with learning style, r(165) = -0.183, p = 0.009 and very minimal, non-significant relationship with age, gender and mode of study. Table 4.27 show Pearson’s correlation coefficients (r) and respective significance (p) highlighted in brackets for lecturers’ opinion on system perceived ease of use. Findings show that the more you go up the age bracket the more the users feel that the system is easy to use. The more one uses the system also has a positive influence on usability perception of the system. This means that the more one uses the system the more he becomes acquainted with it and the more it becomes easy to use. Students who prefer working in groups were likely to have issues using the system than those who prefer working individually while mode of study had too little influence on systems usability. As shown in Table 4.27, I am confident of using the system even if there is no one to show me had significant relationship with level of study, r(65) = 0.229, p = 0.031 and significant negative relationship with age, r(65) = -0.336, p = 0.003; gender, r(65) = -0.159, p = 0.099 and duration of system use, r(65) = -0.444, p = 0.001. I am confident of using the system even if I have never used such a system before had significant negative relationship with duration of system use, r(65) = -0.302, p = 85
0.006 while very minimal, non-significant relationship was observed with age, gender and level of study.
Age
Gender
Level of study
Duration of system use
Table 4.27: Lecturers’ opinion on system perceived ease of use factors
I am confident of using the system even if there is no one to show me
-0.336** (0.003)
-0.159 (0.099)
0.229* (0.031)
-0.444** (0.001)
I am confident of using the system even if I have never used such a system before
-0.095 (0.221)
-0.096 (0.219)
0.033 (0.397)
-0.302** (0.006)
I am confident of using the system as long as I have a lot of time to complete the task
0.253* (0.019)
-0.184 (0.068)
-0.321** (0.004)
-0.211* (0.043)
Learning to operate the e-learning system is easy to me
-0.251* (0.020)
0.136 (0.136)
0.065 (0.302)
0.218* (0.038)
Perceived ease of use factors
Correlation is significant at 0.05 (1-tailed) * Correlation is significant at 0.01 (1-tailed) ** Source: Research data (2013) I am confident of using the system as long as I have a lot of time to complete a task had significant relationship with age, r(65) = 0.253, p = 0.019 while it had significant negative relationship with gender, r(65) = -0.184, p = 0.068; level of study, r(65) = 0.321, p = 0.004 and duration of system use, r(65) = -0.211, p = 0.043. Learning to operate the system had significant relationship with duration of system use, r(65) = 0.218, p = 0.038 and significant negative relationship with age, r(65) = -0.251, p = 0.020. The findings concur with Saadé and Bahli (2005) who found a positive and significant relationship between perceived ease of use and perceived usefulness. Findings show that older people are likely to have issues with system usability. Results show that the more lecturers use the system the more he feels that the system
86
is not user friendly while gender and level of study had very little influence on the web-based e-learning systems’ perceived ease of use. 4.3.3 Web-based e-learning system user satisfaction User satisfaction is often regarded as an individual’s feelings of pleasure or disappointment resulting from comparing a product’s performance (or outcome) in relation to his or her expectations (Chiu et al, 2005). This section looks factors that affect users’ satisfaction of the various functionality attributes discussed in section 4.3.2 in order to achieve objective three which was to find out factors that determines web-based e-learning systems’ user satisfaction in Kenyan universities 4.3.3.1 Web-based e-learning system user satisfaction factors Respondents were asked to rate their level of satisfaction with the system. Figure 4.9 shows the response obtained on the level of satisfaction of the users. Findings show that 119 (71.3%) were not satisfied with web-based e-learning system interactivity, 15 (8.9%) were neutral and only 33 (19.8%) agreed that they are satisfied with system interactivity. A total of 97 (58.1%) disagreed that they are satisfied with elearning system response, 54 (32.3%) agreed to be satisfied and 16 (9.6%) were undecided. Students were asked to rate their satisfaction with features and tools incorporated in the e-learning system and 129 (77.3%) were not satisfied, 23 (13.8%) were satisfied and only 15 (8.9%) were neutral. As shown in the findings in Figure 4.9, 116 (69.5%) of the students were not satisfied with web-based e-learning system inter-operability, 43 (25.8%) claimed to be satisfied and only 8 (4.8%) were undecided. On system security satisfaction, 127 (76.1%) disagreed to be satisfied, 29 (17.4%) claimed to be satisfied with system security while 11 (6.6%) remained undecided. Students were asked to rate how easy is it to use the e-learning system and 91 (54.5%) were not satisfied with e-learning systems’ ease of use, 67 (40.1%) agreed to be satisfied while 9 (5.4%) remained undecided.
87
140
F 120 r e 100 q u 80 e n 60 c y 40
129
127
119
116 97
91 67 54 43
33
29
23 16
8
11
9 Am satisfied with systems' ease of use
15
Am satisfied with system security
15
Am satisfied with system inter-operability
20
Agree
Am satisfied with features in the system
Neutral
Am satisfied with system response
Disagree
Am satisfied with system interactivity
0
User satisfaction statements
Figure 4.9: Users’ level of satisfaction Source: Research data (2013) 4.3.3.2 Supplementary factors affecting e-learning system user satisfaction
Respondents were requested to state any factors they think hinder them from achieving the best from the web-based e-learning system they use. This section presents factors that users pointed out as being an obstacle for the system to meet their expectations. Figure 4.10 show the results from students’ respondents on the functionality and satisfaction factors.
88
52
University policy and culture
115 68
Type of content published on the e-learning system
99 83
Inaccuracy of the content provided by elearning system
84 63
Poor organization and sequencing of course content
104
Functionality and satisfaction factors
58
The e-learning system is hard to use
109 52
Low network access speed
115 65
Availability of the e-learning system
102 64
Poor ICT infrastructure at the University
103 67
Lack of required tools in the e-learning system
100 75
Inappropriate content provided by e-learning system
92 58
Lack of skills to use the e-learning system
109 0
No
20
40
60
80
100
Yes Response Frequency
Figure 4.10: More factors affecting user satisfaction Source: Research data (2013)
89
120
Findings in Figure 4.10 show that 109 (65.3%) of the students feel lack of skills to use the system hinders them from getting the best from the system while 58 (34.7) felt that lack of skills does not affect them. Lack of required tools in the web-based elearning system could be another factor affecting functionality and satisfaction, with 100 (59.9%) of the students citing it. Inappropriate content provided by web-based elearning system was cited by 92 (55.1%) of the students, poor ICT infrastructure at the University being cited by 103 (61.7%) students and 102 (61.1%) saying availability of the web-based e-learning system affects system functionality and their satisfaction. Network access speed was also cited as a factor affecting functionality of web-based e-learning system and user satisfaction by 115 (68.9%) students, ease of use of the elearning system was cited by 109 (65.3%) and system architecture was pointed out as a factor affecting functionality and user satisfaction by 104 (62.3%) students. Accuracy of the content provided by the web-based e-learning system was cited by 84 (50.3%) students, 99 (59.3%) cited type of content published in the system as factor affecting functionality and user satisfaction and 115 (68.9%) cited university policy and culture. 4.3.3.3 Factors that determine user satisfaction inferential statistics This section presents factors that determine web-based e-learning system user satisfaction. This section address the second objective of the study which was to find out factors that determines web-based e-learning systems’ user satisfaction in Kenyan universities. One-way analysis of variance (ANOVA) was carried out on various demographic and other factors to find out their level of significance in influencing web-based e-learning system user satisfaction. Correlation was carried out to find the relationship between these factors and web-based e-learning system user satisfaction. Table 4.11 shows the results of 1-tailed correlation significance between demographic factors and web-based e-learning system satisfaction and corresponding Pearson Correlation values are highlighted and enclosed in brackets.
90
Table 4.28: Pearsons’ correlation of demographics and system satisfaction Gender
Age
Level of Study
Mode of study
Year of study
Am satisfied with system interactivity
0.365 (0.365)**
0.127 (0.089)
0.018 (-0.163)*
0.260 (-0.050)*
0.011 (0.177)*
Am satisfied with system response
0.490 (0.002)
0.419 (-0.016)
0.003 (-0.209)**
0.345 (0.098)
0.105 (-0.031)
Am satisfied with content in the system
0.055 (0.124)
0.236 (0.056)
0.243 (0.054)
0.055 (0.150)*
0.026 (0.124)
Am satisfied with features in the system
0.000 (-0.257)**
0.114 (0.094)
0.458 (0.008)
0.009 (0.183)**
0.476 (-0.005)
Am satisfied with system interoperability
0.000 (-0.440)*
0.342 (-0.032)
0.026 (0.151)**
0.000 (0.382)*
0.046 (-0.131)**
Am satisfied with system security
0.266 (-0.049)
0.039 (0.137)
0.493 (0.001)
0.231 (0.057)
0.453 (0.001)
Satisfaction Statements
**
E-learning system is useful in my learning
0.388 (0.022)
0.291 (0.043)
0.052 (0.126)
0.030 (-0.146)*
0.435 (0.013)
Am satisfied with systems’ ease of use
0.363 (0.027)
0.064 (0.118)
0.277 (0.046)
0.451 (0.010)
0.050 (0.128)
Correlation is significant at the 0.05 level (1-tailed).* Correlation is significant at the 0.01 level (1-tailed).** Source: Research data (2013) Pearson’s correlation coefficients in Table 4.11 show that gender of respondent has a moderate relationship with the system interactivity satisfaction at 0.01 (1-tailed) significance level with 0.365 Pearsons’ correlation coefficient. Age has a slight direcct relationship with system interactivity satisfaction with correlation coefficient of 0.089, level of study has a moderate negative correlation with system interactivity satisfaction with a correlation coefficient of -0.163 at 0.05 (1-tailed) significance level. Mode of study has a minimal negative correlation with interactivity satisfaction with a correlation coefficient of -0.050 at 0.05 (1-tailed) significance level while year of study has a slight positive correlation with system interactivity satisfaction with a correlation coefficient of 0.177 at 0.05 (1-tailed) significance level.
91
Findings show that gender has very minimal relationship with system response satisfaction with 0.002 Pearson’s correlation coefficient; age has a slight negative correlation with system response satisfaction with -0.016 Pearson correlation coefficient; level of study have a moderate negative correlation at 0.01 (1-tailed) significance with -0.209 Pearson’s correlation coefficient. Mode of study has a slight positive relationship with system response satisfaction with 0.098 Pearson’s correlation coefficient, and year of study have a slight negative relationship with system response satisfaction with -0.031 Pearson’s correlation coefficient. Mode of study had a moderate relationship with satisfaction with content in the webbased e-learning system with 0.150 Pearson correlation coefficient at 0.05 (1-tailed) significance level; year of study and gender had a positive relationship with satisfaction of content, both with 0.124 Pearson correlation coefficient and level of study and age had minimal relationship with content satisfaction with 0.054 and 0.056 Pearson’s correlation coefficient values respectively. Year of study registered very minimal negative relationship with satisfaction with features in the system with -0.005 Pearson’s correlation coefficient. Mode of study and satisfaction with features in the system had moderate positive relationship at 0.01 (1-tailed) significance level, with 0.183 Pearson correlation coefficient; satisfaction of features in the system had a slight positive relationship with age with 0.094 Pearson’s correlation coefficient while gender had moderate negative relationship with satisfaction of features in the system with -0.257 Pearson’s correlation coefficient at 0.01 significant level. Gender had a significant negative relationship with system interoperability satisfaction with -0.440 Pearson’s correlation coefficient; mode of study also had a slightly significant positive relationship with system interoperability satisfaction with 0.382 Pearson’s correlation coefficient both at 0.05 (1-tailed) significant levels. Year of study had a moderate negative relationship with system interoperability satisfaction with -0.131 Pearson’s correlation coefficient and level of study had a moderate positive relationship with system interoperability satisfaction with 0.151 Pearson’s correlation coefficient, both at 0.01 (1-tailed) significant levels. Results 92
also show that age and system interoperability satisfaction were slightly negatively related with Pearson’s correlation coefficient of -0.032. System security satisfaction was not strongly related to demographics with -0.049 Pearson’s correlation coefficient with gender, 0.001 with level of study and year of study, 0.057 with mode of study and age had slight positive correlation with system security satisfaction with 0.137 Pearson’s correlation coefficient at 0.01 (1-tailed) significant level. There was slight positive relationship between age and gender to respondents feeling that the web-based e-learning system is useful in their learning with 0.043 Pearson’s correlation coefficient for age, and 0.022 for gender. The findings show that there was a moderate positive relationship between level of study and respondents’ feeling that the system is useful in their learning with 0.126 Pearson’s correlation coefficient; which means that masters’ students felt that the system is useful to them, followed by degree and then diploma students. Mode of study had a moderate negative relationship with respondents’ feeling that the system is useful in their learning with -0.146 Pearson’s correlation coefficient. This means that full time students felt that the web-based e-learning system is useful in their learning than part time and distance learning students. Year of study had minimal positive relationship with respondents’ feeling that the system is useful in their learning with 0.013 Pearson’s correlation coefficient. Findings in Table 4.28 show that there was a moderately significant positive relationship between age and systems’ ease of use satisfaction with 0.118 Pearson’s correlation coefficient. Year of study had a moderate positive relationship with systems’ ease of use satisfaction with 0.128 Pearson’s correlation coefficient and level of study and gender having slight relationship with systems’ ease of use satisfaction with 0.046 and 0.027 Pearson’s correlation coefficients respectively. Mode of study had minimal relationship with systems’ ease of use satisfaction with 0.010 Pearson’s correlation coefficients. Table 4.29 show the Pearson’s correlation between other factors and web-based elearning system user satisfaction. 93
The available web-based elearning system is hard to use
Inappropriate content provided by the e-learning system
Lack of required tools in the webbased e-learning system
User preferred learning style
Period of using the available elearning system
Table 4.29: More factors affecting user satisfaction
Am satisfied with system interactivity
0.159 (0.040)
0.001 (0.244)**
0.013 (-0.172)
0.017 (-0.164)*
0.113 (-0.094)
Am satisfied with system response
0.118 (-0.092)
0.111 (-0.095)
0.474 (0.005)
0.171 (-0.074)
0.091 (-0.104)
Accuracy of content in the system
0.207 (0.064)
0.115 (0.093)
0.077 (-0.111)
0.002 (-0.226) **
0.229 (-0.058)
Am satisfied with features in the system
0.413 (0.017)
0.007 (-0.192) **
0.001 (-0.257) **
0.010 (-0.179)*
0.286 (-0.044)
Satisfied with system interoperability
0.302 (-0.040)
0.037 (-0.139)*
0.002 (0.279) **
0.132 (-0.087)
0.286 (-0.044)
Am satisfied with system security
0.111 (-0.095)
0.482 (-0.004)
0.346 (-0.031)
0.009 (-0.183)**
0.459 (-0.008)
E-learning system is useful in my learning
0.461 (-0.008)
0.294 (-0.142)
0.032 (-0.143)*
0.157 (-0.113)
0.004 (-0.202) **
Am satisfied with systems’ ease of use
0.020 (0.159)
0.057 (-0.123)
0.399 (-0.020)
0.020 (-0.160)
0.067 (-0.198)
Correlation is significant at the 0.05 level (1-tailed).* Correlation is significant at the 0.01 level (1-tailed). ** Source: Research data (2013) There is a significant positive relationship, between learning style and satisfaction of system interactivity with a Pearson’s correlation coefficient of 0.244. Inappropriate content provided by the web-based e-learning system had a moderately significant negative relationship with system interactivity satisfaction with a Pearson’s correlation coefficient of -0.164, and lack of required tools in the web-based elearning system was moderately negatively related with system interactivity 94
satisfaction at -0.172 Pearson’s correlation coefficient. Period of using e-learning system and ease of use of the e-learning system had slight relationship with system interactivity satisfaction at 0.040 and -0.094 Pearson’s correlation coefficients respectively. There was a significant negative relationship between ease of use the e-learning system and system response satisfaction, Pearson’s correlation coefficient of -0.104. Period of using e-learning system, user learning style and inappropriate content published in the e-learning system had slight negative relationship with system response satisfaction with -0.092, -0.095 and -0.074 Pearson’s correlation coefficients respectively. Lack of required tools in the web-based e-learning system had very minimal relationship with system response satisfaction with 0.005 Pearson’s correlation coefficient. As shown in Table 4.29, inappropriate content published in the e-learning system has a significant negative relationship with satisfaction of content published in the system with a Pearson’s correlation coefficient of -0.226. Lack of required tools in the web-based e-learning system had -0.111 Pearson’s correlation coefficient with content satisfaction, user learning style had a Pearson’s correlation coefficient of 0.093 with content satisfaction and period of using e-learning system had a slight positive relationship with content satisfaction with a Pearson’s correlation coefficient of 0.064. Hardship in system use had a slight negative relationship with content satisfaction with Pearson’s correlation coefficient of -0.058. Lack of required tools in the web-based e-learning system had a significant negative relationship with satisfaction of features in the system with Pearson’s correlation coefficient of -0.257 and user learning style also had a relatively significant negative relationship with satisfaction of features in the system with Pearson’s correlation coefficient of -0.192. Inappropriate content published in the web-based e-learning system had a moderately negative relationship with satisfaction of features in the system with a Pearson’s correlation coefficient of -0.179. Ease of use had a slight relationship with satisfaction of features in the system with 0.044 Pearson’s correlation coefficient and period of web-based e-learning system use was related to features satisfaction with 0.017 Pearson’s correlation value. 95
There was very minimal negative relationship between period of web-based elearning system use and hardship in using the system with system interoperability satisfaction with -0.040 and -0.044 Pearson’s correlation coefficients respectively. Lack of required tools in the e-learning system had a significant negative relationship with system interoperability satisfaction with a Pearson’s correlation coefficient of 0.279 and user learning style had a moderate negative relationship with -0.139 Pearson’s correlation coefficient. Inappropriate content published in the system had a slight negative relationship with system interoperability satisfaction with -0.087 Pearson’s correlation coefficient. Hardships in using the available web-based e-learning system had a significant negative relationship with the feeling that the e-learning system is useful in ones learning with -0.202 Pearson’s correlation coefficient. Lack of required tools in the web-based e-learning had a moderate negative relationship with system usefulness with -0.143 Pearson’s correlation coefficient while period of using the web-based elearning system had nearly no relationship with system usefulness. User learning style had significant negative relationship with system usefulness with -0.142 Pearson’s correlation coefficient and inappropriate content published in the elearning system also had a moderate negative relationship with system usefulness with -0.113 Pearson’s correlation coefficient. Period of using the web-based e-learning system had a significant positive relationship with perceived ease of use of the system with 0.159 Pearson’s correlation coefficient, user learning style had a moderate negative relationship with systems’ ease of use with -0.123 Pearson’s correlation coefficient while lack of required tools had very minimal negative relationship with systems ease of use satisfaction with -0.020 Pearson’s correlation coefficient. Hardship in using the webbased e-learning system had a significant negative relationship with systems’ ease of use with -0.198 Pearson’s correlation coefficient.
96
4.3.3.4 Respondents suggestions for improving e-learning systems Respondents were asked to suggest how they wish the web-based e-learning system to be improved. A total of 86 (51.5%) shared their suggestions. This section presents these suggestions. Among the suggestions for improving functionality and user satisfaction that were provided by respondents are; system should be organized such that it can be accessed using various devices such as phones and tablets, students should be able to ask lecturers questions and receive timely feedback, the university to improve network access speeds and ICT infrastructure, also suggested by Sharmin and Sohel (2014), the system should be designed such that it meets needs of diverse learners, learners should be allowed to choose which material or topics to access, content to include simulation and animations to help in understanding of concepts, system should be easy to use without being trained. They further suggested that the university to provide some basic training on how to use the system, the content published should include video and audio tutorial to suite blind and deaf learners, improve lecturer-student interactive communication, increase the number of computers in computer rooms, proper organization and sequencing of course content, the system should be readily available around the clock, the system should allow learners to choose the language they want to use in learning e.g. English, French, should be organized such that the content provided is same and consistent regardless of the equipment used to access it, should be simple and easily accessible. 4.3.3.5 Analysis of variance for differences in satisfaction This section presents analysis of variance to find out differences in web-based elearning system satisfaction. A one-way ANOVA was used to test for satisfaction differences among students and lecturers. Differences were tested between age groups, gender, level of study, year of study, mode of study, period of system use and learning style. Table 4.30 show the F values and significance levels for ANOVA performed on students’ data. Remember, findings are significant at 0.07 for students’ 97
sample and 0.10 for lecturers’ sample. See Appendix 5 for descriptive statistics tables generated by SPSS from the research data. Users aged between 20 and 25 years were more satisfied with system interoperability, M = 2.55 and inter-operability satisfaction differed significantly between age group, F = 2.212, p = 0.069. Satisfaction with system interactivity differed significantly between age groups, F = 3.834, p = 0.011 with users in 20-25 age group being more satisfied than others. System is useful in my learning process satisfaction differed significantly, F = 2.869 p = 0.047 with users below the age of 20 being more satisfied than others. Users aged above 30 years were more satisfied with system usability which differed significantly across age groups, F = 2.836, p = 0.066. System interactivity satisfaction differed significantly across gender, F = 25.440, p = 0.001, with male users being more satisfied, M = 2.55 than their female counterparts. Satisfaction with system features also differed significantly, F = 11.654, p = 0.001, with female users being more satisfied, M = 2.34, than male users. System interoperability also differed significantly, F = 39.596, p = 0.001, with female users being more satisfied, M = 3.02 than their male. Satisfaction with system interactivity differed significantly between levels of study, F = 6.937, p = 0.001 with degree students being more satisfied than masters and diploma students. Satisfaction with system response also differed significantly between levels of study, F = 5.448, p = 0.005 with diploma students being more satisfied than degree and masters. Masters students were more satisfied with system inter-operability, M = 3.00, and there was a significant difference in satisfaction, F = 8.023, p = 0.001. System is useful in my learning differed significantly between levels of learning, F = 5.487, p = 0.005 with masters students being more satisfied than other diploma and degree students. Satisfaction with content published in the system differed significantly between years of study, F = 3.463, p = 0.018 with third years being more satisfied, M = 4.00 than other years of study. Satisfaction with system interactivity also differed significantly between years of study, F = 8.459, p =0.001, with fourth years being more satisfied with interactivity than first, second and third years. 98
Age
Gender
Level of Study
Year of study
Mode of study
Period of use
Learning style
Table 4.30: ANOVA for students’ satisfaction
Am satisfied with system interactivity
F Sig.
3.834 0.011
25.440 0.001
6.937 0.001
8.459 0.001
1.364 0.259
12.555 0.001
9.581 0.001
Am satisfied with system response
F Sig.
0.847 0.870
0.001 0.979
5.448 0.005
1.054 0.370
0.930 0.397
0.488 0.691
2.461 0.065
Content in the system is accurate to course objectives
F Sig
1.019 0.386
2.572 0.111
0.485 0.616
3.463 0.018
3.038 0.051
2.448 0.062
2.452 0.061
Am satisfied with features in the system
F Sig.
1.591 0.153
11.654 0.001
0.292 0.747
0.253 0.859
3.116 0.047
0.137 0.938
5.326 0.002
Am satisfied with system interoperability
F Sig.
2.212 0.069
39.596 0.001
8.023 0.001
1.086 0.357
16.494 0.001
1.256 0.291
7.277 0.001
Am satisfied with system security
F Sig.
1.863 0.138
0.393 0.532
0.217 0.805
0.640 0.590
0.283 0.754
0.499 0.683
0.372 0.774
System is useful in my learning process
F Sig.
2.869 0.047
0.081 0.777
5.487 0.005
0.153 0.927
3.329 0.038
2.742 0.068
2.944 0.048
Am satisfied with system usability
F Sig.
2.836 0.066
0.123 0.726
2.786 0.067
1.310 0.273
2.208 0.052
2.801 0.069
2.451 0.061
Source: Research data (2013) There was observed significant difference in satisfaction with content published in the system between three modes of study, F = 3.038, 0.051 with distance learning students being more satisfied, M = 4.03, than part time and full time students. Satisfaction with features in the system also differed significantly between modes of study, F = 3.116, p =0.047 with distance learning students being more satisfied than 99
others. System inter-operability satisfaction also differed significantly between the three modes of study, F = 16.494, p = 0.001, with distance learning students being more satisfied than others. System is useful in my learning also differed significantly between modes of study, F = 3.329, p = 0.038, with part time students being more satisfied than others. Distance learning students find system being easy to user than other students, F = 2.208, p = 0.052. System interactivity satisfaction differed significantly with period of system use, F = 12.555, p = 0.001 with those users who had used the system for more than three years being more satisfied than others. Satisfaction with content in the system also differed significantly between periods of use, F = 2.448, p = 0.062 with those who had used the system for more than three years being satisfied than others. System is useful differed significantly between periods of use, F = 2.741, p = 0.068 and significant difference was also observed in systems’ usability, F = 2.801, p = 0.069. There was significant difference of system interactivity satisfaction between four learning styles, F = 9.581, 0.001. Satisfaction with content published in the system also differed significantly between learning styles, F = 2.451, p = 0.061 with students who prefer working alone being more satisfied than others. Satisfaction with system features differed significantly, F = 5.326, p = 0.002. System inter-operability satisfaction differed significantly between learning styles, F = 7.774, p = 0.001, system is usefulness differed significantly between learning styles, F = 2.944, p = 0.048, with students who prefer reading and exploring models being more satisfied and those who prefer working in groups being the least satisfied. System usability differed significantly between learning styles, F = 2.451, p = 0.061, with users who prefer reading and exploring models being more satisfied with usability than others. Table 4.31 shows ANOVA table for lecturers’ sample. Satisfaction with system interactivity differed significantly between age groups, F = 2.122, p = 0.098 with users in the 30-40 age group being more satisfied than others. Satisfaction with system response also differed significantly between age groups, F = 7.628, p = 0.001 with users in the 20-30 age group being more satisfied than others, M = 4.00. Users in the 30-40 age group were more satisfied with system usefulness, 100
M = 3.69, F = 12.858, p = 0.001; and system usability, M = 2.92, F = 2.418, p = 0.075.
Period of use
Age
Gender
Level of Study
Table 4.31: ANOVA for lecturers
Am satisfied with system interactivity
F Sig.
2.122 0.098
3.205 0.078
0.191 0.827
4.566 0.006
Am satisfied with system response
F Sig.
7.628 0.001
0.096 0.757
7.271 0.001
13.849 0.001
Am satisfied with accuracy of the system
F Sig
1.125 0.346
4.230 0.044
2.048 0.137
3.976 0.012
Am satisfied with system suitability to its functions
F Sig.
2.060 0.115
4.448 0.039
0.546 0.582
0.889 0.452
Am satisfied with system interoperability
F Sig.
0.328 0.805
9.593 0.003
2.816 0.067
15.617 0.001
Am satisfied with system security
F Sig.
1.337 0.270
0.945 0.335
8.950 0.001
2.303 0.086
System is useful in my teaching process
F Sig.
12.858 0.001
0.021 0.885
2.901 0.062
2.071 0.099
Am satisfied with system usability
F Sig.
2.418 0.075
2.840 0.097
4.179 0.020
3.112 0.032
Source: Research data (2013) Satisfaction with system interactivity differed significantly between gender, F = 3.205, p = 0.078 with male users being more satisfied than female, M = 3.03. Satisfaction with system accuracy also differed significantly between gender groups, F = 4.230, p = 0.044, with male lecturers being mores satisfied, M = 3.28. Male lecturers were also more satisfied with system suitability, M = 2.79, F = 4.448, p = 0.039. Satisfaction with system inter-operability also differed significantly between gender, F = 9.593, p = 0.003, with male lecturers being more satisfied than female lecturers. Male lecturers were also more satisfied with system usability, M = 2.87, which differed significantly between gender groups, F = 2.840, p = 0.097. 101
Satisfaction with system response differed significantly between three levels of study, F = 7.271, p = 0.001 with diploma holders being more satisfied than others. System inter-operability satisfaction differed significantly between levels of study, F = 2.816, p = 0.067 with diploma holders still being more satisfied than degree and masters holders. System security satisfaction also differed significantly between levels of study, F = 8.950, p = 0.001, with masters holders being more satisfied than diploma and degree holders. System usefulness in teaching differed significantly, F = 2.901, p = 0.062, with masters holders feeling that the system is useful than diploma and degree holders. Satisfaction with system usability also differed significantly between levels of study, F = 4.179, p = 0.020, with degree holders being more satisfied that masters and diploma holders. System interactivity satisfaction differed significantly between system use periods, F = 4.566, p = 0.006, with lecturers who had used the system for less than two years being more satisfied than others. Satisfaction with system response also differed significantly between system use periods, F = 13.849, p = 0.001, with lecturers who had used the system for more than six years being more satisfied. Satisfaction with accuracy of the system differed significantly between periods of use, F = 3.976, p = 0.012, with lecturers who had used the system for 4-6 years being more satisfied. System inter-operability satisfaction differed significantly between periods of use, F = 15.617, p = 0.001, with lecturers who had used the system for less than two years being more satisfied. System security satisfaction also differed significantly between periods of use, F = 2.303, p = 0.086, with lecturers who had used the system for 4-6 years being more satisfied. Lecturers who had used the system for 4-6 years were more satisfied with system usefulness, which differed significantly between periods of use, F = 2.071, p = 0.091 and those who had used the system for more than 6 years were satisfied with system usability than others. Usability differed significantly between periods of use, F = 3.112, p = 0.032.
102
CHAPTER FIVE DEVELOPMENT OF PROPOSED ARCHITECTURE 5.1 Introduction This chapter focuses on achieving fourth objective of the study by developing integrated adaptive service-oriented reference architecture (IASORA) that will seek to improve functionality and user satisfaction of web-based e-learning systems. A service-oriented reference architecture describes the essence of a software architecture and the most significant and relevant aspects (Palanivel & Kuppuswami, 2011). Based on the findings of the study discussed in section 4.2 and 4.3, IASORA was developed. Figure 5.1 show the IASORA developed. 5.2 Current web-based e-learning system architectures Majority of e-learning systems are implemented either with client-server architecture or are centralized server based (Mohammed & Hussein, 2010), peer-to-peer architecture and recently upon web services architectures (Dougiamas & Retalis, 2012). The client-server and centralized server approaches are metaphors of studentteacher and repository centric which reflect real world learning scenarios in which teachers act as the content producers while students act as the content consumers (Yang, 2006). In web-based e-learning systems, the client is the web browser that sends requests to a server and present results from the server. 5.3 Limitations of the current e-learning systems architectures The current web-based e-learning systems’ architectures have a number of inherit limitations as discussed in section 2.4. These systems have major limitations in scalability, availability, distribution of computing power and storage as well as sharing information among users and they cannot handle a potentially large number of concurrent, geographically distributed users and they are not flexible in content identification for learners (Palanivel & Kuppuswami, 2011). From the results in section 4.3.1, it was found that the web-based e-learning system could not be accessed from anywhere and the content provided by the system does 103
not change to their preferences or computing power of the device they are using. It was also found that the system does not take into consideration different learning styles of users and it present same learning materials in the same format to varied students with different learning styles. Users of the web-based e-learning systems are mobile and they want a ubiquitous system that can be accessed from anywhere anytime (Munirah et el, 2012) but the current system architectures are not flexible, open, distributed and ubiquitous to suite diverse learners (Swierczek et al, 2012). As results show in section 4.3.1, 65.30% felt that the architecture of web-based elearning system needs to be improved. 5.4 Integrated adaptive service-oriented reference architecture This study proposes an integrated adaptive service oriented reference architecture (IASORA) for the e-learning systems. IASORA utilizes cloud computing capabilities and integration of mobile learning environment with the adaptive e-learning architecture. The architecture will integrate various web-based e-learning system architectures and introduce new capabilities using service-oriented technology, adaptive learning technology and web services with the aim of improving system functionality and user satisfaction in Kenyan public universities. The architecture was developed considering various functionality and effectiveness issues and factors affecting user satisfaction found by the study. The architecture is organized in four layers; presentation layer which basically consist of web browsers that act as clients and user interface; service layer which consist of web-services both in private and public clouds in SaaS or PaaS cloud computing models; application layer that consist of common business processes to be performed by a web-based elearning system and storage layer that consist of databases, file directories and cache servers.
104
Computer Browser
Presentation layer
Mobile Browser
HTTP,TCP,FTP,SMTP Services layer
SOAP
Private Cloud UDDI Service Registry: Useracct.wsdl; register.wsdl; search.wsdl;integration.wsdl; security.wsdl; report.wsdl; notify.wsdl; repository.wsdl; ontology.wsdl; version.wsdl;connect.wsdl;platform.wsdl,track.wsdl
Public Cloud UDDI Service Registry: mail.wsdl, announce.wsdl, lang.wsdl, device.wsdl, discuss.wsdl, optimize.wsdl, chat.wsdl,model.wsdl, group.wsdl, screen.wsdl,apps.wsdl,publish.wsdl,personalize.wsdl
Authentication
Registration
Personalization
Notification
Device Detect
Learner Modelling
Application
Authorization
Course
Communication
Versioning
Language
Mobile Learning
layer
Confidentiality
Content
Search
Semantic
Inter-connect
Publishing Tools
Access Control
Interaction
Search/ Query
Ontology
Screening/Track
Grouping
Storage layer
Database Server
Replication Server
Mobile Database Server
Figure 5.1: Integrated Adaptive Service-Oriented Reference Architecture (IASORA) Source: Developed from research (2013) 105
Cache Server
File Server
5.4.1 Services identified This section gives an overview of the services identified and added to the SORAPES architecture. The rest of the services not mentioned in this section were in SORAPES and are explained in Palanivel and Kuppuswami (2011). These services are either found in institutions’ private cloud or in a public cloud. The connect service (connect.wsdl) facilitates the connection between the web-based e-learning system with other systems within the institutions like student management system, finance system, library management system, other e-learning systems to enable seamless interoperability between these systems. This will enable users to export content to other system, easily import information to the system and provide or access content in other e-learning systems or digital libraries. Track service (track.wsdl) tracks the learning history for each user to enable the system to intelligently discover students’ learning habits and suggest learning material for students. Platform service (platform.wsdl) helps users to connect and use other software like compilers to test or execute program without having to install the software on their machines (Mokhtar et al, 2013). PaaS enables users to access development platform and tools through APIs which support a specific set of programming languages (Babar & Chauhan, 2011). The mail service (mail.wsdl) is a service that can be found in a public cloud and be used for sending and managing e-mail communication. The institution can use services like Gmail for students and staffs mails rather than using university e-mail system (Mtebe, 2013). The announce service (announce.wsdl) allows users to make announcements using broadcast protocols or even using SMS to users mobile phones. The language service (lang.wsdl) is a service found in a public cloud that enables users to choose their preferred language of learning and content will be translated to that language. The device service (device.wsdl) detect the device a user is using to access the web-based e-learning system and triggers the optimize service (optimize.wsdl) which optimizes the content to fit the users’ device. The group service (group.wsdl) is used to organize students into groups for discussions or projects and help in the management of these groups as the discuss service 106
(discuss.wsdl) help manage discussions in groups and chat service (chat.wsdl) enables interactive student-student and lecturer-student communication while screen service (screen.wsdl) is used for screening students posts to prevent distribution of unnecessary material that could be clogging the network bandwidth hence making it slow and consequently affect response of the web-based e-learning systems. The model service (model.wsdl) captures, maintains and tracks students’ preferences including student learning style, information that is handy in content and assignment customization and optimization. The publish service (publish.wsdl) provide tools required for publishing all types of content right from text, audio or video while personalize service (personalize.wsdl) enables students to choose how they want to learn a particular concept either by re-arranging topics or choosing what they want to learn in a particular subject and what they don’t want. 5.4.2 Presentation layer The presentation layer consists of basic HTML and XML used for presenting results to the varied users. The presentation layer may offer multiple user-interfaces where each user interface displays all or part of some of the application data. The presentation layer is designed such that users can customize it to their preferences. The primary function of this layer is to manage the presentation tier of the application and acting as user interface to the users. This layer also contains some basic applications that initiate the interaction with the system. This layer is part of the service consumer component of the service-oriented architecture. 5.4.3 Services layer This layer contains the contracts (service descriptions) that bind service provider and consumer. Services are responsible for exposing a specific set of business functionality through a well-defined contract, where the contract describes a set of concrete operations and messages supported by the service (Erl, 2005). Exposed services reside in this layer; they can be discovered and invoked or possibly choreographed to create a composite service (Palanivel & Kuppuswami, 2011). Services completely encapsulate the coordination of calls to business components in 107
response to operations it exposes to clients. The service producer publishes the service to the service registry which is leveraged by the service consumer for runtime binding. The service registry also acts as the system-of-record for the predefined business policies which could be used at runtime for enforcement of these policies. A service consists of an implementation that provides business logic and data, a service contract that specifies the functionality, usage, and constraints for a client of the service, and a service interface that physically exposes the functionality (Erl, 2005). The services layer consists of the web-services used by the web-based e-learning system to perform various functionalities. The services layer is organized in a hybrid cloud computing model that consists of private cloud and public cloud. A public cloud is where cloud infrastructure is made available to several users/clients and this infrastructure is owned by a cloud service provider (Mokhtar et al, 2013) while a private cloud is where cloud infrastructure is operated solely for particular organization with services made available for internal users (Bansal et al, 2012). Normally, the private cloud is managed by an IT department within an organization or commissioned to a service provider or a third party organization but all services are dedicated to that organization (Mtebe, 2013). Incorporating cloud computing in e-learning as proposed in IASORA will help improve functionality and user satisfaction of web-based e-learning systems because being on the cloud will help solve some of the problems encountered by independently hosted web-based e-learning systems. Cloud based e-learning system provides the learning platforms for the students to enjoy all the benefits of cloud that encompasses active, collaborative and discovery learning in an efficient, flexible, convenient, scalable and ubiquitous manner (Lawanya & Subha, 2013). Mtebe (2013) found that using cloud computing in e-learning improves systems availability because cloud computing service providers’ servers are normally up 24/7 compared to institutional servers that occasionally go off due to power interruptions. Many developing countries do not have high computing facilities for research and teaching especially for simulations, analysis of computation models, and similar research (Mtebe, 2013). Students studying science courses such as computer science lack tools and environment to develop, test, and compile their programs and models; 108
however, the adoption of cloud services has the potential to alleviate this problem (Mtebe, 2013). For example, computer science students can use Google App Engine, Amazon Hadoop or similar PaaS to develop, test and implement their computer programming (Truong, Pham, Thoai, & Dustdar, 2012). For instance, University of Pretoria uses Virtual Computing Lab (VCL) from IBM to enable student’s access and use the next-generation medical research to test the development of drugs, which are expected to cure serious illnesses unique to Africa (Mtebe, 2013). Cloud computing will improve web-based e-learning systems accessibility and make it easy to integrate mobile learning into web-based e-learning. Cloud computing is flexible and easily adaptable and enables students to connect to university educational services using their mobile devices from anywhere (Youssef, 2012) and enables students and staff to access highly available up-to-date software and hardware and improve collaboration among students and staff (Singh & Hemalatha, 2012) while it increases functional capabilities and offline usage with further synchronization opportunities, increasing application and computing performance, mobility for staffs and students (Murtaza, 2013). 5.4.4 Application layer This layer consists of functional modules provided by the web-based e-learning system for it to meet the needs and requirements of users. New modules that need to be introduced in the web-based e-learning system are shown in the application layer in Figure 4.11 with bold borders. The rest of the modules are found in SORAPES and are discussed in Palanivel and Kuppuswami (2011). The device detect module is used to automatically detect the device being used by a user to access the web-based e-learning system at runtime. This module uses browser agents to detect the size of the screen of the device one is using to detect whether it is a computer, tablet or a mobile phone. The mobile learning module contains tools used to optimize and customize content for users who use mobile phone to access the system or redirect the mobile phone users to a mobile version of the content being accessed. The learner model module is used to capture learners’ model including students’ learning style and preferences. This module also tracks the learning process 109
for each user and provide information for designing assessments and suggestions for improvement on the learner. Grouping module is used to organize students into groups for discussions or projects and helps in management of these groups. Using this module, students can virtually discuss or perform a project and it also improves studentt-student collobaration. Publishing tools module is dedicated with providing users with relevant tools for creating and publishing all types of content. These tools may be incorporated into the system or may be accessed from a pubblic cloud as a PaaS. Most web-based elearning systems support development of textual content, but tools for developing content like SCORM, video and audio are not incorporated. The language module is used to customize the content to a language the user prefer using and it translates the textual content to the target language of a user. Inter-connect module helps managers too link the web-based e-learning system to other system within or outside the institution to improve system inter-operability. The system can be linked to other e-learning systems, public cloud services, Student Management System, Library System, Finance System or Examination Management System for efficient, effective and smooth realization of institutions policies. For instance, if finance system could inter-operate and share some information, a distance learning student may be barred from accessing content on the web-based elearning system if he has not paid upto a certain amount of required fee and information from Student Management System would help to prohibit access to content on the system if a student has not registered. The screening module will be used to screen and track all students posts in the system to prevent distribution of unnecessary material. 5.4.5 Storage layer The storage layer consist of data access and data storage modules of the web-based e-learning system. Basically this layer is made of databases, file repositories and cache servers. The database server maintains the students records like learning style, login credentials among other details and perform synchronization with mobile database. Mobile database is a database that is portable and physically separate from 110
the corporate database server but is capable of communicating with that server from remote sites (Connolly & Begg, 2005). This mobile database collaboarate with mobile learning module in the web-based e-learning system to enable students and staff to acccess the system from anywhere. The file directory stores the documents used in the web-based e-learning system like SCORM files, attachments among others. The cache serve stores information that has been accessed by learners most recently and provides this information if a student access it later. The cache server helps to minimize interaction with the main servers which can really improve system response and turn around time. The replication server is responsible for database distribution and synchronization of the offline servers and the live server. Database distribution and replication improves availability, reliability and performance of a system (Connolly & Begg, 2005). 5.5 Evaluation and validation of the developed architecture This section presents the evaluation, validation and verification of the integrated adaptive service-oriented reference architecture developed in section 4.3.4. A software architecture review is an activity to develop an evaluation of architecture against the quality goals. Architecture evaluation aims to assess and validate the software architecture using systematic methods or procedures. The evaluation is done with the objective to ensure that the architecture under question satisfies one or more quality goals. Validation is a set of activities that sought to ensure that a product built is right (Pressman, 2005). Validation for this study was done to check if the developed architecture will improve functionality and user satisfaction of web-based e-learning systems. Two methods of validation were used to validate IASORA. The first method was expert analysis where twenty experts were asked to analyze the architecture and give their opinion on what extend they feel the architecture will improve functionality and satisfaction of web-based e-learning systems. Questionnaires and the developed architecture were distributed to these experts and their opinions analyzed and discussed. The second method used to validate the architecture was building of a prototype that runs on this architecture. The prototype was also analyzed by experts. 111
5.5.1 Demographic distribution of evaluators A total of 20 questionnaires were returned from the 25 distributed to content developers, lecturers and ICT officers. Figure 5.2 shows the distribution of the respondents by age and gender. 7 6
Frequency
5 4 Male 3
Female
2 1 0 20 - 30
30 - 40
40 - 50
Age Group
Source: Research data (2014) Figure 5.2: Distribution by occupation and level of study As shown in Figure 5.2, of the 9 (45.0%) respondents aged between 20 – 30 years, 4 (20.0%) were male and 5 (25.0%) female. Those aged between 30 – 40 years were 8 (40.0%) of which 2 (10.0%) were female and 6 (30.0%) were male. There were only 3 (15.0%) male respondents aged between 40 – 50 years. Figure 5.3 show the occupation and academic qualification of the respondents. Results show that there were 3 (15.0%) ICT officers with 2 (10.0%) having diploma and 1 (5.0%) being a degree holder. Twelve (60.0%) lecturers were involved in architecture validation with 9 (45.0%) having degree and 3 (15.0%) having PhDs. 112
10 9
9 8
Frequency
7 6 Diploma
5
Degree
4 3
Masters
3
3 2
PhD
2
2 1
1 0
0
0
0
0
0
0 ICT Officer
Lecturer
Content Developer
Occupation
Figure 5.3: Academic qualification and profession of respondents Source: Research data (2014) Five (25.0%) content developers took part in the validation with 3 (15.0%) having degrees and 2 (10.0%) being diploma holders. 5.5.2 Response on validation constructs Validators were asked to state their level of agreement on various ways the architecture will improve functionality of web-based e-learning systems. Table 5.1 show the responses obtained from the validation of the architecture. Fifteen (75.0%) agreed that the architecture will improve web-based e-learning system interactivity, 4 (20.0%) disagreed to that assertion and 1 (5.0%) remained undecided. On whether the architecture will improve system response, 13 (65.0%) agreed, 2 (10.0%) disagreed and 5 (25.0%) were undecided. Experts were asked whether they think that incorporation of web services in the architecture will improve effectiveness of web-based e-learning systems, 17 (85.0%) agreed and 3 (15.0%) were neutral. Fifteen (75.0%) agreed that incorporation of web-services in the architecture will improve systems’ suitability, 3 (15.0%) disagreed and 2 (10.0%) 113
were undecided. The architecture will improve system inter-operability received a nod from 13 (65.0%) experts while 7 (35.0%) thought otherwise.
Disagree
Neutral
Agree
Architecture will improve system interactivity
1 (5.0%)
3 (15.0%)
1 (5.0%)
13 2 (65.0%) (10.0%)
Architecture will improve system response
0 (0.0%)
2 5 5 8 (10.0%) (25.0%) (25.0%) (40.0%)
Web-services will improve system effectiveness
0 (0.0%)
0 (0.0%)
Web-services will improve system suitability
1 (5.0%)
2 2 13 2 (10.0%) (10.0%) (65.0%) (10.0%)
Architecture will improve system inter-operability
2 5 (10.0%) (25.0%)
Architecture will improve security of the system
2 6 4 8 (10.0%) (30.0%) (20.0%) (40.0%)
Architecture will improve system usefulness in learning
3 3 2 9 3 (15.0%) (15.0%) (10.0%) (45.0%) (15.0%)
Architecture will make the system more ubiquitous
1 (5.0%)
5 3 8 3 (25.0%) (15.0%) (40.0%) (15.0%)
Architecture will enable learners to access content using any device
1 (5.0%)
3 4 9 3 (15.0%) (20.0%) (45.0%) (15.0%)
Architecture will make the system address various learning styles
2 4 3 8 3 (10.0%) (20.0%) (15.0%) (40.0%) (15.0%)
Strongly Agree
Factors evaluated
Strong Disagree
Table 5.1: Validators opinion on improved functionality
3 8 9 (15.0%) (40.0%) (45.0%)
0 (0.0%)
9 4 (45.0%) (20.0%) 0 (0.0%)
Source: Research data (2013) Results show that, 12 (60.0%) agreed that the developed architecture will improve system usefulness in learning, 6 (30.0%) disagreed and 2 (10.0%) were undecided. 114
Eleven (55.0%) agreed that the architecture will make web-based e-learning system more ubiquitous, 6 (30.0%) disagreed and 3 (15.0%) were undecided. The architecture will enable learners to access content using any device received affirmation from 12 (60.0%), 4 (20.0%) disagreed and 4 (20.0%) were undecided. Eleven (55.0%) agreed that the architecture will meet needs of students with varied learning styles, 6 (30.0%) disagreed and 3 (15.0%) were undecided. 5.5.3 Inferential analysis of validation data Table 5.2 show Pearson’s correlation coefficients (r) and respective significance (p) highlighted in brackets for experts’ opinion on how the architecture will improve web-based e-learning system functionality and satisfaction. The results are significant at 0.10 significance level. As shown in Table 5.2, the architecture will improve system response had significant relationship with age, r(20) = 0.458, p = 0.045 and significant negative relationship with gender, r(20) = -0.496, p = 0.034. Incorporation of web-services into the webbased e-learning system architecture will improve system effectiveness had significant positive relationship with gender, r(20) = 0.426, p = 0.061 and period of system use, r(20) = 0.802, p = 0.001. Web-services will improve system suitability had a significant positive relationship with occupation r(20) = 0.557, p = 0.011 and significant negative relationship with period of system use, r(20) = -0.537, p = 0.015. Results show that, the architecture will improve system inter-operability had significant positive relationship with age, r(20) = 0.658, p = 0.002 and significant negative relationship with gender, r(20) = -0.381, p = 0.097. The architecture will improve security of the system had a significant negative relationship with age, r(20) = -0.778, p = 0.001 and a significant positive relationship with gender, r(20) = 0.572, p = 0.008. The architecture will improve web-based e-learning system usefulness in learning had a significant negative relationship with gender, r(20) = -0.569, p = 0.009 and had positive significant relationship with occupation, r(20) = 0.453, p = 0.045. The architecture will make the system more ubiquitous had a significant relationship with 115
occupation, r(20) = 0.380, p = 0.097 and a significant negative relationship with gender, r(20) = -0.587, p = 0.007 and the architecture will make the system address various learning styles had significant relationship with occupation, r(20) = 0.417, p = 0.067.
Age
Gender
Occupation
Level of study
Period of system use
Table 5.2: Pearson’s correlation for validation factors
Architecture will improve system interactivity
-0.027 * (0.909)
-0.123 (0.604)
0.220 (0.352)
0.000 (1.000)
-0.355 (0.125)
Architecture will improve system response
0.458* (0.042)
-0.496* (0.034)
-0.227 (0.336)
-0.191 (0.419)
-0.044 (0.853)
Web-services will improve system effectiveness
0.176 (0.457)
0.426 (0.061)
-0.179 (0.449)
0.211 (0.372)
0.802** (0.001)
Web-services will improve system suitability
0.065 (0.784)
-0.169 (0.477)
0.557 (0.011)
0.266 (0.257)
-0.537* (0.015)
Architecture will improve system inter-operability
0.658 ** (0.002)
-0.381 (0.097)
0.194 (0.412)
-0.320 (0.169)
-0.183 (0.440)
Architecture will improve security of the system
-0.778
**
(0.001)
0.572** (0.008)
-0.138 (0.562)
0.144 (0.544)
0.106 (0.657)
Architecture will improve system usefulness in learning
0.203 (0.390)
-0.569 ** (0.009)
0.453 * (0.045)
0.115 (0.628)
0.330 (0.115)
Architecture will make the system more ubiquitous
0.371 (0.108)
-0.587 ** (0.007)
0.380 (0.097)
-0.118 (0.621)
-0.161 (0.497)
Architecture will enable learners 0.196 0.147 to access content using any (0.408) (0.537) device Architecture will make the system 0.160 0.162 address various learning styles (0.502) (0.495) Correlation is significant at the 0.05 level (2-tailed) * Correlation is significant at the 0.01 level (2-tailed) ** Source: Research data (2013)
0.224 (0.342)
0.253 (0.282)
0.141 (0.554)
0.417 (0.067)
0.172 (0.469)
-0.106 (0.655)
Factors evaluated
116
5.5.4 Conclusion This section presents the conclusion that is made from the validation of the integrated adaptive service-oriented reference architecture developed. From the Pearson’s correlation presented in section5.5.4, the following conclusion can be made; the architecture will improve system response, incorporation of web-services into the architecture will improve system effectiveness and system suitability. The architecture will improve system inter-operability, system security and system usefulness. It can also be concluded the architecture will make the web-based elearning system more ubiquitous and it will also enable the system to address various learning styles of students and consequently improve system functionality and satisfaction.
117
CHAPTER SIX SUMMARY, CONCLUSION AND RECOMMENDATIONS 6.1 Introduction This section presents the summary of findings of the study, the conclusion of the study and recommends areas that need further research. 6.2 Summary of chapters This section presents summary of the chapters in this thesis. Chapter one laid down the foundation of the study by looking at the background information to the study and identifying the problem area. It presented the study objectives and the conceptual framework of the study. Chapter two presented an analytical analysis of previous literature in the study area and chapter three presented the methodology that was followed to achieve study objectives. Chapter four presented the data obtained in the study and the analysis of this data and the architecture was developed and validated in chapter five. This chapter presents the summary of findings, conclusions drawn from the findings and recommends areas for further research. 6.3 Summary of findings The purpose of this study was to develop a service-oriented reference architecture that will improve functionality of web-based e-learning systems. The summary of the findings are presented in line with the objectives of the study. 6.3.1 Analysis of existing e-learning system architectures The existing web-based e-learning system architectures were analysed and findings presented in section 4.3.1. The findings showed that existing web-based e-learning system are built in client-server or peer-to-peer architectures. These architectures had inherent limitation that affects system functionality user and user satisfaction. These systems have major drawbacks because of their limitations in scalability, availability, distribution of computing power and storage system, as well as sharing information between users. The system could not be accessed from anywhere by any device, it is 118
only accessible while connected on the internet and the content in the system could not change to the preferences of the user. The study found that most lecturers felt that the system does not support all student learning styles with most of the suggesting that the system should be arganized such as to carter for all types of students with varied computer skills, learning styles and preferences, speed of learning, adapts to user preferences, capabilities and allows the user to be in control of his\her learning process. Majority of respondents used desktop computers or laptops to access the system a very few used tablets and non-used phones. Majority of the respondents felt that the architecture of web-based e-learning system needs to be improved. 6.3.2 Functionality of web-based e-learning systems Findings show that the web-based e-learning system is partially non-functional as most responses suggested that the system does not meet user’s needs and expectations. Most respondents were not contented with system interactivity, response, accurateness, suitability and inter-operability. Majority of students and lecturers were comfortable with security features in the web-based e-learning system. However, most of lecturers and students felt that the system is not very useful in their learning and teaching processes respectively and that the system is not easy to use. System suitability and accuracy were adversely affected by lack of appropriate tools in the system for content deliver, lack of mechanisms for grouping students into groups for discussions or project and poor content presentation. System response was affected by poor ICT infrastructure in the institution and low network access speeds while most respondents said that they cannot import of export content to and from the system. Most users said that they are not confident of using the system if they have never used it before or there is no one to show them and they agreed that they are comfortable of using the system so long as they have a lot of time to complete task at hand. 6.3.3 Factors affecting e-learning system user satisfaction The study sought to find out factors that affect web-based e-learning system user satisfaction. The findings presented in section 4.3.3 show that a number of factors affect e-learning system user satisfaction among them lack of required tools in the 119
system, slow and inconsistence system response, system availability, perceived ease of use, inappropriate, inaccurate, poor sequencing and organization of content published in the system, poor ICT infrastructure in the learning institution and type of content published in the web-based e-learning system. Very few respondents were satisfied with system interactivity, inter-operability, response, accurateness, suitability, perceived ease of use and system effectiveness. The study also found that web-based e-learning system satisfaction varies among various demographic and other factors with male respondents being more satisfied than their female counterparts (see section 4.3.3.5 and appendix 5). 6.3.4 Integrated adaptive service-oriented reference architecture The study developed integrated adaptive service-oriented reference architecture (IASORA) that seeks to improve functionality of web-based e-learning systems and consequently improve user satisfaction. The architecture improves system availability by distributing the system across the network with mobile learning and mobile database incorporated to enable ubiquitous access to the system with a device of choice. The architecture uses web services to model various functionalities of the system and this improves system inter-operability. The service layer of the architecture consists of both private cloud and public cloud. Users can access software applications and other services not found in the institution over the public cloud and services that are unique to an institution are kept in the private cloud, and this improves system suitability. The architecture introduces modules that model students according to learning style and a content-optimizer module that changes content to user preferences. 6.4 Recommendations The study recommends implementation of the integrated adaptive service-oriented reference architecture that will help improve functionality and user satisfaction of web-based e-learning systems. The study also recommends improvement of ICT infrastructure in learning institutions so as to increase network access speeds; webbased e-learning system to be configured such that it can be accessed over the institutions’ LAN without going through the Internet; user training in order to 120
improve system usability; publishing of appropriate and accurate content in the system and such content should be well organized. 6.5 Suggestions for further research This study focused on MOODLE web-based e-learning system. This study suggest that further research is required to access functionality and user satisfaction of other web-based e-learning systems like ATutor, WebCT, Blackboard, Sakai and others. A comparative study is also suggested to find differences in functionality and user satisfaction in various web-based e-learning systems. The study also suggests research on adaptive learning and how web-based e-learning, mobile learning, collaborative learning and adaptive learning can be incorporated into a single effective e-learning system.
121
REFERENCES Abdalla, I. (2007). Evaluating Effectiveness of E-Blackboard System Using TAM Framework: A structural analysis approach. AACE Journal, 15(3), 279-289. Abran, A., Khelifi, A., Suryn, W., & Seffah, A. (2003). Usability Meanings and Interpretations in ISO Standards. Software Quality Journal, 11(4), 325-338. ACM. (2010). ISO/IEC 9126 Software Product Evaluation - Quality Characteristics and Guidelines for the User. Aggrawal, A., & Makkonen, p. (2009). Critical Success Factors for Successful Globalised E-learning. International Journal of Innovation and learning, 6(1), 92110. Allen, M. W. (2003). Guide to e-Learning: Building Interactive, Fun and Effective Learning Program for any Company. New Jersey: John Wiley and Sons. Allen, M. W. (2005). Guide to e-Learning: Building Interactive, Fun and Effective Learning Program for any Company. New Jersey: John Wiley and Sons. Allen, M. W. (2007). Designing Successful ELearning. San Francisco: Pfeiffer. Alshwaier, A., Youssef, A., & Emam, A. (2012). A New Trend for E-learning in KSA Using Educational Clouds. Advanced Computing: An International Journal, 3(1), 81-97. American Psychological Association. (2010). Publication manual of the American Psychological Association (6th ed.). Washington DC: American Psychological Association. Aminul, I., Noor, A. R., Liang, T. C., & Momtaz, H. (2011, February). Effect of Demographic Factors on E-Learning Effectiveness in A Higher Learning Institution in Malaysia. International Education Studies, 4(1), 112 - 121. Anaraki, F. (2004). Developing an Effective and Efficient eLearning Platform. International Journal of the Computer, the Internet and Management, 12(2), 5763. Anderson, E. W., & Sullivan, M. W. (1993). The Antecedents and Consequences of Customer Satisfaction for Firms. Marketing Sciences, 12(2), 125-143. Aroyo, L., & Dicheva, D. (2004). The New Challenges for E-learning: The Educational Semantic Web. Educational Technology & Society, 7(4), 59-69. Athanasios, D., & K., I. D. (2006). Personalized e-Learning Implementation - The GIS Case. International Journal of Computers, Communications & Control, 1(1), 59-67. 122
Babar, M. A., & Chauhan, M. A. (2011). A tale of migration to cloud computing for sharing experiences and observations. Proceeding of the 2nd international workshop on Software engineering for cloud computing (p. 50). New York, USA: ACM Press. Bailey, J. E., & Pearson, S. W. (1983). Development of a tool for measuring and analyzing computer user satisfaction. Management Science, 29(5), 530-545. Bansal, S., Singh, S., & Kumar, A. (2012). Use of Cloud Computing in Academic Institutions. International Journal of Computing, Science and Technology, 3(1), 427-429. Bass, L., Clements, P., & Kazman, R. (2003). Software Architecture in Practice (2nd ed.). New York: Addison Wesley. Bates, A. W., & Poole, G. (2003). Introductory remarks on knowledge, learning and teaching. In Effective teaching with technology in higher education. San Francisco: Jossey-Bass. Begam, F. M., & Ganapathy, G. (2013, February). Adaptive Learning Management System Using Semantic Web Technologies. International Journal on Soft Computing, 4(1), 1-8. Bharatia, P., & Chaudhury, A. (2004). An empirical investigation of decision-making satisfaction in web-based decision support systems. Decision Support Systems, 37(2), 187-197. Bhattacherjee, A. (2001). Understanding information systems continuance: An expectation confirmation model. MIS Quarterly, 25(3), 351-370. Bieberstein, N., Bose, S., Fiammante, M., Jones, K., & Shah, R. (2006). ServiceOriented Architecture Compass: Business Value, Planning, and Enterprise Roadmap. Upper Saddle River: Pearson Education. Bosworth, A. (2001). Developing Web services. In Proceedings of. 17th Inter. Conference on Data Engineering, (pp. 477-480). Buabeng-Andoh, C. (2012). Factors influencing teachers’ adoption and integration of information and communication technology into teaching: A review of the literature. International Journal of Education and Development using Information and Communication Technology, 8(1), 136 - 155. Burton, L. J., & Mazerolle, S. M. (2011). Survey instrument validity part I: principles of survey instrument development and validation in athletic training education research. Athletic Training Education Journal, 6(1), 27-35. Cadle, J., & Yeates, D. (2004). Project Management for Information Systems (4TH ed.). New York: Prentice Hall. 123
Carlin, S., & Curran, K. (2012, June). Cloud Computing Technologies. International Journal of Cloud Computing and Services Science, 1(2), 59-65. Chiu, C. M., Hsu, M. H., Sun, S. Y., Lin, T. C., & Sun, P. C. (2005). Usability, quality, value and e-learning continuance decisions. Computers & Education, 45, 399-416. Clark, R., & Mayer, R. (2008). ELearning and the science of instruction; Proven guidelines for consumers and designers of multimedia learning. San Francisco: Pfeiffer. Cochran, W. G. (1963). Sampling Techniques (2nd ed.). New York: John Wiley and Sons, Inc. Collis, B., & Strijker, A. (2002). New Pedagogies and Re-usable Learning Objects: Toward a Different Role for an LMS. ED9MEDIA 2002 World Conference on Educational Multimedia, Hypermedia & Telecommunications. Association for the Advancement of Computing in Education (AACE). Connolly, T., & Begg, C. (2005). Database Systems: A practical Approach to Design, Implementation and Management. Edinburgh Gate, England: Pearson Education Limited. Cooze, M., & Barbour, M. (2007). Learning Styles: A Focus upon E-Learning Practices and their Implications for Successful Instructional Design. Journal of Applied Educational Technology, 4(1), 7-20. Courts, B., & Tucker, J. (2012). Using Technology To Create A Dynamic Classroom Experience. Journal of College Teaching & Learning, 2, 121-128. Creative Research Systems. (2013, July 17). Sample Size Calculator. Retrieved from Creative Research Systems: http://www.surveysystem.com/sscalc.htm Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319-340. DeLone, W. H., & McLean, E. R. (1992). Information systems success: The quest for the dependent variable. Information Systems Research, 3(1), 60-95. DeLone, W. H., & McLean, E. R. (2003). The DeLone and McLean model of information systems success: A ten year update. Journal of Management Information Systems, 19(4), 9-30. DeLone, W. H., & McLean, E. R. (2004). Measuring E-commerce success: applying the DeLone & McLean Information Systems Success Model. International Journal of Electronic Commerce, 9(1), 31-47. Denscombe, M. (2007). The Good Research Guide: For Small Scale Social Research Projects (3rd ed.). New York, U.S.A: McGraw-Hill. 124
Digolo, B. A., Andang'o, E. A., & Katuli, J. (2011). E- Learning as a Strategy for Enhancing Access to Music Education. International Journal of Business and Social Science, 2(11), 1135-139. Donello, J. (2002). Theory & practice: Learning content management systems. Retrieved January 9, 2013, from http://www.elearningmag.com Dutton, J., Dutton, M., & Perry, J. (2002). How do online students differ from lecture students? Journal of Asynchronous Learning Networks, 6(1), 1-20. Eeles, P. (2006, February 15). What is a software architecture? Retrieved June 28, 2013, from IBM: http://www.ibm.com/developerworks/rational/library/feb06/eeles/ Engelbrecht, E. (2003). A Look at E-learning models: Investigating their value for Developing an E-learning Strategy. Progression, 25(2), 35-47. Erl, T. (2005). Servie-Oriented Architecture: Concepts, Technology and Design. New York: Prentice Hall. Falla, J. (2001). Coping with the content challenge: Definitions. Retrieved January 10, 2013, from Content management: http://www.advisor.com/Articles.nsf/aid/FALLJ145 Fernando, A. (2005). An instructional model for web-based e-learning education with a blended learning process approach. British Educational Communications and Technology, 36(2), 217-235. Figueira, E. (2003). Evaluating the effectiveness of e-Learning strategies for SMEs. Retrieved March 4, 2013, from http://www.theknownet.com/ict_smes_seminars/papers/Figueira.html Fisher, S., Lant, J., Stogeckel, E., & Townsend, J. (1998). Handbook for Family Planning Operation Design (2nd ed.). Washington DC: Oxford University Press. Fressen, J. (2007). A Taxonomy to Promote Quality Web Supported learning. International Journal on E-learning, 6(3), 351-362. Fullerton, G., & Taylor, S. (2002). Mediating, interactive, and non-linear effects in service quality and satisfaction with services research. Canadian Journal of Adminsitrative Sciences, 19(2), 124-136. Garland, D., & Martin, B. N. (2005). Do gender and learning style play a role in how online courses should be designed? Journal of Online Interactive Learning, 4(2), 67-81. Garrison, D., & Anderson, T. (2003). E-learning in the 21st Century. London: Rutledge Falmer. 125
Habraken, J. (2008, January). Reference Architecture for e-Learning Solutions. Master Thesis, Computer Science, Open University. Hall, B. (2002). Six steps to developing a successful e-learning initiative: excerpts from the e-learning guidebook. In R. Allison, The ASTD E-Learning Handbook. New York: McGraw-Hill. Hall, B. (2005, March 5). E-learning: Building competitive advantage through people and technology. Retrieved March 12, 2013, from A special section on elearning by Forbes Magazine.: http://www.forbes.com/specialsections/elearning/ Halloran, P. (2008). The Practice of Personal Learning. Proceedings of 30th International Conference of Information Technology Interfaces. Cavatal, Croatia. Hameed, H., Atta, B., & Andrea, J. C. (2008). Effective E-Learning Integration with Traditional Learning in a Blended Learning Environment. European and Mediterranean Conference on Information Systems 2008 (EMCIS2008). Dubai: Al Bustan Rotana Hotel. Hart, C. (1998). Doing a literature review: Releasing the social science research imagination. London: Sage. Hase, S., & Kenyon, C. (2007). Heutagogy: A child of complexity theory. Complicity. An International Journal of Complexity and Education, 4(1), 111118. Horton, W., & Horton, K. (2003). E-Learning tools and technologies: A consumers' guide for trainers, teachers, educators and instructional designers. Indianapolis: Wiley Publishing Inc. Hsu, M. H., Chen, Y. L., & Chiu, C. M. (2003). Examining the WWW continuance: an extended expectation-confirmation model. Communications of the ICISA, 5(2), 12-25. Hsu, P. (2012, October). Learner Characteristic Based Learning Effort Curve Mode: The Core Mechanism for Developing Personalized Adaptive E-learning Platform. Turkish Online Journal of Educational Technology , 11(4). Huang, L. C. (2004). Influential Factors Impacting on the Implementation of elearning: A perspective from Technological Education. . World Transactions on Engineering and Technology Education, 3(1), 47-52. IBM. (2012). Developing Web Services Applications. IBM Corporation. IEEE. (2000). IEEE Computer Society, IEEE Recommended Practice for Architectural Description of Software-Intensive Systems: IEEE Std 1472000. Iowa, S. U. (2013). e-learner. Retrieved May 8, 2013, from Iowa State University: http://www.dso.iastate.edu/asc/academic/elearner/benefits.html 126
Isaacs, S., & Hollow, D. (. (2012). The eLearning Africa 2012 Report. Germany: ICWE. ISO. (1991). ISO9126 Quality Model. Jabr, M. A., & Al-Omari, H. K. (2010). e-Learning Management System Using Service Oriented Architecture. Journal of Computer Science, 6(3), 285-295. Jones, C. (2001). Rules of the game. Online Learning Magazine, 5(6). Kadirire, J. (2009). Mobile Learning DeMystified . In R. Guy (Ed) The Evolution of Mobile Teaching and Learning. California, USA: Informing Science Press. Kar-tin, L. (2005). E-Learning: The Quest for Effectiveness. Malaysian Online Journal of Instructional Technology, 2(2), 61-71. Kashfi, H., & Razzazi, M. R. (2006). A distributed service oriented e-learning environment based on grid technology. 18TH National Computer Conference: Saudi Computer Society. Kay, R. (2006). Addressing gender differences in computer ability, attitudes and use: The laptop effect. Journal of Educational Computing Research, 34(2), 187-211. Keskin, N. O., & Metcalf, D. (2011). The Current Perspectives, Theories and Practices of Mobile Learning. The Turkish Online Journal of Educational Technology, 10(2), 202-208. Khalifa, M., & Liu, V. (2002). Satisfaction with Internet-based services. . Paper presented at the 35th Hawaii International Conference on System Sciences. Big Island, Hawaii. Khan, B. (2005). E-learning Quick Checklist. Hershey: Information Science Publishing. Khan, B. (2005). Learning Features in an Open, Flexible, and Distributed Environment. Association for the Advancement of Computing In Education Journal, 13(2), 137-153. Khmelevsky, Y., & Voytenko, V. (2010). Cloud Computing Infrastructure Prototype for University Education and Research. Proceedings of the 15th Western Canadian Conference on Computing Education. Kiget, K. N. (2012). Usability of E-learning Systems in Kenyan Universities: A Case of Kenyatta University. Master of Science Thesis, Masinde Muliro Universiity of science and Technology, 5. Kiilu, R. (2012). An E-Learning Approach to Secondary School Education: EReadiness Implications in Kenya. Journal of Education and Practice, 3(16), 142148. 127
Knowles, M. (1975). Self-directed learning: A guide for learners and teachers. United States of America: Cambridge Adult Education. Kombo, D. K. (2006). Proposal and Thesis Writing. Nairobi: Paulines. Kombo, D. K., & Orodho. (2003). Research Methods. Nairobi: Kenyatta University Institute of Open Learning. Kothari, C. R. (2004). Research Methods: Methods and Techniques. New Delhi, India: New Age International LTD. Kreger, H. (2001). Web Service Conceptual Architecture (WSCA 1.0). IBM Software Group. Kruchten, P. (2003). The Rational Unified Process: An Introduction (3rd ed.). New York: Addison-Wesley Professional. Kumaran, S. V., & Sankar, A. (2013). Recommendation System for Adaptive Elearning Using Semantic Net. International Journal of Computer Applications, 63(7), 19-24. Lai, T. L. (2004). Service quality and perceived value's impact on satisfaction, intention and usage of Short Message Service (SMS). Information Systems Frontiers, 6(4), 353-368. Lawanya, S. M., & Subha, S. (2013, July). An implementation of e-learning system in a private cloud. International Journal of Engineering and Technology, 5(3), 3036-3040. Lee, S. (2011). Examining the Relationship Among Student Perception of support, Course Satisfaction and Learning Outcomes in on-line Learning. Internet and Higher Education, 14, 158-163. Liu, Y., Lavelle, E., & Andris, J. (2002). Experimental Effects of Online Instruction on Locus of Control. United States Distance Learning Association Journal, 16(6). Loh, B. H. (2007). The Antecedents and Outcomes of E-Learning Effectiveness in the Manufacturing Industry. Master’s thesis, School of Management, Universiti Sains Malaysia, Penang. Macleod, H. (2005). What role can educational multimedia play in narrowing the digital divide? International Journal of Education and Development using ICT, 1(4), 42-53. Mahajan, R., Sodhi, J. S., & Mahajan, V. (2012, August ). Mining User Access Patterns Efficiently for Adaptive e-learning Environment. International Journal of e-Education, e-Business, e-Management and e-Learning, 2(4), 277-279.
128
Mahnane, L., Laskri, T. M., & Trigano, P. (2013, May). A Model of Adaptive elearning Hypermedia System based on Thinking and Learning Styles. International Journal of Multimedia and Ubiquitous Engineering, 8(3), 339-350. Marie, J., & Miguel, A. (2009). SOA initiatives for eLearning. A Moodle case. International Conference on Advanced Information Networking and Applications Workshops, (pp. 1-6). Barcelona. Masud, A. H., & Xiaodi, H. (2012). An E-learning System Architecture based on Cloud Computing. World Academy of Science, Engineering and Technology, 62, 74-78. Mbarek, R., & Zaddem, F. (2013, April). The examination of factors affecting elearning effectiveness. International Journal of Innovation and Applied Studies, 2(4), 423-435. McGill, T., Hobbs, V., & Globa, J. (2003). User-developed applications and information systems success: a test of Delone and McLean's model. Information Resources Management Journal, 16(1), 24-45. McGovern, J. (2004). A Practical Guide to Enterprise Architecture. New York: Prentice Hall . McPherson, M., & Dunes, J. (2008). Critical Issues for E-learning Delivery: What may seen obvious is not always put into practice. Journal of Computer Assisted learning, 24(5), 433-445. Mell, P., & Grance, T. (2011). The NIST Definition of Cloud Computing (Draft) . New York: U.S. Department of Commerce. Miles, M. B., & Huberman, M. A. (1994). Qualitative Data Analysis: An Expanded Sourcebook (2nd ed.). Thousand Oaks, CA:: Sage. Mohammed, A., & Hussein, K. A. (2010). Design and Implementation of E-Learning Management System using Service Oriented Architecture. World Academy of Science, Engineering and Technology. Mokhtar, S. A., Ali, S. H., Al-Sharafi, A., & Aborujilah, A. (2013). Cloud computing in academic institutions. Proceedings of the 7th International Conference on Ubiquitous Information Management and Communication - ICUIMC (pp. 1-7). New York: ACM Press. Moore, M. G., & Kearsley, G. (2012). Distance education: A systems view of online learning (3rd ed.). Belmont, CA: Wadsworth. Mouton, J. (1996). Understanding Social Research. Pretoria: Van Schaik.
129
Mtebe, S. J. (2013, August). Exploring the Potential of Clouds to Facilitate the Adoption of Blended Learning in Tanzania. International Journal of Education and Research, 1-16. Mugenda, O. M., & Mugenda, A. G. (1999). Research Methods: Quantitative & Qualitative Approaches. Nairobi, Kenya: Acts Press. Mugenda, O. M., & Mugenda, G. A. (2003). Research Methods: Quantitative and Qualitative Approaches. Nairobi Kenya: African Center for Technology Studies (ACTS)-Press. Munirah, R., Issham, I., Azidah, A. Z., & Hanysah, B. (2012). The Effectiveness Learning Materials and Activities in e-Learning Portal. Malaysian Journal of Distance Education, 14(1), 17-24. Murtaza, R. S. (2013, May). A Unified Cloud Computing Model towards Developing ‘E-learning as a Service’ based Education System. International Journal of Computer Applications, 72(7), 38-51. Negash, S., Ryan, T., & Igbaria, M. (2003). Quality and effectiveness in Web-based customer support systems. Information & Management, 40(8), 757-768. Newcomer, E. (2002). Understanding Web Services:XML, WSDL, SOAP, and UDDI. London: Addison-Wesley. Nichani, M. (2001). LCMS = LMS + CMS [RLOs] - How does this affect the learner? The instructional designer? Retrieved January 11, 2013, from http://www.elearningpost.com/elthemes/lcms.asp Nielsen, J. (1993). Usability Engineering. London, UK: Academic Press. Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric theory (3rd ed.). New York, U.S.A: McGraw-Hill. OASIS. (2002). UDDI Version3.0.2. Retrieved October 12, 2013, from UDDI Spec Technical Committee Draft: http://oasis-open.org/ pubs/ uddiv3.html. OASIS Open. (2012). Reference Architecture Foundation for Service Oriented Architecture . Retrieved August 12, 2013, from OASIS Open: http://docs.oasisopen.org/soa-rm/soa-ra/v1.0/soa-ra.html Object Management Group Inc. (2003). UML 2.0 Infrastructure Specification: Document number 03-09-15. Oliver, R. L. (1980). A cognitive model for the antecedents and consequences of satisfaction. Journal of Marketing Research, 17(4), 460-469. Oliver, R. L. (1993). A conceptual model of service quality and service satisfaction: compatible goals, different concepts. In T. A. Swartz, D. E. Bowen, & S. W. 130
Brown, In Advances in Services Marketing and Management: Research and Practice. Greenwich: JAI Press. Omondi, O. O. (2009). Comparative Study of the e-learning Platforms Used in Kenyan Universities: Case Study of Jomo Kenyatta University of Agriculture and Technology and United States International University. Masters in Information Technology Thesis. Omwenga, E. I., Waema, T., & Wagenda, P. (2004). A model for introducing and implementing e-learning for delivery of educational content within the African Context. African Journal of science & Tec and Engineering Series, 5(1), 1-10. Ozkan, S., & Koseler, R. (2009). Multi-dimensional students' evaluation of elearning systems in the higher education context: an empirical investigation. Computers & Education, 53(4), 1285-1296. Padayachee, I., Kotze, P., & A., M. (2009). ISO 9126 external systems quality characteristics, sub-characteristics and domain specific criteria for evaluating eLearning systems. Palanivel, K., & Kuppuswami, S. (2010). Service Versioning Model for Personalized E- Learning System. Inter. Journal of Engineering Sciences & Technology, 2(10), 5583-5593. Palanivel, K., & Kuppuswami, S. (2011). Service-Oriented Reference Architecture for Personalized E-learning Systems (SORAPES). International Journal of Computer Applications, 24(5), 35-44. Parker, M. (2003). Technology enhance e-learning: Perceptions of first year information systems students at the Cape Technikon. Annual research conference of South Africa institute of computer scientiests and information technologists on enablement through technology. Pituch, K. P., & Lee, Y. (2006). The influence of system characteristics on e-learning use. Computers & Education, 42(2), 222-244. Pressman, R. S. (2005). Software Engineering: A Practitioner's Approach. New York: McGraw-Hill. Qinghua, Z., & Dong, B. (2008). A Service-oriented Approach to Integration of Elearning Information and Resource Management Systems. IEEE. Rai, A., Lang, S. S., & Welker, R. B. (2002). Assessing the validity of IS success models: an empirical test and theoretical analysis. Information Systems Research, 13(1), 50-69. Ramaya, T., & Lee, C. W. (2012). System characteristic, satisfaction and e-learning usage: A Structural Equation Model (SEM). The Turkish Online Journal of Educational Technology. 131
Ramayah, T., Ahmad, N. H., & Hong, T. S. (2012). An Assessment of E-training Effectiveness in Multinational Companies in Malaysia. Educational Technology & Society, 15(2), 125–137. Ramayah, T., Ahmad, N. H., & Lo, M. C. (2010). The role of quality factors in intention to continue using an e-learning system in Malaysia. Procedia - Social and Behavioral Sciences, 2(2), 5422-5426. Ramayah, T., Lee, J. W., & Osman, M. (2010). The role of quality in e-learning satisfaction and usage among university students in Malaysia. Paper presented at the International Conference on Business and Management Education. Bangkok, Thailand. Resmer.M. (1998, September). Internet Architectures for Learning. IEEE Computer, 31(9), 105-106. Retalis, S., & Dougiamas, M. (2012). 1st Moodle ResearchConference. Heraklion: Crete-Greece. Richmond, A. S., & Cummings, R. (2005). Implementing Kolb’s learning styles into online distance education. International Journal of Technology in Teaching and Learning, 1(1), 45-54. Robbins, S. R. (2002). The evolution of the learning content management system. Retrieved January 9, 2013, from http://www.learningcircuits.org/2002/apr2002/robbins.html Robbins, S., & Stylianou, A. (2003). Global corporate web sites: an empirical investigation of content and design. Information & Management, 40(3), 205-212. Roca, J. C., Chiu, C. M., & Martinez, F. J. (2006). Understanding e-learning continuance intention: An extension of the Technology Acceptance Model. Human-Computer Studies, 64, 683-696. Rosen, M., Lublinsky, B., Smith, K. T., & Balcer, M. J. (2008). Applied SOA: Service-Oriented Architecture and Design Strategies. Indianapolis: Wiley Publishing Inc. Rosenberg, M. J. (2001). e-Learning: Strategies for delivering knowledge in the digital age. New York: McGraw-Hill. Saadé, R., & Bahli, B. (2005). The impact of cognitive absorption on perceived usefulness and perceived ease of use in on-line learning: an extension of the technology acceptance model. Information and management, 42, 317-327. Saeed, N., Yang, Y., & Sinnappan, S. (2009). Emerging Web Technologies in Higher Education: A Case of Incorporating Blogs, Podcasts and Social Bookmarks in a Web Programming Course based on Students' Learning Styles and Technology Preferences. Educational Technology & Society, 12(4), 98–109. 132
Seddon, P. B. (1997). A respecification and extension of Delone and Mclean model of IS success. Information Systems Research, 8(3), 240-253. Selim, H. (2007). Critical Success Factors for E-learning Acceptance. Computers & Education, 49(2), 396-413. Sharkey, U., Scott, M., & Acton, T. (2010). The influence of quality on e-commerce success: an empirical application of the Delone and Mclean IS Success Model. International Journal of e-Business Research, 6(1), 68-84. Sharmin, F., & Sohel, A. (2014, January). Potentials of e-learning in Bangladesh: An Analysis. Banglavision, 13(1), 68-76. Shaw, M., & Garlan, D. (1996). Software Architecture - Perspectives on an Emerging Discipline. New York: Prentice Hall . Sherry, L. (1996). Issues in distance learning. International Journal of Educational Telecommunications, 1(4), 337-365. Singh, A. N., & Hemalatha, M. (2012, February). Cloud Computing for Academic Environment. International Journal of Information and Communication Technology Research, 2(2), 97-101. Smith, B. A., & Merchant, E. J. (2001). Designing an attractive web site: variables of importance. . Proceedings of the 32nd Annual Conference of the Decision Sciences Institute. San Francisco, CA. Sommerville, I. (2011). Software Engineering (9th ed.). New York: Pearson Education, Inc. Steen, G. (2008). The paradox of metaphor: Why we need a three-dimensional model of metaphor. Metaphor and Symbol, XXIII(4), 213-241. Steen, H. L. (2008, December). Effective eLearning Design. MERLOT Journal of Online Learning and Teaching, 4(4), 526-532. Stepanyan, K., Littlejohn, A., & Margaryan, A. (2013). Sustainable e-Learning: Toward a Coherent Body of Knowledge. Educational Technology & Society, 16(2), 91-102. Subrat, R., & Devshri, R. (2011, April). Adaptive E-learning System: A Review. International Journal of Computer Trends and Technology, 1-4. Swierczek, W. F., Bechter, C., & Chankiew, J. (2012). Attributes of e-learning effectiveness in a Multi-cultural context: An Exploration. European, Mediterranean & Middle Eastern Conference on Information Systems. Temur, N., Yildirim, S., Kokaman, A., & Goktas, Y. (2004). What Makes a Good LMS: An Analytical Approach to Assesment of LMSs. Middle East Tehnical 133
University, Department of Computer Education & Instructional Technology, Ankara. Toili, W. W. (2007). Secondary school students' participation in environmental action: Coersionor Dynamism? Eurasia Journal of Mathematics, Science and Technology Education, 3(1), 51-69. Toprak, E. (2010). Ethics in e-learning. The Turkish Online Journal of Educational Technology, 9(2), 78-86. Trice, A. W., & Treacy, M. E. (1988). Utilization as a dependent variable in MIS research. Data Base. (Fall/Winter), 19(4), 33-41. Truong, H., Pham, T., Thoai, N., & Dustdar, S. (2012). Cloud Computing for Education and Research in Developing Countries. Cloud Computing for Education and Research, 78-94. Tuncay, E. (2010). Effective use of Cloud computing in educational institutions. Procedia Social Behavioral Sciences, 2, 938-942. Vagias, W. M. (2006). Likert-type scale response anchors. Clemson International Institute for Tourism & Research Development. Department of Parks, Recreation and Tourism Management. Clemson University. Vinoski, S. (2004). Web Services Notifications. IEEE Internet Computing, 86-90. Vinoski, S. (2004). Web Services Notifications. IEEE Internet Computing, 86-90. Vogel, O., Anold, I., Chughtai, A., & Kehrer, T. (2011). Software Architecture: A Comprehensive Framework and Guide for Practioners. Berlin Heidelberg: Springer-Verlag . Vogel, O., Ingo, A., Chughtai, A., & Kehrer, T. (2011). Software Architecture: A comprehensive framework and guide for practioners. Berlin: Springer-Verlag . Volman, M., & Van Eck, E. (2001). Gender equity and information technology in education: The. The second decade. Review of Educational Research, 71(4), 613 634. W3C. (2001, March 15). Web Services Description Language (WSDL) 1.1. Retrieved October 13, 2013, from W3C: http://www.w3.org/TR/2001/NOTE-wsdl20010315 W3C. (2007, April 27). SOAP Version 1.2 Part 1: Messaging Framework (Second Edition). Retrieved October 13, 2013, from W3C Recommendation: http://www.w3.org/TR/soap12-part1/ Watson, W. R., & Watson, S. L. (2007). What are Learning Management Systems, What are They Not, and What Should They Become? TechTrends, 51(2), 28-34. 134
Yan, Z., Hao, H., Hobbs, L. J., & Wen, N. (2003). The Psychology of e-learning: A field of study. Educational Computing Research, 29(3), 285-296. Yang, S. J. (2006). Context Aware Ubiquitous Learning Environments for Peer-toPeer Collaborative Learning. Educational Technology & Society, 9(1), 188-201. Yildirim, M., Barut, B., & Kilic, K. (2003). Teaching Supply Chain Logistics: A Global Learning Framework. Global Learning and Internet Conference. Wichista State University. Yin, R. K. (2003). Case Study Research - Design and Methods (3rd ed.). Thousand Oaks: California-Sage. Youssef, A. E. (2012, July). Exploring Cloud Computing Services and Applications. Journal of Emerging Trends in Computing and Information Sciences, 3(6), 838847.
135
APPENDICES RESEARCH INSTRUMENTATION AND ADMINISTRATION Appendix 1: Letter of introduction Dear respondent, RE:
MASTER
OF
SCIENCE
IN
INFORMATION
TECHNOLOGY
RESEARCH QUESTIONNAIRE I am a student pursuing a Master of Science degree course in Information Technology at Masinde Muliro University of Science and Technology. My research topic is “Service-Oriented Reference Architecture to Improve Functionality of Web-Based E-Learning Systems”. The purpose of this letter is to kindly request you to fill the attached questionnaire to the best of your knowledge to help me complete this academic endeavour. The information you will provide will be treated with utmost confidentiality and shall be used for academic purposes only. I will collect the completed questionnaire from your departmental office, or can be sent online to
[email protected] Your assistance is highly appreciated. Yours sincerely, ……………………….. Raphael Angulu -SIT/G/23/11 Department of Computer Science Masinde Muliro University of Science and Technology
136
Appendix 2: Questionnaire for students SECTION A: DEMOGRAPHIC INFORMATION A1. a) Name………………………………………………………… (Optional) b) Department………………………………………………………………. c) Institution………………………………………………………………… A2. Gender: Male [ ] Female [ ] A3. Level of study: Certificate [ ] Diploma [ ] Degree [ ] Masters [ ] PhD [ ] A4. Current year of study: 1 ST [ ] 2ND [ ] 3 RD [ ] 4TH [ ] 5TH [ ] A5. Mode of Study: Full Time [ ] Part Time [ ] Distance Learning [ ] A6. Age in years: Below 20 [ ] 20 – 25 [ ] 25 – 30 [ ] Above 30 [ ] SECTION B: E-LEARNING SYSTEM AND INTERNET EFFICACY B1. Have you used Internet on your own? Yes [ ] No [ ] B2. For how long have you used the Internet: Below 2 Years[ ] 2 – 4 Years[ ] 4 – 6 Years[ ] Above 6 Years[ ] B3. Select the web-based e-learning systems you have used or you are currently using: [ ]Wiki [ ]Moodle [ ]WebCT [ ]Blackboard [ ]Sakai B4. For how long have you been using web-based e-learning system [ ] 0 – 1 Years [ ] 1 – 2 Years [ ] 2 – 3 Years [ ] More than 3 years B5. Select the statement(s) that best describe how you prefer learning? [ ] I prefer reading, exploring analytical models, and having time to think things through [ ] I prefer working with others to get assignments done, to set goals, to do field work, and to test out different approaches to completing a project [ ] I prefer experimenting new ideas, simulations, laboratory assignments and practical applications [ ] I prefer to work in groups, listening with an open mind to different points of view and receiving personalized feedback SECTION C: SYSTEM INTERACTIVITY System interactivity is the capability of the system to communicate to the user. Using the following scale, rate the extent to which you agree to the statements below. Key: SD-Strongly Disagree, D-Disagree, N-Neutral, A-Agree, SA-Strongly Agree No RESPONSE STATEMENT SD D N A SA C1 System enables interactive communication between students and lecturers C2 System enables interactive communication among students C3 Generally am satisfied with system interactivity
137
SECTION D: SYSTEM RESPONSE System response is the speed at which the system responds to user requests. Using the following scale, rate the extent to which you agree to the statements below. Key: SD-Strongly Disagree, D-Disagree, N-Neutral, A-Agree, SA-Strongly Agree No RESPONSE STATEMENT SD D N A SA D1 System response while using is fast D2 System response is consistence D3 Generally am satisfied with the response of the web-based e-learning system SECTION E: SYSTEM ACCURATENESS Accurateness is the extent to which the available web-based e-learning system provides expected results or effects for specified tasks and user objectives. Using the following scale, rate the extent to which you agree to the statements below. Key: SDStrongly Disagree, D-Disagree, N-Neutral, A-Agree, SA-Strongly Agree No RESPONSE STATEMENT SD D N A SA System provides mechanisms for making E1 announcements System provides accurate and effective drawing tools E2 for producing accurate diagrams System provides tools for accurate and effective E3 online testing I find content provided by the web-based e-learning E4 system to be accurate to the course objectives SECTION F: SYSTEM SUITABILITY Suitability is the extent to which the available web-based e-learning system provides an appropriate set of functions for required pedagogic tasks and user objectives. Using the following scale, rate the extent to which you agree to the statements below. Key: SD-Strongly Disagree, D-Disagree, N-Neutral, A-Agree, SA-Strongly Agree No RESPONSE STATEMENT SD D RA A SA System provides appropriate means for course F1 content delivery System provides appropriate means for organizing F2 students into groups for projects System presents different types of course material in F3 a well-organized and readable format Generally am satisfied with features in the system F4 SECTION G: SYSTEM INTER-OPERABILITY Inter-operability is the extent to which the available web-based e-learning system seamlessly interconnects to other e-learning systems and other systems. Using the following scale, rate the extent to which you agree to the statements below. Key: SDStrongly Disagree, D-Disagree, N-Neutral, A-Agree, SA-Strongly Agree 138
No G1 G2 G3 G4
STATEMENT I can access content from, and provide content to digital libraries and other e-learning systems I can export data from web-based e-learning system to other system I can import data from other systems to the web-based elearning system Generally am satisfied with system inter-operability
RESPONSE SD D N A SA
SECTION H: SYSTEM SECURITY This concerns availability of security mechanisms in the web-based e-learning system for maintaining the confidentiality and privacy of information about learners. Using the following scale, rate the extent to which you agree to the statements below. Key: SD-Strongly Disagree, D-Disagree, N-Neutral Agree, A-Agree, SA-Strongly Agree No RESPONSE STATEMENT SD D N A SA System provides password protection of all courses, H1 events and resources System provides mechanism for screening of student H2 posts to prevent distribution of undesirable material Generally am satisfied with system security H3 SECTION I: PERCEIVED USEFULNESS This is the degree to which the web-based e-learning system is useful for the pedagogic functions and objectives. Using the following scale, rate the extent to which you agree to the statements below. Key: SD-Strongly Disagree, D-Disagree, N-Neutral, A-Agree, SA-Strongly Agree No RESPONSE STATEMENT SD D N A SA I1 System will allow me to accomplish learning task more quickly Using the web-based e-learning system will improve I2 my learning performance Using web-based e-learning system will make it I3 easier to learn course content Generally, I find the web-based e-learning system I4 useful in my learning System allows learner control over his/her learning I5 process SECTION J: PERCEIVED EASE OF USE This is the degree to which the web-based e-learning system is easy to use to perform various pedagogic tasks and achieve course objectives. Using the following scale, rate the extent to which you agree to the statements below. Key: SD-Strongly Disagree, D-Disagree, N-Neutral, A-Agree, SA-Strongly Agree 139
No J1 J2 J3 J4 J5
STATEMENT I am confident of using system even if there is no one around to show me how to do it I am confident of using system even if I have never used such a system before I am confident of using the system as long as I have a lot of time to complete the task to be done Learning to operate the web-based e-learning system is easy for me Generally, I find the web-based e-learning system to be easy to use
RESPONSE SD D N A SA
SECTION K: ARCHITECTURE OF E-LEARNING SYSTEM Architecture is the structure and organization of a web-based e-learning system. Using the following scale, rate the extent to which you agree to the statements below. Key: SD-Strongly Disagree, D-Disagree, N-Neutral, A-Agree, SA-Strongly Agree No STATEMENT RESPONSE SD D N A SA K1 The system is typically organized in a questionanswer structure K2 I can access the web-based e-learning system from anywhere at anytime K3 Course content is consistent regardless of the equipment I use K4 Do you think the structure and organization of webbased e-learning system need to be improved K5 The content provided by the system change to my preference and capability of the device I use K6. Which equipment do you use to access course content on the web-based elearning system? Desktop Computer [ ] Laptop [ ] Phone [ ] Tablet [ ] K7. How would you wish the web-based e-learning system to be organized and structured?………………………………………………………………………… …………………………………………………………………………………… …………………………………………………………………………………… …………………………………………………………………………………… …………………………………………………………………………………… …………………………………………………………………………………… …………………………………………………………………………………… …………………………………………………………………………………… SECTION L: FUNCTIONALITY FACTORS What factors if any hinders you from achieving the best from the web-based elearning system. Tick in the space provided after each statement appropriately. L1. Lack of skills to use the available web-based e-learning system [ ] L2. Inappropriate content provided by the web-based e-learning system [ ] L3. Lack of required tools in the available web-based e-learning system [ ] L4. Poor ICT infrastructure at the university [ ] 140
L5. Availability of the web-based e-learning system [ ] L6. Network access speeds [ ] L7. The available web-based e-learning system is hard to use [ ] L8. Poor organization and sequencing of course content [ ] L9. Accuracy of the content provided by the web-based e-learning system [ ] L10. Type of content published on the web-based e-learning system. [ ] L11. University policies and culture [ ] L12. Kindly specify any other factor that determine functionality of the web-based elearning system………………………………........................................................... ………………………………………………………………………………………… ………………………………………………………………………………………… ………………………………………………………………………………………… ………………………………………………………………………………………… ………………………………………………………………………………………… ………………………………………………………………………………………… Thank you for taking time to fill this questionnaire
141
Appendix 3: Questionnaire for lecturers and content developers SECTION A: DEMOGRAPHIC INFORMATION a) Name………………………………………………………… (Optional) b) Department………………………………………………………………. c) Institution………………………………………………………………… A1. Gender: Male [ ] Female [ ] A2. Level of study: Certificate[ ] Diploma[ ] Degree[ ] Masters [ ] PhD [ ] A3. Age in years: 20 – 30 [ ] 30 – 40 [ ] 40 – 50 [ ] Above 50 [ ] SECTION B: E-LEARNING SYSTEM AND INTERNET USE B1. Duration of using the Internet: 0 – 2 Years[ ] 2 – 4 Years[ ] 4 – 6 Years[ ] Above 6 Years[ ] B2. Select the web-based e-learning systems you have used or you are currently using: [ ]Wiki [ ]Moodle [ ]WebCT [ ]Blackboard [ ]Sakai B3. For how long have you been using web-based e-learning system? [ ] 0 – 2 Years [ ] 2 – 4 Years [ ] 4 – 6 Years [ ] More than 6 years B4. Does the system address all the learning styles of students? Yes [ ] No [ ] SECTION C: SYSTEM INTERACTIVITY System interactivity is the capability of the system to communicate to the user. Using the following scale, rate the extent to which you agree to the statements below. Key: SDStrongly Disagree, D-Disagree, N-Neutral, A-Agree, SA-Strongly Agree No RESPONSE STATEMENT SD D N A SA C1 The system enables interactive communication between students and lecturers C2 The web-based e-learning system enables interactive communication among lecturers C3 Generally am satisfied with system interactivity SECTION D: SYSTEM RESPONSE System response is the speed at which the system responds to user requests. Using the following scale, rate the extent to which you agree to the statements below. Key: SDStrongly Disagree, D-Disagree, N-Neutral, A-Agree, SA-Strongly Agree No RESPONSE STATEMENT SD D N A SA D1 The web-based e-learning system response while using is fast D2 In general, the response time for the web-based e-learning system is consistence D3 In general, am satisfied with the response of the web-based e-learning system SECTION E: SYSTEM ACCURATENESS Accurateness is the extent to which the available web-based e-learning system provides expected results or effects for specified tasks and user objectives. Using the following 142
scale, rate the extent to which you agree to the statements below. Key: SD-Strongly Disagree, D-Disagree, N-Neutral, A-Agree, SA-Strongly Agree No RESPONSE STATEMENT SD D N A SA E1 The system provides accurate and effective drawing tools for producing accurate diagrams E2 The system provides tools for accurate and effective online testing E3 The system provides accurate course statistics E4 Generally, the system provide accurate results SECTION F: SYSTEM SUITABILITY Suitability is the extent to which the web-based e-learning system provides appropriate set of functions for required pedagogic tasks and user objectives. Using the following scale, rate the extent to which you agree to the statements below. Key: SD-Strongly Disagree, D-Disagree, N-Neutral, A-Agree, SA-Strongly Agree No RESPONSE STATEMENT SD D N A SA F1 The web-based e-learning system provides appropriate tools for publishing of all types of content F2 The system provides appropriate means for management of student records F3 The system provides appropriate means collaboration among lectures F4 The system provides appropriate means for organizing students into groups for projects and discussions F5 The e-learning system provides appropriate means for assigning lecturers to courses F6 Generally, the system is suitable for is functions SECTION G: SYSTEM INTER-OPERABILITY Inter-operability is the extent to which the available web-based e-learning system seamlessly interconnects to other e-learning systems and other systems. Using the following scale, rate the extent to which you agree to the statements below. Key: SDStrongly Disagree, D-Disagree, N-Neutral, A-Agree, SA-Strongly Agree No RESPONSE STATEMENT SD D N A SA G1 I can access content from, and provide content to digital libraries and other e-learning systems G2 I can export data from web-based e-learning system to other system e.g. from the system to excel or others G3 I can import data from other systems to the web-based elearning system e.g. from excel to the system G4 Generally am satisfied with system inter-operability SECTION H: SYSTEM SECURITY This is the availability of security mechanisms in the web-based e-learning system for maintaining the confidentiality of information about learners. Using the following scale, rate the extent to which you agree to the statements below. Key: SD-Strongly Disagree, D-Disagree, N-Neutral Agree, A-Agree, SA-Strongly Agree 143
No H1 H2 H3 H4 H5
STATEMENT System provides password protection of all courses, events and resources Web-based e-learning system uses encryption for encoding data as it travels over the network Web-based e-learning system provides a secure set of user privileges System provides mechanism for screening of student posts to prevent distribution of undesirable material Generally am satisfied with security of the system
RESPONSE SD D N A SA
SECTION I: PERCEIVED USEFULNESS This is the degree to which the web-based e-learning system is useful for the intended functions and objectives. Using the following scale, rate the extent to which you agree to the statements below. Key: SD-Strongly Disagree, D-Disagree, N-Neutral, A-Agree, SAStrongly Agree No RESPONSE STATEMENT SD D N A SA I1 Using web-based e-learning system will make it easier to teach course content I2 Using the web-based e-learning system will enhance my effectiveness in teaching I3 Generally, I find the web-based e-learning system useful in my teaching SECTION J: PERCEIVED EASE OF USE This is the degree to which the web-based e-learning system is easy to use to perform various pedagogic tasks and achieve course objectives. Using the following scale, rate the extent to which you agree to the statements below. Key: SD-Strongly Disagree, DDisagree, N-Neutral, A-Agree, SA-Strongly Agree No RESPONSE STATEMENT SD D N A SA J1 I am confident of using system even if there is no one around to show me how to do it J2 I am confident of using system even if I have never used such a system before J3 I am confident of using the system as long as I have a lot of time to complete the task to be done J4 Learning to operate the web-based e-learning system is easy for me J5 Generally, I find the web-based e-learning system to be easy to use SECTION L: ARCHITECTURE OF E-LEARNING SYSTEM Architecture is the structure and organization of a web-based e-learning system. Using the following scale, rate the extent to which you agree to the statements below. Key: SDStrongly Disagree, D-Disagree, N-Neutral, A-Agree, SA-Strongly Agree
144
No
STATEMENT SD
L1
RESPONSE D N A SA
The system is typically organized in a question-answer structure L2 I can access the web-based e-learning system from anywhere L3 Course content is consistent regardless of the equipment I use to access the system L4 Do you think the structure and organization of web-based elearning system need to be improved L5 The content provided by the system change to my preference and capability of the device I use L6 I can access the system even when am not connected on the Internet L7. Which equipment do you use to access course content on the web-based e-learning system? Desktop Computer [ ] Laptop [ ] Phone [ ] Tablet [ ] L8. How would you wish the web-based e-learning system to be organized and structured?………………………………………………………………………………… …………………………………………………………………………………………… …………………………………………………………………………………………… SECTION M: FUNCTIONALITY FACTORS What factors if any hinders you from achieving the best from the web-based e-learning system. Tick in the space provided after each statement appropriately. M1. Lack of skills to use the available web-based e-learning system [ ] M2. Lack of required tools in the available web-based e-learning system [ ] M3. Poor ICT infrastructure at the university [ ] M4. Less availability of the web-based e-learning system [ ] M5. Low or inconsistent network access speeds [ ] M6. The available web-based e-learning system is hard to use [ ] M7. Poor organization and structure of the system [ ] M8. University policies and culture [ ] M9. Kindly specify any other factor that determine functionality of the web-based elearning system…………………………………………………………………………… …………………………………………………………………………………………… …………………………………………………………………………………………… Thank you for taking time to fill this questionnaire
145
Appendix 4: Schedule Activity/Months
May Jun
Jul
Aug
Sep
Oct
Nov
Dec
Jan
Feb
2013 2013 2013 2013 2013 2013 2013 2013 2014 2014 Proposal writing Literature review Proposal defense Proposal Approval Data collection Data Coding Data Analysis First draft thesis Final draft thesis Thesis submission Thesis defense
146
Mar
Apr
2014 2014
Appendix 5: Descriptive statistics for students and lecturers Part A: Descriptive statistics for students Descriptive statistics for mode of learning and satisfaction
Full Time Am satisfied with webPart Time based e-learning system Distance Learning inter-operability Total Full Time Am satisfied with Part Time system interactivity Distance Learning Total Full Time Am satisfied with Part Time system response Distance Learning Total Full Time Content provided by the Part Time system is accurate to Distance Learning course objectives Total Full Time Am satisfied with Part Time features in the system Distance Learning Total Full Time Am satisfied with Part Time system security Distance Learning Total Full Time Am satisfied with ease Part Time of use of the web-based Distance Learning e-learning system Total Full Time I find the web-based e- Part Time learning system useful in Distance Learning my learning Total
N
Mean Std. Dev.
98 34 35 167 98 34 35 167 98 34 35 167 98 34 35 167 98 34 35 167 98 34 35 167 98 34 35 167 98 34 35 167
1.91 2.88 2.94 2.32 2.30 1.88 2.23 2.20 2.55 2.82 2.83 2.66 3.45 3.35 4.03 3.55 1.83 1.94 2.37 1.96 2.10 2.15 2.26 2.14 2.76 2.62 2.83 2.74 2.72 2.88 2.14 2.63
147
1.075 1.225 1.110 1.214 1.302 1.225 1.190 1.267 1.150 1.487 1.424 1.283 1.253 1.574 1.150 1.320 1.094 1.153 1.114 1.124 1.030 1.019 1.120 1.043 1.363 1.349 1.505 1.384 1.291 1.452 1.192 1.323
Std. 95% Confidence Error Interval for Mean Lower Upper Bound Bound .109 1.69 2.12 .210 2.45 3.31 .188 2.56 3.32 .094 2.14 2.51 .132 2.03 2.56 .210 1.45 2.31 .201 1.82 2.64 .098 2.00 2.39 .116 2.32 2.78 .255 2.30 3.34 .241 2.34 3.32 .099 2.47 2.86 .127 3.20 3.70 .270 2.80 3.90 .194 3.63 4.42 .102 3.35 3.75 .110 1.61 2.05 .198 1.54 2.34 .188 1.99 2.75 .087 1.79 2.14 .104 1.90 2.31 .175 1.79 2.50 .189 1.87 2.64 .081 1.98 2.30 .138 2.48 3.03 .231 2.15 3.09 .254 2.31 3.35 .107 2.53 2.95 .130 2.47 2.98 .249 2.38 3.39 .201 1.73 2.55 .102 2.43 2.84
Min. Max.
1 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 4 5 5 5 5 5 5 5 5 5 4 5
Descriptive statistics for gender and satisfaction N
Am satisfied with web- Female based e-learning system Male inter-operability Total Female Am satisfied with Male system interactivity Total Female Am satisfied with Male system response Total Content provided by the Female system is accurate to Male course objectives Total Female Am satisfied with Male features in the system Total Female Am satisfied with Male system security Total Am satisfied with ease Female of use of the web-based Male e-learning system Total I find the web-based e- Female learning system useful Male in my learning Total
Mean Std. Std. 95% Confidence Dev. Error Interval for Mean Lower Upper Bound Bound 62 3.02 1.180 .150 2.72 3.32 105 1.91 1.039 .101 1.71 2.12 167 2.32 1.214 .094 2.14 2.51 62 1.60 .931 .118 1.36 1.83 105 2.55 1.308 .128 2.30 2.81 167 2.20 1.267 .098 2.00 2.39 62 2.66 1.305 .166 2.33 2.99 105 2.67 1.276 .125 2.42 2.91 167 2.66 1.283 .099 2.47 2.86 62 3.34 1.200 .152 3.03 3.64 105 3.68 1.376 .134 3.41 3.94 167 3.55 1.320 .102 3.35 3.75 62 2.34 1.254 .159 2.02 2.66 105 1.74 .981 .096 1.55 1.93 167 1.96 1.124 .087 1.79 2.14 62 2.21 1.058 .134 1.94 2.48 105 2.10 1.037 .101 1.90 2.31 167 2.14 1.043 .081 1.98 2.30 62 2.69 1.386 .176 2.34 3.05 105 2.77 1.389 .136 2.50 3.04 167 2.74 1.384 .107 2.53 2.95 62 2.60 1.311 .167 2.26 2.93 105 2.66 1.336 .130 2.40 2.92 167 2.63 1.323 .102 2.43 2.84
148
Min. Max.
1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5
Descriptive statistics for age and satisfaction N
Mean
Below 20 27 2.19 20 - 25 25 - 30 Above 30 Total Below 20 20 - 25 Am satisfied with system 25 - 30 interactivity Above 30 Total Below 20 20 - 25 Am satisfied with system 25 - 30 response Above 30 Total Below 20 Content provided by the 20 - 25 system is accurate to 25 - 30 course objectives Above 30 Total Below 20 20 - 25 Am satisfied with features 25 - 30 in the system Above 30 Total Below 20 20 - 25 Am satisfied with system 25 - 30 security Above 30 Total Below 20 Am satisfied with ease of 20 - 25 use of the web-based e- 25 - 30 learning system Above 30 Total Below 20 I find the web-based e- 20 - 25 learning system useful in 25 - 30 my learning Above 30 Total Am satisfied with webbased e-learning system inter-operability
71 43 26 167 27 71 43 26 167 27 71 43 26 167 27 71 43 26 167 27 71 43 26 167 27 71 43 26 167 27 71 43 26 167 27 71 43 26 167
2.55 1.98 2.42 2.32 2.00 2.31 1.79 2.77 2.20 2.44 2.79 2.74 2.42 2.66 3.22 3.69 3.44 3.69 3.55 1.59 2.08 1.88 2.15 1.96 2.00 1.99 2.42 2.27 2.14 2.44 2.72 2.79 3.04 2.74 2.85 2.37 2.91 2.69 2.63
149
Std. Std. 95% Confidence Dev. Error Interval for Mean Lower Upper Bound Bound 1.302 .251 1.67 2.70
Min.
Max.
1
4
1.156 1.205 1.206 1.214 1.209 1.226 1.166 1.394 1.267 1.155 1.341 1.197 1.391 1.283 1.121 1.430 1.221 1.350 1.320 1.047 1.011 1.219 1.287 1.124 1.038 .902 1.159 1.151 1.043 1.340 1.406 1.440 1.280 1.384 1.262 1.256 1.477 1.225 1.323
1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 4 5 5 5 5 4 5 5 4 5 5 5 5 5 5 4 5 5 5 5
.137 .184 .236 .094 .233 .145 .178 .273 .098 .222 .159 .183 .273 .099 .216 .170 .186 .265 .102 .202 .120 .186 .252 .087 .200 .107 .177 .226 .081 .258 .167 .220 .251 .107 .243 .149 .225 .240 .102
2.28 1.61 1.94 2.14 1.52 2.02 1.43 2.21 2.00 1.99 2.47 2.38 1.86 2.47 2.78 3.35 3.07 3.15 3.35 1.18 1.85 1.51 1.63 1.79 1.59 1.77 2.06 1.80 1.98 1.91 2.39 2.35 2.52 2.53 2.35 2.07 2.45 2.20 2.43
2.82 2.35 2.91 2.51 2.48 2.60 2.15 3.33 2.39 2.90 3.11 3.11 2.98 2.86 3.67 4.03 3.82 4.24 3.75 2.01 2.32 2.26 2.67 2.14 2.41 2.20 2.78 2.73 2.30 2.97 3.05 3.23 3.56 2.95 3.35 2.66 3.36 3.19 2.84
Descriptive statistics for level of study and satisfaction N
Mean Std. Std. 95% Confidence Min. Max. Dev. Error Interval for Mean Lower Bound
Diploma Am satisfied with webDegree based e-learning system Masters inter-operability Total Diploma Am satisfied with system Degree interactivity Masters Total Diploma Am satisfied with system Degree response Masters Total Diploma Content provided by the Degree system is accurate to Masters course objectives Total Diploma Am satisfied with Degree features in the system Masters Total Diploma Am satisfied with system Degree security Masters Total Diploma Am satisfied with ease of Degree use of the web-based eMasters learning system Total Diploma I find the web-based eDegree learning system useful in Masters my learning Total
Upper Bound
34
2.41
1.158 .199 2.01
2.82
1
5
100
2.07
1.112 .111 1.85
2.29
1
5
33 167 34 100 33 167 34 100 33 167 34 100 33 167 34 100 33 167 34 100 33 167 34 100 33 167 34 100
3.00 2.32 2.18 2.43 1.52 2.20 3.29 2.52 2.45 2.66 3.35 3.61 3.58 3.55 2.03 1.91 2.06 1.96 2.21 2.10 2.21 2.14 2.50 2.84 2.70 2.74 2.44 2.59
1.323 1.214 1.193 1.320 .906 1.267 1.219 1.275 1.201 1.283 1.323 1.399 1.062 1.320 1.243 1.093 1.116 1.124 1.008 1.078 .992 1.043 1.308 1.419 1.357 1.384 1.330 1.303
2.53 2.14 1.76 2.17 1.19 2.00 2.87 2.27 2.03 2.47 2.89 3.33 3.20 3.35 1.60 1.69 1.66 1.79 1.85 1.89 1.86 1.98 2.04 2.56 2.22 2.53 1.98 2.33
3.47 2.51 2.59 2.69 1.84 2.39 3.72 2.77 2.88 2.86 3.81 3.89 3.95 3.75 2.46 2.13 2.46 2.14 2.56 2.31 2.56 2.30 2.96 3.12 3.18 2.95 2.91 2.85
1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 4 5 5 5 5 5 5 5 5 5
33
2.97
1.357 .236 2.49
3.45
1
5
167
2.63
1.323 .102 2.43
2.84
1
5
150
.230 .094 .205 .132 .158 .098 .209 .128 .209 .099 .227 .140 .185 .102 .213 .109 .194 .087 .173 .108 .173 .081 .224 .142 .236 .107 .228 .130
Descriptive statistics for year of study and satisfaction N
First Am satisfied with web-based Second e-learning system interThird operability Fourth
Am satisfied with system interactivity
Am satisfied with system response
Content provided by the system is accurate to course objectives
Am satisfied with features in the system
Am satisfied with system security
Am satisfied with ease of use of the web-based e-learning system
Total First Second Third Fourth Total First Second Third Fourth Total First Second Third Fourth Total First Second Third Fourth Total First Second Third Fourth Total First Second Third Fourth Total First Second Third
I find the web-based elearning system useful in my learning Fourth Total
Min. Max.
53 57 31 26
Mean Std. Std. 95% Confidence Dev. Error Interval for Mean Lower Upper Bound Bound 2.45 1.294 .178 2.10 2.81 2.40 1.237 .164 2.08 2.73 2.26 .965 .173 1.90 2.61 1.96 1.248 .245 1.46 2.47
1 1 1 1
5 5 4 5
167 53 57 31 26 167 53 57 31 26 167 53 57 31 26 167 53 57 31 26 167 53 57 31 26 167 53 57 31 26 167 53 57 31
2.32 2.32 1.61 2.45 2.92 2.20 2.66 2.82 2.32 2.73 2.66 3.57 3.16 4.00 3.85 3.55 1.91 2.04 2.03 1.85 1.96 2.15 2.18 1.94 2.31 2.14 2.60 2.58 3.06 3.00 2.74 2.68 2.54 2.65
2.14 1.96 1.36 1.97 2.41 2.00 2.30 2.49 1.87 2.19 2.47 3.20 2.83 3.52 3.31 3.35 1.62 1.71 1.66 1.34 1.79 1.90 1.88 1.56 1.85 1.98 2.24 2.22 2.49 2.45 2.53 2.32 2.20 2.15
2.51 2.68 1.86 2.93 3.43 2.39 3.02 3.16 2.77 3.27 2.86 3.93 3.49 4.48 4.38 3.75 2.19 2.36 2.40 2.35 2.14 2.41 2.47 2.31 2.76 2.30 2.96 2.94 3.64 3.55 2.95 3.04 2.89 3.14
1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 4 4 5 5 5 5 5 5 5 5 5
26 2.73 1.430 .280 2.15 167 2.63 1.323 .102 2.43
3.31 2.84
1 1
5 5
151
1.214 1.312 .940 1.312 1.262 1.267 1.315 1.255 1.222 1.343 1.283 1.323 1.236 1.317 1.317 1.320 1.043 1.210 1.016 1.255 1.124 .928 1.120 1.031 1.123 1.043 1.306 1.349 1.569 1.356 1.384 1.312 1.297 1.355
.094 .180 .125 .236 .248 .098 .181 .166 .219 .263 .099 .182 .164 .236 .258 .102 .143 .160 .182 .246 .087 .127 .148 .185 .220 .081 .179 .179 .282 .266 .107 .180 .172 .243
Descriptive statistics for period of system use and satisfaction N
Am satisfied with webbased e-learning system inter-operability
Am satisfied with system interactivity
Am satisfied with system response
Content provided by the system is accurate to course objectives
Am satisfied with features in the system
Am satisfied with system security
Am satisfied with ease of use of the web-based elearning system
I find the web-based elearning system useful in my learning
0-1 1-2 2-3 Above Total 0-1 1-2 2-3 Above Total 0-1 1-2 2-3 Above Total 0-1 1-2 2-3 Above Total 0-1 1-2 2-3 Above Total 0-1 1-2 2-3 Above Total 0-1 1-2 2-3 Above Total 0-1 1-2 2-3 Above Total
37 39 46 3 45 167 37 39 46 3 45 167 37 39 46 3 45 167 37 39 46 3 45 167 37 39 46 3 45 167 37 39 46 3 45 167 37 39 46 3 45 167 37 39 46 3 45 167
Mean Std. Dev.
2.19 2.56 2.43 2.11 2.32 2.49 1.69 1.67 2.93 2.20 2.81 2.74 2.65 2.49 2.66 3.41 3.74 3.24 3.82 3.55 1.89 1.97 2.04 1.93 1.96 2.30 2.18 2.11 2.02 2.14 2.49 2.62 2.70 3.11 2.74 2.59 2.59 2.85 2.49 2.63
152
1.175 1.209 1.344 1.092 1.214 1.387 .766 1.034 1.321 1.267 1.175 1.371 1.233 1.359 1.283 1.235 1.251 1.303 1.419 1.320 1.125 .986 1.316 1.053 1.124 1.024 1.097 .971 1.097 1.043 1.325 1.480 1.364 1.335 1.384 1.279 1.251 1.382 1.375 1.323
Std. 95% Confidence Error Interval for Mean Lower Upper Bound Bound .193 1.80 2.58 .194 2.17 2.96 .198 2.04 2.83 .163 1.78 2.44 .094 2.14 2.51 .228 2.02 2.95 .123 1.44 1.94 .152 1.37 1.98 .197 2.54 3.33 .098 2.00 2.39 .193 2.42 3.20 .220 2.30 3.19 .182 2.29 3.02 .203 2.08 2.90 .099 2.47 2.86 .203 2.99 3.82 .200 3.34 4.15 .192 2.85 3.63 .212 3.40 4.25 .102 3.35 3.75 .185 1.52 2.27 .158 1.65 2.29 .194 1.65 2.43 .157 1.62 2.25 .087 1.79 2.14 .168 1.96 2.64 .176 1.82 2.54 .143 1.82 2.40 .164 1.69 2.35 .081 1.98 2.30 .218 2.04 2.93 .237 2.14 3.10 .201 2.29 3.10 .199 2.71 3.51 .107 2.53 2.95 .210 2.17 3.02 .200 2.18 3.00 .204 2.44 3.26 .205 2.08 2.90 .102 2.43 2.84
Min. Max.
1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 4 5 5 5 4 5 5 4 5 5 5 5 5 5 5 5 5 5 5
Descriptive statistics for learning style and satisfaction
Reading and exploring models 15 Working with others in a group 51 Am satisfied with webExperimenting new ideas and 72 based e-learning system examples inter-operability Listening and working alone 29 Total 167 Reading and exploring models 15 Working with others in a group 51 Am satisfied with system Experimenting new ideas and 72 interactivity examples Listening and working alone 29 Total 167 Reading and exploring models 15 Working with others in a group 51 Am satisfied with system Experimenting new ideas and 72 response examples Listening and working alone 29 Total 167 Reading and exploring models 15 Working with others in a group 51 Content provided by the Experimenting new ideas and 72 system is accurate to examples course objectives Listening and working alone 29 Total 167 Reading and exploring models 15 Working with others in a group 51 Am satisfied with featuresExperimenting new ideas and 72 in the system examples Listening and working alone 29 Total 167 Reading and exploring models 15 Working with others in a group 51 Am satisfied with system Experimenting new ideas and 72 security examples Listening and working alone 29 Total 167 Reading and exploring models 15 Working with others in a group 51 Am satisfied with ease of Experimenting new ideas and 72 use of the web-based eexamples learning system Listening and working alone 29 Total 167 Reading and exploring models 15 Working with others in a group 51 I find the web-based eExperimenting new ideas and 72 learning system useful in examples my learning Listening and working alone 29 Total 167
153
Std. Std. 95% Confidence Dev. Error Interval for Mean Lower Upper Bound Bound 2.33 1.113 .287 1.72 2.95 2.86 1.281 .179 2.50 3.22 1.89 1.082 .128 1.63 2.14
Min Max
1 1 1
4 5 5
2.45 2.32 2.27 1.47 2.57
1.088 1.214 1.580 .703 1.298
.202 .094 .408 .098 .153
2.03 2.14 1.39 1.27 2.26
2.86 2.51 3.14 1.67 2.87
1 1 1 1 1
5 5 5 5 5
2.52 2.20 3.47 2.59 2.51
1.299 1.267 1.302 1.169 1.245
.241 .098 .336 .164 .147
2.02 2.00 2.75 2.26 2.22
3.01 2.39 4.19 2.92 2.81
1 1 2 1 1
5 5 5 5 5
2.76 2.66 3.27 3.55 3.49
1.455 1.283 1.223 1.137 1.463
.270 .099 .316 .159 .172
2.21 2.47 2.59 3.23 3.14
3.31 2.86 3.94 3.87 3.83
1 1 1 1 1
5 5 5 5 5
3.86 3.55 2.33 2.35 1.61
1.302 1.320 1.447 1.163 .928
.242 .102 .374 .163 .109
3.37 3.35 1.53 2.03 1.39
4.36 3.75 3.13 2.68 1.83
1 1 1 1 1
5 5 5 5 5
1.97 1.96 2.00 2.25 2.08
1.085 1.124 .926 .997 1.097
.201 .087 .239 .140 .129
1.55 1.79 1.49 1.97 1.83
2.38 2.14 2.51 2.54 2.34
1 1 1 1 1
5 5 4 4 5
2.17 2.14 3.40 2.76 2.64
1.071 1.043 1.183 1.436 1.397
.199 .081 .306 .201 .165
1.76 1.98 2.74 2.36 2.31
2.58 2.30 4.06 3.17 2.97
1 1 1 1 1
5 5 5 5 5
2.62 2.74 3.07 2.47 2.71
1.321 1.384 1.280 1.362 1.294
.245 .107 .330 .191 .152
2.12 2.53 2.36 2.09 2.40
3.12 2.95 3.78 2.85 3.01
1 1 1 1 1
5 5 5 5 5
2.52 1.353 .251 2.63 1.323 .102
2.00 2.43
3.03 2.84
1 1
5 5
Mean
N
Part B: Descriptive statistics for lecturers Descriptive statistics for age and satisfaction
20 - 30 Generally am satisfied 30 - 40 with system 40 - 50 interactivity Above 50 Total 20 - 30 Am satisfied with the 30 - 40 response of web-based 40 - 50 e-learning system Above 50 Total 20 - 30 Am satisfied with 30 - 40 accurateness of the 40 - 50 system Above 50 Total 20 - 30 am satisfied with 30 - 40 suitability of the 40 - 50 system Above 50 Total 20 - 30 Am satisfied with inter30 - 40 operability of web40 - 50 based e-learning Above 50 system Total 20 - 30 Am satisfied with 30 - 40 security of web-based 40 - 50 e-learning system Above 50 Total 20 - 30 The web-based e30 - 40 learning system is 40 - 50 useful in teaching Above 50 Total 20 - 30 Am satisfied with the 30 - 40 usability of the web40 - 50 based e-learning Above 50 system Total
N
Mean Std. Dev.
7 26 27 7 67 7 26 27 7 67 7 26 27 7 67 7 26 27 7 67 7 26 27 7 67 7 26 27 7 67 7 26 27 7 67 7 26 27 7 67
2.71 3.19 2.30 3.14 2.78 4.00 2.27 2.07 3.00 2.45 3.43 2.73 3.26 2.71 3.01 2.00 2.23 2.85 3.00 2.54 2.86 2.65 2.93 3.14 2.84 2.00 2.92 3.04 3.14 2.90 2.00 3.69 2.00 2.29 2.69 1.57 2.92 2.74 2.43 2.66
1.604 1.674 .869 1.069 1.369 .000 1.251 .781 1.291 1.158 .535 1.116 1.457 1.604 1.285 .000 1.070 1.379 1.291 1.210 1.069 1.573 1.174 1.069 1.310 .000 1.573 1.160 1.069 1.293 .000 1.011 1.177 1.254 1.317 .535 1.495 1.023 1.134 1.250
154
Std. 95% Confidence Error Interval for Mean Lower Upper Bound Bound .606 1.23 4.20 .328 2.52 3.87 .167 1.95 2.64 .404 2.15 4.13 .167 2.44 3.11 .000 4.00 4.00 .245 1.76 2.77 .150 1.77 2.38 .488 1.81 4.19 .142 2.17 2.73 .202 2.93 3.92 .219 2.28 3.18 .280 2.68 3.84 .606 1.23 4.20 .157 2.70 3.33 .000 2.00 2.00 .210 1.80 2.66 .265 2.31 3.40 .488 1.81 4.19 .148 2.24 2.83 .404 1.87 3.85 .309 2.02 3.29 .226 2.46 3.39 .404 2.15 4.13 .160 2.52 3.16 .000 2.00 2.00 .308 2.29 3.56 .223 2.58 3.50 .404 2.15 4.13 .158 2.58 3.21 .000 2.00 2.00 .198 3.28 4.10 .226 1.53 2.47 .474 1.13 3.45 .161 2.37 3.01 .202 1.08 2.07 .293 2.32 3.53 .197 2.34 3.15 .429 1.38 3.48 .153 2.35 2.96
Min. Max.
1 1 1 2 1 4 1 1 1 1 3 1 1 1 1 2 1 1 1 1 2 1 1 2 1 2 1 1 2 1 2 1 1 1 1 1 1 1 1 1
4 5 4 4 5 4 5 4 4 5 4 4 5 4 5 2 4 5 4 5 4 5 4 4 5 2 5 4 4 5 2 5 4 4 5 2 5 4 4 5
Descriptive statistics for duration of system use and satisfaction N Mean Std. Std. 95% Confidence Min Max Dev. Error Interval for Mean Lower Upper Bound Bound 5 4.00 .000 .000 4.00 4.00 4 4
0-2 2-4 4-6 Above 6 Total 0-2 2-4 Am satisfied with the response of web4-6 based e-learning system Above 6 Total 0-2 2-4 Am satisfied with accurateness of the 4-6 system Above 6 Total 0-2 2-4 am satisfied with suitability of the 4-6 system Above 6 Total 0-2 2-4 Am satisfied with inter-operability of 4-6 web-based e-learning system Above 6 Total 0-2 2-4 Am satisfied with security of web4-6 based e-learning system Above 6 Total 0-2 2-4 The web-based e-learning system is 4-6 useful in teaching Above 6 Total 0-2 2-4 Am satisfied with the usability of the 4-6 web-based e-learning system Above 6 Total Generally am satisfied with system interactivity
25 2.20 25 2.76 12 3.50 67 2.78 5 3.20 25 2.36 25 1.76 12 3.75 67 2.45 5 3.20 25 2.36 25 3.44 12 3.42 67 3.01 5 3.20 25 2.52 25 2.32 12 2.75 67 2.54 5 4.00 25 1.80 25 3.60 12 2.92 67 2.84 5 2.80 25 2.76 25 3.36 12 2.25 67 2.90 5 2.20 25 2.28 25 3.12 12 2.83 67 2.69 5 1.60 25 2.48 25 2.68 12 3.42 67 2.66
155
1.000 1.451 1.567 1.369 1.095 .995 .970 .452 1.158 1.095 1.254 1.121 1.311 1.285 1.095 1.475 .627 1.545 1.210 .000 .957 1.190 .900 1.310 1.095 .970 1.469 1.357 1.293 1.643 1.061 1.236 1.642 1.317 .548 1.327 1.069 1.311 1.250
.200 .290 .452 .167 .490 .199 .194 .131 .142 .490 .251 .224 .379 .157 .490 .295 .125 .446 .148 .000 .191 .238 .260 .160 .490 .194 .294 .392 .158 .735 .212 .247 .474 .161 .245 .265 .214 .379 .153
1.79 2.16 2.50 2.44 1.84 1.95 1.36 3.46 2.17 1.84 1.84 2.98 2.58 2.70 1.84 1.91 2.06 1.77 2.24 4.00 1.40 3.11 2.34 2.52 1.44 2.36 2.75 1.39 2.58 .16 1.84 2.61 1.79 2.37 .92 1.93 2.24 2.58 2.35
2.61 3.36 4.50 3.11 4.56 2.77 2.16 4.04 2.73 4.56 2.88 3.90 4.25 3.33 4.56 3.13 2.58 3.73 2.83 4.00 2.20 4.09 3.49 3.16 4.16 3.16 3.97 3.11 3.21 4.24 2.72 3.63 3.88 3.01 2.28 3.03 3.12 4.25 2.96
1 1 1 1 2 1 1 3 1 2 1 1 2 1 2 1 2 1 1 4 1 2 2 1 2 1 1 1 1 1 1 1 1 1 1 1 1 2 1
4 5 5 5 4 5 4 4 5 4 5 4 5 5 4 5 4 4 5 4 4 5 4 5 4 4 5 4 5 4 4 4 5 5 2 4 4 5 5
Descriptive statistics for gender and satisfaction N
Mean Std. Std. 95% Confidence Min. Max. Dev. Error Interval for Mean Lower Bound
Female Generally am satisfied Male with system interactivity Total Am satisfied with the Female response of web-based e- Male learning system Total Female Am satisfied with Male accurateness of the system Total Female am satisfied with Male suitability of the system Total Am satisfied with inter- Female operability of web-based Male e-learning system Total Am satisfied with security Female of web-based e-learning Male system Total The web-based e-learning Female system is useful in Male teaching Total Female Am satisfied with the usability of the web-based Male e-learning system Total
Upper Bound
28 2.43 1.168 .221 1.98
2.88
1
4
39 3.03 1.460 .234 2.55
3.50
1
5
67 28 39 67 28 39 67 28 39 67 28 39 67 28 39 67 28 39 67 28 39
2.44 2.07 2.02 2.17 2.27 2.82 2.70 1.76 2.39 2.24 1.87 2.80 2.52 2.29 2.56 2.58 2.19 2.24 2.37 1.91 2.46
3.11 2.93 2.80 2.73 3.01 3.75 3.33 2.60 3.20 2.83 2.71 3.66 3.16 3.13 3.49 3.21 3.24 3.09 3.01 2.81 3.29
1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
5 4 5 5 4 5 5 4 5 5 4 5 5 4 5 5 4 5 5 4 5
67 2.66 1.250 .153 2.35
2.96
1
5
2.78 2.50 2.41 2.45 2.64 3.28 3.01 2.18 2.79 2.54 2.29 3.23 2.84 2.71 3.03 2.90 2.71 2.67 2.69 2.36 2.87
1.369 1.106 1.208 1.158 .951 1.432 1.285 1.090 1.239 1.210 1.084 1.327 1.310 1.084 1.423 1.293 1.357 1.305 1.317 1.162 1.281
156
.167 .209 .193 .142 .180 .229 .157 .206 .198 .148 .205 .213 .160 .205 .228 .158 .256 .209 .161 .220 .205
Descriptive statistics for level of study and satisfaction
Diploma Generally am satisfied Degree with system Masters interactivity Total Diploma Am satisfied with the Degree response of web-based Masters e-learning system Total Diploma Am satisfied with Degree accurateness of the Masters system Total Diploma am satisfied with Degree suitability of the Masters system Total Am satisfied with inter- Diploma operability of webDegree based e-learning Masters system Total Diploma Am satisfied with Degree security of web-based Masters e-learning system Total Diploma The web-based eDegree learning system is Masters useful in teaching Total Am satisfied with the Diploma usability of the webbased e-learning system
N
Mean Std. Std. 95% Confidence Min. Max. Dev. Error Interval for Mean Lower Upper Bound Bound
6
2.50 1.643 .671 .78
4.22
1
4
19 2.89 1.243 .285 2.30 42 2.76 1.411 .218 2.32
3.49 3.20
2 1
5 5
67 6 19 42 67 6 19 42 67 6 19 42 67 6 19 42 67 6 19 42 67 6 19 42 67 6
2.44 4.00 1.72 1.99 2.17 4.00 2.21 2.54 2.70 1.85 1.87 2.11 2.24 4.00 2.03 2.34 2.52 2.00 1.55 2.99 2.58 .93 2.18 2.37 2.37 .93
3.11 4.00 2.59 2.73 2.73 4.00 3.47 3.37 3.33 4.15 3.29 2.80 2.83 4.00 3.24 3.19 3.16 2.00 2.76 3.73 3.21 2.07 3.61 3.15 3.01 2.07
1 4 1 1 1 4 1 1 1 2 1 1 1 4 1 1 1 2 1 1 1 1 1 1 1 1
5 4 4 5 5 4 4 5 5 4 5 4 5 4 4 5 5 2 4 5 5 2 5 4 5 2
19 3.11 1.150 .264 2.55
3.66
2
5
Masters 42 2.62 1.268 .196 2.22
3.01
1
4
Total
2.96
1
5
Degree
2.78 4.00 2.16 2.36 2.45 4.00 2.84 2.95 3.01 3.00 2.58 2.45 2.54 4.00 2.63 2.76 2.84 2.00 2.16 3.36 2.90 1.50 2.89 2.76 2.69 1.50
1.369 .000 .898 1.186 1.158 .000 1.302 1.324 1.285 1.095 1.465 1.109 1.210 .000 1.257 1.358 1.310 .000 1.259 1.186 1.293 .548 1.487 1.246 1.317 .548
.167 .000 .206 .183 .142 .000 .299 .204 .157 .447 .336 .171 .148 .000 .288 .210 .160 .000 .289 .183 .158 .224 .341 .192 .161 .224
67 2.66 1.250 .153 2.35
157
Appendix 6: Budget Particulars
ESTIMATED COST(Ksh.)
Stationeries
20,000
Photocopying and Binding
30, 000
Data collection
40,000
Data analysis
12,000
Travelling expenses
15,000
Miscellaneous expenses
25,100
Internet services
8,700
Conferences, Publications, Books and Journals
100,000
Total Expenditure
250,800.00
158
Appendix 7: Research Proposal Approval
159
Appendix 8: Research authorization letter
160
Appendix 9: Research permit
161