Pattern-Based Usability Evaluation of E-Learning Systems

0 downloads 0 Views 203KB Size Report
learning management systems. ... learning systems in comparison with the e-learning system ... (AHP), which allows us to analyze systems and compare them.
Pattern-Based Usability Evaluation of E-Learning Systems Dmitry Zub, Erki Eessaar Department of Informatics, Tallinn University of Technology, Raja 15, 12618 Tallinn, ESTONIA [email protected], [email protected] Abstract-Each pattern describes widely used and generally accepted solution to a recurring problem. In this paper, we propose a novel pattern-based usability evaluation method. It uses Analytic Hierarchy Process (AHP) in order to achieve numerical results based on evaluations of different systems and make this evaluation as objective as possible. We also present results of a case study about usability of several e-learning systems.

I.

INTRODUCTION

This work has two main themes: e-learning and usability. E-learning is one kind of distance learning. Distance learning can be described as "education that takes place independent of location, in contrast to education delivered solely in the classroom and that may be independent of time as well."[1] This is achieved by the fact that the learning material is delivered via different means of instruction. At first it was mail correspondence then radio, TV, CD-ROM discs. The next stage of this evolution is web-based delivery of the learning materials. E-learning itself can be defined as "a wide set of applications and processes, such as web-based learning, computer based learning, virtual classrooms and digital collaboration." [2] The delivery of this material goes through Internet, audio- and videotape and other means. Systems that have been in our area of interest are web-based learning management systems. Some of the main functions of those systems are creating courses, registering students, tracking progress of students and assignments. But apart from functionality of e-learning systems there is another significant issue. The system should be quite easy to use so that the potential students will not spend too much time in order to find out how to use the different functions of the system. The first goal of this paper is to present a novel usability evaluation method. It was initially developed in attempt to evaluate the usability of some well-known international elearning systems in comparison with the e-learning system Maurus that is created in Tallinn University of Technology (TUT). The second goal is to present the results of this evaluation. In this study, we use action research methodology [3] that requires active participation of investigators in the practice. The process of the action research consists of potentially unlimited number of iterations of building of theory and practicing it. Theory (in this case usability evaluation) is a structure emerging from practice. It should be continuously controlled and improved by the practice. In this paper, we

propose a usability evaluation method (theory) that has emerged from the usability evaluation of e-learning systems (practice). The organization of this paper is as follows. In section II we give an overview of some usability terms. The following section describes shortly the Analytic Hierarchy Process (AHP), which allows us to analyze systems and compare them between each other. Patterns are the criteria that are used in the proposed usability evaluation method. Therefore, section IV gives an overview of the main principles of patterns. The next section gives an overview of some existing usability evaluation methods. Next, we present the results of a usability evaluation of some e-learning systems. Firstly, we explain for what reason patterns are selected as criteria for usability evaluation. Secondly, the patterns used for our analysis are described. Thirdly, the results of usability evaluation that was performed in our study are presented. In section VII the general method for usability evaluation is described. Next, we describe further development of the method. Finally, we draw some conclusions. II. USABILITY Researchers have proposed many overlapping definitions of usability [4]. For instance, ISO 9126 [5] defines software usability as "the capability of the software product to be understood, learned, used and attractive to the user, when used under specified conditions". As you can see, this definition refers to different aspects that must be considered in the context of usability. Folmer and Bosch [4] give an overview of different proposals of these aspects. Some of the most recurring of them are following: Ease of learning – How fast can a user, who has not seen a user interface before, learn this interface so that he/she can accomplish basic tasks? Efficiency of use – Once an experienced user has learned to use a system, how fast can he/she accomplish tasks? Memorability – If a user has used a system before, then can he/she remember enough to use it effectively after some time has passed or does the user has to start learning from scratch? Error frequency and severity – How often do users make errors while using a system, how serious are these errors, and how do users recover from those errors? Subjective satisfaction – how much does the user like a system?

In this paper, we propose that for the analysis of the usability of systems (including e-learning systems) we can use Analytic Hierarchy Process and patterns. Therefore, in the next sections we shortly introduce these concepts. III. ANALYTIC HIERARCHY PROCESS The Analytic Hierarchy Process (AHP) [6] allows us to make decisions by modeling a complex problem in a hierarchical structure. The levels of this model, if we begin from the upper level, are: Goal – overall objective. The goal is connected to some kind of choice. For example, "Find the best e-learning system in terms of usability". Objectives – What are we trying to achieve while choosing between e-learning systems? What are the main criteria that we should take into account while making our choice? For example, the usability aspects could be used (in theory at least) while evaluating e-learning systems. Alternatives – Objects between which the choice will be made. In our case the objects are e-learning systems that we will analyze. The next step after model building is to compare the elements. Firstly, we should compare objectives in order to find their importance for achieving our goal. After that we should compare how well do the alternatives correspond to each objective. AHP uses pairwise comparisons in order to make evaluation process easier to a human expert who must perform the evaluation. Elements of the same level (goals, objectives, or alternatives) are compared with each other in respect to one element of higher level in a single matrix. The scale for comparing the elements is presented in Table I. After all the comparisons are made, the geometric mean of comparison grades is calculated for each compared element, and the weight of each element is calculated so, that the sum of the weights is equal to 1. After combining all the weights, the final evaluation values for alternatives are found. IV. PATTERNS The idea of patterns was firstly used in the context of civil architecture [7]. Later patterns became popular in other areas like object-oriented software design [8], specification of databases [9], design of user interfaces [10, 11] and pedagogy [12, 13]. Every pattern describes a solution to some kind of recurring problem. The use of patterns helps us to prevent inventions of the solutions over and over again. TABLE I Numerical value 1 3 5 7 9 2, 4, 6, 8

COMPARISON SCALE OF THE AHP Verbal scale of importance Equal Moderate Strong Very strong Extreme Used to compromise between two judgments

Instead it allows us to concentrate on the accumulation of new knowledge. There are different formats that can be used in order to specify a pattern. However, generally the name of a pattern has some kind of a hint on how to solve the problem. The body of the pattern consists of at least three main parts. Firstly the context (background or situation) where this pattern can be used is described. Then the problem that appears in that context is described. The final part is a solution of the problem together with discussion on how to implement the solution. V. METHODS OF USABILITY EVALUATION In this section, we shortly describe the best-known usability evaluation methods. There are a lot of methods of usability evaluation. These methods can be divided into several categories, the best known of which are usability inspection [14, 15] and usability testing [15]. Usability inspection is the process of evaluating the system by experts in design, usability and other relevant fields but without participation of the potential end users of this system. In usability testing the users are given some specific tasks to accomplish and their actions are recorded in some way. In addition, users could give a feedback on how easy or hard was to accomplish the tasks. Some usability inspection methods specify some actions that are similar to the actions of the proposed approach. These methods are shortly described below [15, 16, 17]: Heuristic evaluation – usability experts judge whether the elements, features, or functions of a system follow certain usability principles that are called heuristics. Cognitive walkthrough – a technique where firstly the task scenarios are created and then played through by usability experts. The experts must pay attention to the possible problems that potential user could have while accomplishing the task. Formal usability inspection – is a procedure with strictly defined roles and actions. It combines the first two abovementioned methods. Feature inspection – it is used for checking a set of features that are used for accomplishing some usual task (e.g. to submit a solution to an exercise). Consistency inspection – here designers from other related projects (e.g. some products in a set of software) check if a system acts the same way as their projects. Standards inspection – it is used for checking a system against the established standards (rules) that are accepted in this area. VI. USABILITY EVALUATION OF E-LEARNING SYSTEMS In this section, we describe the results of the usability evaluation of a set of e-learning systems. The general idea of the approach is that we compare a system that is under evaluation with other similar systems in terms of patterns by using AHP. We use usability and e-learning patterns as

criteria in order to perform AHP. We present generalized specification of the approach in section VII. A. The Choice of Criteria of AHP As it was mentioned in section III, AHP allows us to make choice between some alternatives. The result depends on information about what alternatives correspond better to the selected criteria. Therefore, we should pick some criteria in order to use AHP. Two main alternatives were considered while choosing the criteria. One alternative was to use usability factors (ease of learning, efficiency of use etc. – see section II). The second alternative was to use the usability patterns. The disadvantage of usability factors is that they are very abstract. For instance, it is quite hard to evaluate whether one system is easier for learning than some other system. The patterns on the contrary are more specific and it is easier to make evaluation based on them. Another advantage is that the patterns are so specific that they may be used as guidelines that help designers to fix usability problems. Therefore, it was decided to use web usability patterns [11] for judging usability with the help of AHP. In addition, we decided to use some domain-specific patterns about e-learning [13]. The idea of using patterns in the context of usability evaluation is not completely new. Schmetow [14] describes an approach that uses usability-patterns as heuristics in the heuristic evaluation method. The approach that is proposed in this paper uses additionally domain-specific patterns (elearning patterns). Folmer et al. [18] propose to use patterns in order to create a software architecture that supports usability. However, we do not know a method or a system that uses patterns as the criteria for usability evaluation and combines them with AHP. B. The Selected Patterns Below is the list of the selected patterns together with short explanation. We also show initialisms of these patterns that we use in Table II. In this research, we did not use any formal method to select the set of patterns from the set of all possible patterns. We made the selection based on our experience. In our view these patterns represent the main principles of usability. Web Usability patterns were: Two Years Old Browser (2YB) – A user who has somewhat outdated software or hardware should be able to use the system. Site-map (SM) – There should be a possibility to see the site structure in order to find the necessary information as quickly as possible. Feedback (F) – A user should get a feedback whether his/her actions were successful or not. If an action failed, then the system should specify what went wrong. Sense of location (SL) – A user should see in what part of a website or a system he/she is. Also, when some complex process is executed, it would be good to know what steps are already done and what are yet to be completed.

Follow standards (FS) – The user has his/her own experience of working with similar web sites and therefore well-known traditions and rules must be followed. That makes the system easier to learn. We decided to use the following e-learning patterns: Interactive elements (IE) – Set of learning activities should be implemented in e-learning system. They should enable active participation and interaction between students and teachers. Project based e-learning (PBL) – E-learning system should have the tools that make project-based learning more effective. Examples of these tools are project proposal, market, or diary. This pattern was selected because projectbased learning is quite popular (it is also actively used in our university) and has been shown to produce good results [19]. Collect feedback (CF) – A system should allow teachers to collect feedback about the course or the use of the software. It will help a teacher to improve the course. Feedback about using the system will be also useful for usability evaluation. Team workspace (TW) – Projects are usually made in groups. Therefore, each team should have their own space for storing their joint work. Computer-mediated communication (CMC) – In e-learning one of the most important ways of getting the information is to communicate with teacher and fellow students. The tools for that should be provided by the system. C. The Alternatives One of the practical goals of the project was to assess the usability of one of the e-learning systems of the Tallinn University of Technology called Maurus. It is one of several systems that have been created by university staff, in order to achieve flexible and easily manageable e-learning system. It is used in the courses about database and information-system design. It is a learning management system that allows lecturers to create a course, distribute learning materials and references to interesting web-pages, post messages to a message-board, publish exercises and collect their answers and organize registration of students to the exams. Along with this system, the following international elearning systems were evaluated in the analysis: Claroline, FLE3, OLAT. This set of three systems was chosen for the following reasons. Firstly, all of them are open source systems. This approach has some advantages. University staff has possibility to extend and improve open-source systems, if necessary. In addition, it is much easier compared to commercial systems to get access to these systems and start the testing. Second requirement is that the systems should have at least the same features as Maurus. The Claroline project was initiated in 2000 at the Catholic University of Louvain (Belgium) by Thomas De Praetere [20]. It has some course management capabilities like for example holding courses, providing documents and links, and participation in forums. One feature of the system that looks special is the possibility to create a wiki for the course. Wiki

[21] is the group of web pages that users can create and edit like if it was a forum. FLE3 (Future Learning Environment) is developed by Learning Environments for Progressive Inquiry Research Group UIAH Media Lab, University of Art and Design Helsinki [22]. This system is not typical course management system. It is a web-based learning environment. This is created for computer supported collaborative learning [19]. It contains special learning tools like WebTops, Knowledge building, and Jamming. For example, Jamming provides a shared space for collaborative creation of digital content (pictures, text, audio, and video). OLAT (Online Learning And Training) is a web-based Open Source Learning Management System (LMS) / Learning Content Management System (LCMS). The initial development started at the University of Zurich, Switzerland [23]. It is a course management system with some interesting differences. For example, a student there does not register for a course but enrolls learning group. After that courses are assigned to particular learning groups.

The usability expert decided that the level of importance of e-learning patterns is moderately bigger (see Table I) compared to the usability patterns. According to the analysis, FLE system is the best system in the set of selected systems in terms of conformance to the e-learning systems. Therefore, FLE system has the highest score. Next two systems have virtually even score. They both have many interesting and useful features. However, Claroline has a bit higher score since it is more traditional in its design and it is easier to learn in our opinion. Maurus system has the lowest score because it is not so feature-rich and it has some usability problems. The practical result of this study was a set of recommendations for improving the usability of the Maurus system. The results also show that in-house development of an e-learning platform, instead of using an open-source system, has not yet lead us to the best possible results. VII. GENERAL METHOD OF ANALYSIS In this section, we propose a novel method for performing usability evaluation of a system. It is created as a generalization based on the study of usability evaluation (see the previous section). It allows usability experts to repeat this process in order to perform usability evaluation of other systems. That process consists of 8 steps: 1. Selection of systems (alternatives) and patterns (criteria) for the evaluation. 2. Analysis of the patterns that are selected for the evaluation. 3. Writing of specification of tasks and evaluation based on these tasks. 4. Analysis of the results of tests. 5. Calculation of the final weights for every system. 6. Identification and solving of the main problems that were discovered during the testing. 7. Retest of the system. 8. Analysis of patterns. Next, every step of this method will be described and discussed in more detail. Step 1: Selection of systems and patterns. If a system (that is analyzed) is under development, then it would be useful to study similar systems. If a system already exists, then it is still important to find out what possible improvements we can make in the system. In addition, it is valuable to know the opinion of future users of the system (teachers and students in case of e-learning systems).

D. The Results The analysis was performed by one usability expert. Step 1: We compared the types of patterns (web usability and e-learning) and found the weights of pattern types. Step 2: In case of each pattern type, we compared the patterns of that type between each other and found the initial weights of patterns. Step 3: We compared 4 e-learning systems in terms of every single selected pattern and found the initial weights of systems. The results are in the rows "Maurus", "Claroline", "FLE", and "OLAT" in Table II. Step 4: We multiplied the weight of each pattern (from step 2) by the weight of its corresponding pattern-type (from step 1). The results are in the row "Pattern Weight" in Table II. Step 5: The final weight of each system (alternative) was calculated by using (1). p1*s1 + p2*s2 + ..+pi*si.. + pn*sn.

(1)

In this case, n denotes the amount of patterns (criteria). pi denotes the final weight of i-th pattern (from step 4). si denotes the weight that the system received in the comparison matrix related to i-th pattern (from step 3). The final weight of each system is presented in the column "Sum" in Table II.

TABLE II WEIGHTS OF PATTERNS AND E-LEARNING SYSTEMS 2YB

SM

F

SL

FS

IE

PBL

CF

TW

CMC

Pattern Weight

0,013

0,028

0,098

0,037

0,074

0,328

0,153

0,046

0,099

0,123

Sum

Maurus

0,391

0,060

0,067

0,066

0,053

0,065

0,078

0,080

0,053

0,084

0,072

Claroline

0,067

0,413

0,391

0,147

0,488

0,170

0,201

0,357

0,218

0,233

0,246

FLE

0,391

0,380

0,391

0,500

0,241

0,595

0,520

0,157

0,488

0,138

0,438

OLAT

0,151

0,147

0,151

0,288

0,218

0,170

0,201

0,406

0,241

0,545

0,244

1 1

System -name : String -version : String

0..* Pattern classification scheme 0..* Comparison of alternatives -name : String -author : String -result : Decimal -explanation : String 0..*

0..*

1

0..* 1 Alternative -reason_of_selection : String -weight : Decimal -is_object_of_study : Boolean 1

in terms of

E-learning system

Participant role -name : String 1

0..*

1..*

1 Participation Comparison of criterion 0..* 1 Evaluation -result : Decimal 1..* -start_time : Date 0..* Classifier in scheme 0..* 0..* -end_time : Date in terms of -name : String 1 1 -results : String 1 -description : String 1 0..* Possible criterion Party Criterion 1 0..* 0..1 -reason_of_selection : String 1 1 Recommentation -post_analysis : String 1 -specification : String 0..* -weight : Decimal Usability evaluation 0..* 1..* Pattern Usability-pattern 0..* Task E-learning pattern -description : String 0..* 0..* 1..* 1 -child 0..* Quality characteristic pattern guides how to achieve -dependent 0..* 0..*

0..*

0..*

0..* -parent

0..1

Fig. 1. A domain model of the proposed method.

A way of getting the opinions of potential users is to create the list of possible patterns what will be used for analysis and discuss it with them in order to find out what patterns they consider to be the most important. Step 2: Analysis of the criteria is a usual step in AHP. In our approach the criteria for the evaluation are patterns. On the other hand, since we have two different types of patterns (web usability and e-learning) we have to do criteria comparisons in two steps. Firstly, we have to determine the relative importance of different types of patterns in the context of the study by using AHP. After that we compare the patterns that have the same type with each other. The patterns can be from different fields (like usability and e-learning in our case). Therefore, different experts should discuss the importance of different types of patterns. Step 3: In order to evaluate all the important features of a system it would be useful to write down the evaluation tasks. If all the tasks are specified, then the evaluation could take less time compared with the case when the evaluator has to look through all the functions of the system. In addition, if we have the list of tasks, then we could easily repeat the evaluation. For example, reevaluation allows us to make sure that the changes to the system have improved the situation. In addition, we can use the specification of evaluations in case of evaluating similar systems in the future. After everything is ready we can carry out the evaluation according to the plan.

We would like to point out that the activities that we used for the evaluation have much in common with the methods mentioned in section V. We evaluate the systems to see if they are following the specific rules (patterns) like in heuristic evaluation. The recommendation to write down the tasks corresponds to cognitive walkthrough. In our evaluation, we tried to evaluate the main features of the systems like in feature inspection. And the inclusion of "Follow standards" pattern into the set of criteria shows that we have done some standards inspection as well. Step 4: The next step would be to analyze the results of evaluations and to decide, how well each pattern is implemented in each of the systems. Step 5: If we compare several systems with each other, then we could calculate the final weights for the systems according to AHP. Step 6: Find the patterns, the implementation of which is the worst. If we have a working system, then we could think how to improve the situation and solve the problems. Step 7: Run the evaluation from step 3 again in order to estimate whether the situation has improved or not. Step 8: This step could be either the last step of the analysis, or the first step of a new iteration. During this step we should decide what patterns were useful and what were not. If some patterns proved to be not that useful as we thought, then we have to replace them with some other patterns in the first step of a new iteration.

There are two possibilities for making iterations in this method. One is to make evaluation and the following analysis until the results are good enough (Steps 3 to 7). Another possibility is to repeat the entire process after one iteration (steps 1-8) is completed. Fig. 1 presents a domain model of the proposed approach. Each evaluation uses one or more alternatives and one or more criterion. The alternatives are systems. One or more alternatives are the objects of study, the quality of which we want to evaluate. An e-learning system is a kind of system. The criteria are either patterns or elements of pattern classification scheme. Usability patterns and e-learning patterns are kinds of patterns. Each evaluation consists of one or more tasks. There could be dependencies between tasks. A usability evaluation is a kind of evaluation. Each evaluation has one or more associated parties who participate in the evaluation in some role. Based on the results of executing the tasks it is possible to perform pairwise comparison of criteria (possibly in terms of other criterion) and pairwise comparison of alternatives in terms of each criterion. The use of a pattern could help us to improve one or more quality characteristic of a system. An evaluation can produce recommendation about the use of a pattern in a system. Quality characteristics can form a hierarchy [5]. Each pattern is characterized by zero or more classifiers. Each classifier belongs to exactly one classification scheme. It is possible that a pattern can be associated with more than one classifier, which belong to the same classification scheme. Examples of classification schemes of patterns are Domain, Purpose and Scale. For instance, a domain of a pattern could be telecommunication, medicine or e-learning. In addition, each system is associated with zero or more classifiers. If a system is associated with a classifier, then it means that the system can take advantage of patterns that have this particular associated classifier. Based on this information we can find criteria (patterns) that we can use in order to evaluate a system. VIII.

FUTURE DEVELOPMENT OF THE METHOD

Actually, this method can work as a general pattern-based analysis method of systems. The goal is to create an information system that allows us to describe the systems that we want to analyze (in terms of usability, functionality or any other aspect) and manage the results of analysis. For example, if we take some other system and will analyze it based on some other aspect, then we should change: • systems that we analyze; • patterns that are the criteria; • the tasks that must be completed during the analysis. We can change the systems, use other kinds of patterns and put together new tasks (see Fig. 1). But the method itself does not change. In this general system, it should be possible to describe different types of systems, patterns and tasks. The creation of that system is the goal of our future studies.

IX. CONCLUSIONS In this paper, we proposed a novel usability-evaluation method that uses patterns and Analytic Hierarchy Process. We applied this method in order to evaluate 4 e-learning environments. We used 5 usability patterns and 5 e-learning patterns in the analysis. The result of evaluation is that the Future Learning Environment is the best e-learning system among the systems that were under consideration. Based on this study we created a generalized specification of the method. We described steps of this method and also presented a domain model. REFERENCES [1] L. Neal and D. Miller, The Basics of E-Learning: An Excerpt from Handbook of Human Factors in Web Design. Retrieved September 21, 2007, from http://elearnmag.org/subpage.cfm?section=tutorials&article=20-1 [2] E-learning // Learning Circuits Glossary. Retrieved September 21, 2007, from http://www.learningcircuits.org/glossary.html [3] D. Avison, F. Lau, M. Myers, and P.A. Nielsen, "Action Research," Commun. ACM, Vol. 42, Issue 1 (Jan. 1999), pp. 94-97, 1999. [4] E. Folmer and J. Bosch, "Architecting for usability; a survey," Journal of systems and software, Elsevier, pp. 61-78, 2002. [5] ISO 9126-1, 2000. Software Engineering––Product Quality––Part 1: Quality Model. [6] T. L. Saaty, "How to Make a Decision: The Analytic Hierarchy Process," European Journal of Operational Research, Vol. 24, No. 6, 1990. [7] C. Alexander, M. Silverstein, S. Angel, S. Ishikawa, and D. Abrams, The Oregon Experiment, Oxford University Press, New York, 1975. [8] C. Larman, Applying UML and patterns: an introduction to objectoriented analysis and design and the Unified Process, 2nd edn., Upper Saddle River, USA: Prentice Hall, 2002. [9] D. C. Hay, Data Model Patterns. A Metadata Map, Morgan Kaufmann Publishers, 2006. [10] D. K. Duyne, J. A. Landay and J. I. Hong, The Design of Sites. Patterns for Creating Winning Web Sites, 2nd edn., Prentice Hall, 2006. [11] Web Usability Patterns. Retrieved September 21, 2005, from http://www.trireme.com/WU [12] J. Bergin, "Fourteen Pedagogical Patterns," In Proceedings of the Fifth European Conference on Pattern Languages of Programs, July 5-9, 2000, Irsee, Germany. [13] M., Derntl, Patterns for Person-Centered e-Learning. PhD Thesis. Faculty of Computer Science, University of Vienna, 2005. [14] M., Schmetow, "Towards a Pattern Based Usability Inspection Method for Industrial Practitioners," In: Proceedings of the IFIP WG 2.7/13.4 INTERACT 2005 Workshop Integrating Software Engineering and Usability Engineering. pp 16-21, 2005. [15] R. Jeffries, J. R. Miller, C. Wharton, and K. Uyeda, "User interface evaluation in the real world: a comparison of four techniques," In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: Reaching Through Technology, ACM Press, New York, NY, pp. 119-124, 1991. [16] The usability methods toolbox by James Home. Retrieved September 21, 2005, from http://jthom.best.vwh.net/usability/usahome.htm [17] J., Nielsen, "Usability inspection methods, Conference companion on Human factors in computing systems," p.377-378, May 07-11, 1995. [18] E. Folmer, J., van Gurp, and J. Bosch, "Software Architecture Analysis of Usability," LNCS Vol. 3425, Springer-Verlag, pp. 38-58, 2005. [19] D. Uribe, "The effect of computer-mediated collaborative learning on solving III-defined problems," Educational Technology Research and Development, Vol. 51, No. 1, Springer Boston, pp. 5-19, 2003. [20] Claroline.net - Open Source eLearning Retrieved November 28, 2005, from http://www.claroline.net/ [21] Wiki // Wiki - Wikipedia, the free encyclopedia. Retrieved November 28, 2005, from http://en.wikipedia.org/wiki/Main_Page [22] Fle3 is a Learning Environment! - Fle3 CSCL Software. Retrieved November 28, 2005, from http://fle3.uiah.fi/index.html [23] Open Source LMS OLAT | About OLAT. Retrieved November 28, 2005, from http://www.olat.org/public/index.html

Suggest Documents