2008-2009 Course Management System (CMS) Committee Report
4/27/2009
4/27/2009
2008-2009 COURSE MANAGEMENT SYSTEM (CMS) COMMITTEE REPORT CMS Committee Members: Alex Angerhofer, Ph.D., Associate Professor of Chemistry Diane Beck, PharmD, Professor of Pharmacotherapy & Translational Research and Interim Associate Dean for Curriculum & Assessment Darral Brown, DCE Instructional Designer Pamela Dickrell, Ph.D., Director of UF EDGE John Dobson, Ph.D., Senior Lecturer, Applied Physiology and Kinesiology Joanne Foss, Ph.D., Clinical Assistant Professor of Public Health & Health Professions and Associate Dean for Student & Academic Affairs Randy Graff, Ph.D., Health Science Center Rae Jesano, M.S., Assistant University Librarian Doug Johnson, Ph.D., Academic Technology Michael Legrande, Department of Psychology, College of Liberal Arts and Sciences Tawnya Means, M.S., Director of Instructional Support, College of Business Kathleen Price, Associate Dean for Library and Technology, College of Law David Quinn, Ph.D., Assistant Professor of Educational Administration and Policy Richard Rathe, M.D., Associate Professor of Family Medicine and Associate Dean for Information Technology Office Mark Rieger, Ph.D., CALS Curriculum Enhancement Committee Marilyn Roberts, Ph.D., eLearning and Distance Ed Committee Craig Tapley, Ph.D., Senior Lecturer of Finance, College of Business
1
4/27/2009
EXECUTIVE SUMMARY The current UF Course Management System (CMS) contract will end October 2010 and support for the product will cease in October 2011. In Fall 2008, a committee was convened to seek a replacement CMS. The following criteria were used to evaluate the CMSs: 1) Core Functionality, 2) Usability, 3) Technical, 4) Integratability, 5) Migration, 6) Evolving Pedagogy, 7) Total Cost of Ownership, and 7) Longevity/Ecosystem. The Committee work was completed in 4 phases. During the first two phases, the following CMSs were evaluated using a mixed methods approach: 1) Angel 7.3, 2) Moodle 1.9, and 3) Sakai 2.5. This methodology involved campus-wide input via faculty focus groups, a student survey, a faculty and staff survey following vendor presentations, and hands-on focus group sessions. At the end of Phase II, the Committee deliberated using these data and ranked Angel 7.3 the highest, with Sakai 2.5 ranked a close second. Moodle 1.9 clearly ranked lowest. At this point, the Committee narrowed the options to Angel and Sakai. During these initial deliberations it was realized that both Angel and Sakai were planning new releases. Therefore, during Phase III, the Committee performed a head-to-head comparison of these new versions (Angel 7.4 vs Sakai 3.0). Due to the short timeline, the committee gathered data comparing the two by interviewing other institutions who were beta-testing a version and developing comparison tables. The committee also weighed the pros and cons of an open-source versus a closed-source (proprietary) CMS. During final deliberations (Phase IV), the Committee determined that Sakai 3.0 ranked highest based on the following criteria: 1) Technical, 2) Migration, 3) Longevity/Ecosystem, and 4) Total cost of Ownership. Angel 7.4 and Sakai 3.0 were tied with respect to Integratability and Core Functionality. Angel 7.4 ranked higher in usability and evolving pedagogy, the committee noted that few UF faculty expressed interest in the features related that were considered as evolving pedagogy. Based on these findings, it was a unanimous committee decision to recommend Sakai 3.0 as the next CMS for UF. Since Sakai 3.0 is an open-source system, if it is adopted UF will need to commit resources to build infrastructure (programming staff and other IT resources) that can support system. In addition, a new culture must be developed that allows for UF Community Engagement in deciding priorities for development of new tools and versions. This will require a governance structure for Sakai 3.0 that is formal 2
4/27/2009 and recognized within the University. There are significant savings (approximately $300,000 per year) to be had from outsourcing the hosting of Sakai. The committee recognizes that this is not part of their recommendation, however this option should be considered.
Summary of Recommendations 1. Implement and fully fund, with a line item in the UF Budget, Sakai as the replacement for the current E Learning System. 2. Remain and actively participate as a member of the Sakai community. 3. Build the infrastructure at UF that is necessary for an open source system. 4. Develop and fund a governance structure for Sakai that is formal and institutionally recognized within the University. 5. Extend our current Blackboard license through 2011 to allow for development and migration. 6. Implement the Sakai Portfolio System. 7. Fully fund a migration team to facilitate the movement from Blackboard Vista to Sakai.
For More Information Please read the entire 2008-2009 Course Management System (CMS) Committee Report or contact the committee Co-Chairs, Diane Beck, PharmD (
[email protected] ), and Randy Graff, PhD (
[email protected]) for additional information.
3
4/27/2009
Contents Introduction ..................................................................................................................... 7 Initial Deliberations .......................................................................................................... 8 Methods ........................................................................................................................ 10 Methodology Overview ............................................................................................... 10 Phase I ....................................................................................................................... 11 Initial Focus Group Sessions ................................................................................... 11 Student Survey ........................................................................................................ 12 Phase II ...................................................................................................................... 12 Vendor Presentations .............................................................................................. 12 Hands-on Sessions ................................................................................................. 12 Phase III ..................................................................................................................... 13 Criteria for Comparing Systems .............................................................................. 13 Selection and Comparison of Finalists .................................................................... 15 Phase IV..................................................................................................................... 16 Committee Deliberations and Recommendations ................................................... 16 Results .......................................................................................................................... 17 Data Sources ............................................................................................................. 17 Phase I ....................................................................................................................... 17 Initial Focus Group Sessions ................................................................................... 17 Student Survey ........................................................................................................ 17 Phase II ...................................................................................................................... 18 Vendor Presentations .............................................................................................. 18 Hands-On Sessions ................................................................................................ 19 Phase III ..................................................................................................................... 20 Comparison of Systems .......................................................................................... 20 Phase IV - Committee Deliberations .......................................................................... 20 Deliberations Comparing Angel 7.4 and Sakai 3.0 .................................................. 21 Proprietary versus Open Source Systems .............................................................. 23 4
4/27/2009 Definitions ............................................................................................................... 23 Recommendations ........................................................................................................ 25 Rationale .................................................................................................................... 25 Need to Build Infrastructure and Culture .................................................................... 27 Infrastructure ........................................................................................................... 27 Governance and Culture ......................................................................................... 28 Need for Membership in the Sakai Organization ..................................................... 29 How Membership Aligns with the UF Mission .......................................................... 29 Recommended Timeline for Adoption......................................................................... 30 Short-term Recommendations.................................................................................... 31 Appendix I. Discussion Questions Used During the Initial Faculty and Staff Focus Group Sessions. ............................................................................................................ 33 Appendix II. Survey Administered to Students ............................................................. 34 Appendix III. Survey Administered During the Vendor Presentations. ........................... 39 Appendix IV. Classification of the Responding Students ............................................... 42 Appendix V. College Affiliation of Responding Students ................................................ 43 Appendix VI. Student Perceptions about the Value of Features in the Current ELearning System. .......................................................................................................... 44 Appendix VIII. Representative Student Comments to Open-ended Questions About How Improve the Current E-Learning System ............................................................... 45 Access........................................................................................................................ 45 Browser ...................................................................................................................... 45 Features ..................................................................................................................... 45 Instructor-related ........................................................................................................ 46 Mac-friendly ................................................................................................................ 47 Stability....................................................................................................................... 47 User-interface ............................................................................................................. 47 Appendix IX. Vendor Presentations - Respondent Ratings about Functions, Ease of Use, and Quality of the Presenter. ................................................................................ 48 Appendix X. Ratings of the 9 Core Functions by Participants at the Vendor Sessions. 48 5
4/27/2009 Appendix XI. Participant Ratings on Usability. .............................................................. 48 Appendix XII. Comparison Table ................................................................................... 49 References .................................................................................................................... 51
6
4/27/2009
INTRODUCTION University of Florida's license for the E-learning system expires in 2010. In addition, Blackboard, the vendor for the University of Florida's course management system, has announced that support for the current system will end in 2011. Blackboard also offered no options for migration of the current E-learning to their new version. A full review of UF's options for a Course Management System and related services was last conducted in 2003. Since this last report, there have been significant enhancements in proprietary and open source software. During this time there has also been a maturation of the open source community and evolution of Web 2.0 technologies which have enhanced pedagogical options for learning. Recognizing all these issues, Professor Fedro Zazueta, Director of Academic Technology, formed a multidisciplinary Course Management System Review Committee to look at a five-year horizon and charged them to: •
•
In the midterm: 1. Review the University of Florida's strategic planning documents to ensure continued alignment of the E-Learning system and services with institutional priorities. 2. Assess the current UF E-Learning system and identify future needs. 3. Re-evaluate the recommendation to continue using our current E-Learning system. 4. Recommend a new Course Management System (CMS) for the UF ELearning system and the associated set of services. In the short-term: 1. Identify trailing services that can be discarded. 2. Identify services that need to be improved. 3. Recommend new services that need to be implemented.
The Committee met weekly between October 2008 and April 2009 and accomplished these charges. This report addresses each of these charges.
7
4/27/2009
INITIAL DELIBERATIONS During Fall Semester 2009, the Committee focused on the mid-term charges. Specifically, they addressed the first three charges and developed the methodology for accomplishing the fourth charge which was to recommend a new CMS for the UF ELearning system and the associated set of services. First, the committee reviewed UF's 2007 strategic work plan and identified two goals that specifically align with the ELearning system and services: These goals were: 1) Continue to develop strategies to expand student access to educational programs through distance education (Goal 14), and 2) Review IT needs and develop a state-of the art IT infrastructure to support faculty and students, with emphasis on increasing, from the standpoint of the end-user, the compatibility of IT units, while maintaining their integrity (Goal 27). With respect to Goal 14, to successfully expand student access to educational programs through distance education, a robust CMS is needed. Such a system not only provides access to teaching resources such as the syllabus, lectures, and assessments, but also enables community and collaboration among students and the instructor. Course management systems that include or allow interface with Web 2.0 applications such as blogs, wikis, and social networking will promote the development of quality distance-education courses. With respect to Goal 27, the CMS must be state-of-art and compatible with other IT units within UF. The Committee did an environmental scan of course management systems and academic technologies and quickly realized that the current UF E-Learning system lacked many of the features available in systems currently being marketed. In particular, committee members identified how improved interaction and collaboration tools could enhance their teaching. Specifically, many new systems allow for enhanced statistics and reporting of student performance as well as including wikis, blogs, and podcasting. The Committee also gathered input from faculty and staff (Charge #2) by conducting focus group sessions during Fall 2008 and Spring 2009. A student survey was also delivered during the Spring. With respect to Charge #3, we did not have the option to stay with the current ELearning system. Specifically, the provider of the current E-Learning System, Vista (developed by WebCT but now owned by Blackboard) indicated that they were no longer going to support that product. Furthermore, they offered no migration tool for converting to the new version of Blackboard or to any other CMS. Because of this, all available CMSs were on equal footing and the Committee began an “open” review of all available options for a new CMS. 8
4/27/2009 The Committee first examined public information about the various proprietary and open source course management systems and identified the following four for consideration as possible replacements for the current E-Learning system: 1) Angel Learning System 7.3, 2) Blackboard 9, 3) Moodle 1.9, and 4) Sakai 2.5. In late Fall 2008, the Committee also established the methodology to compare these Course Management Systems. During February 2009, the committee requested all vendors to provide UF with access to a course site that could faculty could use during the hands-on sessions. Blackboard was sent multiple emails and received multiple phone calls requesting access. However, they did not respond by the time the hands-on sessions started. Without this test site from Blackboard, the Committee realized it could not collect the same data that were planned for all systems being evaluated. Therefore, the committee voted to drop Blackboard from further consideration. Because of this decision, the remainder of the report describes the comparison of Angel, Moodle, and Sakai along with a summary of the committee’s final deliberations and recommendations. Each of these Course Management Systems has unique attributes: • Angel – The Angel Learning Management Suite is a propriety system that is owned by Angel Learning which is located in Indianapolis, Indiana. The company also offers the Angel e-Portfolio system which currently must be purchased as a separate system. • Moodle – Moodle is an open source course management system. Moodle has a large and diverse user community with members in 208 countries. Users within the community collaborate via forums and events to refine and develop new versions of Moodle. • Sakai – The Sakai Collaboration and Learning Environment (CLE) is an open source system that supports teaching, learning, and scholarly collaboration. Sakai is developed and maintained by a community which includes over 200 institutions around the world as members. The Sakai community collaborates via forums, meetings, and other events. The Sakai CLE includes e-Portfolio tools which allow students to build a portfolio across multiple courses.
9
4/27/2009
METHODS Methodology Overview The initial design of this project was planned by the entire CMS committee. Our goal was to seek input and data from the UF community and vendors to use in decisionmaking. The project used a mixed methodology to gather qualitative and quantitative data. Figure 1 describes the methods used and timeline. Figure 1. Methodology and Timeline for CMS Comparison & Committee Decision
Jan 2009
Feb 2009
Feb-Mar 2009
Feb-Mar 2009
April 2009
April 2009
•Phase I: Initial Focus Group Sessions •Identify Needs of UF faculty to delivery instruction •Phase I: Student Survey •Identify preferred tools and needs for learning •Phase II: Vendor Presentations •Learn about features of each CMS •Phase II: Hands-On Sessions •Gather faculty and staff perceptions about functionality and usability of each CMS •Phase III: Comparison of Systems and Finalists •The finalists (Angel 7.4 and Sakai 3.0) were compared •Phase IV: Committee Deliberations and Recommendations •Based on an analysis of all data, the Committee prepared the final report
During Phase I, the Committee gathered input from UF faculty and students about the attributes of a CMS that facilitate effective teaching and learning. Specifically, focus groups comprised of college designated faculty members were conducted to identify functions/features most important to the faculty. Then, a student survey was conducted to gather their perceptions about features important in their learning. During Phase II, three Course Management Systems were evaluated and compared: 1) Angel –version 7.3, 2) Moodle – version 1.9, and 3) Sakai-version 2.5. This phase first involved a presentation and demonstration by each CMS vendor. In addition, at the end of each presentation a survey was administered to gather faculty and staff perceptions about the functionality and usability of each system. (The Committee initially desired to 10
4/27/2009 also invite 3 faculty members from outside institutions where one of the Course Management Systems is being used to share their experiences. However, budget constraints and an aggressive timeline precluded these presentations.) This phase also involved Hands-on Focus Group sessions to gather faculty member and staff perceptions about the usability of each CMS. In order to provide users with the opportunity to try out what they saw at presentations, the Vendor Presentations and the Hands-on Focus Group Sessions were conducted simultaneously. During Phase III, the Committee compared the systems and then narrowed the options to two finalists – Angel 7.3 and Sakai 2.5. During these initial deliberations it was realized that both Angel and Sakai were planning new releases. Therefore, during Phase III, the Committee performed a head-to-head comparison of these new versions (Angel 7.4 vs Sakai 3.0). Due to the short timeline, the committee gathered data comparing the two by interviewing other institutions who were beta-testing a version and developing comparison tables. The committee also weighed the pros and cons of an open-source versus a closed-source (proprietary) CMS. During Phase IV, the committee deliberated and then developed recommendations and the final report.
Phase I Initial Focus Group Sessions Initially, focus group sessions were conducted with faculty members to assess the functions and features most important to delivery of instruction at UF. Individuals were selected to participate by Deans of each Unit who nominated individuals having a reputation among peers as an “expert” in teaching and also in using the E-Learning system. Once participants were identified, five focus group sessions were held. Three committee members facilitated these focus group sessions and each session was audio-taped for later analysis. During these sessions, perceptions and experiences were tapped using a set of open-ended questions. (See Appendix I) The data from these sessions were compiled and analysis resulted in a prioritized list of functions/features that faculty members considered important for a CMS. Nine functions/features that were identified became a component of the core criteria for evaluating the each CMS during the vendor presentations, hands-on sessions, and final committee deliberations.
11
4/27/2009 Student Survey To gather perceptions concerning the CMS features useful during their coursework and important for learning from a student point of view, a survey was conducted. The survey items were constructed by the committee. In addition to demographic questions, the survey listed various functions and asked students to rate their necessity during a course. (See Appendix II) The survey was administered electronically using Qualtrics, a web-based survey application. All UF students were invited to participate in the survey by means of an email message which was initiated by the Registrar’s office.
Phase II Vendor Presentations A vendor for each system was invited to present and demonstrate each system and participants were asked to evaluate the functionality and features of the CMS presented. These sessions allowed “experts” to explain each of the systems and demonstrate the functions and features. To guide what was covered in each of these presentations, the Committee provided each vendor with the list of 9 CMS functions/features that were determined to be important to faculty and staff during the initial focus group sessions. The vendor was informed that the participants would be evaluating their product based on these 9 functions/features, and therefore it was important that they focus the discussion and demonstration on these 9 functions/features. These presentations allowed for evaluation and comparison of the following three systems: 1) Angel – version 7.3, 2) Moodle – version 1.9, and 3) Sakai – version 2.5. Angel is a proprietary system and therefore, the presenter was from Angel Learning Systems. Moodle and Sakai are open source software and thus, service providers were invited to present these systems. Specifically, Moodle was presented by Remote Learner and Sakai was presented by RSmart. All UF faculty and staff were invited to these presentations via a campus-wide email and an announcement on the E-Learning system. Paper-based surveys (Appendix III) that tapped participant perceptions about the 9 functions/features along with other attributes of the CMS were distributed at the beginning of each session and then collected at the end. Videos of the sessions were made available for people who could not attend in person. Hands-on Sessions The Hands-on session provided faculty input about usability and functionality of the following three systems: 1) Angel – version 7.3, 2) Moodle – version 1.9, and 3) Sakai – 12
4/27/2009 version 2.5. To assess usability and further explore functionality, a total of 24 hands-on sessions were conducted where faculty and staff were invited to explore one of the features of all three Course Management Systems. Across these 24 sessions, the nine functions/features of a CMS were explored. The nine functions/features that were selected as topics of these hands-on sessions were those identified during the initial faculty focus group sessions that are described above. Faculty and staff were recruited for these sessions via an email invitation which allowed them to register for multiple sessions. Committee members facilitated all 24 hands-on sessions and additional committee members were present and observed the reactions of the participants. During each Hands-on Session, faculty and staff were provided a brief overview of how to use a given feature. Then, they were provided time to actually use and explore the feature in each of the three systems. At the end of the session, each participant was asked to complete a survey which used open-ended questions to tap their perceptions about each system. In addition, each participant was asked to rate the usability of each CMS on a scale of 1 to 10 where 1 = Not user friendly and 10 = Extremely user friendly. The open-ended items from the survey were analyzed qualitatively using a typology method, classifying items into patterns.. For each CMS and each of the 9 features that were evaluated, a mean usability rating was computed using Excel.
Phase III Criteria for Comparing Systems The Committee established 8 criteria for comparing the Course Management Systems and forming their recommendations. These criteria are listed in Table 1. Table 1. Criteria Used to Compare the Course Management Systems Criteria
Explanation
Core Functionality
9 Functions identified during the Initial Focus Group Sessions: Assignments, Assessments including selective release and browser lockdown, Assessment rubrics, Discussions, Email, File management, Group management, Grade book, Tracking.
Usability
User experience collected during the Hands-on Sessions for Faculty and Staff.
Technical
Code, language, quality of coding
Integratability
Portal authorization, Registrar’s office (enrollment, adds/drops, grades), library resources, Gator mail, Respondus, Turnitin, Elluminate. 13
4/27/2009 Migration
Results from the pilot migration.
Evolving Pedagogical Tools (Used by Early Adopters)
Inclusion of E-Learning tools and functions important for quality elearning, but not yet widely adopted by most faculty: e-portfolios, wikis, blogs, certificate generator, podcasting, publisher support, PDA mode, whiteboard, chat, desktop sharing, standards mapping, activity alerts for ungraded assignments and quizzes, RSS feeder.
Total Cost of Ownership (TCO)
Hardware, software, services, support, integration, programming
Longevity and Ecosystem
How long the company will be around, exit strategy should the company cease, software escrow, profitability, negative contingencies, reputation, how many years the company has been actively engaged in the software industry, products first release and current version of release, recent turnover of management, overall headcount and whether it is increasing or decreasing, number of third party developers and integrators with plugins and add-ons; level of activity of each.
The committee first reviewed the criteria and finalized the weightings for each of the criteria. The following discussion outlines each criterion and an explanation of why it was included and rationale for the weightings. Core Functionality. Core functionality was based on how well the CMS met the 9 functions/features that faculty and staff identified as important during the initial focus group sessions. Because these functions/features are most used by our faculty at the current time and also directly impact faculty and students, this category was given the largest weighting. Usability. The GUI interface and initial experience in using the system are important for gaining initial faculty acceptance and their ability to effectively use the system. The committee members reflected on the comments by participants during the Hands-on sessions and also the participants’ usability ratings as they deliberated about usability. Technical. The code language that is used for programming and the perceived quality of coding by other institutions was used to assign ratings for this category. The AT staff provided extensive input as did other peer institutions to the Committee and the committee members used this information for deliberating about the technical considerations. 14
4/27/2009 Integratability. Input was obtained from the AT staff, the registrar’s office, and librarians about how well the system would integrate with the current systems being used across the campus. Migration. Migration of courses currently in the E-Learning systems would make the transition from WebCT-Vista to another CMS less painful. Therefore, several of the committee members took courses that were currently in E-Learning and provided files to each of the vendors in order to see how these courses would be converted. These committee members provided input about how well the course migrated into each CMS. Evolving Pedagogical Tools (Used by Early Adopters). There are many features/functions that are currently available that would significantly enhance the quality of learning; especially in courses that involve distance learning. These functions/features were not noted by faculty during the initial focus group sessions. However, this was likely because these are evolving technologies and not yet widely recognized and adopted. Because these functions/features have the potential to enhance learning as faculty members become aware of their availability, they were included as a criterion. Total Cost of Ownership. Open source software requires a careful consideration of the total cost of ownership, since the total cost is not in the license fee or maintenance agreement. A detailed explanation of how the costs of hardware, software, services, support, and integration were considered is available upon request. Longevity and Ecosystem. Realizing the difficulties encountered when a vendor decides to no longer support a CMS, the committee included this criterion. This criterion also includes considerations about the long-term viability of the company/organization that is providing the CMS. In addition, the AT staff were asked to develop a Table comparing the Course Management Systems and also a Table comparing the Total Cost of Ownership (TCO). Selection and Comparison of Finalists The committee began comparing the three Course Management Systems. It was then identified that both Angel and Sakai were planning to release new versions and these versions were really the options for implementation by UF in 2010-2011 rather than Angel 7.3 and Sakai 2.5. Therefore, because Angel and Sakai ranked highest during the prior phase, the Committee decided it was essential that these new versions be directly compared. Therefore, during Phase III the Committee compared these versions. First, they conducted phone conferences with Universities who are using 15
4/27/2009 these systems. Then, the Committee developed a comparison table (Appendix XII) that evaluated these systems based on the eight criteria developed by the Committee.
Phase IV Committee Deliberations and Recommendations The Committee compared the two finalists (Angel 7.4 and Sakai 3.0) using the criteria outlined in Table I and the information in Appendix XII. During deliberations, the Committee also compared the advantages and disadvantages of proprietary and open source Course Management Systems. The committee then developed the recommendations and the final report.
16
4/27/2009
RESULTS Data Sources The sources of data for this report include a results from the initial focus groups (n=32), the student survey (n=1544), participant ratings/feedback from three vendor sessions (n=33), perceptions of UF faculty and staff who participated in the Hands-on Sessions (n=61), and comparison tables generated by the AT Staff and Committee.
Phase I Initial Focus Group Sessions A total of 32 individuals participated in five focus group sessions. Qualitative analysis of the observations gathered during these sessions revealed the following 9 themes which were considered important to faculty: 1) Assignments, 2) Discussions, 3) Email, 4) File Management, 5) Grade Book, 6) Rubrics, 7) Group Management, 8) Assessment, and 9) Tracking. These 9 functions/features became focus and evaluation criteria for the vendor presentations and also for the Hands-on Sessions. Student Survey A total of 1544 students completed the survey. Appendix IV reveals that most responses were from freshman, sophomore, junior, senior, and graduate students and there was a lower response rate by professional students. The highest percentage of students were from the College of Liberal Arts and Sciences (32%). Also represented were College of Engineering (15%), College of Business (15%), College of Agricultural and Life Sciences (12%), and College of Pharmacy (8%). (Appendix V) The representation from other colleges ranged between 0 and 5%. Seventy-four percent of the responses were from on-campus students and 26% of the responses were from distance students. Most (99%) of the students had experience with WebCT Vista and only 4% had experience with Moodle. Concerning the usefulness of the current E-Learning system, 74% of respondents indicated the system was either very useful or useful. The tools most valued in the current E-Learning system were “My Grades”, Announcements, the syllabus tool, assessments and quizzes, discussions, and mail. (See Appendix VI) When asked about new features that should be the next E-Learning system, file sharing and bookmarking were the most desired. (See Appendix VII)
17
4/27/2009 An open-ended question was also used to obtain input about how to improve the ELearning system. Students most frequently cited the need for a better user interface. For example, they noted the need for better labels and navigation to make it easier to use. Students also noted that they frequently heard faculty complain about its ease of use from an instructor’s perspective and they their learning could be enhanced if the system was more “instructor-friendly.” Students indicated a more “instructor-friendly system” could improve feedback about performance and decrease the time required for assessment feedback. Students also desired better features in the email interface and a better calendar system. They also noted the system needed to be compatible with the Macintosh operating system and run more smoothly (e.g., fewer crashes, fewer problems with JAVA). (See Appendix VIII)
Phase II Vendor Presentations The number of surveys completed across the three presentations was as follows: a) Angel (11), b) Moodle (11), and c) Sakai 2.5 (11). Appendix IX is a summary of the function totals and includes a summary of ease of use and presentation skills. Angel and Moodle received the highest scores with respect to functions available and ease of use. Sakai scored lower than Angel and Moodle; however, it should be noted that the presentation skills of the presenter were lowest for Sakai. Respondents were also asked to rate each CMS based on the 9 core functions identified as important by UF faculty. Appendix X contains a breakdown of the scores for each of the functions (count and average).
18
4/27/2009 Hands-On Sessions Across the 24 sessions, there were 61 participants with between one to seven participants in each hands-on session. Qualitative analysis of the comments revealed the topics outlined in Figure 2. Figure 2. Topics of Comments Most Frequently Cited in the Evaluations from the Hands-on Sessions. Concerns: Thoughts about the system that gave cause for the respondent to write; these were of a negative connotation and concerned with university reactions and abilities. Difficulty: A feature or functions that was described as hard to complete. Great Feature: A well liked feature. Lacking Feature: A feature that was expected, but not there. OK : An aspect of the system that was mediocre. Overwhelming: An aspect of the system that was visually or systemically over stimulating or busy. Visually Pleasant: A positive visual characteristic of a system. Visually Unpleasant: A negative visual characteristic of a system. Overall, from a hands-on perspective, Angel was the most desired system, followed by Sakai 2.5, and then Moodle (n=63). According to the faculty and staff who attended these sessions and completed the survey, Angel was the easiest to use and Sakai was the hardest to use. Angel was most often cited to have notably great features. Interestingly enough, participants also identified that Angel had some features missing that were significant. For example, it lacked the ability to drag and drop files into the system. Moodle was most often described as “mediocre.” Visually, Angel was the most pleasing and Moodle, the least visually pleasing. One noted problem with Moodle is that there was a lot of information on the webpage. However, the committee noted that this problem was likely due to the example course used during the hands on sessions and that Moodle could probably be developed with a more pleasing webpage. Table 10 contains a summary of the usability ratings that were gathered during the Hands-on Sessions. With regards to perceived usability, Angel was considered to be the most usable overall (mean = 8.2). In fact, Angel was considered most usable in all the categories listed. Sakai was second and Moodle, the third most usable. These 19
4/27/2009 results were consistent with the observations of committee members who conducted the hands-on sessions.
Phase III Comparison of Systems The Committee used the criteria in Table I to compare Angel 7.3, Moodle 1.9, and Sakai 2.5. With AT staff assistance, the Committee developed a Table comparing the functions/features of the Course Management Systems and also a Table comparing the Total Cost of Ownership (TCO). Based on these criteria, the committee ranked Angel 7.3 the highest. However, Sakai 2.5 was ranked a very close second. Moodle clearly ranked third based on the criteria. Although Angel 7.3 best met our criteria for core functionality, usability, and evolving pedagogy, it ranked poorly with respect to technical aspects, migration, and longevity/ecosystem. Although Sakai 2.5 did not rank as high with respect to core functionality and usability, it was ranked very high with respect to technical aspects, migration and longevity/ecosystem. As the committee was deliberating about Angel versus Sakai, they learned that a new version of Sakai that was in Beta-testing would be available for use in 2010-2011. It was realized that this new version (Sakai 3.0) had more functions and features than the Sakai 2.5 version that had been the focus of the CMS Committee evaluation. In addition, Angel Learning notified us that they would be releasing a new version (Angel 7.4) within our timeline for adoption. This finding led the committee to conduct a comparison of Angel 7.4 and Sakai 3.0. Appendix XII compares Angel 7.4, Moodle, and Sakai 3.0 to our current E-Learning System and also the earlier versions of Angel and Sakai. As outlined in this Table, it is clear that Sakai 3.0 is a much better system than the 2.5 version. Angel 7.4 is not significantly different than the 7.3 version. Based on the findings summarized in Table XII, the committee determined that Sakai 3.0 ranked well with respect to not only technical, migration, and longevity/ecosystem but also core functionality. Appendix XIII compares the Total Cost of Ownership (TC0) for Moodle, Angel 7.4 and Sakai 3.0.
Phase IV - Committee Deliberations During final deliberations, the committee noted that there were no “deal breakers” among the finalists. All systems were significant improvements in some way over our current system. 20
4/27/2009 Deliberations Comparing Angel 7.4 and Sakai 3.0 Appendix XII was used to directly compare Angel 7.4 and Sakai 3.0. This table also includes information obtained in earlier phases when Angel 7.3, Moodle 1.9, and Sakai 2.5 were being considered. This table also compares these systems to our current ELearning System. As outlined in this Table, it is clear that Sakai 3.0 is a much better system than the 2.5 version. Angel 7.4 is not significantly different than the 7.3 version. Based on the findings summarized in Table XII, the committee deemed that both Angel 7.4 and Sakai 3.0 are comparable with respect to core functionality. These core functions were the functions and features that the UF faculty deemed as important for instructional delivery. Both were also comparable with respect to the criterion Integratability. Although Usability was ranked higher for Angel based on the results of the hands-on sessions, several committee members explored the usability of the Sakai 3.0 version and determined that Usability improved over the 2.5 version. The committee also found that Sakai 3.0 ranked very well with respect to the following criteria: 1) Technical, 2) Migration, and 3) Longevity/ecosystem. As noted in Figure 3, the Committee determined that the strengths of Sakai 3.0 outweighed those of Angel 7.4. Figure 3. Comparison Summary Based on Criteria
Sakai 3.0 Angel 7.4
Integratability
21
4/27/2009 The following section outlines a more detailed comparison of Angel and Sakai based on the criteria that were used by the Committee for decision-making. Core Functionality. Both Angel 7.4 and Sakai 3.0 were found to be similar in meeting the 9 core functions deemed as important by UF faculty who participated in the initial focus groups. Concerns were raised during the Vendor presentations about the limitation of Sakai 2.5 with respect to selective release of quizzes. However, Georgia Institute of Technology has developed a tool that will be available with Sakai 3.0 and enables selective release of quizzes. This feature is important for accommodating students who have learning disabilities and need additional testing time. None of the systems were ideal for algorithmic equations. Usability. Based on the comparison of Angel 7.3 and Sakai 2.5, Angel was more userfriendly for faculty and staff. However, Sakai 2.5 ranked a close second during the Hands-on Sessions. In addition, several committee members evaluated the usability of 3.0 and concluded it was enhanced over Sakai 2.5. Technical. Sakai 3.0 ranked highest in this category. It is an open source system that the UF IT group is experienced with the architecture that it is built on. It has been successfully used by other institutions that are the size of UF. In contrast, Angel has a Microsoft Windows architecture and the central UF IT group is not experienced in this area. Adoption of Angel would require development of the IT staff so that they could be equipped to manage the system. In addition, Angel has very little experience with Institutions as large as UF. Feedback from other institutions currently using Angel revealed concerns about the quality of coding and the company’s response to the institution’s concerns and needs. Although Angel has good APIs, Sakai is open source and therefore, UF can build what is needed. Integratibility. Both Angel 7.4 and Sakai 3.0 were equal with respect to their ability to interface with UF e-mail, Respondus, Turnitin, and Elluminate. Migration. During the pilot test migration using a course from Business, Sakai was more successful than Angel. Evolving Pedagogy (Features and Functions Used by Early Adopter Faculty Members). Angel had a few more features that are evolving trends in the delivery of quality online learning but these have not yet been widely adopted by UF faculty. For 22
4/27/2009 example, Angel allows generation of an automated grading of student participation in discussion forums, certificate upon completion of a course or continuing education program, mapping of outcomes/standards, and activity alerts for ungraded quizzes. However, interest/need for these features was not a high priority among the majority of UF faculty who participated in the focus group and hands on sessions. Future. Angel had a few more features that are evolving trends in the delivery of quality online learning but these have not yet been widely adopted by UF faculty. Although Angel allows generation of an automated grading of student participation in discussion forums, certificate upon completion of a course or continuing education program, mapping of outcomes/standards, and activity alerts for ungraded quizzes, interest/need for these features was not a high priority among the majority of UF faculty who participated in the focus group and hands on sessions. Total Cost of Ownership (TCO). If hosted locally, the TCO of Angel and Sakai were comparable. However, if the external hosting option is selected the TCO for Sakai is significantly less. Longevity and Ecosystem. Because Sakai is an open source system, there are no worries about company buy outs. In addition, UF can develop the tools desired by the UF community. Other considerations. Students expressed desire for the system to better interface with Mac computers. Both Angel and Sakai have made strides in improving operability with Mac Computers; however, the Committee did not find a significant difference in how well these systems interface with Mac computers. Proprietary versus Open Source Systems During deliberations the committee spent significant time gaining insights about proprietary versus open course systems. Definitions Open-source “Open source” is a software development model generally associated with availability of the source code, including compiled code, for that software. While access to the source code is a critical element of “open source,” additional important characteristics include distribution characteristics such as free redistribution and allowance of derived works, 23
4/27/2009 among others. Closed-source Also known as proprietary software, “closed source” refers to software owned by one organization. The term is typically used only in discussions that contrast open source software with proprietary software. Community source “Community source” is a model developed by higher education institutions, particularly surrounding the Sakai project that refers to a community coordination mechanism which utilizes the model of open source development. API An “application programming interface” (API) is a set of routines, data structures, data structures, object classes, and/or protocols provided by libraries and/or operating system services to support the development of new applications and integration with external services. APIs are the “connection” between external services and the course software; but by nature, they limit how integration can be accomplished. Integration with ANGEL is restricted to using the available APIs, which are very robust. However, integration with Sakai can be accomplished both via available APIs as well as by making modifications to the Sakai source code, if needed.
24
4/27/2009
RECOMMENDATIONS On April 20, 2009, the Committee voted and the vote was unanimous in favor of Sakai 3.0. 1. Implement and fully fund, with a line item in the UF Budget, Sakai as the replacement for the current E Learning System. 2. Remain and actively participate as a member of the Sakai community. 3. Build the infrastructure at UF that is necessary for an open source system. 4. Develop and fund a governance structure for Sakai that is formal and institutionally recognized within the University. 5. Extend our current Blackboard license through 2011 to allow for development and migration. 6. Implement the Sakai Portfolio System. 7. Fully fund a migration team to facilitate the movement from Blackboard Vista to Sakai.
Rationale 1. Core Functionality. Sakai provides the core functionality desired by UF faculty members and students. 2. More Than a CMS. Sakai provides a Collaboration and Learning Environment (CLE) which can be used not only for teaching and learning but also research. In addition, this CLE includes an e-Portfolio system. 3. UF Will Own the Code. Because Sakai is open source, UF will own the source code. Therefore, UF controls its destiny. 4. Graduate Research. Graduate students in the Computer Sciences and other units of UF can accomplish Masters level projects which contribute to the evolution of the Sakai CLE. 5. Features. Features and functions will change over time – usually improving. With open source we can influence that time and ensure improvement; with closed source systems, our options are limited. If we choose a proprietary system, we can lobby for the things we need that aren’t there (e.g. better rubrics, better calculated questions, etc.) but what we get will be determined by the priorities of the entire user community, not by our needs. This will also be true with the Sakai 25
4/27/2009 community; but where our needs diverge from those of the community, we will have the capability of applying our resources to addressing our needs. 6. Bug fixing and problem prioritization. Again, with a vendor system fixes will be addressed based on the issues of the larger community. For example, we have had problems in the current system with the grade book functioning slowly in our large enrollment courses. This problem has not been resolved; we suspect because the problem does not exist in courses with smaller enrollments and smaller grade books. There are very few institutions with large enrollment classes of our size; therefore, resolving that problem has not been a high priority for our current vendor. This is likely to be true with any proprietary system. 7. Peer institutions. The founders of the Sakai project are Stanford, Michigan, Indiana, MIT and Berkeley. Cambridge (UK), Georgia Tech, and Indiana are the leaders in the Sakai 3 project. Our peers in the Sakai community will be the aspirational peers of our institution. 8. Networking. Again with Sakai we join a community of peers. The result will be an enhanced ability to build a network of alliances and a communal knowledge base of best practices. Likewise, we gain even greater opportunities to help shape the tools we use to better support our educational and research programs. And our innovations, technical, pedagogical, administrative, etc., can then be shared with the wider community, building UF’s reputation in practice and innovation. 9. Community Source. While still open source, Sakai builds on a "community source" model, supported by world-class institutions and governed by a board of directors at the non-profit Sakai Foundation. The current board, elected by the Sakai community, includes nine higher education representatives: Foothill College (CA), Indiana University, Stanford University (CA), University of California at Berkeley, University of Cambridge (UK), University of Hull (UK), University of Michigan, and University of Toronto (Canada), as well as a representative from a Sakai commercial affiliate. If we choose Sakai, we are adopting a system whose development is governed by our peers. 10. Product development. Unlike systems developed by proprietary coding teams, Sakai tools are designed by educators for educators. One of the biggest problems we’ve experienced in the past at UF is the divergence over time of our 26
4/27/2009 CMS vendor from the higher education community in which both systems started. 11. Vendor lock-in. We purchased a WebCT product; but WebCT was bought out by Blackboard who subsequently chose to end support for our current CMS. With a vendor solution, we have no control and little influence over the direction of the company and its impact on UF. Sakai emerges as an alternative to the risks of proprietary systems, including uncertain licensing costs, unresolved product roadmaps, and continued instability due to patent lawsuits. 12. The CMS market space. Open source threatens all proprietary CMS vendors. Every institution that moves to either Moodle or Sakai is a loss for all other proprietary systems; and open source is a growing trend for higher education: “Yet even as respondents seem wary of migrating to Open Source, the survey data document show continuing gains by Open Source LMS applications…. [O]ne campus in seven (13.8 percent) that participated in the 2008 survey has identified either Moodle or Sakai as the campus standard LMS, up from 10.7 percent in 2006.” (The Campus Computing Project, October, 2008. p. 3) 13. Cost considerations. As we have seen from the TCO discussions, overall costs of open source versus proprietary systems tend to be roughly equivalent, with licensing costs being diverted into programming positions. However, the money we pay to a proprietary vendor will go to pay and upgrade the skills of vendor staff. That same money applied to open source support will pay and upgrade the skills of UF staff – a significant benefit to UF – as well as contribute back to the higher education open source community. 14. Enhancements. With Sakai we gain new features otherwise unavailable except at great cost. Specifically, we gain the built-in e-portfolio tool and the ability to offer group collaboration sites. Collaboration sites can be used by researchers who need to work with their colleagues around the world, faculty engaged in governance committee work, and students working with study groups or activity clubs. These new tools were not intrinsic to our evaluation; but they represent a significant value-added proposition to the institution.
Need to Build Infrastructure and Culture Infrastructure Adoption of an open source system like Sakai 3.0 requires a new type of commitment 27
4/27/2009 for UF. Specifically, successful use of an open source system requires significant infrastructure and also a new culture among providers and users. Although there are no license fees an open-source system is not “free.” The costs normally allocated for license fees and other costs associated with a proprietary system must be used to support additional staff devoted to programming and development of new tools. A commitment to implementation of Sakai 3.0 requires a commitment to developing an infrastructure that will make it successful at UF. If University of Florida administration accepts the CMS Review Committee’s recommendation it is essential to understand that adoption of Sakai 3.0 as the centrally-supported course management system, will require the UF administration to make a commitment not just to software, hardware, and user support, as in the past, but also to software development. To this end, the CMS Review Committee strongly recommends that UF hire or assign at least 2-3 programmers with appropriate skills (e.g. Java) to: 1. 2. 3. 4.
pay full-time attention to the relevant open source code; work on bug fixes as needed at UF; work on developing new features and functions as needed; coordinate with the broader open source community toward objectives 2) and 3) as well on broader code development; and 5. develop and implement appropriate programming practices, quality assurance, and quality control Governance and Culture The operations of this open source programming team should be tied in closely with the CMS governance structure; which also needs to be refined in the light of open source adoption. Specifically, in order to effectively decide on priorities for development of new tools for the Sakai CLE, a new culture must be developed that involves governance. As part of deliberations, the committee reviewed the 2009 UF Information Technology Action Plan. The committee is very positive about the potential of developing the infrastructure that will be required for an open source system within the new proposed structure for IT governance. The 2009 UF Information Technology Action Plan also includes plans for UF Community Engagement. Such an advisory structure is needed to get input from faculty, staff, and students for developing priorities as the need for new tools are identified by the UF community. The Committee recommends that UF develop a governance structure for Sakai that is formal and institutionally recognized within the 28
4/27/2009 University. The CMS Review Committee recognizes that the decision to adopt, implement, and support the Sakai open source software as the institutional course management and electronic portfolio system is not trivial. In fact, it represents a complete paradigm shift for UF, as the institution transitions from being a consumer of off-the-shelf software to being a co-producer of the software in a community of peers. Need for Membership in the Sakai Organization The Total Cost of Ownership figures for Sakai include a $10,000 annual fee for membership in the Sakai community (the annual cost can be reduced by purchasing multi-year membership). Membership in the Sakai community allows the University of Florida to designate an official representative who can vote on Foundation Board members. The Sakai Foundation is a member-based, non-profit 501(c)(3) corporation engaged in the collaborative design, development and distribution of open-source software for education, research and related scholarly activities. It encourages community-building between academic institutions, non-profits and commercial organizations and provides its members and others with an institutional framework within which Sakai projects can flourish. The Foundation also works to promote the wider adoption of community-source and open standards approaches to software solutions within higher education. The Sakai Foundation staff coordinate software development, quality assurance and distribution activities for the community. Staff members oversee Sakai's intellectual property and track contributor agreements. They also provide technical support for both community members and potential adopters, speak at conferences and other gatherings about Sakai and manage Sakai's own conferences and meetings. The Sakai Foundation is supported by voluntary partner contributions. Sakai Foundations member organizations elect the ten-member Board of Directors, which provide the strategic leadership for the Sakai Foundation. How Membership Aligns with the UF Mission We believe that a move to an open source CMS, supported by robust open source development, will not only provide the best CMS option for faculty and students but also represents a critical alignment with UF’s core institutional mission. That mission is 29
4/27/2009 teaching, research, and outreach; or, put another way: creating, preserving, and transmitting knowledge. Adopting a proprietary system really addressed only one dimension: teaching/transmitting. Participating in the open source community brings all three core purposes to life in the context of the CMS as we create, archive, and contribute time, code, features, and functions to the broader higher education community. Furthermore, adopting Sakai supports the institutional goal of becoming a top 10 institution as we take a place among the community of aspirational peers and enhance our institutional reputation among other leading institutions as contributors to the community while developing toward our own institutional needs and desires.
Recommended Timeline for Adoption Because Sakai 3.0 will be in beta-testing until 2011, the Committee recommends the timeline of adoption outlined in Figure 3. It should be noted that this implementation plan will require extending our current Blackboard license through 2011. Figure 3. Timetable of Sakai implementation
30
4/27/2009
Short-term Recommendations The Committee was also asked to make any “short-term” recommendations that could be implemented now as we prepare for adoption and implementation of a new CMS (i.e., trailing services that can be discarded, services that need to be improved and new services that need to be implemented). Recognizing the short timeline for implementation of a new CMS, the committee determined that all efforts should focus on preparing for implementation of the new CMS. Conversion to a new CMS provides opportunity to help faculty members realize the potential of new functions and features that will be available in the new CMS and also improve pedagogy. Therefore, it is recommended that workshops or seminars be offered that will increase faculty understanding about the use of the functions and features and also the pedagogical rationale for their use. Faculty members should also be encouraged to use the instructional design consulting that is available through Academic Technology. Additional staff resources may be needed to meet the needs of faculty who desire such consultation.
31
4/27/2009
Appendices
32
4/27/2009
APPENDIX I. DISCUSSION QUESTIONS USED DURING THE INITIAL FACULTY AND STAFF FOCUS GROUP SESSIONS. Background Information Provided to the Focus Group Participants: This group has been asked to meet to help inform the Course Management System Advisory Group (CMSAG), we are interested in faculty experiences using multiple course management systems. The question guiding this study is, what do faculty members report that they need to effectively use a course management system? This group is comprised of faculty from various colleges; faculty chosen to represent their peers in the department and college. During this focus group the members will share perspectives from their own experiences and from experiences of their colleagues. This data will be used to inform the CMSAG and provide the structure for a second series of focus groups. Questions Used to guide discussion: a. What systems do you currently have experience with? b. What do you like about these systems? c. What do you dislike about these systems? d. What CMS features do you use most consistently at this time? e. How do you currently receive support for the CMS? f. What basic features in a CMS do you feel are important? g. What are the really basic support features in using a CMS you feel are important? h. In an ideal world, what would you like to see in a CMS? i. In an ideal world, what support features would you like to see in a CMS?
33
4/27/2009
APPENDIX II. SURVEY ADMINISTERED TO STUDENTS
34
4/27/2009
35
4/27/2009
36
4/27/2009
37
4/27/2009
38
4/27/2009
APPENDIX III. SURVEY ADMINISTERED DURING THE VENDOR PRESENTATIONS.
39
4/27/2009
40
4/27/2009
41
4/27/2009
APPENDIX IV. CLASSIFICATION OF THE RESPONDING STUDENTS # Answer
Response %
1 Freshman
267
19%
2 Sophomore
227
16%
3 Junior
294
20%
4 Senior
283
20%
5 Graduate Student
289
20%
6 Professional Student
83
6%
1443
100%
Total
42
4/27/2009
APPENDIX V. COLLEGE AFFILIATION OF RESPONDING STUDENTS #
Answer
Response %
1
Agricultural and Life Sciences, College of
155
12%
2
Business Administration, Warrington College of
198
15%
3
Design, Construction and Planning, College of
19
1%
4
Dentistry, College of
1
0%
5
Education, College of
51
4%
6
Engineering, College of
206
15%
7
Fine Arts, College of
26
2%
8
Health and Human Performance, College of
53
4%
9
Journalism and Communications, College of
73
5%
4
0%
424
32%
12 Medicine, College of
10
1%
13 Nursing, College of
42
3%
14 Pharmacy, College of
105
8%
10 Law, Levin College of 11
Liberal Arts and Sciences, College of
15
Public Health and Health Professions, College of
55
4%
16
Veterinary Medicine, College of
6
0%
28
2%
17 Undeclared
43
4/27/2009
APPENDIX VI. STUDENT PERCEPTIONS ABOUT THE VALUE OF FEATURES IN THE CURRENT E-LEARNING SYSTEM. 1200 1000 800 600 400 200 0
Gotta have it I like it I like it but can do without it I like the idea, but it doesn't work the way I would like This stinks I don't know what this is
Appendix VII. Student Responses About New Features They Would like to See Added 500 400 300 200 100 0 File Sharing Gotta have it
Bookmarking Tools for Social Wiki Creation Networking I like it
I like it but can do without it
This stinks
Create a Portfolio
Blog Creation
I don't know what this is
44
4/27/2009
APPENDIX VIII. REPRESENTATIVE STUDENT COMMENTS TO OPENENDED QUESTIONS ABOUT HOW IMPROVE THE CURRENT ELEARNING SYSTEM Access
“Access to classes from last semester, even if you drop the class.” “Allowing for old classes to be removed from the class roster that shows up upon login.” “Allow a student to still have access to a course he previously took to see what an instructor is doing with it that semester and in case he needs to access previous information.” “Don't do upgrades at the end of the week that might possibly run into deadlines for coursework. And yes, I KNOW we're not supposed to procrastinate, but many of us have jobs and family demands that don't allow us to get things done early. I have enjoyed my distance learning courses.” “please don't remove the course material once the course is done, it is always helpful to come back and prepare using these materials while interviewing with companies” “have a link to it on the main ufl website”
Browser
“…to make it more user friendly, make things more simple.” “Fix the connectivity and Java problems. Update the types of browsers and formats compatible with E-Learning…” “Must support current browsers and browser technology. Must be secure and not rely on self-signed SSL certificates!” “Be able to download files on the first try”
Features
“A more interactive calendar that is easy to use.” “Have the E-Learning System have capable video and voice conferencing built in.” “Integrate WebCT mail with UFL email since people frequently don't check their WebCT mail…” “Make a main page that all of the classes add material to. In other words, have a main calendar, grades, assignments, etc by default that all courses insert their material to.” 45
4/27/2009 “I would like a calendar to combine all of my classes so I can view all my tests, due dates, etc. on one calendar.” “Maybe having an online chat area for any students who were signed into the system at the same time in case they have questions about assignments.” “The roster needs to have a search function in it. I would also suggest there be an incentive for students to put pictures and information about themselves on the site including ufl e-mail addresses. In this way it could serve as a ‘student directory.’” “being able to communicate with professors live, instead of through emails” “Email the students when new grades or announcements are posted.” “Use a system that encourages (or forces) student/teacher interaction. The worst Elearning courses I've ever had were those that left me with the impression my "instructor" was an elaborate set of scripts and macros.” “In addition to the tab the shows new announcements and new posts on the discussion board, it would also be nice to see a notification when a professor posts new notes to the Course Material. This would be nice because most professors do not make announcements and it would save time from scanning each course for new powerpoint slides.”
Instructor-related
“As a student I find the E-Learning system to be sufficient in what it's main priorities are. Its gives me a home base for my classes and enables me to check grades and access supplemental resources. Also from a student perspective within my 3 semesters here I have had multiple TEACHERS complain about different aspects that make things difficult on their end or things they would like to do but can't through the E-Learning system. I think the main objective of the new system should be to make sure a majority of teachers are sufficiently satisfied with the new system and what they can do with it.” “It needs to be effectively used by the teacher. The features of the system do not matter if the teacher does not want to use them.” “Make it required. I do a lot better in my classes when the professor uses E-Learning. It keeps me on track with my assignments.” “The way that teachers use it. If they utilize what is there, the system works much better. If they try to use it, but only use one aspect, it hinders the success of the program.” “They should require every online course to use at least the discussion tool or the email tool, providing some mode of communication for the class.” “To have more professors use it - and use it effectively.” “Provide a training course of some kind to all faculty and staff who must use it; too many 46
4/27/2009 of my instructors do not use it because they do not understand it.”
Mac-friendly
“An E-Learning system that is Mac OS and Safari friendly. Whenever I use the current system with my Mac on Safari, sometimes it will work and other times it will not. Not very reliable when you need to turn an assignment in on time or take an assessment.” “PLEASE make it compatible with multiple OS's and not just Windows. I am a Mac user and have been very frustrated with lack of support for streaming media and overall compatibility with Mac OS X.”
Stability
“Help make the online assessment/quizzes tool more stable so that it doesn't crash, and if an error does occur allow the student to finish the assignment with the leftover time from the first attempt with all of the original responses saved.” “It is often very slow and will time out after too short a time period.” “There is too much variability with the system. One minute it will load and the next it won't. It needs to be more dependable.”
User-interface
“Better labels and menu navigation! I could never tell where a given icon would take me.” “I think it is difficult to navigate through the different sections and folders, sometimes I feel like I wish there was a button that would take you back to the specific course's home page…” “Fewer links to access the information” “make it more user friendly, intuitive and easy to use” “I would recommend keeping the E-learning system as close to the current system as possible. Most of my courses for my degree are based on this system, and a completely new system would take time to master, time that I unfortunately don't have. I have no major problems with the current system. It is user-friendly.” “make it more user friendly and easier to use. I really like the college of education elearning system because your course website announcements grades bascially everything in all in one spot and its easy to use”
47
4/27/2009
APPENDIX IX. VENDOR PRESENTATIONS - RESPONDENT RATINGS ABOUT FUNCTIONS, EASE OF USE, AND QUALITY OF THE PRESENTER. Functions
%
Easy to
%
Final
%
Presentation
Total (out of
Use/Meet My
Score
Skills Score
693)
Needs Total
(out of
(out of 55)
(out of 110)
803)
%
Angel
555
80%
82
75%
637
79%
39
78%
Moodle
547
79%
88
80%
635
79%
44
88%
Sakai
420
61%
60
55%
480
60%
37
74%
APPENDIX X. RATINGS OF THE 9 CORE FUNCTIONS BY PARTICIPANTS AT THE VENDOR SESSIONS. Angel Moodle Sakai
Angel Moodle Sakai
Assignments
Discussions
Email
File Mgmt
Grade book
Count
61
71
61
48
67
Average
5.55
6.45
5.55
4.36
7.39
Count
67
65
62
67
66
Average
7.39
5.91
5.64
7.39
6
Count
62
61
58
65
43
Average
5.64
5.55
5.27
5.91
4.3
Rubrics
Group Mgmt
Assessment
Count
55
63
66
Average
5
5.73
6
Count
41
71
55
Average
3.73
6.45
5
Count
43
42
41
Average
4.3
4.2
4.56
APPENDIX XI. PARTICIPANT RATINGS ON USABILITY. Av Score
File Mgt
Disc
Email
Grp Mgt
Rubric
Grade Book
Assessment
Assignment
Angel
8.2
8.8
7.9
9.0
8.4
7.0
8.3
8.9
7.6
Moodle
5.4
5.7
6.1
3.7
7.3
2.0
6.3
6.6
6.6
Sakai
6.1
5.8
6.4
4.7
6.2
4.0
7.3
6.6
7.5
48
4/27/2009
APPENDIX XII. COMPARISON TABLE Functions/Features Categorized Using the 8 Criteria
Phase I Comparison Sakai
7.3
2.5
WebCT
Comparison
(Current E-
Angel
Sakai
7.4
3.0
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
Assessments – Browser Lockdown
●
●
●
●
Assessment Rubrics
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
Established by the Committee
Angel
Phase II
Moodle
Learning System)
Core Functionality Grade Assignments Offline
●
Grade – By Groups Grade – Allows Algorithmic/Calculated Questions Assessments – Selective Release by Date Assessments – Selective Release by Other Events
●
Discussions Email
●
●
File Management Gradebook – Percentage Based
●
●
Gradebook – Grade Assignments Off-line Tracking – Detailed Tracking of Progress
●
●
●
Usability Drag and Drop
●
●
Back Button Works
●
●
●
●
●
HTML Editor Does Not Use Java
●
●
●
●
●
Built in Browser Checker
●
Scorm 2004 Compliant
●
Technical ● ●
Customizable UF’s Knowledge of Backend
●
●
●
●
● ●
●
49
4/27/2009 Phase I Comparison
Functions/Features Categorized Using the 8 Criteria
Phase II
WebCT
Comparison
(Current
Sakai
7.3
2.5
●
?
●
Elluminate
●
●
●
●
●
●
Turnitin
●
●
●
●
●
Respondus Question Database
●
?
●
●
●
Publisher Support
●
●
●
●
Common Cartridge Mode
●
Open Database Schema
●
Migration
+
Established by the Committee
Moodle
Angel
Sakai
7.4
3.0
E-
Angel
Learning System)
Software Division Branding
●
Integratability
●
● ● +++
+
● +++
Evolving Pedagogical Tools (Used by Early Adopters) Certificate Generator
●
Wiki
●
●
Blog
●
Podcast
●
PDA Mode
●
RSS
●
New Activity Alerts in Portal for Ungraded Assignments, Quiz, etc.
$
Whiteboard
●
Statistics and Reports
●
+++ = Best
●
●
●
●
●
●
●
●
● ●
●
●
●
● ●
$
● ●
●
●
●
●
Results Available on Request
Total Cost of Ownership (TCO)
+ = Worst
●
●
e-Portfolio
Longevity and Ecosystem
●
+
+++
$ = Requires Additional Costs
++
+
+++
? = Could Not Be Determined
50
4/27/2009
REFERENCES Adeoye, B., & Wentling, R. M. (2007). The relationship between national culture and the usability of an E-learning system. International Journal on E-Learning, 6(1), 119-146. Alexander, S., & Golja, T. (2007). Using students' experiences to derive quality in an E-learning system: An institution's perspective Educational Technology & Society 10(2), 7-33. Retrieved from http://www.ifets.info/journals/10_2/3.pdf Beatty, B., & Ulasewicz, C. (2006). Faculty perspectives on moving from blackboard to the Moodle learning management system. TechTrends: Linking Research & Practice to Improve Learning, 50(4), 36-45. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&db=tfh&AN=22930339&site=ehost-live . Black, E. W., Beck, D., Dawson, K., Jinks, S., & DiPietro, M. (2007). The other side of the LMS: Considering implementation and use in the adoption of an LMS in online and blended learning environments. TechTrends: Linking Research and Practice to Improve Learning, 51(2), 35-39,. Retrieved from http://dx.doi.org/10.1007/s11528-007-0024-x Bonds-Raacke, J. M. Students' attitudes toward the introduction of a course website. Journal of Instructional Psychology, 33(4), 251-255. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&db=tfh&AN=23508196&site=ehost-live Bradford, P., Porciello, M., Balkon, N., & Backus, D. (2006). The blackboard learning system: The be all and end all in educational instruction? Journal of Educational Technology Systems, 35(3), 301-314. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&db=tfh&AN=24984163&site=ehost-live Calongne, C. M. Educational frontiers: Learning in a virtual world. EDUCAUSE Review, 43(5), 3638,40,42,44,46,48. Retrieved from http://connect.educause.edu/Library/EDUCAUSE+Review/EducationalFrontiersLearn/47221 Chao, I. T. (2008). Moving to Moodle: Reflections two years later. Educause Quarterly, 31(3), 46-52. Retrieved from http://net.educause.edu/ir/library/pdf/EQM0837.pdf DeNeui, D. L., & Dodge, T. L. (2006). Asynchronous learning networks and student outcomes: The utility of online learning components in hybrid courses. Journal of Instructional Psychology, 33(4), 256259. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&db=tfh&AN=23508197&site=ehost-live D'Silva, R., & Reeder, K. (2005). Factors that influence faculty members’ uptake and continued use of course management systems. British Journal of Educational Technology, 36(6), 1071-1073. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&db=tfh&AN=18613825&site=ehost-live EduTools (2009) CMS: CMS features. Retrieved March 30, 2009, from http://www.edutools.info/feature_list.jsp?pj=4
51
4/27/2009 Falvo, D. A., & Johnson, B. F. The use of learning management systems in the united states. TechTrends: Linking Research and Practice to Improve Learning, 51(2), 40-45. doi:http://dx.doi.org/10.1007/s11528-007-0025-9 Frasier, C. E., IT Strategic Planning Committee. (2002). UF information technology strategic plan Retrieved from http://www.cio.ufl.edu/policies/documents/IT%20Strategic%20Plan.pdf Georgouli, K., Skalkidis, I., Guerreiro, P. (2008). A framework for adopting LMS to introduce e-learning in a traditional course. Educational Technology & Society, 11(2), 227-240. Retrieved from http://www.ifets.info/journals/11_2/17.pdf Gibbons, S. (2005). Course-management systems. Library Technology Reports, 41(3), 7-11. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&db=tfh&AN=17297353&site=ehost-live Hartman, J. L. Moving teaching and learning with technology from adoption to transformation. Educause Review, 43(6), 24-25. Retrieved from http://connect.educause.edu/Library/EDUCAUSE+Review/MovingTeachingandLearning/47440 HSC faculty development - resources. Educause. (2008). Retrieved March 30, 2009, from http://facdev.health.ufl.edu/educause.shtml Ioannou, A., & Hannafin, R. D. (2008). Course management systems: Time for users to get what they need. TechTrends: Linking Research & Practice to Improve Learning, 52(1), 46-50. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&db=tfh&AN=29544247&site=ehost-live Kilmon, C., & Fagan, M. H. (2007). Course management software adoption: A diffusion of innovations perspective. Campus-Wide Information Systems, 24(2), 134-144. Retrieved from http://www.emeraldinsight.com/10.1108/10650740710742736 Landon, B., Henderson, T., Poulin, R. (2006). Massachussetts institute of technology peer comparison of Course/Learning management systems, course materials life cycle, and related costs: Final report. WCET's EduTools. Retrieved from http://web.mit.edu/emcc/www/MIT-WCET-C-LMS-Final-Report07-19-06.pdf Lane, L. M. (2008). Toolbox or trap? course management systems and pedagogy. EDUCAUSE Quarterly, 31(2), 4-6. Retrieved from http://connect.educause.edu/Library/EDUCAUSE+Quarterly/EDUCAUSEQuarterlyMagazine/46575 Lohman, M. C. (2006). Effects of information distributions strategies in a web-based course management system on student performance and satisfaction access ERIC: FullText. Paper presented at the Academy of Human Resource Development International Conference (AHRD), Columbus, OH, Feb 22-26, 2006. 771-778. Retrieved from http://www.eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/1b/df/b9. pdf Malikowski, S. R., Thompson, M. E., & Theis, J. G. (2007). A model for research into course management systems: Bridging technology and learning theory. Journal of Educational Computing Research, 36(2), 149-173. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&db=tfh&AN=24660531&site=ehost-live
52
4/27/2009 Martin-Blas, T., & Serrano-Fernandez, A. (2009). The role of new technologies in the learning process: Moodle as a teaching tool in physics. Computers & Education, 52(1), 35-44. doi:http://dx.doi.org/10.1016/j.compedu.2008.06.005 McGee, P., & Leffel, A. (2005). Ethics and Decision‐Making in a course management system: Instructor and learner development. Education, Communication & Information, 5(3), 265-284. doi:10.1080/14636310500350653 McQuiggan, C. A. (2006). A survey of university faculty innovation concerns and perceptions that influence the adoption and diffusion of a course management. Paper presented at the (AHRD) (Columbus, OH, Feb 22-26, 2006) p1160-1167 (Symp. 55-2), Columbus, OH, Feb 22-26, 2006. Retrieved from http://www.eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/1b/e0/3d. pdf Padgett, L. (2006). In search of.. Information Today, 23(9; 9), 62-63. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&db=tfh&AN=22815730&site=ehost-live Pan, G., & Bonk, C. J. (2007). The emergence of open-source software in north america. International Review of Research in Open and Distance Learning, 8(3), 1-17. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/496/950 Powell, P., Ed.D. (2008). Diffusion of innovation: A case study of course management system adoption. Online Submission. Retrieved from http://www.eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/43/04/ff.p df Samarawickrema, G., & Stacey, E. (2007). Adopting web-based learning and teaching: A case study in higher education. Distance Education, 28(3), 313-333. Retrieved from http://www.informaworld.com/openurl?genre=article&id=doi:10.1080/01587910701611344 Simon, S., & Ting, E. K. (2008). Changing to a new CMS: Short-term pain or long-term suffering? Paper presented at the Educause 2008 Annual Meeting, Orlando FL. Retrieved from http://www.educause.edu/Resources/ChangingtoaNewCMSShortTermPain/163418 Simonson, M. (2007). Course management systems. Quarterly Review of Distance Education, 8(1), vii-ix. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&db=tfh&AN=24958016&site=ehost-live Smith, G. G., Torres-Ayala, A. T., & Heindel, A. J. (2008). Disciplinary differences in E-learning instructional design. Journal of Distance Education, 22(3), 63-88. Retrieved from http://www.jofde.ca/index.php/jde/article/view/91 Smoline, D. V. (2008). Some problems of computer-aided testing and 'interview-like tests'. Computers and Education, 51(2), 743-756. doi:http://dx.doi.org/10.1016/j.compedu.2007.07.008 Notre Dame University. University Council for Academic Technologies. Course Management System Subcommittee. (2008). University of Notre Dame course management system replacement project: Findings and recommendations: Public report.
53
4/27/2009 University of Florida. Office of the President. (2007). From achievement to recognition: Strategic work plan for the university of floridaUniversity of Florida. Uzunboylu, H., Ozdamli, F., & Ozcinar, Z. (2006). An evaluation of open source learning management systems according to learners tools. Current Developments in Technology-Assisted Education. 1 ,811. Retrieved from http://www.eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/27/f5/f5.p df WebLearn wiki. (2009). Retrieved March 30, 2009, from http://wiki.oucs.ox.ac.uk/weblearn West, R., Waddoups, G., & Graham, C. (2007). Understanding the experiences of instructors as they adopt a course management system. Educational Technology Research & Development, 55(1), 1-26. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&db=tfh&AN=23710181&site=ehost-live Young, J. R. (2008). Blackboard customers consider alternatives. Chronicle of Higher Education, 55(3), A1. Retrieved from http://chronicle.com/archive
54