Session T3E DEVELOPING A CRITERIA SET FOR AN ONLINE LEARNING ENVIRONMENT Maxine S. Cohen1 and Timothy J. Ellis2 Abstract Although designing an online course is a difficult task, it is one that is faced by educators almost on a daily basis. There is a lot of pressure to put one’s course online and little research available for the “best” way to do it. There are course management systems that aid in the procedural tasks, but we still don’t know the optimal d esign for an online course. This paper presents a study to develop a criteria set for excellence in online courses, from the student perspective. A two-fold approach was taken in this study. A group of students “brainstormed” a criteria list of important factors for an online learning environment. A second group of students then validated the criteria. A factorial analysis was performed analyzing the criteria by rating and rank order. Index Terms Distance education, evaluation, online learning.
INTRODUCTION Distance education via online courses is becoming a standard offering by most colleges from undergraduate to graduate. Students are flocking to online classes in record numbers. Students seem to like the flexibility of online learning offering freedom from constraints of time and space. Research abounds on some of the topics related to this relatively new way to deliver education. Some research focuses on the faculty perspective, including issues of time needed to develop and deliver online courses. Other research looks at the administrative issues such as class sizes and turf wars, including state and geographic boundaries. And still other research looks at the myriad set of tools available. Although this research has been student centered little research has been found that focuses on the students and what they expect in an online course. Initially, online courses were a novelty and the fact they worked and even existed was pretty exciting. Now, however, just taking a course and putting the class notes into PowerPoint and placing it on a Web page no longer works. In addition, the online environment allows much more flexibility that is not being used to its full potential. As both students and instructors become more proficient and experienced with online course delivery, we are starting to see the emergence of some standards and traditions. Certainly, we do not have the experience base with online teaching as with the lecture delivery method that has been around for hundreds of years. As we get more experience,
research can start focusing on the learning process, pedagogy issues, student expectations, and quality issues. Little research has been done looking at what constitutes quality in an online course. The time seems ripe to move into the student arena and examine successful online students along with their characteristics and expectations. In general, online students are seen as self-motivated and independent learners. Many students enrolled in online classes are non-traditional students and working adults. They often bring their workplace experiences with them into the classroom. Their experiences are often quite varied. Research is being done looking at ways to reduce drop out rates, which seem to be higher for online courses. [1] Research Questions In this paper, we are trying to answer the following research questions. 1.
2.
What do students perceive to be the quality indicators for an online course, separate from coursecontent issues? What are the quality indicators that are of the greatest importance to students enrolled in technology focused programs, again separate from coursecontent issues?
To answer these questions, we designed a two-phase developmental study. Our goal was to develop and validate a criteria set of the indicators of a quality, online technology course, as perceived by the student. All the students involved were enrolled in a distance education doctoral program in a school of computer and information sciences. Assumptions and Limitations This research must be viewed in the context of the assumptions and limitations upon which it was based. Our understanding of online learning environments is still in its in fancy, and quality issues must be studied from many views; this study focused solely on the student perspective. The students participating in this study were all experienced, online students enrolled in a doctoral program in a graduate school of computer and information sciences and, as such, were comfortable in using online technology. Results of this study may not be applicable to either undergraduate students
1
Maxine S. Cohen, Nova Southeastern University, Graduate School of Computer and Information Sciences, 6100 Griffin Rd, Davie, FL 33314,
[email protected] 2 Timothy J. Ellis, Nova Southeastern University, Graduate School of Computer and Information Sciences, 6100 Griffin Rd, Davie, FL 33314,
[email protected]
0-7803-7444-4/02/$17.00 © 2002 IEEE November 6 - 9, 2002, Boston, MA 32 nd ASEE/IEEE Frontiers in Education Conference T3E-8
Session T3E or less technologically sophisticated students. Finally, the focus of the study was on quality indicators of the environment, not on course content and other curriculum-related issues.
BACKGROUND Most courses currently have a course Web page with a detailed syllabus. Online courses typically use some sort of asynchronous tools. Examples of these tools include: email, bulletin boards, and course assignment submission systems. Some courses may include a synchronous component with a chat facility. The tools available range from home grown tools to off the shelf components to full featured commercial course management systems, such as that offered by Blackboard and WebCT. This functionality may include items such as: email, synchronous and asynchronous communication, electronic whiteboards, Web pages, course syllabus features, calendars, etc. There are a host of courses and workshops offered on many campuses teaching professors how to translate their courses into an online offering. Topics may include: how to manage online activities; how to sustain interactivity in online courses; and other related topics. There are summer programs, mentoring programs, and other ways (sometimes including financial incentives and course release time) to encourage professors to move their courses online. Many faculty discussions center on barriers to online learning. It seems for some disciplines the transition to online education is easier than in other disciplines. For example, how does one teach a course with a clinical component online? Yet, in online environments, simulations and experiments can abound. Videos, PowerPoint slides, audio files, animation can be integrated and cause some disciplines to really come alive. Using the Internet "virtual visits" can be taken to places near and far (i.e. famous museums or landmarks). Dangerous simulations can be run in a "safe" environment and repeated at no real additional cost. Most faculty seem to agree online courses take more time to develop. Faculty need to keep Web pages and links up to date. Distance education provides faculty the freedom to travel to conferences and still "teach" their courses from hotel rooms or anywhere else on the road. This same opportunity is available to working adults that cannot or choose not to be in a classroom say every Wednesday evening at 7 pm. Certainly one needs sufficient technology infrastructure to run online courses, including help desks, course designers, and support for the faculty. The physical classroom size barrier is removed, although there are some researchers that feel 18-25 students is the ideal size for an online course. Decisions need to be made as to the best way to deliver course materials. In the traditional classroom, paper was commonly used. Now we have choices of Web delivery, CDs, and of course, still paper handouts. Teaching
technology adds additional expectations and problems. Using current technology is an issue. Meeting the expectations of the lowest common denominator as far as hardware facilities of the students is another issue. Dealing with different computer set-ups is another problem. The online student population presents many teaching challenges. Richardson [2] found online students to have superior time management skills. Kubala [3] found online students to be daring and confrontational regarding their expression of ideas. Students demand quick turnaround and expect to be in a 24/7 environment. Time zone boundaries seem to disappear. Rossman [4] found that online learners want prompt feedback, specific feedback, responses from fellow students (without humiliation). They prefer negative comments to be communicated privately via email or with a phone call. Many faculty try to simulate the establishment of a learning community. Pictures and creating a social space are important to create a learning community [5]. Psychological isolation can be a problem in online environments. Frequent communication is another way to keep students together over the distance and break down the distance barriers, but how does one deal with the non-communicative student?
M ETHODOLOGY Addressing the research questions required two processes. A list of online course quality-indicator candidates had to be developed. That list of potential indicators of quality in online courses then had to be rated and rank-ordered. Since this study focused on the student perspective, each of these processes entailed polling students actively engaged in one or more courses delivered via an online modality. A more detailed description of each process follows. Identifying a List of Potential Quality-Indicators The list of potential quality-indicators was developed through a threaded discussion forum-based “brainstorming” session. Participation in the session was voluntary and anonymous and open to students taking a doctoral-level TABLE I CANDIDATE INDICATORS OF QUALITY IN AN ONLINE COURSE Connection with professor Connection with other students Learner (student)-centered Expectations clearly articulated Immediately engages the student Effective instructor-to-students communication Effective student-to-instructor communication Effective student-to-student communication Anytime, anyplace learning Self-paced schedule Simulates an in class ‘feel’ Class size Feedback clear, timely, and meaningful Peers adequately prepared for online course Incorporation of leading-edge technologies
0-7803-7444-4/02/$17.00 © 2002 IEEE November 6 - 9, 2002, Boston, MA 32 nd ASEE/IEEE Frontiers in Education Conference T3E-9
Session T3E course in Human Computer Interaction (HCI). All participants were majoring in Computing Technology in Education (CTE), a program that focuses on the application of computer technology as an enhancement to learning. This course was selected for the “brainstorming” process for two reasons: 1) CTE students, by virtue of their educational and experiential backgrounds, are conversant with learning theory and comfortable in discussing that topic and; 2) the focus of the HCI course lends itself well to a discussion of quality, especially in terms of computer-enhanced environments such as online courses since that is an integral part of the HCI curriculum. Ten students participated in the “brainstorming” discussion thread and identified a total of 35 items. The researchers examined the list of 35 potential qualityindicators and, by eliminating duplicate and off-topic entries, reduced the list to the 15 items listed in Table I. Rating and Ranking Online Course Quality-Indicators The 15 quality-indicator candidates (Table I) identified through the “brainstorming” session were rated and ranked by a second group of students enrolled in doctoral-level Multimedia Systems classes. The 44 students participating in this phase of the study were majoring in Information Systems, Computer Information Systems, or Computer Science and possessed predominately technical education and experiential backgrounds. A two-part instrument was developed to rate and rank the quality-indicator candidates. The first part of the instrument consisted of a set of 16 questions. The first 15 questions asked the student to rate each of the 15 qualityindicators on the five-point scale detailed in Table II. Question 16 (Figure 1) asked the student to identify how she
the instrument, they were also given the opportunity to add their own indicators if they felt one or more items of great importance had been ignored in the earlier analysis. Analysis The data produced from the questionnaire described above were analyzed quantitatively. The responses on the 15 quality-indicator rating questions were converted to numbers using a scale ranging from 2 for “Crucial to a good online course” to –2 for “Disastrous to a good online course”. Means, standard deviations, and distribution patterns were determined for each of the quality-indicator candidates in order to determine a value rating for each item. The ranking of the quality-indicators was determined by tabulating the number of times each item was listed among the top five indicators of quality in an online course. To determine an ordinal ranking, the number of first place listings for an item was multiplied by five, the number of second place listings by four, third place listings by three, fourth place listings by two, and fifth place listings by one. The products were added for each item to determine a weighted ranking. Finally, a factorial analysis was conducted. The items identified during the “brainstorming” process appeared to fall naturally into three rather general categories: InstructorStudent Interaction, Student-Student Interaction, and Class
TABLE II QUALITY-INDICATOR RATING SCALE q q q q q
Crucial to a good online course Important in a good online course Unimportant (neutral) in an online course Really rather not see in an online course Disastrous to a good online course
or he compared quality in an online course with quality in an on-campus course. The second part of the instrument asked the students to list the five most important characteristics of a high-quality online course, in order of importance. Although the students were asked to select from the 15 quality-indicators listed in An online course has: q Exactly the same indicators of quality as an on-campus course q The same indicators of quality as an on-campus course, plus unique indicators q Different indicators of quality than an on-campus course
FIGURE 1 QUALITY IN AN ONLINE COURSE VS. QUALITY IN AN ON-CAMPUS COURSE
Organization (Table III). The means, standard deviations and average weighted ranking (sum of the weighted rankings of each item in the category, divided by the number of items in the category) for each of the three factors were calculated to assess the meaningfulness of the categories as descriptors of quality online technology-intensive courses.
RESULTS Table IV presents the means and standard deviations of the 15 quality-indicators, organized by category. A review of these data indicates that the students participating in the study did distinguish among the quality-indicator candidates: the means ranged from a low of 0.3182 to a high of 1.8409.
0-7803-7444-4/02/$17.00 © 2002 IEEE November 6 - 9, 2002, Boston, MA 32 nd ASEE/IEEE Frontiers in Education Conference T3E-10
Session T3E TABLE IV
TABLE VI
QUALITY-INDICATOR RATINGS
FACTORIAL ANALYSIS OF QUALITY-INDICATOR CATEGORIES
Distribution * Means
SD
-2 -1
0
1
2
Factor 1: Instructor-Student Interaction Connection with professor
1.3864 0.647
0
0
4 19 21
Effective instructor-to-students communication 1.6591 0.561
0
0
2 11 31
Effective student-to-instructor communication 1.5455 0.541
0
0
1 18 25
Feedback clear, timely, and meaningful
1.8409 0.365
0
0
0
7 37
Expectations clearly articulated
1.7500 0.482
0
0
1
9 34
Factor 2: Student-Student Interaction Connection with other students
0.7442 0.685
0
1 14 23
5
Effective student-to-student communication 0.4773 0.690
1
0 22 19
2
Peers adequately prepared for online course 0.5909 0.778
1
0 20 18
5
Class Size
0.4318 0.579
0
0 27 15
2
Immediately engages the student
1.1591 0.672
0
0
7 23 14
Learner (student)-centered
1.3256 0.636
0
0
4 21 18
Anytime, anyplace learning
1.5909 0.576
0
0
2 14 28
Self-paced schedule
0.5909 1.134
3
5
8 19
9
Simulates an in class ‘feel’
0.3182 0.873
1
5 21 13
4
Incorporation of leading-edge technologies 1.1818 1.028 *-2 = Disastrous to a good online course -1 = Really rather not see in an online course 0 = Unimportant (neutral) in an online course 1 = Important in a good online course 2 = Crucial to a good online course
1
1
Factor 3: Class Organization
8 15 18
Category
Means
SD
Instructor-Student Interaction 1.6364 0.5515
Average Weighted Ranking 63.0000
Student-Student Interaction 0.5600 0.6974
4.2500
Class Organization
25.8333
1.0266 0.9529
Interestingly, the four items included in the Student-Student Interaction factor recorded four of the six lowest means. Table V details the number of top-five rankings and the calculated Weighted Ranking for each of the quality-indicators, again organized by category. Once again it is interesting to note that the four items included in the Student-Student Interaction factor recorded the four lowest Weighted Ranks. When the factorial analysis is applied to these data, the results are striking. As illustrated in Table VI, the differences among the perceived value of Instructor-Student Interaction, StudentStudent Interaction, and Class Organization among these students in a technology-intensive program of study is quite noteworthy.
TABLE V
CONCLUSIONS
QUALITY-INDICATOR RANKINGS 1
2
3
4
5
Weighted Rank
Factor 1: Instructor-Student Interaction Connection with professor
5
3
1
0
2
42
Effective instructor-to-students communication 6
5
3
0
1
60
Effective student-to-instructor communication 2
4
4
3
2
46
Feedback clear, timely, and meaningful
3
6
7
5
8
78
Expectations clearly articulated
5
9
5
5
3
89
Factor 2: Student-Student Interaction Connection with other students
1
0
0
0
1
6
Effective student-to-student communication 0
0
0
3
0
6
Peers adequately prepared for online course 0
0
0
1
1
3
Class Size
0
0
0
1
0
2
Immediately engages the student
1
1
0
0
2
11
Learner (student)-centered
2
1
1
0
1
18
Anytime, anyplace learning
2
3
5
6
4
53
Self-paced schedule
0
1
3
3
0
19
Simulates an in class ‘feel’
0
1
1
2
2
13
Incorporation of leading-edge technologies
5
1
3
0
3
41
Factor 3: Class Organization
This study produced some interesting and somewhat unexpected results. As is the case with most studies, it likewise raised additional questions for future research. The quality-indicators seemed to break nicely into three categories. These categories have to do with interactions (instructor-to-student, student-tostudent) and organizational issues. Looking at the three factors there are some expected findings and some surprises. Of course, some results may be different with a different student population. At this point, an analysis was not carried out looking at gender issues, age issues, and experience with online courses. Changes in any of those demographics may generate different results in the factor analysis. Factor 1 – Instructor-Student Interaction. This factor had the highest mean with the smallest standard deviation in the factorial analysis (Table VI). That seems to indicate that the factors in this group are of interest to students and an important part of the quality-indicators in an online course. Looking at the individual factors in this
0-7803-7444-4/02/$17.00 © 2002 IEEE November 6 - 9, 2002, Boston, MA 32 nd ASEE/IEEE Frontiers in Education Conference T3E-11
Session T3E group, the two with the highest weighted rank are “Feedback clear, timely, and meaningful” and “Expectations clearly articulated”. These factors are important to any course online or traditional. These results add additional credence to the fact that 26 of the participants said that an online course has the same indicators of quality as an on-campus course, plus unique indicators. Our findings are also consistent with Rossman ‘s [4] observation that online learners want prompt and specific feedback. In any course, students want feedback, but in the online environment, without good feedback in a timely fashion, students can tend to feel isolated. In an online environment this isolation is exacerbated since students and instructors do not have the normal “clock” to measure the passage of time through the course [6]. Setting a level of expectation for feedback helps students know what to expect and when. Grading rubrics are useful for meaningful feedback. The other factors in this group have to do with communication and connection with the professor. Again, these factors ranked pretty high as a group. Effective use of communication is especially important in facilitating learning [6]. Although this study did not distinguish course related communication versus general chat, it does confirm the findings of Stevenson, Sander, and Naylor [7] on the importance of social chat. Baird [8] studied the factors that seemed to support successful doctoral students. His findings included persistence and determination, organize time well, intense interest, task and detail oriented, fellowship in the department, supportive human, and the faculty/advisor student relationship. Although this study did not distinguish among online courses versus traditional course, many of his characteristics are typical of online students. This finding is also consistent with that of Richardson [2] who found that online students have superior time management skills. The communication issues are strong ones. Factor 2 - Student-Student Interaction. This factor had the most surprising results. We expected to find class size to be an issue, but using the weighted rank, this factor had the absolute lowest number of the 15 factors. This group of students apparently felt the interaction and communication was strong enough that class size had no impact. Perhaps class size is a faculty issue, not a student issue. We were also surprised to see the rankings of “Connection with other students” and “Effective student-to-student communication” to rank relatively low. Our hypothesis about this result is that it may depend on the student group. Typically, technology students are not as involved in communication with their peers as liberal arts and social science students. This result is in contrast to the research findings of others [5, 7]. Repeating the ranking with a different student population may generate different rankings. Likewise, replicating this study with faculty rather than student participants could generate differing results. We also found that there was not much emphasis on “Peers adequately prepared for online course”. Our explanation for this result is that the students work rather independently (often in their own homes and/or
offices) and just do not see some of the is sues that are seen by the faculty member, trying to deal with the varied range of technological experiences. Again, online students are seen as independent learners. Factor 3 – Class organization. It is not surprising to see the high weighted ranking of “Anytime, anyplace learning” in this category. Most students are attracted to distance education because of the flexibility of the learning environment. This flexibility allows students to learn at a time convenient to them. It affords them the opportunity to be anywhere while learning, freeing them from geographic and time constraints. Another highly ranked factor in this category was the “Incorporation of leading-edge technologies”. Remember this study was conducted in a school of computer and information sciences with students who have some technological background, experience, and interest. It would be counter-intuitive to expect them not to rank this item as one of the higher rated quality-indicators. The other factors in this grouping had sort of a middle of the road ranking. There also was a large standard deviation in this third factor, so it is hard to say much more about the importance of these issues. A different population may have different rankings for this set of factors. Summary Distance education is becoming a bigger part of our teaching repertoire of choices. Commercial course management systems and home-grown tools will continue to proliferate and impact the online learning environment. As bandwidth continues to increase and technology continues to be more powerful and affordable, the online environment will grow into a more sophisticated environment and standards will start to emerge. More students will have experience with this medium and bring greater expectations for these courses. This study is just one of many to look beyond the technological improvements and new software in online learning environments. It is an initial look at the quality indicators of an online course from the doctoral student perspective. A different type of student population almost certainly would generate different rankings. For example, social science students may have more emphasis on the learning community than engineering students. Age, gender, and experience may also impact the quality indicators. Doing this ranking with a group of faculty may also generate a different weighted rank. The important contributions of this study are the following: 1. 2. 3. 4.
An initial identification of quality-indicators for the online environment A categorization of the quality-indicators A methodology that can be easily repeated by other researchers A beginning look at pedagogy in online learning.
0-7803-7444-4/02/$17.00 © 2002 IEEE November 6 - 9, 2002, Boston, MA 32 nd ASEE/IEEE Frontiers in Education Conference T3E-12
Session T3E Of course there is still more to learn, but online learning environments are certainly going to be a part of the future of education. Palloff and Pratt summarized this quite well when they said "Successful online distance education is a process of taking our very best practices in the classroom and bringing them into a new arena. In this new arena, however, the practices may not look exactly the same"[9] (p. 6).
REFERENCES [1]
Lake, D. “Reducing isolation for distance students: an On-line Initiative”. Open Learning, 14(3), 1999, 14-23.
[2]
Richardson, J.T.E. “Mature students in higher education: I. A literature survey on approaches to studying”. Studies in Higher Education, 19, 1994, 309-325.
[3]
Kubala, T. “Addressing student needs: Teaching on the Internet”. T.H.E. Journal, March, 1998.
[4]
Rossman, M. “Successful Online Teaching Using an Asychronous Learner Discussion Forum”. Journal of Asynchronous Learning Networks, 3 (2), 1999.
[5]
Bernard, R., Rubalcava, B. and St -Pierre, D. “Collaborative online distance learning: issues for future practice and research”. Distance Education, 21 (2), 2000, 260-277.
[6]
Cohen, M.S. and Ellis, T.J. “Teaching technology in an online, distance education environment”. Proceedings: Frontiers in Education Conference, 2001 Reno (pp. T1F-1 – T1F-6). Piscataway, NJ: IEEE. 2001.
[7]
Stevenson K., Sander, P. and Naylor, P. “Student perceptions of the tutor’s role in distance learning”. Open Learning, 11(1), 1996, 41-49.
[8]
Baird, L.L. “Completing the dissertation: theory, research and Practice”. New Directions for Higher Education, 25 (3), 1997, 99-105.
[9]
Paloff, R. M. and Pratt, K. Building Learning Communities in Cyberspace: Effective Strategies for the Online Classroom. San Francisco, CA: Jossey Bass. 1999.
0-7803-7444-4/02/$17.00 © 2002 IEEE November 6 - 9, 2002, Boston, MA 32 nd ASEE/IEEE Frontiers in Education Conference T3E-13