Student User Experience (UX) - LearnTechLib

1 downloads 0 Views 1MB Size Report
Nov 16, 2016 - Abstract: User experience (UX) testing denotes evaluating a service or product by testing it with representative users. Such testing is ...
E-Learn 2016 - Washington, DC, United States, November 14-16, 2016

Asking Students What They Think: Student User Experience (UX) Research Studies to Inform Online Course Design Ronda Reid Andrea Gregg Vicki Williams Amy Garbrick Penn State University United States [email protected] [email protected] [email protected] [email protected]

Abstract: User experience (UX) testing denotes evaluating a service or product by testing it with representative users. Such testing is commonplace in the online world generally, but not as common with online learning specifically. Testing online course design and preferences with students—representative users of online education services—has the potential to minimize learning barriers which may result from design problems or non-intuitive learning designs, help the student learning experience, and improve online learning overall. This paper explores two user based research efforts: a user survey to measure online course design preferences and a think-aloud, taskbased testing in a fully developed online course to gather data and feedback on user experience in online course design. Both studies took place with current, online students attending a Large R-1 Mid Atlantic University. Results indicated the task-based nature of online learners and their desire for efficient course design. Additionally, qualitative feedback suggests that in addition to their taskoriented approach to individual courses, online learners also desire an affective connection with their university community. Lastly data also suggest that students appreciate the opportunity to provide feedback pertaining to their online learning.

Introduction User experience (UX) focuses on having an understanding of users, what they value and need, and promoting the quality of the user’s interaction with a specified service or product (U.S. Department of Health & Human Services, 2016). It is a line of research that was early adopted in the field of human-computer interaction (HCI, Hassenzahl & Tractinsky, 2006). Considering human experience as important dimension for investigation is also common in other fields like advertising and physical product design (Parrish, Wilson, & Dunlap, 2011). While it would seem that online learning would be a logical site for extensive UX research, given that the web interface mediates nearly all of the online learners’ experiences with their courses, UX efforts in this context have typically lagged behind their online counterparts (Fisher & Wright, 2010). In part this may be because of important differences between e-commerce when compared to online learning environments because of the need in online learning to also attend to instructional approaches, learning outcomes, quality and levels of learning, and order of content (Notess, 2001). As many educators can attest, student satisfaction is not the primary goal of an online course, student learning is. That said, there is an important distinction to make between challenges in learning experiences due to confusing or non-intuitive interfaces, challenges that should be minimized, and challenges inherent to the learning itself. Holding course development to a less rigorous examination process in terms of UX than non-academic online counterparts risks creating online educational environments that can lead to counterproductive student experiences. When learners cannot find what they need to do in an online environment, it is harder for them to learn. In addition, frustrated learners tend to either drop out or become an undue burden on instructors (Faculty Focus, 2009). Often when learners complain about learning materials delivered online, those complaints involve unclear buttons, confusing menus, or links that are not intuitive (Ardito et al., 2006). It is toward the end of using UX research to improve online learning experiences that the following two studies were conducted.

-451-

E-Learn 2016 - Washington, DC, United States, November 14-16, 2016

Research Studies This paper gives a brief overview of two research studies conducted at a large Mid-Atlantic R-1 University. The first is a completed study investigating online learner layout, navigation, and look-and-feel preferences for their online courses (study 1). The second is a study with preliminary analysis of think-aloud task-based observations of online learners (study 2). Study 1: User Preferences (Survey) Study Design In April 2016, 4,900 online students, over the age of 18, who were taking at least one course in the Canvas LMS, were sent an email asking them to complete an online user survey. Those who completed the survey, delivered through Qualtrics, were eligible for a drawing for one of five Amazon.com gift cards. The intent of the survey was to gain an understanding of student preferences in online course design, specifically pertaining to three main areas: the course landing page, syllabus design, and the course navigation structure. Students were shown four screenshots for the course landing page and three screenshots for the syllabus and then asked various questions regarding preference, usage, and functionality. For course navigation, students were asked to rank the navigational elements in order of importance. The survey also gathered qualitative feedback pertaining to affect towards the university as well as overall thoughts about course design. Responses from students were collected over a seven week period which resulted in 743 survey views, 516 completed surveys, and an additional 50 partially completed surveys for an overall response rate of slightly more than 10%. Of the participating students, 37.4% were Graduate students, 62.6% were Undergraduates. Undergrads were typically in their 5th through 8th semesters. The maximum participant age was 68, with a mean age of 35.5 yrs. Both quantitative and qualitative data were collected and analyzed. Findings Student preference trends in each of the three main areas (landing page, syllabus, and navigation) indicated a task-based approach and desire for efficiency in course design. For the course landing page, students indicated that due dates, a course outline, and timely instructions were the most important elements to them. In regard to the course landing page, student preference indicated that usefulness and the ability to find information quickly were more important than the attractiveness of the page. The preference on seeing due dates and finding information quickly was again found on the results for course syllabus. Students overwhelmingly preferred a syllabus option where the course schedule was listed predominantly on the syllabus page, even if that meant clicking to a new page for other syllabus-related information. Regarding the navigation structure, the top student preferences were to have links in the following order: Course Home Page, then Assignments (including information on quizzes, graded discussions, and online submissions), then Announcements (information and updates posted by the instructor), and then Modules (used to organize course content, including all assignments, discussions, and quizzes). A brief sampling of student comments in the open-ended text area reinforce the notion of efficiency in course design: “Getting to my coursework quickly and efficiently is the most important thing to me.” “It was quicker and more efficient for me as a user.” “Easiest to find information quickly and efficiently.” While student preference indicated a strong desire for efficiency in course design, comments in the open-ended text area of the survey also demonstrated a desire for an affective connection to the university. For one of the course landing page options, students were shown a picture of a statue of the university mascot. While this picture did not resonate with students in terms of usefulness or practicality, the imagery did produce numerous positive comments from the affective side of connection with the university and the role of the mascot image and also ranked highest in terms of the most attractive landing page. While this need for connection was not as strong as the need for

-452-

E-Learn 2016 - Washington, DC, United States, November 14-16, 2016 efficiency, it nevertheless indicated students’ desire to have a sense of connectedness through their online course design. The following quotations illustrate this: “The welcome page with the [university] mascot evoked a sense of belonging to [the university]. Canvas feels very generic on its own and the [school mascot] is a symbol of [the university].” “The picture of the [mascot] gives the sense of belonging.”

Study 2: User Experience (Think-Aloud Observations) Study Design In November 2015, online faculty members in a college were asked to distribute a recruitment survey to undergraduate students asking for volunteers to participate in a User Experience (Think-Aloud) Research Study. The recruitment survey was active for ten days and resulted in forty-three respondents. From the respondents, a total of five online students participated in the study between a timeframe of December 2015–January 2016. The selection of five participants was based on usability best practices (Nielsen, 2000). All participants were currently taking at least one fully online course in either the ANGEL LMS or the Canvas LMS. Both systems are being used by the university as it transitions all of its online course offerings from ANGEL to Canvas. After participating in the study, each participant was awarded a $50 gift certificate to Amazon.com. The five participants were chosen to match the college’s online undergraduate demographic population and had the following breakdown: four male students and one female student; one student between the ages of 20-23; one student aged 24-29; two students between the ages of 30-39; and one student aged 40-49.

Figure 1: Age breakdown of undergraduate online students in the college.

The study’s aim was to evaluate the college’s online course design. Each participant was placed in a fully developed online course and then asked to complete eight predefined tasks which represented common elements present in any online course, such as: finding when an assignment is due, checking a graded assignment, locating a reading, or contacting group members about a group assignment. While partaking in the study, participants had their computer mouse-movements recorded. In addition, both audio and video recordings of the subjects were made. Participants were also asked to write on a paper form to rank the ease or difficulty of each task using a scale from one to five. The form also asked participants to write any comments and feedback regarding each task or their general thoughts regarding the LMS through open-ended

-453-

E-Learn 2016 - Washington, DC, United States, November 14-16, 2016 questions. Four of the five participants recorded comments on the paper form, while one participant gave verbal responses to the questions on the paper form. Both quantitative and qualitative data were collected, thus supporting a mixed methods approach.

Figure 2: Picture of room where User Experience (Think-Aloud) Research Study took place.

Findings Overall, both quantitative and qualitative data support the notion students did not seem to have difficulty completing assigned tasks, despite, in some cases, this being the first time participants were interacting with the Canvas LMS. Regarding the 1 (easy) to 5 (difficult) scale used to record task attempts and completions, the majority of the tasks ranked in the >2.0 scale reflecting relative ease in completing the tasks. A brief sampling of student comments reinforce the notion of ease in completing tasks: Student 1 2 3 4 5

Task 2 7 6 1 8

Comment “Not too bad, easy once you get familiar.” “Very straightforward.” “Easy to locate information.” “...it was fairly easy to find.” “Easy to find.”

Figure 3: Sample of student comments while completing tasks. However, even with the overall relative ease of completing the assigned tasks, there was much to learn by witnessing and recording students interacting with the online course design. For example, students predominately went to one area of the course site to complete all task. This one area was used to organize course content including all assignments, discussions, and quizzes. This area also reinforced the information on student preference for course landing page as found in the User Preferences (Survey). For students who did have difficulty with a particular task, watching their mouse movements and clicks, in addition to hearing their comments, yielded information on what and how students think when trying to complete typical course-related tasks. This information pointed to potential course design flaws with terminology used and a need to provide additional information to students regarding terminology and functionality. Furthermore, as the university is in process of transitioning from one LMS, ANGEL, to another, Canvas, many students in the study had more familiarity with the ANGEL system than Canvas. Surprisingly, despite the presence of no study questions regarding ANGEL, students actively compared the Canvas LMS to ANGEL. In some cases, the student’s knowledge of ANGEL seemed to impact their use and judgment of Canvas, where they often

-454-

E-Learn 2016 - Washington, DC, United States, November 14-16, 2016 applied terminology and functionality from the ANGEL to Canvas, even if not applicable. This led to an effort to better educate students during the LMS transition on how to use the new system.

Discussion The goal of user experience is to identify problems while collecting qualitative and quantitative data regarding participant satisfaction with the product (U.S. Department of Health & Human Services, 2016). Both the User Preferences (Survey) and User Experience (Think-Aloud) afforded collecting qualitative and qualitative data which pointed to student design preferences plus issues with course design and the transition to a new LMS. In both studies, students indicated a need to find information quickly, easily and efficiently. Students showed a particular interest in viewing and using one area of the course which would give them an overview of all course content, assignments and due dates as found in both studies. In both studies, many of the subjects used in the research qualify as adult students. As Cerone (2008) mentioned, adult students need to be actively involved in their learning process. However, the findings in both studies suggest this involvement also applies to the online course design. Students want to be heard and involved in their educational process, even in regards to the course design they use while pursuing their education. An additional item of note was the strong desire of some students to have their opinion known and the gratitude expressed by other participants in both the User Preferences (Survey) and the User Experience (ThinkAloud) studies for the opportunity to give their opinion. For the User Experience study, one participant drove eight hours to take part in the study noting they wanted to contribute to their education and try to help other students. In the User Preferences (Survey), multiple comments submitted by students in the open-ended text box were used to thank the researchers for asking their opinion. A brief selection of student comments from the User Preferences (Survey) follows: “I appreciate that my feedback will be used to allow for canvas to help students be successful.” “Thank you for the opportunity to contribute to the learning experience...” “Thank you for organizing this survey. I hope that the input from students is useful.” “Thank you for taking the time to seek our feedback.”

Conclusion The continued popularity of online learning makes user experience evaluation increasingly important to facilitate learning and ensure an appropriate learning interface (Moore et al, 2008). Yet, although researchers state that institutions need to continually examine the experiences of their students in order to determine if expectations regarding degree programs are being met (Deggs et al., 2010), this exploration does not seem to extend to students interacting within the online learning environment. Indeed, as noted by Wu (2016), few studies have been reported in identifying factors of learners' online learning experience when interacting with content. In a world where student boundaries between online professional, online personal, and online education are mere clicks away, holding online course development to a less rigorous examination process than non-academic online counterparts risks creating disruptive and unproductive online educational environments. By online education’s omission of employing user experience, which is considered best practice in other online industries, it risks doing a grave disservice to the user, or in this case, the student. More research on user experience in online course design and learning is needed to help inform both best practices for those in online education and scientific research to add to the limited body of knowledge in this area.

-455-

E-Learn 2016 - Washington, DC, United States, November 14-16, 2016

References Ardito, C., Costabile, M. F., De Marsico, M., Lanzilotti, R., Levialdi, S., Roselli, T., & Rossano, V. (2006). An approach to usability evaluation of e-learning applications. Universal access in the information society, 4(3), 270283. Cercone, K. (2008). Characteristics of adult learners with implications for online learning design, AACE Journal, 16(2), 137-159. Deggs, D., Grover, K., & Kacirek, K. (2010). Expectations of adult graduate students in an online degree program. College Student Journal, 44(3), 690–699. Faculty Focus (2009, March). Online course design: 13 strategies for teaching in a Web-based distance learning environment. Retrieved from http://www.facultyfocus.com/free-reports/online-course-design-13-strategies-forteaching-in-a-web-based-distance-learning-environment/ Fisher, E. A., & Wright, V. H. (2010). Improving online course design through usability testing. Journal of Online Learning and Teaching, 6(1), 228. Hassenzahl, M., & Tractinsky, N. (2006). User experience - a research agenda. Behaviour and Information Technology, 25(2), 91-97. Merriam, S. B. (2001). Andragogy and self-directed learning: Pillars of adult learning theory. New Directions for Adult and Continuing Education, 89(3), 3–14. Retrieved from http://umsl.edu/~wilmarthp/modla-links2011/Merriam_pillars of anrdagogy.pdf Moore, J. L., Dickson-Deane, C., Galyen, K., Vo, N., & Charoentham, M. (2008). ELearning Usability Instruments What is being Evaluated.Proceedings from E-Learn. Nielsen, J. (2000, March). Why you only need to test with 5 users. Retrieved from https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/ Notess, M. (2001). Usability, user experience, and learner experience. Retrieved from https://elearnmag.acm.org/featured.cfm?aid=566938 Parrish, P. E., Wilson, B. G., & Dunlap, J. C. (2011). Learning experience as transaction: A framework for instructional design. Educational Technology, 51(2), 15-22. Stanley, R., & Kurtz, P. (2011, October). Usability Testing: A Key Component in e-Learning Design. In E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (Vol. 2011, No. 1, pp. 418-423). U.S. Department of Health & Human Services (2016). Usability testing. Retrieved from: http://www.usability.gov/how-to-and-tools/methods/usability-testing.html U.S. Department of Health & Human Services (2016). User experience basics. Retrieved from: https://www.usability.gov/what-and-why/user-experience.html Wu, Y. (2016). Factors impacting students' online learning experience in a learner-centred course. Journal of Computer Assisted Learning. Zaharias, P. (2004). Usability and e-learning: the road towards integration.eLearn Magazine, 2004(6), 4.2

-456-