ACCEPTED FOR PUBLICATION AT ECGBL 2016, PAISLEY, SCOTLAND 6-7 OCTOBER GAME ELEMENTS NOT REFLECTED IN PROCEEDINGS HAVE BEEN ADDED TO THIS VERSION OF THE PAPER
How can educators with limited resources gamify their classes? A Design-based approach. Björn Lefers, Marcus Birkenkrahe Berlin School of Economics and Law, Berlin, Germany
[email protected] [email protected]
Abstract Hamari et al. (2014) reviewed 24 peer-reviewed empirical studies on gamification and concluded that gamification works, but that both the role of the context and the qualities of the users constitute confounding factors for its success. We suggest that there is a third aspect to be taken into account: the design quality of a gamified system. Using a design-based methodology and both qualitative and quantitative data this empirical study addresses the following question: How can a gamified learning website be designed, developed and implemented? The goal is to show educators and researchers how they can build their own gamified learning website with limited resources (time, money and personnel) and highlight confounding factors. This is important, because gamified systems can only be compared properly if they are designed well and in a similar fashion. The description of the design process of the gamified learning website follows the ADDIE-model that defines five steps: analysis, design, development, implementation and evaluation. We illustrate our approach with an experiment conducted with undergraduates in two classes between 2013 and 2015. The class website was built with WordPress, an open source content management system, and some additional plugins. The quality of a gamified system does depend on the types of design elements that are being used, but even more so on the behaviors that these design elements stimulate in learners and how they are embedded within a website. Led by theory, we show how a well designed reward system for gamification might look like. Compared with previous classes, the gamified learning website led to more activity and better grades. These positive aspects are due to the extrinsic effectiveness of the gamified system. Negative voices from learners concerned: too much content, too few popular content types and usability issues. The process of reading this paper is also gamified.
Keywords Gamification, game design, game based technologies, gameful design, instructional design
1. Introduction This paper is part of a planned series of publications on making knowledge and skills more accessible for people with a social mindset to help make the world a better place. This paper practices what it preaches, because the paper itself is gamified: while reading it you will collect badges and points (roughly one point per word read):
Figure 1: Badge: Getting Started Deterding et al. (2011) define gamification as “the use of design elements characteristic for games in nongame contexts.” Werbach and Hunter (2012) define design elements as “specific characteristics of games that you can apply in gamification.” The main goal of games is to entertain users (cp. Deterding et al. 2011; Zichermann & Cunningham 2011; Flatla et al. 2011). Thus gamification within the educational context is seen as a way to make learning more fun. But how can teachers gamify their class successfully?
1
ACCEPTED FOR PUBLICATION AT ECGBL 2016, PAISLEY, SCOTLAND 6-7 OCTOBER GAME ELEMENTS NOT REFLECTED IN PROCEEDINGS HAVE BEEN ADDED TO THIS VERSION OF THE PAPER
1.1 Status Quo Hamari et al. reviewed 24 peer-reviewed empirical studies on gamification and conclude that gamification works, but that there are some confounding factors; they name two main aspects: “1) the role of the context being gamified, and the 2) qualities of the users.“ (2014) We suggest that there is a third aspect to be taken into account: the design quality of a gamified system.
1.2 Purpose The goal of this paper is to show educators and researchers how they can built their own gamified learning website with limited resources (time, money and personnel), to illustrate what constitutes a good design quality and to show why it’s important to take the design quality into account as a confounding factor.
1.3 Research question How can a gamified learning website be designed, developed and implemented?
1.4 Methods This paper uses a Design-based methodology following the definition by Wang et al. (2005): “[W]e define design-based research as a systematic but flexible methodology aimed to improve educational practices through iterative analysis, design, development, and implementation, based on collaboration among researchers and practitioners in real-world settings, and leading to contextually-sensitive design principles and theories.” The educational practice that we apply this method to is a finance class taught by Birgit Felden, a professor of business management at the Berlin School of Economics and Law in 2013-2015. As an instructional designer, one of the authors of this paper, Björn Lefers, was significantly involved in the analysis, design, development, and implementation of this class; the teacher and the designer made many decisions together. The other author, Marcus Birkenkrahe, was supervisor of the thesis that led to this paper. Our investigation can be classified as Participatory Action Research (PAR) in the sense of Freire (Brydon-Miller, 2003). In the further course of this paper we will not use the names of the persons involved, but the following role descriptions: teacher, designer and author of this paper.
1.5 Structure The description of the design process of the gamified learning website follows the ADDIE-model that defines five steps: analysis, design, development, implementation and evaluation. (Molenda, 2003) To assess the effects of the website this empirical study uses both qualitative and quantitative data. More details will be provided in the section on evaluation.
Figure 2: Badge: Intro
2. Analysis 2.1 Description In the summer semester of 2013 the teacher delivers her class “Investition und Finanzierung” (engl. Finance and Investment) in a blended learning format, as defined by Graham (2010), by combining face-to-face instruction with a website. Classes start with a short talk by the teacher, after which students go through the content of the website individually or in teams. If questions arise they can ask the teacher. The website is build with WordPress and explains the content of the class through short learning units with texts, graphics, questions and sample solutions. Within a topic multiple nested learning units are shown as a link that expands
2
ACCEPTED FOR PUBLICATION AT ECGBL 2016, PAISLEY, SCOTLAND 6-7 OCTOBER GAME ELEMENTS NOT REFLECTED IN PROCEEDINGS HAVE BEEN ADDED TO THIS VERSION OF THE PAPER or collapses when a student clicks on it; sample solutions to questions can be viewed similarly. At the bottom of each topic page students can comment.
2.2 Evaluation We evaluate the class based on data from google analytics, exam results and observations and interpretations of the teacher and the designer, who involve other experts in their discussions. Only very few students attend class. The teacher infers this is due to students learning at home. The students’ grade average is similar to previous courses. Students do not access the website very often and only leave few comments. The designer assumes that the latter is due to the multiple nested learning units; to comment, students have to scroll down a lot and point out which learning unit their comment refers to. This also makes it hard for others to identify the context of a comment and to respond. In addition, students can view sample solutions without providing their own solutions. This increases the likelihood that students only scan the content without doing the work. The teacher and the designer discuss these problems with another instructor, Stefanie Quade, who has a lot of experience running blended learning classes. She points out that the website is always the focal point, no matter whether students are in class or learning online. Because of that the added value of each of these elements is not clear enough.
Figure 3: Badge: Analysis
3. Design In this section we focus on the design of the gamified learning website, but first want to point out how the analysis of the pilot project in the summer semester of 2013 leads to important changes for the class in the winter semester 2014-15.
3.1 Overall Concept To distinguish between the classroom and the website and to communicate the added value of both of these elements more clearly the teacher and the designer adapt the idea of the flipped classroom as defined by Lowell (2013): "We define the flipped classroom as an educational technique that consists of two parts: interactive group learning activities inside the classroom, and direct computer-based individual instruction outside the classroom." Outside the classroom students learn individually with the website to prepare for a certain topic and inside the classroom they form groups to solve a case study based upon what they have learned online. The teacher only hands out the case study inside the classroom and communicates to the students that a big part of the exam will be solving such a case study. The designer assumes that this has a big impact on the use of the website, because without studying online before class students aren’t prepared to solve the case study, which also means that they are not very well prepared for the exam.
3.2 Information Architecture & Page Layout To increase the activity on the website the teacher and the designer decide to gamify the website. But before doing so the designer makes some important changes to the information architecture and the page layout of the website. The nested learning units are rearranged in a way that the hierarchy of the website is only 3 layers deep: topic, module, unit. This means that students can reach a unit with 2 clicks, while before it was up to 6 clicks and involved a lot of scrolling. Inspired by the KhanAcademy.org, a popular free learning website, the page layout on the unit level is changed as shown in figure 1.
3
ACCEPTED FOR PUBLICATION AT ECGBL 2016, PAISLEY, SCOTLAND 6-7 OCTOBER GAME ELEMENTS NOT REFLECTED IN PROCEEDINGS HAVE BEEN ADDED TO THIS VERSION OF THE PAPER
Figure 4: Page layout of learning website Students can navigate to modules and units through the navigation bar on the left. Below the learning content to the right students can comment. The changes to the information architecture and the page layout stimulate discussions, because commenting involves less scrolling and the context of a comment is more clear. Also now students need to click at least once to proceed to the next unit, which makes it harder to just scan over the content quickly; further improvements relating to this will be described later in more detail.
Figure 5: Badge: Coffee
3.3 Gamification Design Based on Vroom (1964; quoted in Gagné & Deci 2005) Porter and Lawler (1968; quoted in Gagné & Deci 2005) introduced the model of intrinsic and extrinsic motivation. People who are motivated intrinsically enjoy the activity itself, while, in contrast, people who are motivated extrinsically derive satisfaction from the consequence of their activity through a reward. (Gagné & Deci 2005). The self determination theory reconciles intrinsic and extrinsic motivation (Ryan & Deci 2002) by pointing out that extrinsic motivators can help internalize a motivation, thus turning it into an intrinsic motivation, when they support basic psychological needs. (Ryan & Deci 2000; 2005) There are three basic psychological needs: competence, autonomy and
4
ACCEPTED FOR PUBLICATION AT ECGBL 2016, PAISLEY, SCOTLAND 6-7 OCTOBER GAME ELEMENTS NOT REFLECTED IN PROCEEDINGS HAVE BEEN ADDED TO THIS VERSION OF THE PAPER relatedness (Ryan & Deci 2000). To make learning more fun a gamified system would have to support these basic psychological needs, but how can it do so? Hunicke et al (2004) point out that a designer can evoke positive emotions in players through rules, or mechanics. The way players interact with these rules create a game dynamic. To help assess the effects of rules they distinguish eight different positive emotions games can evoke in players: § Sensation - Game as sense-pleasure. § Fantasy - Game as make-believe. § Narrative - Game as drama. § Challenge - Game as obstacle course. § Fellowship - Game as social framework. § Discovery - Game as uncharted territory. § Expression - Game as self-discovery. § Submission - Game as pastime. It’s difficult to foresee what game dynamics certain rules create and which emotions they evoke in players, which is why designing games is an iterative process in which the designer needs to optimise the rules to find an ideal balance. For example time pressure and oponents can create challenge, while tasks that are difficult to achieve alone can foster fellowship (Hunicke et al. 2004). In the educational context the teacher is the designer who formulates rules that create a dynamic and evoke emotions in learners. The different types of emotions above can help assess, reflect upon and optimise a class. In this paper we focus on the rules of the gamified learning website that, as part of a flipped classroom approach, prepares students to be able to solve the case study inside the classroom. Hunicke et al (2004) define the rules, or mechanics as the “various actions, behaviors and control mechanisms afforded to the player within the game context.” Within the learning website on the most basic level the measurable actions and behaviours are limited to: § checking off learning units, § solving quizzes and § commenting. Werbach et al (2012) make the materialisation of the control mechanisms more tangible through a list of 15 important components, to which we add three from another source (Gamification Wiki, n.d.) The six steps of Werbach et al (2012) gamification design framework and the questions contained therein helped choose the components. But due to limitations regarding the technical infrastructure of the learning website, the choice not only depends on what can be done in theory, but even more so on what can be developed in practice. Both of these aspects inform each other and lead to an ultimate choice. This is why we will describe the used components and how they are embedded within the website in the following section.
Figure 6: Badge: Design
4. Development 4.1 Technical Infrastructure The website is developed with WordPress, an open source content management system, the fee-based theme Enfold and the fee-based plugins WP Courseware, a learning management system, and WPAchievements for gamification. Badges are designed with the free Open Badge Designer Plugin. Videos are produced with GoAnimate, a fee-based animation software for cartoon-style videos, and LearningApps, a free tool to develop interactive modules like puzzles.
5
ACCEPTED FOR PUBLICATION AT ECGBL 2016, PAISLEY, SCOTLAND 6-7 OCTOBER GAME ELEMENTS NOT REFLECTED IN PROCEEDINGS HAVE BEEN ADDED TO THIS VERSION OF THE PAPER
4.2 Elements We distinguish between content elements that contain learning materials and feedback elements, which are the rules or components we mentioned in the previous section. We will describe the feedback elements in more detail. This video screencast shows the look and feel of both content and feedback elements and how they are embedded within the website: https://youtu.be/wVG84wF_N3g (Lefers 2015)
4.2.1 Content Elements We distinguish between observable and interactive content elements. Observable content elements are texts, tables, graphics and videos. The designer develops cartoon learning videos with recurring characters similar to a sitcom and embeds external videos from other teachers or short news clips. Interactive content elements are LearningApps (interactive modules), quizzes (multiple choice, open ended text and true/false) and comments (students receive an email when someone replies to their comment).
4.2.2 Feedback Elements Cascading Information Theory The content of each class is released in small snippets through short learning units. Content Unlocking When students check off a unit the next unit is unlocked. The day after a class the content relevant for the next class is unlocked and students are informed via email. Countdown A countdown on the start page shows students how much time they have left to prepare until the next class starts. Progression Progress bars visualize the progression of a student within a topic. A checkbox indicates the units that have been checked off. Within a unit the navigation bar to the left indicates finished, current and future units through colors.
Figure 7: Progress bars Achievements & Badges The development of the reward system in the form of points, achievements and badges is inspired by classes from Lee Sheldon, game designer, instructor and author of the book The Multiplayer Classroom (2011), as well as the websites KhanAcademy.org, a popular free learning website, and Stackoverflow.com, a question and answer website for developers. Lee Sheldon and these two websites use gamification successfully. Other influences are a recorded video conference with Jeff Atwood, one of the founders of StackOverflow and a gamification workshop at a German elearning conference. (Anon n.d.)
6
ACCEPTED FOR PUBLICATION AT ECGBL 2016, PAISLEY, SCOTLAND 6-7 OCTOBER GAME ELEMENTS NOT REFLECTED IN PROCEEDINGS HAVE BEEN ADDED TO THIS VERSION OF THE PAPER In theory achievements are defined objectives and badges are the visual representation of this accomplishment (Werbach & Hunter 2012); both are closely linked which is why from now on when we speak of badges we also mean achievements. Based on the actions they reward badges can be grouped in 5 categories: § Topic badges (for completed topics) § Comment badges (for 1, 5, 10, 15, 25 and 50 comments) § Module badges (for 10, 25, 50 and 75 modules) § Special badges (for special actions, as a surprise) § Case study badges (for completed case studies) Examples for special badges are the exam badge students earn after having collected 8.000 points or the Xmas badge they get for christmas. Badges are awarded right after the corresponding action has taken place. A popup window at the top right of the website informs the student about the event; it contains the badge graphic, the name of the badge and a short text.
Figure 3: Badge popup window (engl. „Financial analysis – Very good, no KPI can scare you any longer!“) The shape and color of the 5 badge types are designed in a way that makes it easy to distinguish them. Bright colours and motivating texts are used to evoke positive associations in students. (Tyge-F. et al. 2016) Badges are also used for user onboarding, which “is the process of improving a person's success with a product or service.” (Wikipedia contributors 2015) Users are awarded badges for their first login, their first comment, their first completed topic, etc. Through this they slowly get to know the reward system of the website and are motivated to repeat certain actions to collect more badges.
Figure 9: Badge: Special Person Points The number of points collected and the last actions for which points have been awarded are always shown in the top right corner.
Figure 10: Points collected
7
ACCEPTED FOR PUBLICATION AT ECGBL 2016, PAISLEY, SCOTLAND 6-7 OCTOBER GAME ELEMENTS NOT REFLECTED IN PROCEEDINGS HAVE BEEN ADDED TO THIS VERSION OF THE PAPER The designer decides how many points are awarded for which actions. Through this a reward system emerges that stimulates certain behaviors in learners. Type
Actions
Points total
Share
Learning content
Modules completed Topics completed
7.175
75,9 %
Comments (example: 15 comments)
Comments written Comment badges awarded
1.300
13,7 %
Special badges
Special actions
631
6,7 %
Case studies
Case study quizzes answered
350
3,7 %
Total
9.456
100%
Table 1: Reward system On the learning website students are rewarded for progressing through the learning content and for discussion through comments. The teacher communicates that students need to earn more than 8.000 points to be eligible to take the exam. The reward system is intentionally setup in a way that this is not possible without having checked off a large part of the learning content and without writing many comments. Leaderboard The leaderboard shows how many points students have earned and which badges they got. This makes the learners progress transparent and allows students to see how they are doing compared to their classmates.
Figure 11: Leaderboard listing points and achievements (icons) earned by students of the course
8
ACCEPTED FOR PUBLICATION AT ECGBL 2016, PAISLEY, SCOTLAND 6-7 OCTOBER GAME ELEMENTS NOT REFLECTED IN PROCEEDINGS HAVE BEEN ADDED TO THIS VERSION OF THE PAPER
Figure 12: Badge: Development
5. Implementation Students use the gamified learning website to prepare for class. The teacher unlocks the topics over the course of the semester. Students reach the necessary 8.000 points to be eligible for the exam. At the end of the semester they write the exam, which is graded by the teacher. In this phase the data for the evaluation of the class is collected.
Figure 13: Badge: Implementation
6. Evaluation 6.1 Data Collection The data traffic of the website was recorded through Google Analytics. The comments (qualitative) and the number of generated comments, points and badges (quantitative) per user are captured with WordPress. In the winter semester of 2014-15 the quality management of the Berlin School of Economics and Law evaluates the class. The survey of the students contains quantitative assessments of the teacher, concept, workload and the contribution of the students, as well as qualitative statements in the form of free text. The grades of the students are available for the summer semesters 2012 and 2013, as well as for the winter semester of 201415. Qualitative assessments of exam performances are based on the teachers opinion. In addition to that conversations between the teacher, the designer and other experts were recorded (e. g. emails, meeting protocols, etc.). The data was analysed with the spreadsheet software Numbers (Mac) and Google Analytics.
Figure 14: Badge: Spying
6.2 Evaluation 6.2.1 Overall Compared to the 2013 pilot class more students attend class. The dropout rate decreases from 50% to 34%. Out of 31 students taking the exam 4 did not pass. The average grade is 2,79. Compared to the pilot class both the quality of the answers in the exam and the average grade improved (from 3,12 to 2,79). A large majority of
9
ACCEPTED FOR PUBLICATION AT ECGBL 2016, PAISLEY, SCOTLAND 6-7 OCTOBER GAME ELEMENTS NOT REFLECTED IN PROCEEDINGS HAVE BEEN ADDED TO THIS VERSION OF THE PAPER the students rates the workload as too high. With 6,8 hours per week the workload is 2,2 hours above the average workload of the department. Grade
Number of Students
Points (Average)
Comments (Average) Case Study Badges (Average)
1,0 - 1,9
8
9.906
27,50
5,38
2,0 - 2,9
11
9.665
23,91
5,00
3,0 - 3,9
6
8.673
13,00
5,33
4
2
8.439
10,50
5,50
5
4
8.858
20,00
4,50
-
16
3.643
3,13
1,88
N =
47
Table 2: Correlation of grade, number of comments made, points and case study badges earned by students 6.2.2 Website Students use the website more often, stay longer and post more comments. But students rate the website rather negative. They criticise too much content and too few explanations, which is a little contradictory. The majority of the students point out the missing haptics of the website and wish for lecture notes. They explain that they are used to learning with pen and paper and point out that it is difficult to comprehend how topics relate to each other through a website. Content elements Within the comments students far more often report negative than positive experiences (258 vs. 47; ratio: 85% to 15%). Out of 47 positive comments 15 name graphics (32%), 9 cartoon learning videos (19%), 8 external videos (17%), 8 LearningApps (17%), 4 tables, 2 text and 1 quizzes (all less than 1%). Out of 258 negative comments 170 name unclear content (66%), 46 content-related mistakes (18%) and 13 technical problems (16%). All learning units contain text, and more than half of them quizzes and comments. Almost one hour of videos is spread over 11% of the learning units; LearningApps are contained in less than 1%. Popular content elements are represented less than less popular ones. On average 2,23 comments were posted per learning unit. The average number of comments per student on the website was 16. Feedback elements 39 of 47 students reached 8.000 points and were awarded the exam badge. The more points students collected, the better was their grade in the exam. The reward system of the website is strongly marked by challenge: on average 67% of points were awarded for actions based on challenge (checking off learning content), 20% relate to fellowship (comments), 7% to narrative (videos) and 6% to sensation (special actions).
Figure 15: Badge: Evaluation
7. Discussion In this section we reflect upon the evaluation in light of intrinsic and extrinsic motivators (Vroom 1964; Porter & Lawler 1968; quoted in Gagné & Deci 2005), self determination theory (Ryan & Deci 2000), the eight positive
10
ACCEPTED FOR PUBLICATION AT ECGBL 2016, PAISLEY, SCOTLAND 6-7 OCTOBER GAME ELEMENTS NOT REFLECTED IN PROCEEDINGS HAVE BEEN ADDED TO THIS VERSION OF THE PAPER emotions that can be evoked in learners, the rules, or mechanics employed (Hunicke et al. 2004) and their materialisation as components (Werbach & Hunter 2012). The feedback elements reward the consequence of an activity: they are very effective extrinsic motivators. The content elements offer no reward: ideally they are enjoyable in and of themselves and can be seen as intrinsic motivators. Overall, students do not enjoy the content elements, though, because there are too many and too few popular types of content. The lack of haptics of digital devices makes interaction less sensual. The feedback elements do support the basic psychological needs, e. g. relatedness through rewarding comments or competence through progression; but they do not turn extrinsic motivators into intrinsic ones. The reward system is too focused on challenge and not balanced in terms of the eight positive emotions of Hunicke et al (2004). We can confirm what Werbach and Hunter (2012) call “pointsification” and Hamari et al (2014) describe as extrinsic rewards undermining intrinsic motivators. But while Hamari et al name the role of the context and the qualities of the users as confounding factors we highlight the quality of the gamification design. Hamari et al (2014) compared different gamified systems based on the elements (motivational affordances) that were used and their psychological and behavioral outcome. We believe that this way they have compared apples with oranges. Because the psychological and behavioral outcome does not only depend on e. g. the points and badges that are being awarded, but even more on the actions, these elements reward (or punish) how they were developed and what emotions they evoke in learners. To improve the design quality, the rules or mechanics need to be optimised iteratively to evoke another, better fitting set of emotions in learners and to better satisfy the basic psychological needs. This can be achieved by improving existing content and feedback elements or by adding new ones. E. g. relatedness could be increased by allowing students to vote up high quality comments to reward the author with points; autonomy could be increased by allowing learners to start with any topic or unit (and not only at the beginning). Furthermore new components like teams or gifting could be added to see what effects this has on the game dynamics. When teachers become designers, the resulting reward system reflects the teachers pedagogical approach and can be used to assess the learning experience of the students.
Figure 16: Badge: Discussion
8. Limitations 8.1 Disclosure The implicit hypothesis of the designer is, that gamification concepts in general can have positive effects and that their success depends on their design quality and the manner in which these have been executed. The failure of a concept does not mean that gamification doesn’t work, but that the concept may have to be discarded or optimised.
8.2 Methodological limitations The limitations of this investigation are a result of the design-based method itself, because it involves that these results are contextually sensitive. This is why their generalization, their practical application and their comparability with other research is limited. Therefore the limits of this investigation lie mostly in the quality of the collected data, which is constrained in the following way: Collected points don’t reward the contentrelated quality of a comment and how comprehensive or accurate units have been processed (there was no
11
ACCEPTED FOR PUBLICATION AT ECGBL 2016, PAISLEY, SCOTLAND 6-7 OCTOBER GAME ELEMENTS NOT REFLECTED IN PROCEEDINGS HAVE BEEN ADDED TO THIS VERSION OF THE PAPER manual grading or checking of quiz answers). In theory students were able to collect many points with very short quiz answers or comments. This may have been utilized to collect 8.000 points and gain the exam badge. A summer semester was compared to a winter semester, but a winter semester is longer than a summer semester. Even though the number of classes was the same, students in the winter semester had more time to study. This may have improved the average grade. The evaluation of the activity on the website compared two classes that are based on very different concepts. It is not clear if an identical concept and another group of students would show similar results. No sociodemographic data like age, gender and media literacy was collected. E. g. it is possible that criticism of students was based on insufficient media literacy. Students were not anonymous on the website. It is possible that their comments were affected by the knowledge that the teacher who reads them also grades their exam. Google analytics obtained all the data of website visitors. This includes the teacher, the designer, aswell as a small group of coworkers and interested parties, that had access to the website. This may have distorted some of the data.
Figure 17: Badge: Limitations
9. Conclusions First things first! Let’s look at the leaderboard (which, due to one of the many constraints of this text-based medium, only shows yourself): you have now collected 11,5 badges and 4667 points (which means that we’re just within the limit of 5.000 words, one of the conferences rules). You may get another badge if you also read the references. Don’t cheat! But since you made it this far in the game it will be easy for you to spot the limitations of this gamified system, or paper: it is mostly progress, or discovery based. Through the lense of self determination theory it (hopefully) satisfied you basic psychological need for competence, if you had some say in when and how much of it you have read, it may have satisfied your need for autonomy, but it will only satisfy your need for relatedness if you discuss this paper with us or others - which we hope you will at conferences, via email or through papers! The research question of this paper is: How can a gamified learning website be designed, developed and implemented? The paper has answered this question by describing the design process of a gamified learning website. Educators and researchers can use this process as a guideline or an inspiration for their own classes. Hamari et al. reviewed 24 peer-reviewed empirical studies on gamification and conclude that gamification works, but that there are some confounding factors; they name two main aspects: “1) the role of the context being gamified, and the 2) qualities of the users.“ (2014) We added a third aspect, the design quality of a gamified system. In the section design we described that changing the overall concept to a flipped classroom approach, improving the page layout and the information architecture of the learning website, aswell as the gamification design were important for the success of the gamified system – these factors contribute to the design quality. In the section development we listed and described the content and feedback elements that were being used, what they reward and how they are embedded with in the website. We focused on the feedback elements and on the reward system in the form of points, achievements and badges; these elements are aggregated within the leaderboard, which is the main indicator for self assessment, because it makes it tangible for learners to see where they stand in relation to their peers. A broad comparative study (Hamari, 2014) has identified context and quality of the users as factors determining the quality of the gamified system. Based on our course experiences, we infer that the design quality of the gamified system is an important additional factor. The design of a gamified system includes a
12
ACCEPTED FOR PUBLICATION AT ECGBL 2016, PAISLEY, SCOTLAND 6-7 OCTOBER GAME ELEMENTS NOT REFLECTED IN PROCEEDINGS HAVE BEEN ADDED TO THIS VERSION OF THE PAPER number of elements that lead to interactions between teacher and students and among the students using a device hapticaly. We suggest that designers of gamified systems need to pay more attention to these elements though the basis of our experiment is too narrow to say which elements are more important than others. A common quality of all design elements that we found to be successful (in the sense of leading to a better learning experience) is the increased feedback between teacher and student. (Lefers and Birkenkrahe, 2016) But gamified systems can only be compared properly if they are designed well and in a similar fashion. We have shown how a well-designed gamified system might look like and how it can be improved. This approach is aligned with the design-based approach, and more than that: our findings suggest that the best way to design gamified courses is to focus on the process rather than the product. We hope readers can use our findings to make knowledge and skills more accessible to people with a social mindset to help make the world a better place.
Ficure 18: Badge: Social Impact The next papers in our planned series on social impact of contemporary education will be concerned with the change management aspects of creating blended learning courses.
Figure 19: Badge: Conclusions
References Anon, Workshops | Die e-Learning Fachtagung Informatik, [online], http://www.delfi2014.de/workshops#workshop7. Deterding, S. et al., 2011. From game design elements to gamefulness. Proceedings of the 15th International Academic MindTrek Conference on Envisioning Future Media Environments - MindTrek ’11, pp.9–11. Flatla, D.R. et al., 2011. Calibration Games: Making Calibration Tasks Enjoyable by Adding Motivating Game Elements. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology. UIST ’11. New York, NY, USA: ACM, pp. 403–412. Gagné, M. & Deci, E.L., 2005. Self-determination theory and work motivation. Journal of organizational behavior, 26(October 2003), pp.331–362. Gamification Wiki, Game Mechanics | Gamification.org, [online], https://badgeville.com/wiki/Game_Mechanics. Graham, C.R., 2010. The Chronicle of Higher Education. The Chronicle of higher education, LVII(1), pp.A1–A88. Hamari, J., Koivisto, J. & Sarsa, H., 2014. Does Gamification Work? -- A Literature Review of Empirical Studies on Gamification. In System Sciences (HICSS), 2014 47th Hawaii International Conference on. pp. 3025–3034.
13
ACCEPTED FOR PUBLICATION AT ECGBL 2016, PAISLEY, SCOTLAND 6-7 OCTOBER GAME ELEMENTS NOT REFLECTED IN PROCEEDINGS HAVE BEEN ADDED TO THIS VERSION OF THE PAPER Hunicke, R., LeBlanc, M. & Zubek, R., 2004. MDA: A Formal Approach to Game Design and Game Research. Lefers, B., 2015. Gamifizierte Lern-Webseite (Einblick), Youtube, [online], https://www.youtube.com/watch?v=wVG84wF_N3g. Lefers, B., Birkenkrahe, M., 2016. Was ist der didaktische Mehrwert von Gamification? Reflexion statt Regression. To be published in Die Transformation von Unternehmenskultur, Controlling und professioneller Weiterbildung durch Gamification: Durch Spiel zum Ziel!, Hrsg. Schönbohm, A.; HWR Berlin. Lowell, J. et al., 2013. The Flipped Classroom : A Survey of the Research. Proccedings of the Annual Conference of the American Society for Engineering Education, p.6219. Molenda, M., 2003. In Search of the Elusive ADDIE Model. Performance improvement advisor, 42(5), pp.34–36. Porter, L.W. & Lawler, E.E., 1968. Managerial attitudes and performance. Ryan, R. & Deci, E., 2000. Self-determination theory and the facilitation of intrinsic motivation. The American psychologist, 55(1), pp.68–78. Ryan, R.M. & Deci, E.L., 2002. Self-determination theory: An organismic dialectical perspective. Handbook of Self-Determination Research, pp.3–34. Sheldon, L., 2011. The multiplayer classroom: Designing coursework as a game, Cengage Learning. Tyge-F., K., Jan, R. & Jan, M., 2016. Enhancing understandability of process models through cultural-dependent color adjustments. Decision support systems. Vroom, V.H., 1964. Work and motivation. 1964. NY: John Wiley &sons, pp.47–51. Wang, F. & Hannafin, M.J., 2005. Design-based research and technology-enhanced learning environments. Educational technology research and development: ETR & D, 53(4), pp.5–23. Werbach, K. & Hunter, D., 2012. For the Win: How Game Thinking Can Revolutionize Your Business, Wharton Digital Press. Wikipedia contributors, 2015. User onboarding. Wikipedia, The Free Encyclopedia, [online], http://en.wikipedia.org/w/index.php?title=User_onboarding. Zichermann, G. & Cunningham, C., 2011. Gamification by design: Implementing game mechanics in web and mobile apps, O’Reilly Media, Inc.
Figure 20: Badge: References
14