Making the Implicit Explicit: Game-Based Training Practices From an Instructor Perspective Anna-Sofia Alklind Taylor and Per Backlund University of Skövde, Skövde, Sweden
[email protected] [email protected] Abstract: A game-based learning environment is more than just a digital artefact. Factors such as where the game is played, how the learning experience is designed, the level of social interaction and so on, need to be considered when designing a game for learning or training. For instance, during gaming, the learners might physically leave the virtual environment to continue the gameplay in the physical environment. If the instructor wants to keep track of, for example, learning progress or game states, the gaming system needs to support these activities both in-game and outside the game, via different logging tools, e.g. video and voice recording. Military organisations have a long history of using games and simulations for training. This means that they have had the opportunity to develop and refine training practices that are both cost-effective and valuable for learning. However, these practices are largely based on instructors’ own experiences rather than scientific studies. This study aims to describe game-based training practices in order to (1) extricate good practices that may be transferred as inspirational examples for others, and (2) identify areas for improvement. Empirical material was collected using observations and interviews and then analysed and categorised. Interpretations made from the analysis were later validated through a questionnaire survey with military personnel directly or indirectly involved in simulator- or game-based training. The analysis shows that a game-based training cycle consists mainly of four phases: preparation, introductory lecture, gameplay and debriefing. Although the systems used are advanced in that they log user activity and support quick changes to the scenario during gameplay, running a training session is highly demanding for the instructors. Offline tools (e.g. pen and paper) are commonly used when there is a lack of system support in a specific situation. The paper concludes with a list of system support features for different aspects of game-based training. Keywords: best practices, game-based training, instructor-in-the-loop, instructional support, puckstering, serious games
1. Introduction and related work The use of games and simulations for learning and training is gaining more and more popularity for each day, even though many of us still have naïve assumptions on their impact on learning. Moreover, while much research is dedicated to the effect of gaming on the learner, or on the design of serious games, little attention is paid to in-depth studies of the effect of serious gaming on instructors (Liu & Wang, 2006; Pivec, 2009; Tan, Neill & Johnston-Wilder, 2012; Watson & Fang, 2012). It is widely accepted that training effectiveness stems from the manner in which simulation is used (Bedwell & Salas, 2010), and this is no less true for serious gaming. While many researchers and practitioners nowadays will assent that games will not revolutionize educational and training practices, most agree that given the right design in the right context, game-based learning (GBL) is a valid option for increasing learning and to facilitate transfer of knowledge (O'Neil, Wainess & Baker, 2005). Based on this, we have identified the following research questions:
How is training carried out in organisations that have used serious games for a longer period of time?
How can a GBL system facilitate instructors in their tasks, such as designing, running and assessment during a training session? In practice, what requirements for system support could be derived from studying game-based training practices?
Watson and Fang (2012) present a framework for implementing games in an educational context. What sets this framework apart from many others within the discipline is that it has a clear focus on teachers’ roles and tasks during specific problem-based learning processes, i.e. problem, activation, exploration, reflection and facilitation. Their conclusion is that teachers play an important role in structuring the gameplay experience around specified learning goals. This entails coaching students to reinterpret and making sense of the gaming experience, both during the experience and afterwards (Watson & Fang, 2012). If we agree on the premise that gaming in itself does not lead to in-depth learning, we must turn our gaze towards the context in which games are used, i.e. the instructional design employed. Thus, we
1
Anna-Sofia Alklind Taylor and Per Backlund make a distinction between games as (digital) artefacts and the manner in which they are used, including where they are used, how they are used, and the level of social interactions between learners, learners-instructors, and learners-virtual agents. Qiu and Riesbeck (2004) have not studied GBL per se, but rather focus on the wider scope of computer-based learning. Their work is interesting, however, because it represents an instructor-inthe-loop approach in which the instructor can continuously introduce new materials into the system and as a consequence complement computer generated feedback. Their approach “allows authoring and instruction to happen at the same time and keeps the system from totally depending on preprogrammed content” (Qiu & Riesbeck, 2004, p 172). When designing GBL experiences, there are many factors to consider. Some of these factors are constrained due to financial, temporal and situational issues. For instance, running a facility that employs technical and educational staff is more expensive and time-consuming than having learners play a game at home. Facilitators’ skill and experience level also play an important part in successful game development (Tan, Neill & Johnston-Wilder, 2012). If the facilitators are unaware of the capabilities and limitations of games, they will not use them appropriately. The development of serious games should be a collaboration between instructors (experts in the subject they are teaching) and game designers (experts in game technology and design) (Bedwell & Salas, 2010). Within the military domain, there is increasing interest in instructor-less game-based training that relies on intelligent tutoring systems (e.g. Salas & Cannon-Bowers, 2001; Stevens-Adams et al, 2010). Instructor-less training has several advantages, the most obvious being lower costs for staff, vehicles and facility maintenance (Stevens-Adams et al, 2010). Another significant factor is flexibility; trainees can practice whenever and wherever they want. This type of learner control is hypothesised to augment learning through increased motivation and time spent on training (DeRouin, Fritzsche & Salas, 2004). Other advantages include consistency (everyone learns the same thing) and less pressure on the instructor to prepare and facilitate training activities (Qiu & Riesbeck, 2004). Building intelligent tutoring systems that can replace human instructors is far from straightforward, however. Most game-based systems still rely on human intervention before, during and after training, or students might take away the wrong message from the game (Chatham, 2009). Instructor-less training might work when learning objectives are simple and easily measured, but this is not always the case. For instance, complex goals involving creative problem solving and teamwork need a dynamic experience and context-dependent feedback delivered by an instructor (Raybourn, 2007; Wilson et al, 2009). Thus, instructors complement the pre-programmed instructional system, by being able to adapt to situations not predicted during game or scenario design. As put by Qiu and Riesbeck (2004, p 171): “In order to support purely computer-based accurate feedback, the vocabulary of operations and situations in the system has to be specified in advance so that rules can be written. Once deployed, students can only do what the system has been prepared to support. It is considerably harder for instructors, as non-programmers, to modify a computer application when they want to customize it for their courses.” Another problem relates to feedback and transfer. Simulations and games are very effective for rote learning, but adaptive skills require context-sensitive feedback that can balance corrective feedback and feedback that elicits reflective or meta-cognitive reasoning in the learner. One challenge difficult to determine by artificial means is to recognise different player modes. In wargaming, instructors wish to limit the time trainees enter a so-called ‘gamer mode’ (Frank, 2012). In gamer mode, a trainee tries to win the game regardless of the real-world applicability of the methods used, i.e. they can exploit weaknesses in the game design. While gamer mode can be useful for certain purposes, it can lead to erroneous learning. Having an instructor-in-the-loop can regulate player modes to suit the training goals. We argued elsewhere for an approach in which instructors actively take part in gameplay (Alklind Taylor, Backlund & Niklasson, 2012). This is also known as puckstering (Colonna-Romano et al, 2009), i.e. instructors control one or several avatars and provide learning experiences by playing alongside their students. One of our conclusions was that serious game development needs to incorporate instructors, not just as subject matter experts, but also as a user group next to students.
2
Anna-Sofia Alklind Taylor and Per Backlund Military organisations have had the opportunity to develop and refine training practices that are both cost-effective and valuable for learning. There is, however, a lack of documentation of these practices, especially from an instructor’s viewpoint. Therefore, this study aims to describe gamebased training practices in order to (1) extricate good practices that may be transferred as inspirational examples for others, and (2) identify improvement in training procedures and system support.
2. Methods Since the research questions posed in the beginning of this paper are of exploratory nature, a qualitative approach has been chosen. Thus, this research is based on a case study at the Swedish Land Warfare Centre (SLWC). The SLWC is a military organisation that trains cadets in land warfare. This includes training on different levels; from training of the individual soldier to the work involved in organising platoons, companies and battalions. Part of this training is conducted using simulation and game technology, ranging from bespoke simulation systems to commercial-off-the-shelf (COTS) systems. Specifically two simulator- or game-based facilities have been involved in this study. Facility A (Figure 1) is a virtual command and situation trainer that utilises the COTS serious games Virtual Battlespace 2: Virtual Training Kit (VBS2) and Steel Beast Pro (SBP). Facility B (Figure 2) is a virtual platoon trainer for tanks and combat vehicles. Compared to facility A, it has a more traditional simulator installed. The simulator encompasses three bespoke, high-end ‘cabins’, simulating the Leopard 2 tank interior. Even though the simulator at facility B cannot be classified as a game, it was chosen due to the gaming characteristics that surrounded its use. For instance, instructors may include elements of competition and scoring as part of some scenarios.
Figure 1: Virtual command and situation trainer (facility A), from a trainee perspective (left) and the instructor’s screens (right)
Figure 2: Virtual platoon trainer for tanks and combat vehicles (facility B), with instructors' seats in the foreground (dark) and simulator cabins in the background (light)
3
Anna-Sofia Alklind Taylor and Per Backlund Empirical data have been collected through a series of observations; first a few pilot observations to get acquainted with the facilities (using pen-and-paper for data collection) and then structured observations (using a video recorder for data collection) of exercises carried out in autumn 2010 and spring 2011. Before the structured observations, key words and goals that would help focus recordings on details that were of main concern for the research problem were written down. These ‘sensitizing concepts’ (Patton, 2002) are needed because it is not possible to observe and record everything going on in a situation. Examples of sensitizing concepts used are: gaming behaviours and training phases, procedures and habits. Occasionally, situations would arise that were not anticipated, but still found valuable to record. A total of 11 hours and 42 minutes of video material filmed during three separate occasions (one at facility A and two at facility B) was used as the main data source. Analysis was done using the Transana 2 software for transcribing video clips and sound recordings. Not all recorded material was transcribed, only those parts that contained verbal information of direct interest for the aim of this study. Interesting scenes and utterances concerning aspects related to training phases were sorted into categories. Once a clip had been identified, it was grouped together with other clips that have a common theme. There were usually about three instructors present during the training sessions observed: one system operator, one exercise director and occasionally one or two extra instructors to serve as extra pair of eyes and ears. A system operator is an instructor and/or technical staff member with specialised training in and responsibility for running the current system. The system at facility B is adapted to be run by two instructors with one extra seat for a third, with limited access to operator screens. Please note that the number of present instructors varied during observation, due to the fact that some instructors arrived later, when the particular scenario reached a new phase. Observations were complemented with informal interviews with training personnel. The aim was to get more information about training procedures, attitudes towards game-based training, and to confirm conclusions made from above mentioned observations (Lincoln & Guba, 1985). All participants in the study are male and all are professionally active military educators. After the analysis, results were validated through a questionnaire with assertions based on the interpretations made in the analysis. The respondents had to rate each assertion on a 7-point Likert scale. The questionnaire was completed by 14 military officers with some affiliation to simulation training at the SLWC. Two of the officers had also been participants in the above mentioned observations.
3. Results and analysis As mentioned earlier, we specifically wanted to extract aspects of game-based training related to training phases. Phase categories that emerged were preparation, briefing, gameplay/training (‘offline’ and ‘in-game’) and debriefing. We will return to these phases in the subsequent sections. The next subchapters will give examples from the empirical data that contributed to our conclusions. We have aggregated the results across facilities, since they both follow the same procedure on a superordinate level. Specific details where the facilities diverge are explained in the text.
3.1 Preparation phase – creation and re-negotiation Preparation at the SLWC involves formulation of a training plan and construction of a scenario. The training plan includes information about the extent, aim, objective and requirements of the training. The system operator then creates a scenario using the simulation system or game, i.e. chooses a virtual map or terrain and, to some extent, adding entities such as buildings, avatars, vehicles and so on. However, most of these entities are created moment-by-moment during gameplay, to create a dynamic environment. One difficulty when creating a scenario, as voiced by one of the system operators (facility B) interviewed, is to anticipate how the trainees will interpret it. This calls for a skilled trainer, who is proficient in both the technical system and the domain of the profession, to make a scenario that is flexible enough to be played in different ways. Although the instructors might have some idea on what the general training objective will be, the precise details might not be formulated until the day of training. About one hour before trainees arrive,
4
Anna-Sofia Alklind Taylor and Per Backlund instructors are familiarised with the basics of running the system (facility B). The instructors’ computer skill varies a great deal, from not wanting to handle the game at all (mainly facility A) to being able to take over the system operator’s role completely (mainly facility B). This affects how the roles are divided up between participating instructors.
3.2 Lesson or briefing – traditional instructor role Once the trainees have arrived, training starts with a lesson or briefing in a classroom setting (Figure 3). What is included in that lesson seems to be dependent on the trainees’ previous experience with that particular facility as well as on the individual instructors involved. Unsurprisingly, if it is the trainees first time, greater time is spent explaining rules and regulations for that facility (including safety issues), plus getting the trainees acquainted with the game or simulator. For instance, during the observation at facility B, in which the trainees were entirely new to the simulator, the system operator spent a large portion of time explaining the graphics in the simulator. At the same time as he explained how different terrain and objects in the terrain looked and worked, the other instructors ran a live demonstration, which was projected on two screens above the whiteboard.
Figure 3: Briefing at facility A (left) and facility B (right) Apart from facility practicalities, simulation characteristics and training objectives, instructors may also include theoretical material, such as what the regulations say are the correct methods in solving a task. The trainees might also be quizzed about correct terminology and/or procedures. At facility A, one instructor was also employed to role-play as company commander and he used a larger part of the briefing to relay the objective and orders for the different platoons. Most of this was done in the role of commander and the trainees assigned to play as platoon commander answered him as they would have, if the communication had been transmitted via radio. According to the results of the questionnaire, most instructors believe that it is important for trainees to have as much information as possible before running the simulation (see Figure 4). This goes against current research into learning and development of expertise. According to Schwartz, Bransford and Sears (2005), this tell-and-practice method of instruction can severely impede far transfer, thus making training less effective. Instead, students should find solutions to problems by exploration and reflection first and then given the “correct” solution (if there is one).
3.3 Gameplay phase – teacher player and formative feedback After briefing, trainees are allowed to acquaint themselves with the system before the actual gaming commences. The gameplay phase is complex; many things are happening at the same time. The SLWC has dealt with this by employing several instructors who distribute responsibilities between themselves. This is especially apparent at facility A, were the different roles are more clear-cut compared to facility B. For instance, at facility B, it is not possible to directly observe trainees as they run the simulation. Instead, instructors assess progress through several screens and audio output. It is therefore more important for all the instructors at facility B to learn the assessment and feedback interface. Instead, at facility A, most of the technical issues are left to the system operator, while the rest of the instructors do their tasks ‘offline’, such as direct observation and live role-play.
5
Anna-Sofia Alklind Taylor and Per Backlund
Figure 4: Results from questionnaire regarding instructors' beliefs about important pre-knowledge before initiating gameplay. Questions have been translated from Swedish Not all training occur in-game; trainees sometimes discuss strategies with other trainees face-to-face or perform other tasks offline (see left picture in Figure 5). That introduces a new challenge where data from various sources (e.g. video/audio recordings and handwritten notes) must be integrated with data from the system logs. Feedback to trainees is given in a number of ways. The simplest form of feedback is the one given by the game mechanics. For instance, if a trainee shoots at an enemy, visual and auditory feedback will tell if (1) the weapon is actually fired and (2) the enemy was hit or not. Another form of in-game feedback, with a similar function as the one just described, is communication between players. In that way, each trainee knows how to proceed through the simulation. Feedback more directly related to the learning goals are mainly given by the instructors. Here, one can distinguish between explicit feedback and feedback conveyed as part of the game. By explicit feedback, we mean feedback were the instructor talk directly to a trainee or a group of trainees by, for example, correcting their behaviour or asking them to explain why they are doing something. Another form of feedback is the one emerging from puckstering (see right picture in Figure 5). From the trainees’ point of view, this resembles game feedback in that it is conveyed through the game interface in the same manner as the scripted feedback. The underlying (pedagogical) aim, however, is different. When acting as a puckster, the instructor controls one or several avatars within the game. By playing the opponent, instructors can create a more realistic and adaptive enemy compared to the AI system, but, most importantly, they can adjust the difficulty depending on the performance of the trainees. In other words, they can allow the ones that apply the correct methods (i.e. the methods most likely to work in reality) to win the game and ‘punish’ the ones that are careless, inattentive, and so on.
6
Anna-Sofia Alklind Taylor and Per Backlund
Figure 5: Activities during gameplay at facility A: offline note taking during a trainee strategy meeting (left) and puckstering (right) Puckstering can also be used to test trainees against certain learning objectives. At one occasion during observations at facility A, the system operator was concerned that a group of trainees did not report enemy sightings correctly (or simply was not attentive enough). He, by controlling an enemy avatar, started to shoot just above the trainees’ avatars’ heads and then listened for the group’s reaction. When noting happened, he repeated the procedure. After a while one of the trainees is heard shouting “Retreat! Retreat!” and the group commander reports “India Bravo is fired at”. The system operator ceases firing and seems content with the trainees’ reactions. The incident was later brought up during debriefing, to remind the trainees to report what is happening at all times. Here, we clearly see the interplay between formative feedback (given continuously during training) and summative (feedback after training has occurred). Apart from acting as a puckster, the system operator also has to deal with technical problems that arise during training. These two roles can sometimes be in conflict with one another. For instance, at facility A, the system operator is seated in the same room as the trainees. His position is in one of the corners, with the screen turned away from the rest of the room. This makes it difficult for trainees to sneak a peek to see were the enemies are located. Yet, whenever there is a bug in the system or a trainee’s avatar needs reviving, one of the trainees has to go over to the system operator to report the problem. The system operator must then quickly discontinue what he was doing to help the trainee, who now has full view of the screen. This problem does not occur at facility B, since trainees do not leave the cabins until the end of an exercise.
3.4 Debriefing phase – summative feedback Debriefing, or after action review (AAR), occurs after training. This is when trainees reflect upon their experiences and performance during the simulation. Thus, debriefing is an essential ingredient for learning. Debriefing at the SLWC is carried out in different ways depending on the situation and the limitations of the simulation software. At facility B, the system logs everything, from every button that is pressed to the communication within each simulator. The instructors have the opportunity to use these during debriefing, to make their point both visually and auditory. This does not mean, however, that all logged data are used during debriefing; in the sessions I observed, only a small amount of the logs were used actively during the debriefing. At facility A, logged data in the form of ‘filmed’ sequences and audio are seldom used, due to limitations in VBS2. The instructors at the facility have previously tried to use the game with headsets, but could not solve the problem of the game picking up the utterances from trainees seated nearby. Their solution was to not use the microphones at all, with the consequence that no sound is logged in the AAR module. As explained by one of the instructors, moving pictures without sound take more time to extract than what they are worth during debriefing, so they at most use screenshots as visual aids. During and directly after training, instructors take notes on what to put emphasis on during the debriefing. They usually only have a few minutes to prepare between training and debriefing, so the notes are usually brief and written by hand (or as a bullet list in a presentation software). A ‘trick’ they
7
Anna-Sofia Alklind Taylor and Per Backlund use to give themselves more time is to give the trainees a routine task to perform that does not need the instructors’ attention (facility B). Debriefing is usually conducted as a discussion between the instructors and the trainees. This can be done in a classroom setting (with all trainees present) or in smaller groups. The instructor usually starts by asking the trainees to give a summary of their own assessment and then gives summative feedback based on the notes made earlier. For shorter, iterative training sessions, trainees can also be asked to discuss their performance among themselves in between sessions, and then a longer, instructor-led debriefing is held at the end of the day (facility B). Occasionally, one instructor sits down with one or two trainees to discuss a specific matter only relevant to them (facility A).
4. Summary of key points and concluding remarks When developing new serious games, game designers must take the whole context of game-based training into consideration. Descriptions of current training practices can thus be used to infer functionality such a system. Table 1 shows a number of system support features for instructors (and learners) on different levels of abstraction. For instance, they show that a serious game for vocational training needs several components of subsystems, such as an authoring tool and an AAR system. At the same time, they also show low-level needs pertaining to the user interface, such as being able to perform specific task efficiently and with only minor pre-training. Table 1: Examples of system support features derived from the empirical data Training phase Preparation
Activity Training plan written and sent to system operator, who creates a scenario Learning objectives or difficulty level re-negotiated Teaching inexperienced instructors how to use the game and experimenting with game mechanics.
Gameplay
Assessment of performance by direct observation (offline) Assessment of performance by indirect observation (online) Explicit formative feedback to trainee(s). In-game formative instructor feedback (coaching by gaming)
Adding a scoring system Debriefing
Trainees assessing performance
their
own
Instructors preparing the debriefing session
Running a debriefing session
System support Being able to tag finished scenarios with keywords for easy search among previously created scenarios. Being able to quickly write an instructor guide to each scenario for easy reuse of scenarios. Being able to co-author a scenario online. Being able to create open-ended scenarios. Authoring tool must allow for flexibility. Authoring tool should have high degree of usability. Being able to create a simple scenario within a few minutes without extensive prior knowledge of the user interface. Being able to quickly place new entities in game. Being able to quickly (or automatically) incorporate notes in AAR tool. Being able to follow several trainees ‘by a glance’. Monitoring tools should have a user interface that highlights changes in simulator/game states or important trainer performance measures. Being able to convey information (through text or speech) in real-time to individual trainees or groups of trainees. Being able to play the game from an instructor’s perspective, i.e. being able to alter certain aspects of the game through gameplay to achieve specific pedagogical goals. Being able to quickly place new entities in game and to control them (cf. puckstering). Being able to switch between administrative mode and coaching by gaming mode. The user interface should give clues to help the instructor navigate in the game space. Being able to add a customary scoring system based on simple game state rules. A debriefing tool that enables trainees to access logged data. These data need to be organised and aggregated in a way that supports the activity. Being able to visualise the training session from different aspects. The debriefing tool should include visualisation software that aid instructors by pattern recognition and triangulation of data. The game should include built in scenarios for routine tasks that do not need human monitoring to give instructors more time to prepare. Easy to use AAR tool with time stamps for video and sound clips.
The research objectives guiding this study were to describe how training is carried out in organisations that have used serious games for a longer period of time. In our case study, we have shown that a learning environment is more than just a game; it also encompasses the physical, social and organisational context in which serious gaming takes place. The context obviously affects how
8
Anna-Sofia Alklind Taylor and Per Backlund learning is assessed and how feedback is conveyed and this, in turn, has practical consequences for how GBL is carried out. A difference between military and non-military educational settings is the number of instructors present during exercises. While it is quite common to involve several instructors in military training, distributing responsibilities and alleviating cognitive workload, it is fairly rare in other educational context. This will undoubtedly affect training practices. However, even if conclusions drawn from observations of gameplay might not be applicable to all instances of GBL situations, we can use them as useful examples for future instructional design. Most importantly, we have shown that instructors play a significant role in game-based training and that more research is needed on how games can be designed to support teaching activities. Programming games that can assess complex behaviour and react appropriately can be very difficult, and trainees run the risk of forming over-simplistic models of their tasks. As we have observed, a human assessor is (currently) more efficient at assessing complex learning situations. Therefore, developing systems that facilitate an instructor-in-the-loop approach, including features to support puckstering, is a valuable endeavour. Furthermore, we wanted to derive requirements for system support that would facilitate instructors’ tasks as they run a GBL session. For a wider audience of game developers and instructors, the list of requirements could be used as a basis for discussion around how a specific GBL system should be designed. Depending on issues such as learning objectives and staff availability, the list must then be adapted to the specific project at hand. In sum, issues such as support for cooperative work, usability, flexibility and adaptability should be key issues for serious games designers. Instructors are a specific user group with different needs compared to trainees. To give a concrete example: trainees playing a game need challenges related to learning goals. Instructors, on the other hand, need to be able to play the same game without those challenges. An instructor who is ‘stuck’ on a problem will not be able to coach his or her trainees in an efficient and relevant way. Therefore, challenges related to coaching should not be linked to learning goals. The system should instead support, not hinder, the task of creating a dynamic learning experience, because that is a challenge in itself.
References Alklind Taylor, A.-S., Backlund, P. and Niklasson, L. (2012) "The coaching cycle: A coaching-by-gaming approach in serious games". Simulation & Gaming, doi:10.1177/1046878112439442. Bedwell, W.L. and Salas, E. (2010) "Computer-based training: capitalizing on lessons learned", International Journal of Training and Development, Vol. 14, No. 3, pp 239–249. Chatham, R.E. (2009) "Toward a second training revolution: Promise and pitfalls of digital experiential learning", in K.A. Ericsson (ed.), Development of professional expertise: Toward measurement of expert performance and design of optimal learning environments, Cambridge University Press, New York, pp 215–246. Colonna-Romano, J., Stacy, W., Weston, M., Roberts, T., Becker, M., Fox, S. and Paull, G. (2009) "Virtual Puckster – behavior generation for army small team training and mission rehearsal", Proceedings of the 18th Conference on Behavior Representation in Modeling and Simulation, Sundance, UT, 31 March – 2 April 2009, pp 153–154. DeRouin, R.E., Fritzsche, B.A. and Salas, E. (2004) "Optimizing e-learning: Research-based guidelines for learner-controlled training", Human Resource Management, Vol. 43, No. 2-3, pp 147–162. Frank, A. (2012) "Gaming the game: A study of the gamer mode in educational wargaming", Simulation & Gaming, Vol. 43, No. 1, pp 118–132. Lincoln, Y., S. and Guba, E.G. (1985) Naturalistic inquiry, Sage Publications, Newbury Parc, CA. Liu, J. and Wang, L. (2006) "A teacher’s tool in game-based learning system: Study and implementation", in Z. Pan, R. Aylett, H. Diener, X. Jin, S. Göbel & L. Li (eds), Technologies for e-learning and digital entertainment, Vol. 3942, Springer, Berlin, pp 1340–1347. O'Neil, H.F., Wainess, R. and Baker, E.L. (2005) "Classification of learning outcomes: evidence from the computer games literature", Curriculum Journal, Vol. 16, No. 4, pp 455–474. Patton, M.Q. (2002) Qualitative research and evaluation methods, 3rd edn, Sage Publications, London. Pivec, P. (2009) Game-based learning or game-based teaching?, Report Number 1509, British Educational Communications and Technology Agency (BECTA), http://webarchive.nationalarchives.gov.uk/20101102103654/emergingtechnologies.becta.org.uk/index.php? section=etr&rid=14692. Qiu, L. and Riesbeck, C.K. (2004) "An incremental model for developing computer-based learning environments for problem-based learning", Advanced Learning Technologies, 2004. Proceedings. IEEE International Conference on, 30 Aug.-1 Sept. 2004, pp 171–175. Raybourn, E.M. (2007) "Applying simulation experience design methods to creating serious game-based adaptive training systems", Interacting with Computers, Vol. 19, No. 2, pp 206–214.
9
Anna-Sofia Alklind Taylor and Per Backlund Salas, E. and Cannon-Bowers, J.A. (2001) "The science of training: A decade of progress", Annual Review of Psychology, Vol. 52, No. 1, pp 471–499. Schwartz, D.L., Bransford, J.D. and Sears, D. (2005) "Efficiency and innovation in transfer", in J.P. Mestre (ed.), Transfer of learning from a modern multidisciplinary perspective, Information Age Publishing, Greenwich, Connecticut, pp 1–51. Stevens-Adams, S.M., Basilico, J.D., Abbott, R.G., Gieseler, C.J. and Forsythe, C. (2010) "Performance assessment to enhance training effectiveness", Proceedings of the Interservice/Industry Training, Simulation & Education Conference (I/ITSEC), Orlando, Florida, November 29 – December 2. Tan, W.H., Neill, S. and Johnston-Wilder, S. (2012) "How do professionals’ attitudes differ between what gamebased learning could ideally achieve and what is usually achieved", International Journal of Game-Based Learning (IJGBL), Vol. 2, No. 1, pp 1–15. Watson, W.R. and Fang, J. (2012) "PBL as a framework for implementing video games in the classroom", International Journal of Game-Based Learning (IJGBL), Vol. 2, No. 1, pp 77–89. Wilson, K.A., Bedwell, W.L., Lazzara, E.H., Salas, E., Burke, C.S., Estock, J.L., Orvis, K.L. and Conkey, C. (2009) "Relationships between game attributes and learning outcomes", Simulation & Gaming, Vol. 40, No. 2, pp 217–266.
10
Proceedings of the 6th European Conference on Games Based Learning Hosted by University College Cork And Waterford Institute of Technology Ireland 4-5 October 2012 Edited by Dr Patrick Felicia Waterford Institute of Technology Ireland
Copyright The Authors, 2012. All Rights Reserved. No reproduction, copy or transmission may be made without written permission from the individual authors. Papers have been double-blind peer reviewed before final submission to the conference. Initially, paper abstracts were read and selected by the conference panel for submission as possible papers for the conference. Many thanks to the reviewers who helped ensure the quality of the full papers. These Conference Proceedings have been submitted to Thomson ISI for indexing. Further copies of this book and previous year’s proceedings can be purchased from http://academic-bookshop.com CD version ISBN: 978-1-908272-70-6 CD version ISSN: 2049-1018 Book version ISBN: 978-1-908272-69-0 Book Version ISSN: 2049-0992
Published by Academic Publishing International Limited Reading UK 44-118-972-4148
www.academic-publishing.org
Preface These proceedings represent the work of researchers participating in the 6th European Conference on Games-Based Learning, which is being organised and hosted this year by University College Cork and the Waterford Institute of Technology. The Co-Conference Chairs are Professor Grace Neville and Dr. Sabin Tabirca, both from University College Cork, Ireland. The Programme Chair is Dr. Patrick Felicia from Waterford Institute of Technology, Ireland. The conference will be opened with a keynote from Dr. Simon Egenfeldt-Nielsen from Serious Games Interactive. The topic of Simon’s presentation is "The potential of tablets to game-based learning." The Keynote address on the second day is byStephen Hagarty from Big Fish Games, Cork, Ireland. The Conference is a valuable platform for individuals to present their research findings, display their work in progress and discuss conceptual advances in many different areas and specialties within Games-Based Learning. It also offers the opportunity for like minded individuals to meet, discuss and share knowledge. ECGBL continues to evolve and develop, and the wide range of papers and topics will ensure an interesting two-day confercence. In addition to the main streams of the conference, there are mini tracks focusing on the areas of Multi-User Virtual Environments, Content and Assessment Integration, User Profiling and Barriers and Opportunities for the introduction of GBL in Educational Settings With an initial submission of 159 abstracts, after the double blind peer review process, there are 68 research papers, 4 PhD research papers and 11 work-inprogress paperspublished in these Conference Proceedings. These papers represent research from Australia, Austria, Belgium, Brazil, Czech Republic, Denmark, Finland, France, Germany, Greece, Hong Kong, Ireland, Israel, Italy, Malaysia, Norway, Phillipines, Russia, Singapore, Slovenia, South Africa, Spain, Sweden, Switzerland, Taiwan, The Netherlands, UK and the USA. We hope that you have an enjoyable conference. Dr Patrick Felicia Programme Chair October 2012
Conference Committee ECGBL Conference Director Professor Thomas M Connolly, University of the West of Scotland, UK Conference Executive Dr Patrick Felicia, Waterford Institute of Technology, Ireland Dr Dimitris Gouscos, University of Athens, Greece Professor Michael Meimaris, University of Athens, Greece Professor Grace Neville, University College Cork, Ireland Dr Pauline Rooney, Dublin Institute of Technology, Ireland Dr Sabin Tabirca, University College Cork, Ireland Mini Track Chairs Francesco Bellotti, University of Genoa, Italy Dr Stefan Göbel, Technical University of Darmstadt, Germany Dr Thomas Hainey, University of the West of Scotland, UK, Viktor Wendel, Technical University of Darmstadt, Germany Committee Members The 2012 conference programme committee consists of key people in the games based learning community, both from the UK and overseas. The following people have confirmed their participation: Wilfried Admiraal (Universiteit van Amsterdam, The Netherlands); Minoo Alemi (Sharif University of Technology, Iran); Daniel Aranda (Universitat Oberta de Catalunya, Spain); Nikolaos Avouris (University of Patras, Greece); Wolmet Barendregt (Gothenburg University, Department of Applied IT, Sweden); Francesco Bellotti (University of Genoa, Italy); Mary Bendixen-Noe (Ohio State University, USA); Tobias Bevc (Technische Universität München, Germany); Bani Bhattacharya (IIT Kharagpur, India); John Biggam (Glasgow Caledonian University, UK); Peter Blanchfield (School of Computer Science, University of Nottingham, UK); Natasha Boskic (The University of British Columbia, Canada); Rosa Maria Bottino (Istituto Tecnologie Didattiche - Consiglio Nazionale Ricerche, Italy); Philip Bourke (LIT-Tipperary, Ireland); Liz Boyle (University of the West of Scotland, UK); Tharrenos Bratitsis (University of Western Macedonia, Greece); Anthony Brooks (Aalborg University, Denmark); David Brown (Nottingham Trent University, UK); Carsten Busch (University of Applied Sciences HTW-Berlin, Germany); George Caridakis (University of the Aegean / NTUA, Greece); Thibault Carron (Université de Savoie, Chambéry, France); Rommert Casimir (Tilburg University, The Netherlands); Erik Champion (Massey University, New Zealand); Maiga Chang (Athabasca University, Canada); Dimitris Charalambis (University of Athens, Greece); Darryl Charles (University of Ulster, UK); Nathalie Charlier (Catholic University of Leuven, Belgium); Yam San Chee (Nanyang Technological University, Singapore); Ming-Puu Chen (National Taiwan Normal University, Taiwan); Satyadhyan Chickerur (M S Ramaiah Institute of Technology, India); Thomas Connolly (University of West of Scotland, UK); Tamer Darwish (Brunel University, UK); Ioannis Darzentas (University of Aegean, Greece); Sara De Freitas (Birkbeck College, University of London, UK); Sophia Delidaki (Hellenic American Educational Foundation, Greece); Ioannis Deliyannis (Ionian University, Greece); Muhammet Demirbilek (Suleyman Demirel University, Turkey); David Edgar (Glasgow Caledonian University, UK); Patrick Felicia (Waterford Institute of Technology, Ireland); Georgios Fesakis (University of the Aegean, Greece); Brynjar Foss (University of Stavanger, Norway); Christos Gatzidis (Bournemouth University, UK); Sebastien George (INSA Lyon, France); Panagiotis Georgiadis (University of Athens, Greece); Andreas Giannakoulopoulos (Ionian University, Greece); Stefan Goebel (Technical University Darmstadt, Germany); Pedro Pablo Gomex-Martin (Universidad Complutense, Madrid, Spain); Cleo Gougoulis (Peloponnesian Folklore Foundation, Greece); Dimitris Gouscos (University of Athens, Greece); Maria Grigoriadou (University of Athens, Greece); David Guralnick (Kaleidoscope Learning, New York, USA); Thomas Hainey (University of the West of Scotland, UK); Paul Hollins (The University of Bolton, UK); Birgitte Holm Sorensen (Danish School of Education, Denmark); Rozhan Idrus (Universiti Sains Malaysia, Malaysia); Jose Ignacio Imaz (University of the Basque Country, UPV-EHU, Spain); Jeffrey Jacobson (Carnegie Museum of Natural History, Pittsburgh, Pennsylvania, USA); Ruben Jans (Limburg Catholic University College, Belgium); Runa Jesmin (Global Heart Forum, UK); Alexandros Kakouris (University of Athens, Greece); Fragiskos Kalavassis (University of the Aegean, Greece); Konstantinos Kalemis (National Centre of Local Government and Administration, Greece); Michail Kalogiannakis (University Paris 5, France); Elisabeth Katzlinger (Johannes Kepler University, Linz, Austria); Harri Ketamo (Satakunta University of Applied Sciences, Finland); Kristian Kiili (Tampere University of Technology, Pori, Finland); Evangelia Kourti (University of Athens, Greece); Rolf Kretschmann (University of Stuttgart, Germany); Timo Lainema (University of Turku, Finland); Miguel Leitao (ISEP, Portugal); Ximena Lopez
vii
Campino (Initium, Italy); Carrie Lui (James Cook University, Australia); Hamish Macleod (University of Edinburgh, UK); Rikke Magnussen (Danish School of Education, Aarhus University/Steno Health Promotion Center, Denmark); Emanuela Marchetti (Aalborg University Esbjerg, Denmark); Jean-Charles Marty (Université de Savoie, Chambéry, France); Stephanos Mavromoustakos (European University Cyprus, Cyprus); Florian Mehm (Technische Universität Darmstadt, Germany); Michail Meimaris (University of Athens, Greece); Bente Meyer (The Danish University of Education, Denmark); Florence Michau (Grenoble Institute of Technology, France); Christine Michel (INSA-Lyon, France); Jonathan Moizer (University of Plymouth, UK); Alexander Moseley (University of Leicester, UK); Constantinos Mourlas (University of Athens, Greece); Grace Neville (University College Cork, Ireland ); Piotr Nowakowski (John Paul II Catholic University of Lublin, Poland); Kimmo Oksanen (Finnish Institute for Educational Research, University of Jyväskylä, Finland); Michela Ott (Institute Educational Technology, Italy); Lucia Pannese (Imaginary srl, Milano, Italy); George Papakonstantinou (University of Thessaly, Greece); Agis Papantoniou (Multimedia Laboratory of the School of Electrical and Computer Engineering (ECE) of the National Technical University of Athens (NTUA), Greece); Marina Papastergiou (University of Thessaly, Greece); Paul Peachey (University of Glamorgan, Treforest, UK); Eva Petersson (Aalborg University Esbjerg, Denmark); Elias Pimenidis (University of East London, UK); Selwyn Piramuthu (University of Florida, Gainesville, USA); Angeliki Poylymenakou (Athens University of Economics & Business, Greece); Bernd Remmele (WHL Graduate School of Business and Econmics Lahr, Germany); Vyzantinos Repantis (Psychico College, Hellenic-American Educational Foundation, Greece); Simos Retalis (University of Piraeus, Greece); Pauline Rooney (Dublin Institute of Technology, Ireland); Eleni Rossiou (University of Macedonia, Thessaloniki, Greece); Maria Roussou (makebelieve design & consulting, Greece); Florin Salajan (North Dakota State University, Canada); Jordi Sanchez Navarro (Universitat Oberta de Catalunya, Spain); Manthos Santorineos (School of Fine Arts in Athens, Greece); Olga Shabalina (Volograd State Technical University, Russia); Markus Siepermann (Technische Universität Dortmund, Germany); Julie-Ann Sime (CSALT, Lancaster University, UK); Chrysanthi Skoumpourdi (University of the Aegean, Greece,); Venustiano Soancatl (Universidad del Istmo, Mexico); Elsebeth Sorensen (University of Aarhus, Denmark); Mark Stansfield (University of West of Scotland, UK); Sabin Tabirca (University College Cork, Ireland ); Uday Trivedi (R.C. Technical Institute, India); Thrasyvoulos Tsiatsos (Aristotle University of Thessaloniki, Greece); Chuang Tsung-Yen (National University of Tainean, Taiwan); Richard Tunstall (University of Leeds, UK); Andrea Valente (Aalborg University Esbjerg, Denmark); Linda Van Ryneveld (Tshwane University of Technology, Pretoria, South Africa); Nicola Whitton (Manchester Metropolitan University, UK); Dorothy Williams (Robert Gordon University , UK).
viii