A Refined Evaluation Framework for Games-based Learning Thomas Hainey Thomas Connolly Liz Boyle
[email protected] [email protected] [email protected]
Abstract Games-based Learning has a dearth of empirical evidence supporting the approach. One of the primary reasons for this is the lack of evaluation frameworks to generate sufficient ideas to guide and focus games-based learning evaluations. This paper will present a refined evaluation framework based on two extensive literature reviews. One review performed to identify evaluation frameworks and empirical evaluation evidence in 2008 and another in 2009 to identify research on learning value and methods of measuring resultant outcomes and impacts of computer games. The paper will provide a brief description of existing evaluation frameworks and will then discuss specific refined categories of a previously developed evaluation framework to specifically evaluate games-based learning in terms of learner/instructor perceptions, learner/instructor preferences and learner/instructor motivations. The paper will provide a list of guidelines for evaluating games-based learning and will provide detailed measurements in terms of perceptions, preferences and motivations. Finally we will conclude with a discussion of future research directions with regards to further refinement and verification of the evaluation framework. Keywords: evaluation, empirical evidence, evaluation frameworks, perceptions, preferences, motivations 1. Introduction This paper presents the contribution of three sections of a developed and refined evaluation framework for games-based learning and will focus on learner/instructor preferences, learner/instructor perceptions and learner/instructor motivations. The framework is a product of two extensive literature reviews. One carried out in 2008 and a second carried out in 2009. The first literature review was carried out to identify evaluation frameworks for GBL in the literature and to identify empirical measurements taken and had a time frame from 1996 to 2008. The second literature was to review research on learning value and methods of measuring resultant outcomes and impacts of computer games and had a timeframe of 2004 to 2009. While the literature reviews had different search terms and different time frames, the aspect of commonality that links them is that they were attempting to identify empirical evidence in the literature. The 2009 literature search also reaffirmed and added to the findings of the literature search in 2008. From the combined results of these literature searches a refined evaluation framework has been developed. While there are 7 categories to this framework, this paper will focus on 3 of them: the learner/instructor perceptions, learner/instructor preferences and learner/instructor motivation categories. The learner performance and GBL environment category have been discussed in previous studies. Connolly, Stansfield and Hainey (2009) discussed the learner performance category in detail and Connolly, Stansfield and Hainey (2008) discussed the GBL environment category in detail. This paper will discuss previous work in this area and will present the results of the two literature searches. The evaluation frameworks discovered in the literature searches will then be presented and the three categories of the developed framework being discussed will be presented as well as the detailed measurements. Derived measurements will then be discussed and detailed guidelines for utilisation of the evaluation framework for conducting a GBL evaluation
will be presented. The paper will conclude with a discussion of this papers contribution and will discuss future research directions. 2. Previous Work Connolly, Stansfield and Hainey (2009) reviewed the literature and formulated a new evaluation framework for GBL (Figure 1). The purpose of the framework is to identify the main potential evaluation categories of games-based learning available in the scientific literature. The categories do not necessarily have to be viewed in isolation but as a collective whole depending on what is to be evaluated. The framework can be used in both a developmental sense to inform design during the implementation and embedding a games-based learning environment into curricula in a formative evaluation sense and also points to examples of individual analytical measurements already present in the literature for focusing on an evaluation at the end of development in a summative evaluation sense.
Figure 1: Evaluation framework for effective games-based learning A brief description will be provided of the four additional categories and then a full detailed description of the preferences, perceptions and motivation categories will be provided as they are the main focus of this study: Learner Performance – Encompasses pedagogy from the perspective of the learner and is to evaluate aspects of learner performance. It is primarily concerned with whether there is an improvement in learner performance. Attitudes – Learner and instructor attitudes towards various elements that may alter the effectiveness of the games-based learning intervention. Elements include: learner attitudes towards the taught subject, learner attitudes towards games, instructor attitudes towards the incorporation of games into the curricula etc. Collaboration – Collaboration is optional when considering games-based learning as it is dictated by whether the game is played on an individual level, cooperative group level, competitive group level etc. The main ways of evaluating collaboration are through log files monitoring interaction, mapping team aspects to learner comments, measuring the regularity and level of collaboration and learner group reflection essays. Games-Based Learning Environment category – This category encompasses all aspects that could potentially be evaluated about the games-based learning environment. It is one of the most complicated categories as it can be divided into five subcategories: environment, scaffolding, usability, level of social presence and deployment. The evaluation framework has been highly instrumental in designing an evaluation for two different types of games: a game for teaching requirements collection and analysis at tertiary education level (Hainey, Connolly and Boyle,
2009) and an Alternate Reality Game (ARG) for teaching modern foreign languages across Europe (Hainey et al. 2009). 3. Literature Searches This section presents the literature searches that were carried out to identify previous evaluation approaches for GBL and empirical measurements taken in the literature. 3.1 Method Used to Collect Data Two extensive literature searches were performed by reviewing various electronic databases including: ACM, ABIINFORM Global Database, Academic Search Premier, ASSIA: Applied Social Sciences Index and Abstracts, BioMed Central, Cambridge Journals Online, Blackwell Synergy, ChildData, Index to Theses, Oxford University Press (journals), Science Direct, EBSCO (consisting of Psychology and Behavioural Science, PsycINFO, SocINDEX, Library, Information Science and Technology Abstracts, CINAHL), ERIC, IngentaConnect, Infotrac (Expanded Academic ASAP), Emerald, IEEE and the Simulation & Gaming Journal from 1996 were also extracted and assimilated into the final results. The first literature search was looking at empirical measurements that had been taken in the literature and evaluation frameworks and had a time frame of 1996 to 2008. The second literature search was looking at research conducted on the learning value and skill enhancement of gaming and methods of measuring the resultant outcomes and impacts. The time frame of the second literature search was from 2004 to 2009. The one aspect of commonality that links the two literature searches is the search for empirical evidence in the gaming field and games-based learning. This means that the results of the 2009 literature search reaffirmed the findings of the first literature search and also identify additional literature to compliment the literature search in 2008. . The following detailed search terms were used for the first literature search in 2008: (“computer games” OR “video games” OR “serious games” OR “simulation games” OR “games-based learning” OR “MMOG” OR “MMORPG” OR “MUD” OR “online games”) AND (“education” OR “learning”) AND “evaluation” Approximately 10,000 articles were returned, but only 78 were considered appropriate to the primary research criteria, namely evaluation frameworks for GBL and evaluation of games-based learning taking some form of empirical measurement. The literature search results have helped identify particular measurements existing in the literature and have been instrumental in constructing the GBL evaluation framework presented. The following detailed search terms were used for the second literature search in 2009: ("computer games" OR "video games" OR "serious games" OR "simulation games" OR "games-based learning" OR "MMOG" OR "MMORPG" OR "MUD" OR "online games") AND (evaluation OR impacts OR outcomes OR effects OR learning OR education OR skills OR behaviour OR attitude OR engagement OR motivation OR affect) 7,392 articles were returned in the second literature search. 126 studies were considered relevant to our primary research criteria and an additional 8 articles were considered relevant to this study that have been incorporated into the evaluation framework categories and combined with the first literature search results. One particular study has been incorporated in the identified evaluation frameworks.
4. Evaluation Framework Results The literature searches identified 12 evaluation frameworks from 9 separate articles. The Game Object Model 1 and Kirkpatrick’s four levels for evaluating training have been identified in 2 studies. Identified frameworks are listed in Table 1. Table 1 Identified evaluation frameworks Study Amory, Naicker, Vincent and Adams (1999) Amory (2006) de Freitas and Oliver (2006) O’Neil, Wainess and Baker (2005)
Schumann, Anderson, Scott and Lawton (2001) Song and Lee (2007) Ssemugabi and de Villiers (2007) Tan, Ling and Ting (2007)
Lee and LaRose (2007)
Framework(s) Game Object Model version 1 Game Object Model version 2 Four Dimensional Framework Kirkpatrick’s four levels for evaluating training CRESST model of learning Affective Motivation Model of Learning. Kirkpatrick’s four levels for evaluating training Framework of Heuristic Evaluation in MMORPGs Framework for evaluating web-based learning The Design Framework for Edutainment Environment Adopted Interaction Cycle for Games. The Engaging Multimedia Design Model for Children Game Object Model version 1 Theoretical model of game consumption that integrating Bandura's (1991) social cognitive theory of self-regulation and Csikszentmihalyi's (1975) theory of flow experience.
5. Empirical Paper Results The literature searches identified 77 empirical papers taking some form of measurement that can be applied to a GBL application. This study is focusing on three particular categories of the framework: Learner/Instructor Perceptions, Learner/Instructor Preferences and Learner/Instructor Motivation. 5.1 Learner/Instructor Perceptions This category mainly encompasses perceptions associated with the learners such as their perception of time within a game or simulation, how real the game is and its correspondence with reality (for example, whether the GBL application represents a holistic view of a particular organisation or process), perception of game complexity, advice quality and level of self reported proficiency at playing games. The category also encompasses the learners’ perception of how the GBL application can assist them and whether confusion is experienced. The category also encompasses differences in gender perceptions, perceptions of the male and female characters roles in the games that they play for example, the relation between masculinity and gaming. The instructor would also have similar perceptions depending on their particular involvement. If the instructor was simply incorporating content into the GBL application then their perceptions may be more important in terms of whether the application is fitting well into the particular context. Perceptions are extremely dependent on the learning outcomes and what particular perceptions are considered important in the evaluation criteria. The learner/instructor perception measurements are listed in Table 2.
Table 2: Learner/Instructor Perception Measurements Learner/Instructor Perceptions Overview of time line. Confusion experienced. How well the game would fit into its intended context (Wagner, Schmalstieg and Billinghurst, 2006). Future use in anticipated domain (Kelleher, Pausch and Kiesler, 2007). Advice quality (Constantino-González and Suthers, 2001; Leemkuil and de Hoog, 2005). Whether the game requires increased or decreased realism to improve it. Whether the game could potentially help other learners. Whether the game could potentially help the learner. How proficient the learner is at playing games (Kato and Beale, 2006). The games ability to represent a holistic view of a particular organisation or process. How realistic uncertainty is in the game. Complexity, fluency of the gaming, game feedback ability, and the level of realism. How the game corresponds to reality (Lainema and Nurmi, 2006). Self reported effectiveness (Paul, Messina and Hollis, 2006). Thiagarajan’s seven-point debriefing (Lennon, 2006). Perceptions of game playing (Sward, Richardson, Kendrick and Maloney, 2008). Gender differences in game playing, player perceptions of the male and female characters roles in the games that they play, the relation between masculinity and gaming (Ogletree and Drake, 2007).
Whether the game teaches the subject (Oh and Van der Hoek, 2005; Shaw and Dermoudy, 2005). Perception of increased skill acquisition. Perception of the approach (Dantas, Barros and Werner, 2004).
Perceived ease of use and attitude towards playing online game. Perceived ease of use and perceived usefulness. Perceived usefulness and attitude towards playing online game. Perceived usefulness and intention to play online game (Hsu and Lu, 2004).
5.2 Learner/Instructor Preferences
This category considers learner and instructor preferences during a GBL intervention. Learners like to learn in different ways and have different learning styles (Kolb, 1984) therefore different learners will have different preferences. This category could include: learner preference for media when teaching the material, preference for conventional teaching approaches or GBL, preference and utilisation of particular game features, most preferred positive and negative aspects of the game and preference for different competitive modes. The category could also include preferred computer game genre in relation to gender and preference for use in an educational context. For an instructor, this category could include when to introduce the GBL application in their particular course or whether they prefer to teach with the GBL application. The learner/instructor preferences measurements are listed in Table 3. Table 3: Learner/Instructor Preferences Measurements Learner/Instructor Preferences Rating of technical aspects. Utilisation of key game features.
Most important positive aspects of the game. Most important negative aspects of the game (Schwabe and Goth, 2005). Learner/instructor likes and dislikes of conventional approach/training. Learner/instructor likes and dislikes of virtual reality/games-based training approach/training (Quinn, Keogh, McDonald and Hussey, 2003). Learner preference for where they would most like to play the game. Preference of media for teaching the course material (Kato and Beale, 2006). Preferred activities within the environment and preferred activities in relation to other activities (Rosas et al, 2003). Preference of competitive modes with regards to anonymous competition, face-to-face competition, decreased proximity competition. Satisfaction (Yu, Chang, Luit and Chan, 2002). Preference of learning styles: competitive, cooperative or individualistic (Ke, 2006; Zaphiris, Ang and Law, 2007; Yu, Chang, Luit and Chan, 2002). Comparison of learner preferences between conventional training and GBL training under specific headings such as feedback and self-paced learning etc (Quinn, Keogh, McDonald and Hussey, 2003). Most favored computer game genres in relation to gender. Preference for use in an educational context (Karakus, Inal and Cagiltay, 2008)
5.3 Learner/Instructor Motivation This category is primarily concerned with the particular motivations of the learner for using the GBL application, the learner level of interest in participating, participation over a prolonged period of time and determining what particular motivations are the most important (Connolly, Boyle and Hainey, 2006; Connolly, Boyle and Hainey, 2007). Are the learners participating extrinsically or intrinsically (Deci and Ryan, 1991)? What particular features of the GBL environment are the most interesting? Are the learners distracted in any way? Are the learners willing to use the GBL application more than once? The category can be concerned with enjoyment and likeability and motivation in terms of task difficulty. When considering Kirkpatrick’s four levels for evaluating the effectiveness of business simulations in particular curricula it is important to identify the motivations that not only apply to the learner but also to the instructor. Therefore, it may be important to identify what motivates the instructors to attempt to assimilate a GBL approach into their curricula. The learner/instructor motivation measurements are listed in Table 4. Table 4: Learner/Instructor Motivation Measurements Learner/Instructor Motivations Extrinsic Motivation Condition (Habgood, 2007). Intrinsic Motivation Condition (Baker, Habgood, Ainsworth and Corbett, 2007; Habgood, 2007). Distraction (Lim, Nonis and Hedberg, 2006). Willingness to play. Level of interest in playing. Willingness to play over a period of time (Kato and Beale, 2006). Motivations for playing computer games using Malone and Lepper’s 1987 framework (Connolly, Boyle and Hainey, 2007; Connolly, Boyle, Stansfield and Hainey, 2006). Gender differences in relation to motivations (Chou and Tsai, 2007). Engagement (Klopfer, Yoon and Rivas, 2004; Waraich, 2004; Adamo-Villani and Wright, 2007; Bos and Sadat Shami, 2006). Fun (Ebner and Holzinger, 2007; Sim, MacFarlane and Horton, 2005; Dantas, Barros and Werner, 2004), which can be measured on three levels: expectation, engagement and endurability (Adamo-Villani and Wright, 2007).
If the game leads to increased motivation to play again (Ebner and Holzinger, 2007).
Summary of features that make the game most interesting (Schwabe and Goth, 2005).
Excitement experienced at the idea of the game/environment (Klopfer, Yoon and Rivas, 2004). Enjoyment using an EGameFlow scale (Fu, Su and Yu, 2009). Differential motivations between the addicts and non-addicts. Whether four factors (expectancy, relevance, tangibility, and contingency that could cause extrinsic motivators to undermine intrinsic motivation and to determine whether their impacts on intrinsic motivation were as expected.) that moderate a detrimental effect of extrinsic motivators on intrinsic motivation would function as predicted (Wan and Chiou, 2007).
Relationships between: Perceived ease-of-use and flow experience of playing a game. Flow experience and attitude toward playing a game. Flow and intention to play a game (Hsu and Lu, 2004). Modifying task difficulty in an instructional game impacts motivation. (Orvis, Horn and Belanich, 2008). Likeability evaluation of a computer game (Virvou and Katsionis, 2008)
5.4 Derived Measurements While the measurements discussed in the evaluation framework so far have been taken from the relevant literature there are a number of measurements that can be derived when changing the perspective from the learner to the instructor that do not have references from the literature associated with them. Examples of these are: instructor perceptions of how well the game is fitting into their course, instructor perceptions of how the game saves them time, instructor perception of customisability of the game, instructor perception of usability and motivations of the instructors to attempt to assimilate a GBL approach into their curricula. 6. Guidelines on Using the Evaluation Framework for a GBL Evaluation The evaluation framework presented in this chapter is designed to be a starting point for researchers to focus the evaluation of a GBL application. The following guidelines may be useful in carrying out the evaluation: 1. Formulate the research questions for the study of the GBL application. 2. Based on the research questions, produce a shortlist of the main things to evaluate about the GBL application, for example, learning effectiveness, how motivated the learners are to participate and what particular aspects are interesting to evaluate. 3. Go through all of the measurements associated with each framework category (i.e. learner performance, GBL environment, learner/instructor motivation, learner/instructor perception, learner instructor preferences and collaboration) and identify those measurements that are of particular interest. 4. Examine the literature associated with each relevant measurement and check how the identified empirical studies have collected evaluation data on this measurement. 5. When all of the relevant measurements have been identified, select an appropriate experimental design to properly collect these measurements; for example, pre-test/post-test or pre-test/post-test experimental control group design. 6. Run the evaluation using the chosen experimental design methodology to address the research questions. The evaluation framework is at this stage designed to be used in an implicit way, however, it will produce a number of starting ideas for a more focused and rigorous GBL evaluation.
7. Conclusions and Future Directions This paper has presented two extensive literature searches to both identify evaluation frameworks present in the literature that can be used by researchers and has develop a refined evaluation framework to guide GBL evaluation as an alternative option. The study has provided detailed links to the literature to guide the evaluation in terms of preferences, perceptions and motivations. Guidelines for actually performing a GBL evaluation have been provided to assist researchers in planning a GBL evaluation using the framework. One potentially interesting future research direction regarding the developed evaluation framework is that the measurements in the framework could be adapted to a developmental design perspective. The evaluation framework has only been successfully used to evaluate two different types of games: a requirements collection and analysis game and an Alternate Reality Game (ARG). The evaluation framework could be used to evaluate different types of GBL applications in the future for further refinement and development. References Adamo-Villani, N. and Wright, K. (2007). SMILE: an immersive learning game for deaf and hearing children. ACM Proceedings of SIGGRAPH 2007- Educators, 5-10 August 2007, San Diego, ACM Digital Library. New York: ACM Publications. Amory, A. (2006). Game object model version II: a theoretical framework for educational game development. Educational Technology Research and Development, 55(1), 51–77. Amory, A., Naicker, K., Vincent, J. and Adams, C. (1999). The use of computer games as an educational tool: 1. Identification of appropriate game types and game elements. British Journal of Educational Technology, 30, 311-322. Baker, S.J., Habgood, J., Ainsworth, S.E. and Corbett, A.T. (2007). Modeling the acquisition of fluent skill in educational action games. In Proceedings of User Modeling, 17-26. Bandura, A. (1991). Social cognitive theory of self-regulation. Organizational Behavior and Human Decision Processes, 50, 248-287. Bos, N. and Sadat Shami, N. (2006). Adapting a Face-to-Face Role-Playing Simulation for Online Play, ETR&D, 54(5), 493–521, Association for Educational Communications and Technology. Chou, C and Tsai, M-J. (2007). Gender differences in Taiwan high school students’ computer game playing. Computers in Human Behavior, 23, 812-824. Connolly, T.M., Boyle, E. and Hainey, T. (2006). Can Computer Games Motivate Next Generation Learners? A Survey of Students’ Reasons for Playing Computer Games, In Proceedings of the th 10 International Conference of Motivation, 28-30 September 2006, University of KoblenzLandau, Germany. Connolly, T.M., Boyle, E., Stansfield, M.H. and Hainey, T. (2006). Can Computer Games Help Next Generation Learners? A Survey of Students’ Reasons for Playing Computer Games, In rd Proceedings of the 3 International Conference of the Association of Learning and Teaching ALT-C 2006: the next generation, 5-7 September 2006, Edinburgh, Scotland. Connolly, T.M., Boyle, E. and Hainey, T (2007). A Survey of Students’ Motivations for Playing st Computer Games, In Proceedings of the 1 European Conference on Games – Based Learning (University of Paisley), 25 - 26 October 2007, Paisley, Scotland. Connolly, T.M., Stansfield, M.H. and Hainey, T. (2009). Towards the Development of a Gamesbased Learning Evaluation Framework, In Games-based Learning Advancement for Multisensory Human Computer Interfaces: Techniques and Effective Practices (Eds: T.M. Connolly, M.H. Stansfield and E. Boyle). Idea-Group Publishing: Hershey. ISBN: 978-1-60566360-9. Constantino-González, M. de los A. and Suthers, D.D. (2001). Coaching Collaboration by Comparing Solutions and Tracking Participation. In European Perspectives on ComputerSupported Collaborative Learning (Eds: P. Dillenbourg, A. Eurelings, K. Hakkarainen), In st Proceedings of the 1 European Conference on Computer-Supported Collaborative Learning, Universiteit Maastricht, Maastrict, the Netherlands, March 22-24, 2001, 173-180.
Csikszentmihalyi, Mihaly (1975). Beyond Boredom and Anxiety: Experiencing Flow in Work and Play, San Francisco: Jossey-Bass. ISBN 0-87589-261-2 Dantas, A.R., Barros, M.O. and Werner, C. (2004). A Simulation-Based Game for Project Management Experiential Learning. In Proceedings of the Sixteenth International Conference on Software Engineering and Knowledge Engineering (SEKE'04), Alberta, Canada, June 2004, 19-24. Deci, E.L. and Ryan, R.M. (1991). A motivational approach to self: Integration in personality. In Nebraska symposium on motivation, Vol 38: Perspectives on motivation (Ed: R. Dienstbier), 237-288. Lincoln, NE: University of Nebraska Press. de Freitas, S. and Oliver, M. (2006). How can exploratory learning with games and simulations within the curriculum be most effectively evaluated. Computers & Education. 46(3), 249–264. Ebner, M. and Holzinger, A. (2007). Successful implementation of user-centered game based learning in higher education: An example from civil engineering. Computers & Education, 49, 873–890. Fu, F-L., Su, R-C. and Yu, S-C. (2009). EGameFlow: A scale to measure learners' enjoyment of e-learning games. Computers & Education, 52(1), 101-112. Habgood, M.P.J. (2007). The Effective Integration of Digital Games and Learning Content. Thesis submitted to the University of Nottingham. Retrieved 27th October, 2008 from http://zombiedivision.co.uk/ Hainey, T., Connolly, T.M. and Boyle, L. (2009). Development and evaluation of a game to teach requirements collection and analysis in software engineering at tertiary education level, In rd Proceedings of the 3 European Conference on Games-based Learning (ECGBL), 12-13 October 2009, Graz, Austria. Hainey, T., Connolly, T.M., Stansfield, M.H., Boyle, L., Josephson, J., O’Donovan, A., Rodriguez Ortiz, C., Tsvetkova, N., Stoimenova, B. and Tsvetanova, S. (2009). ARGuing for multilingual rd motivation in Web 2.0: An evaluation of a large-scale European pilot, In Proceedings of the 3 European Conference on Games-based Learning (ECGBL), October 2009, Graz, Austria. Hsu, C-L., and Lu, H-P. (2004). Why do people play on-line games? An extended TAM with social influences and flow experience. Information & Management 41(7), 853-868. Karakus, T., Inal, Y. and Cagiltay, K. (2008). A descriptive study of Turkish high school students’ game-playing characteristics and their considerations concerning the effects of games. Computers in Human Behavior, 24(6), 2520-2529. Kato, P.M. and Beale, I.L. (2006). Factors affecting acceptability to young cancer patients of a psychoeducational video game about cancer. Journal of Pediatric Oncology Nursing, 23(5), 269–275. Ke, F. (2006). Classroom goal structures for educational math game application. In Proceedings th of the 7 International Conference on Learning Sciences ICLS ’06, Publisher: International Society of Learning Sciences. Kelleher, C. Pausch, R. and Kiesler, S. (2007). Storytelling Alice Motivates Middle School Girls to Learn Computer Programming. In Proceedings of the CHI 2007: Programming By & With EndUsers, April 28-May 3, 2007 San Jose, CA, USA. Klopfer, E., Yoon, S. and Rivas, L. (2004). Comparative analysis of Palm and wearable computers for Participatory Simulations. Blackwell Publishing Ltd 2004 Journal of Computer Assisted Learning, 20, 347–359. Kolb, D. (1984). Experiential Learning. New Jersey, Prentice-Hall Inc. Lainema, T. and Makkonen, P. (2003). Applying constructivist approach to educational business games: Case REALGAME, Simulation & Gaming, 34(1), March 2003, 131-149. Lee, D. and LaRose, R. (2007). A socio-cognitive model of video game usage. Journal of Broadcasting & Electronic Media. Leemkuil, H. and de Hoog, R. (2005). Is support really necessary within educational games? In Educational games as intelligent learning environments (Eds: C. Conati and S. Ramachandran), 21-31. Amsterdam. Lennon, J.L. (2006). Debriefings of web-based malaria games, Simulation & Gaming. 37(3), 350356. Lim, C.P. Nonis, D. and Hedberg, J. (2006). Gaming in a 3D multi-user virtual environment: engaging students in Science lessons, British Journal of Educational Technology, 37(2).
Ogletree, S. M., and Drake, R. (2007). College Students’ Video Game Participation and Perceptions: Gender Differences and Implications. Sex Roles, 56 (7-8), 537 – 542. Oh Navarro, E. and Van der Hoek, A. (2005). Design and Evaluation of an Educational Software Process Simulation Environment and Associated Model. In Proceedings of the Eighteenth Conference on Software Engineering Education and Training, Ottawa, Canada , April, 2005. O’Neil, H.F., Wainess, R. and Baker, E.L. (2005). Classification of learning outcomes: evidence from the computer games literature. The Curriculum Journal, 16(4), December 2005. Orvis, K.A., Horn, D.B. and Belanich, J. (2008). The roles of task difficulty and prior videogame experience on performance and motivation in instructional videogames, Computers in Human Behavior, 24(5), 2415-2433. Paul, S.T. Messina, J.A. and Hollis, A.M. (2006). A Technology Classroom Review Tool for General Psychology. Teaching of Psychology, 33(4), Aut 2006, 276-279. Quinn, F. Keogh, P. McDonald, A. and Hussey, D. (2003). A pilot study comparing the effectiveness of conventional training and virtual reality simulation in the skills acquisition of junior dental students. European Journal of Dental Education. Rosas, R., Nussbaum, M., Cumsille, P., Marianov, V., Correa, M., Flores, P., Grau, V., Lagos, F., Lopez, X., Lopez, V., Rodriguez, P. and Salinas, M. (2003). Beyond Nintendo: design and assessment of educational video games for first and second grade students, Computers & Education, 40, 71–94. Schumann, P.L., Anderson, P.H., Scott, T.W. and Lawton, L. (2001). A framework for evaluating simulations as educational tools. Developments in Business Simulations and Experiential Learning, 28. Schwabe, G. and Goth, C. (2005). Mobile learning with motivational effects. Journal of Computer Assisted Learning, 21, 204–216 Blackwell Publishing Ltd. Shaw, K. and Dermoudy, J. (2005). Engendering an empathy for software engineering. In Proceedings of the 7th Australasian Computing Education Conference (ACE2005), Newcastle, Australia, 42, 135–144. Sim, G., MacFarlane, S. and Horton, M. (2005). Evaluating Usability, Fun and Learning in Educational Software for Children. In Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2005 (Eds: P. Kommers and G. Richards ), 1180-1187. Chesapeake, VA: AACE. Song, S. and Lee, J. (2007). Key factors of heuristic evaluation for game design: Towards massively multi-player online role-playing game. International Journal of Human-Computer Studies, 65, 709–723. Ssemugabi, S. and de Villiers, R. (2007). A comparative study of two usability evaluation methods using a web-based e-learning application. Fish River Sun, Sunshine Coast, South Africa. In Proceedings of the 2007 annual research conference of the South African institute of computer scientists and information technologists on IT research in developing countries, 2 - 3 October 2007. Sward, K.A., Richardson, S., Kendrick, J. and Maloney, C. (2008). Use of a Web-Based Game to Teach Pediatric Content to Medical Students. Ambulatory Pediatrics, 8(6), 354-359. Tan, P-H., Ling, S-W. and Ting, C-Y. (2007). Adaptive digital game-based learning framework. In Proceedings of the 2nd international conference on Digital interactive media in entertainment and arts. Perth, Western Australia. Yu, F. Y., Chang, L.J, Luit, Y.H. and Chan, T.W. (2002). Learning preferences towards computerized competitive modes. Journal of Computer Assisted Learning, 18, 341– 350. Virvou, M. and Katsionis, G. (2008). On the usability and likeability of virtual reality games for education: The case of VR-ENGAGE. Computers & Education, 50, 154–178. Wagner, D. Schmalstieg, D. and Billinghurst, M. (2006). Handheld AR for Collaborative Edutainment. Advances in Artificial Reality and Tele-Existence. Lecture Notes in Computer Science Springer Berlin / Heidelberg. Wan, C-S. and Chiou, W-B. (2007). The motivations of adolescents who are addicted to online games: a cognitive perspective. Adolescence, 42(165), 179-197. Waraich, A. (2004). Using narrative as a motivating device to teach binary arithmetic and logic gates. In Proceedings of the 9th annual SIGCSE conference on Innovation and technology in computer science education. 97–101. Leeds, United Kingdom.
Zaphiris, P., Ang, C. S. and Law, D. (2007). Individualistic Vs. Competitive Game-based Elearning. Advanced Technology for Learning, 4(4).