A Generic Framework for Evaluation Phase in Games Development ...

4 downloads 603 Views 147KB Size Report
The evaluation phase in game development methodology is detected by two type of .... Some extra criteria such as mobility are specific for mobile games will not ...
A Generic Framework for Evaluation Phase in Games Development Methodologies ( Domain of research : Software Engineering )

Rula Al-Azawi

Aladdin Ayesh

Ian Kenny

Khalfan A AL-Masruri

DMU University,UK DMU University,UK Higher College of Technology, Oman Gulf College, Oman Email: [email protected] Email: [email protected] Email: [email protected] Email: [email protected] [email protected]

Abstract—Evaluation phase plays an important role in software development in general and in game development in particular, thus it is an important part in designing a game development methodology. In the research presented here, evaluation phase has been investigated the user and expert evaluation in details. This paper focuses on providing general heuristics to aid in the evaluation process. The result is a generic evaluation framework that would work for most game genres, thus it can be included in a generic game development methodology.

K EYWORDS Heuristics set, evaluation phase, game development methodology, Game genre, Game evaluation. I. I NTRODUCTION The evaluation phase in game development methodology is detected by two type of evaluator: the expert evaluators and real users. Expert evaluators are the person who has knowledge and experience in conducting evaluation by using different expert evaluation methods. Meanwhile, real users are the targeted group of users for which game is developed. These users will be the respondents for the evaluation by using different user evaluation methods. Furthermore it will be repeated process and comparing the results of current iteration with results of previous iteration before game release. The evaluation process, known as formative evaluation is deliberately conducted to detect problems [1]. They are usually identified at earlier stage of development processes of the game and we could use it for improvement and enhancement of the game in each iteration before it will be ready for release. The game experience can be evaluated after prototype implemented and it ready for beta testing. At this point, correcting any problems will be too expensive, or the project schedule does not allow any delays due to marketing reasons. As a result, there is a need for an evaluation methods that can identify these problems before beta testing starts and thus provide time for corrections [2]. In the next section, we briefly describe previous work. Section 3 presents elements that should be considered when evaluating games. Sections 4and 5 explains the concepts of game heuristics with essential criteria in both user and expert evaluation methods. Then, in Section 6 apply the game evaluation process. Finally, we present some conclusions and ongoing work. II. BACKGROUND In order to deal with the evaluation problems in details, we need to discuss most of the different heuristic sets which

have been proposed based on different view of point. The first step in creating evaluation heuristics set is understand how the heuristics developed and what are the main criterion has been used by authors. Nielsion and Mack offered in 1990 heuristics evaluation, and it was used to evaluate user interface of software productivity [3]. Those heuristics are useful to use in development phase to get a design guidelines. Later several authors have noted that games required heuristic for their own [4] [5] [6]. The usability heuristics address issues concerning playability. The playability unlike usability does not have a standard definition. For this reason several authors have offered a definition and heuristics set to cover the playability [2][7][5] [8] Federoff thesis [4]has presents heuristics model that could be considered as the first specific heuristic model due its structure and design methods. Federoff presents a 40 heuristics sets divided into following sub-criteria: game interface, game mechanics and gameplay. Desurvire [7] presents Heuristics for Evaluating Playability (HEP) based on Federoff sub-criteria, he used gameplay, game mechanics and he added to his sub-criteria usability and game story which consists totally 43 heuristics. Korhonen and Koivisto [5] presented also playability heuristics focuses on mobility games, he used like Desurvire gameplay, usability and he added to his sub-criteria mobility which consists totally 29 heuristics. Schaffer [9] presents his heuristics based on his own expertise from HCI fields, so that his heuristics divided into five categories: general, graphical user interface, game play, control mapping and level design which they contain 21 heuristics. Lastly Pinelle introduced 10 usability heuristic designed for multiplayer games [10]. III.

G AME E VALUATION H EURISTICS FRAMEWORK (GEHF)

Special attention needs in specifying methods that evaluate games. The game evaluation is still ongoing and the heuristic sets are quite different but it includes some common issues [2]. There are multiple heuristic sets available and it is important to choose carefully the variables to be measured, as well as the correct methods to collect them [11]. Some heuristics are proposal which have not been validated, others are targeted to specific game genre or they do not cover all evaluation needs.

Science and Information Conference 2013 October 7-9, 2013 — London, UK

Fig. 1.

Game evaluation criteria

This led us to ask an important question, ”What aspects of game can be evaluated? It is challenging task to define heuristics that can capture aspects that are essential to the game evaluation. Therefore, it is important to select heuristics set that are in high level to be fit to different game genre without losing power to guide evaluators during the evaluation phase, and at the same time covered all evaluation aspects to get measurement suitable for enhance the next iteration in game development methodology. Furthermore we need to use correct methods that collect and compare between them. In our work, we aim to achieve some clarity, generality and usefulness heuristic sets to evaluate most games genre. The suggested framework that evaluates games has standard requirements to facilitate the work in game development companies. Instead of using different evaluation sets for each game type, our heuristic sets will covers all the important aspects in high level heuristics for games. The evaluation task, need to be alert all the time and inspect the game for problems. The purpose of the heuristic sets is to guide the evaluation phase and remind the evaluators to pay attention to important aspects that will be explains in details. This paper presents initial high level heuristics framework for games that will be used iteratively in the game development methodology. For the above reasons our proposed evaluation sets contains 100 heuristics organized into four categories: Firstly, game quality (12 heuristics) which deals with three major issues: game functionality, game efficiency and game adaptability. Secondly, game playability (42 heuristics) which deals with four major issues: game story, game play, and game mechanics. Thirdly, game usability (25 heuristics) which deals with two major issues: game interface and game control and finally, game enjoyment (21 heuristics) as shown in TableI. Meanwhile the user evaluation methods will cover the following criteria: questionnaire and scenario as shown in Figure 1 IV.

G AME E VALUATION P ROCESS

An initial step towards game evaluation process is to explain the overall process. Figure 2 illustrate the sketch of overall process needs to follow in each iteration of game development methodology. The results founds in each iteration will compare with results from previous iteration to be insure that all founded problems has been solved. In each of the evaluated games we measured many criteria, and the correlation between them. We decomposed those criteria into a number of more measurable factors, defined on the basis of the current literature in games [12]. The data from overall correlation will determine the user

Fig. 2.

Game evaluation process

views, expert evaluation and comments to develop prototype and it should be able to identify the specific elements that needs to be enhanced. In TableI we adds two columns that will be useful to get accrued results for statistical correlation. In column number 5 we define a score for each heuristics set from 1 to 5 ( 1 being worst, 5 being best) to be compared in every iteration with previous iteration. Furthermore in column 6 we adds priority from 1 to 3 (1 being highest priority, 3 being lowest priority) which is useful to give us the importunity of heuristics depending on game genre and evaluators view of points. The next section explains in details with all required key factors. A. User Evaluation Methods Within recent years the term user experience has become a buzzword within the community focusing on HCI. According to [13] this is the counter-reaction to the more dominant task and work related usability paradigm. Still, this is not a completely new concept [14]. Nevertheless, a clear definition and grounded understanding of this term is still missing [15]. According to Law the main problem is that user experience treats non utilitarian aspects of interactions between humans and machines. This means that user experience mainly focuses on affect and sensationtwo very subjective impressions [16] [17]. The expert evaluators found it difficult to play as any player would play the game, and for that reason the evaluation session needs to deal with real player as an evaluator to game. When we are selecting player evaluator, at least the player evaluators should be interested in games and preferably be familiar with game genre of the tested game. The user evaluators have to play the game until all of its aspects can be studied. Some games designed to take20-40 hours of playing time in gaming. Of course in such case the user evaluator will exam common issues only. In the first playing sessions most of the players have a first impression of the game which is very difficult to change. Then the player gains experience with games. The user evaluator report from evolution session which could be interview or scenario or questionnaire or real play of game, coded as a positive player experience or a negative player experience. A positive experience was defined as anything that increased their pleasure, immersion, and the challenge of the game. A negative experience was defined as any situation where the player was bored, frustrated, or wanted to quit the game [7].

www.conference.thesai.org

2- Page

Science and Information Conference 2013 October 7-9, 2013 — London, UK An advantage of user evaluator even if they didnt covered many issues as expert evaluator, but they identified some issues to games such as boredom, challenge, pace level and the terminology used in games. V.



Game play: is the set of problems and challenges a user must face to win a game [7]. When evaluating game play, the evaluators have some game design experience and should understand the goal and must know the target players [5]



Game story: usually includes all plot and character development [7]



Game mechanics: are tested by Quality Assurance (QA) personnel in game companies to ensure no broken games get shipped [4]. It involves the programming that provides the structure by which units interact with the environment [7].

E XPERT E VALUATION M ETHODS

The game developers’ perception of the expert evaluation methods is an interesting because it can measure different aspects helps in evaluating and testing games. The use of the criteria for expert evaluation allowed many issues in games to be identified and evaluated in-depth [18]. The expert evaluation methods will shows that the criteria are a useful tool for reviewing games and identifying issues, as well as the affect of these issues on player enjoyment. Our heuristics sets are general purpose heuristics which means they are applicable for evaluating for most game genre. Our heuristic sets are covering playability, quality, usability and enjoyment aspects. Those criteria has been selected according to the literature review, common for all games, cover most aspects needs in evaluation games and finally easy to use. Some extra criteria such as mobility are specific for mobile games will not apply in our general heuristics sets. A. Quality One of the important measurements is the quality of the games through development life cycle. The quality evaluation process started with a careful planning phase. It has included the purpose of the evaluation, the timing of the evaluation and who should be conducting the evaluation process[19]. The quality heuristic was used to assess not only the final version of games but also the quality during the development life cycle which makes us able to prevent the majority of games failures. We have adopted the Quality Evaluation Frame work form [19]. His QEF evaluates based on (ISO 9126 is the standard reference) [Scalet et al, 2000] divided into three criteria. Every criterion aggregates a set of factors. A factor is a component that represents the system performance from a particular point of view. The dimensions of our Cartesian quality space are: Efficiency, Adaptability and as shown in TableI. •

Efficiency: it measures the systems ability for presenting different views on its content with minimum effort.



Adaptability: it measures the efficacious of the extend scenario and system contents and it present different instructional design theories and different learning environment in a common platform.



Functionality: it reflects to the characteristics of the games related to its operational aspects.

B. playability What we means by playability and what are the criteria that effect playability? This section will answer previous question. The games have a good playability when it will be easy to use and at the same time fun and challenging. Playability in our heuristics set is a combination of game play, game story and game mechanics. Game usability is related to playability, while it is an important aspect and covered the game control and game interface. For the above reason we explain usability as main criteria not as a part of playability.

C. Game Usability Game usability covers the game control and game interface through which player interacts with the games. Good game usability ensures that the player will have fun and have another enjoyable session [5]. Most of existing game usability heuristics sets based on [3].His ten usability heuristics used to perform heuristics evaluation on software engineering and website. The game interface should allow the player control the game fluently and display all necessary information about game status and possible actions [5]. D. Enjoyment Player enjoyment is an important goal for games. If the player do not enjoy in the game then they will not play the game again [18]. Sweetser in his research has focus on game enjoyment [18] based on game flow. Game flow is a model for evaluating player enjoyment in games which consisted of eight elements, as follows :( concentration, challenge, player skills, control, clear goal, feedback, immersion and social interaction). In Table I we nominated the duplicates heuristics which has been covered earlier in the table such as control which have been covered in usability. While clear goal, challenge covered in playability. Finally feedback also covered in usability. E. Mobility A mobile phone is an excellent companion for killing time or just doing something during short breaks because it is always with the user. Taking a photo, sending a message, checking the calendar, or browsing a web site are typical tasks that should be initiated without delay. Therefore, the application and the phone should be in operating mode instantly [5]. Evaluating mobility aspects during the play testing would have made the test sessions more complicated [17]. The mobility heuristics are validated in several mobile game evaluations conducted by playability experts and provide challenges for the design. Since the general context of the gameplay possibly varies more in such mobile games than in traditional games. Mobile phone games requirements include not only playability and gaming aspects, but issues of mobility as well. For example, mobile phones can be taken into a variety of environments of changing lighting and noise levels [20]. Mobility is defined by how easily the game for the player. it is used most of the usability heuristics, we have added only three extra mobility evaluation as shown in Table I.

www.conference.thesai.org

3- Page

Science and Information Conference 2013 October 7-9, 2013 — London, UK VI.

D ISCUSSION

Our goal in this paper was to develop an optimal heuristic sets that can be used by game designers as expert evaluators from early prototype until game release. The real user also participate in the evaluation but some heuristic sets covered by only expert such as game mechanics and game control. Meanwhile some evaluation sets such as gameplay which is hard task, usually done at the final release of the game after checking the issues in evaluation sets. An interesting observation is that unstructured nature of heuristic evaluation led to several criticisms of the techniques. Our heuristics sets structured clearly and also we adds extra requirement by adding score from (1-5) to get accrued number to be compared with previous prototype iteration, furthermore we introduce a priority from (1-3) to get induction about the importunity of the heuristics. For example ”Use sound to provide meaningful feedback or stir a particular emotion.” is less priority than ” Make effects of the Artificial Intelligence (AI) clearly visible to the player by ensuring they are consistent with the players reasonable expectations of the AI actor. ”. Each heuristics should provide enough information to enable the evaluators to judge all possible problems of a game. VII.

[2]

[3]

[4]

[5]

[6]

[7]

[8]

C ONCLUSION AND F UTURE W ORKS

Several studies suggest that having an evaluation phase that deals with most of the details need in evaluating games that will helps designers to find important classes of problems that are not always found with user testing [21][22]. Currently, the evaluation phase in our suggested game development methodology is in the progress. The data from the evaluation heuristics and the comments regarding the developed of prototype, should be able to identify the elements that needs to be enhanced in the next of methodology iteration. In order to get quantitative results. We added extra two columns one for score the heuristics and others to set priority for this heuristics which will different priority than others. Since our heuristics used critical review of games to identify problems and to develop a set of design requirements for games. We believe that this general methodology is a new approach that could be used by researcher and designer to understand design issues for most game genre. Several researchers have developed heuristics sets, however the work is still ongoing and we have quite different issues included in our heuristics set. We also aim to achieve some clarity to the different aspects in heuristic sets and their usefulness in game evaluation phase. Most of our heuristic sets could used at early development phase, in this case solving problems will not affect extra cost and time to solve it furthermore it adds some enhancement to games. As a future work, we are planning to continue studies to find the most optimal heuristic sets and integrated to our initial design of game development methodology. Unfortunately, there is not sufficient data from this study to make a deeper analysis of games. As a future work, proposed heuristic sets will be tested in game under construction to compare the results in each iteration of game until release it.

[9] [10]

[11]

[12]

[13]

[14]

[15]

[16]

[17]

[18]

R EFERENCES [1]

H. Omar, R. Ibrahim, and A. Jaafar, “Methodology to evaluate interface of educational computer game,” in 2011 International Conference

[19]

on Pattern Analysis and Intelligent Robotics, no. 28-29 June. Putrajaya, Malaysia: IEEE, 2011, pp. 228–232. [Online]. Available: http://ieeexplore.ieee.org/xpls/abs all.jsp?arnumber=5976931 H. Korhonen, J. Paavilainen, and H. Saarenpaa, “Expert review method in game evaluations: comparison of two playability heuristic sets,” in MindTrek 2009. Tampere, Finland: ACM, September 30 October 2 2009, pp. 74–81. [Online]. Available: http://dl.acm.org/citation.cfm?id=1621856 J. Nielsen and R. Mack, “Heuristic evaluation,” in New York: John Wiley & Sons., no. April, 1994, pp. 249–256. [Online]. Available: http://web.vtc.edu/users/cad03090/hci-r/heuristic.pdf M. Federoff, “Heuristics and Usability Guidelines for the Creation and Evaluation of Fun in Video Games,” Master of Science Thesis, Indiana University, 2002. H. Korhonen and E. Koivisto, “Playability heuristics for mobile games,” in MobileHCI, no. 9-16. Helsinki, Finland: ACM Press, September 12-15 2006, p. 9. [Online]. Available: http://portal.acm.org/citation.cfm?doid=1152215.1152218 J. Paavilainen, “Critical review on video game evaluation heuristics: social games perspective,” in Future Play 2010. ACM Press, 2010, pp. 56–65. [Online]. Available: http://dl.acm.org/citation.cfm?id=1920787 a. C. M. Desurvire, H. and J. Toth, “Using Heuristics to Evaluate the Playability of Games,” in CHI 2004 Late Breaking Results Paper, Vienna, Austria, 24-29 April 2004, pp. 1509–1512. L. Nacke, “From playability to a hierarchical game usability model,” in Conference on Future Play - FuturePlay ’09. Vancouver, BC, Canada: ACM Press, 2009, pp. 10–11. [Online]. Available: http://portal.acm.org/citation.cfm?doid=1639601.1639609 N. Schaffer, “Heuristics for usability in games.” Rensselaer Polytechnic Institute, White Paper., Tech. Rep., April 2007. D. Pinelle, a. S. T. Wong, N., and Gutwin, “Usability heuristics for networked multiplayer games,” in Proc. of the ACM 2009 International Conference on Supporting Group Work, Sanibel Island, Florida, USA., May 1013 2009, pp. 169–178. [Online]. Available: http://dl.acm.org/citation.cfm?id=1531700 G. Andrade, G. Ramalho, A. Gomes, and V. Corruble, “Dynamic game balancing: An evaluation of user satisfaction,” in the 2nd Artificial Intelligence and Interactive Digital Entertainment Conference (AIIDE06). AAAI Press, 2006, pp. 3–8. [Online]. Available: http://www.aaai.org/Papers/AIIDE/2006/AIIDE06-005.pdf A. Febretti and F. Garzotto, “Usability, playability, and long-term engagement in computer games,” in CHI 2009. Boston, MA, USA: ACM, April 4-9 2009, pp. 4063–4068. [Online]. Available: http://dl.acm.org/citation.cfm?id=1520618 M. Hassenzahl and N. Tractinsky, “User experience a research agenda.” in Behavior & Informa- tionechnology, vol. 25, no. 2, March-April 2006, pp. 91–97. R. Bernhaupt and W. Ijsselsteijn, “Evaluating user experiences in games,” in CHI ’08 extended abstracts on Human factors in computing systems,. Florence, Italy: ACM, 2008, pp. 3905–3908. [Online]. Available: http://dl.acm.org/citation.cfm?id=1358953 V. Law, E. Roto, J. Vermeeren, A. Kort, and M. Hassenzahl, “Towards a shared definition of user experience,” in CHI ’08 Extended Abstracts on Human Factors in Computing Systems, C. . ACM, Ed., Florence, Italy, April 05-10 2008. K. Christina, H. Wolfgang, L. Jakob, H. Michael, G. Arjan, and T. Manfred, “Using Heuristics to Evaluate the Overall User Experience of Video Games and Advanced Interaction Games,,” in Evaluating User Experience in Games, Bernhaupt, Regina, Ed. Springer, 2009. H. Korhonen, “Comparison of playtesting and expert review methods in mobile game evaluation,” in Proceedings of the 3rd International Conference on Fun and Games - Fun and Games ’10. New York, USA: ACM Press, 2010, pp. 18–27. [Online]. Available: http://portal.acm.org/citation.cfm?doid=1823818.1823820 P. Sweetser and P. Wyeth, “GameFlow: a model for evaluating player enjoyment in games,” Computers in Entertainment (CIE), vol. 3, pp. 1–24, 2005. [Online]. Available: http://dl.acm.org/citation.cfm?id=1077246.1077253 P. Escudeiro and N. Escudeiro, “Evaluation of Serious Games in Mobile Platforms with QEF: QEF (Quantitative Evaluation

www.conference.thesai.org

4- Page

Science and Information Conference 2013 October 7-9, 2013 — London, UK

[20] [21] [22]

Framework),” in 2012 Seventh IEEE International Conference on Wireless, Mobile and Ubiquitous Technology in Education, Takamatsu, Kagawa, Japan, 27-30 March 2012, pp. 268–271. [Online]. Available: http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6185045 http://ieeexplore.ieee.org/xpls/abs all.jsp?arnumber=6185045 M. Britain, D. Bolchini, and U. Simulation, “Usability Evaluation for Health Video Games: a Library of Inspection Heuristics.” D. Pinelle, U. Street, and G. Hall, “Heuristic Evaluation for Games : Usability Principles for Video Game Design,” pp. 1453–1462, 2008. R. Jeffries and J. Miller, “User interface evaluation in the real world: a comparison of four techniques.” in Conference on Human Factors in Computing Systems CHI 91. New Orleans, LA: ACM Press, April 1991, pp. 119–124.

TABLE I: Game Evaluation Sets Criteria

Quality

Sub Criteria

Description

1

Adaptability

The game is easily integrated with other environments. The game includes an evaluation system, during the development process. The game allow for new techniques and better learning. The game allow for activities that keep the curiosity and the interest of the player in the content. The game allow player to take decisions. Is there no extra information? The game have a good program structure that allows easy access to content and activities. The speed of communication between the program and the user adequate. The program execution efficient and with no operational errors. The system been developed with originality. The information well structured and does it adequately distinguish the objectives, context, results, multimedia resources. The game checked all the alert message.

2 3 4 5 6

Efficiency

7 8 9 10 11

Functionality

12 13 14 Playability

Score (1-5)

No.

15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33

Game Play

Priority (1-3)

The game has varying activities and pacing during game play. The game provides clear goals or supports player-created goals. The game provide consistency between the game elements and the overarching setting and story to suspend disbelief. There is an interesting and absorbing tutorial that mimics game play. The game is fun for the player and enjoyable to replay. Game play should be balanced with multiple ways to win. Player is taught skills early that you expect the players to use later, or right before the new skill is needed. Players discover the story as part of game play and holds interest. The games should change strategy for same failure of player. The game should give rewards that immerse the player more deeply in the game by increasing their capabilities (power-up), and expanding their ability to customize. There are variable levels of difficulty and an unexpected outcome. There are multiple goals on each level. Players are able to save games in different states and resume them later. The game gives hints, but not too many. Game can be played multiple times using different paths through the game. Challenges are positive experiences rather than negative ones. The player sees the progress in the game and can compare the results. The player is in control. There are no repetitive or boring tasks. The game supports different playing styles. Allow players to build content.

www.conference.thesai.org

5- Page

Science and Information Conference 2013 October 7-9, 2013 — London, UK TABLE I: Game Evaluation Sets Criteria

No.

Sub Criteria

34 35

Game Story

36 37 38 39 40 41 42 43

Game mechanics

44 45 46 47 48 49 50 51 52 53 54 55 Usability

User Interface

56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72

Game Control

Score (1-5)

Description

Priority (1-3)

There must not be any single optimal winning strategy. Player understand and interest in the story line as a single consistent vision. The Player spends time thinking about possible story outcomes. The Player feels as though the world is going on whether their character is there or not. The Player has a sense of control over their character and is able to use tactics and strategies. Player experiences fairness of outcomes. Player is interested in the characters because (1) they are like me; (2) they are interesting to me, (3) the characters develop as action occurs. Take other person into account. Don’t waste the player time. Game should react in a consistent, challenging, and exciting way to the players actions (e.g., appropriate music with the action). Make effects of the Artificial Intelligence (AI) clearly visible to the player by ensuring they are consistent with the players reasonable expectations of the AI actor. A player should always be able to identify their score/status and goal in the game. Mechanics/controller actions have consistently mapped and learnable responses. Controls should be intuitive, and mapped in a natural way; they should be customizable and default to industry standard settings Player should be given controls that are basic enough to learn quickly yet expandable for advanced options. Camera views match the action. Player is taught skills that will be needed later in the game. There are predictable and consistent responses to a user’s actions. Responses to user’s actions are timely, allowing for successful interaction. Feedback should be given immediately to display user control Get the player involved quickly and easily Use sound to provide meaningful feedback or stir a particular emotion. Players do not need to use a manual to play game. The interface should be as non-intrusive to the Player as possible. Controls are customizable. Menu layers are minimized, or can be minimized. Screen layout is efficient and visually pleasing. Device UI and game UI are used for their own purposes. The player understands the terminology. Control keys are consistent and follow standard conventions Provide users with information on game status. Provide instructions, training, and help. Follow the trends set by the gaming community to shorten the learning curve Players should perceive a sense of control and impact onto the game world. The game should be easy to learn and hard to master. Provide immediate feedback for user actions. The Player can easily turn the game off and on, and be able to save games in different states. The Player should experience the menu as a part of the game and should contain clear help Upon initially turning on the game, the player has enough information to get started.

www.conference.thesai.org

6- Page

Science and Information Conference 2013 October 7-9, 2013 — London, UK TABLE I: Game Evaluation Sets Criteria

No. 73 74 75 76 77 78 79 80

Enjoyment

Score (1-5)

Description

Games should provide a lot of stimuli from different sources.

82

Games must provide stimuli that is worth attending to. Games should quickly grab the players attention and maintain their focus throughout the game. The player shouldnt be burdened with tasks that dont feel important. Games should have a high workload, while still being appropriate for the players perceptual, cognitive and memory limits. Players should not be distracted from tasks that they want / need to concentrate on. Challenges in games must match the players skill level. Games should provide new challenges at an appropriate pace. Learning the game should not be boring, it should be part of the fun. Games should include online help so the player doesnt need to exit the game. Overriding goals should be clear and presented early. Intermediate goals should be clear and presented at appropriate times. Players should receive immediate feedback on their actions. Players should become less aware of their surroundings. Players should become less self-aware and less worried about everyday life or self. Players should feel emotionally involved in the game. Players should feel viscerally involved in the game. Games should support competition and cooperation between players. Games should support social interaction between players (chat etc). Games should support social communities inside and outside the game. The game and play sessions can be started quickly. The game accommodates with the surroundings. Interruptions are handled reasonably.

83 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103

Priority (1-3)

There are means for error prevention and recovery. Game controls are convenient and flexible. The player cannot make irreversible errors. The player does not have to memorize things unnecessarily. Allow users to customize video and audio settings, difficulty and game speed. Provide predictable and reasonable behavior for computer controlled units. Provide intuitive and customizable input mappings. Provide controls that are easy to manage, and that have an appropriate level of sensitivity and responsiveness.

81

84

Mobility

Sub Criteria

www.conference.thesai.org

7- Page