Med.Sci.Educ. (2014) 24:181–187 DOI 10.1007/s40670-014-0038-x
ORIGINAL RESEARCH
A Multimedia Audience Response Game Show for Medical Education Robin K. Pettit & Lise McCoy & Marjorie Kinney & Frederic N. Schwartz
Published online: 3 April 2014 # International Association of Medical Science Educators 2014
Abstract Games are increasingly popular in medical education. However, there is a need for games that target today’s learners, including their preferences for active participation, social interaction, immediate feedback, and multimedia formats. With these preferences in mind, a commercially available game show template was used to develop a game show for review of medical microbiology. The game show was combined with an audience response system (“clickers”) to enable participation of all students in a large group setting. A 19-item questionnaire was used to measure students’ perceptions of the game. The questionnaire was administered after participants had played the games on three separate occasions during their first year of medical school. The response of medical students to the game shows was overwhelmingly positive. Students valued the ability of the game shows to engage them, to provide a positive learning environment, to clarify concepts, and to develop clinical thinking. The game software combined with an audience response system provides a visually rich, engaging format that could be used for a review of any basic science discipline. Keywords Game show . Clicker . Medical microbiology . Bravo C3 Softworks™
Introduction Undergraduate medical school classes are now largely made up of net generation students, or students born between 1982 and 1991 [1]. Net generation students have a high degree of
R. K. Pettit (*) : L. McCoy : M. Kinney : F. N. Schwartz School of Osteopathic Medicine in Arizona, A. T. Still University, 5850 E. Still Circle, Mesa, AZ 85206, USA e-mail:
[email protected]
technological literacy and a preference for active versus passive learning [1]. The effectiveness of active learning on a broad range of learning outcomes is widely accepted [2–4]. Students who engage interactively with each other and with the instructor learn concepts better, retain them longer, and can apply them more effectively in other contexts than students who sit passively listening [2, 5]. Two core elements of active learning, student activity and engagement, are central to games. A survey of family medicine and internal medicine residency program directors in the USA indicated that 80 % used games as an educational strategy in their residency training programs [6]. Educational games for medical students include board games [7–9], card games [10], video games [11], and game shows [12–18]. Some of the games in medical education are directed toward teaching new concepts [7, 8, 10, 12, 14, 16–18], while the purpose of others is review [9, 13, 15, 19]. While there is evidence that students find games more enjoyable and stimulating than standard lectures [16, 18], evidence for their utility in increasing knowledge is conflicting, perhaps in part due to the limited number of rigorous studies [20]. Several published game shows for medical education are PowerPoint-based. Jirasevijinda and Brown [14] and Shiroma et al. [18] describe a PowerPoint-based Jeopardy game, and Moy et al. [15] describe a PowerPoint-based Who Wants to be a Millionaire game. Other medical school programs have used non-electronic game show formats [13, 16, 17]. In 2008, Akl et al. [12] published a description of Guide-O-Game, an interactive Jeopardy game developed by IT specialists at their university. The authors report that it was expensive to implement and time-consuming to develop. Given the popularity of game shows in medical education [12–18], a template for creating multimedia games for large group settings that are easy to build and implement is needed. We developed a review tool using a commercially available electronic game show template, Bravo C3 Softworks™. To
182
our knowledge, the application of Bravo C3 Softworks™ training software to educational settings has not been published. The software uses a customizable gameboard template that can be enhanced by adding video, audio, and graphics. While the software can be used to create teaching games or review games, the games described here targeted the review of medical microbiology knowledge. The games were combined with an audience response system to enable participation of all students in a large group setting. The game shows were developed and then implemented in three different organ system courses during year 1 of an osteopathic medical curriculum. With the goal of improving instruction in our program, particularly with regard to increasing the number and types of interactive sessions, we posed the following research questions: What are students’ perceptions of the game shows regarding the ability of the games to engage, provide a positive learning environment, clarify concepts and develop clinical thinking?
Materials and Methods Game Show Software Game show software was purchased from C3 SoftWorks (Minneapolis, MN). Wireless keypads (clickers) and a receiver were purchased from Turning Technologies (Youngstown, OH). Bravo C3 Softworks™ has several game show templates, for example, Quiz Show, Billionaire, and Spin-Off. The Spin-Off game was selected for this study.
Med.Sci.Educ. (2014) 24:181–187
customize their own avatars, which stimulates investment in the game at the outset. Under the settings tab, we set the game to automatically go to the next question within a category (instead of spinning again, which saves time in class), and set the time to answer each question to 40 s. Keypads were selected as the response type using the response settings tab. Sounds and graphics can also be customized, but for these games, default settings were used. Building and implementing these games is quite straightforward. The game software is very user-friendly, and if game questions are prebuilt and copied/pasted into the game template, a game can be created in 1 h to a few hours (depending on the number of questions). Implementation of the Game Shows The game is controlled by the instructor at the podium using a mouse. Students view the game on eight, 50-in. plasma flat screens in our large group classroom. At the outset of the game, each student joins their team (male or female doctors) by simply pressing any button on their clickers. The instructor spins the wheel to select question categories, students individually respond to questions in each category with their clickers, and the instructor clicks through correct answers, team scores, and review slides. At the end of the game, a detailed report of all of the results is displayed and can be transferred to a Microsoft Excel file. More details of game play, including the entertaining and competitive features that distinguish this activity from other types of games, can be found in the “Discussion” section. Construction of the Survey Instrument
Construction of the Game Shows A menu for customizing various features of the game appears when the game builder interface opens (Fig. 1a, left side). Under the question tab (Fig. 1b, left side), questions and answers were copied/pasted from Microsoft Word into the game template (one can also create and edit questions within the question interface). The Spin-Off game allows a maximum of 50 questions. To improve readability in large group settings, the question preview option was selected for questions with long stems. The question builder interface also has an option for adding a review screen that displays once students answer a question. The review screen can facilitate teaching, clarification, or reinforcement. Images (jpg, gif, png) and video clips (swf, flv) were imported into the corresponding question template. Audio clips (mp3) can also be imported. Using the teams tab, team avatars were customized prior to the game using a large selection of attributes available in the software. For games in the current study, we used the attributes to create male and female doctor avatars (Fig. 1a, middle). If there is time prior to game play, students can select or
A 19-item questionnaire was used to measure students’ perceptions of the games. A literature search facilitated the design of key constructs for this survey [1, 7, 8, 10, 15, 16, 18]. Students’ perceptions of the games were measured in four categories: the ability of the games to engage (four questions), provide a positive learning environment (four questions), clarify concepts (four questions), and develop clinical thinking (four questions). Three questions at the beginning of the survey addressed age range (22–25, 26–30, 31–35, 36–40, 41+), gender, and number of times the students had participated in these games (0, 1, 2, 3). The A. T. Still University Institutional Review Board deemed the study exempt. Data Collection Survey data collection involved an email solicitation containing a clickable link to an online survey. A reminder email was sent 1 week later. Participation was voluntary and anonymous. There were no rewards offered for completing the survey. Students were asked to evaluate the extent to which they
Med.Sci.Educ. (2014) 24:181–187
183
Fig. 1 Screenshots of the game builder menu and customized avatars (a), question builder interface (b), student’s response feedback during a game (c), and team results feedback during a game (d)
agreed with the statements using a Likert five-point rating scale (1, strongly agree; 2, agree; 3, neutral; 4, disagree; 5, strongly disagree). A general comment section was included (other comments) at the end of the survey. The questionnaire was administered after participants had played the games on three separate occasions in three different organ system courses (summer 2012, fall 2012, and spring 2013) during their first year of medical school. The student population was 107 first-year medical students. Data Analysis All statistical analyses were completed using the statistical analysis software IBM SPSS Statistics 21™. Responses for each item were compared to determine if item rankings were the same across the number of games played (independent samples, Kruskal-Wallis test). Differences found in items using Kruskal-Wallis tests were then compared individually using repeated Mann-Whitney U tests. Finally, responses were grouped according to the four categories: engagement, creation of a positive learning environment, clarification of concepts, and practice with clinical thinking. The Likert ratings were categorized into either positive or neutral/negative
response, combining “strongly agree” and “agree” into the positive category and “neutral,” “disagree,” and “strongly disagree” into the neutral/negative response. Responses to each of the four categories were compared using a chisquare test.
Results Students’ perceptions of the game shows were queried in four areas: engagement, creation of a positive learning environment, clarification of concepts, and practice with clinical thinking. A total of 73 students (68.2 %) in the class of 2016 completed the perception survey. Four respondents were omitted from analysis because these students selected zero for the number of times they had participated in the games. One hundred and seven students are enrolled in the class of 2016, but attendance for large group sessions, where the games were offered, is optional (lecture capture technology is used in our program). Of the remaining 69 respondents, 36 were male, 32 were female, and one student did not specify gender. There were 37 respondents aged 22–25, 20 aged 26–30, 7 aged 31– 35, 4 aged 36–40, and 1 student over age 41. Forty students
184
had played three games, 17 students had played two games, and 12 students had played one game. Figure 2 summarizes student responses to Likert-scale statements about their perceptions of the games. The majority of the respondents agreed or strongly agreed that the games offered an engaging format (Fig. 2(a)), provided a positive learning environment (Fig. 2(b)), clarified concepts (Fig. 2(c)), and developed clinical thinking (Fig. 2(d)). There were no statistically significant differences in rankings for any of the questions when grouped according to number of games played (independent samples, Kruskal-Wallis tests, p values range from 0.128 to 0.939). To determine whether students valued any single category more than the others, a chi-square test on the proportion of positive responses in each of the four categories (Fig. 2) was performed. Likert ratings were categorized into either positive or neutral/negative response, combining strongly agree and agree into the positive category and neutral, disagree, and strongly disagree into the neutral/negative category. There was no statistically significant difference in the responses to the four sets of questions (chi-squared=0.507, p=0.917). Nine students wrote comments in response to the optional prompt “other comments.” These specific survey comments were categorized by theme using open coding [21] (Table 1). Five of the statements were extremely positive, providing insights as to why students felt the games were valuable or fun; students appreciated the competition, unconscious
Med.Sci.Educ. (2014) 24:181–187
learning, and the practice exam questions. One student wished every teacher would provide similar activities. Six statements delivered specific suggestions for improving the games. Key themes included pace, the length of the stem or case vignette, access to the game show questions outside of game play, and the timing of the review game with respect to the material being learned. Two responses mentioned speed or pace within the games; for one student, the pace was too fast, and for the other student, the pace was acceptable (a positive comment, but grouped here for simplicity). Two responses suggested shortening the question stems or vignettes. Another two responses requested access to the questions before or after the class session, and one requested more time to digest course content material prior to playing the game. Two comments were omitted because they were unclear.
Discussion We combined the engaging qualities of a multimedia game show with the power of an audience response system to create a review tool that would appeal to first year medical students. Net generation students are visual and kinesthetic learners; they gravitate toward activities that promote and reinforce social interaction (prefer to learn and work in teams), and expect immediate responses [1]. Features of the described game show that address net generation student preferences
Fig. 2 Student’s perceptions of the game shows in four categories: the ability of the games to engage them (a), provide a positive learning environment (b), clarify concepts (c), and develop clinical thinking (d)
Med.Sci.Educ. (2014) 24:181–187 Table 1 Responses to other comments prompt, categorized by theme using open coding [21]
185
Theme
Specific statement
Positive comments Competition
Unconscious learning Practice exam questions Request for more games Suggestions for improving the activities Pace
Length of question stem or vignette
Access to quiz game show questions
Game schedule
include a multimedia format, physical action (clickers), team play, and immediate feedback. Akl et al. [20] suggest that accessing the potential benefit of educational games is best accomplished when three critical factors are addressed in game development: active learning, integration of fun and excitement, and feedback mechanisms. The game shows incorporate all of these features. At least in part because of the clickers, all students in the class are actively involved, as opposed to an instructor-led review where students sit passively and respond to questions individually. Fun and excitement are generated by the entertaining and competitive features of the game. The game opens with game show music and colorful graphics, and students log on to their teams (males against females for the games in this study) with their clickers. As individual students click their answer choices during the game, the countdown timer ticks and then buzzes when the time is up. Other game-generated sounds include the spinning wheel, clapping, and booing. Team results are tabulated and displayed immediately after each question round, so the competition heats up regularly. The students contribute to the noise as well, as they send vocal encouragement to teammates around the classroom after each result display. When the winning team is announced on-screen, fireworks explode (and the students roar!), and the winning team receives a candy reward.
Making it competitive also made it fun and collaborative within each group Battle of the sexes was the most fun format Very fun activity where you don’t even realize you’re learning The in-class games have been a great way to get some much-needed practice questions on difficult information These are amazing!!! I wish every teacher would do this The pace on these can often be too fast for me to critically think about the material before moving on I like that it is not a fastest answer response set-up so that everyone has an opportunity to answer the questions I wish that vignettes were a little shorter (maybe bullet points instead of paragraphs)/font a little larger to read on the screens Sometimes the questions stems are too long for the pace of the game. If they are long, the font is small and difficult to read It would be helpful to have the game accessible at a later time for review It would be helpful if a list of the questions given were in a word document or sent out after the event so I could review the questions in preparation for the test Holding them within or near to the lecture in which the material was taught is not helpful, I need time to integrate and learn the information before being tested on it
The game houses multiple feedback mechanisms, the third critical factor in game development. Feedback from formative assessment, for example, this type of review game, can motivate students and redirect their learning towards areas of deficiency, particularly if done under conditions that are non-judgmental and conducive to learning [22]. After the countdown clock buzzes in the Spin-Off game, the instructor selects the correct answer and the percent of students responding to each answer choice is displayed (Fig. 1c). Misconceptions about concepts can be addressed immediately, with or without a review screen. As described in the “Materials and Methods” section, review screens for teaching, reinforcement, or clarification are easily built into the game. Team results are tabulated immediately and displayed graphically on the subsequent screen (Fig. 1d, right side). Team avatars appear to the left of the results graph, either clapping or groaning (Fig. 1d, left side). A detailed report of all of the results is displayed at the end of the game, which allows the instructor to quickly identify problem areas for further instruction. In addition to providing feedback, formative assessment activities should be an opportunity for students to develop familiarity with summative instruments [22]. Approximately half of the questions in the game shows were strict recall, while the other half matched the style used in our summative assessments, board style with clinical vignette stems and
186
multiple choice answers. The multimedia game show described is an extremely useful tool for difficult, contentdense subjects like medical microbiology, where games have been shown to help counter feelings of despair related to assimilating the large volume of facts and terminology [7]. Student’s responses to the game shows, both net generation students and older students (>31 years old), were overwhelmingly positive (Fig. 2). The majority of the students agreed or strongly agreed that the games were engaging (Fig. 2(a)), provided a positive learning environment (Fig. 2(b)), clarified concepts (Fig. 2(c)), and developed clinical thinking (Fig. 2(d)). Statistical analysis indicated that students did not place more value in any one of the four categories. The positive response to the categories that addressed clarification of concepts (Fig. 2(c)) and development of clinical thinking (Fig. 2(d)) indicates that these games were perceived to be a valuable educational tool. It is noteworthy that 87.7 % of the students agreed or strongly agreed with the statement I look forward to playing more games in the classroom. One student strongly disagreed with all statements in all categories but did not submit an optional comment. As such, the student’s dissatisfaction with the games (or perhaps the instructor!) cannot be addressed. While the majority of the students did not provide optional comments (Table 1), useful information was gained. In our large group classroom, students sit at round tables, and there are eight, 50-in. plasma flat screens. As such, smaller fonts can be difficult to read. For future games in our large group classroom, question stems will be shortened as much as possible to improve visibility. Two 80-in. projector screens were recently added to the classroom, which should also help. The comment Holding them within or near to the lecture in which the material was taught is not helpful, I need time to integrate and learn the information before being tested on it is understandable; however, it is extremely difficult to schedule additional time for a particular topic in our integrated clinical presentation curriculum [23–25]. A possible solution would be to schedule lunchtime review games several days after material is delivered. One limitation of this study is that survey participants represented 64.4 % of the class, and thus, caution is recommended in extrapolating results to non-participating students and to other medical student populations.
Conclusion Net generation students have a high degree of technological literacy. They prefer active, first-person experiential learning, interactivity, and image-rich formats versus text. The game show described here was developed with these preferences in mind, and it received overwhelmingly positive responses from first-year medical students. The game software combined with
Med.Sci.Educ. (2014) 24:181–187
an audience response system provides a visually rich, engaging format that should be applicable to any basic science discipline. Acknowledgments This research was supported by HRSA grant no. D54HP20674. Notes on Contributors Robin K. Pettit, Ph.D., is a Professor of Microbiology at the School of Osteopathic Medicine, A. T. Still University, Mesa, AZ, USA. Lise McCoy, MTESL, is an Assistant Director, Faculty Development at the School of Osteopathic Medicine, A. T. Still University, Mesa, AZ, USA. Marjorie Kinney, M.Ed., is a Curriculum Assessment Analyst at the School of Osteopathic Medicine, A. T. Still University, Mesa, AZ, USA. Frederic N. Schwartz, D.O., FACOFP, is the Associate Dean, Clinical Education and Services, and Professor and Chair, Family and Community Medicine at the School of Osteopathic Medicine, A. T. Still University, Mesa, AZ, USA.
References 1. Oblinger DG, Oblinger JL. Educating the net generation. 2005. www. educause.edu/educatingthenetgen. Accessed 2 Jul 2013. 2. Handelsman J, Ebert-May D, Beichner R, Bruns P, Chang A, DeHaan R, Gentile J, Lauffer S, Stewart J, Tilghman SM, Wood WB (2004) Scientific teaching. Science 304:521–522 3. Michael J (2006) Where’s the evidence that active learning works? Adv Physiol Educ 30:159–167 4. Prince M (2004) Does active learning work? A review of the research. J Engr Educ 93:223–231 5. Wood W (2004) Clickers: A teaching gimmick that works. Dev Cell 7:796–798 6. Akl EA, Gunukula S, Mustafa R, Wilson MC, Symons A, Moheet A, Schunemann HJ (2010) Support for and aspects of use of educational games in family medicine and internal medicine residency programs in the US: a survey. BMC Med Educ 10:1–5 7. Beylefeld AA, Struwig MC (2007) A gaming approach to learning medical microbiology: student’s experiences of flow. Med Teach 29: 933–940 8. Valente P, Lora PS, Landell MF, Schiefelbein CS, Girardi FM, Souza LR, Zonanto A, Scroferneker ML (2009) A game for teaching antimicrobial mechanisms of action. Med Teach 31:e383–e392 9. Zakaryan V, Bliss R, Sarvazyan N (2005) Non-trivial pursuit of physiology. Adv Physiol Educ 1:11–14 10. Da Rosa AC, Moreno Fde L, Mezzomo KM, Scroferneker ML (2006) Viral hepatitis: an alternative teaching method. Educ Health 19:14–21 11. Graafland M, Schraagen JM, Schijven MP (2012) Systematic review of serious games for medical education and surgical skills training. Br J Surg 99:1322–1330 12. Akl EA, Mustafa R, Slomka T, Alawneh A, Vedavalli A, Schunemann HJ (2008) An educational game for teaching clinical practice guidelines to internal medicine residents: development, feasibility and acceptability. BMC Med Educ 8:1–9 13. Hudson JN, Bristow DR (2006) Formative assessment can be fun as well as educational. Adv Physiol Educ 30:33–37 14. Jirasevijinda T, Brown LC (2010) Jeopardy! An innovative approach to teach psychosocial aspects of pediatrics. Patient Educ Couns 80: 333–336 15. Moy JR, Rodenbaugh DW, Collins HL, DiCarlo SE (2000) Who wants to be a physician? An educational tool for reviewing pulmonary physiology. Adv Physiol Educ 24:30–37
Med.Sci.Educ. (2014) 24:181–187 16. O’Leary S, Diepenhorst L, Churley-Strom R, Magrane D (2005) Educational games in an obstetrics and gynecology core curriculum. Am J Obst Gynec 193:1848–1851 17. Schuh L, Burdette DE, Schultz L, Silver B (2008) Learning clinical neurophysiology: gaming is better than lectures. J Clin Neurophysiol 25:167–169 18. Shiroma PR, Massa AA, Alarcon RD (2011) Using game format to teach psychopharmacology to medical students. Med Teach 33:156–160 19. Rajasekaran SK, Senthilkumar U, Gowda V (2008) A PowerPoint game format to teach prescription writing. Med Teach 30:717–718 20. Akl EA, Pretorius RW, Sackett K, Erdley S, Bhoopathi PS, Alfarah Z, Schunemann HJ (2010) The effect of educational games on medical students’ learning outcomes: a systemic review. BEME Guide No. 14. Med Teach 32:16–27
187 21. Glaser BG, Strauss AL (1967) The discovery of grounded theory: strategies for qualitative research. Aldine Transaction, New Brunswick, USA 22. Rolfe I, McPherson J (1995) Formative assessment: how am I doing? Lancet 345:837–839 23. Pettit RK, Kuo YP (2013) Mapping of medical microbiology content in a clinical presentation curriculum. Med Sci Educ 23:201–211 24. Schwartz FN, Hover ML, Kinney M, McCoy L (2012) Student assessment of an innovative approach to medical education. Med Sci Educ 22:102–107 25. Schwartz FN, Hover ML, Kinney M, McCoy L (2012) Faculty assessment of an innovative approach to medical education. Med Sci Educ 22:108–116