567980 research-article2015
SAGXXX10.1177/1046878114567980Simulation & GamingPavlov et al.
This article is part of a symposium: System Dynamics and Simulation/Gaming
Improving Instructional Simulation With Structural Debriefing
Simulation & Gaming 1–21 © 2015 SAGE Publications Reprints and permissions: sagepub.com/journalsPermissions.nav DOI: 10.1177/1046878114567980 sag.sagepub.com
Oleg V. Pavlov1, Khalid Saeed1, and Lawrence W. Robinson2
Abstract Background. Research shows that learning and task performance improve when participants in management exercises understand the structure of the system they control. However, the majority of business simulators are black-boxes. Aim. This article introduces structural debriefing, which is a debriefing activity aimed at helping students learn about causal relationships, feedbacks, accumulations, and delays within a black-box simulation. Method. A structural debriefing can be prepared and facilitated by following the Structural Debriefing Protocol. Results. A pilot study was conducted in which undergraduate students participated in a structural debriefing of The LITTLEFIELD TECHNOLOGIES, a popular simulation for teaching principles of operations management. The students were able to complete all eight steps of a structural debriefing, but required considerable time (three academic terms) to do so. Not every instructional simulation will require all the steps or such a large time commitment. Conclusion. The successful completion of the pilot study demonstrates that structural debriefing is a useful debriefing technique. However, to be effective, the scope and format of a structural debriefing activity must suit practical and pedagogical considerations.
1Worcester 2Cornell
Polytechnic Institute, USA University, USA
Corresponding Author: Oleg V. Pavlov, Associate Professor, Worcester Polytechnic Institute, SSPS, WPI, 100 Institute Rd., Worcester, MA 01609, USA. Email:
[email protected]
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
2
Simulation & Gaming
Keywords accumulations, black-box simulation, business training, causal relationships, debriefing, delays, feedbacks, LITTLEFIELD, operations management, simulation, simulators, steps, structural debriefing, system dynamics, time
Computer-based instructional simulations have gained wide acceptance since they were first introduced in the 1960s (Steenhuis, Grinder, & de Bruijn, 2011). The majority of instructional simulators are ready-made black-boxes with user-friendly interfaces that display input-output and provide only minimal information about the internal structure of the simulated system. Black-box simulators are effective and efficient teaching tools in situations when detailed knowledge of the system is immaterial for performance and learning (Alessi, 2000). For example, pilots train on flight simulators without diving into details of engine mechanics. Experiments have shown, however, that students make better management decisions when they understand the structure of the system in the exercise (e.g., Größler, Maier, & Milling, 2000; Qudrat-Ullah, 2007; Sterman, 2010). Therefore, this article reports the development of a debriefing procedure, called structural debriefing, that helps students learn about causal relationships, feedbacks, accumulations, and delays within a black-box simulation. Structural debriefing brings together two key insights. The first one is from the field of simulation and gaming. It is the commonly accepted view that debriefing improves the simulation experience (Crookall, 2010; Lederman, 1992). The second key insight is that system structure determines system behavior, which is the fundamental principle of the system dynamics methodology (e.g., Größler, Thun, & Milling, 2008; Richardson & Pugh, 1981). System dynamics is the second most widely used simulation technique in operations research (Jahangirian, Eldabi, Naseer, Stergioulas, & Young, 2010), and it has been applied to a range of topics including production (e.g., Größler et al., 2008) and renewable resource management (e.g., Qudrat-Ullah, 2007). Several universities offer instruction in system dynamics (Davidsen, Kopainsky, Moxnes, Pedercini, & Wheat, 2014; Pavlov, Doyle, Saeed, Lyneis, & Radzicki, 2014; Wakeland, 2014). A structural debriefing can be prepared and facilitated by following the Structural Debriefing Protocol, which is based on the standard method in system dynamics. As a pilot study, we guided a team of four undergraduate students through a structural debriefing of a popular business online game called The LITTLEFIELD TECHNOLOGIES. It simulates a small manufacturing facility that produces make-toorder electronic equipment. The game is used in operations management courses to illustrate concepts of utilization, queuing, scheduling, and inventory (Miyaoka, 2005; Snider & Balakrishnan, 2013; Wood, 2007). While this article focuses on business education, structural debriefing can be used with simulations in other fields.
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
3
Pavlov et al.
In the next section, we review the use of simulations and debriefing in business education. Then, we describe studies that show the positive impact of structural knowledge on decision making. The third section explains structural debriefing, and the fourth reports on the pilot study. We follow that with a discussion on the scope and format of a structural debriefing designed to suit the practical and pedagogical considerations of a particular course. We conclude with a summary and thoughts on future directions for this research.
Simulations in Business Education and the Importance of Debriefing Nearly all business programs that are accredited by The Association to Advance Collegiate Schools of Business (AACSB) use simulations in instruction (Wood, 2007). Business simulations are also popular in corporate training (Summers, 2004). The wide adoption of instructional simulations has been facilitated by the evolution of personal computers and the Internet as well as by the growth and acceptance of computational methods. Although they range in sophistication, quality, and cost to the users, business simulations usually portray a firm or an industry and decisions revolve around allocating resources. For examples the reader may visit the website of The Association for Business Simulation and Experiential Learning. Reflecting their experiential nature, simulations are frequently referred to as games or serious games (Crookall, 2010). A typical simulation is part of a course, mixed with lectures, cases, and homework assignments. Instructors build other activities around simulations such as group exercises, team competitions for the best performance, coming up with strategies, and post-game analysis (Anderson & Lawton, 1997). For the most part, students enjoy simulations (Moizer & Lean, 2010; Steenhuis et al., 2011). Most of the learning in a simulation exercise comes from a debriefing that aims to construct meaning of the simulation experience (Crookall, 2010; Lederman, 1992; Markulis & Strang, 2003; Steinwachs, 1992). Common formats for a debriefing are discussions and journal writing (Petranek, Corey, & Black, 1992; Steinwachs, 1992). A debriefing may include analysis, personalization, and generalization activities (Lederman, 1992; Steinwachs, 1992). During the analysis, participants introduce themselves and recollect what happened. In the personalization phase, participants describe the effect of the simulation on them personally. A debriefing may conclude with a review of wider applications of the experience.
Structural Knowledge and the Behavior of a System A significant body of research on learning and management demonstrates that people tend to underestimate causal complexity. We often think in terms of simple linear relationships such as “A causes B” (Perkins & Grotzer, 2005). The reality, however, is seldom that simple. Natural, social, and business systems consist of many parts that interact through complex causal webs of mutual causality, feedback loops, domino
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
4
Simulation & Gaming
causality, and spiraling causality (Grotzer, 2012). People also make judgment mistakes because they misperceive the effects of accumulations and delays within the system (e.g., Moxnes, 2000; Sterman, 1989) In one study, when students were given information on the number of customers entering and exiting a store, they consistently misjudged the number of people in the store at any particular moment (Sterman, 2010). In another set of experiments, participants repeatedly obtained poor results while playing The BEER DISTRIBUTION GAME, a simple simulation of industrial production and distribution with a number of embedded delays and feedbacks (Sterman, 1989). Similar cognitive deficiencies were observed in experiments that involved managing natural resources (e.g., Moxnes, 2000; Perkins & Grotzer, 2005; Qudrat-Ullah, 2007). It has been suggested that people are generally not concerned about the climate change because they underestimate the delayed effects of CO2 accumulation in the environment (Sterman, 2008). Several reasons rooted in our cognition prevent people from making optimal decisions (Grotzer, 2012). We tend to learn when we see things happen in response to an action; this learning mechanism fails to grasp the causal complexity of systems, especially when an event has several causes or when feedback is involved. Often people make conclusions based on statistical correlations, rather than true causation. Also in pursuit of efficiency, our minds ignore a great deal of potentially useful information. Certain things simply do not capture our attention. Research shows, however, that structural knowledge improves task performance. Consider again the experiment mentioned above when students were asked to predict the number of customers in the store. They erred because they relied on the correlation heuristic—they assumed that the trajectory of the stock should be positively correlated with the flows in and out of the stock, which is incorrect. After the stock and flow structure of the problem was explained to students, the error rate dropped by 50% (Sterman, 2010). In experiments conducted by Qudrat-Ullah (2007), students played a resource management game, The FISH BANKS. In this game, students manage a fishery and a fishing fleet. Performance is determined by the total ending profit and the remaining fish population. Prior to playing FISH BANKS, the treatment group discussed the main feedback loops in the game’s underlying model. That discussion improved mental models that students had of the system and led to more robust heuristics and better decisions. Students in the treatment group outperformed other students. Größler et al. (2000) measured the effect of structural information on student performance when they played the business simulator called LEARN! The simulation included production, financials, human resources, and marketing. The control group used the simulation as a black-box, whereas treatment groups received information about the feedback structure of the simulation. The structural information was communicated in a number of formats. One treatment group received a 45-minute lecture on the feedbacks in the model. The other treatment group had continuous access during the simulation to help screens with diagrams of causal relationships between variables. The results suggested that familiarity with the feedback structure led to better outcomes.
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
5
Pavlov et al. Table 1. Eight Steps of the Structural Debriefing Protocol. Steps
Skill level
1. List of variables 2. Reference modes 3. Momentum strategies 4. Dynamic hypotheses 5. Model construction 6. Model validation 7. Strategy testing 8. A written report
Basic Basic Basic Basic Intermediate Intermediate Intermediate Basic
Time required Short Short Short Medium Long Long Long Medium
Structural Debriefing How can an instructor help students learn about the structure of the simulated system? We suggest that this task can be accomplished with structural debriefing, which is a special type of a debriefing activity. Structural debriefing incorporates fundamental principles from two disciplines. The first principle comes from simulation and gaming, and it is the recognition that a debriefing activity positively contributes to the simulation experience (Crookall, 2010; Lederman, 1992). Second, structural debriefing adopts the view from system dynamics that system structure determines system behavior (e.g., Größler et al., 2008; Richardson & Pugh, 1981). The Structural Debriefing Protocol (Protocol, for short) is a step-by-step description of a structural debriefing activity. The Structural Debriefing Protocol is an adaptation of the “standard method” for system dynamics modeling (Pavlov et al., 2014). The term “the standard method” was coined by Professors James Hines and James Lyneis when they developed a graduate course called the System Dynamics Foundations, first taught at the Massachusetts Institute of Technology and now at Worcester Polytechnic Institute (WPI). The standard method is a process for building, validating, and using a system dynamics model for policy analysis. The Protocol includes eight steps (Table 1). Each step requires different levels of system dynamics knowledge and training (see Pavlov et al., 2014, for the discussion of skill levels). The last column in Table 1 estimates the relative time it takes to complete each step. In the most comprehensive structural debriefing activity, students would complete the eight steps of the Protocol from start to finish. They would build a system dynamics model of a simulation, validate it, and conduct strategy testing. This approach is consistent with the system dynamics view that the best way to learn about a system is to build its computer model and then conduct computer simulations (Alessi, 2000; Größler et al., 2000). However, as discussed later in the article, the constraints of any particular course will dictate the realistic and necessary scope and format of a structural debriefing activity. For example, when time is short, the instructor may provide the structural information to students rather than ask them to discover it on their own.
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
6
Simulation & Gaming
The Protocol steps are explained below. All steps are iterative and students may need to go through them multiple times before a satisfactory structural understanding is reached. Each iteration will improve their understanding of the problem and the system. Step 1—List of variables: The first step is to identify variables in the simulation. A variable is an entity of some significance within the system. Variable values may change over time or they may remain constant. Students may or may not have data for the variables. Step 2—Reference modes: Reference modes are graphs that show behavior of variables over time. They are the most important outcome in formulating the problem statement because reference modes describe the behavior of the system. Reference modes and the initial discussions around them need not attempt to explain the outcomes of the original simulation. Drawing reference modes for five important variables is usually sufficient. Reference modes show a pattern rather than data, and therefore it is not necessary to show scale. Each graph will include a history portion as well as three possible trajectories for the future: “hope,” “fear,” and “expected” (they also can be named the “best case,” the “worst case,” and “expected”). The horizontal axis should show the beginning time, “now,” and the ending time. Step 3—Momentum strategies: If students followed a particular strategy in the original simulation, then it is a momentum strategy. It is a strategy that they thought would be successful. Students who participate in a structural debriefing may identify several momentum strategies. Recording momentum strategies is a good way to keep track of how one’s thinking about a particular simulation evolves. The momentum strategies should be compared with the additional strategies that are generated during the debriefing activity. Step 4—Dynamic hypotheses: A dynamic hypothesis is an explanation of the simulation behavior in terms of its structure. The idea is that interacting loops generate patterns. For example, S-shaped growth requires at least one positive loop and one balancing loop. Students may come up with several dynamic hypotheses. A dynamic hypothesis is communicated with the help of a causal loop diagram (see Figure 3 for an example). Step 5—Model construction: During this step, students build a system dynamics model of the black-box simulation. This step requires familiarity with the system dynamics method. The model is constructed using one of the specialized system dynamics software packages, such as iThink or Vensim. A careful building of the model may take a considerable amount of time. It is, however, the best way to learn the structure of a black-box simulation. Step 6—Model validation: The objective of model validation is to build confidence in the system dynamics model (Barlas, 1996). Students should validate model structure, behavior and check the model for unit-consistency. A well-constructed model must yield sensible results during extreme value testing, input response testing, and sensitivity analysis.
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
7
Pavlov et al.
Step 7—Strategy testing: Once validated, the system dynamics model can be used to explain the outcomes of the original simulation. Students would begin by implementing the momentum strategies. In addition, they can experiment with new strategies that may lead to better outcomes. Performance under new and momentum strategies should be compared. Step 8—A written report: In the final step of a structural debriefing, students should write about their experiences as they ran the black-box simulation and describe what they learned during the structural debriefing. If students have built the model, they should describe it in the report. A written report may be supplemented by a presentation in class.
An Application: LITTLEFIELD To test our approach, we recruited four students to participate in a structural debriefing activity for The LITTLEFIELD TECHNOLOGIES simulation (LITTLEFIELD, for short). The students were in their third and fourth year of studies at WPI. They had backgrounds in system dynamics, economics, and operations management. Academically, the pilot study was set up as a credit-bearing, graded course. Over three academic quarters (21 weeks), the students completed all steps of the Structural Debriefing Protocol. They listed variables, drew reference modes, identified momentum strategies, and so on. Throughout the duration of the study, they submitted progress reports, met with us regularly, and we provided them with feedback. At the end, they submitted a report. The LITTLEFIELD simulation was chosen as an application example because it is a popular and well-designed black-box simulator. One of the authors of this article regularly uses it in his operations management courses at Cornell University. Below, we provide additional details about the simulation.
The LITTLEFIELD Simulation LITTLEFIELD is an online game sold by Responsive Learning Technologies, which licenses the simulation from Stanford University. The simulation was developed in the late 1990s by Samuel C. Wood and Sunil Kumar when both were professors at Stanford University (Wood, 2007). In 1996, Stanford University adopted LITTLEFIELD in its MBA courses. Additional universities started using the game in 1998. The simulation is part of the curriculum at nearly 200 institutions in more than 30 countries (Snider & Balakrishnan, 2013). It is available in English, Russian, and Japanese. In 2004, LITTLEFIELD received the Wickham Skinner Award for Teaching Innovation from the Production and Operations Management Society. LITTLEFIELD simulates a small factory that produces make-to-order electronic equipment. The game is used in operations management courses to illustrate concepts on utilization, queuing, scheduling, and inventory (Miyaoka, 2005; Snider & Balakrishnan, 2013; Wood, 2007). Participants access it through regular Internet browsers via the interface shown in Figure 1. The assembly process consists of four
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
8
Simulation & Gaming
Figure 1. A screenshot of the LITTLEFIELD interface.
steps carried out at three stations: board stuffing, testing, and tuning. Each station consists of several machines. The last step is inspection. In the basic configuration of the game, participants can adjust the contract type, the inventory reorder point, the reorder quantity, and the number of machines at each of the three stations. The factory receives a random number of customer orders each day. The revenue is affected by the average lead time per order and the contract type chosen. Better contracts not only bring more revenue, but also have stiffer penalties for missing delivery dates. By clicking on the icons on the interface, students may observe historic data for machine utilization, inventory levels, queue lengths, and lead times. The revenue increases the stock of virtual cash that can be spent on raw materials and new machines. Raw materials (the assembly kits) are acquired from the supplier. Machines can also be sold at any time at a deep discount. The danger of purchasing too many machines in an attempt to achieve low lead time is that cash needed for raw materials can run out. Without raw materials, production halts. The only way out of this predicament is for the factory to borrow cash at an extremely high interest rate from a “bank” that opens halfway into the simulation. The factory does not earn any revenue for orders that miss delivery targets even though the orders still have to be fulfilled. In a typical course, LITTLEFIELD is played by student teams. Teams run independent factories, but do not directly compete with each other within the game for customers or any other resources. All teams face the same demand pattern. The benchmark is provided by a robotic Do-nothing team that makes no decisions. Each team can compare its financial performance against other teams as well as against the Do-nothing team. The winning team is the one with the most cash at the end of the game. This setup creates an exuberant atmosphere with teams trying to outperform the other
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
9
Pavlov et al.
teams. Instructors believe that playing LITTLEFIELD increases students’ interest in operations management and improves their understanding of capacity and inventory management (Miyaoka, 2005; Snider & Balakrishnan, 2013). Students pay Responsive Learning Technologies for access to the simulation. While most games are played by teams over a period of a week, it is possible to play it for 2 hours or for as long as a month. The whole simulation is 318 simulated days. Teams are in charge of their factories between Periods 50 and 218. The simulation runs in the “fast” mode during the first 50 days and the last 100 days; during the fast period, teams cannot make any changes to the simulation. Teaching with LITTLEFIELD is more effective when a debriefing session is included (Snider & Balakrishnan, 2013). Debriefing can start with a short in-class overview of the simulation prior to the game (Snider & Balakrishnan, 2013; Wood, 2007). An instructor may require students to complete a pre-assignment before the simulation that allows students to become familiar with the game and think through possible strategies. After the game, the winning team may be asked to explain its strategy in a debriefing discussion (Snider & Balakrishnan, 2013; Wood, 2007).
The Need for a Structural Debriefing Our experience and the literature review suggest that students appear to be having problems with LITTLEFIELD even if they have been exposed to relevant management topics. It is difficult for them to think about the simulation as a system rather than isolated resources (Wood, 2007). The structural complexity of the game is further increased by internal delays. For example, it takes 4 simulated days for raw materials (kits) to arrive after they are purchased. The lead time on orders also varies in response to machine utilization rates which, in turn, are determined by work-in-progress at each manufacturing stage and order arrival rates. As a result, student strategies are often reactive and suboptimal. Steenhuis et al. (2011) reported that among over 100 teams that played the game, only between 13% and 57% of teams in each simulation session earned a profit. In other words, “. . . a substantial portion of the teams are making decisions that are worse than if they had not made decisions at all . . .” (Steenhuis et al., 2011, p. 112). As discussed above, structural knowledge tends to improve learning and task performance. Although some structural information about the simulation is revealed to students through the LITTLEFIELD interface (Figure 1), this interface hides important variables and does not explain causal relationships and delays in the system. A structural debriefing would reveal that information.
A Structural Debriefing for LITTLEFIELD This section is based on the report prepared by students during the pilot study. For the purpose of this article, we improved graphs from their originals in the report.
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
10
Simulation & Gaming
Table 2. A Partial List of Variables. Cash balance Lead time Revenue Machines in Station 1 Machines in Station 2 Machines in Station 3 New machines purchased Revenue per order Raw materials Queue wait time Reorder quantity Customer order queue
Step 1—List of variables: As the first step, students identified variables in the simulation. Table 2 shows some variables from the much longer list prepared by students. In general, the initial list of variables does not need to be exhaustive as the list can be updated later. Step 2—Reference modes: Figure 2 shows reference modes for two variables that are crucial to the success of the virtual factory: cash balance and lead time. Until Time 50, the game is in an automatic, or “fast,” mode and students cannot make any changes. Beyond Time 50, students take control of the simulation. The “Hope” trajectories reflect the outcomes that are desired by students. The students would like to avoid the “Fear” scenarios. If game participants make no decisions and allow the game to run on its own, it follows the “Do-nothing” trajectories. Step 3—Momentum strategies: The students identified a number of momentum strategies. One strategy was to make no changes to the default values for contracts, reorder points, and quantities. Other instructors report that students indeed choose this strategy, appropriately called “Do-nothing” (Snider & Balakrishnan, 2013). Because the “Do-nothing” strategy produces better results as compared with the “Fear” scenario, the strategy is a rational option for teams that are not confident in their ability to manage the factory. Another momentum strategy is to play it “safe” by never choosing the highreward-high-risk contracts. In exchange for high revenue, such contracts require short lead times and carry stiff penalties for failing to deliver on time. Students can shorten lead times by investing in additional machines; however, it is easy to run out of cash that will halt production and, eventually, increase the lead time again— all feared outcomes. Step 4—Dynamic hypotheses: Figure 3 shows the causal structure of the game. The schematic, also known as the causal loop diagram, includes variables that have been identified in Step 1. Rectangles designate stocks which are points of accumulation and delay within the system. Polarity of links identifies positive and negative causal relationships between variables. A positive link between two variables
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
11
Pavlov et al.
Figure 2. Examples of reference modes for cash balance and lead time.
means that if one variable changes, it causes the second variable to move in the same direction. A negative link means that the cause and effect change in opposite directions. Causal chains form negative and positive feedback loops. A positive loop consists of only positive links or an even number of negative links. Positive loops either drive growth or act as vicious cycles. Negative loops counterbalance positive loops. For example, purchasing new machines reduces lead time and improves revenue and cash stock, which allows participants to buy more machines— a positive loop. On the other hand, purchasing machines immediately reduces the cash stock and reduces the ability to purchase even more machines—a negative loop. Step 5—Model construction: During the pilot study, students were involved in a full structural debriefing, which included building a full working system dynamics model of the simulation. This step, which involved many iterations of the model, took the longest time. The model included eight sectors. Orders were tracked in the Incoming Customer Orders sector. Job orders were released to the Stations sector
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
12
Simulation & Gaming
Figure 3. The causal structure of the LITTLEFIELD simulation game.
that simulated inventories of raw materials, work-in-progress, and finished goods. When an order was complete, the number of orders in the system was decreased and the appropriate lead time was calculated in the Lead Time sector. Revenue was based on the lead time and the type of the contract. Revenue and cash balance were calculated in the Financial sector. Based on the lead time at the moment, the Contracts sector estimated the profitability of various contracts and the Policy sector picked the most profitable contract for the following production batch. The ability to purchase machines and raw materials was calculated in the Cash Constraints sector. The Machines sector included three stocks of machines—one stock for each production station at the factory. The system dynamics model was implemented in iThink software. As a demonstration, Figure 4 displays the stock-and-flow structure of the sector called Stations. The six stocks are shown as rectangles: Order for Raw Material, Raw Material Inventory, and four stocks for the four process queues at the factory. The stock Order for Raw Material tracks the raw materials shipping. It has two flows: Raw Material Purchase Rate and Raw Material Arriving Rate. Arriving raw materials are
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
Figure 4. The stock and flow structure of the Stations sector.
13 Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
14
Simulation & Gaming
Figure 5. Lead time for two values of customer orders.
added to the stock Raw Material Inventory. A job is released when the Customer Order Queue is not empty and if the raw material inventory is sufficient. When a job is released, it is added to the stock Queue for Step 1. The job is then processed through the four steps of the production until its completion. Step 6—Model validation: The purpose of model validation was to ensure that the model behaved as expected. Figures 5 and 6 show two examples of validation tests performed by the students. When the number of incoming customer orders was zero, no jobs needed processing, and therefore lead time was consistently zero (Line 1 in Figure 5). When customer orders arrived at a rate of 100 per day, the growing order backlog extended the lead time (Line 2 in Figure 5). Figure 6 shows corresponding trajectories for cash. Without customer orders, cash grew due to earned interest (Line 1 in Figure 6). Line 2 in Figure 6 shows cash when the number of incoming customer orders per day was equal to 100. Cash increased quickly at first. However, very soon the order backlog caused the lead time to increase beyond the value allowed by the contract. According to the rules of the game, orders that missed delivery targets did not earn any revenue even though the orders still had to be fulfilled. As a result, without revenue from contracts, cash growth slowed down, and from that point in the simulation, cash grew only due to the earned interest. Step 7—Strategy testing: Once the model was constructed and validated, students performed strategy testing. They picked different contracts, changed reorder levels and quantities for raw materials, and experimented with machine orders. Figure 7 shows a typical set of runs for different reorder levels. All runs used Contract 3— the most profitable contract—and the reorder quantity was set at 2,500 jobs.
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
15
Pavlov et al.
Figure 6. Cash trajectories for two values of customer orders.
Trajectory 1 is for the reorder level that was set at 40 kits. The reorder level was equal to 80 kits for Trajectory 2. For Trajectory 3, reordering was set at 120 kits. These runs suggested that the third strategy, with the reorder level of 120 kits, was the best one among these three options. Step 8—A written report: At the end of the pilot study, students submitted a written report that documented the entire debriefing activity. The project was formally completed when the students were assigned grades.
Discussion The scope and format of a structural debriefing may vary. How much structural knowledge about the simulation should be provided to students depends on instructional objectives (see Alessi, 2000; Machuca, Castillo, Carrillo, & Zamora, 1998, for a relevant discussion). For example, if the goal is to teach students to discover knowledge, then the debriefing activity should facilitate the discovery process. Besides instructional objectives, situational constraints such as time and resources available for a debriefing session and educational backgrounds of the participants also influence the scope and format of a debriefing (Lederman, 1992). While students in the pilot study at WPI participated in a full structural debriefing, a shorter structural debriefing is possible. A full debriefing requires a great deal of time, and students must be proficient in the system dynamics methodology. In the majority of classroom situations, a full structural debriefing is either not necessary or not feasible. Therefore, the format of a structural debriefing activity can be modified
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
16
Simulation & Gaming
Figure 7. Cash trajectories for three reorder levels.
along several dimensions: timing of the debriefing, structural elements included in the debriefing, and whether students discover the structure themselves or learn about the structure from instructional materials prepared before the debriefing. For example, during our pilot study, students completed all eight steps of the Structural Debriefing Protocol over the period of three academic terms. These students had some training in system dynamics and operations management. When system dynamics modeling skills of the participants and time for the exercise are limited, the instructor can skip model construction and model validation steps. Instead, students may be given access to a system dynamics model that has been prepared earlier. For yet a shorter debriefing, the instructor may explain the internal structure of the simulation by reviewing with students the causal loop diagram similar to the one in Figure 3. Age and educational backgrounds of participants may also affect the format of a structural debriefing activity. It has been demonstrated that adults and youths learn differently and therefore may require different teaching strategies (Seaman & Fellenz, 1989). Among students who play LITTLEFIELD, MBA students are typically more analytical than undergraduates in their approach to playing the simulation (Miyaoka, 2005; Snider & Balakrishnan, 2013). For that reason, a debriefing for graduate students may be more analytically intensive than for undergraduates. For instance, MBA students may be asked to infer the causal structure of the simulation (Figure 3) while undergraduates may be presented with that information. A debriefing can be conducted prior, during, or post simulation (Crookall, 2010). Building on Crookall (2010), Table 3 shows five possible arrangements. The
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
17
Pavlov et al. Table 3. Several Possible Simulation and Structural Debriefing Combinations. Design A Simulation
Design B Short debriefing Simulation
Design C
Design D
Simulation Short debriefing
Simulation Full debriefing
Design E Short debriefing Simulation 1 Short debriefing Simulation 2 Short debriefing
table differentiates between short and full, or comprehensive, debriefings. A short debriefing session may include reviewing with students important variables and reference modes and discussing the dynamic hypothesis (as in Figure 3). During a full debriefing, students complete all eight steps of the Structural Debriefing Protocol. Design A includes only a simulation exercise and no debriefing. In Design B, a short structural debriefing precedes the simulation exercise. A short debriefing can also be offered after the simulation exercise as in Design C. Our pilot study followed the format as in Design D—Students played the LITTLEFIELD simulation and then participated in a full debriefing exercise. Design E implies that two simulations are played. Students start with a short debriefing during which the format of the exercise is explained and some structural information is provided. Then students run Simulation 1 which is based on the system dynamics model prepared prior to the debriefing. After a short debriefing of the results, students play Simulation 2, which in our case would have been the LITTLEFIELD simulation. Students participate in another short debriefing immediately after the second simulation. Even more designs are possible in addition to those in Table 3. As part of the Structural Debriefing Protocol, we recommend that students write a formal report. However, for a short debriefing, preparing such a report might not be practical because of time constraints. Instead, the debriefing may be organized as a conversation. Steinwachs (1992) offered useful tips on implementing oral debriefings. For example, she recommends that a debriefing group should include no more than 25 people. Larger groups should be split and convene in separate rooms with individual facilitators. She also recommends conducting debriefings in a circle. A “fishbowl” debriefing technique calls for an oral debriefing with 20 students in front of the class, while the rest of the class observes. Large group discussions may be followed by small group debriefings. We may explore in our future research the effectiveness of oral presentations in place of the formal written report.
Conclusion This article introduced and discussed structural debriefing—a pedagogical tool that helps students learn about complex causal structures as well as accumulations and
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
18
Simulation & Gaming
delays within “black-box” simulations. Prior research has shown that such structural knowledge improves learning and task performance in management exercises. Structural debriefing is an application of the system dynamics methodology to debriefing. It can be prepared and facilitated by following the Structural Debriefing Protocol. To test our approach, we guided a group of students through a structural debriefing activity for a popular black-box business simulation called The LITTLEFIELD TECHNOLOGIES. The pilot study was successfully completed, and its outcomes are described in this article. This approach recognizes that practical and pedagogical considerations may dictate variations in the scope and format of structural debriefings. For example, a debriefing may be conducted before, during, or post simulation. Students may be presented with structural information or they may be asked to discover the structural knowledge on their own, which takes longer time. The analytical complexity of debriefings may also be adjusted for the age and educational backgrounds of participants. Variations in the debriefing scope, format, and their comparative effectiveness will be the subject of future research. Examples of appropriate research designs and measures can be found in Qudrat-Ullah (2007), Crookall (2010), and Kopainsky and Sawicka (2011). Author Contributions All authors contributed to this article, in content and in form. The idea and the outline for this project came during discussions between OVP and LWR, when OVP took an operations course with LWR at Cornell University. OVP, KS, and LWR worked on the manuscript. OVP and KS supervised the pilot study and contributed to the modeling.
Acknowledgments The authors gratefully acknowledge valuable comments by Stephen M. Alessi, Birgit Kopainsky, and two anonymous referees. We thank WPI students Brooke Qiying Fan, Mengjie Liu, Murtaza Turab Thahirally, and Siqi Wang for their enthusiastic participation in the pilot study.
Declaration of Conflicting Interests The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding The authors received no financial support for the research, authorship, and/or publication of this article.
References Alessi, S. (2000). Designing educational support in system-dynamics-based interactive learning environments. Simulation & Gaming: An International Journal, 31, 178-196.
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
19
Pavlov et al.
Anderson, P. H., & Lawton, L. (1997). Demonstrating the learning effectiveness of simulations: Where we are and where we need to go. Developments in Business Simulation and Experiential Exercises, 24, 68-73. Barlas, Y. (1996). Formal aspects of model validity and validation in system dynamics. System Dynamics Review, 12, 183-210. Crookall, D. (2010). Serious games, debriefing, and simulation/gaming as a discipline. Simulation & Gaming: An International Journal, 41, 898-920. Davidsen, P., Kopainsky, B., Moxnes, E., Pedercini, M., & Wheat, D. (2014). Systems education at Bergen. Systems, 2, 159-167. Größler, A., Maier, F. H., & Milling, P. M. (2000). Enhancing learning capabilities by providing transparency in business simulators. Simulation & Gaming: An International Journal, 31, 257-278. Größler, A., Thun, J.-H., & Milling, P. M. (2008). System dynamics as a structural theory in operations management. Production and Operations Management, 17, 373-384. Grotzer, T. A. (2012). Learning causality in a complex world: Understandings of consequence. Lanham, MD: R&L Education. Jahangirian, M., Eldabi, T., Naseer, A., Stergioulas, L. K., & Young, T. (2010). Simulation in manufacturing and business: A review. European Journal of Operational Research, 203, 1-13. Kopainsky, B., & Sawicka, A. (2011). Simulator-supported descriptions of complex dynamic problems: Experimental results on task performance and system understanding. System Dynamics Review, 27, 142-172. Lederman, L. C. (1992). Debriefing: Toward a systematic assessment of theory and practice. Simulation & Gaming: An International Journal, 23, 145-160. Machuca, J., Castillo, J. C. R., Carrillo, M. A. D., & Zamora, M. M. G. (1998, July 20-23). Our ten years of work on transparent box business simulation. Proceedings of the 16th International Conference of the System Dynamics Society. Québec City, Canada. Markulis, P. M., & Strang, D. R. (2003). A brief on debriefing: What it is and what it isn’t. Developments in Business Simulation and Experiential Learning, 30, 177-184. Miyaoka, J. (2005). Making operations management fun: Littlefield Technologies. INFORMS Transactions on Education, 5(2), 80-83. Moizer, J., & Lean, J. (2010). Toward endemic deployment of educational simulation games: A review of progress and future recommendations. Simulation & Gaming: An International Journal, 41, 116-131. Moxnes, E. (2000). Not only the tragedy of the commons: Misperceptions of feedback and policies for sustainable development. System Dynamics Review, 16, 325-348. Pavlov, O., Doyle, J., Saeed, K., Lyneis, J., & Radzicki, M. (2014). The design of educational programs in system dynamics at Worcester Polytechnic Institute (WPI). Systems, 2(1), 54-76. Perkins, D. N., & Grotzer, T. A. (2005). Dimensions of causal understanding: The role of complex causal models in students’ understanding of science. Studies in Science Education, 41, 117-166. Petranek, C. F., Corey, S., & Black, R. (1992). Three levels of learning in simulations: Participating, debriefing, and journal writing. Simulation & Gaming: An International Journal, 23, 174-185. Qudrat-Ullah, H. (2007). Debriefing can reduce misperceptions of feedback: The case of renewable resource management. Simulation & Gaming: An International Journal, 38, 382-397.
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
20
Simulation & Gaming
Richardson, G. P., & Pugh, A. L., III. (1981). Introduction to system dynamics modeling with DYNAMO. Cambridge, MA: Productivity Press. Seaman, D. F., & Fellenz, R. A. (1989). Effective strategies for teaching adults. Columbus, OH: Merrill. Snider, B., & Balakrishnan, J. (2013). Lessons learned from implementing web-based simulations to teach operations management concepts. INFORMS Transactions on Education, 13(3), 152-161. Steenhuis, H.-J., Grinder, B., & de Bruijn, E. J. (2011). Simulations, assessment and student learning. International Journal of Information and Operations Management Education, 4, 99-121. Steinwachs, B. (1992). How to facilitate a debriefing. Simulation & Gaming: An International Journal, 23, 186-195. Sterman, J. D. (1989). Modeling managerial behavior: Misperceptions of feedback in a dynamic decisionmaking experiment. Management Science, 35, 321-339. Sterman, J. D. (2008). Risk communication of climate: Mental models and mass balance. Science, 322, 532-533. Sterman, J. D. (2010). Does formal system dynamics training improve people’s understanding of accumulation? System Dynamics Review, 26, 316-334. Summers, G. J. (2004). Today’s business simulation industry. Simulation & Gaming: An International Journal, 35, 208-241. Wakeland, W. (2014). Four decades of systems science teaching and research in the USA at Portland State University. Systems, 2(2), 77-88. Wood, S. C. (2007). Online games to teach operations. INFORMS Transactions on Education, 8(1), 3-9.
Author Biographies Oleg V. Pavlov (PhD, University of Southern California, 2000; MBA, Cornell University, 2011) is an associate professor of economics and system dynamics at WPI. His research interests lie at the intersection of system dynamics modeling and economics. He has published in a variety of journals, including System Dynamics Review, Journal of Economic Dynamics and Control, Journal of the Operational Research Society, Computational Economics, and Communications of the Association for Information Systems. He is past president of the Economics Chapter of the System Dynamics Society and now serves on the Executive Board of the Economics Chapter. He is the coordinator for the WPI graduate system dynamics program. Contact:
[email protected]. Khalid Saeed (PhD, Massachusetts Institute of Technology, 1981) is a professor of economics and system dynamics at WPI. He was a student of Jay Forrester at MIT and has worked on developing system dynamics models of real-world systems. His insight models have been used to test policies for environmental sustenance, replicate psychology experiments, design innovative organizations, implement developmental agendas, and improve performance of governance systems. He has been teaching the art and science of system dynamics for the past several decades. He also published on methodological issues. He is a former president of the System Dynamics Society (1995) and a recipient of the Jay Forrester Award in system dynamics. Contact:
[email protected].
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015
21
Pavlov et al.
Lawrence W. Robinson (PhD, University of Chicago, 1986) is a professor of operations management at Samuel Curtis Johnson Graduate School of Management at Cornell University. His research focuses on problems of operating in an uncertain environment, in particular, on developing practical heuristic policies that perform well and can be easily calculated. His research interests range from inventory management to booking limits for discount fare airline passengers, to scheduling doctor’s appointments. He has published in a variety of journals, including Operations Research, Management Science, IIE Transactions, and the European Journal of Operational Research. He has been a Mobil Scholar and has been recognized by BusinessWeek as one of the top teachers at Johnson. His consulting projects include advising Hungarian professors on introducing TQM into their courses and developing joint forecasting/inventory management policies for highly seasonal and perishable products. Contact:
[email protected].
Downloaded from sag.sagepub.com at WORCESTER POLYTECH INST on March 17, 2015