A taxonomy of computer simulations to support learning

3 downloads 9205 Views 432KB Size Report
Computer-based simulations are important tools to support learning. ... computer simulations of business and economic systems try to abstract from details. This.
What are we talking about?— A taxonomy of computer simulations to support learning Frank H. Maier and Andreas Größler

Published in System Dynamics Review 16(2), 2000, 135–148.

Abstract In this paper a proposal for a taxonomy of computer simulations to support learning processes is presented. To achieve this goal, distinguishing criteria of computer simulations are identified and categorized. The work is done in order to facilitate research about the evaluation of learning tools. The taxonomy is mainly built to be suitable for this objective but can be of general help in establishing a common terminology. Sharing of findings and discussions among researchers from different fields can therefore be stimulated.

A Confusion in Naming Computer-based simulations are important tools to support learning. However, in the literature as well as in discussions among scientists and practitioners, there exists some confusion. Besides the unsolved question of the efficacy of these tools, this confusion often is caused--or at least increased--by problems connected with terminology. Many different words symbolize the same object, or a single word is used for different objects. Microworld, management flight simulator, business simulator, business game, management simulator, learning environment, sometimes describe the same kind of simulation. But sometimes two objects both called management flight simulators are quite distinct. Some authors distinguish between business games and business simulations, whereas some do not. Examples for this imprecise and not uniform use of terms are given in the next section. At this point should only 1

be stated that the confusion is a result of a number of factors, including academic backgrounds, marketing concerns and unreflective adoption of terms originally used with other intended meanings. However, an unambiguous terminology is important for the understanding of the work done in any discipline. It is a necessary condition for scientific research. Considering the state of current terminology pertaining to computer simulations, a taxonomy of tools to support learning through computer simulations appears worthwhile. In particular, a taxonomy can be helpful to connect and compare different studies (Funke, 1992). It is our opinion that some of the apparently contradictory results are only due to the imprecise terminology. Clear terms and categorization will help in analyzing the effects of different types of simulation tools (Kluwe, 1993). Furthermore, a common terminology and taxonomy will build a bridge between the many involved fields, e.g., management and decision science, psychology, education, computer science. In order to clarify these issues, in a first step this paper presents and critiques commonly used terms for simulation-based learning instruments. Then, a list of possible characteristics is given and applied to some well known simulations. Finally, in order to establish suggestions for a coherent terminology a proposal for naming conventions is provided.

Critique and Explanation of Some Commonly Used Names Analyzing some already used terms and their meaning is our starting point in building a coherent naming scheme for computer simulations. Sometimes, the term management simulator is used for one- and multi-person computer simulations (e. g., Milling, 1995). However, this term is problematic because it is not the management in an institutional understanding--i.e. a group of managers--or in a functional sense--i.e. a functional area in a firm--that is simulated. It is the structure of a company and market environment, the business,

2

which is the basis for the computer simulation. In such a computer simulation a user may play the role of managers in a virtual firm and a simulated business. Quite often the term management flight simulator is used for single-user as well as for multi-user simulation games (Sterman, 1992). This apparently fortunate analogy found widespread use because it has market appeal and sounds reasonable. The idea is that just as a pilot learns to fly with a flight simulator, one can learn to manage a company with the help of a management flight simulator. Nevertheless, a closer look reveals two disadvantages. First, for many persons, the phrase “flying a company”, is unfamiliar. Due to that lack of familiarity, the phrase might be misunderstood to apply to flying or the airline industry. For example, Senge explicitly states that management flight simulators “are not limited to the airline industry” (Senge, 1994, p. 531). Secondly, learning tools in socio-economic environments do not aim to cover reality as congruently or as comprehensively as real flight simulators do. While flight simulators try to be as realistic as possible--including almost every detail-computer simulations of business and economic systems try to abstract from details. This abstraction allows to focus on important structures and behavior modes. Moreover, the term “management flight simulator” suggests that behavior is trained, but not that insights into the relation between structure and behavior are mediated. The term microworld goes back to Papert (1980). However, it was already used in the end of the 1960s in research about artificial intelligence (AI) describing simplified environments which allow computers to act within them apparently intelligently (see Minsky and Papert 1970, quoted after Dreyfus 1992, p. 9). Again, the use of this term for computer simulations to support learning seemed promising. However, Papert’s approach is almost exclusively constructivistic aimed at promoting individual representations and interpretations of external realities. Papert understands microworlds as instruments that enable learners to construct their knowledge themselves. Even more important, such open-ended microworlds rarely involve

3

explicit learning goals. Users are free to define what they want to learn. Considering this, the term microworld should rather be used for learner-centered, modeling-oriented software packages, which are instruments to construct and simulate models (see Morecroft, 1988, who explicitly includes the process of modeling and simulation in a microworld). These software packages usually contain some sort of formalism--a programming language--to express thoughts about specific or general systems. Laurillard (1993) explains differences between simulations and microworlds and differences between microworlds and modeling tools. Nevertheless, the term microworld is used in the system dynamics area quite often as a synonym to the term of management flight simulators (see, for example, Diehl, 1994), although, these systems are something different than Papert’s open-ended, model-building, and goal-less environments . Also, computer simulations to support learning are mixed up with decision support systems (DSS). The latter are built to increase performance in short term decision making. Usually, these instruments provide possibilities to test the outcome of decisions which have to be made. See Keen and Scott Morton (1978) and Maier (1995) for a description of the goals and characteristics of DSS. They do not aim on long term changes in the users’ mental models. They are not constructed to support learning processes. In contrast, programs aiming at learning as their main goal, are also meant to improve decision making in the long run. Frequently, the term ‘game’ is used instead of or parallel with simulator or simulation (for instance, Jensen et al., 1997). This distinction mostly seems to be influenced by the scientific tradition of that author. Some authors therefore combine both terms to ‘simulation games’ (for example, Dörner, 1992). Some do not distinguish between game and simulation (for instance, Keys and Wolfe, 1996, or Klein and Fleck, 1990); other authors distinguish between the two terms (for example, Lane, 1995).

4

The terms ‘learning laboratory’ or ‘interactive learning environment’ (ILE) usually contain more than a pure computer simulation model. One or more simulation models are embedded into a learning environment which may also include case descriptions, presentations by a facilitator, and modeling tools. See Paich and Sterman (1993), for an appeal aiming in that direction. Such computer-based learning environments can also comprise background information, source material, and working instructions integrated into a single computer application. The term “system dynamics based interactive learning environment” (SDBILE) makes clear that still a simulation model is a central part of such learning tools (see for instance, by Davidsen and Spector, 1997). This leads to the conclusion that an interactive learning environment is a computer-based simulation with additional components. These components are assumed to be necessary for its effectiveness (See Spector and Davidsen, 1997).

Distinguishing Criteria of Computer Simulations This abundance of different names causes us to ask what the applications have in common and what makes them distinct. The starting point was our aim to investigate the ‘effectiveness of computer-based simulations to support learning’. Rather rapidly we realized that such a construct does not exist in a general way. ‘Effectiveness’ as well as ‘computer-based simulation’ need some further definition and explanation. This paper addresses the second issue. We decided that a taxonomy would be helpful. According to the American Heritage Dictionary, a taxonomy is a division of objects into ordered groups or categories. The categorization indicates that there is a common criterion which characterizes the category and which can be used to distinguish between each category. However, a division of the objects based on observed characteristics can seldom be made without bias. Much depends on the reasons and the goals a taxonomy is constructed.

5

Alessi (1988) constructed a taxonomy to support his endeavor to explore the relationship between fidelity and effectiveness of instructional simulations. ‘Instructional simulation’ is his term for what we called computer simulations to support learning. He suggested a 4x4 matrix of such instruments. Every cell of the matrix contains various criteria of how to differentiate instructional simulations in detail. Of more interest at this point of our discussion, however, is the main construction scheme of his taxonomy. On one axis he distinguishes between the main components of these simulations, on the other between four types of simulations which support and provide learning in different areas. The main building blocks of instructional simulations are (after Alessi, 1988): underlying model, presentations, (possible) user actions, and system feedback. By system feedback Alessi means human-computer interaction. He names four types of simulations: physical, process, procedural, and situational. The latter differentiation mixes up two distinct characteristics of simulations: (1) the type of knowledge to be mediated, especially declarative versus procedural knowledge (Ryle, 1949), and, (2) the type of real-world domain that is ‘tangible’ versus ‘intangible’. For research about computer simulations to support learning both characteristics are important. Socio-economic systems comprise tangible and intangible components of the real-world domain. Learning objectives are the mediation of declarative knowledge (knowing that) as well as procedural knowledge (knowing how) and structural knowledge (knowing why). Hence, type of knowledge to be mediated and type of real-world domain should be considered explicitly as distinguishing criteria of computer simulations. We also do not completely follow Alessi’s ideas about the basic components of simulations. Certainly, the underlying model is a major part of a computer simulation. But it seems reasonable to summarize Alessi’s components ‘presentations’, ‘user actions’ and ‘system feedback’ into a category comprising the characteristics of a human-computer interface. Besides that ‘system feedback’ has a different connotation in system dynamics than

6

it does for instructional computing. We also added a category covering characteristics of computer simulations which are not determined by the model or the human-computer interface. ------------------------------------Figure 1 about here -------------------------------------Figure 1 depicts our basic argument that computer simulations have three key aspects: the underlying model, the human-computer interface, and various functionalities. The latter contains all features that are not provided by the underlying model or the human-computer interface. Functionality comprises, for instance, access to additional source materials, the extent to which the structure of the underlying model is explicitly shown (degrees of transparency), the progress of time within the simulation (time-steps), etc. The functionality certainly includes various important determinants of effectiveness (Wolfe, 1985). Note that-as the example ‘degree of transparency’ shows--the three aspects are not completely distinct. Although transparency is a facet of functionality, it has to be considered in the other two aspects as well. E.g., the user interface requires additional windows to display the model structure. Besides that, the transparent representation of the simulation model influences, e.g., the naming of variables--which have to be understandable--and should lead the model builder to avoid ‘technical variables’. In Figure 1 the users and their personal characteristics are not included (Funke, 1995). Although users are of high importance in evaluation studies, a categorization of tools can be somewhat independent of user characteristics. The same holds true for ‘situational factors’ (e.g., the learning situation in which a simulation tool is used: embedded or stand-alone, etc.) Often, these factors are summarized under the term “learning environment.” In a broad

7

definition, learning environments comprise virtually everything connected with the learning process (for example, Strittmatter and Mauel, 1997). Of course, computer simulations can be characterized by a lot of features. The following list is an attempt to systematize these features according to the three main aspects of simulations described above. In addition, we want to emphasize again that we focus on tools to support learning processes. In other words, the meta-purposes of the simulation programs we look at are to help users understand the principles of the underlying system and to train users in controlling the system (Andersen et al., 1990). Other conceivable goals and objectives of computer simulations (research about human characteristics, personnel selection, etc.) are not taken into account. Table 1 shows the three different categories. --------------------------------Table 1 about here ---------------------------------It has to be noted that a strict distinction cannot always be made between the pairs of characteristics as listed in Table 1. Furthermore, this list of criteria is not complete. However, the list shows that there are many possible combinations of design characteristics. Moreover, the characteristic pairs have to be seen as the extreme points of a continuum. Thus, we obtain an infinite number of combinations of characteristics. Each pair differently influences the purpose, use, and effectiveness of computer simulations. Without discussing the table in detail it can be stated that some combinations of characteristics make no sense. For instance, modeling-oriented tools are used to develop a model as a means for communication between different problem-owners and the modelbuilder and hence can never be black-boxes. However, the characteristics are the basis for the taxonomy of simulation-based learning instruments discussed below.

8

The criteria can be applied (1) to analyze the characteristics of existing computer-based simulations, and (2) to determine the characteristics of newly designed simulation tools. As an example, Table 2 visualizes the differences of some popular computer simulations. The categories and criteria correspond to those of Table 1. If a criterion is applied in the particular tool, the according field in the table is shaded gray. In some cases both criteria of a category are applicable, for which then the field is marked black. For example, the People Express Management Flight Simulator is based on a model of a business world domain which is the airline business. However, it can be used in the airline industry as well as in other industries to explain dynamics generated by users’ decisions. It does not actively generate decisions and serves only as a clearing device. Because it is based on a system dynamics model it is a continuous, deterministic and feedback-oriented computer simulation. The users may intervene in discrete periods of time in a typical single person, stand-alone, and black-box simulation environment. --------------------------------------Table 2 about here ----------------------------------------

Proposal for a Taxonomy Figure 2 shows our recommended taxonomy in form of a tree diagram. The nodes represent the proposed naming convention. The leaves give examples of some well-known simulation tools. The branches starting in a node differ in exactly one of the criteria depicted in Table 1. ---------------------------------Figure 2 about here -----------------------------------

9

At the root of the tree we distinguish between modeling- and gaming-oriented instruments. The implicit criterion for this distinction is ‘main area of application’ from the column ‘functionality’ of Table 1. Modeling-oriented tools are further distinguished by the criteria ‘structure’ and ‘progress of time in the simulation engine’ (column ‘underlying model’). Therefore, on one hand, models built with Vensim, Powersim, Ithink or DYNAMO are summarized under feedback-oriented continuous simulations. On the other hand, models created with, for example, Taylor or Simple++, are classified as discrete, process-oriented simulations. The main goals of feedback-oriented continuous simulations are learning, problem solving and gaining insights. Their usefulness as well as the efficacy to achieve these goals are virtually undoubted within the system dynamics community (Senge, 1989) but have not been systematically investigated. In contrast, the aim of process-oriented discrete simulation environments is mainly to optimize process layouts and visualize the behavior of the system processes under consideration. Their main real world domain is business, especially, manufacturing and logistic processes. The second branch of the tree shows the gaming-oriented simulations. These are further distinguished by the criterion ‘number of users’ into single-user and multi-user applications. Single-user applications are defined as ‘simulators’, whereas multi-user applications are labeled as ‘planning games’. In a simulator, usually a single person plays with or against a computer model. In planning games several (groups of) players compete with each other or play various roles within a larger framework. The kind of learning that occurs in using simulators or planning games is different. In planning games, group dynamics may have a strong influence on learning and decision making processes, and, therefore effect learning effectiveness. Therefore, research about the effectiveness of computer simulations mostly focuses on learning processes gained by using simulators.

10

Simulators and planning games can furthermore be distinguished by the real-world domain in which the simulation is situated. Here we distinguish between business applications and others. Consequently, we name the business related tools as “business simulators” and “corporate planning games”. Since there is a variety of different real-world domains and our focus lies in the business area, there is no further naming convention for the other simulators and planning games. Each of the leaves of the tree gives an example of a related computer simulation. Of course, this covers only the typical usage of these products: for example, there is always the chance that a group of people works with a business simulator like LEARN! or People Express. However, in this case the group can be seen as a single virtual user. Examples for business simulators are the People Express and Boom and Bust Enterprises from Sloan School or LEARN! and CopyShop developed at Mannheim University. Examples for simulators from other real-world domains are the World-3 simulator included in the Vensim simulation environment, the famous computer game SimCity and Dörner’s Lohhausen simulator. Examples for corporate planning games include Lobster, Topic, and Marga. A planning game example is the Fish Bank Game by Meadows. The use of the proposed naming convention contained in the nodes of Figure 2 hopefully leads to a clearer understanding of research in the area of computer simulations and enhances the preciseness of discussions among scientists. In particular we suggest to use the term ‘business simulator’ instead of ‘management simulator’ or ‘management flight simulator’. The suggested taxonomy can be extended by the other criteria given in Table 1. For example, as Machuca and Carrillo (1996) proposed, the criterion ‘transparency of the simulation model’ can be used to distinguish between black-box business simulators and transparent-box business simulators.

11

Conclusion Starting with the identification of three aspects of simulation tools to support learning, this paper presented a list of characteristics to distinguish between such tools. Based on this list of characteristics a taxonomy was constructed. This taxonomy must be seen in the context of the research goal: “the investigation of the effectiveness of computer-based simulations to support learning.” Evaluation studies have to exclude as many external influences on individual learning processes as possible. Only then it can be concluded that learning effects are due to the instrument under consideration and are not caused by confounding variables. Therefore, the focus of our research is on what we call simulators, in particular business simulators. By definition, a simulator does not show influences to learning caused, e.g., by group decision processes or group think phenomena. Also, cases in which users build models themselves are not considered in our evaluation research. Thus, the taxonomy helped us to exclude a substantial number of possible simulation tools and, therefore, to minimize influences on learning effectiveness which are not caused by the simulator. It is our expectation that the taxonomy can have the same positive effect in other evaluation projects. Furthermore, if more researchers follow this or an improved taxonomy, the comparison of evaluation studies will become easier. The list of possible characteristics of simulation tools to support learning provides not only certain characteristics to be examined in evaluation studies, but also a starting point to improve and extend the suggested taxonomy. Using the taxonomy to systematize the research about learning with computer simulations shows that it mostly takes place in the branch of “gaming-oriented simulations.” In particular, as discussed above, simulators are used as a research object. There is almost no research (1) to evaluate learning processes with modeling-oriented simulations, and, (2) to compare the effectiveness of learning with modeling-oriented as opposed to gaming-oriented computer 12

simulations. Certainly a demanding, but interesting research area for the system dynamics community.

Appendix List of URLs of the simulation software mentioned: Simulation program

URL

Vensim

http://www.vensim.com/

Powersim

http://www.powersim.com/

IThink

http://www.hps-inc.com/

Taylor

http://www.taylorii.com/

Simple++

http://www.technomatix.com/

People Express MFS

http://web.mit.edu/jsterman/www/SDG/MFS/PE.html

LEARN!

http://www.simcon.de/

“Boom and Bust” MFS

http://web.mit.edu/jsterman/www/SDG/MFS/BandB.html

CABS

http://www.cabs.de/

Vensim’s World Dynamics

http://www.vensim.com

Ecopolicy

http://www.frederic-vester.de/

SimCity

http://www.simcity.com/

Lobster

http://www.simcon.de/

Marga

http://www.marga.de/

Fish banks

http://www.unh.edu/ipssr/FishBank.html

References Alessi, S. M. 1988. Fidelity in the Design of Instructional Simulations. Journal of ComputerBased Instruction, Vol. 15, No. 2: 40–47. Andersen, D. F., I. J. Chung, G. P. Richardson and T. R. Stewart. 1990. Issues in Designing Interactive Games Based on System Dynamics Models. Proceedings of the 1990 International System Dynamics Conference, ed. D. F. Andersen, G. P. Richardson and J. D. Sterman. Chestnut Hill, MA: 31–45.

13

Davidsen, P. I. and J. M. Spector. 1997. Cognitive Complexity in System Dynamics Based Learning Environments. 15th International System Dynamics Conference: Systems Approach to Learning and Education into the 21st Century, ed. Y. Barlas, V. G. Diker and S. Polat. Istanbul: 757–760. Diehl, E. W. 1994. Managerial Micro Worlds as Learning Support Tools. Modeling for Learning Organizations, ed. J. D.W. Morecroft and J. D. Sterman, pp. 327–337. Portland Oregon: Productivity Press. Dörner, D. 1992. Die Logik des Mißlingens. Strategisches Denken in komplexen Situationen [The Logic of Failure. Strategic Thinking in Complex Situations]. Reinbek bei Hamburg: Rowohlt. Dreyfus, H. L. 1992. What Computers Still Can’t Do. Cambridge, MA: MIT Press. Funke, J. 1992. Wissen über dynamische Systeme: Erwerb, Repräsentation und Anwendung [Knowledge about Dynamic Systems: Acquisition, Representation, and Application]. Berlin et al.: Springer. Funke, J. 1995. Experimental Research on Complex Problem Solving. Complex Problem Solving – The European Perspective, ed. P. A. Frensch and J. Funke, pp. 243–268. Hillsdale, NJ: Lawrence Erlbaum. Jensen, K., I. Gallacher, S. Hussain and J. Mcleod. 1997. An Interactive Telecommunications Business Game for Strategic Exploration. 15th International System Dynamics Conference: Systems Approach to Learning and Education into the 21st Century, ed. Y. Barlas, V. G. Diker and S. Polat. Istanbul: 491–494. Keen, P. G. W. and M. S. Scott Morton. 1978. Decision Support Systems: An Organizational Perspective. Reading, MA: Addison-Wesley. Keys, B. and J. Wolfe. 1996. The Role of Management Games and Simulations in Education and Research. Yearly Review, Journal of Managment, Vol. 16, No. 2: 307–336.

14

Klein, R. D. and R. A. Fleck. 1990. International Business Simulation/Gaming: An Assessment and Review. Simulation and Gaming, Vol. 21, No. 2: 147–166. Kluwe, R. H. 1993. Knowledge and Performance in Complex Problem Solving. The Cognitive Psychology of Knowledge, ed. G. Strube and K. F. Wender, pp. 401–423. Amsterdam: Elsevier. Lane, D. C. 1995. On a Resurgence of Management Simulations and Games. Journal of the Operational Research Society, Vol. 46: 604–625. Laurillard, D. 1993. Rethinking University Teaching: A Framework for the Effective Use of Educational Technology. London/New York: Routledge. Machuca, J. A. D. and M. A. D. Carrillo. 1998. Transparent-Box Business Simulators versus Black-Box Business Simulators: An Initial Empirical Comparative Study. System Dynamics ’96, ed. G. P. Richardson and J. D. Sterman, pp. 329–332. Maier,

F.

1995.

Die

Integration

wissens-

und

modellbasierter

Konzepte

zur

Entscheidungsunterstützung im Innovationsmanagement [Integration of Knowledge and Model-Based Concepts for Decision Support in Innovation Management]. Berlin: Duncker & Humblot. Milling,

P.

1995.

Organisationales

Lernen

und

seine

Unterstützung

durch

Managementsimulatoren [Organizational Learning and Its Support by Management Simulators]. Zeitschrift für Betriebswirtschaftslehre, Ergänzungsheft 3/95: Lernende Unternehmen: 93–112. Minsky, M. and S. Papert. 1970. Draft of a Proposal to ARPA for Research on Artificial Intelligence at M.I.T. Morecroft, J. D.W. 1988. System Dynamics and Microworlds for Policymakers. European Journal of Operational Research, Vol. 35, No. 3: 301–320.

15

Paich, M. and J. D. Sterman. 1993. Boom, Bust, and Failures to Learn in Experimental Markets. Management Science, Vol. 39, No. 12: 1439–1458. Papert, S. 1980. Mindstorms. Children, Computers, and Powerful Ideas. New York: Basic Books. Ryle, G. 1949. The Concept of Mind. New York: Barnes & Noble. Senge, P. M. 1989. Organizational Learning: A New Challenge for System Dynamics. Computer-Based Management of Complex Systems: Collected Papers from the 1989 International System Dynamics Conference, ed. P. M. Milling and E. O. K. Zahn, pp. 229– 236. Berlin et al.: Springer. —. 1994. Microworlds and Learning Laboratories. The Fifth Discipline Fieldbook, ed. P. M. Senge, A. Kleiner, C. Roberts, R. B. Ross and B. J. Smith, pp. 529–531. New York et al: Currency & Doubleday. Spector, J. M. and P. I. Davidsen. 1997. Constructing Effective Interactive Learning Environments Using System Dynamics Methods and Tools: Interim Report, EIST Publications and Reports No. 1, University of Bergen. Sterman, J. D. 1992. Teaching Takes Off – Flight Simulators for Management Education. OR/MS Today, October 1992: 40–44. Strittmatter, P. and D. Mauel. 1997. Einzelmedium, Medienverbund und Multimedia [Single Medium, Network of Media, and Multimedia]. Information und Lernen mit Multimedia, ed. L. J. Issing and P. Klimsa, pp. 46–61. Weinheim: Psychologie Verlags Union. Wolfe, J. 1985. The Teaching Effectiveness of Games in Collegiate Business Courses. Simulation & Games, Vol. 16, No. 3: 251–288.

16

m odel

sim ulation tool functionality hum an-com puter interaction

Figure 1: Three aspects of computer simulations

Com puter sim ulations to support learning in socio-econom ic system s

M odeling-oriented sim ulations

G am ing-oriented sim ulations

Feedback-oriented continuous sim ulation m odels with... Vensim

Sim ulators

Planning G am es

Business sim ulators

.....-sim ulators

People Express M FS

Vensim ’s W orld Dynam ics

Lobster

Fish banks

LEARN!

Ecopolicy

Topic

.....

“Boom and Bust” M FS

Sim City

M arga

Copy Shop

Lohhausen

.....

CABS

.....

Corporate planning gam es

.....-planning gam es

Powersim

IThink

..... Process-oriented discrete sim ulation m odels with... Taylor

Sim ple ++ ..... .....

Figure 2: Taxonomy of computer simulations to support learning processes in socio-economic systems

17

Underlying model Real-world domain Business Other Generality of model in regard to domain Special area of real world domain Whole domain Structure Feedback-oriented Process-oriented (mostly without feedback) Behavior Deterministic Stochastic Progress of time in simulation engine Discrete Continuous Role of simulation model Active generation of decisions Clearing device for users’ decisions Influence of external data With such influences Without such influences Domain of variables Integers Real numbers

Human-computer interface Chance of intervention while simulating Discrete periods Simulation in one run Mode of users’ input Policy-oriented Decision-oriented Mode of display Text Multi-media Mode of interaction Keyboard Mouse

Functionality Number of users possible Single person Multi person Degree of integration Stand-alone simulation Integration in computer-based environment Main area of application Modeling-oriented Gaming-oriented Use of teachers/facilitators/coaches Totally self-controlled learning Support by teacher/facilitator/coach Transparency of simulation model Black-Box Transparent-Box Advancing of time Clock-driven User-driven

Table 1: Criteria for categorization of computer simulations

18

People Express M FS Copy Shop LEARN! Lobster Fish Banks Vensim Sim ple ++

Exam ples User-driven

Clock-driven

Transparent-box

B lack-box

S upport by teacher/facilitator/coach

Totally self -controlled learning

G am ing-oriented

M odeling-oriented

Integration in com puter-based environm et

S tand-alone sim ulation

M ultiple persons

S ingle person

Decision-oriented

P olicy-oriented

S im ulation in one run

Discrete periods

P rocess-oriented (m ostly without f eedback)

Feedback-oriented

S tochastic

Determ inistic

Real

Integers

Continuous

Discrete

W ithout inf luence of external data

W ith influence of external data

Clearing device for users' decisions

A ctive generation of decisions

W hole dom ain

S pecial area of real world dom ain

O ther

n/a

B usiness

A dvancing of tim e in sim ulat ion

Transparency of sim ulat ion m odel

S upport of teachers/ facilitat ors/ coaches

M ain area of application

Degree of integration

Num ber of users

M ode of users' input

Chance of intervention while sim ulat ing

S truct ure

B ehavior

Dom ain of variables

P rogress of tim e in sim uation engine

Influence of external data

Role of sim ulation m odel

G enerality of m odel wit h respect to dom ain

Real world dom ain

Underlying m odel Hum ancom puter interface Functionality

Legend: single criterion of category applies

both criteria of category m ay apply

criterion is not applicable

n/a n/a n/a n/a

Table 2: Exemplary application of the distinguishing criteria

19

Suggest Documents