Technology-Enhanced, Modeling-Based Instruction (TMBI) in Science Education
41
Ji Shen, Jing Lei, Hsin-Yi Chang, and Bahadir Namdar
Abstract
In this chapter, we review recent research and development in technology-enhanced, modeling-based instruction (TMBI) in science education. We describe the cognitive, social, and curriculum-design aspects of science learning promoted in these environments. We emphasize the continuum of qualitative to quantitative modeling, the computational mind, and the system thinking that are critical for scientific modeling. We illustrate typical collaborative learning in TMBI science education settings. We highlight scaffolding strategies relevant to TMBI in science curricula. Keywords
Model • Model-based reasoning • Computational modeling • System thinking
Introduction Scientists develop conceptual, physical, representational, and computer models to explore the nature. These models may represent a particular aspect of a phenomenon, delineate the interacting components of a system, and quantify the relationships among relevant variables to help explain and predict an event (Clement, 2000; Gilbert, 1993). Scientific
J. Shen, Ph.D. (*) • B. Namdar Department of Mathematics and Science Education, College of Education, The University of Georgia, 212 Aderhold Hall, Athens, GA 30602, USA e-mail:
[email protected];
[email protected] J. Lei, Ph.D. Department of Instructional Design, Development and Evaluation, School of Education, Syracuse University, 336 Huntington Hall, Syracuse, NY 13244, USA e-mail:
[email protected] H.-Y. Chang, Ph.D. Graduate Institute of Science Education, National Kaohsiung Normal University, No. 62, Shenjhong Road, Yanchao District, Kaohsiung 824, Taiwan e-mail:
[email protected]
models may evolve over time, some were refined and others abandoned. They have become fundamental elements in scientific language. Modeling-based instruction (MBI) is an innovative way for science teaching and learning that encourages students to use, create, share, and evaluate models to represent and explain scientific processes and phenomena. It has been studied and implemented in the last three decades and has demonstrated effectiveness in improving students’ conceptual understanding, critical thinking, and inquiry skills in science (Hart, 2008; Hestenes, 1987; Khan, 2007; Lehrer & Schauble, 2006; Passmore & Stewart, 2002; Schwarz et al., 2009; Sell, Herbert, Stuessy, & Schielack, 2006; White, 1993; Windschitl, Thompson, & Braaten, 2008). Typically, a MBI approach has the following features: (1) MBI engages students to actively participate in learning as they build, test, and modify their own models (Hestenes, 1987; Penner, Gilles, Lehrer, & Schauble, 1997; Schwarz et al., 2009; White, 1993), resembling what scientists do in their fields as they constantly build and test scientific models (Gilbert, Pietrocola, Zylbersztajn, & Franco, 2000; Schwartz & Lederman, 2005; Tomasi, 1988; Zhang, Liu, & Krajcik, 2006); (2) MBI employs multiple ways of representations and alternative models including physical models, computerized visualizations, graphs, mathematical formula, and human role-play models that may
J.M. Spector et al. (eds.), Handbook of Research on Educational Communications and Technology, DOI 10.1007/978-1-4614-3185-5_41, © Springer Science+Business Media New York 2014
529
530
reach a diversity of learners with different learning styles (Ardac & Akaygun, 2004; Kozma, Chin, Russell, & Marx, 2000; Mayer 2005; Shen & Confrey, 2007, 2010); (3) MBI facilitates a peer-learning community as students build models together, communicate their models to peers, and evaluate alternative models to help themselves better understand complex science topics (Gilbert & Boulter, 1998; Papert, 1991; Lehrer & Schauble, 2006; Tobin, 1993). The fast development of information communication technology not only greatly expands the variety of media available for modeling opportunities for science learning, but also dramatically transforms traditional learning environments of MBI. Many technology-enhanced, modelingbased instruction (TMBI) environments have been developed for K-12 science instruction (Barab, Hay, Barnett, & Keating, 2000; Frederiksen, White, & Gutwill, 1999; Levy & Wilensky, 2008; Linn & Hsi, 2000; Perkins et al., 2006; Stratford, Krajcik, & Soloway, 1998; Wieman, Adams, & Perkins, 2008; Wu, 2010; Wu, Krajcik, & Soloway, 2001). These TMBI environments empower students to model a wide range of science phenomena, especially those often too small to see, too abstract to represent, too complex to comprehend, or too dangerous to explore in real life. These environments also build new forms of collaboration so that students can build models together within or across classes (Gobert & Pallant, 2004; Linn & Eylon, 2011). Furthermore, many of these environments are able to provide instant feedback and automated scaffoldings. This makes learning experience more student-centered, as students can manage their own learning pace and receive individualized instruction (Hannafin & Land, 1997). In this chapter, we review the latest development of the technologies and pedagogies related to TMBI in science education. We use examples that have empirically proven to be effective in helping students learn science. We organize our chapter in four themes: promoting scientific exploration, facilitating model-based thinking, enhancing collaborative modeling, and designing scaffolded TMBI. The first three themes concern the kinds of learning TMBI promotes; the fourth theme focuses on design features utilized in TMBI curricula to support students’ learning in science.
Promoting Scientific Exploration To promote students’ exploratory learning in science (Bransford, Brown, & Cocking, 2000; White, 1993), many computer models developed as science instructional materials have built-in features to allow students inquiry about the phenomena they are investigating. These features afford differentiated instruction and allow students’ self-exploration that is a key characteristic of science practice (National Research Council, 2011).
J. Shen et al.
One good example is the PhET Interactive Simulations developed at the University of Colorado, Boulder (http:// phet.colorado.edu/). PhET simulations started with physics topics but now include other disciplines such as math, biology, chemistry, and earth science. PhET simulations are open-source, stand-along programs, typically written in Java or Flash. These simulations have been translated in many languages and used worldwide. They help students visualize and test scientific models and practice inquiry learning (e.g., Perkins et al., 2006; Wieman et al., 2008). These simulations can be used as different types of activities or assignments (Wieman, Adams, Loeblein, & Perkins, 2010). Adams, Paulson, and Wieman (2009) investigated how students engage in interacting with the PhET simulations when provided with different levels of guidance. They conducted over 250 think-aloud interviews with more than 100 students. During the interviews, students were asked to think out-loud as they explored the computer models with four levels of guidance (no instruction, driving questions, gently guided, and strongly guided). They found that students’ exploration of a simulation was highly dependent on the features of the simulation: If a simulation is too complicated, students may not be able to make sense of it; if a simulation is not fun, students may only engage for a very short period of time; only when a simulation is at a level that a student finds both intriguing and manageable, then the student sustains his/her exploration. For those well-designed simulations, it was found that students showed optimum engagement when they were provided with minimal guidance, partially because they were seeking answers to their own questions. On the contrary, when provided with cook-book guidance, students lost ownership of the inquiry and gained very limited understanding of the simulation. Podolefsky, Perkins, and Adams (2010) observed and interviewed how college students interacted with PhET simulations with minimal explicit guidance, and documented two cases on how students worked with a particular simulation, Wave Interference. This simulation allows students to manipulate and observe interference of waves in three contexts: water, sound, and light (Fig. 41.1). Students may choose different tabs of contexts, different objects to show, different measurement tools to use, and different variables to manipulate. The study examined how students took advantages of the computer simulation to make progress towards developing a scientific model of wave interference. Given the flexibility of the PhET simulation, the students followed different exploration paths, similar to how scientists investigate natural phenomena. On the other hand, built-in features of the simulation and real-time feedbacks guided students’ self-exploration and made their learning more successful. When interacting with the wave simulation, the students made connections between the real world and abstract representations and among multiple representations. These students also built analogical reasoning
41
TMBI
531
Fig. 41.1 A snapshot of the PhET wave interference model
among different contexts, critical in developing modeling competency (Lehrer & Schauble, 2006). Another good example is River City, a multiuser virtual environment (MUVE) developed at Harvard Graduate School of Education (http://muve.gse.harvard.edu/muvees2003/) to enhance middle school students’ motivation and inquiry learning in science (Nelson, Ketelhut, Clarke, Bowman, & Dede, 2005). It is a 17-h curriculum centered on inquiry as defined by the National Science Education Standards (National Research Council, 2000). In River City, students go through forming scientific hypotheses and conducting virtual experiments to test their hypotheses about what causes an outbreak illness of residents in a virtual town—a complex computer model of a human–nature system that involves the knowledge of ecology, health, biology, chemistry, and earth science. Typically with River City, students work in small groups of 2–4 and interact with each other’s avatar, digital artifacts, tacit clues, and computer-based agents. A number of studies have showed that River City curriculum increased student motivation (e.g., Ketelhut, Dede, Clarke, & Nelson, 2006), content knowledge (e.g., Ketelhut
et al., 2006) and inquiry skills (e.g., Ketelhut, Nelson, Dede, & Clarke, 2006). Ketelhut (2007) investigated whether the scientific inquiry behaviors of students are developed while they are engaging in the inquiry-based project and how selfefficacy is related to students’ scientific inquiry behaviors. The findings indicated that the students conducted scientific inquiry using River City, and that the total number of inquiry behaviors increased over time. Moreover, it was found that the high self-efficacy students engaged in more scientific inquiry behaviors than those with low self-efficacy. Ketelhut et al. (2006) implemented River City with approximately 2,000 students in 2004 and examined whether students engaged in inquiry learning in River City and what types of implementation of River City had more effects on student learning. Results indicated that the students using River City not only were engaged in scientific inquiry, but also showed a higher quality in inquiry than the control students. To assess inquiry learning in River City, Ketelhut and Dede (2006) developed an alternative method (Letter to the Mayor) and found that it was able to offer a better account of students’ inquiry learning than traditional tests.
532
Facilitating Model-Based Thinking TMBI environments may facilitate a habit of mind of model-based thinking. Model-based thinking overlaps with other critical thinking processes, but has its own unique characteristics. Here we highlight three interlinked aspects of model-based thinking.
Qualitative and Quantitative Modeling Scholars have emphasized the importance of using qualitative thinking in modeling (Bredeweg & Forbus, 2003; Forbus, 1984). This approach focuses on conceptual aspects of science learning (Li, Law, & Lui, 2006), and stresses that qualitative understanding provides a solid ground for the development of quantitative reasoning (Bredeweg & Forbus, 2003). Hestenes, a pioneer of MBI in physics education, on the other hand, emphasizes the importance of mathematical models when speaking of modeling in physics (Hestenes, 1987). Mathematical models refer to mathematical representations including symbolic, graphic, and other forms of the real-world situations and quantitative thinking is a critical component of mathematical formalism. There have been many programs developed to facilitate students’ qualitative or quantitative modeling, or both. An exemplar qualitative TMBI is Model-It, developed by the Center for Highly Interactive Computing in Education (http://hi-ce.org) at the University of Michigan. Targeting middle and high school students, Model-It can be used to build and test qualitative models of scientific phenomena (e.g., Stratford et al., 1998; Zhang et al., 2006). Three modes (plan, build, and test mode) are built in the tool to scaffold users’ qualitative thinking in modeling. In the planning mode, students create objects and define associated variables. In the building mode, students set the relationships between the variables verbally or graphically. In this process, students only use qualitative relationships (e.g., as variable A increases, variable B decreases). In the testing mode, students may change the values of the variables and see how the model works. Also, in this process, variables only change among a few hierarchical levels. Many studies have supported that students are able to build quality models using the Model-It program. Stratford et al. (1998) found that students engaged in four types of activities using this modeling program: (a) analyzing (decomposing a system under study into parts), (b) relational reasoning (exploring how parts of a system are causally linked or correlated), (c) synthesizing (ensuring that the model represents the complete phenomenon), and (d) testing and debugging (testing the model, trying different possibilities, and identifying problems with its behavior and looking
J. Shen et al.
for solutions). Studying how content experts using Model-It, Zhang, Liu, and Krajcik (2006) found that experts started modeling with a clear focus expressed as an object and a factor, and then proceeded with a linear sequence including planning, building, and testing. Experts tend to spend a long time in planning, thinking through the whole factors and the relationships among factors before they represent their mental models in the program. Similarly, many TMBI programs have been developed to enhance students’ quantitative thinking in learning science (Liu, 2006; Simpson, Hoyles, & Noss, 2006; Stern, Barnea, & Shauli, 2008). For instance, Sins, Savelsbergh, van Joolingen, and van Hout Wolters (2009) describes a study investigating the relationship between students’ epistemological understanding of models and modeling (i.e., nature of models, purposes of models, process of modeling, and evaluation of models) and their underlying cognitive processes (deep vs. surface). In the setting, 26 students worked in dyads on a computer-based modeling task on mechanics—modeling the movement of an ice skater. The students used Powerism® constructor Lite version, a free modeling tool based on system dynamics approach (similar to STELLA, a well-known commercial system dynamics modeling tool). The environment has five model building blocks: stocks, rates, auxiliaries, constants, and connectors. Specifically, a Stock represents a quantity that can increase or decrease (i.e., a variable) and a rate determines how quickly the quantity in stock will change. Qualitatively, students may add, delete, and move around the elements; quantitatively, they can manipulate the rates and numbers of these elements (e.g., assign a value for velocity of the ice skater) and adding formulas. A Powerism® model with assigned quantities and rates can be run automatically and computing results through generating corresponding differential equations. The computed results are displayed as graphs or tables. Overall, the study confirmed the positive correlation between students’ epistemological understanding and their cognitive processes. It was found that most students actually employed surface cognitive processes. For instance, the most common surface process found in the study involved quantifying a model without referring to its background physics knowledge. Many students who had a lower epistemological understanding tended to focus only on the visual aspect of their models. Note that there is no clear cut between the qualitative and quantitative modeling continuum and a high modeling competency requires both. Indeed many TMBI programs are able to nurture both processes in science learning (Komis, Ergazaki, & Zogza, 2007; White, 1993). Qualitative modeling can help students visualize the main modeling elements and see the core connections, and therefore building foundations for a more precise quantitative description; quantitative modeling engages students in manipulating variables and their connections quantitatively, and therefore leading towards mathematical formulation.
41
TMBI
Computational Habit-of-Mind As scientists nowadays rely heavily on computers to solve complex problems, computational thinking (Papert, 1996) becomes a critical skill students need to develop in math and science education. Wing (2006) defined computational thinking as the ways in which computer scientists think about world, solve problems, and design systems. She pointed out that equating computational thinking with computer programming was a narrow-minded interpretation. Instead, defining features of computer thinking include thinking at multiple levels of abstraction, being fundamental, humanbased problem solving, complementing and combining mathematical and engineering thinking, and applying computational concepts to live everyday life (Wing, 2006). In science learning, students may use a computer-modeling program to conduct computational experiments. Molecular Workbench (MW) software (http://mw.concord.org/modeler/), developed by the Concord Consortium, is such a tool (e.g., Pallant & Tinker, 2004; Xie & Tinker, 2006) (Fig. 41.2). MW is a java-based modeling environment that provides visual, interactive computational experiments and models for teaching and learning science (Xie, 2010). MW focuses on the molecular and atomic interactions that span a range of topics in physics, chemistry, and biology. Its computational algorithm relies on molecular dynamics and quantum dynamics simulation methods (Xie et al., 2011). Students can create their own models to simulate and experiment with molecules and predict real world events. A pilot study shows that general and physical chemistry students are able to create novel computational experiments to explore deeply chemistry phenomena and principles including ionic bonding, purification, and fuel cells (Xie et al., 2011). Computational modeling can be taught to young students. For instance, Papaevripidou, Constantinou, and Zacharia (2007) investigated how fifth graders built computer models to study marine ecology. They used an object-based modeling tool, Stagecast Creator (SC), in which students set rules to control certain behaviors of characters through graphical programming tools (e.g., dragging the character to a new position) without using syntactic programmable language. Students are also able to define variables to create a rule for determining or controlling an action. For instance, student can assign a number for her/his character’s energy consumption with its one unit movement. The study showed that after the unit, students enhanced their modeling skills by using SC. For instance, they shifted their focus from creating descriptive representation of the phenomena to creating more complex models that showed processes, interactions, and relationships among the component of the phenomenon. Also, the students who had the computational modeling experience provided more comprehensive description of casual interactions from a given model, specified criteria to
533
the appreciation of the models, and used iterative and continuous procedure of model revision and refinement.
System Thinking As scientists are building models to simulate and interpret more and more complex systems in nature and the society (e.g., a particular ecosystem, a corporation management system), students need to develop system thinking in understanding the complexity of science phenomena (Kauffman, 1995; Wilensky & Resnick, 1999; Zhang et al., 2006). Characteristics of system thinking may include perceiving a system as being consisted of many elements that interact with each other, understanding a change of one element in a system may result in changes of other elements and even the whole system, and embracing that the relatively simple behavior of individual elements may be aggregated through some mechanism (e.g., statistical methods) to explain the complex system at the collective level. Many TMBI environments can help students develop system thinking (Bravo, van Joolingen, & de Jong, 2009; Wilensky & Reisman, 2006; Wu, 2010). Note that system thinking is a more encompassing term that includes system dynamics modeling, as some of the aforementioned programs illustrate (e.g., Model-It, Stella, Powerism®). Another good example of TMBI promoting system thinking is the NetLogo, an agent-based modeling tool that simulates complex and decentralized systems (e.g., Wilensky & Rand, 2009) (Fig. 41.3). In NetLogo, individual entities can be programmed to operate independently, but follow the same set of rules (e.g., to represent a flock of birds in NetLogo, each “agent” representing a bird follows a set of independent rules). These rules include descriptions on how the agents interact with each other (e.g., when two “birds” come close to a certain distance apart, they move away from each other to avoid clash). Thus, NetLog is able to show not only systems perceived at different levels (e.g., micro- and macro-), but also how these different levels relate to each other (e.g., interactions of individual agents lead to emerging collective behavior). Levy and Wilensky (2009a, 2009b) described a curriculum using NetLogo on gas laws and kinetic molecular theory in chemistry—the Connected Chemistry Curriculum (chapter one, henceforth, CC1). The curriculum aims to help students make connections among four levels of access to a system (submicroscopic, symbolic, experiential, and macroscopic) in order to gain knowledge in three spheres (conceptual, mathematical, and physical). In this modeling environment, students begin from a submicroscopic level and explore the movement of a single particle, and then move towards forming a system view of the chemical world. The studies found that after the CC1, students gained a deeper understanding of particulate nature of gas laws and the kinetic
534
Fig. 41.2 A snapshot of the Molecular Workbench model on the electron transport chain
Fig. 41.3 A snapshot of a NetLogo model on global warming
J. Shen et al.
41
TMBI
molecular theory and a higher ability in connecting the submicroscopic level and the macroscopic world. It was found that the students had a greater success when the assessment was embedded in the process of modeling rather than in the post-test questionnaire. It was also found that students’ epistemic understanding of the nature of models was enhanced (e.g., multiple models can be used to represent the same phenomenon; a model is not a replica of the referent). As we discussed above, students develop qualitative and quantitative modeling skills, computational habit-of-mind, and system thinking while using TMBI programs. An important note is that although we used different TMBI programs to highlight different aspects, many of these programs can facilitate a set of these thinking skills as they are all intertwined with each other.
Enhancing Collaborative Learning Collaboration is critical in MBI because scientific knowledge as a collective is socially constructed and students need to engage in social interaction to develop and revise their own understanding of science phenomena (Komis et al., 2007; Penner, 2001). Students often work in a mixture of collaborative forms in a TMBI environment to share resources or strengthen modeling practices (Barab et al., 2000). For instance, Birchfield and Megowan-Romanowicz (2009) described the SMALLLab, a mixed-reality environment on geologic evolution for high school students. The students worked with each other face-to-face through interacting with the environment and some specially developed handheld devices (e.g., glow ball). The class was then divided into different groups, each group in charge of different roles. The modeling activity involved co-building “layer cake” of earth crust. It was found that the interaction between students in the experimental group increased 33 % compared to a control group. Also, the students who received the intervention outperformed on earth science content tests than their counterparts in the control group. One important message was that more of the gains came from the open-ended explanation tests than the multiplechoice test. Ioannidou et al. (2010) described a modeling technology, Mr. Vetro, that they implemented with high school students on the topic of human physiology. Students collaborated in small groups and whole class. Each group controlled a wireless connected computer simulation (e.g., one group is in charge of the heart), and the data collected from each group fed into a central composite simulation (in this case, it is a blood-centric representation of the body). The groups needed to coordinate with each other to maintain a healthy state of a human body. In this activity, students visualized human organs through computer models, manipulated physiological
535
variables that affect the complex system of a human body, and coordinated with each other to maintain a satisfactory outcome of the system. The research showed that Mr. Vetro class was more inquiry-based than the comparison class based on class observation and teacher interviews; In terms of student learning in content, Mr. Vetro classes outperformed the comparison classes. Specifically, Mr. Vetro classes did much better than the comparison classes for definition and open-response items. Results also showed a positive impact on students’ attitudes toward biology and personal relevance. As the information communication technology allows long distance collaboration, many TMBI environments have incorporated collaboration schemes beyond classroom constraints. Gobert and Pallant (2004) described a science unit on plate tectonics using the WISE platform (Linn, Lee, Tinker, Husic, & Chiu, 2006). In this unit, students may see, manipulate, construct, and evaluate computer models of platetectonic related phenomena (e.g., earth quake, volcano). In the unit implementation, the unit facilitated face-to-face collaboration within pairs of students in the same class and between classes from the two coasts in the USA. The groups of students from different coasts critiqued and evaluated each others’ models using the online discussion feature in WISE. Therefore, the collaborative experiences were built into an authentic modeling process for the students. It was found that students’ understanding of the nature of models deepened after the unit and those that with a more sophisticated understanding of the nature of models had greater content learning gains. Simpson et al. (2006) developed a computer programming and video gaming tool, ToonTalk, to help students learn kinematics. Students worked in small groups and dyads to construct video games, write programs, and model motions with graphs. Students also worked in a project to share, communicate, and collaborate with peers from a different country through a Web-based collaboration system where students can post statements and make comments. They found that the students improved their understanding of motion after the unit. Their learning was enhanced by the collaboration opportunity in that the students were engaged in sharing models and challenging peers cross-site, which led to more animated face-to-face discussion among local participants. Collaborative TMBI also faces many challenges. First, the role of collaboration in a TMBI environment in terms of individual student learning outcome is contested: e.g., students may see collaboration as an opportunity to reduce workload (Barab et al., 2000); students get less opportunity to manipulate the technology (Metcalf & Tinker, 2004). Even though collaboration is an important aspect in MBI, collaboration itself rarely enters the equation of outcome measurement. Also, focusing solely on procedure (e.g., problem-solving)
536
may discourage group members’ attention to content, thus sidetracking students from the main learning task (Krange & Ludvigsen, 2008). Research suggests that students actually spend more time on some particular modeling processes such as linking model elements that requires more peer support (Komis, Ergazaki, & Zogza, 2007). Many TMBI environments afford manipulation of multiple variables, which require students to collaborate with each other to make sense of the interconnection among these variables (Komis et al., 2007; Manlove, Lazonder, & de Jong, 2009). More research is needed to examine carefully how collaboration occurs during the modeling processes and how it can be facilitated by computer technology.
Designing Scaffolded TMBI Students need cognitive and procedural supports in order to carry out scientific inquiry in learning environments that have interactive, dynamic computer models (Linn, 2006; Quintana, Zhang, & Krajcik 2005). These support, or scaffold, may help learners focus on the key aspects of a model, distribute the cognitive load, provide relevant resources and background information, assess student learning in-situ and provide instant feedbacks (Adams et al., 2009; Collins, Brown, & Newman, 1990; Jonassen & Reeves, 1996; Linn, Davis, & Eylon, 2004). Scaffolds may also function as a way to help students problematize the subject matter they are learning (Reiser, 2004). For example, a study engaging students in evaluating peer-generated dynamic models of chemical reactions at the molecular level significantly enhanced student understanding of the subject. Detailed scripts and prompting questions were provided as scaffolds (Chang, Quintana, & Krajcik, 2010). Without such scaffolds students may evaluate their peer’s work superficially without providing substantive criticism. While acknowledging the important role of teachers for the success of student learning, we here focus our discussion on the explicit scaffolds that a learning system may provide (for implicit scaffolds, see, e.g., Podolefsky et al., 2010). Scaffolds embedded in a learning environment need to be well aligned with the learning theory upon which the environment is built, and need empirical evidence to demonstrate their effectiveness. One successful example is the Web-based Inquiry Science Environment (WISE) system that has built on years of research and development (Linn, Clark, & Slotta, 2003; Linn & Eylon, 2011; Linn & Hsi, 2000; Slotta & Linn, 2009). WISE is a powerful, open-source online learning environment that supports guided inquiry, embedded assessments, peer collaboration, interactive computer models, and teacher customization. The latest version, WISE 4.0, has been developed since 2008 and incorporates new features such as Energy Stories, MySystem, IdeaManager
J. Shen et al.
to diagnose and support students’ integrative understanding of important science concepts. WISE projects are designed to help students learn core science concepts and achieve knowledge integration (KI) (Linn, 2006). These standards-based curricula are developed through teams of content experts, school teachers, educational researchers, and computer scientists with iterations of refinement and revisions. WISE curricula are equipped with research-based scaffolds that help students’ knowledge integration processes including eliciting ideas, adding new ideas, distinguishing similar ideas, and sorting out ideas (Linn, 2006). Here we elaborate on a few important scaffolding strategies related to scientific modeling. Research indicates that students may have difficulties to attend properly to the complex information of a scientific model (Lowe, 2004). They may not have shared experience, competency or knowledge as the producer of the scientific model to successfully perceive information represented in the model (Kress & van Leeuwen, 1996). Therefore, it is important to design scaffolds in TMBI environments that provide hints or help focus students’ attention on key aspects of a model. For example, a WISE unit “Thermodynamics: Probing Your Surroundings” (Clark, 2006; Clark & Sampson, 2007, 2008) incorporates a particulate model that shows how heat transfers between two objects at the particulate level (Xie & Tinker, 2006), accompanied with a temperature graph that shows how the temperature of the objects changes as time goes by (Fig. 41.4a). To guide students’ learning with the model and graph, prompting questions embedded in the unit ask students to predict, observe, and explain the results from the model (Fig. 41.4b). The prompting questions asking students to predict how the speeds of the particles change by temperature give students a heads-up before their observation that they need to pay attention to the motion of the particles in the model. After students observe the dynamic molecular model, prompting questions require students to explain the results from the model. The prompting questions are content specific. For example, one question asks “what happened to the molecules of the objects when a hot object was placed on top of a cold object.” They provide check points for students to reflect on whether they have paid attention to and comprehended the key aspects of the model. A study examining students’ responses to the prompts indicates that students developed integrated understanding of heat transfer at the particulate level after learning with the model and embedded scaffolds (Chang & Linn, in press). Students also need explicit scaffolds to help them productively engage in scientific modeling practices (McElhaney & Linn, 2011; Schwarz & White, 2005). For example, the molecular model in the WISE Thermodynamics unit was revised to include features that allow students to conduct experiments using the model. Students can change the temperature of the two objects, the material of the objects, and the time the objects put in touch with each other. However,
41
TMBI
537
Fig. 41.4 Snapshots of the Thermodynamics unit: (a) the dynamic molecular model showing heat transfer between two objects; (b) embedded prompts guiding students learning from the model
students may conduct experiments using computer models purposelessly or mindlessly (McElhaney & Linn, 2011; Parnafes, 2007). To help students develop criteria about how to conduct scientific experiments using the thermodynamics model, a critique activity was designed to engage students in critiquing a fictitious student’s experiment with the model (Chang & Linn, in press). Prompting questions are used to guide students to critique the purposefulness and methodology of the fictitious student’s model experiment. The incorporation of the critique activity builds on a perspective on scaffolds that students need support not only structuring but also problematizing or contextualize the learning task (Reiser, 2004; Shen, 2010). The implementation results supported that the students who used the critique activity designed better experiments and developed more integrated understanding of the scientific model than those who did not engage in a critique activity (Chang & Linn, in press). Scaffolds may also be designed to help students construct abstract explanatory models based on intuitive models and prior experience. Shen and Linn (2011) described a WISE unit for high school students to develop a scientific explanatory model of electrostatics. They carefully delineated cases on how students’ explanatory models of induction evolve over time. They employed key KI design principles such as making thinking visible and making science accessible to help students retrieve their prior knowledge, make sense of the computer models, and link these models with hands-on experience. The built-in scaffolds in the unit proceed from basic charge-based explanatory model, to particle-based model, then to energy-oriented model. The results showed that after the unit, students were able to integrate different levels of models and offer better explanation of everyday experience and observation related to electrostatics.
Conclusion In this chapter, we reviewed a number of high quality programs and studies focusing on providing computer-based environments for students to learn science through modeling. These TMBI environments, given appropriate scaffolding, have demonstrated effectiveness in enhancing students’ modeling-based thinking including qualitative and quantitative modeling, computational thinking, system perspectives, and help diversify and strengthen students’ collaborative learning in science. Despite the rapid development, technologies are still poorly integrated into science education curricula (Songer, 2007). There are many challenges as how to best utilize these programs and implement in different school contexts. Scaling-up research-proven TMBI programs is both a meaningful and urgent next step. Acknowledgments This material is based upon work supported by the National Science Foundation under award number DRL-1019866. Any opinions, findings, and conclusions expressed in this work are those of the authors and do not necessarily reflect the views of the National Science Foundation.
References Adams, W. K., Paulson, A., & Wieman, C. E. (2009). What levels of guidance promote engaged exploration with interactive simulations? PERC Proceedings. Retrieved August 23, 2011, from http:// phet.colorado.edu/en/research Ardac, D., & Akaygun, S. (2004). Effectiveness of multimedia-based instruction that emphasizes molecular representations on students’ understanding of chemical change. Journal of Research in Science Teaching, 41, 317–337.
538 Barab, S. A., Hay, K. E., Barnett, M., & Keating, T. (2000). Virtual solar system project: Building understanding through model building. Journal of Research in Science Teaching, 37(7), 719–756. Birchfield, D., & Megowan-Romanowicz, C. (2009). Earth science learning in SMALLab: A design experiment for mixed reality. International Journal of Computer-Supported Collaborative Learning, 4(4), 403–421. *Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How people learn: Brain, mind, experience, and school (Expanded edition). Washington, DC: National Academy Press. Bravo, C., van Joolingen, W. R., & de Jong, T. (2009). Using Co-Lab to build system dynamics models: Students’ actions and on-line tutorial advice. Computers in Education, 53(2), 243–251. Bredeweg, B., & Forbus, K. (2003). Qualitative modeling in education. AI Magazine, 24(4), 35–46. Chang, H.-Y., & Linn, M. C. (in press). Scaffolding learning from molecular visualizations. Journal of Research in Science Teaching. *Chang, H.-Y., Quintana, C., & Krajcik, J. S. (2010). The impact of designing and evaluating molecular animations on how well middle school students understand the particulate nature of matter. Science Education, 94, 73–94. Clark, D. B. (2006). Longitudinal conceptual change in students’ understanding of thermal equilibrium: An examination of the process of conceptual restructuring. Cognition and Instruction, 24(4), 467–563. Clark, D. B., & Sampson, V. (2007). Personally-seeded discussions to scaffold online argumentation. International Journal of Science Education, 29(3), 253–277. Clark, D. B., & Sampson, V. (2008). Assessing dialogic argumentation in online environments to relate structure, grounds, and conceptual quality. Journal of Research in Science Teaching, 45, 293–321. *Clement, J. (2000). Model based learning as a key research area for science education. International Journal of Science Education, 22(9), 1041–1053. Collins, A., Brown, J. S., & Newman, S. E. (1990). Cognitive apprenticeship: Teaching the crafts of reading, writing, and mathematics. In L. B. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser (pp. 453–494). Hillsdale, NJ: Lawrence Erlbaum Associates. Forbus, K. D. (1984). Qualitative process theory. Artificial Intelligence, 24(1–3), 85–168. Frederiksen, J. R., White, B. Y., & Gutwill, J. (1999). Dynamic mental models in learning science: The importance of constructing derivational linkages among models. Journal of Research in Science Teaching, 36(7), 806–836. *Gilbert, J. K. (1993). Models & modeling in science education. Hatfield, UK: The Association for Science Education. Gilbert, J. K., & Boulter, C. J. (1998). Learning science through models and modeling. In B. J. Fraser & K. G. Tobin (Eds.), International handbook of science education, Part 1 (pp. 53–66). Dordrecht, Netherlands: Kluwer Academic Press. Gilbert, J., Pietrocola, M., Zylbersztajn, A., & Franco, C. (2000). Science education: Notions of reality, theory and models. In J. K. Gilbert & C. J. Buolter (Eds.), Developing models in science education (pp. 19–40). Dordrecht, Netherlands: Kluwer Academic Press. Gobert, J. D., & Pallant, A. (2004). Fostering students’ epistemologies of models via authentic model-based tasks. Journal of Science Education and Technology, 13(1), 7–22. Hannafin, M. J., & Land, S. (1997). The foundations and assumptions of technology-enhanced, student-centered learning environments. Instructional Science, 25, 167–202. Hart, C. (2008). Models in physics, models for physics learning, and why the distinction may matter in the case of electric circuits. Research in Science Education, 38(5), 529–544. Hestenes, D. (1987). Toward a modeling theory of physics instruction. American Journal of Physics, 55, 440–454.
J. Shen et al. Ioannidou, A., Repenning, A., Webb, D., Keyser, D., Luhn, L., & Daetwyler, C. (2010). Mr. Vetro: A collective simulation for teaching health science. International Journal of Computer-Supported Collaborative Learning, 5(2), 141–166. Jonassen, D., & Reeves, T. (1996). Learning with technology: Using computers as cognitive tools. In D. H. Jonassen (Ed.), Handbook of research in educational communications and technology (pp. 693– 719). New York, NY: Simon & Schuster Macmillan. Kauffman, S. (1995). At home in the universe: the search for the laws of self-organization and complexity. Oxford: Oxford University Press. Ketelhut, D. J. (2007). The impact of student self-efficacy on scientific inquiry skills: An exploratory investigation in River City, a multiuser virtual environment. Journal of Science Education and Technology, 16(1), 99–111. Ketelhut, D. J., & Dede, C. (2006). Assessing inquiry learning. Paper presented at the National Association of Research in Science Teaching, San Francisco, CA. Ketelhut, D. J., Dede, C., Clarke, J., & Nelson, B. (2006). A multi-user virtual environment for building higher order inquiry skills in science. Paper presented at the American Educational Research Association, San Francisco, CA. Ketelhut, D. J., Nelson, B., Dede, C., & Clarke, J. (2006). Inquiry learning in multi-user virtual environments. Paper presented at the National Association for Research in Science Teaching, San Francisco, CA. Khan, S. (2007). Model-based inquiries in chemistry. Science Education, 91, 877–905. Komis, V., Ergazaki, M., & Zogza, V. (2007). Comparing computersupported dynamic modeling and “paper & pencil” concept mapping technique in students’ collaborative activity. Computers in Education, 49(4), 991–1017. Kozma, R. B., Chin, E., Russell, J., & Marx, N. (2000). The role of representations and tools in the chemistry laboratory and their implications for chemistry learning. The Journal of the Learning Sciences, 9(3), 105–144. Krange, I., & Ludvigsen, S. (2008). What does it mean? Students’ procedural and conceptual problem solving in a CSCL environment designed within the field of science education. International Journal of Computer-Supported Collaborative Learning, 3(1), 25–51. Kress, G., & van Leeuwen, T. (1996). Reading images: The grammar of visual design. New York, NY: Routledge. *Lehrer, R., & Schauble, L. (2006). Cultivating model-based reasoning in science education. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (pp. 371–388). New York, NY: Cambridge University Press. Levy, S. T., & Wilensky, U. (2008). Inventing a “mid-level” to make ends meet: Reasoning through the levels of complexity. Cognition and Instruction, 26, 1–47. Levy, S. T., & Wilensky, U. (2009a). Students’ learning with the Connected Chemistry (CC1) Curriculum: Navigating the complexities of the particulate world. Journal of Science Education and Technology, 18(3), 243–254. Levy, S. T., & Wilensky, U. (2009b). Crossing levels and representations: The Connected Chemistry (CC1) Curriculum. Educational Technology, 18(3), 224–242. Li, S. C., Law, N., & Lui, K. F. A. (2006). Cognitive perturbation through dynamic modeling: A pedagogical approach to conceptual change in science. Journal of Computer Assisted Learning, 22(6), 405–422. Linn, M. C. (2006). The knowledge integration perspective on learning and instruction. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (pp. 243–264). New York, NY: Cambridge University Press. *Linn, M. C., Clark, D., & Slotta, J. D. (2003). WISE design for knowledge integration. Science Education, 87(4), 517–538.
41
TMBI
Linn, M. C., Davis, E. A., & Eylon, B.-S. (2004). The scaffolded knowledge integration framework for instruction. In M. C. Linn, E. A. Davis, & P. Bell (Eds.), Internet environments for science education (pp. 47–72). Mahwah, NJ: Lawrence Erlbaum. Linn, M. C., & Eylon, B.-S. (2011). Science learning and instruction: Taking advantage of technology to promote knowledge integration. New York, NY: Routledge. Linn, M. C., & Hsi, S. (2000). Computers, teachers, peers: Science learning partners. Mahwah, NJ: Lawrence Erlbaum Associates. *Linn, M. C., Lee, H. S., Tinker, R., Husic, F., & Chiu, J. L. (2006). Teaching and assessing knowledge integration in science. Science, 313, 1049–1050. Liu, X. (2006). Effects of combined hands-on laboratory and computer modeling on student learning of gas laws: A quasi-experimental study. Journal of Science Education and Technology, 15(1), 89–100. Lowe, R. (2004). Interrogation of a dynamic visualization during learning. Learning and Instruction, 14, 257–274. Manlove, S., Lazonder, A. W., & de Jong, T. (2009). Collaborative versus individual use of regulative software scaffolds during scientific inquiry learning. Interactive Learning Environments, 17(2), 105–117. Mayer, R. E. (Ed.). (2005). Cambridge handbook of multimedia learning. New York, NY: Cambridge University Press. McElhaney, K. W., & Linn, M. C. (2011). Investigations of a complex, realistic task: Intentional, unsystematic, and exhaustive experimenters. Journal of Research in Science Teaching, 48(7), 745–770. Metcalf, S. J., & Tinker, R. F. (2004). Probeware and handhelds in elementary and middle school science. Journal of Science Education and Technology, 13(1), 43–49. National Research Council. (2000). Inquiry and the national science education standards: A guide for teaching and learning. Washington, DC: National Academy Press. *National Research Council. (2011). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academy Press. Nelson, B., Ketelhut, D. J., Clarke, J., Bowman, C., & Dede, C. (2005). Design-based research strategies for developing a scientific inquiry curriculum in a multi-user virtual environment. Educational Technology, 45(1), 21–27. Pallant, A., & Tinker, R. F. (2004). Reasoning with atomic-scale molecular dynamic models. Journal of Science Education and Technology, 13(1), 51–66. Papaevripidou, M., Constantinou, C. P., & Zacharia, Z. C. (2007). Modeling complex marine ecosystems: An investigation of two teaching approaches with fifth graders. Journal of Computer Assisted Learning, 23(2), 145–157. Papert, S. (1991). Situating constructionism. In I. Harel & S. Papert (Eds.), Constructionism. Norwood, NJ: Ablex Publishing. *Papert, S. (1996). An exploration in the space of mathematics educations. International Journal of Computers for Mathematical Learning, 1(1), 95–123. Parnafes, O. (2007). What does fast mean? Understanding the physical world through representations. The Journal of the Learning Sciences, 16(3), 415–450. Passmore, C., & Stewart, J. (2002). A modeling approach to teaching evolutionary biology in high school. Journal of Research in Science Teaching, 39, 185–204. Penner, D. E. (2001). Cognition, computers, and synthetic science: Building knowledge and meaning through modelling. Review of Research in Education, 25, 1–37. Penner, D. E., Gilles, N. D., Lehrer, R., & Schauble, L. (1997). Building functional models: Designing an elbow. Journal of Research in Science Teaching, 34(2), 125–143. Perkins, K., Adams, W., Dubson, M., Finkelstein, N., Reid, S., Wieman, C., et al. (2006). PhET: Interactive simulations for teaching and learning physics. The Physics Teacher, 44(1), 18–23.
539 Podolefsky, N. S., Perkins, K. K., & Adams, W. K. (2010). Factors promoting engaged exploration with computer simulations. Physical Review Special Topics—Physics Education Research, 6, 020117-1-11. Quintana, C., Zhang, M., & Krajcik, J. (2005). A framework for supporting metacognitive aspects of online inquiry through softwarebased scaffolding. Educational Psychologist, 40(4), 235–244. *Reiser, B. J. (2004). Scaffolding complex learning: The mechanisms of structuring and problematizing student work. Journal of the Learning Sciences, 13(3), 273–304. Schwartz, R. S., & Lederman, N. G. (2005, April). What scientists say: Scientists’ views of models. Paper presented at the Annual Conference of National Association for Research in Science Teaching, Dallas, TX. Schwarz, C. V., Reiser, B. J., Davis, E. A., Kenyon, L., Acher, A., Fortus, D., et al. (2009). Developing a learning progression for scientific modeling: Making scientific modeling accessible and meaningful for learners. Journal of Research in Science Teaching, 46(6), 632–654. Schwarz, C. V., & White, B. (2005). Meta-modeling knowledge: Developing students’ understanding of scientific modeling. Cognition and Instruction, 23(2), 165–205. Sell, K. S., Herbert, B. E., Stuessy, C. L., & Schielack, J. (2006). Supporting student conceptual model development of complex Earth systems through the use of multiple representations and inquiry. Journal of Geoscience Education, 54(3), 396–407. Shen, J. (2010). Nurturing students’ critical knowledge using technology-enhanced scaffolding strategies in science education: A conceptual framework. Journal of Science Education and Technology, 19(1), 1–12. *Shen, J., & Confrey, J. (2007). From conceptual change to constructive modeling: A case study of an elementary teacher in learning astronomy. Science Education, 91(6), 948–966. Shen, J., & Confrey, J. (2010). Justifying alternative models in learning the solar system: A case study on K-8 science teachers’ understanding of frames of reference. International Journal of Science Education, 32(1), 1–29. Shen, J., & Linn, M. C. (2011). Connecting scientific explanations and everyday observations: A technology enhanced curriculum on modeling static electricity. International Journal of Science Education, 33(12), 1597–1623. Simpson, G., Hoyles, C., & Noss, R. (2006). Exploring the mathematics of motion through construction and collaboration. Journal of Computer Assisted Learning, 22, 114–136. Sins, P. H. M., Savelsbergh, E. R., van Joolingen, W. R., & van HoutWolters, B. H. A. M. (2009). The relation between students’ epistemological understanding of computer models and their cognitive processing on a modelling task. International Journal of Science Education, 31(9), 1205–1229. Slotta, J. D., & Linn, M. C. (2009). WISE science: Inquiry and the internet in the science classroom. New York, NY: Teachers College Press. Songer, N. B. (2007). Digital resources versus cognitive tools: A discussion of learning science with technology. In S. K. Abell & N. G. Lederman (Eds.), Handbook of research on science education. Mahwah, NJ: Lawrence Erlbaum Associates Publishers. Stern, L., Barnea, N., & Shauli, S. (2008). The effect of a computerized simulation on middle school students’ understanding of the kinetic molecular theory. Journal of Science Education and Technology, 17(4), 305–315. Stratford, S. J., Krajcik, J., & Soloway, E. (1998). Secondary students dynamic modeling processes: Analyzing, reasoning about, synthesizing, and testing models of stream ecosystems. Journal of Science Education and Technology, 7(3), 215–234. Tobin, K. (Ed.). (1993). The practice of constructivism in science and mathematics education. Washington, DC: AAAS Press.
540 Tomasi, J. (1988). Models and modeling in theoretical chemistry. Journal of Molecular Structure (THEOCHEM), 179, 273–292. White, B. (1993). ThinkerTools: Causal models, conceptual change, and science education. Cognition and Instruction, 10(1), 1–100. Wieman, C., Adams, W. K., Loeblein, P., & Perkins, K. K. (2010). Teaching physics using PhET simulations. The Physics Teacher, 48(4), 225–227. *Wieman, C., Adams, W. K., & Perkins, K. K. (2008). PhET: Simulations that enhance learning. Science, 322, 682–683. Wilensky, U., & Rand, W. (2009). An introduction to agent-based modeling: Modeling natural, social and engineered complex systems with NetLogo. Cambridge, MA: MIT Press. Wilensky, U., & Reisman, K. (2006). Thinking like a wolf, a sheep, or a firefly: Learning biology through constructing and testing computational theories—An embodied modeling approach. Cognition and Instruction, 24(2), 171–209. *Wilensky, U., & Resnick, M. (1999). Thinking in levels: A dynamic systems perspective to making sense of the world. Journal of Science Education and Technology, 8(1), 3–19. Windschitl, M., Thompson, J., & Braaten, M. (2008). Beyond the scientific method: Model-based inquiry as a new paradigm of preference for school science investigations. Science Education, 92(5), 941–967.
J. Shen et al. Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33–35. Wu, H.-K. (2010). Modeling a complex system: Using novice-expert analysis for developing an effective technology-enhanced learning environment. International Journal of Science Education, 32(2), 195–219. Wu, H.-K., Krajcik, J. S., & Soloway, E. (2001). Promoting understanding of chemical representations: Students’ use of a visualization tool in the classroom. Journal of Research in Science Teaching, 38(7), 821–842. Xie, C. (2010). Computational experiments for science and engineering education. Retrieved August 28, 2011, from mw.concord.org/modeler/articles/computational_experiment.pdf Xie, Q., & Tinker, R. (2006). Molecular dynamics simulations of chemical reactions for use in education. Journal of Chemical Education, 83(1), 77–83. *Xie, C., Tinker, R., Tinker, B., Pallant, A., Damelin, D., & Berenfeld, B. (2011). Computational experiments for science education. Science, 332(6037), 1516–1517. Zhang, B., Liu, X., & Krajcik, J. S. (2006). Expert models and modeling processes associated with computer-modeling tool. Science Education, 90(4), 579–604.