Designing for Science Learning and Collaborative ...

3 downloads 0 Views 2MB Size Report
I know I'm being nitpicky but work a little on grammar.” 11. S: The next one is better: “You know you have to struggle to read the 'detailed plan.' Sounds like a fun ...
Designing for Science Learning and Collaborative Discourse 1

3

Todd Shimoda1, Barbara White2, Marcela Borge3, & John Frederiksen4 2

Pomona College; The Graduate School of Education, University of California at Berkeley; The Center for Online Innovations in Learning, College of Information Sciences and Technology, 4 Pennsylvania State University; The College of Education, University of Washington 1

[email protected]; [email protected]; [email protected]; 4 [email protected]

ABSTRACT A prototype Web-based environment, called the Web of Inquiry, was developed that built on previous work in science learning and technology. This new system was designed to meet constructivistlearning principles, support self-reflection, and meet specific interaction goals within the classroom environment. The system was tested it in fifth, sixth, and seventh grade (ages 10-13) classrooms. Mixed methods results suggest that the system met many of the initial design goals and also identified areas that could be improved in future iterations of the system.

Categories and Subject Descriptors I.7.2, D.3.2, [Programming Languages]: Languages used to build the system.

General Terms Design, Experimentation, Measurement.

Keywords Science Education, Learning Technologies, Expert Systems, Interaction design, Child-Centered Design, Design-based Research.

1. INTRODUCTION Presently, a movement is underway to restructure how science is taught to young students. The National Research Council [22] has emphasized the need to develop students' abilities to understand and reason about everyday phenomenon in ways similar to that of scientists. They propose a new framework for teaching K-12 science that emphasizes core ideas and practices over the traditional breadth of knowledge. The NRC argues that the lack of interest in science and engineering is likely due to substandard experiences students have with science in K-12. To remedy this problem, they propose restructuring teaching and assessment methods in K-12 science. They want K-12 science education to emphasize a more student-centered, practice-oriented, in-depth approach to science instruction. Though we are proponents for Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]. IDC '13, June 24 - 27 2013, New York, NY, USA Copyright 2013 ACM 978-1-4503-1918-8/13/06…$15.00.

transforming science education to focus on development of students' scientific reasoning and ability to apply this form of higher order thinking to everyday experiences, there are challenges to implementing this new framework. Science practice is a complex and multifaceted endeavor that presents particular challenges for younger students. For example, adolescents may not be capable of understanding multivariate causality or other important aspects of scientific inquiry without proper guidance [20]. It is also not likely that students will develop the type of higher order thinking processes required to conduct high quality scientific inquiry without some form of formal instructional support [19, 21]. It is important to emphasize that the researchers that present the aforementioned findings emphasize that these findings should not be mistaken for an argument against young children carrying out scientific inquiry, but rather of the importance of developing ways to properly support children’s cognition during these activities [20]. They highlight the need to articulate a set of cognitive competencies and set them as concrete learning goals and outcomes of instruction. Otherwise, they state, “In the absence of an explicit sequence of this nature, inquiry learning risks becoming a vacuous practice— one embraced without clear evidence of the cognitive processes or outcomes that it is likely to foster” [20, Pg 520]. What these finding indicate is that there is a need to develop tools to support students’ scientific exploration. As science classrooms move towards emphasizing student-centered curricula, we must also develop ways to support the higher order thinking processes inherent to scientific practice. Given that so many adults lack the higher order thinking processes associated with science inquiry [21], it is likely that such tools would be just as helpful for teachers as it would be for students and could also help to increase the quality of science instruction for a wider range of students. The development of technological tools presents both promise and challenges towards meeting these needs. New software could aid teachers by supplementing their knowledge and helping to guide activities, but this same technology can lead to another set of problems. The most likely problems include improper technology use, implementation, and design. Technology has the potential to change how teachers and students perceives the practice of science, the artifacts they create as they work to understand science practice, the rules they set for evaluating scientific ideas, and their entire classroom community. This vision is the basic idea behind activity theory and the power of technological tools [9, 23]. Unfortunately, when it comes to supporting instruction, technology has a history of not being well integrated in formal classroom environments [7]. This is due to many reasons, a

common one being the perception that the intent of technology is to replace the teacher [7].

conducting investigations, analyzing data, synthesizing models or theories to explain the data, and evaluating and extending the

Table 1. Inquiry Learning Model and Constructionist Principles Inquiry Learning Factors

Constructionist Principles for Educational Technology Learner Control

Scaffolded Advice, Tools

Individual Differences

Tasks and Projects

Inquiry Cycle

Students can move forward & backward in cycle

Students are given task specific but not restrictive instructions

Students can be task or knowledge oriented

Students complete reports & other inquiry artifacts including data graphing

Cognitive & Social Factors

Students can select types of tools they feel are best for a task

Students are given context specific advice for completing tasks or problem solving

Students have multiple ways of getting advice and feedback and for communicating

Students create archived and viewable threaded discussions, and feedback

Metacognitive Factors

Students have anytime access to feedback and assessment items

Students can determine best type of advice or tool for their level of understanding

Students can use alternate ways of understanding their progress and learning

Students complete blogs (web logs) of their thinking and self-assessments

The goal of our learning system was not to replace teachers, but to supplement instructional practice with cognitive supports and added opportunities for scientific discourse. We envisioned our system would provide teachers and students with tools to supplement their own knowledge and help to increase science discourse in the classroom. In keeping with the most recent literature on learning and cognition, works that built on the theories of Lev Vygotsky, Seymour Papert, Ann Brown, and Jean Lave, science learning technology should help students to think about and control their own learning while providing them with examples of science concepts and inquiry practices. In this way the students can develop the ability to think about and evaluate scientific concepts and inquiry processes on their own and as part of a community of learners [3]. This paper is an articulation of the design and evaluation of a system designed to support science learning. The system was developed and refined through the collaboration of teachers, students, and cross-disciplinary researchers. In this paper, we describe the underlying rationale behind the design of the system to support inquiry science, present findings from classroom implementations, discuss implications for future iterations of the system, and expand the picture first painted by Collins and Halverson [7] of the future of educational interaction design.

1.1 Foundations of the current technology In previous work, White and Frederiksen [29, 30] proposed a conceptual environment that would support children’s thinking and understanding as they engaged in the practice of inquiry and would bridge the gap between science practice and instruction. The concept had emerged from several years of working with teachers, researchers, and students that revealed the problems and successes involved in learning scientific inquiry. These experiences provided the initial requirements for an inquiry learning support environment. At the heart of the concept is a theory of science instruction: scientific inquiry should be taught as process consisting of iterative cycles of inquiry, with inherent cognitive, social, and metacognitive factors, and this complex form of learning could be supported with educational technology tools [11, 14, 28, 29, 30]. The inquiry cycle and curriculum should introduce students to the cyclical nature of scientific inquiry, e.g., coming up with theorybased research questions, developing hypotheses, designing and

models to new situations that would lead to new research questions. Within each of these skills, students should be introduced to inquiry sub-skills, be given opportunities to practice the skills, and be prompted to evaluate their understanding and use of the skills. This evaluation process, termed reflective assessment, pushes students to evaluate their research throughout their inquiry process [1, 12, 27], as well as receive feedback from peers and teachers. The hypothesis proposes that reflective assessment and feedback would help students develop a better understanding of the purpose and practice of inquiry. It would also help students develop metacognitive knowledge and habits of monitoring and reflecting on their work and understanding. Reflective assessment should be especially important for low-achieving students, who tend to lack metacognitive skills [5]. Studies have shown that this form of reflection can improve inquiry learning and is particularly beneficial for the low-achieving students [29].

1.2 Designing a research-based and principlebased inquiry science support environment An educational technology prototype, called Inquiry Island, was developed and tested in classroom settings to support the model of learning inquiry described in the previous section [10, 11, 25, 26]. The functional design of the technology used constructionist principles based on the work of Seymour Papert [24]. Constructionism is based on constructivism, which is a framework that emphasizes learners actively building knowledge from their existing knowledge. Constructionist principles include: (a) giving students as much control over their learning as possible, (b) providing flexible and adjustable scaffolding (learning help) in the form of advice and tools for learning, (c) accounting for individual differences such as learning goals or advice preferences, (d) providing students with projects and other artifacts to complete that (e) emphasize self and peer assessment, (f) and enable teachers to understand, scaffold, and assess inquiry. Inquiry Island was designed to promote activities and interact with students in ways that would support three critical aspects of inquiry learning: the inquiry cycle, cognitive and social capabilities, and metacognitive capabilities. Designing the environment to facilitate the development and use of these capabilities should provide the interactions and cognitive supports students would need to engage in self-regulated science learning

[4, 20]. Table 1 summarizes the constructionist principles for designing a system that supports students’ learning and practicing

Figure 1: Screen shot of one advisor, Ivy Investigator, and an excerpt from the definition of “investigate”. The advisor also provides concrete examples, but these are not shown. the critical aspects of inquiry science. Though the initial prototype showed promise [11, 26], we identified additional student and teacher needs. Our students needed more collaborative supports. They wanted to share ideas with others, but did not know how to ask for help or feedback in person. In addition to student support, the system needed to better support the teaching of science inquiry learning. Our teacher participants helped us to recognize that teachers needed immediate access to activities and online support to help them guide learning during the inquiry process. The teachers also wanted access to the students’ work and assessments to provide feedback and be able to engage in discussions about their work. Lastly, key features were yet to be designed, developed, and integrated into the system: data analysis and search tools, and group awareness and collaborative discussion features. Evaluation of the system also needed to take place in a more realistic educational setting. Our initial trials took place in small classroom settings with one or two teachers who were actively participating in the development of the system. This affected our ability to do a more comprehensive examination of the system’s ability to support specific aspects of the inquiry processes with real teachers. The challenge became to develop a usable, flexible, and principle-based environment where students could carry out and assess inquiry projects, while at the same time providing teachers with features to efficiently support their students. This system needed to be tested in multiple classrooms, with multiple teachers, in order to evaluate its ability to support inquiry learning and collaborative peer-to-peer and teacher-to-student interactions in urban and suburban classroom settings.

2. WEB OF INQUIRY The most current iteration of the technological system is implemented in an online environment and is called Web of

Inquiry. The Web of Inquiry provides spaces and tools for teachers to create and score science inquiry projects, for students to undertake and self-assess science inquiry projects, and for researchers to analyze project and assessment outcomes. The target audience for the system was science classes in grades 5-7 (ages 10 to 13). The Web of Inquiry is unique in that it includes a suite of advisors and an advisor editor, a project template builder, a project workspace, a set of inquiry tools, and a project assessment workspace. Features in the system were also designed to support the higher-order thinking processes that underlie scientific inquiry [20, 29]. These added supports allow teachers and students to have more control over learning content and processes than is common in formal learning systems. The system also provides students and teachers opportunities for new forms of social interaction and discourse in classroom settings by promoting a classroom environment where both teachers and students have access to assessment tools, student understanding via reflective tools, and simulated expert advice to support scientific inquiry. These features were seamlessly integrated in one online system that tracks student activity and progress. Wellinformed learning technology, such as this, can enhance and support the cognitive abilities of children by providing an innovative interactive learning environment [18]. The four main user types are teachers, researchers, individual students, and student project teams. When logged in, the users are taken directly to their home page, which presents the links and workspaces available to that user. On their home page, all users can update personal information, which varies according to user type. Teachers and researchers can create or modify project templates, set up and administer a class, set up projects for that class, respond to student discussions, and view and score projects. Teachers and researchers can create and modify inquiry advice and tool help. Students can sign up for a class, start a project, and go to projects they have previously started. Project teams can go to the tasks and subtasks for their project, see advice, use inquiry tools, and communicate with other teams including giving feedback on their projects. Detailed descriptions of the main environment modules are provided in the following sections.

2.1 Inquiry advisors and advice The Web of Inquiry advice system includes task advisors based on the inquiry cycle, and cognitive, social, and metacognitive advisors. Advice includes goals, strategies, motives, examples, concepts, and plans. Each advisor has a top-level page with links to main categories of advice, as well as an “autobiographical” About Me. The About Me narrative presents a personal dialogue about the skill incorporated in the advisor. Figure 1 shows an example of the advice format. An advisor editor allows advisors to be created and modified. Rules for establishing when advisors or advice is displayed are created in the project template builder.

2.2 Project workspace and tools Students record their work on science inquiry projects in the project workspace. As students work on a task or subtask, they have access to specific advice selected for the context. Figure 3 is a screen shot of the project instructions, project work, navigation, advisors, advice, and tool icons. For each task or subtask, students or teams can enter and modify text. Also displayed in the project workspace are a team’s links to non-text media including images, video, and sound. Other features of the workspace include advice icons that link to relevant advice. The advice title is displayed when rolling the cursor over an icon; for example, rolling over “Questions” will

display “Questions you might ask yourself.” Clicking on the main and secondary advisors for the task opens the advisor’s top-level page. A navigation list allow students to link to other tasks or subtasks, or go to higher level pages such as the class home page. A set of tools helps students complete tasks and assess their work. The tools currently available include: • • • • • • • •

Thinker Tool (provides a blog for brainstorming ideas) Progress Chart (tracks team’s progress in the inquiry process and in completing assessment items) Discussion Space (allows threaded discussions with other teams) Project Reporter (builds a printable report of student work) Data Entry and Table Builder (allows teams to create variables, enter data, and build data tables) Graph Builder (helps teams create bar or line graphs from data tables) Dictionary (provides a glossary of inquiry concepts) Advice Search (allows students to search all advice)

2.3 Assessment workspace and project scoring The assessment workspace contains the self-assessment items. These items are created using the template builder to accompany project workspaces, including those for each task and subtask. The main instructions are displayed for each assessment item. When the students work on the item, the rubric items or text entry box is displayed. Advice icons are available for an assessment item, as well as for each rubric item for radio button or checkbox formats, if established in the project template builder. The previous selection or text is displayed when students have previously completed the item,. Students can modify the previous assessment as they revise their work. As students complete self-assessment items, the results are also available in the Progress Chart. Teachers and researchers can score projects through a “project scoring” link found on the teacher and researcher home pages. Scoring is accomplished through an assessment tool that displays all assessment items, or group by type (i.e., goal, standards, analytic, or open). Assessment items are not necessarily the same for teams and teachers and researchers, as each may have different items created in the project template builder. After scoring, data from the assessments can be selected and formatted for use in statistical software.

language and a server to provide the interactivity of the environment and communicating with the database. The ColdFusion platform had previously been used to create a content management system for the software advisors’ content, and it also provides for quick deployment using pre-developed functions such as graphing and search indexing. The database architecture was designed to capture the student work and assessments, as well as allow flexibility in creating projects templates, class projects, advisor calls, assessments, and calling other features such as graphical elements. The database also recorded teacher and researcher assessments. As a result, datasets of all work, assessments, and system use logs, filtered by class, teacher, or students can be created for statistical analyses.

3. EVALUATING THE SYSTEM Our primary objectives for the system were for it to serve as a resource for teachers and students to guide and evaluate three critical aspects of scientific inquiry learning: the inquiry cycle, cognitive and social factors, and metacognitive factors. We envisioned that it would not replace teachers but rather provide them with added resources to supplement their expertise and knowledge of science. We also envisioned that the interactive formative assessments would model for students the types of questions they should ask themselves when working on different component of the inquiry cycle. In this section, we will evaluate the system in two ways. Our research questions are as follows: (RQ1), could use of the system help students to improve their understanding and application of inquiry concepts and processes and (RQ2) did the system aid in facilitating student-teacher discourse and in supplementing collaborative inquiry learning.

3.1 Population Over the course of three years, seven teachers in California and Washington State used the Web of Inquiry as part of fifth, sixth, and seventh grade science classes (ages 10-13). A total of four hundred students participated in the trial use of the Web of Inquiry. The participating classes represented a range of school district profiles in the Northwest and California. Five public (four urban and one sub urban) and two private schools (one urban and one sub urban) were included in the study. Students represented a mix of ethnicities and socioeconomic backgrounds. Some of the students had prior instruction on inquiry methods, some were

2.4 Technical Specifications The Web of Inquiry was constructed using HTML, CSS, JavaScript, ColdFusion, and SQL Server. There were approximately forty separate web pages using HTML and CSS for layout and formatting, and some JavaScript for some modifiable elements such as rollover effects. ColdFusion is a middleware that uses ColdFusion markup

Figure 3. Screen shot of the project work and assessment spaces.

unfamiliar with scientific inquiry, but none of the students in the subset had previously used the Web of Inquiry. Most of the teachers in the project were familiar with inquiry and had experience engaging students in inquiry projects prior to the introduction of the Web of Inquiry. Data collected include prepost inquiry tests, videotapes, observations, and field notes. The students’ work, assessments, and other artifacts were maintained in the Web of Inquiry database. Teacher and student interviews were also conducted and recorded.

3.2 Implementation Teachers chose to introduce the system in different ways. Some chose to walk students through the work and assessment spaces, advisors and tools. Others introduced the students to the general concepts of inquiry doing a simple inquiry project while modeling use of the system during whole-class activity.

3.3 Specific projects Teachers used the Web of Inquiry for several different projects during the initial trials. These included content areas of earth and ecological science, physical and materials science, biology, behavioral science, health and nutrition. Students used the Web of inquiry to document their entire inquiry process. In the system, students worked on each part of the inquiry cycle with intelligent advisors that explained important ideas and provided concrete examples of science concepts and practices. Each part of the inquiry cycle was also supported by reflective assessments where students stop to reflect on the quality of their work based on accepted scientific criteria [29, 30]. Students input their work into the system and then the system pieces together a report based on their inputs.

3.4 Design-based research methods A mixed-methods approach was used to evaluate the Web of Inquiry as a means to help students learn about inquiry and discuss their thinking with peers and teachers. The overall methodology was design-based research: a method used to evaluate the potential of an educational technology, grounded in theory or prior work, and evaluated in real classroom settings [2, 8]. Prominent educational researchers have argued that controlled experiments in classroom settings are not realistic nor generalizable and propose the design based approach as an alternative that better meets the needs of designers and users when developing tools for use in formal classroom settings [2, 8]. For this reason, our evaluation of this system was not intended to be a controlled, empirical study, but rather a feasibility study of the usefulness of the Web of Inquiry in natural classroom environments that include a mix of teachers, students, curricula, and implementations.

3.5 Evaluating the system as a learning tool A representative subset of the total project participants was used to assess learning gains; this consisted of 88 students in five classrooms. A validated test, called the Inquiry Test [13], was administered in classrooms that had not previously used the system. The Inquiry Test has a high level of internal consistency reliability (α =.90) and measures student understanding of core concepts and methods of scientific inquiry [13, 14, 16, 17, 29, 30], concepts and methods modeled and supported by the current system. Prior studies conducted with the Inquiry Test also show that one does not get significant learning gains simply by taking the test twice [29].

There are three general parts to assessing a student’s Inquiry Test performance. The first part is a detailed analysis that covers each section of our model of inquiry, discussed in the next paragraph. The second part consists of an overall assessment of how well students understand important cognitive and social goals. Important cognitive and social goals include science understanding, reasoning carefully, and inventiveness. The third part of the assessment includes a coherence score that measures how well the different parts of a student’s responses built on and related to their answers to previous parts of the test. The detailed analysis evaluates understanding of the five inquiry concepts from our proposed model of inquiry: (a) developing hypotheses, (b) investigation design, (c) analyzing data, (d) synthesizing models or theories to explain the data, and (e) evaluating and extending the models to new situations that would lead to new research questions. Students were required to develop some alternative hypotheses about possible answers to a research question given on the test, and design an investigation that would collect the appropriate data. Using their mock data, they were required to draw a conclusion and relate it to their hypotheses. They were then required to judge the limitations of their investigation and describe the overall usefulness of their inquiry results. Students individually took the pretest before they used the Web of Inquiry, and after they finished their projects using the Web of Inquiry. Two of the classrooms completed one Web of Inquiry project, and three completed two projects. The tests were blindscored with no information as to student identity or time of test. Scorers assess each of the test sections while reading through the students’ responses, and they make their overall assessments after completing that analysis. Along with the pretest, students completed a written assignment to determine baseline written communication ability, which could affect their inquiry test results. This measure was added as a means to ensure that learning gains in the inquiry test were not due to overall improvement of communication ability. The writing tests were scored to provide a communication score, used in the data analysis. The correlations of the communication score compared with standardized test scores for English proficiency were 0.418 and 0.684 for grades 6 and 7, ages 11-13. Standardized test scores were not available for fifth graders (ages 10-11).

3.6 Evaluating Interaction Design The primary interaction design goals reported in this paper are related to the system’s ability to support collaborative inquiry learning and create opportunities for scientific discourse in the classroom. From this standpoint, the goal was not to replace the teacher or discussions with classmates, but to create opportunities for the students and teachers to interact with the system in ways that could enhance the types of discussions they had around scientific inquiry and what they could do. We did not focus on more traditional usability testing, as much of this had been carried out as part of the original prototype evaluation. Another important design goal was for the system to supplement student cognition. For this interaction goal, we were looking to see to what extent the students used the advisors and cognitive tools we provided as a means to guide their work. To evaluate the utility of the system to meet these goals, we relied primarily on video and field notes captured during implementation of the Web of Inquiry during formal classroom instruction. The videos were transcribed, logged, and connected to related field notes. These transcripts were then categorized according to the

extent to which they were connected to the different requirements listed in Table 1. A discourse analytic approach was used as a means to analyze the findings and draw conclusions [6, 15, 17].

4. RESULTS 4.1 Evaluating the System as a Learning Tool Pretest and posttest Inquiry Test results were used to determine whether use of the Web of Inquiry in classroom settings was associated with development of students’ understanding of and ability to apply the inquiry science concepts it supported. The results also helped to identify which parts of the inquiry cycle were most problematic for students. As a reminder, students took the pretest prior to having used the system and then took a posttest immediately after using the system. A repeated measures ANOVA was conducted on the total Inquiry Test score. The between subjects factor in the analysis was the Teacher and the within subjects factor was Time of Test (pretest versus posttest). Students’ communication scores were also included as a covariate. The analysis of total score showed significant main effects for Teacher, F (4,87) = 7.8, p < .001, and Time of Test, F (1,87) = 6.6, p = .006, such that posttest scores (mean = 26.8) were higher overall than pretest scores (mean = 24.8). There was also a significant main effect of the covariate, communication score, F (1,87) = 47.2, p < .001; students’ with higher communication ability scores also had higher inquiry total scores. However, there was no significant interaction of communication with either Time of Test or Teacher. An additional repeated measures ANOVA analysis was conducted without communication scores as the covariate. The results were not substantially different from those for the analysis that included the covariate. Repeated measures ANOVA were also conducted for the coherence score. There was a significant effect for Time of Test, F(1,87) = 7.0, p = .005, with posttest scores being higher (mean = 7.63) than pretest scores (mean = 6.78) and an effect size of .30. To determine which aspects of the inquiry cycle were most and least problematic, a repeated measures ANOVA for each part of the Inquiry Test was conducted. Most of the areas showed significant improvement including: hypothesis, investigate, model, and coherence between areas. The only two areas that did not show significant improvement were analyze, F(1,87) = 1.5, and evaluate, F(1,87) = .7. These two aspects of the inquiry cycle were also the lowest scoring areas for students overall. There were also significant effects for science understanding, reasoning carefully, and inventiveness. The most significant increase was in students’ overall science understanding, F(1,87) = 12.9, with posttest scores (mean = 2.91) higher than pretest scores (mean= 2.5) and a corresponding effect size of .40. Overall, the results suggest there is a formative value of teaching these inquiry concepts with the Web of Inquiry, but also reveal potential improvements that could be made to the system. For example, although students seem to improve their understanding of inquiry concepts and practice overall, the extent to which students develop these abilities depends on individual teacher interactions. For this reason, we wanted to further explore how students used the system and characterize interactions between teacher, student, and technology in order to see if the envisioned interaction design goals were being met.

4.2 Evaluating Interaction Design Analysis of field notes and transcripts suggest that the system succeeded in meeting initial interaction design goals: providing opportunities for collaborative science discourse and reasoning about the underlying thought processes associated with the practice of scientific inquiry [17]. Additionally, the system was able to support each of the three factors we identified as critical aspects of inquiry learning: inquiry cycle tasks, cognitive and social factors, and metacognitive factors. This learning occurred as part of students’ use of the system and as part of discussions facilitated by the system. Data logs in the system showed frequent use of advisors and reflective tools. Video data showed high frequencies of students meaningfully interacting with the system. Video and field notes support the claim that the system provided children with a source of expertise and control over their own learning, while at the same time, worked alongside the teacher to enhance learning opportunities. This was one of our objectives for the system. As the students worked with the system, they developed an understanding of the steps necessary to work through an inquiry project. The system features guided students as they developed their projects, introducing them to the language of science as well as guiding a thoughtful practice. The following excerpt illustrates how system features support the learning of important cognitive and social aspects of inquiry when young students are working together. In this excerpt, two students are working on a solar oven project. The project goals are for students to learn about heat transfer with different materials, angles of the panels, and other variables. The inquiry involves changing variables and measuring differences in temperature. They have finished collecting data and are currently using the Web of Inquiry to help them reason about and explain a theory they have developed from their data. 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16.

17.

S2: OK, now what do we do? S1: (Looks at system) Now we do explanation for our theory, we’re basically….oh crap, we didn’t save our current best theory. We didn’t save it. S2: Should we go back and save it? S1: We’ll go back and save it. (Continues typing their explanation) ‘The reflectors’… S2: just say ‘direct.’ S1: … ‘direct heat and light’ (pauses)… S2: Uh, ‘depending’… S1: … ‘to places depending on’… S2: ‘how they’re positioned,’ I guess? S1…’how they are positioned’…’So they do help create more heat…’ S2: …’by reflecting heat toward…’ S1: No, I’m going to say ‘by creating more heat…’ S2: Well, they don’t create more heat, they just get more heat. They reflect it to the right place. S1: ‘They reflect more heat in …’ S2: Just say ‘toward’ whatever …the heat would get there… S1: ‘They direct more heat to the place’....no...’at a place where it is directed by the angle? No: ‘depending on the angle.’ We said something else about angle. (Both boys lean in towards the screen.) S2: Um, we don’t have to put it all in cuz I think we said some stuff over, just in different ways. So make sure you save it.

These students demonstrate how they use the technology to develop a plausible explanation of their theory, using the Web of Inquiry workspace to enter in their thoughts. They depend on the system to guide them along on their thought process. For example, In turn 1, S1 asks what they do next, and S2 looks at the system and states that they now have to explain their theory. Through this

guided process, the students are carrying out an investigation on and learning about solar heat and how it works. In turns 12-16, they work through finding the precise language to explain their findings. The boys are building on each other’s ideas and have already developed theory, but require use of the system to remember what they wrote (turn 16). This example illustrates the extent to which the young students depend on the system to guide their work and store their ideas. The teacher and system played reciprocal roles in enhancing the learning opportunities for students. Qualitative findings suggest that the system enhanced teachers’ awareness of student activities and thinking processes and teachers coached students to better utilize the system and to extend their thinking around system features [17]. During teacher-student-system interactions, the system served as an anchor for discussions about differing aspects of the inquiry cycle, students’ self assessments of their science practice, and how to get and utilize peer feedback to improve their work. This is important because it suggests that learning gains may not simply be due to differences in teacher instruction, but rather to differing abilities in teachers to use the system and engage in discourse around the system. The system includes reflective assessments that promote active reflection throughout the project and encourages students to think about what they write as well as their application of inquiry practices. Use of this technology helps the students learn the cognitive, metacognitive, and social practices of scientists by introducing the importance of reflection, epistemic criteria, explanation, and peer feedback as part of scientific practice. We present two examples that illustrate some of the types of interactions afforded by having the system as part of the classroom culture and the reciprocal relationship that the system and teacher had in supporting student learning. The system provided different types of collaborative opportunities and allowed students to use collaborative activities to fulfill their individual needs. These excerpts also show how the system supported development of the social and cognitive practices inherent in science and provide concrete examples of mediated discourse between teacher, student, and a technological system. The system provides the teacher with access to students’ self assessments that can be used to push students’ to think about and discuss their own work. We designed the reflective assessment questions in the Web of Inquiry to stimulate metacognitive activities in which students reflect on and evaluate the work they have just done [30]. In this example, a teacher is prompting students to do more relevant kinds of explanation of their selfassessments in their project blog space. The project concerns ideal nutrition and energy levels when performing cognitive functions such as taking a test. 1.

2. 3. 4. 5.

6.

T: OK. Well then, in this case, you guys… I would go on to the Blog and looking at your own goal assessments, how you rated yourself on each step… Look at the things that you maybe…did you do that for your…? S1: Yeah we did everything. We just need to… T: So, for example… S2: We rated ourselves so good… T: Well, let’s go for accuracy rather than…(reads from the screen:) possible to investigate, our theories…OK—so here’s an example where you didn’t give yourself the highest. And so maybe this would be, maybe for the Blog, might be something…good. You did all those. OK. For example, maybe this is something you want help with because you didn’t give yourself the highest rating. So you can say, ‘for our theories… S1: (Interrupts) Well, I don’t know how good it…(small laugh.)

7. 8. 9.

S2: You know…we’re not certain…’? S1: Can you check it—our theories? T: Well, I want you to get help from each other. Um, Peggy, you’re not…what did you say? You don’t get it? 10. S2: No, well…(hesitates) I don’t know how good it is. 11. T: You don’t know. OK, so maybe you’re saying I’m not sure if it’s… I’m not sure if it’s good or not. It’s not that you think it’s not very deep. It’s that you’re not really sure. Could it be better? Could it be clearer? Could it have more reasons why? Less clear about why. So this is saying you know, you’ve got your theories, you’ve got your variables, and how they relate, you know, how one thing affects the other, but maybe you don’t have so much about why. This why is-you have energy in your system. Maybe it’s not scientific about the energy coming from nutrients, you know, that kind of thing. So I think you’re right. So maybe this is something you can talk about in your Blog. OK do you know how to get to the Blog? 12. S1: Yes, I do. 13. S2: Yes. 14. T: OK. So Blog away…I’m gonna move on to the other groups here. And you can say something like: “Some of the things our group thinks we may need help with are…” 15. S1: Um, we need help with our theories? 16. S2: We need help with, why…why we didn’t we rate ourselves as good? 17. S1: Why (one student begins to type…)

In turn 1 and 5, the teacher is reading the students’ selfassessments in the system, ensuring that they took time to reflect on their work. She questions the consistently high ratings and begins to suggest that it is better for them to be accurate and rate themselves poorly than to incorrectly characterize their work. Then, finding that the students did not rate one area highly, “Is it possible to investigate our theories”, she asks the students to elaborate further on that part of their project (turns 6-10). The students explain that they simply don’t know how good their theories are and ask the teacher to evaluate them (turns 6-8). Rather than assess the children’s work, the teacher pushes them to ask their classmates and think more deeply about their responses (turn 11). For example she prompts them with self-reflection questions: “Could it be better? Could it be clearer? Could it have more reasons why?” She also says the blog tool is a good place to put down their ideas (the blog is not seen by students in other teams). In turn 14, the teacher further provides a self-reflection prompt: “Some of the things our group thinks we may need help with are…” With this prompt, the students begin to delve deeper into their self-assessments and think about why they did not rate themselves highly (turns 15-17). In the next example, a teacher and a student are discussing feedback that the student has received on her project from other teams. The project is investigating the effects of colors on appetite. The student is using the discussion tool, a feature in the system that allows students to communicate with others via the system. The format is a threaded discussion intended for feedback, soliciting or giving advice, or other social interactions. A discussion starter is a question or a request for feedback posted by the student. In this case, the student is working alone and evaluates the kind of feedback the other students are providing. The teacher begins this excerpt by asking this fifth grade (age 10-11) student for a progress report. 1. 2.

T: Hi, what is happening here? S: I am working on the effect of color on appetite and I’m using the color blue because I have heard it supposed to make people less hungry…or it’s supposed to make food look less appetizing. I am working alone this time. It’s a little bit easier to work alone because you don’t have to discuss with anybody, less distractions.

3. 4. 5. 6.

T: What do you do when you get stuck for an idea? S: Ask for help. T: Where are you now in your process? S: I’ve just finished my detailed plan. I’ve posted up some feedback for other groups. And now I’m just waiting for some replies to my discussion starter. 7. T: What was your discussion starter? 8. S: Feedback on my project so far? I just kinda wanted to know where some gaps are. 9. T: Any feedback yet? 10. S: I just got two. The first one is being kinda nitpicky: “Not bad. I know I’m being nitpicky but work a little on grammar.” 11. S: The next one is better: “You know you have to struggle to read the ‘detailed plan.’ Sounds like a fun experiment. “Yummy.” Also I would recommend what Tara says: How are you going to control what people eat for 24 hours? Also, can I be a test subject. Except no controlled diet! 12. T: Now what will you do? 13. S: I’m going to wait for a few more feedback comments before I go and change anything in my work. So I’ll go back and give other people feedback while I am waiting. 14. T: Do you think this feedback so far will cause you to change anything about your experimental design? 15. S: I think I’ll end up changing the controlled diet for 24 hours to maybe just what they ate for breakfast.

In turn 2, the student expresses a problem working within a group because of distractions. The student recognizes that their cognitive processes work more efficiently when allowed to think without distractions. The teacher asks what the student does when she is stuck, since she doesn’t have a teammate to turn to (turn 2). The student says she asks for help (turn 3), a term used for activating cognitive advisors in the system. In turn 5, the teacher asks where the student is in the inquiry cycle. The student responds by stating that they just finished their detailed plan (turn 6): the space in the system that supports students as they develop an experimental design. Though the student seems to dislike face-to-face discussions, she does see the value in getting feedback via the discussion tool (turn 6). She posted her experimental design in the discussion forum in order to get feedback on its quality from her peers (turn 6). The teacher asks about the discussion starter from the original discussion question post (turn 7). The discussion starter is what the system calls the initial question students post in the discussion forum. The student says she asked for feedback, specifically wanting to know if there are any potential problems with her research plan (turn 8). The teacher asks what feedback the student received thus far and, in turns 10-11, the student evaluates the two responses she received. The student judges one of the responses less substantive than the other. Within the system another student discusses potential issues associated with the current procedures (turn 11). The student is finding the feedback to be valuable and wants more (turn 13), and decides to evaluate the other students’ research plans. As a result of the feedback, the student plans to change the experimental design (turn 15). The system has provided this student with added control over their own learning processes including alternative ways to collaborate with peers. This is similar to the type of flexibility real scientists have; they choose whether to work as part of a lab on a project or alone. Even when choosing to work alone, a scientist receives feedback on their work from peer reviewers or colleagues. The system provides a modified version of such opportunities for young students, as well as a means for them to store their thoughts, and assess their work. These features were used, discussed, and informally evaluated by the students. Such findings support the claim that the system aided young students with their cognitive, metacognitive and socio-cognitive processes.

It was important to our design goals to be able to provide students with alternative ways to learn and interact with classmates. This was why we designed discussion tools to support distributed collaboration. However, our findings also indicate that this type of tool may pose particular problems for younger students and those with reading disabilities. For example, one student pointed out that due to difficulty in typing, face-to-face feedback was often richer than computer mediated feedback: “Well I don’t know why, but when they came over here, they tended to be more descriptive about how they felt, than just typing it on the computer. It’s probably easier to say stuff than type it.” We also found instances where students had difficulty comprehending the textual forms of advice. These will be important considerations for us as we decide how to improve the system in the next iteration.

5. DISCUSSION We designed and evaluated the Web of Inquiry in multiple school classrooms with students ages 10-13. There were various limitations to the study associated with design-based research. These included lack of control over how teachers used the system and the extent to which students used the advisors and examples in the system. None-the-less, learning gains, in combination with video of how the students and teachers used the system to support their thinking and discourse, provide evidence of the promise and feasibility of using the Web of Inquiry to support science inquiry learning in formal educational settings. Even though the system succeeded in supporting many of the intended interactions, findings also helped to identify some aspects of the system in need of further evaluation and potential modification. Four potential areas for improvement in the system include: increasing teacher awareness of students’ reflective assessments, refining activities and supports in the system for the analyzing and evaluating data as part of the inquiry cycle, supplementing the system with training modules for teachers, and finding alternatives to text-dependent information and activity design. The system can provide multiple ways to enhance collaborative learning opportunities for students. One of the ways it enhanced interaction was by increasing teacher awareness of student thinking and evaluation processes; this information facilitated discourse around student comprehension and deeper exploration of science concepts. Given the extent to which teachers referred to these self-reflective tools, we may want to extend this capability for teachers by providing them with a teacher view that affords immediate access to the ratings of multiple groups. In this way, the teacher can examine student self-assessments and decide which students may be in need of added assistance. Careful consideration will have to be placed on how such a change in the system could impact the classroom culture and patterns of interaction between teachers and groups of children. Findings also indicate that activities in the system intended to support student comprehensive understanding of the inquiry cycle may need modification. Though results showed significant learning gains in the majority of areas assessed by the Inquiry Test, they also showed no significant gains in analyzing data and evaluating results. These findings suggest that we need to develop additional ways to meet these needs for young students. For example, we may need to add additional features or examples as part of the system to support students understanding of these concepts. Another possibility is that or that students may need repeated practice (i.e., running more than one inquiry project) to improve their understanding of these important inquiry concepts

and practices. As designers, we need to consider how we can provide opportunities for faster data collection and repeated analysis as part of the system or a new system. Another potential for improving the system and designing fruitful interactions around the system is to add supports for modeling teacher-students-system discourse. Our findings indicate that the extent of learning gains may be dependent on the quality of the discourse existing around use of the system and the concepts it supports and models. Quantitative findings showed that gains varied by teachers, and qualitative findings suggest that this may be due to how well teachers use the system to start discussions on science concepts and practices or further probe student thinking. In this project, we focused on supporting teachers’ ability to assess inquiry projects and understand the system. We did not focus on supporting teacher discourse with the technological tool. Developing teachers’ ability to engage in scientific discourse is not standard practice in teacher education programs, but our findings suggest that this may be an important area of needed development. There is some progress being made in this area. Researchers are developing curriculum and tools to develop student teachers’ abilities to facilitate scientific discourse in classroom settings [31]. However, these studies currently do not include developing teachers’ abilities to interact with technology as a means to support discourse. Teachers may need develop this kind of skill in order to support classroom learning with technology. In this way, teachers can make use of powerful learning technologies and further enhance the quality of learning. For example, they can learn how to facilitate critical evaluation of ideas or problems encountered as students work with the system. These types of events can anchor collaborative discourse in the classroom and provide opportunities to model and discuss scientific thinking processes. In order to increase the likelihood of more consistent learning gains, it may be necessary to include training modules for teachers with concrete examples of how teachers can use the system to monitor students’ learning and push critical thinking through discourse. Lastly, our findings indicate that students’ level of writing and typing proficiency may impact their ability to use the system. Students with lower overall writing proficiency scored lower on the Inquiry Test, overall, than students with higher writing proficiency. Video data also indicated that students with less reading fluency were less able to utilize and comprehend the advice text. There are modifications we can make to the system to address these issues, but there are trade-offs to consider. For example, the current discussion tool provides alternative ways to communicate, but due to students’ varying ability to type in advice, at a cost of the richness of the information. Another important trade-off is efficiency of advice and level of comprehension. What we mean by this is that the current form of the advice is primarily text based. This presents some problems for younger readers and those with reading disabilities; it is especially problematic in urban classrooms that often house a diverse student population with differing instructional needs. One possibility is to design new advisors that utilize video as a primary advice-giving format. The problem with this design path is that it could limit quick access to the advice: students’ ability to scan or search advice quickly in order to decide if it is useful. It could also minimize the extent to which certain populations of students have access to cognitively rich text. None-the-less, these are potential design solutions that need to be evaluated in future iterations.

In order to broaden the impact of the system for a wider range of students it may be necessary to explore alternative forms of information and activity design. For example, future studies could include comparing video and text-based advice on learning gains for differing age groups, and reading and writing proficiencies. We may also want to explore different forms of assessment for younger children, as writing ability may prevent the richness of their understanding of concepts and practices to be adequately captured by the Inquiry Test.

6. CONCLUSIONS In their book, Rethinking Education in the Age of Technology, Collins and Halverson [7] paint a picture of the future of education that looks quite similar to the interactions we present in this paper. They argue that as technology continues to change and influence culture, educational systems must work to embrace technology and revise the traditional roles of the teacher. Technology, they state, can provide children with more individually suited learning opportunities. They also suggest that many teachers may be hesitant to use technology because it is often seen as a means to replace them. Nonetheless, these authors maintain that in order for the educational system to move forward, technology needs be utilized as a means to support learning and teachers need to shift from a model of transmitting information to one of facilitating discussion through the use of technology. This was one of the main goals behind the design of our system. The Web of Inquiry is an innovative student-centered science environment that can be integrated as part of formal educational settings. As previously discussed, current educational practice in school settings can reduce student motivation and interest in science [22] This problem can be remedied by moving towards a more student-centered model that emphasizes science practice and allows students to better understand science as a way of thinking about and exploring their world [22]. Though new technologies could help towards this aim, the problem is that developing technologies to support formal science learning is a difficult task in need of new principles and methods for meeting the needs of this specific community [18]. Our approach to the design, development, and assessment of this system can help towards this aim. Educational theory and conceptual models of science inquiry practice informed the entire design process of the Web of Inquiry. Prior learning theory and research informed the root concept of the system. Models of expert practice informed the advice provided by agents in the system. Theories of learning informed the activity and interaction design goals implemented in the system. Finally, a theory of assessment informed the assessment tools in the system and the outcome measures used to evaluate the system. The interaction design goals of our system were also specifically crafted to ensure that the system work alongside teachers and enhance student learning. Teachers and students used the system as a means to negotiate understanding and think deeply about science practice. The system helped students to evaluate their own thinking and understanding of science concepts and practices, when to ask for clarification, and how construct knowledge as part of their own work. The system also helped students to seek and evaluate information with their peers and provide teachers with access to students’ self perceptions of their understanding of inquiry science concepts and practices. As a result of all of these system features, teachers were able to step back from answering questions and checking answers and move on to facilitating scientific discourse. Findings from this study provide a framework

and motivation to continue developing the Web of Inquiry and technologies like it: technologies that merge interaction design with learning theory.

7. ACKNOWLEDGEMENTS We thank the National Science Foundation for support of this work through grant REC-0337753. The perspective offered and the views expressed are those of the authors and do not necessarily reflect the views of the NSF. We also thank our University of Washington colleagues Leslie Herrenkohl and Tammy Tasker for access to discourse transcripts, and Min Li for data analyses. In addition, we thank the Center for Online Innovations in Learning at the Pennsylvania State University for supporting this work. We also thank all of the members of the ThinkerTools research group, past and present that contributed to this work. We especially would like to acknowledge Eric Eslinger, Suzy Loper, Leslie Stenger, Phelana Pang, Allan Collins, and Linda Shimoda. Lastly, we would like to thank all of the students and teachers who worked with us as participants and co-designers over the years to build a system to meet their needs.

[14]

[15] [16]

[17] [18]

8. REFERENCES [1] Black, P. & William, D. (1998). Assessment and classroom learning. Assessment in Education, March 1998, 7-74. [2] Brown, A.L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. The Journal of the Learning Sciences, 2(2), 141-178. [3] Brown, A.L., & Campione, J.C. (1996). Psychological theory and the design of innovative learning environments: On procedures, principles, and systems. In L. Schauble & R. Glaser (Eds.), Innovations in learning: New environments for education (pp. 289–325). Mahwah, NJ: Erlbaum. [4] Butler, D. L. & Winne, P. H. (1995) Feedback and selfregulated learning: a theoretical synthesis. Review of Educational Research, 65(3), 245–281. [5] Campione, J.C. (1987). Metacognitive components of instructional research with problem learners. In F.E. Weinert & R.H. Kluwe (Eds) Metacognition, motivation, and understanding. Hillsdale, NJ: Lawrence Erlbaum. [6] Cazden, C. (2001). Classroom discourse: The language of teaching and learning. Portsmouth, NH: Heinemann. [7] Collins, A., & Halverson, R. (2009). Rethinking education in the age of technology: The digital revolution and schooling in America. New York: Teachers College Press. [8] Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. The Journal of the Learning Sciences, 13(1), 15-42. [9] Engestrom, Y. 2000. Activity theory as a framework for analyzing and redesigning work. Ergonomics, 43(7), 960– 974 [10] Eslinger, E. (2004). Student self-assessment in an interactive learning environment: Technological tools for sca olding and understanding self-assessment practices. Unpublished doctoral dissertation. [11] Eslinger, E., White, B., & Frederiksen, J. (2001). A modifiable multi-agent system for supporting inquiry learning. In J. Moore, C. Redfield, & W. L. Johnson (Eds.) Artificial Intelligence in Education. Amsterdam, Netherlands: IOS Press. [12] Frederiksen, J. & Collins, A. (1989). A systems approach to educational testing. Educational Researcher, 18, 27-32. [13] Frederiksen, J. R. & White, B. Y. (2004). Designing Assessments for Instruction and Accountability: An

[19] [20] [21]

Application of Validity Theory to Assessing Scientific Inquiry. In M. Wilson (Ed.), Towards Coherence Between Classroom Assessment and Accountability. The 103rd Yearbook of the National Society for the Study of Education, Part II. Chicago: National Society for the Study of Education, 74-104. Frederiksen, J., White, B., Herrenkohl, L., Li, M., & Shimoda, T. (2008). Classroom formative assessment: Investigating models for evaluating the learning of scientific inquiry. Final Report for NSF Grant REC-0337753. Gee, J. (1990). Social linguistics and literacies: Ideology in discourses. London: Falmer Press. Herrenkohl, L.R., Palincsar, A.S., DeWater, L.S., and Kawasaki, K. (1999). Developing scientific communities in classrooms: A sociocognitive approach. Journal of the Learning Sciences, 8, 451-493. Herrenkohl, L., Tasker, T., & White, B. (2011). Pedagogical practices to support classroom cultures of scientific inquiry. Cognition and Instruction, 29(1), 1-44. Honey, M. and M. Hilton (2011). Learning science: computer games, simulations, and education 2011: National Academies Press. Kuhn, D. (2009). The importance of learning about knowing: Creating a foundation for development of intellectual values. Perspectives on Child Development. Kuhn, Black, Kesselman, & Kaplan (2000). The development of cognitive skills to support inquiry learning. Cognition and Instruction. 18, 495-523. Kuhn, D., Katz, J., & Dean, D. (2004). Developing reason. Thinking and Reasoning, 10, 197–219.

[22] National Research Council. (2011). A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. Committee on a Conceptual Framework for New K-12 Science Education Standards. Board on Science Education, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press. [23] Nardi, Bonnie (1995). Context and Consciousness: Activity Theory and Human-Computer Interaction. MIT Press. [24] Papert, S. & Harel, I., (1991). Constructionism. Bristol, UK: Ablex Publishing Corporation. [25] Shimoda, T.A. (1999). Student goal orientation in learning inquiry skills with modifiable software advisors. Unpublished doctoral dissertation. [26] Shimoda, T. A., White, B. Y., & Frederiksen, J. R. (2002). Student goal orientation in learning inquiry skills with modifiable software advisors. Science Education, 86, 244263. [27] Towler, L. & Broadfoot, P. (1992) Self-assessment in the primary school. Educational Review 44 (2), 137-151 [28] White, B. (1993). ThinkerTools: Causal Models, Conceptual Change, and Science Education. Cognition and Instruction, 10(1), 1-100. [29] White, B. & Frederiksen, J. (1998). Inquiry, modeling, and metacognition: Making science accessible to all students. Cognition and Instruction, 16(1), 3-118. [30] White, B. & Frederiksen, J. (2005). A theoretical framework and approach for fostering metacognitive development. Educational Psychologist, 40(4), 211-223. [31] Windschitl, M., Thompson, J. & Braaten, M. (2008). How novice science teachers appropriate epistemic discourses around model-based inquiry for use in classrooms. Cognition and Instruction, 26(3), 310-378.