LINKING AN EDUCATIVE LEARNING PROGRESSION ... - CiteSeerX

31 downloads 14 Views 1MB Size Report
identified as one of the most effective instructional approaches in science education ..... how a flower would come to have the horrible stench of rotting flesh.
LINKING AN EDUCATIVE LEARNING PROGRESSION FOR NATURAL SELECTION TO TEACHER PRACTICE: RESULTS OF AN EXPLORATORY STUDY Erin Marie Furtak, University of Colorado at Boulder Sarah Roberts, University of Colorado at Boulder Deborah Morrison, University of Colorado at Boulder Kathleen Henson, University of Colorado at Boulder Shealyn Malone, University of Colorado at Boulder Abstract How do teachers learn to navigate the complex domain of student ideas during class discussions? This paper presents one approach to scaffolding teachers' ability to identify student ideas: using a learning progression as a tool to support teacher enactment of formative assessment conversations. We analyze videotaped classroom discussions and stimulated retrospective recall interviews from six high school biology teachers enacting the same formative assessment prompt developed to draw out common student misunderstandings about natural selections. Results indicate that teachers frequently used questions to probe students' unclear statements for the purpose of diagnosing the quality of student ideas during class discussions. Introduction Formative assessment, the process by which teachers elicit information about student thinking and take action to move students toward learning goals, has been identified as one of the most effective instructional approaches in science education (Black & Wiliam, 1998). Formative assessment has been summarized as consisting of three concrete steps, often posed as questions (National Research Council, 2001a): -

What should the students come to know? What do the students know now? How can students move from what they know toward learning goals?

The process of formative assessment has been highlighted in national educational documents (National Research Council, 1996, 2001a, 2001b), summaries of recent research (National Research Council, 2007), and as a result has been integrated formally into several reform-oriented science curricula (e.g. Lawrence Hall of Science, 2000). Despite this emphasis, simply taking pre-developed formative assessment prompts and placing them in the hands of teachers is not an effective approach to helping formative assessment happen in all classrooms. For example, Furtak et al. (2008) found a great deal of variation among a group of six teachers enacting the same set of embedded formative assessments. Atkin et al. (2005) found that teachers needed significant support in order to develop meaningful formative assessment practices. Of course, studies indicating that ‘on the ground’ enactment of educational and curriculum reforms does not match what may have been intended by authors of those reforms are not new (e.g. Cohen, 1990; Spillane, 1999). Recent research has pointed to the importance of developing teachers’ knowledge of the content and Paper Presented at the 2010 Annual Conference of the American Educational Research Association, Denver, CO.

Furtak et al. 2 curriculum that is relevant to the contexts in which they work (DuFour, 2004; DuFour & Eaker, 1998). This research goes beyond Shulman’s (1986) construct of pedagogical content knowledge, arguing that in order to successfully come to understand and enact reforms in their classrooms, teachers need to develop expertise that is embedded in their daily work and relevant to their own interests. A possible tool that could aid teachers' development of the knowledge they need to enact formative assessment may be found in Learning Progressions (LPs). LPs have been defined as “descriptions of the successively more sophisticated ways of thinking about a topic that can follow one another as children learn about and investigate a topic over a broad span of time” (National Research Council, 2007, p. 205). To date, LPs have been developed for multiple purposes, including guides for curriculum development and instruction (American Association for the Advancement of Science, 2001; Catley, Lehrer, & Reiser, 2005) as well as foundations for assessment development to identify student progress in a conceptual domain (Briggs, 2008; Wilson, 2005). Formative assessments are designed to elicit a range of student ideas that are convoluted and difficult to interpret. We argue that because LPs are used as representations of how student ideas develop in a conceptual domain, they might also serve as a sort of roadmap for teachers as they navigate the complex territory that is student talk. According to Dewey (1902), …the map...gives direction; it facilitates control; it economizes effort, preventing useless wandering, and pointing out the paths which lead most quickly and most certainly to a desired result. Through the map every new traveler may get for his own journey the benefits of the results of others’ explorations without the waste of energy and loss of time involved in their wanderings – wanderings which he himself would be obliged to repeat were it not for just the assistance of the objective and generalized record of their performances. (p. 198-199) Thus, a LP represents the state of knowledge about how student ideas develop in a domain and presents them in a format that has instructional utility. The teacher then adopts the role of using the map to identify where students are, determining the best path forward for students. In this sense the LP is not prescribing a particular route from point A to point B in the manner that one might get from a website like Google Maps; instead, the LP simply puts forward a ‘lay of the land’ through which different paths might be negotiated. Even when viewed through this lens, LPs in and of themselves are only representations of student ideas in a particular domain; in order to scaffold changes in instructional practices, LPs need to be accompanied by tools that will support teacher development. Educative LPs include not only exhaustive listings of how student ideas develop from alternative conceptions to scientifically accepted explanations but also formative assessment prompts to elicit developing student understanding, sample student responses, and feedback strategies (Furtak, 2009b). Educative LPs should be introduced to teachers in the context of ongoing professional development to help teachers develop the background knowledge necessary to teach the ideas represented in the LP and to practice identifying and acting upon student thinking (Black & Wiliam, 1998b).

Furtak et al. 3 In this paper, we take a novel approach to the use of LPs as tools to scaffold teachers' eliciting and diagnosing of student thinking during formative assessment conversations. We present criteria for how LPs may be used to support teachers, describe a small exploratory study in which a department of high school biology teachers worked with an LP for two years to develop their understanding and teaching of natural selection, and then present evidence indicating the influence of this LP on classroom practice. Theoretical Background Our application of LPs to support teacher enactment of formative assessment builds upon the established needs of teachers when they enact formative assessments. We will first perform an analysis of the demands put on science teachers as they enact formative assessments in their classrooms. Then, we will explore Goodwin's (1994) framework for professional vision as a possible lens on professional practice that could inform what we want teachers to be able to do during formative assessment. We then present criteria for what we call educative learning progressions as guidelines for teaching science. Demands of Effective Formative Assessment Practices If conducting formative assessment were as simple as asking students open-ended questions, it would be in wider use today. Despite 12 years having passed since Black & Wiliam's (1998a) watershed review, educational reformers are still struggling to support teachers in realizing this practice in their classrooms. This situation may be attributed to the fact that formative assessment challenges teachers in a variety of ways. It is based upon deep content knowledge, forces teachers to take the risk of finding out the effectiveness of their teaching as they go along, and requires teachers to respond to student ideas on-the-fly. In the sections below we will take apart these difficulties for teachers step by step. Step 1: Setting a Goal. The first step in the formative assessment process is perhaps the most straightforward of the three. This step involves identifying a target understanding or ability for students. Much of the reforms in science education in the past 20 years have focused on just this approach (American Association for the Advancement of Science, 1990, 1993, 2001; National Research Council, 1996), and every state has its own version of these standards. The mere existence of such documents, however, does not mean that the statements in these documents are straightforwardly interpretable into classroom standards. According to Collins (1998), the standards were neither intended as, nor act as a ‘how-to’ document for teachers. Thus, some modicum of interpretation of the standards and what they mean is incumbent upon teachers themselves. Take, for example, the following statement, taken from the National Science Education Standards (National Research Council, 1996): Species evolve over time. Evolution is the consequence of the interactions of (1) the potential for a species to increase its numbers, (2) the genetic variability of offspring due to mutation and recombination of genes, (3) a finite supply of the resources required for life, and (4) the ensuring selection

Furtak et al. 4 by the environment of those offspring better able to survive and leave offspring. (p. 185) This statement straightforwardly summarizes the 'right' or 'scientifically accepted' definition of evolution as the consequence of natural selection. Teachers then must translate standards such as this into statements of what students will know and be able to do to complete the first step of formative assessment. Step 2: Determining what students know. In order to know “where the students are now,” the teacher needs a strategy and/or prompt for making student thinking explicit and a framework or interpreting the ideas that students share. Strategies and prompts to draw out student thinking are becoming more prevalent; in some cases they have been developed as embedded assessments in pre-existing curricula (e.g. the FOSS curriculum) and in others as parts of 'toolkits' intended for teachers to pick and choose approaches that fit into their units of instruction (Keeley, 2008). Teachers may also conduct ongoing assessment in the form of listening to student ideas in small group talk, reading over student work, and asking specific questions. The quality of these prompts is clearly part of the equation; in order to be interpretable, formative assessments need to provide opportunities for students to share clearly stated ideas that are linked to learning goals. Yet these strategies and prompts can only take a teacher so far. Once the teacher has this information, how is she to interpret it? Student understanding develops along a continuum and is not always easily slotted into standards-style statements of what students should ultimately know. Indeed, this view of student ideas as right or wrong does not acknowledge the fact that standards in and of themselves - the idea of 'Science for All' - implies that students must move toward right ideas rather than considering how these ideas relate to their prior experiences (Barton, 1998; Strike & Posner, 1992). Take again the standard listed above; a teacher who heard her students say that 'animals change themselves to fit their environment' would know only that the student did not have the same understanding as expressed in the standard and has, therefore, a 'wrong' idea. One might go further and argue that even though a student might give an answer that matched perfectly with the canonical expression of the scientific idea, it may not reveal deep changes in student thinking over time (Shemwell & Furtak, 2009). Either way, without a framework for interpreting this particular student idea, teachers will not be able to identify nuances in the gap between prior ideas and scientific understandings. Previous studies have shown that teachers’ ability to identify and act upon student thinking is not necessarily correlated with their years of experience (Furtak, 2006, 2009a). Step 3: Feedback. Once a teacher has assessed the gap between student thinking and desired learning goals, the final step is to provide the student with feedback to help him or her improve performance in the direction of the learning goal. Feedback has been argued to be the step that has a significant, positive impact on student learning (Black & Wiliam, 1998a; Hattie & Timperley, 2007; Kluger & DeNisi, 1996). Yet this final step is incumbent upon the teacher evaluating the size of the gap between what students know and what they need to know and then providing targeted instructional interventions in a variety of forms to help move students toward learning goals. Without this appropriate assessment of the gap, feedback will be neither timely nor specifically targeted at student ideas as they are developing.

Furtak et al. 5 For example, Wiliam (Wiliam, 2007) has used the metaphor of a softball coach giving feedback to a softball pitcher on how to improve her delivery. A coach telling a pitcher that her slider is not cutting away from the plate has given her no information about how to improve her performance, only what is wrong with it. Taking this further, it would be inappropriate for a coach to tell a pitcher that she needed to change her release point if the problem was really having to do with the grip that the pitcher was using. The more knowledgeable other, in softball or any setting, must be able to accurately diagnose the problem in order to provide feedback specifically tailored to the problem at hand. Thus, accurate diagnosis of the problem or shortcoming is an essential first step on the way to developing competence in formative assessment. Given the preceding analysis, we conclude that in order to successfully conduct formative assessment, teachers need high-quality assessment prompts and frameworks to help them to interpret the ideas students share in response to those prompts as necessary precursors to conducting feedback during instruction. This ability may also be conceptualized as diagnosing. A doctor observes symptoms and conducts tests to find out information about a patient's condition and then makes a judgement of the patient's condition based on this information, or a diagnosis. Proper diagnosis is a necessary first step on the way to giving an appropriate treatment. In this sense, teachers must develop the ability to diagnose student ideas in order to target specific feedback to help students learn. Noticing and Attending to Student Thinking A different paradigm for what has been called informal formative assessment, or assessment that occurs on-the-fly (Shavelson et al., 2008), comes from the mathematics education literature, where identifying and responding to student ideas during class has been called noticing and attending to student thinking (Sherin & Es, 2003). This framing of teaching calls attention to the teacher’s ability to pick out complex and often convoluted student ideas from the messiness of classroom talk, acting on those ideas that can be fruitfully developed into more scientific understandings. The concept of noticing draws on Goodwin's (1994) work, who wrote of the development of professional vision, or the ability to discern specific information valued within a particular community from a complex domain. Goodwin described how an archaeologist was able to use a Munsell chart, a tool that identified different colors of soil to identify an area of an excavation site that was once a post, to circle the area of ground which she identified as such, and then to make that marking a public event to guide the perception of others. Goodwin parses this description of professional vision into three distinct categories: coding, highlighting, and producing and articulating material represenations. Each of these practices is an observable action on the part of a member of a professional community with a particular way of viewing a complex field. Goodwin defines these three steps as follows: • • •

“coding, which transforms phenomena observed in a specific setting into the objects of knowledge that animate the discourse of a profession” “highlighting, which makes specific phenomena in a complex perceptual field salient by marking them in some fashion” reification, or “producing and articulating material representations” (1994, p. 606).

Furtak et al. 6 In education, this ability could be a teacher conducting a class discussion in which multiple student ideas are shared. In some cases, an effective formative assessment opens a virtual Pandora’s Box of ideas that are convoluted and difficult to interpret. Developing the ability to listen to these ideas and notice the ones that are linked to common student naive conceptions, and being able to tease those ideas apart and do something about them is an essential skill in being able to conduct formative assessment. This is in effect, diagnosing student thinking for the purpose of identifying the gap between current and desired levels of performance. Educative Learning Progressions As described above, LPs are most commonly used as guides for curriculum and assessment development, yet they contain information that could be employed in the form of a tool to develop teachers' ability to navigate the complex environment of classroom talk. In order for LPs to fulfill this role, they must be educative in the sense that they scaffold the development of teacher knowledge over time (Davis & Krajcik, 2005) so that teachers learn to independently diagnose student ideas onthe-fly in formative assessment situations. In order to be educative, LPs must be part of suites of tools (Thompson, Braaten, & Windschitl, 2009) that support teachers in eliciting, identifying, and acting upon student ideas in a conceptual domain. Educative LPs could serve as scaffolds for teachers in inquiry-based settings by helping them to anticipate not only the ideas students might share but also to suggest instructional strategies tailored to these ideas. In order to serve that purpose, however, LPs need to include examples of student responses for each level, as well as instructional strategies, feedback suggestions to help students proceed up the LP, and professional development to support changes in practice. Educative LPs should contain the preceding five elements; the first four are features of the educative LP itself (Furtak, 2009), and the fifth emphasizes the importance of sustained professional development within a community of practice (Borko, 2004) to support teachers’ adoption of new instructional strategies based on the educative LP (Figure 1). The sections below describe each element in more detail, supported by the research base in science teaching and assessment.

Figure 1. Components of an Educative Learning Progression

Furtak et al. 7 Element 1: Accounting of student ideas Educative LPs should combine features of both approaches to LP construction described above to include not only the multiple and sequential correct understandings that build to the big ideas in science across multiple grade spans, but also the development of student ideas on isolated constructs within the overarching curricular sequence. Development should begin with extensive conceptual analyses of the big ideas in science on the basis of prior research and new empirical studies and involve continuous empirical validation of a preliminary progression through fine-grained analyses of student work and responses to assessment items linked to the LP (Briggs, 2008; Wilson, 2005). Element 2: Formative assessment prompts to elicit student thinking. Before teachers can take instructional action on the basis of student ideas, they need high-quality assessment prompts to elicit student thinking (Pellegrino, Chudowsky, & Glaser, 2001). Such prompts should entail a variety of formats and be linked to the LP to explicitly draw out the ideas that appear in the progression, so that teachers can identify the levels at which students are located. To serve the purpose of advancing student learning, assessment prompts linked to the LP should be formative in nature; that is, intended to generate information about students’ levels of understanding so as to provide a basis for feedback rather than grades (Black & Wiliam, 1998a; NRC, 2001a). The prompts should also be generative in nature, inviting students to share a range of ideas. Formative assessment prompts can be described along a continuum including formal curriculum-embedded, pre-planned, and spontaneous, on the fly assessments (Furtak & Ruiz-Primo, 2008; Ruiz-Primo & Furtak, 2007; Shavelson et al., 2008). Each of these types serve the purpose of eliciting students’ conceptions as a basis for teachers to make instructional decisions to reduce the gap between learning goals and students’ present state of understanding. Formal curriculum-embedded assessments consist of pre-developed prompts linked to a curriculum. Planned-for formative assessment constitutes the everyday activities the teacher selects or designs to elicit and act upon student thinking (Shavelson et al., 2008). Informal or on the fly formative assessment takes any interaction in class (i.e., one-to-one interactions, small group, or class discussion) as an opportunity to gather and elicit information about students’ level of understanding in order to provide appropriate and timely feedback. The teacher can respond immediately to students’ ideas, making possible the completion of multiple feedback cycles in a relatively short period of time (Ruiz-Primo & Furtak, 2006). According to this framework for types of formative assessment, educative LPs should provide not only formal prompts tightly linked the LP but also information to support teachers’ development of planned-for formative assessments and those teachers create ‘on the fly’. Element 3: Age-appropriate examples of actual student responses Current versions of LPs provide summaries or categories of student ideas; however, these are often phrased in general terms and do not reflect the ways that students speak about scientific ideas. For example, where density is concerned, students often speak of ‘objects being heavy for their size,’ or about an amount of mass being ‘spread out’ in a particular amount of space (Shemwell & Furtak, 2009). A LP might summarize these ideas as naïve understandings of density, but without examples of

Furtak et al. 8 these kinds of student statements, teachers may have difficulty in properly recognizing and categorizing them. To facilitate teachers identifying student understanding on a range of formative assessment prompts (and informal observations and discussions), age-appropriate responses for each level of the LP should be provided. These student ideas could be collected in the process of validating the LP itself with a sample target student audience. Element 4: Suggested feedback strategies to move students between levels The key element of LPs for teacher development is the inclusion of feedback strategies to help teachers move students between levels. Lower levels of feedback, such as providing evaluation (“that’s right/that’s wrong”) or acknowledging students’ effort (“I can tell you’re working very hard”) are not as effective as feedback that contains specific information about student performance and how it may be improved to reach learning goals (Hattie & Timperley, 2007). Like the assessment prompts themselves, informational feedback strategies may take many forms, from questions asked by the teacher, to integrating additional descriptions or representations of particular ideas, to revised lessons plans, to reteaching entire units when students do not understand. Unfortunately, these probing questions, strategies, activities, and representations are difficult for most teachers to plan for and even harder to come up with on the fly. Educative LPs in effect anticipate for the teacher the different kinds of responses that students will give during instruction, and thus are an ideal starting point for suggesting informational feedback that will help advance student learning. Element 5: Sustained professional development to support enactment of the educative LP The previous four elements focused upon qualities of the LP and its accompanying tools; however, research has indicated that simply creating tools and providing them to teachers will not result in changed practice (Black, Harrison, Lee, Marshall, & Wiliam, 2002; Furtak et al., 2008). Therefore the fifth element of an educative LP focuses upon the process by which teachers will be supported in reflecting upon and adapting their current practice based on the information contained in an educative LP itself: sustained professional development within a professional learning community (PLC). Such professional development is essential alongside a LP, which can easily lead teachers to assume that a particular path to competence must be followed, rather than emphasizing multiple possible paths for student learning (Corcoran, Mosher, & Rogat, 2009). Within a PLC, teachers can examine their own practice, explore examples of student work, and practice identifying and responding to the different ideas the educative LP contains. Furthermore, if the educative LP is seen as a continuously evolving tool, teachers can help suggest different strategies to include on the LP. Sustained professional development centered on an educative LP and its accompanying tools could help teachers learn the structure and relationship of students’ understandings about a particular concept, the teaching strategies to elicit student thinking, the likely student responses, and the targeted feedback for students. In this way, educative LPs can act as a guide for teachers, showing them potential steps to take in order to help students move along the LP, thereby meeting

Furtak et al. 9 increasingly challenging learning goals (Sadler, 1989) and positively impacting student learning (Black & Wiliam, 1998a). Educative Learning Progressions as Tools to Support Formative Assessment Practices This paper will explore how ideas represented in a tool to aid the diagnosis of student thinking are identified by teachers in classroom discussions. Formative assessment capacity development focuses therefore not only on teachers’ enactment of the tools with which they are provided, but on the development of the necessary knowledge – the professional vision – in order to effectively diagnose student thinking. In this paper we will describe a project in which we engaged a department of high school biology teachers around an educative LP, and will present evidence supporting the influence of the educative LP on their ability to highlight and diagnose student ideas about natural selection during class discussions. Method The purpose of this paper is to explore the influence of an educative LP on teachers' practice during formative assessment discussions. To do so, we draw on three sources of data: first, the educative LP itself as the tool intended to scaffold teachers' ability to diagnose student thinking in formative assessment discussions; second, classroom videos collected by the authors of the paper as evidence of how teachers diagnosed student thinking in class discussions; and third, interviews with each teacher which helped us to capture, for each of them, their own descriptions and understandings of how they identified and acted upon student thinking. In this section we will first explain the context of the study and the development of the LP. Then, we will introduce the study participants and describe each source of data. Finally, we will present the coding system we applied to these sources of data with examples of how we used that coding system. Context A complete understanding of evolution underlies many different ideas in biology. Catley et al. (2005) envisioned a developmental sequence of concepts building to understanding evolution across multiple grade bands and organized into these categories; however, the multiple dimensions of this sequence are hard to track and not necessarily instructionally useful. Similarly, the AAAS’ (2001) Atlas of Science Literacy contains multiple interconnected concepts that are difficult to translate into instruction. These summaries are closely related to the oft-used sequence for understanding of natural selection, Mayr’s (1997) five facts and three inferences, the basis for the National Science Education Standards’ explication of the concept (National Research Council, 1996). However, many student understandings and naïve conceptions of natural selection and evolution have also been identified (Bishop & Anderson, 1990; Dagher & Boujaoude, 2005; Ferrari & Chi, 1998; Geraedts & Boersma, 2006; Shtulman, 2006). We separate students' common ideas into two separate constructs: naive conceptions related to the origin of new traits and variation within a population, and selective force, or how those traits lead to differential survival and reproduction. With respect to origin of traits, we differentiate students' ideas regarding how organisms are able to change themselves in response to a perceived need (anthropomorphic),

Furtak et al. 10 how the environment causes changes organisms (environmentally-induced changes), or how organisms differentially mate with organisms that have characteristics that could be advantageous (eugenic). A more sophisticated, yet not scientific, understanding involves students adopting genetic terms to describe more traditional naïve conceptions. An example of this is that the environment causes an organism to mutate in response to environmental changes, as compared to a scientific understanding that involves students identifying random changes in genes and recombination of genes as the source of new variation. With respect to selective force, students often will describe that species change over time without specifically describing how organisms that are better adapted to the environment are more likely to survive and reproduce, a complete understanding of this construct. In addition, students will often use unclear language to describe their own views of natural selection; words like 'fitness' and phrases like 'survival of the fittest' suggest that the organisms that are more successful are stronger and in some cases even beat up weaker organisms (Anderson, Fisher, & Norman, 2002). Students may also confuse the term 'adapt' in a biological sense - the fact that the proportion of organisms within a population with advantageous characteristics becomes greater over time - with its everyday usage. Unfortunately, scientists speaking to the public often use these terms in ways that sound very similar to students' own every day usage ideas of these terms. In the research reported in this paper, we have developed, piloted, and revised a LP over the past two years that serves as the foundation for the educative LP (Furtak, 2009); this LP is shown in Figure 2. The horizontal table shows how correct ideas are sequenced within the unit of instruction, based on Mayr’s (1997) framework; the extending vertical tables show how particular ideas within the horizontal table develop from alternative conceptions to correct ideas. Extended versions of the LP provide examples of real student responses and suggestions for instructional strategies to help students develop more sophisticated understandings.

Figure 2. Learning Progression for Natural Selection

Furtak et al. 11 Participants Nine biology teachers at an ethnically and socioeconomically diverse school near a large city in the western US participated in the project. While all had teachers had backgrounds in biology, they ranged in experience from a student teacher to a 29 year veteran, and taught multiple levels of biology ranging from sheltered biology for English language learners to AP Biology. Two of the participants were student teachers – Teacher 7 during the 2008-2009 school year, and Teacher 9 during the 2009-2010 school year. Since Teacher 9 had taken over Teacher 2's classes by the time of the study we do not have data for Teacher 2, and we do not have videotape from Teacher 8's enactment of the lesson, Thus, in this paper, we draw upon data from six of these teachers. Table 1. Teacher Participants Teacher 1 MA or MS

Teacher 3 MA or MS

Teacher 4 MA or MS

Teacher 5 MA or MS

Teacher 6 MA or MS

Teacher 9 BA

Highest Degree Earned Major in Environmenta Biology Chemistry Biology Biology Biology Science l Biology & Zoology Minor in N/A N/A N/A Natural Geology/Eart N/A Science Sciences h Sciences Professional, Professional Professional Professional Professional, Professiona Teacher Secondary , Secondary , Secondary , Secondary K-12 Science l Credentia Science Science Science Science Secondary l Science Years of 8 8 10 11 13 0 Teaching Years 3 6 3 7 11 0 Teaching at this HS Years of 7 7 10 11 13 0 Teaching Biology Science Biology, AP Biology, Biology, IB Biology, Biology, Biology Courses Environmenta Earth HL Biology Senior IB Honors Taught l Science Science Biology Biology II 2009-10 Number 4 2 2 2 3 2 of Biology Classes

Professional Learning Community and the Development of Formative Assessment Tools The seven teachers and the research team participated in monthly meetings in the 2008-2009 and 2009-2010 schools years for the purpose of establishing a professional learning community (PLC) (DuFour & Eaker, 1998) organized around the

Furtak et al. 12 teaching of natural selection. In the first PLC meeting, the group discussed the basics of the project and their current approaches to teaching natural selection. In subsequent meetings the participants identified students’ ideas related to natural selection and began mapping them into the LP. Particpants explored strategies for addressing different naive conceptions and designed formative assessment activities for eliciting student ideas within the natural selection unit based on the LP. Figure 3 illustrates the iterative nature of this work across the multiple years of the project and into the future. Teachers and researchers co-created formative assessments and then teachers implemented these in their classrooms, with teachers determining appropriate places in their regular instruction to locate each assessment. In the second year of the project, the monthly meetings began with a focus on reflection of the previous year. As the school year progressed the focus shifted to analyzing systematically indentified student ideas and linking them to the LP. Teachers viewed teaching video from the the enactment of formative assessment from year one to focus on student thinking and teacher responses. The members of the PLC continued to work to revise the formative assessments. Teachers agreed upon a common set of seven formative assessments and an order in which to implement them. After teachers had delivered their units on natural selection in their classrooms, the meetings focused on the results of the implementation. Specifically, members of the PLC went through each formative assessment to determine how they worked in terms of their ability to solicit student thinking. The participants looked at what student ideas were elicited and how teachers responded to those ideas. As the school year came to a close, the focus of the PLC shifted to considering ways in which to sustain the PLC and to permit these teachers to share what they had learned with a broader audience.

Figure 3. Cycle of Activities in Professional Learning Community Formative Assessment Two: 'How did it come to be?' Formative assessment two, How did it come to be?, was selected for the focus of this analysis for several reasons. This assessment was designed to elicit student ideas

Furtak et al. 13 early in the unit when students were more likely to have a variety of naive ideas. This allowed teachers to identify these ideas in time to tailor their instruction to address them. Formative assessment two was developed during the second year of the PLC. This assessment evolved out of an earlier formative assessment, Individual Change Versus Natural Selection, which was developed collaboratively by the PLC in year one based on teachers' previously existing natural selection activities and included nine scenarios. The Individual Change Versus Natural Selection formative assessment solicited student ideas on the differences between individual changes and changes in a population. After the enactment of this assessment, the PLC discussed the results and concluded that the examples were too complex for their students and made suggestions on how the instrument could be revised. Researchers took all of the teacher's ideas and revised the original assessment to into two separate assessments, number two, How did it come to be? and number three, What gets passed on? The goal of the revised formative assessment two was to elicit student ideas around the origin of a trait and the selective force behind the success of a given trait. For each of four scenarios presented in How did it come to be? students were asked to explain how the organism described came to be the way it is, as seen in Figure 4 below. The four scenarios were: the pronghorn antelope, the arctic hare, the corpse flower, and antibiotic resistant bacteria. The example of the pronghorn antelope asks students to consider how and why an animal would be so fast that no known predator could catch it. The example of the arctic hare focuses students on the mechanism of camoflauge. The corpse flower example asks students to think about how a flower would come to have the horrible stench of rotting flesh. Finally, the forth example asks students to consider the mechanism behind antibiotic resistance.

Furtak et al. 14

Figure 4. Formative assessment two: How did it come to be? Sources of Data Changes in teachers’ practice related to the educative LP were triangulated with two sources of data: videotapes of teachers enacting the lesson and interviews with each teacher at the end of the school year. We describe each source of evidence below. To track the ways in which teachers elicited and responded to student ideas about natural selection, we explored videotapes of each teacher leading a class discussion after students had independently responsed to How did it come to be? We videotaped one class enactment for each teacher, and then, from each videotape, we excerpted the portion of class in which the teacher led a discussion of student responses to each question on the formative assessment. These video excerpts were then transcribed. The length of each video segment is shown in Table 2.

Furtak et al. 15 Table 2. Length of classroom enactment of “How did it come to be?” Teacher

Segment Length (minutes)

1

6:54

3

19:04

4

8:42

5

11:28

6

10:00

9

5:56

In order to triangulate our own propositions about how teachers diagnosed and responded to student ideas in the discussions, we selected a minute or two of videotape to show each teacher during exit interviews at the end of year two. We purposively selected these clips as examples of what we considered to be instances of teachers’ highlighting of student ideas in talk and, in some cases, diagnosing or giving categories to students' ideas during the discussion. How did it come to be? clips were selected to show a sequence of student to teacher interaction where a student's naive conception of natural selection was probed by the teacher. Researchers jointly selected clips after coding formative assessment transcripts for student conceptions and teacher feedback. We presented the clips to the teachers during a semistructured interview in which teachers responded to questions about the formative assessments, identified examples of student ideas in writing, and then described how they gave their students feedback during their natural selection unit. We introduced each clip by asking the teachers, "What kinds of feedback do you feel you provided to students during the natural selection unit this year?" We used the video clip as a way of probing teachers to be specific in their response to this question. We showed the clip to teachers and then asked them to talk about what they were thinking when they were watching the clip, and how they decided to respond that way. If teachers did not respond with the specifics we were looking for in response to this question, we asked three follow-up questions: • • •

How did you interpret what the student was thinking here? Talk a little about the way that you responded/why you responded this way. What did you do next?

We then transcribed each interview. Data Analysis Beginning with the video data, we coded the transcripts according to a coding system previous developed in conjunction with the educative LP. This coding system operationalizes student ideas represented in the selective force and origin of traits constructs (see Figure 1) into codes that can be applied to written work and transcripts. These codes are listed in Table 3.

Furtak et al. 16 Table 3. Codes for origin of traits and selective force constructs Codes

Description

Example

Full Mechanism (SFS+SFR=SFM)

Student says population changes over time with a clear and complete statement of selective force. Includes BOTH selective force and reproductive forces.

"Okay, so the plant that smelled like rotting flesh attracted more flies which gave them more food so they could survive and pass on their genes and the ones that did not died off."

Survival (SFS)

Student says population changes over time but include only survival mechanism (eg. differential survival, predation).

"I would say that some of the antelope were slow and the cheetah were eating the slow antelope...cheetah couldn't catch the fast antelope, cheetah's going to die."

Reproduction (SFR)

Student says population changes over time but include only reproductive mechanism.

"They got to make more kittens."

No Mechanism

No discussion of selective force mentioned.

Selective Force

Origin of Traits Mutation random (MRA)

Multiple kinds of random mutations can occur (either spontaneously or in respnose to environmental mutagens), some of which lead to new beneficial traits in organisms.

"It was...a random mutation and it wasn't a bad mutation. It was a very advantageous mutation."

Mutation needs based (MNB)

Changes occur as a result of genetic mutations in direct response to the environment.

"I said that the pronghorn antelope have to adapt to the change in speed to be able to outrun them."

Mutation unclear mechanism (MUD)

Mentions mutations leading to new traits but does not describe a mechanism for how the mutations occur.

"I think that the mutation occured which caused longer legs."

Needs based change environmental (ENV)

Organisms change in direct response to environmental changes; mechanism of transmission of changes is unclear.

"...in oder to get the flies to come they started developing their own scent based off what they used to grow on."

Needs based changeanthropogenic (ANT)

Organisms consciously change themselves to

So in order to become faster do, do the antelope then go

Furtak et al. 17

Codes

Description

Example

adapt to environmental get treadmill and work out changes. Things their heart? organisms learned in their lives are traits to be passed on. Needs based changeeugenic (EUG)

Organisms choose mates intentionally to create offspring that will be well suited to the environment.

Static (CON)

Species do not change/are constant.

No mechanism (NEM)

Mechanism for change in population is unclear.

Trait not present

Description of differences in traits not included.

"...they needed to adapt, when their predators, they evolved..."

After developing shared understandings about these codes while watching two of the six videos, researchers then coded the remaining four videos independently and then met and came to a consensus on the final codes applied. Our next step was to examine how teachers responded to student ideas in these transcripts. To do so, we used an adaptation of Goodwin's (1994) approach to identifying instances of highlighting and coding of ideas related to a larger vision for how student thinking develops. We considered highlighting as instances in which the teacher picked up a particular student comment related to the LP and coding to designate instances in which the teacher actually placed a category similar to one on the LP on a student idea or collection of ideas during class discussions. These codes are summarized in Table 4. Although Goodwin's (1994) framework also includes reification, we did not track this piece in classroom talk because we consider this element of professional vision to be more visible in teachers' discussions with each other, which are not a part of th present analysis.

Furtak et al. 18 Table 4. Codes for teacher responses to student ideas during discussions Teacher action

Description

Example

Highlighting

The teacher "highlights" a student comment related to the learning progression.

Student: Well it could have been natural selection so the faster ones did survive better and so maybe like over time then they became faster. SFS Teacher: OK so if they became faster, what do they have to do here? Go ahead, Chloe.(T6L8)

Coding/Diagnosing

The teacher places a code, similar to the codes on the learning progession, on a student idea during class discussions.

Student: OK, so only the fastest ones were the ones that were surviving so they were the only ones who could breed. So the faster and faster—they kept getting faster and they just—I don’t know. It keeps going like that because still, only the fastest ones can survive. Teacher: OK, so I heard a couple of things. The ones that survive get to reproduce. Is that pretty standard do you think? (T5L3)

Again, after establishing shared understanding on these four codes, four of the five authors then separately coded the transcripts for instances of coding and highlighting and came together to establish consensus. After establishing consensus, researchers created a large table in which all instances of teacher highlighting and coding/diagnosing were listed. Researchers then used this table to look for patterns in the kinds of ideas that teachers picked up on, and how the teachers responded. The first author then established a final set of codes (Table 5) based on a grounded approach to the data and informed by previous studies (Ruiz-Primo & Furtak, 2006; 2007) to categorize the specific ways that teachers responded when highlighting student ideas. These responses were not viewed as mutually exclusive but captured on whole the different ways that teachers chose to highlight. This final set of codes was applied in a final pass over the transcript to better describe patterns in teacher diagnosis of student responses.

Furtak et al. 19 Table 5. Codes for categorizing teacher responses to the highlighting of student ideas Code Revoice

Reconstruct

Description Teacher repeats verbatim a part of student answer

Teacher restates student statement and makes significant changes to student words and/or meanings

Example Student: Okay um well when (inaudible) I guess it’s kind of like the winter thing so they have the ability to (inaudible) save them from predators and what not so they gain that in order to hide from the predators and so that’s passed down to each rabbit. Teacher: OK so they gain the ability to change their coat color so they can avoid predators? Student: I said that, I don’t know, it’s genetics. Like if your parents are tall, you’re most likely going to be tall. And I said if you’re parents have good vision and hearing then you have it, or something like that. But I said that they have long legs and they could possibly have good hearing so they could hear— Teacher: So you’re saying that the adult antelope that has the best hearing and that could run the fastest was the one that could survive the best? Is that what you’re saying? Student: I guess

Check

Asking for Mechanisms

Informational Feedback

Teacher asks student a yes/no or shortanswer question to be sure that student has a particular understanding. Teacher asks student to unpack words like 'adapt' or 'had to' or asks a 'how' question to get at an underlying mechanism for change. Teacher provides student with specific information about how to improve his/her performance

(T3L24-26) Teacher: OK, so let me ask you guys this. Can an individual antelope, one antelope, get faster because it needs to? Student: Yes. No. (T1L1-2) Student: I said that the pronghorn antelope have to adapt to the change in speed to be able to outrun them. So somewhere in the genes there was a mutation where the offspring have like longer legs so they can outrun the cheetah. Teacher: So I have a question for you. When you say adapt, what do you mean by adapt? (T3L16-17) Teacher: That’s one possible explanation. So back in the day, the original… flower. what I would like to see more in your answer if you could add that was that some flowers smelled more like rotting flesh, some flowers smelled a little bit differently that made them more attractive then others. Right. Let’s add natural selection into that. (T9L51)

To triangulate our understanding of the video on the basis of this coding approach, we showed the teachers specific clips for which we had coded teacher highlighting

Furtak et al. 20 and coding sequences. We then asked the teachers what they were thinking at this point in their class. We used the transcripts of the teacher interviews to match teacher coding with the prior coding done by the research team. In this way we were able to confirm the teachers' use of the educative LP ideas within their instructional feedback. Results In this section, we will begin by illustrating how teachers highlighted student ideas linked to the LP in class discussions. Then we will show how some teachers used class discussions to actually diagnose or code categories of student ideas as they were occurring but did not necessarily follow up these instances with informational feedback. Finally, we will illustrate how teachers used different kinds of questions to elicit additional information from students to better understand the students’ ideas. Because we are analyzing data from the enactment of only one formative assessment, we do not make claims about individual teachers, but rather we use the data we have collected to make claims about the group of teachers as a whole, looking for patterns in the way that they showed shared understanding and/or professional vision to identify and diagnose student ideas during discussion. Highlighting Ideas from the Educative LP Our analysis of classroom video and interviews indicate that teachers highlighted student ideas linked to the LP during class discussions of How did it come to be? We observed instances of highlighting being done by all six teachers in the study. Teacher interviews supported our interpretation of these instances as the teachers. As an example, we can explore Teacher 3's discussion of the pronghorn antelope example. In this scenario, the teacher has given students time to write responses to the formative assessment individually and then asks the class to share their answers. Speaker

Dialogue

Teacher 3

Why don’t we discuss question number one and then if you’re still not finished the whole thing, finish that as homework. Yes, Lydia.

Lydia

I said that the pronghorn antelope have to adapt to the change in speed to be able to outrun them. So somewhere in the genes there was a mutation where the offspring have like longer legs so they can outrun the cheetah. So I have a question for you. When you say adapt, what do you mean by adapt?

Teacher 3

Notes

Mutation Undefined Selective Force Selection Teacher highlights student's use of 'adapt'

In this excerpt, the student uses the word 'adapt' in an unclear manner that in combination with the student saying 'somewhere in the genes there was a mutation', does not provide clear information about what the student is thinking.

Furtak et al. 21 When asked about the student thinking and her response, Teacher 3 described the preceding excerpt this way: I was thinking that she didn’t know whether she was using adapt in the sense that we talk about adapt, adapting to daily changes versus adaptation in natural selection. So I wanted to see if she herself recognized that or if that was something else that I would have to address later. This response illustrates that the teacher also saw herself as targeting the student's word adapt and asking for more information about what that word means, indicating that this is an instance of the teacher identifying a student idea that is linked to the LP. The exchange continued: Speaker

Dialogue

Lydia

Well they have to somehow (inaudible).

Teacher 3

Lydia

OK, so do you think that the fact that they were running away from the cheetahs is what made them change? Or was that a random change that led to some of them being able to run faster than the cheetahs? It could be both.

Teacher 3 Lydia

I’m sorry? I don’t know.

Notes

Teacher narrows down to two possible ideas: need-based change versus random mutation

Student admits she is unsure of the correct explanation.

In this next exchange, we gain more specific information about the two possible interpretations of the student thinking that the teacher is selecting from. On the one hand, the student may have a need-based change idea or, on the other hand, the student may have a more sophisticated view of the mutation being random. At the end of the exchange, Lydia admits that she is not sure which of those options is right, helping the teacher to see that Lydia indeed does not clearly understand the correct explanation. Teacher 3 describes her thinking as; I’m assuming that when I said “Do you think that the fact that they were running away made them change versus random change,” I was trying to peg her down on whether she understood that the change wasn’t occurring as the antelope were running away from the cheetah, but that there must have been antelope already with an adaptation for better getting away from the cheetah, that they would have passed on to offspring. In this case, we clearly see an instance of a teacher trying to 'peg down' what a student understood, picking up on the student's unclear language and narrowing in on two different student ideas about origin of traits as represented in the LP.

Furtak et al. 22 Teacher 3's exchange, while a clear instance of highlighting, nevertheless is a very teacher-directed exchange in which the teacher focused on one student statement and then tried to get the student to choose between two options. Other teachers in the study took a less directed approach and rather than trying to get students to pick between two ideas, simply tried to get students to share their thinking as clearly as possible. For example, Teacher 4 picked up on a potential student naïve conception by referring the question back to the entire class in the following classroom discussion: Speaker

Dialogue

Teacher 4

So this one you were to read the following scenarios. Based on what you know about natural selection, describe how could these scenarios have come to be. So the first one you have the pronghorn antelope are able to run so fast that no known predator alive today is able to catch them. There once was a North American cheetah, which is now extinct. So, Raphael. They are able to run so fast ‘cause they were being chased by a really fast animal and over time it progressively got faster and faster so it could outrun the North American cheetah and when they died off it just maintain the fastest. They maintained the fastest. OK, can anybody, that’s great, can anybody give us a little more explanation of how the progression of speed? Craig. The pronghorn antelope that were able to run the fastest wouldn’t be eaten and those ones were likely more faster or had some tendency to have stronger leg muscles. And the ones that survived that passed the stronger leg muscles onto their offspring were able to continue to outrun the cheetah and all the ones that weren’t able to outrun him died.

Raphael

Teacher 4

Craig

Notes

Environment Causes organism to change Unclear description of selective force Teacher highlights student use of 'progression of speed' Complete description of selective force

In this case, the teacher highlights the student's use of a need-based phrase that suggests that the pronghorn antelope progressively got faster over time in response to the cheetah chasing it. She then refers the question to another student, who gives a more complete response. She described the interaction this way: I thought Raphael, he probably knew the mechanisms that were driving the increase in speed, but he didn’t give me enough detail to really let me know his true understanding of it... I thought Craig gave a really good, concise description...that there had to have been – that not all the antelope were super fast, but there had to be some that could run fast to not get eaten. So that there was some variation within the group and that that gave that group the ability to survive

Furtak et al. 23 and produce offspring with those same characteristics. So it kind of hit all the defining points of natural selection. Here we see Teacher 4 making an assumption about Raphael having a complete understanding, but choosing to highlight the part of Raphael's statement that she thought could reveal a misconception. She then reflected that statement to another student, who gave what she considered to be a complete explanation of natural selection. Later in the lesson, the teacher is involved in another exchange with Raphael and another student regarding the corpse flower example. Teacher 4 describes these exchanges; I was trying to direct them, I guess, to give me the exact answer that I wanted. And maybe not just let them off the hook with an explanation that was good but didn’t really give me all the information I wanted. Kate talked about that the smell helping spread seeds, which I’m not real familiar with how that would work. That was fine, even though it was more of a pollination thing. But that there was a mutated gene that caused them to smell. And Raphael talking about the smells. But I think I wanted them more – both of them to kind of express more of the idea that some – that at some point there had to have been or there was most likely variation and maybe not – and it was just the worst smelling had the advantage. In this reflection, the teacher is able to articulate not only what she hears in the student responses, but also what she feels is missing and wants student to include. In this way, the teacher is drawing on her multifaceted understanding of the concept and identifying exactly what she wants the students to add. In contrast to the first case, here we see a teacher just airing out student understanding, highlighting a particular phrase that she thought was unclear, and then calling on another student to provide a more complete understanding. Overall, we observed 23 instances of highlighting across the different teachers' classes, indicating that all teachers in the study were using information from the educative LP to identify student ideas in classroom talk. Coding and Diagnosing The preceding examples illustrate how teachers in the study called attention to student ideas in classroom talk. When highlighting, the teachers just pulled out examples of student talk and asked for more information. However, we also observed instances in which teachers actually coded or diagnosed student ideas in discussion in ways that were consistent with our categorization from the LP. Instances of coding or diagnosing were much less frequent than highlighting, and often occurred in response to the teacher having heard a number of different student ideas, or having read over student work and responding about it to the class as a whole. For example, Teacher 5, when talking with students about the corpse flower example, codes student ideas twice within a short exchange:

Furtak et al. 24 Speaker

Dialogue

Notes

Teacher 5

How do you think this flower came to be like this? Why, Nick.

Nick

Um, I don’t know maybe it like used that smell to attract flies and if it’s like a carnivorous plant then that’s their food source that it attracted flies so it liked to use that smell ‘case it attracted more fruit flies.

Teacher 5

OK so maybe it eats—OK so it does eat the flies. So let me add that bit of information. What else do you think—or why do you think it has this relationship with the flies? Nora.

Nora

Well they like pollinate it.

Teacher 5

OK they are the flower’s pollinators. How do you think this came to be? Um Becky.

Becky

Um they were dying cause they didn’t get enough pollen so they started putting off that smell so the flies would come.

Need-based change/ anthropomorphic

Teacher 5

OK. So from the previous example and also from Becky's statement it sounds like there’s this idea that things have to become a certain way to survive. Would you agree with that?

Student

I think that they like the smelly ones were the only ones that pollinating because the flies were attracted to them so they became more and more developed.

Teacher codes/diagnoses student statement: things have to become a certain way to survive No explicit mechanism

Teacher 5

Good. What made them smelly to begin with?

Student

Just like in that first batch of them there’s like one or maybe two that were smellier than others and they kept breeding and they became more and more developed and (inaudible).

No explicit mechanism

Student

Maybe they used to like grow on rotting corpses and so the ones—so in order to—and then they weren’t doing so well and so in order to get the flies to come they started developing their own scent based off what they used to grow on.

Need-based change anthropomorphic

Anthropomorphic

Furtak et al. 25 Speaker

Dialogue

Notes

Teacher 5

K, so this idea of they did this to become this, right?

Teacher codes/diagnoses student ideas: They did this to become this

Here we see the teacher summarizing what she hears in student talk, at first stating her diagnosis in general terms, and then using the exact language from the LP to describe what she hears the students saying. She has elicited a number of unclear statements that in some cases seem to involve naïve conceptions related to anthropomorphic ideas about how the flower is able to change itself. After watching this clip, the Teacher 5 described her statements as: So primarily I just heard a bunch of need-based change and this idea that...in order to attract the flies they developed the stinky smell. And so it just seemed to me that I was just trying to work away from that. So to get them to think, okay, so they became this because they needed to do this. And I was trying to get them to see sort of the oddness of that idea, I guess. Here we see the teacher using the language straight from the LP to identify this sequence of student ideas that she elicited in the discussion. She says that she was calling out the way that students were stating their ideas - the fact that in order to do this, they did that - to help them see the 'oddness' of the idea. Teacher 3 also used explicit language from the LP to identify student ideas during her discussion. Again returning to the antelope example: Speaker

Dialogue

Student

Why didn’t they just share food?

Teacher 3

OK, once again, remember that A word I was using? Anthropomorphizing. Making them like humans.

Notes

Teacher codes student ideas as being anthropomorphic

In this example the teacher does not highlight the student statement at all but immediately calls the students' attention to the student's anthropomorphizing without much explanation. When viewing this clip, the teacher said, It seemed…[reads from transcript] ”maybe the initial generation had to find a way to survive.” So that’s very anthropomorphic, very much ‘we need to find a way to survive this, we’d better run faster’ kind of idea. And so I was trying again to lead her to the idea without just saying, “no, you’re wrong. This is the way it is.” Trying to get her to get to the

Furtak et al. 26 point where she could see for herself that anthropomorphizing is not the best way to understand natural selection. Thus, as illustrated by these instances above, we see that teachers in the study were also coding or diagnosing the particular kind of student thinking they were seeing during class discussions. These instances indicate that teachers were not only actively listening to student thinking, they were categorizing it based on their understanding of the educative LP and then diagnosing that kind of thinking in classroom talk. Looking Across the Data: Teacher Responses to Student Ideas As the preceding sections indicate, teachers were able to elicit student ideas and then highlighted and coded or diagnosed those ideas. We also noted patterns in the approaches that teachers used to respond to students in primarily four ways: revoicing, reconstructing, checking, and asking for students to provide underlying mechanisms. These approaches were not necessarily mutually exclusive but were used by teachers in a variety of ways. Revoicing, Reconstructing & Checking. When highlighting student ideas, in many cases, teachers would revoice or reconstruct student ideas, and then follow up by asking students if that was what they thought, asking the students to choose between two alternatives, or asking a short-answer question. In this way teachers were checking to see if they indeed had identified the appropriate level of student thinking. Many of the examples above illustrate these types of trends in teachers’ use of the educative LP in classroom discussions. Asking for underlying mechanisms. Another common response that often overlapped with teachers highlighting student ideas were requests for students to provide the underlying mechanism. In this way, teachers pushed students beyond simple explanations of changes in traits and elicited more scientific understandings based on mechanisms of change. Take the following sequence as an example Speaker

Dialogue

Notes

Student:

Well it could have been natural selection so the faster ones did survive better and so maybe like over time then they became faster. OK so if they became faster, what do they have to do here? Go ahead, Chloe.

No explicit mechanism

Teacher:

Student:

Well, if the slower antelope were killed by a cheetah because they aren’t fast, then the faster ones would go onto reproduce more offspring than the slow ones and eventually the fastness pace would just take over the entire population.

Teacher highlights student use of 'became' and asks for mechanism Full explanation of selective force

Furtak et al. 27 Here we see the teacher picking up on the student's lack of a clear mechanism for change and asking specifically for the student to provide the mechanism by which the population changes. Interrelationship between Teacher Responses and Naive Conceptions Teachers were more likely to ask students to provide mechanisms when students shared ideas that included some kind of need-based change idea, used 'mutation' in unclear ways, or provided no inforamtion about the mechanism than if student ideas were diagnosed as selective force. However, if students expressed ideas related to the selective force construct, instead of the origin of trait construct as above, there was no clear pattern of teacher highlighting or diagnosing even when non-scientific conceptions were provided. Thus teachers appear to be focusing on the origin of traits construct over the selective force construct in classroom discussions. Teachers later commented in their post-enactment interviews that they felt their students had already mastered the selective force construct and, therefore, did not require more support for developing more nuanced understandings in this area. Feedback Although the teachers' questions in the preceding sections could be construed as instances of teachers providing feedback to the students in discussions, we nevertheless did not see many instances of informational feedback from teachers to students. This is most likely a vestige of the placement of “How did it come to be?” towards the beginning of the unit; indeed, most of the teachers referred to using the information they had collected in the assessment to inform their teaching later in the unit. Thus, feedback from this assessment was primarily flowing from the student to the teacher in the form of naïve conceptions about natural selection. Teachers used this information to determine appropriate instructional steps within their units. There were, however, a few instances in the enactment of formative assessment two when teachers engaged in informational feedback to students, as in the following example with Teacher 4: Speaker

Dialogue

Notes

Derek

So like, instead of the sweat nectar they we mostly find in North American that most bees populate, I would imagine that there aren’t many bees in Southern America. I won't lie but (inaudible) it smells like corpse. But I guess over time it started smelling ranker, er ranky, or worse and worse. Then more flies were attracted to it and then the more kind of thing you want attracted to it, I guess the better chance you had of spreading your loins, I guess—

No explicit mechanism

Furtak et al. 28 Teacher 4

Yes, the fruit of your loins, right, yah. OK, all right. Derek brought up a good point with the flies and here in North America most pollination is done by bees. Can you relate something to when we looked at Darwin’s ideas about resources and competition? With what this flower is doing as opposed to what we traditionally think of.

Teacher asks student to relate his idea to prior knowledge from class

Susan:

So there might have been multiple types of flowers that they all used bad smells to attract their perspective pollinator. And the one that was able to—so in this case the corpse flower—which produced the worst smell was able to attract for more flies and pollinate better than it’s other, I guess, species. And so the one with the worst smell, it became more dominant and it became more dominant anything it put out ultimately have it smell worst. So you just kind of get that competition going where the worst smelling populates the best.

Student refines answer to incorporate variation

In this way, Teacher 4 asks the student to draw on his prior knowledge from class to state a more scientific answer. In contrast, Teacher 9 took the opportunity to state exactly the elements that she felt were missing from the student’s response, as shown in this example: Speaker Student:

Dialogue Okay, so the plant that smelled like rotting flesh attracted more flies which gave them more food so they could survive and pass on their genes and the ones that did not died off. They just died, turned over sideways.

Notes Selective Force

Teacher 9

Okay, so what I’m hearing, what I’m hearing from your explanation is that the flower, this type of plant is a carnivorous plant. Right. That the flies are its food.

Revoice and check

Student

Yes

Teacher 9

That’s one possible explanation. So back in the day, the original… flower. what I would like to see more in your answer if you could add that was that some flowers smelled more like rotting flesh, some flowers smelled a little bit differently that made them more attractive then others. Right. Let’s add natural selection into that.

Feedback: teacher states criteria for correct answer (incorporate variation)

Furtak et al. 29 In both of these instances, we see that teachers were using the information they had collected about student thinking to provide informational feedback to help students develop more sophisticated understandings. Discussion Although LPs are currently used as foundations for curriculum and assessment development, our study suggests that learning progressions may also have utility as supports for teachers’ formative assessment practices. Our data indicate that teachers participating in the study were able to highlight and diagnose student ideas represented in the learning progression, indicating that the sustained professional development centered on this tool had an influence on teachers’ thinking about natural selection, as well as enactment of formative assessment in whole-class discussions. Although student ideas about natural selection often involve many naïve conceptions related, the teachers in the study were all successful at identifying these ideas during class discussion in ways consistent with the educative LP. This suggests that the group of teachers had come to a shared understanding and had, in a way, developed a group vision of what student thinking looked like in this domain. Although we never observed teachers using the LP during discussions, the nevertheless LP as a scaffold for PLC meetings was an influence on their understanding of student ideas in this domain. Furthermore, our analyses indicate that teachers were able to consistently enact the formative assessment across classrooms in consistent ways. The four different questions, debated and carefully developed by the PLC across two years of meetings to specifically draw out student ideas related to the LP, were posed as open-ended questions by each teacher and drew out similar kinds of ideas in each classroom. This result suggests that LPs, when put in the hands of teachers in a sustained professional development environment, can also serve as frameworks for teacherdeveloped formative assessments. The formative assessments may be developed to elicit the student ideas represented in the LP, and then can be used as a way to evaluate the quality of the formative assessment (i.e., if the ideas represented in the LP are not being elicited by the formative assessment, why? If there are ideas elicited by the formative assessment that are not present in the LP, should they be somehow integrated?). The formative assessments were given on an instructional timeline from the beginning of the unit to the end of the unit. “How did it come to be” was enacted at the beginning of the unit and therefore provided feedback mainly from student to teacher, though it also allowed some informational feedback from the teacher to the student as well. The student feedback included more selective force ideas than origin of trait ideas. As such, the teacher feedback included: revoicing, reconstructing, checking, and looking for mechanisms primarily on this construct versus on the selective force construct. If LPs continue to be used as supports for classroom instruction, they should be used in concert with sustained professional development. It has been argued that LPs could create the misperception that student thinking develops in a one-dimensional, linear way from naïve ideas to sophisticated ones (Shavelson, 2009). To help teachers learn how this tool can structure their classroom practice, LPs should be

Furtak et al. 30 introduced to teachers over long periods of time. Teachers should be given opportunities to identify student thinking represented in the LP in writing and in classroom talk. Furthermore, we argue that teachers should not be viewed as merely implementers of the teaching strategies suggested by LPs, but should be invited to help co-construct LPs, develop assessments linked to them, and then help other teachers come to understand the value of this tool. Despite our success in the present study, we feel that we have not yet learned how to support teachers in adopting informational feedback strategies to use on-the-fly in classroom discussions. Although the formative assessment we analyzed for this paper was intended for use early in the unit, we would still expect to see instances of teachers making criteria for student explanations explicit or of contrasting student ideas with each other. Despite our efforts to help teachers anticipate student ideas and plan feedback strategies in advance, we feel that more work needs to be done to develop shared meanings about the purpose and role of feedback in formative assessment discussions. Future research should explore how to better support teachers in developing feedback approaches once student ideas have been identified. The teachers in this PLC have taken an important step in developing capacity to conduct formative assessment through consistent diagnosis of student ideas during class discussion. The above evidence indicates that these teachers are in the process of developing professional vision through a shared understanding of the highlighting and coding of student ideas of natural selection. The field of science education is currently contributing considerable effort toward the design of LPs (Shavelson, 2009); however, LPs are “as yet unproven tools for improving teaching and learning (Corcoran et al., 2009, p. 5). This paper presented the results of a study in which an educative LP is being developed and piloted with a group of practicing biology teachers for the purpose of better understanding how these new tools can support improved classroom practice. The results of the study indicate that, despite concerted efforts to help teachers improve their practice, the influence of the LP varied between teachers, calling into question the efficacy of LPs as a ‘cure-all’ educational reform.

Furtak et al. 31 References American Association for the Advancement of Science. (1990). Science for All Americans. New York: Oxford University Press. American Association for the Advancement of Science. (1993). Benchmarks for Science Literacy. New York: Oxford University Press. American Association for the Advancement of Science. (2001). Atlas of Science LIteracy. Washington, D.C.: National Academies Press. Anderson, D. L., Fisher, K. M., & Norman, G. J. (2002). Development and Evaluation of the Conceptual Inventory of Natural Selection. Journal of Research in Science Teaching, 39(10), 952-978. Atkin, J. M., Coffey, J. E., Moorthy, S., Sato, M., & Thibeault, M. (2005). Designing Everyday Assessment in the Science Classroom. New York: Teachers College Press. Barton, A. C. (1998). Reframing "Science for all" through the Politics of Poverty. Educational Policy, 12(5), 525-541. Bishop, B. A., & Anderson, C. W. (1990). Student Conceptions of Natural Selection and Its Role in Evolution. Journal of Research in Science Teaching, 27(5), 415-427. Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2002). Working Inside the Black Box: Assessment for learning in the classroom. London: King's College. Black, P., & Wiliam, D. (1998a). Assessment and Classroom Learning. Assessment in Education, 5(1), 7-74. Black, P., & Wiliam, D. (1998b). Inside the Black Box: Raising Standards through Classroom Assessment. Phi Delta Kappan, 80(2), 139-148. Borko, H. (2004). Professional Development and Teacher Learning: Mapping the Terrain. Educational Researcher, 33(8), 3-15. Briggs, D. C. (2008). Synthesizing Causal Inferences. Educational Researcher, 37(1), 15-22. Catley, K., Lehrer, R., & Reiser, B. (2005). Tracing a Prospective Learning Progression for Developing Understanding of Evolution: Paper Commissioned by the National Academies Committee on Test Design for K-12 Science Achievemento. Document Number) Cohen, D. K. (1990). A Revolution in One Classroom: The Case of Mrs. Oublier. Educational Evaluation and Policy Analysis, 12(3), 311-329. Collins, A. (1998). National Science Education Standards: A political document. Journal of Research in Science Teaching, 35(7), 711-727. Corcoran, T., Mosher, F. A., & Rogat, A. (2009). Learning Progressions in Science: An Evidence-Based Approach to Reform. Philadelphia, PA: Consortium for Policy Research in Educationo. Document Number) Dagher, Z. R., & Boujaoude, S. (2005). Students' Perceptions of the Nature of Evolutionary Theory. Science Education, 89, 378-391. Davis, E. A., & Krajcik, J. (2005). Designing Educative Curriculum Materials to Support Teacher Learning. Educational Researcher, 34(3), 4-14. Dewey, J. (1902). The School and Society and The Child and the Curriculum. New York: Touchstone. DuFour, R. (2004). What is a "Professional Learning Community. Educational Leadership, 61(8), 6-11. DuFour, R., & Eaker, R. (1998). Professional Learning Communities at Work: Best Practices for Enhancing Student Achievement. Bloomington, IN: Solution Tree. Ferrari, M., & Chi, M. T. H. (1998). The nature of naive explanations of natural selection. International Journal of Science Education, 20(10), 1231-1256.

Furtak et al. 32 Furtak, E. M. (2006). The Dilemma of Guidance in Scientific Inquiry Teaching. Stanford University, Stanford, CA. Furtak, E. M. (2009a). Learning Progressions to Support Teacher Learning. Paper presented at the Annual Meeting of the American Educational Research Association. Furtak, E. M. (2009b). Toward Learning Progressions as Teacher Development Tools. Paper presented at the Learning Progressions in Science Conference, Iowa City, IA. Furtak, E. M., & Ruiz-Primo, M. A. (2008). Making Students' Thinking Explicit in Writing and Discussion: An Analysis of Formative Assessment Prompts. Science Education, 92, 799-824. Furtak, E. M., Ruiz-Primo, M. A., Shemwell, J. T., Ayala, C. C., Brandon, P., Shavelson, R. J., et al. (2008). On the Fidelity of Implementing Embedded Formative Assessments and its Relation to Student Learning. Applied Measurement in Education, 21(4), 360-389. Geraedts, C. L., & Boersma, K. T. (2006). Reinventing Natural Selection. International Journal of Science Education, 28(8), 843-870. Goodwin, C. (1994). Professional Vision. American Anthropologist, 96(3), 606-633. Hattie, J., & Timperley, H. (2007). The Power of Feedback. Review of Educational Research, 77(1), 81-112. Keeley, P. (2008). Science Formative Assessment: 75 Practical Strategies for Linking Assessment, Instruction, and Learning (Vol. Corwin Press): Thousand Oaks, Ca. Kluger, A. N., & DeNisi, A. (1996). The Effects of Feedback Interventions on Performance: A Historical Review, a MetaAnalysis, and a Preliminary Feedback Intervention Theory. Psychological Bulletin, 119, 254-284. Lawrence Hall of Science. (2000). Magnetism and Electricity. Nashua, N.H.: Delta Education. Mayr, E. (1997). This is Biology. Cambridge, MA: Harvard University Press. National Research Council. (1996). National Science Education Standards. Washington, D.C.: National Academy Press. National Research Council. (2001a). Classroom Assessment and the National Science Education Standards. Washington, D.C.: National Academy Press. National Research Council. (2001b). Inquiry and the National Science Education Standards. Washington, D.C.: National Academy Press. National Research Council. (2007). Taking Science to School: Learning and Teaching Science in Grades K-8. Washington, D.C.: National Academies Press. Pellegrino, J. W., Chudowsky, N., & Glaser, R. (Eds.). (2001). Knowing What Students Know: The Science and Design of Educational Assessment. Washington D.C.: National Academies Press. Ruiz-Primo, M. A., & Furtak, E. M. (2006). Informal Formative Assessment and Scientific Inquiry: Exploring Teachers' Practices and Student Learning. Educational Assessment, 11(3&4), 237-263. Ruiz-Primo, M. A., & Furtak, E. M. (2007). Exploring Teachers' Informal Formative Assessment Practices and Students' Understanding in the Context of Scientific Inquiry. Journal of Research in Science Teaching, 44(1), 57-84. Sadler, D. R. (1989). Formative Assessment and the Design of Instructional Systems. Instructional Science, 18, 119-144. Shavelson, R. J. (2009). Reflections on Learning Progressions. Paper presented at the Learning Progressions in Science Conference. Shavelson, R. J., Yin, Y., Furtak, E. M., Ruiz-Primo, M. A., Ayala, C. C., Young, D. B., et al. (2008). On the Role and Impact of Formative Assessment on Science

Furtak et al. 33 Inquiry Teaching and Learning. In J. Coffey, R. Douglas & C. Stearns (Eds.), Assessing Science Learning (pp. 21-36). Arlington, VA: NSTA Press. Shemwell, J. T., & Furtak, E. M. (2009). Problems with Argumentation for Conceptual Science Learning: When Arguments and Explanations Diverge. Paper presented at the European Association for Research in Learning and Instruction Biennial Meeting. Sherin, M., & Es, E. v. (2003). A New Lens on Teaching: Learning to Notice. Mathematics Teaching in the Middle School, 9(2), 92-95. Shtulman, A. (2006). Qualitative differences between naive and scientific theories of evolution. Cognitive Psychology, 52, 170-194. Shulman, L. S. (1986). Those Who Understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4-14. Spillane, J. P. (1999). External reform initiatives and teachers' efforts to reconstruct their practice: The mediating role of teachers' zones of enactment. Journal of Curriculum Studies, 31(2), 143-175. Strike, K. A., & Posner, G. J. (1992). A revisionist theory of conceptual change. In R. Duschl & R. Hamilton (Eds.), Philosophy of Science, Cognitive Psychology, and Educational Theory and Practice (pp. p. 147-176). Albany, NY: SUNY Press. Thompson, J., Braaten, M., & Windschitl, M. (2009). Learning Progressions as Vision Tools for Advancing Novice Teachers' Pedagogical Performance. Paper presented at the Learning Progressions in Science Conference. Wiliam, D. (2007). Keeping learning on track: Classroom assessment and the regulation of learning. In J. F. K. Lester (Ed.), Second handbook of mathematics teaching and learning (pp. 1053-1098). Greenwich, CT: Information Age Publishing. Wilson, M. (2005). Constructing Measures: An Item Response Modeling Approach. Mahwah, N.J.: Erlbaum.

Suggest Documents