Paper presented at NARST 2015, Chicago
1
Using Learning Progressions to Support Pre-Service Physics Teachers’ Noticing Claudia von Aufschnaiter1, Alicia Alonzo2, & Daniel Kost1 1
Justus Liebig University of Giessen, Germany
2
Michigan State University, USA
Contact:
[email protected];
[email protected]
“Noticing” enables teachers to recognize and interpret key features of classroom events and to respond flexibly to students’ thinking during instruction. This study explores the premise that learning progressions can serve as a framework for supporting teachers’ noticing and interpreting of student ideas. In pairs, pre-service physics teachers (PSPTs) discussed short video clips highlighting a feature of common student thinking about force and motion. PSPTs’ discussions were video-recorded before and after 150 minutes of instruction on learning progressions. Using a category-based coding scheme, we investigated whether the PSPTs focused their noticing on student thinking and, if so, whether they utilized criteria derived from learning progressions (or other frameworks) for their interpretations. We expected that, as a result of instruction, the PSPTs would pay more nuanced attention to differences in the sophistication of students’ understanding. Results from the analysis of two pairs of PSPTs indicate that both groups spent the entire time on discussion of the videos (no off-task activities) and at both pre- and post-test were engaged with student thinking (rather than the other aspects of the videos). However, whereas one group demonstrated similar approaches to and expressed similar ideas about student thinking before and after learning progression instruction, the other group engaged more deeply with student thinking prior to the instruction. After instruction, these PSPTs used the learning progression to assign student ideas to levels, without paying as much attention to the nuances of student thinking as they had prior to instruction. Thus, contrary to our expectations, use of the learning progression did not appear to improve the four PSPTs’ ability to notice student thinking. Implications for further research on teacher noticing with learning progressions are discussed.
Paper presented at NARST 2015, Chicago
2 Problem
Contemporary views of teaching and learning call for “next generation” science teachers who are able to craft instruction that is both responsive to and builds on students’ ideas. There is evidence that teachers develop knowledge of student ideas with classroom experience (e.g., van Driel, Verloop, & de Vos, 1998); therefore, one challenge for teacher preparation is how to support novice teachers so that they enter the classroom prepared to engage with their students’ ideas. “Noticing” (van Es & Sherin, 2002) — which includes “identifying what is important” (p. 573) and using knowledge to reason about classroom events — may serve as the foundation for teachers’ formative assessment practices and student-centered instruction. It enables teachers to recognize and interpret key features of classroom events and to respond flexibly to students’ thinking during instruction. A number of recent efforts have focused on the use of video to develop teachers’ capacity to notice student thinking (e.g., Alonzo & Kim, 2012b; Borko, Jacobs, Eiteljorg, & Pittman, 2008; Star & Strickland, 2008; van Es & Sherin, 2008). Even in the simplified context of videos, teachers may not initially focus attention on student thinking (van Es & Sherin, 2008), and frameworks have been used to guide novice teachers’ work with videos (e.g., Santagata & Guarino, 2011; Star & Strickland, 2008). Among other uses proposed for learning progressions (LPs) – “descriptions of the successively more sophisticated ways of thinking about a topic that can follow one another as children learn” (National Research Council, 2007, p. 219) – is support for teachers’ formative assessment practices (e.g., Alonzo, 2011; Black, Wilson, & Yao, 2011; Furtak, 2012). It is not enough for a teacher to recognize that students have made a mistake or are offering the correct solution/desired response; he or she must also be able to identify the cause of the students’ error (Bonnoil, 1991) or the potential of their ideas. Specific diagnoses of students’ learning difficulties are a vital part of the formative assessment process (Nussbaum, 1981), which may draw upon teachers’ familiarity with typical student reasoning and ideas. LPs may provide teachers with a sense for the range and variety of ways students are likely to think about a topic, providing a more nuanced framework for identifying and interpreting student thinking than “gets it”/”doesn’t get it” (Alonzo, 2012). This dichotomous view of student thinking has been identified as problematic for formative assessment and, thus, student learning (Minstrell, Anderson, & Li, 2011), yet common in pre-service teachers’ approaches to formative assessment (Otero, 2006). To-date, research has focused primarily on in-service teachers’ use of LPs (e.g., Alonzo, de los Santos, & Kobrin, 2014; Alonzo & Elby, 2015a; Covitt, Syswerda, Caplan, & Cano, 2014; Furtak & Heredia, 2014; Furtak, Morisson, & Kroog, 2014; Gunckel, Covitt, & Salinas, 2014). Thus, in this exploratory study, we sought to investigate the possibility of using LPs as a framework for supporting pre-service teachers’ noticing. In particular, we were interested in whether a specific LP supported novice teachers’ noticing of student ideas in videos. Procedure An intact cohort of 40 German pre-service physics teachers (PSPTs; all traditional university students, one-third female) participated in the study while enrolled in a physics education course that focused on diagnosis of student learning and implications for instruction. During the preceding physics education course, these PSPTs responded to both the Force Concept Inventory (FCI; Hestenes, Wells, & Swackhamer, 1992) and ordered multiple-choice (OMC; Briggs, Alonzo, Schwab, & Wilson, 2006) items associated with a force and motion (FM) LP (Alonzo & Steedle, 2009). Individual test results were provided to the PSPTs in order to help them assess their own learning and identify where they lacked conceptual understanding. A subsequent whole class discussion focused on the test results, as well as reasons for any difficulties the PSPTs experienced and how instruction might respond to these difficulties.
Paper presented at NARST 2015, Chicago
3
During the 10 class lessons that preceded this research, the PSPTs analyzed videos depicting students working in groups on physics tasks. PSPTs were introduced to the idea of levels of conceptualization (von Aufschnaiter & Rogge, 2010) as a framework for analysis of student thinking. The PSPTs were also prompted to notice whether student ideas were correct or incorrect, and, for the latter, whether the ideas seemed to demonstrate known student misconceptions. In January 2014, 2.5 of the class sessions (a total of 210 minutes) were devoted to a set of study-related activities: 1) Video-based pre-test (30 minutes out of the first session) In pairs, PSPTs discussed two to three video clips (depending on the length of discussion per clip), each highlighting a feature of common student thinking about FM. All three video clips were excerpted from cognitive interviews with high school physics students in the United States (Alonzo & Kim, 2012a).
Clip 1 (approximately 1.5 minutes) depicts a male student discussing a rocketpropelled sled that travels with increasing speed along a horizontal frictionless track. In the video, the student appears to have the common idea that force is proportional to velocity and, thus, for the speed of the sled to increase at a constant rate, the force exerted by the engine must also increase. However the student also reveals some contradictions in his thinking, which may indicate confusion about the words he is using to describe the situation, leading to uncertainty as to what can be concluded about his thinking. Clip 2 (approximately 4 minutes) depicts a female student first discussing forces acting on a ball that is bounced vertically on a floor and then discussing forces acting on a ball that is thrown vertically up into the air. In the video, the student is able to repeat what her teacher had said about these sorts of scenarios. However, she also expressed her own view, which was quite different. The student repeatedly expressed the common idea that a force must be exerted in the direction in which an object is traveling and, thus, that when the ball is thrown up into the air (or pushed down towards the ground), the force from the person’s hand is still acting after the ball has left the person’s hand. Clip 3 was offered for those PSPTs who had some time left after discussing Clips 1 and 2. In Clip 3 (approximately 2 minutes), a female student talks about a girl pushing a heavy rock that does not move, demonstrating some conceptual difficulties with both balanced forces and Newton’s third law.
The PSPTs were asked to discuss what they noticed about student thinking in each clip, possible reasons for student thinking, and, as time permitted, how they might respond with instruction. As clips were in English, the PSPTs were offered English transcripts along with German translations. In addition, clips were subtitled in English so that the PSPTs were supported in understanding spoken words in a language not very familiar to them. During the class session, about half of the students relied at least partially on the English transcript when talking about the videos. Furthermore, some PSPTs suggested minor revisions to the German translations, indicating that they were cross-checking the accuracy of the translations. 2) LP instruction (150 minutes) Session 1 (60 minutes). Directly after the pre-test, the instruction addressed PSPTs’ conceptual understanding of force and motion, as previous testing revealed that not all PSPTs demonstrated sound knowledge of the topic. PSPTs were asked to respond to the OMC items in pairs and to discuss why a particular answer is correct whereas others are not. They were also prompted to write down the concepts that a student needs to understand in order to choose the correct answer purposefully (all together about 20 minutes). During these activities, the lecturer (the first author) made sure that pairs selected the correct answer and
Paper presented at NARST 2015, Chicago
4
explained the physics concepts with further examples when asked by PSPTs. In a second step, the pairs of PSPTs completed a template that displays the FM concepts and how these relate to each other (about 15 minutes; Figure 1 depicts the German version that was used in class). Even though the PSPTs did not know it at the time, the template depicts level 4 of the FM LP. Later in session 1, pairs of PSPTs analyzed two new short video clips (one from a student interview and one from a classroom setting) with the prompt to contrast how the students in the two videos conceptualize FM (about 15 minutes). In a subsequent classroom discussion (about 10 minutes), the student ideas identified by the PSPTs were ordered from the simplest to most sophisticated. This order was used to introduce the idea of the FM LP as a means to describe student thinking at different levels. Session 2 (90 minutes). At the beginning of the second session the PSPTs were given completed templates for each level of the FM LP, including a completed template for level 4, and were asked to identify what changes from one level to the next. They were also prompted to assign levels to the transcripts of the two videos analyzed in session 1 if time was left (all together about 15 minutes). To familiarize themselves with the FM LP (about 50 minutes), PSPTs tried to assign the levels to options of OMC items (the first third of the time) and to student thinking in up to three additional video clips. In addition, they discussed both why students in the videos might hold particular ideas and what teachers might do to support students in making progress with respect to the FM LP. Finally, the lecturer briefly discussed the potential of LPs, in general, for teaching and learning by highlighting which ideas might be addressed with instruction in order to move students from one level to the next. Including short whole class discussions about these ideas, this last phase took about 25 minutes.
Figure 1. Template for concepts FM concepts (German version) 3) Video-based post-test (30 minutes out of a third session) In pairs, PSPTs discussed the video clips from the pre-test, with the same request to discuss what they noticed about student thinking, possible reasons for student thinking, and how they might respond with instruction.
Paper presented at NARST 2015, Chicago
5
Data During all three activities described above, video-recordings and written notes were collected from 12 pairs of PSPTs. PSPTs who were willing to be video-recorded were informed that they could decide to give consent for the analysis of the footage after all three activities were completed. Out of the 12 pairs, there was at least one PSPT in each of four pairs who did not give consent. Because this was an exploratory study and our goal was not to make claims about the effectiveness of an LP-based intervention, we neither adopted an experimental methodology nor assigned PSPTs to particular groups or insisted that they sit with the same partner throughout all study activities. Therefore, the composition of the pairs changed throughout the sessions. In the end, there were only two pairs that remained the same across all three sessions and in which both members agreed to have their video-recordings analyzed: one male and one female group. For initial exploration of PSPTs’ noticing, we focused on these four PSPTs. All four were in the top third of the class in terms of performance on the OMC items (answering 14 or 15 of the 15 FM OMC items at the highest possible LP level) and also performed well on the FCI. They were preparing to teach math and physics in lower and upper secondary schools and had taken similar university courses. They were also concurrently enrolled in a preparation course for a 5-week secondary school physics teaching practicum. In-class observation of these PSPTs and their exam results in physics education courses indicates that they are very committed to engaging with student thinking and learning, attend almost all class sessions, and interact frequently with each other and the lecturer about class topics. Therefore, we do not consider these PSPTs to be representative of the class as a whole in terms of both abilities and dispositions. This is reflective of a general problem in this type of research, in that PSPTs with very weak test results are less likely to agree to video-recording and are more likely to skip one or more lessons associated with the research. However, by focusing on these PSPTs, as a first consideration of the LP instruction, we hoped to characterize what might be possible, rather than what might be typical. Furthermore, higher level discussions are more likely to support the refinement of a coding system that we can later apply to more typical student discussions. Analysis Video data for the two groups were transcribed first in German and then translated into English for shared analysis. Transcripts were analyzed with a combination of application of a pre-defined category system that was derived from our previous work with pre- and in-service teachers (e.g. Alonzo & Kim, 2012a; Alonzo & Elby, 2015b; Hofmann, 2015) and the inductive development of further codes (similar to approaches described in Mayring, 2000). Table 1 Preliminary set of codes for analysis Code Brief description STj Overall judgement about student thinking as completely right or completely wrong (“gets it/doesn’t get it” perspective) STd Description of student thinking without evaluation STm Identifying what is missing in student thinking STc Identifying what is correct in student thinking STi Identifying what is incorrect in student thinking STn More nuanced analysis of student thinking, with attention to student’s words STe Focus on the student’s emotions
Code IN Q
Brief description Instruction PSPT try to make sense of the question posed in the video -TF Theoretical Framework (either ST-TF or IN-TF) reason Reasons for ST or IN offered / different interpretation possible , both codes apply (…) Corresponding excerpt from transcript […] Further interpretation/remarks by coder
Paper presented at NARST 2015, Chicago
6
The pre-defined set of categories were used to identify the focus of PSPTs’ discourse: Student thinking (ST), Instruction (IN), Other (OT, such as expressions of PSPTs own emotions and feelings), Physics Content (PT), Educational content (EC), and, where it applied in addition, Theoretical Framework (TF, whenever PSPT addressed either levels of conceptualization, LP, or any other theoretical idea). Analysis indicated that some of the codes did not appear in these particular PSPTs’ discussions, whereas others required more differentiation. After some refinement, codes shown in Table 1 were used. Table 2 Excerpt from data analysis (male group, pre-test, clip 1) Time 13:27
S2:
13:36 13:38
S1: S2:
13:44 13:46 13:47
S1: S2: S1:
14:10 14:11 14:13
S2: S1: S2:
14:29
S1:
14:39
S2:
14:50
S1:
15:11
S2:
15:17 15:19 15:20 15:23
S1: S2: S1: S2:
15:35
S1:
15:40 15:42
S2: S1:
15:50
S2:
Transcript English Or else I would say, I found in the middle, where he says, wait a second, he would move with a constant velocity. Where does he say that? Down here. (Points to the material in front of them) || And he will accelerate and accelerate. Mhm [agreeing] Well in any case a contradiction. But I think it is interesting here, here he has, I don’t know whether that’s a ..?.. problem is, but here he has | accelerated in any case, accelerate, push. In some way he sees pushing as a type of acceleration.
And he has the correct idea there and says when he goes faster || faster, exactly [here S1 doesn’t use the German translation and pauses shortly.] when he goes faster (laughs) (S2 laughs) || If he goes faster, then there has to be a force acting, that pushes him, it doesn’t work any other way. Yeah. That much he has understood. That’s also my opinion. One can’t interpret it too much, because he above all does say at the end, he justifies the increase in velocity with an accompanying increase in force, because increasing velocity would also indicate a higher force. Now the question is why he believes that, because he doesn’t say that, he does say down here, he doesn’t talk about friction any more that is stopping him. || Well at least he did. Ok, good, the interviewer did still ask (points to the material in front of them) whether the force is greater, less or remains the same. Then he says | Good, he says here, I think. Ok apart from that, it would increase, the force would increase. Exactly, but that requires that he has an intuition that the friction will also increase with increasing velocity, he does have to keep that in the back of his mind, otherwise he would not get the idea that it has to increase. So, either that or he says in principle he has this idea, well to get faster I have to also push harder. You mean that pushing harder has to require a higher force, because he sees the friction in opposition to it. Either that. That he has this idea. Either that or he has yet, I would say, even less. Although I find that even less likely, well | On top of that he asks there is no friction and then the interviewer, or whoever is doing that there, tells him he is right. | And he says despite this, at the end, that the force would have to increase. I am just asking myself whether what he is saying here is contradicting or not. At what point? That is the question. He says here | “Because there is no friction that stops him” | That means he knows that in order to let something go faster, I have to exert more force. Yep.
Analysis constant velocity contradicts with acceleration STi
pushing is acceleration STd [PSPT curious about ST (interesting)] going faster force STc
going faster force STn
ST, reason: friction [PSPT curious about reasons for ST (now the question is)] force required to overcome friction & force required to get faster STn [overinterpretation (14:50, first two lines)] [PSPT curious about ST (15:35)]
more force required to get faster STc
Paper presented at NARST 2015, Chicago
7
In addition to coding, we briefly tried to summarize the content of PSPTs’ discussion in order to identify whether they seemed to repeat similar arguments throughout their analysis or expressed a variety of ideas. Even though we performed the analysis sentence by sentence, we grouped coherent successive ideas together into chunks (see example in Table 2). Coding was performed independently by two authors of the paper and then results were compared and discussed for shared interpretation. At this stage of research, aiming to establish hypotheses about PSPTs’ noticing before and after the intervention with a very small amount of data, we did not formally calculate interrater reliability but are well aware that further investigations would require such an analysis. Findings Both groups demonstrated high engagement with the video analysis, in general, and with student thinking, in particular. No off-task activities were identified; instead, several instances indicated that the PSPTs were interested in student thinking, or curious about it, but also considered it to be challenging: The end is really great. ||| The end is great. […] Well, I can’t wait. […] There is a lot in here. (male group, pre-test, clip 2, see also 13:47 and 14:29 in Table 2) S1: We can already load the next video. S2: Can we discuss this again real quick? (male group, pre-test, clip 1) Oh, this is difficult. (female group, post-test, clip 1) Well, I don’t know, especially with this student I find it very, very difficult. (male group, post-test, clip 2) This interest was observed even though the PSPTs had already engaged with student thinking over the preceding 10 weeks of the course. With the exception of a few isolated instances, the PSPTs spent the entire time noticing student thinking (see also the number of ST-codes in Table 2). When PSPTs addressed instruction, they mainly did so in order to identify reasons for student thinking: This conceptual network has not been properly established in her case. || I mean, she is missing the alternative concept to it, that the force comes with it. (S1: Mhm. [agreeing]) || And accordingly that would then also speak for these learning progressions, that one so to speak has to first build up a broad conceptual network and then has to refine it from level to level.” (female group, post-test, clip 2) Even though both groups exhibited similar patterns, they also demonstrated differences in how they discussed the video clips and how their discussion changed from pre- to post-test. Therefore, results for each group are presented individually. Female group Pre-test In their initial analysis of student thinking, the female group addressed whether student thinking was correct or incorrect (codes STc and STi). In addition, some student ideas were described but not evaluated (STd). What he did correctly is that… Even though correctly is a bit exaggerated maybe, but… That friction stops him, yes. STc (female group, pre-test, clip 1) […] She has | that still a force has to be acting on the ball (moves finger along transcript), yes, he has put the ball into movement, ok, that means, she does say, err, force, well, force is the cause for movement (looks to S2, looks into transcript), in that sense. STd (female group, pre-test, clip 2)
Paper presented at NARST 2015, Chicago
8
Only rarely did this group perform nuanced analyses similar to the male group (see below). Rather, their analysis remained, for both pre- and post-test, at an evaluative level, identifying what aspects the student understood and what he or she did not understand. Even though the PSPTs discussed various ideas about student thinking, these were often repetitions of initial ideas they had offered. In discussing clip 1, for instance, they frequently noted that the student confused velocity and acceleration and did not distinguish between velocity and force. The PSPTs addressed possible reasons for student thinking at different points in their analysis but primarily, in accordance with the instructions given by the instructor, at the end. For example, in their pre-test they focused in particular on the nature of physics concepts, labelling these as being model-based (derived from the framework offered to them in prior lessons). This abstract nature of the concept was assumed to be one reason for student learning difficulty: The question is now what the reasons are for him not to understand the physics idea at all. […] That the concept of force, for example, is nothing you can really touch, so, it is in the direction of a model-based concept, like with velocity, acceleration. […] STj-TF (female group, pre-test, clip 1) Post-test At the post-test, the PSPTs’ noticing was similar. They evaluated aspects of student thinking but overall, were less explicit about what was correct or incorrect about student ideas. Rather, their analyses remained vague or did not address the particular student in the video but, rather, students in general: […] One can often observe that students can indeed use the manner of speaking, but have not really understood what is behind it. ST (female group, post-test, clip 1) As with the pre-test, the PSPTs addressed reasons for student thinking. Here, they made reference to the FM LP four times. In particular, they hypothesized that the instruction the student had received did not follow the learning progression, creating a gap between the student’s current understanding and concepts addressed in class (at levels that were too high for him): In other words, that he was more or less overwhelmed in class and wasn’t | put on the right learning path, that he completes one level after another and can engage himself, but that he is then more or less overwhelmed with a level that is too high and for this reason has gaps. I mean, this was not a continuous learning path for him then. ST-TF reason: Instruction does not follow LP (female group, post-test, clip 1) Therefore, from the PSPTs’ point of view, with such a gap, students were not able to understand the concepts and would learn only how to express them correctly (“manner of speaking,” see example above). Also, the PSPTs assumed that if students were to reach higher levels of the LP, their understanding would be fragmentary. In addition to using the FM LP as a potential resource to reason about particular student thinking, the PSPTs tried to apply the FM LP once in order to classify student thinking. In doing so, they realized that the LP could not help if all concepts are “mixed up” in the student’s mind: Maybe it is, I mean, if we think back again to this learning progression, that it is not really that he does have the levels. I don’t know if one can really determine such a level or whether he is in general mixing it up more.” STi-TF [LP does not apply well] (female group, post-test, clip 1)
Paper presented at NARST 2015, Chicago
9
Male group Pre-test In their initial analyses, the male group already offered a nuanced noticing of student thinking, addressing different alternatives to the same student idea (see Table 2, 13:47-15:50). During their discussion, the PSPTs questioned their own interpretations (Table 2, 15:35) and critically engaged with each other’s analyses (Table 2, 15:23). Furthermore, they referred to the theoretical framework of levels of conceptualization by using “intuitively” or “intuitive rule-based” three times during their analyses. The male PSPTs offered reasons for student thinking in their pre-test analyses primarily for clip 2 (with one exception for clip 1, see Table 2, 14:29). Referring to clip 2, they described what they assumed the student might have experienced or might be imagining: I almost have the feeling that she is imagining that. In her mind she is putting herself in the ball that was thrown up and she thinks, I will freeze everything for a moment and now I am hanging here in the air and now a force is pulling me down and if I push play again, then I will still keep speeding up into the air, and for that a force is required. STd, reason: how the student is imagining the situation (male group, pre-test, clip 2) Post-test In their post-test, the male group did not provide the same depth of analysis as they had in the pre-test. They paid less consideration to alternative interpretations of student ideas, treating the transcripts as a more straightforward representation of the students’ thinking (codes STi, STc and STm). Instead, they focused on categorizing the students’ ideas and the tasks posed in the interviews according to the FM LP (see example in Table 3). Table 3 Excerpt from data analysis (male group, post-test, clip 1) Time 28:29
S2:
28:54 28:59 29:01
S1: S2: S1:
29:10 29:11
S2: S1:
Transcript English Exactly, the concepts are missing. I mean velocity in itself is not even present in his explanation. (S1: Mhm. [agreeing]) || We basically also had that last time, with these different levels, that in part || that you practically have a concept of force, but | what was that again exactly? Somehow when one continues to move in one direction one has a force, but the concept of velocity itself, it has not even been brought in there. This here was this level two, from these energy levels. Yeah, that’s how it is. It was the two and the three, I think. Everything is definitely always pointing in the same direction. Because he doesn’t even have here, level 4, that was it, that the vector of velocity can point in a direction that is different that the vector of acceleration. Exactly. But that is not even exactly the case here. No brakes are being applied here. It’s just going in the same direction. That means we are on level three and even that is not present with him, and that means he is on level two.
Analysis concept of velocity missing STm direction of force, direction of velocity STm-TF [classification with LP]
force in same direction as motion Q-TF [classification with LP] STm-TF [classification with LP; missing content of question associated with a particular level]
During the post-test, the PSPT did not offer reasons for student thinking but one male PSPT paid some attention to the behavior of the interviewer. Rather than criticizing the questions being asked, he argued that it is difficult to go on probing student understanding even though one is aware that the student is expressing incorrect ideas. In contrast to the pre-test, in which this PSPT discussed alternative interpretations of student ideas and questioned their own
Paper presented at NARST 2015, Chicago
10
analyses, in the post-test, he expressed some frustration with the lack of definitive action being taken on the part of the interviewer: S1: Ok I have to say something completely honestly, what I, what I as the teacher, if I were the interviewer here, what I would be itching to deal with. I couldn’t take it that the students present wrong ideas for five minutes at a time without interrupting them and posing provoking counter questions. I don’t think I could do that. I would then say, imagine, I would immediately step in, and just imagine the following situation or I would just make something up with which I could pull them away from this level. I would – S2: But that is not the interviewer’s role. S1: I know. S2: Otherwise we would not be able to watch it. S1: Of course. This is right so. But I just mean, I think, if I were the interviewer, I would intervene immediately. I wouldn’t let them present their false ideas for five minutes. And when I give private tutoring lessons, my goodness. But I really cannot. (male group, post-test, clip 2) Discussion Reflecting on both the original proposals for teachers’ use of LPs and the preliminary results from this study, we hypothesize that a learning progression for teachers (LPT) might help to explain our results.3 In discussions of teachers’ use of LPs, rhetoric has tended to assume that, without a LP (or other framework for formative assessment), teachers will utilize a “gets it/doesn’t get it” perspective, viewing students’ thinking as either completely right or completely wrong and not recognizing the value in obtaining additional information about the nature of students’ ideas. In very rough terms, this might be considered level 1 of a LPT for engagement with student thinking. As indicated by the pre-intervention results, even before the FM LP was introduced, the PSPTs in this study were engaging with student thinking at a higher level than this LPT level 1. Although both groups occasionally used language that hinted at a “gets it/doesn’t get it” perspective, overall, their analyses reflected recognition that students may have alternative ideas about the content being presented and, rather than providing an overall evaluation that the students were “wrong,” the PSPTs identified what the students were thinking, describing both correct and incorrect aspects. Thus, level 2 of our hypothetical LPT might include teachers’ awareness of alternative student ideas and the ability to recognize these ideas when expressed by students. As reflected in the results above, the male PSPTs went beyond even this level in their analyses of the videos prior to the LP intervention. Their analyses of student thinking were more nuanced and reflected careful attention to the students’ words. They tried out different interpretations of student thinking, recognizing that it was not possible to fully diagnose the student’s ideas on the basis of the information in the video. We propose that these PSPTs were operating at level 3 of a hypothetical LPT. At this level, we imagine that teachers may be aware that student ideas might differ according to context and that they may engage with the contradictions and quirks of reasoning that students often exhibit. In other words, while teachers at this level can provide straightforward diagnoses, they treat student thinking as more complex (and interesting) than is reflected in a “simple” categorization. We note that a LPT is reported in Furtak, Thompson, Braaten, and Windschitl (2012) describes increasingly sophisticated forms of the practice attending to students’ ideas, which includes teachers’ treatment of student ideas, although somewhat differently than the LPT we hypothesize here. 3
Paper presented at NARST 2015, Chicago
11
So, where does the FM LP fit in? Similar to the PSPTs’ efforts to consider the level of instruction or the level of the task in relation to the FM LP level of the students in the video, we think the PSPTs’ interaction with the FM LP was largely a function of their location on our hypothetical LPT. We view arguments for teachers’ use of LPs to support formative assessment practices as consistent with progress from level 1 to level 2 of our hypothetical LPT (i.e., moving from a “gets it/doesn’t get it” perspective to an awareness of a range of student ideas).4 By providing a means of categorizing student ideas, the FM LP supports engagement with student thinking at level 2 of the LPT. This is consistent with Furtak’s (2012) finding that some teachers treated a LP for natural selection “as [a] catalog of misconceptions to be ‘squashed’” (p. 1181). That the PSPTs were already engaging with student ideas at a LPT level equivalent to (or higher than) the LPT level directly supported by the FM LP may help to explain why we did not observe a clear benefit for these PSPTs from working with the FM LP framework. The female PSPTs seemed to engage with student thinking in similar ways (what we would categorize as level 2 on our LPT) both before and after the LP intervention. We hypothesize that the LP provided them with additional ways to categorize student thinking and how it relates to classroom instruction, but not with a fundamentally different way of viewing student thinking (such as the shift from “gets it/doesn’t get it” to a greater awareness of students’ alternative ideas). In contrast, the male PSPTs engaged with student thinking at level 3 before being introduced to the FM LP. After the intervention, their conversation could be characterized at level 2, as the FM LP encouraged them to categorize student thinking, and this was the focus of their interaction with the videos. For the male PSPTs particularly without additional support, the FM LP did not seem to enhance their abilities to identify or interpret student thinking. We conclude that LPs may be most beneficial for teachers at the lowest level of our hypothetical LPT. In providing information about how students typically think about a topic, LPs may support engagement with student thinking beyond a “gets it/doesn’t get it” perspective. However, we might have to consider further, how we introduce LPs to those teachers, who have already reached a level 2 or level 3 of our hypothetical LPT. Simply presenting LP as a means to classify student thinking is not likely to move teachers at these levels of a LPT any further in their abilities to notice.
LPs are also thought to support teachers’ instructional decisions, in supporting choices about how to move students from one LP level to the next; however, in this paper, our focus is only on the identification and interpretation aspects of formative assessment. 4
Paper presented at NARST 2015, Chicago
12
References Alonzo, A. C. (2011). Learning progressions that support formative assessment practices. Measurement: Interdisciplinary Research and Perspectives, 9, 124-129. Alonzo, A. C. (2012). Eliciting student responses relative to a learning progression: Assessment challenges. In A. C. Alonzo & A. W. Gotwals (Eds.), Learning progressions in science: Current challenges and future directions (pp. 241–254). Rotterdam, The Netherlands: Sense Publishers. Alonzo, A. C., de los Santos, X. E., & Kobrin, J. L. (2014, April). Teachers’ interpretations of score reports based upon ordered-multiple choice items linked to a learning progression. In J. Masters (Chair), Diagnostic assessment: Recent advances from psychometric modeling to classroom applications. Symposium conducted at the annual meeting of the American Educational Research Association, Philadelphia, PA. Alonzo, A. C., & Elby, A. (2015a, April). One teachers’s use of a learning progression to generate knowledge about student understanding of physics. In J. Kobrin (Chair), Teachers’ use of learning progressions for formative assessment: Implications for professional development and further research. Symposium to be conducted at the annual meeting of the American Educational Research Association, Chicago, IL. Alonzo, A. C., & Elby, A. (2015b, April). Physics teachers’ use of learning-progression-based assessment information to reason about student ideas and instructional responses. In A. C. Alonzo (Chair), Affordances and constraints of learning progressions for informing teachers’ reasoning. Related paper set to be conducted at the annual meeting of the National Association for Research in Science Teaching, Chicago, IL. Alonzo, A. C., & Kim, J. (2012a, March). Exploring teachers’ pedagogical content knowledge elicited with video clips focused on student thinking. In A. C. Alonzo (Chair), Multiple approaches to video as a tool for exploring teachers’ pedagogical content knowledge. Related paper set presented at the annual meeting of the National Association for Research in Science Teaching, Indianapolis, IN. Alonzo, A. C., & Kim, J. (2012b, April). Beginning teachers learning to notice and respond to student thinking. In Zeichner, K. (Chair), Tools and routines for preparing STEM teachers. Structured poster set presented at the annual meeting of the American Educational Research Association, Vancouver, BC, Canada. Alonzo, A. & Steedle, J. (2009). Developing and assessing a force and motion learning progression. Science Education, 93(3), 389-421. Black, P., Wilson, M., & Yao, S.-Y. (2011). Road maps for learning: A guide to the navigation of learning progressions. Measurement: Interdisciplinary Research & Perspective, 9, 71-123. Bonnoil, J. J. (1991). The mechanisms regulating the learning process of pupils: Contribution to a theory of formative assessment. In P. Weston (Ed.), Assessment of pupils’ achievement: Motivation and school success (pp. 119-137). Amsterdam, The Netherlands: Swets and Zeitlinger. Borko, H., Jacobs, J., Eiteljorg, E., & Pittman, M. E. (2008). Video as a tool for fostering productive discussions in mathematics professional development. Teaching and Teacher Education, 24, 417–436. Briggs, D., Alonzo, A., Schwab, C., & Wilson, M. (2006). Diagnostic assessment with ordered multiple-choice items, Educational Assessment, 11(1), 33-63. Covitt, B. A., Syswerda, S. P., Caplan, B. Z., & Cano, A. (2014, March). Teachers’ uses of learning progression-based formative assessment in water instruction. Paper presented at the annual meeting of the National Association for Research in Science Teaching, Pittsburgh, PA.
Paper presented at NARST 2015, Chicago
13
Furtak, E. M. (2012). Linking a learning progression for natural selection to teachers’ enactment of formative assessment. Journal of Research in Science Teaching, 49, 1181-1210. Furtak, E. M., & Heredia, S. C. (2014). Exploring the influence of learning progressions in two teacher communities. Journal of Research in Science Teaching, 51, 982-1020. Furtak, E. M., Morrison, D., & Kroog, H. (2014). Investigating the link between learning progressions and classroom assessment. Science Education, 98, 640-673. Furtak, E. M., Thompson, J., Braaten, M. & Windschitl, M. (2012). Learning progressions to support ambitious teaching practices. In A. C. Alonzo & A. W. Gotwals (Eds.), Learning progressions in science: Current challenges and future directions (pp. 405– 433). Rotterdam, The Netherlands: Sense Publishers. Gunckel, K. L., Covitt, B. A., & Salinas, I. (2014, March). Teachers’ uses of learning progression-based tools for reasoning in teaching about water in environmental systems. Paper presented at the annual meeting of the National Association for Research in Science Teaching, Pittsburgh, PA. Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force concept inventory. The Physics Teacher, 30(3), 141-166. Hofmann, J. (2015). Untersuchung des Kompetenzaufbaus von Physiklehrkräften während einer Fortbildungsmaßnahme. [Investigation of teachers‘ development of competency during an in-service training.] Dissertation at the Faculty of Mathematics and Informatics, Physics, Geography, Justus Liebig University Giessen. Mayring, P. (2000). Qualitative content analysis. Forum: Qualitative Social Research, 1(2), Art. 20. http://www.qualitative-research.net/index.php/fqs/article/view/1089 [April 8, 2015] Minstrell, J., Anderson, R., & Li, M. (2011, May). Building on learner thinking: A framework for assessment in instruction. Commissioned paper for the Committee on Highly Successful STEM Schools or Programs for K-12 STEM Education, National Academy of Sciences. National Research Council. (2007). Taking science to school: Learning and teaching science in grades K-8. Washington, DC: The National Academies Press. Nussbaum, J. (1981). Towards the diagnosis by science teachers of pupils’ misconceptions: An exercise with student teachers. European Journal of Science Education, 3(2), 159– 169. Otero, V. K. (2006). Moving beyond the “get it or don’t” conception of formative assessment. Journal of Teacher Education, 57, 247–255. Santagata, R., & Guarino, J. (2011). Using video to teach future teachers to learn from teaching. ZDM The International Journal of Mathematics Education, 43, 133-145. Star, J. R., & Strickland, S. K. (2008). Learning to observe: Using video to improve preservice mathematics teachers’ ability to notice. Journal of Mathematics Teacher Education, 11, 107–125. van Driel, J. H., Verloop, N., & de Vos, W. (1998). Developing science teachers’ pedagogical content knowledge. Journal of Research in Science Teaching, 35, 673-695. van Es, E. A., & Sherin, M. G. (2002). Learning to notice: Scaffolding new teachers’ interpretations of classroom interactions. Journal of Technology and Teacher Education, 10, 571-596. van Es, E. A., & Sherin, M. G. (2008). Mathematics teachers’ “learning to notice” in the context of a video club. Teaching and Teacher Education, 24, 244-276. von Aufschnaiter, C. & Rogge, C. (2010). Misconceptions or missing conceptions? Eurasia Journal of Science, Mathematics & Technology Education, 6(1), 3-18.