Working with Middle School Science Teachers to ...

5 downloads 2903 Views 192KB Size Report
education technology for K-12 classrooms. Objectives. Improving .... teachers had district or, in one case, campus grade books they had to use. Second, teachers ...
Working with Middle School Science Teachers to Design and Implement an Interactive Data Dashboard Abstract Improving science instruction is a top priority at a time when experts warn that not enough students in the U.S. are pursuing the STEM careers needed to support a competitive economy. Using student data to improve educational processes, including teaching and learning, is a key step to improving science achievement. Yet, research has shown that data use is difficult, and that even the most technologically-advanced data systems often go unused. This paper documents interim results from the first year of data collection in a larger, three-year project to understand how science teachers use data. Following a mixed methods approach (Creswell, 2009), we collected data in three ways: a teacher survey, teacher observations during planning periods or professional learning community meetings1, and interviews with teachers and instructional coaches. Here, we only report on data from the interviews and observations. The findings have implications for school and district leaders as well as for those developing education technology for K-12 classrooms. Objectives Improving science instruction is a top priority at a time when experts warn that not enough students in the U.S. are pursuing the STEM careers needed to support a competitive economy (National Center for Education Statistics, 2014). Several ways to improve science instruction have been proposed, but in this research, we focused on work being done to provide teachers with better and more timely information on their students’ learning so that teachers can make changes to their instruction (e.g., Fuhrman & Elmore, 2004; Lachat & Smith, 2005; Supovitz, 2009; Wayman, 2005). Specifically, this paper documents interim results from the first year of data collection in a larger, three-year project to understand how science teachers use data, to apply that understanding to the enhancement of data management and reporting tools embedded in an online PreK-12 science curriculum (“the study curriculum”). Theoretical Framework Using student data to improve educational processes, including teaching and learning, is a key step to improving science achievement. Research shows, however, that using data effectively to inform and improve classroom practice can be challenging for teachers (Brunner et al., 2005; Mandinach, Honey, & Light, 2006; Breiter & Light, 2006). User-friendly data systems that provide rapid access to student data are thought to hold promise for teachers because system use can be embedded in nearly every aspect of practice (Datnow, Park, & Wohlstetter, 2007; Lachat & Smith, 2005; Wayman & Stringfield, 2006). Yet, research has shown that even the most technologically-advanced data systems often go unused, primarily because systems do not align with teachers’ work (Cho & Wayman, 2014; Means, Padilla, & Gallagher, 2010; Wayman, Cho, & Shaw, 2009). Cho and Wayman (2014) argued that this is because designers and administrators create and implement systems with the assumption that those systems will be used as intended, and often that is not the case. This                                                                                                                 1

Most of the charter school teachers planned alone and did not have departmental meetings or PLCs, and so we were unable to observe them plan; instead they walked us through how they planned their lessons and assessments as part of the interview. In total, we combined the interview and observation for 32 of the 73 teachers.

finding has important implications for the development of educational data systems, such as the one embedded in the study curriculum. The purpose of this project was to address the inherent limitations of technology-driven change by working closely with science teachers to understand their notions of data use and how they use and want to use the dashboard, and to design, test, and then refine digital tools for the study curriculum’s dashboard. Methods The approach we adopted to design and refine the tools was design-based implementation research (DBIR; Penuel, Fishman, Cheng, & Sabelli, 2011). DBIR focuses on pragmatic issues, emphasizing collaboration between researchers and practitioners, who in essence become coresearchers (Barab & Squire, 2004; Norros & Savioja, 2004; van den Akker, Gravemeijer, McKenney, & Nieveen, 2006). In this way, educational design-based research attends not only to the design of innovations, but also to the implementation of these innovations. The DBIR approach also addresses a common weakness shared by much of the development work in education: Researchers do not bring teachers and other end users into the design and development process (Fishman, Marx, Blumenfeld, Krajcik, & Soloway, 2004). In this paper, we report on data collected during the fall of 2014, which comprised exploratory work. During the summer and early fall of 2014, we recruited six school districts in southeast Texas to partner with us. Of these districts, three were charter organizations (17 schools), and three were traditional public school districts (19 schools). From these districts, we were able to recruit 73 teachers, 71 of whom completed all of the research activities; 45 of them were from public schools, and 26 were from charter schools. Over half of the participating teachers had five years of experience or less, and over three-quarters of participating teachers had fewer than 10 years of experience (see Table 1). The vast majority of teachers participating were not new to science teaching (58%), and 66.2% of all of the teachers only taught science (25.4% taught other subjects as well, including English language arts, math, or social studies). Participating teachers were relatively evenly distributed across the four grade levels (Table 2). Data Sources Following a mixed methods approach (Creswell, 2009), we collected data in three ways: a teacher survey, teacher observations during planning periods or professional learning community meetings2, and interviews with teachers and instructional coaches. Here, we only report on data from the interviews and observations. We conducted two kinds of interviews with participants: one-on-one interviews and focus group interviews. In total, we conducted 12 focus groups, and the remaining teachers we interviewed one-on-one. We interviewed eight of the instructional coaches in a one-on-one format, and one of the coaches we interviewed as part of a teacher focus group. The interviews were semi-structured, which allowed the researchers some flexibility to address additional issues that arose during an interview (Merriam, 2009; Miles & Huberman, 1994). All interviews were audio-recorded and transcribed. For the observations, we observed teachers’ data use as it was embedded in their work. We listened and observed as teachers worked together to create lessons, analyze student assessment data, and create assessments, watching to see what resources they were using (both physical,                                                                                                                 2

Most of the charter school teachers planned alone and did not have departmental meetings or PLCs, and so we were unable to observe them plan; instead they walked us through how they planned their lessons and assessments as part of the interview. In total, we combined the interview and observation for 32 of the 73 teachers.

such as laptops or notebooks, and content, such as the study curriculum, for example), and how they were using them. Finally, we also observed whether the teachers were using any kind of student data as they planned. We took extensive field notes during the observation sessions (Emerson, Fretz, & Shaw, 1995). The observations help to triangulate data collected via the interviews and survey (Yin, 2009), and should not be considered a complete picture of teacher data use. For the purposes of this paper, we focused our analysis on the qualitative data. We analyzed our data descriptively and analytically, both across and within district cases. We coded the interview data and our observation notes using a list of a priori codes, but also added codes when themes emerged, thus incorporating a grounded approach (Strauss & Corbin, 1990; one set of codes was created to analyze the teacher interviews, and a second was created to analyze the instructional coach interviews). For the observation data, we also created two codes based on the resources we observed teachers using, physical planning resources and curriculum resources. All interviews and observations were coded at least twice by different members of the research team in order to ensure reliability. Once we had coded the data, we compared results both within districts (i.e., across schools) and across districts in order to see what was common across the districts and what was unique. Results We focus our discussion of select findings on answering our principal research question, What data and data dashboard functions do teachers use and want, and why? Teachers’ use of the study curriculum tended to be on the high side, which is not surprising given that we recruited five of the six districts because of their strong history of use of the curriculum. In fact, over 44% of the teachers reported accessing the curriculum more than once a day. Teachers told us the used the curriculum frequently to plan and teach as well, with approximately 60% of teachers using it at least once a week, and over a quarter using it to plan and teach every day. We also asked them about their use of data management and reporting tools as well as the lesson planning tool. Approximately one fifth to one third of teachers knew about the tools but reported not utilizing them to plan. Findings from our interviews shed light on possible reasons for the low use. There are a couple of probable reasons that we learned from teachers regarding their decision not to adopt the lesson planner, including their lack of knowledge of or training on the planner. First, almost all of the teachers are required to turn in their lesson plans using a district- or campusapproved format, and they must do so through a district-sanctioned online management system. A second reason is that all teachers use several resources to plan and teach their science classes, and the study curriculum, no doubt for proprietary reasons, does not allow teachers to add those resources into their planner. A second tool that the vast majority of teachers did not use was the grade book. In most cases, teachers had district or, in one case, campus grade books they had to use. Second, teachers used various resources for their assessments, which also could not be included into the study curriculum’s grade book. Third, because technology was not one-to-one in classrooms, only a few teachers administered the curriculum’s assessments online. In other words, even if they gave one of the automatically graded assessments, if they did so on paper, they would have to go back and enter the grades themselves manually. Another notable result was the general lack of technology in classrooms. Across both the public and charter schools, we observed very few computers in classrooms. Most schools did, however,

have computer labs, but usually these were shared across the school, and some teachers told us that math and English language arts (ELA) had priority. Even in the district that allows students to bring and utilize their own devices (BYOD), access to devices varied and, in most cases, BYOD only happened once or twice a week. Most teachers told us that they wanted to use more technology in their teaching. When we asked them to describe their ideal digital tool, they told us they wanted more videos, though they specifically described a few kinds of videos: short videos, videos to accompany or follow-up on labs, and videos of labs being demonstrated for when it is difficult to conduct a lab in the classroom. They also told us they wanted a digital tool with lots of interactive games and simulations. Significance The educational technology sector currently is experiencing rapid growth as more and more products have appeared and companies have sprouted up to fill demand for technology in schools. Just as the number of technology products has increased, so has the presence of learning analytics and tools that aim to help educators make sense of the mass of data. Research has not kept up with development, however, and so our understanding of how teachers want to use and actually use these analytics-based tools is limited, as is our knowledge of their impact on teaching and learning (Bienkowski, Feng, & Means, 2012). Yet, schools and districts are spending millions of dollars on products for which there is little empirical support, and teachers and administrators are being asked to use these products, not all of which are designed with the end-users in mind (Bienkowski et al., 2012). The findings from this exploratory research have implications for both district and school leaders and for educational technology developers. A recent study of how teachers view data and data tools found that teachers still are not satisfied with the data that digital tools provide them (Gates, 2015), which means that educational technology and curriculum companies need to be doing more in-depth research on their users and their needs before developing their products, just as this study was designed to do. Similarly, district and school leaders in charge or procurement and adoption decisions should learn more about how the products they are investing millions of dollars in were developed, and how they continue to learn from their users to improve their products. Similarly, procurement decisions for educational technology should rely heavily on input from the main users, whether that includes teachers, students, or both. Words: 1,948

Table 1: Participants’ teaching experience

Experience New to teaching 2-5 years 6 to 9 years 10 to 15 years 20 to 30 years More than 30 years Total

Number of teachers 8 31 15 9 2 2 71

Table 2: Grades taught Number of Grades Taught teachers 5th 20 6th 12 7th 20 8th 22 Other* 3 Total 71

Percent 11.3 43.7 21.2 12.7 2.8 2.8 100

Percent 28.2 16.9 28.2 31 4.2 100

*Of the 'other' category, one teaches 4th grade, and two teach high school in addition to middle school

References Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. The journal of the learning sciences, 13(1), 1-14. Bienkowski, M., Feng, M., & Means, B. (2012). Enhancing teaching and learning through educational data mining and learning analytics: An issue brief.US Department of Education, Office of Educational Technology, 1-57. Breiter, A., & Light, D. (2006). Data for school improvement: Factors for designing effective information systems to support decision-making in schools.Journal of Educational Technology & Society, 9(3), 206-217. Brunner, C., Fasca, C., Heinze, J., Honey, M., Light, D., Mardinach, E., & Wexler, D. (2005). Linking data and learning: The Grow Network study. Journal of Education for Students Placed At Risk, 10(3), 241-267. Cho, V. & Wayman, J. C. (2014). Districts’ efforts for data use and computer data systems: The role of sensemaking in system use and implementation. Teachers College Record, 116(2). Creswell, J. (2009). Research design: Qualitative, quantitative, and mixed methods approaches. Thousand Oaks, CA: Sage Publications. Datnow, A., Park, V., & Wohlstetter, P. (2007). Achieving with data: How high-performing school systems use data to improve instruction for elementary students. Los Angeles, CA: Center on Educational Governance, USC. Emerson, R. M., Fretz, R. I., & Shaw, L. L. (2001). Participant observation and fieldnotes. Handbook of ethnography, 352-368. Fishman, B., Marx, R. W., Blumenfeld, P., Krajcik, J., & Soloway, E. (2004). Creating a framework for research on systemic technology innovations. The Journal of the Learning Sciences, 13(1), 43-76. Fuhrman, S., & Elmore, R. F. (Eds.). (2004). Redesigning accountability systems for education. New York: Teachers College Press. Gates Foundation. (2015) Teachers know best: Making data work for teachers and students. Seattle, WA: Gates Foundation. Lachat, M. A., & Smith, S. (2005). Practices that support data use in urban high schools. Journal of Education for Students Placed at Risk, 10(3), 333-349. Mandinach, E. B., Honey, M., Light, D., & Brunner, C. (2008). A conceptual framework for data-driven decision making. Data-driven school improvement: Linking data and learning, 13-31. Means, B., Padilla, C., & Gallagher, L. (2010). Use of Education Data at the Local Level: From Accountability to Instructional Improvement. US Department of Education. Merriam, S. B. (2009). Qualitative Research. A guide to Design and Implementation (Revised and Expanded from Qualitative Research and Case Study Applications in Education). Miles, M. B. & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook. Thousand Oaks, CA: Sage.

Minner, D. D., Levy, A. J., & Century, J. (2010). Inquiry-based science instruction – What is it and does it matter? Results from a research synthesis years 1984 to 2002. Journal of Research in Science Teaching, 47, 474-496. National Center for Education Statistics. (2014). Baccalaureate Degree Recipients’ Early Labor Market and Education Outcomes: 1994, 2001, and 2009. Washington, DC: Department of Education. Norros, L., & Savioja, P. (2004). Usability evaluation of complex systems. A literature review. Radiation and Nuclear Safety Authority, Helsinki (Finland). Penuel, W. R., Fishman, B. J., Cheng, B. H., & Sabelli, N. (2011). Organizing research and development at the intersection of learning, implementation, and design. Educational Researcher, 40(7), 331-337. Strauss, A., & Corbin, J. M. (1990). Basics of qualitative research: Grounded theory procedures and techniques. Thousand Oaks, CA: Sage Publications. Supovitz, J. (2009). Can high stakes testing leverage educational improvement? Prospects from the last decade of testing and accountability reform. Journal of Educational Change, 10(2-3), 211-227. Van den Akker, J., Gravemeijer, K., McKenney, S., & Nieveen, N. (Eds.). (2006). Educational design research. Routledge. Wayman, J. C. (2005). Involving teachers in data-driven decision making: Using computer data systems to support teacher inquiry and reflection. Journal of Education for Students Placed at Risk, 10(3), 295-308. Wayman, J. C., & Stringfield, S. (2006). Technology‐supported involvement of entire faculties in examination of student data for instructional improvement. American Journal of Education, 112(4), 549-571. Wayman, J. C., Cho, V., & Johnston, M. T. (2007). The data-informed district: A district-wide evaluation of data use in the Natrona County School District. Austin, TX: The University of Texas at Austin. Wayman, J. C., Cho, V., & Shaw, S. M. (2009). First-year results from an efficacy study of the Acuity data system. Austin, TX: The University of Texas. Yin, R. K. (2009). Case study research: Design and methods. Applied Social Research Methods Series (4th ed., Vol. 5). Thousand Oaks, CA: SAGE.