Understanding What Students Know

17 downloads 0 Views 394KB Size Report
For young children, we often create a search engine that we populate with a few dozen websites (see Google. Custom Search https://www.google.com/cse/ ).
POP CULTURE/DIGITAL LITERACIES

Understanding What Students Know Evaluating Their Online Research and Reading Comprehension Skills JILL CAST EK & JULIE COIRO

T

546

he Internet is increasingly central to our daily lives, transforming the ways we access, use, and exchange information (Pew Research Center, 2014). Conducting online research requires skills in questioning, locating, evaluating, synthesizing, and communicating (Leu, Kinzer, Coiro, Castek, & Henry, 2013). These skills are complex, and in some cases, unique to online reading and writing contexts. However, many of these skills are only peripherally included in the Common Core State Standards and are not specifically addressed on any state reading assessment. As a result, they are not embedded into most literacy curricula nor systematically assessed by teachers. This piece is framed by questions we are often asked when we talk about online reading assessments and instruction with teachers. Each of us offers our perspective framed by experiences in different classrooms and our review of how to assess digital literacies in actionable ways (see Coiro, 2009; Coiro & Castek, 2010). Our ideas are also informed by our experiences in designing, administering, scoring, and interpreting the results of a range of different online reading assessments (Castek & Coiro, 2010 ; Coiro & Kennedy, 2011). By sharing these perspecAuthors (left to right) Jill Castek is a Research Assistant tives, we hope to inspire Professor with the Literacy, new ways of tracking Language, and Technology Research Group at Portland State students’ progress that University, Portland, OR, USA; can help teachers dee-mail [email protected]. sign powerful literacy Julie Coiro is an Associate Professor of Education at the instruction that inteUniversity of Rhode Island, Kingston, RI, USA; e-mail jcoiro@ grates digital texts and uri.edu. new technologies.

What lessons have you learned from your work in assessing online reading? Jill : I’ve learned that students’ online reading ability cannot be determined solely on the basis of their reading performance in non-digital contexts (Castek, Zawilinski, McVerry, O’Byrne, & Leu, 2011; Coiro, 2011). We’ve both seen students who are less proficient offline readers who can skillfully read and communicate in online spaces. We’ve also noticed digital texts often require that readers engage more actively in order to navigate hyperlinks and construct their own reading paths. Skilled online readers move productively across different texts, find and make sense of multimodal resources, and use crowd-sourcing techniques to ask questions of peers and other experts to make sense of ideas. We know that students’ ability to accomplish these tasks depends on their range of online experiences in both the classroom and in settings outside of school. It ’s critical that teachers find ways to capture and understand the full range of students’ abilities so they can guide learners toward more strategic use of practices that will better prepare them to make sense of information in many different online spaces. Julie : I learn a lot by watching online readers, but it ’s often difficult to know what they are thinking as they click around and make decisions about what to read. As a result, we’ve often encouraged students to work together in assessment spaces where strategic

Journal of Adolescent & Adult Literacy 58(7) April 2015 doi:10.1002/jaal.402 © 2015 International Literacy Association (pp. 546–549)

What is important to consider when designing assessments of online research and comprehension? Julie : Assessment tasks can be designed as scenario-based performance tasks that capture students’ actions and thinking processes during reading (using screen capture software such as Jing, Camtasia or ScreenCast-O-Matic) as well as the products they create with communication tools like e-mail or blogs. In our early work, almost all of our online reading assessments were designed to capture online reading processes. Examining the quality of students’ cognitive strategies and their social exchanges helps to understand more and less effective online reading practices, which can, in turn, inform classroom instruction. You might wish to try your hand at assessing the quality of 7th graders’ online reading strategy use by exploring student videos at the companion website to the Online Research and Comprehension Assessment (ORCA) project (2009–2014) as shown in Figure 1—see http://www.orca.uconn.edu/professionaldevelopment/understanding/introduction/. You can also explore the “In My Class” section of the

FIGURE 1 Screenshot of the ORCA Project website ( http://www.orca.uconn.edu/professional-development/ understanding/introduction/ ) for professional development featuring a menu of “Show Me” tutorials and “Let Me Try” video clips with which to practice evaluating student performance on each dimension of online research and reading comprehension

Understanding What Students Know: Evaluating Their Online Research and Reading Comprehension Skills

thinking surfaces more naturally as students share ideas, take turns, and develop expertise together. Pairing students for online assessment activities encourages dialogue and collaboration. As students discuss ideas, their cognitive processes and ways of working with others become more visible. Often, partners model and exchange strategies with each other to enhance their mutual understanding of the material (Coiro, Castek, & Guzniczak, 2011). We’ve also found that assessment tasks can weave in communication tools used in out-of-school contexts (text messaging, social networking, digital video) and the workplace (e-mail, video-conferencing, support networks) to foster and support collaboration as a naturally social dimension of online reading comprehension. However, because these tools aren’ t typically used in classrooms, many students are unfamiliar with how to use them to support academic learning. Thus, we need to consider how to use these assessments as opportunities for students to simultaneously learn new strategies while showing their teachers what they currently know about productive online literacy practices.

547

POP CULTURE/DIGITAL LITERACIES website for several classroom lessons that map onto specific online reading skills for which your students might need more support.

J OURN AL OF A DOL E SCE NT & ADU LT L ITE RAC Y

5 8 (7 ) APRIL 2015

Jill: More recently, we’ve created online assessments that ask students to work with a partner and apply their knowledge of certain content while using online resources to solve broad interdisciplinary problems (see Coiro, Sekeres, Castek, & Guzniczak, 2014). Research scenarios engage students with authentic problems such as helping Toys “R” Us stock their shelves with ecologically friendly toys or offering advice about whether or not land should be set aside for leatherback sea turtles. Using curriculum-based themes for informal classroom assessments promotes a level of engagement, opportunities for content-area learning, and investment not often observed during large-scale assessments. These online research tasks (e.g., http://coiroira2013. wikispaces.com/EcotoysExample) involve the full range of online reading comprehension and communication strategies as students generate questions, locate and evaluate information, and organize and share solutions. This approach to assessment allows us to evaluate the quality of students’ online reading processes in relation to multiple types of product data (dialogue, online communication, projects) and provides a rich set of data from which to draw insights and design instructional supports.

548

What are some challenges encountered in assessing students’ knowledge? Jill: It ’s difficult to design assessment tasks that are simultaneously authentic and motivating. Class-based inquiry activities designed around externally generated problems aren’ t quite as real or engaging as online inquiries prompted by personally meaningful questions. Tasks that are too prescriptive or constraining may decrease motivation and authenticity. While we are strong advocates for self-directed online inquiry, we’ve found that designing tasks that engage students in a common, but authentic experience makes it possible to gauge their skill level on a common metric. While all students will undoubtedly bring something unique to the task, a standard task that all students complete yields a

known set of responses and predictable pathways that can be more easily evaluated and compared. However, these tasks need to fit within the time constraints of a school’s daily schedule, which can be challenging. Julie : Being able to create reliable items is another challenge. Online texts and reading contexts change rapidly, often from one day to the next. Designers update their websites, URLs for websites change without warning, and readers post new comments using social networking tools. These constant changes make for more valid but less reliable assessments of online research and reading comprehension. Creating your own texts and network of online spaces within which students can search and interact with digital texts can address some of these issues (since you have control over when and how the information changes), but this is incredibly time-consuming and does not reflect the complexity of ever-changing texts students are likely to encounter in unbounded online inquiry experiences.

What drives the design of assessment tasks? Jill: One choice is to use the open Internet as a context to observe students. There are distinct advantages here. The reading context is realtime and current, reading choices are unlimited, and distractions are authentic. However, constantly changing texts make it difficult to compare student performance over time. For young children, we often create a search engine that we populate with a few dozen websites (see Google Custom Search https://www.google.com/cse/). Some of these websites are more relevant to the topic than others to reflect an authentic online reading experience, but this customized set of resources ensures that all of the resources are appropriate for children (see more at Sekeres, Coiro, Castek, & Guzniczak, 2014). The custom search tool involves collecting websites you wish to use and copy/pasting the URL’s into a form box (see this short Teacher Tube tutorial (http:// tinyurl.com/mv46nt7). When students use your customized search engine, the search results appear just like they do in a regular Google search. To see how it works, try out the search engine

Julie : It is also possible to develop what we have called a closed assessment format with replicas of websites linked together in a simulated closed space. At the overview page of our ORCA Project ’s website (see http://goo. gl/J3N5Pr), you can view videos of a student engaged in online research in an open format as well as a student engaged in a parallel assessment confined to websites in a closed space. You might also notice the closed space contains a fully functioning, but customized search engine known as “Gloogle” that limits search results to the websites we collected. However, as mentioned earlier, creating replicas of websites is fairly technical and extremely time-consuming, especially with tasks that require many websites. And, using a closed system is not an adequate stand in for assessing performance in an authentic online reading context; there are fewer navigational choices and less complexity, which may artificially elevate students’ performance. Jill: We also believe future assessments should embed digital supports such as text-to-speech reading aids, annotation and translation tools, and space for reflection to provide a richer picture of what all students know and can do. These features help to collect different kinds of output data that highlight the many different skills and practices that readers control as well as those that are still developing. Identifying both strengths and weaknesses can, in turn, inform the design of meaningful instructional sequences such as those available at http:// coiroira2013.wikispaces.com/ and http://tinyurl. com/k4kctmf As we think forward about ways to better understand students’ online reading and research skills, we certainly recognize that the online landscape will continue to evolve. Faced with these changes, we hope you will join us in thinking carefully about how best to capture and understand what students know and can do online.

References Castek , J., & Coiro, J. (2010, May). Measuring online reading comprehension in open networked spaces: Challenges, concerns, and choices. In S. Sullivan (Chair) Finding common ground: Documenting and analyzing student learning with hypertext, multimedia, and hypermedia. American Education Research Association (AERA). Denver, CO. Retrieved from http://tinyurl.com/oktb5p8 Castek, J., Zawilinski, L., McVerry, J.G., O’Byrne, W.I., & Leu, D.J. (2011). The new literacies of online reading comprehension: New opportunities and challenges for students with learning difficulties. In C. Wyatt-Smith, J. Elkins, & S. Gunn (Eds.), Multiple perspectives on difficulties in learning literacy and numeracy (pp. 91–110). New York, NY: Springer. Coiro, J. (2009). Rethinking online reading assessment: How is reading comprehension different and where do we turn now. Educational Leadership, 66, 59 – 63. Coiro, J. (2011). Predicting reading comprehension on the Internet: Contributions of offline reading skills, online reading skills, and prior knowledge. Journal of Literacy Research, 43(4), 352–392. Coiro, J., & Kennedy, C. (2011). Preparing students for common core standards and 21st century literacies: The Online Reading Comprehension Assessment (ORCA) Project : Unpublished manuscript, Universit y of Rhode Island, Kingston, RI. Retrieved from http://www.orca.uconn.edu/research/ research-reports/ Coiro, J., & Castek, J. (2010). Assessment frameworks for teaching and learning English language arts in a digital age. In D. Lapp & D. Fisher (Eds.) Handbook of research on teaching the English Language Arts (3rd Ed.) (pp. 314 –321). Co-Sponsored by the International Reading Association and the National Council of Teachers of English. New York, NY: Routledge. Coiro, J., Castek, J., & Guzniczak, L. (2011). Uncovering online reading comprehension processes: Two adolescents reading independently and collaboratively on the Internet . In P. Dunston, L. Gambrell, K. Headley, S. Fullerton, P. Stecker, V. Gillis , & C. Bates (Eds.) 60th Annual Yearbook of the Literacy Research Association (pp. 354 –369). Oak Creek, WI : Literacy Research Association. Coiro, J., Sekeres, D.C., Castek , J., & Guzniczak , L . (2014). Comparing third, fourth, and fifth graders’ collaborative interactions while engaged in online inquiry. Journal of Education, 194 (2), 1–16. Leu, D.J., Kinzer, C.K., Coiro, J., Castek , J., & Henry, L.A . (2013). New literacies: A dual level theory of the changing nature of literacy, instruction, and assessment. In D. Alvermann, N.J. Unrau, & R. B. Ruddell (Eds.), Theoretical models and processes of reading (6th ed., pp. 1150 –1182). Newark, DE : International Reading Association. Pew Research Center. (April 2014). Older adults and technology use. Retrieved from http://www.pewinternet.org/2014/04/03/ older-adults-and-technology-use/ Sekeres, D.C., Coiro, J., Castek, J., & Guzniczak, L.A . (2014). Wondering + online inquiry = learning. Phi Delta Kappan, 96 (3), 44 – 48.

Understanding What Students Know: Evaluating Their Online Research and Reading Comprehension Skills

we designed for our Eco-Friendly toys task (see http://goo.gl/ibGzkh) by searching for natural wood toys or green toys.

549