Exceptionality, 17:88–102, 2009 Copyright © Taylor & Francis Group, LLC ISSN: 0936-2835 print/1532-7035 online DOI: 10.1080/09362830902805848
Understanding How Adolescents with Reading Difficulties Utilize Technology-Based Tools Matthew T. Marino Washington State University
This article reports the findings from a study that examined how adolescent students with reading difficulties utilized cognitive tools that were embedded in a technology-based middle school science curriculum. The curriculum contained salient features of the Universal Design for Learning (UDL) theoretical framework. Sixteen general education teachers implemented the curriculum in 62 inclusive classrooms. Students’ (N D 1153) tool use was monitored throughout the four-week curriculum to determine if there was a relationship between students’ reading ability, use of cognitive tools, and their comprehension of scientific concepts and processes. Students were grouped into three reading ability levels based on their performance on a standardized reading test. Findings from this study indicate that students who scored below the 50th percentile on the reading assessment utilized and benefited from the tools in highly similar ways. Implications and recommendations for instructional design research and practice are discussed.
Several emergent trends in education policy and practice are converging in a manner that offers great promise for individuals with disabilities. First is the increased presence of students with disabilities in general education classrooms during content area instruction (De La Paz & MacArthur, 2003). Second is the increased availability of technology in inclusive classrooms. Third is the utilization of Universal Design for Learning (UDL) as a theoretical framework for K–12 curriculum design and implementation. Together these trends create the potential for a symbiotic relationship among inclusive practices, technology, and instruction that improves access to learning opportunities within the general education curriculum (Rose, Meyer, & Hitchcock, 2005). One group for whom this convergence holds particular merit is adolescents with reading disabilities. As students with reading disabilities enter fourth grade, they are increasingly expected to utilize complex expository texts as a primary mode of knowledge acquisition (Dyck Correspondence should be addressed to Matthew T. Marino, Washington State University, Cleveland Hall, Room 329, P.O. Box 642132, Pullman, WA 99164-2132. E-mail:
[email protected]
88
TECHNOLOGY-BASED TOOLS
89
& Pemberton, 2002). These texts present concepts and vocabulary at a pace and readability level that students with reading disabilities struggle to achieve (Maccini, Gagnon, & Hughes, 2002). As a result, there is an increasing achievement gap between individuals with disabilities and their normally achieving peers at the secondary level that is most pronounced in the areas of science, social studies, applied problems, and passage comprehension (Wagner, Newman, Cameto, & Levine, 2006). Instructional technology is gaining increased popularity as a method to scaffold students’ learning processes and reduce the achievement gap in inclusive classrooms (U.S. Department of Education, 2004). Technology can enhance students’ abilities to read, review, and access supplemental information through the use of cognitive tools. For example, if a student reads an electronic paragraph about Mars and needs an explanation of the term Kelvin, the student could click on the term, hear its definition, and see a visual representation of how the Kelvin scale compares to the Celsius scale. This type of student self-management during assignments has been shown to improve comprehension and motivation (Smith, Dittmer, & Skinner, 2002). Technology can also provide students who have knowledge deficits with tools that allow them to acquire background information quickly and efficiently (Liu, Bera, Corliss, Svinicki, & Beth, 2004; McKenna, Reinking, Labbo, & Kieffer, 1999). These tools reduce the cognitive energy students must expend gathering information and increase their ability to focus on the goals of the learning task (Liu & Bera, 2005). Effective instructional technology includes tools that support students’ cognitive processes by presenting relevant information using multiple modalities including pictures, sound, and language (Mayer, 2003). This allows students to focus their cognitive abilities on higher-order processing tasks, such as critical analysis, rather than memorizing isolated facts (Quintana et al., 2004). Technology can streamline the learning process with tools that automate nonessential tasks. For example, students can take notes using voice recognition software or access information resources such as databases, graphic organizers, overviews of text, and dictionaries with the click of a mouse. Kirschner and Erkens (2006) noted that these types of mindtools assist learners with integrating and interrelating content so that learning outcomes are meaningful. There has been an increased focus on the relationship between tool use and performance over the past decade. Englert, Wu, and Zhao (2005) found that while Web-based writing scaffolding tools helped many students with learning disabilities produce well-formed essays, there were differential effects of the supports based on the individual students. This finding is supported by other researchers who have concluded that the effectiveness of technology-based tools may depend on the individual students, the classroom context (e.g., grade level or content area), the quality and type of tools, and the extent to which students use the tools (Edelson, Gordin, & Pea, 1999; Liu et al., 2004; MacArthur, Ferretti, Okolo, & Cavalier, 2001; Quintana et al., 2004). Preiss and Sternburg (2006) postulated that as the link between technology-based tools and daily activities becomes more fixed, individuals’ metacognitive processes would be altered in a way that challenges how we conceptualize and measure intelligence. Simply stated, the reliance on (and interaction with) the tools will fundamentally alter how individuals think and act. This is logical if the tools are used appropriately. Consider how a spell checker might change the written expression of a student with a learning disability. However, if the student does not know how or when to use the tool, it will not make a difference. Land (2000) reported that students who did not recognize the importance of using technology-based tools
90
MARINO
to solve complex problems frequently disregarded the tools and found them to be a hindrance to their progress. Given this finding, a reasonable approach for improving our understanding of the types of tools that support students who struggle in the general education classroom is to focus on effective design and instruction principles. UDL is a theoretical framework that holds promise for accomplishing this objective (Hitchcock, Meyer, Rose, & Jackson, 2002). UDL is grounded in the belief that barriers to student learning should be identified and eliminated at the outset of effective instructional design in order to support students’ recognition, strategic, and affective learning networks (Rose et al., 2005). To achieve this, technology is incorporated throughout the instructional design as a means to (a) present multiple formats of the same data, (b) provide students with flexible means of expression, and (c) encourage engagement with the learning materials (Jackson, Harper, & Jackson, 2005). From a theoretical perspective, a universally designed curriculum has intellectual merit. From a practical standpoint, this type of curriculum design takes substantive time, energy, and commitment from the curriculum developer and instructor. Identifying which technology-based tools are most efficacious for student learning would enable the efficient development of UDL curricula. Unfortunately, there is a lack of conclusive empirical evidence that addresses this issue (Alper & Raharinirina, 2006; MacArthur et al., 2001). The current investigation was designed to address this need for research within the context of an inclusive technology-based middle school science curriculum. Science is considered to be one of the most difficult subjects for individuals with disabilities because of its complex vocabulary and theoretical nature (Mastropieri, Scruggs, & Graetz, 2003). It was hypothesized that the rigorous content would encourage students to utilize the technology-based tools and create a means by which the researcher could understand the relationship between tool use and learning. For this study, student groups were established based on reading performance rather than disability categorization. This decision was based on precedent established in other studies (e.g., Fletcher et al., 1994; Vellutino, Scanlon, & Lyon, 2000) with the purpose of alleviating the definitional issues surrounding both the learning disabilities construct and AT. Specifically, this study was designed to answer the following question: Is there a relationship between students’ reading ability, use of cognitive tools, and their comprehension of scientific concepts and processes? If so, what is the nature of the relationship?
TABLE 1 Student Group Membership by School
School
Participants’ Grade Level
Severe Reading Difficulties (DRP 1)
Poor Readers (DRP 2)
Proficient Readers (DRP 3)
Total Students
A B C D
8 8 7 6
43 32 23 28
48 41 53 63
166 100 324 232
257 173 400 323
126
205
822
1153
Group Totals
TECHNOLOGY-BASED TOOLS
91
METHOD Setting and Participants Students from four middle schools in the northeast participated in the four-week technologybased science curriculum. Participants (N D 1153) ranged from sixth to eighth grade. The curriculum was implemented by 16 general education science teachers in the grade level at which astronomy was included in the school districts’ curriculum plan. Two schools implemented the curriculum at the eighth grade level, one school at the seventh grade, and one school at the sixth grade level. School districts were selected based on unanimous administrative support for the project, the adequacy of technical resources and infrastructure to support the project, and consent to participate from all science teachers in the grade level at which the curriculum would be implemented. Each student in the study used an individual computer to access the curriculum throughout the intervention. Eleven teachers implemented the curriculum in a computer laboratory; five teachers utilized laptop computers in their science classrooms. Three student groups were established based on students’ scores on the Degrees of Reading Power (DRP) test from the previous year: (1) students with severe reading difficulties (n D 126) defined as scoring 25th percentile, (2) poor readers (n D 205) defined as scoring between the 26th and 50th percentile, and (3) proficient readers (n D 822) defined as scoring >50th
FIGURE 1 Screen print from the Astro-engineering room showing the choice of authentic NASA instruments students can choose for their probes. Note that students also have chassis type, power source, and communication device choices. The tool bar at the bottom of the page (e.g., field journal, spectrogram, etc.) is accessible in any of the virtual rooms.
92
MARINO
percentile. Information regarding student group membership including cell sizes and grade level distribution is presented in Table 1. Fifty percent of the participants were female. Ninety one percent of the participants identified themselves as white, 1% African American, 5% Asian, and 3% Hispanic.
Instrumentation Alien Rescue (Center for Innovative Learning and Assessment Technologies, 2005) is a technology-based astronomy curriculum that includes critical components of the UDL framework. The curriculum utilizes problem-based learning as a means to anchor instruction, which provides students with an authentic context where they can use cognitive tools (Brown, Collins, & Duguid, 1989; Cognition and Technology Group at Vanderbilt, 1990). Students begin the curriculum by watching a video in which they learn that six alien species, each with individual characteristics, are orbiting Earth in search of a new home in this solar system. The students are charged with learning about the aliens, comparing and contrasting the aliens’ needs to the planets and moons in the solar system, and ultimately recommending homes for the aliens. The curriculum includes five stages of scientific inquiry: exploration, preliminary research, advanced research, hypothesis testing, and justifying solutions.
FIGURE 2 Screen print of the expert modeling tool. In this frame the expert is reading the highlighted text and demonstrating how to take notes in the electronic field journal. The colored bars under the text “The Eolani need to breath in” represent a spectrometer representation of oxygen. Students can access information about the color signature using the spectrogram tool at the bottom of the screen.
TECHNOLOGY-BASED TOOLS
93
Alien Rescue provides students with technology-based tools that scaffold the learning process through a two-layered interface. The first layer is a virtual space station consisting of five rooms. Each room provides students with access to different instruments and information. For example, the astroengineering room allows students to design probes to send to other planets. This room provides virtual access to authentic instruments used by NASA. Students build probes that conform to budget limitations, provide rationales for their probe design, launch the probes, collect and interpret their data, and provide teachers and classmates with their analysis. A screen print of the astroengineering room is presented in Figure 1. Other rooms in the space station include: (a) the expert room (Figure 2), where students can have common questions answered by watching videos of an expert navigating through the program, taking notes, and asking questions; (b) xenobiology, where information on the aliens is found; (c) computer core housing tutorials of scientific concepts; and (d) telemetry, where information from probes is displayed. The second layer of the interface provides students with tools they can use in any room in the space station. These tools are available as icons at the bottom of the screen. For example, if students encounter the name of a planet they are unfamiliar with, they can click on the solar database tool and gain background information using the provided text, illustrations, pictures, animations, videos, and graphic organizers. A screen print of the solar database tool is presented in Figure 3. These tools provide students with the ability to learn at their own pace, while taking into account varying degrees of background knowledge.
FIGURE 3 Screen print of the solar database tool. This tool provided students with background information on the planets and moons in our solar system.
94
MARINO
Measures Pre/posttest. Students’ knowledge of scientific concepts, processes, and vocabulary was ascertained using a paper and pencil 25-item multiple-choice pre/posttest design. A reliability analysis of the pre/posttest measure resulted in alpha D .85 (Pedersen & Williams, 2004). Solutions Forms. Six open-ended paper and pencil solutions forms were developed by the researcher to serve as secondary measures of achievement. The solutions forms were designed to assess students’ ability to analyze, synthesize, and evaluate scientific information. The forms prompted students to identify key environmental requirements for each alien species (e.g., essential elements). Once all of the aliens’ requirements were identified, the students formulated a hypothesis regarding the most appropriate planet for the alien. They completed a side-byside comparison with the alien characteristics on the left side of the form and the planet characteristics on the right. Students noted whether each feature of the planet was a beneficial or limiting factor. After completing both columns, the students were required to complete a two to five sentence narrative justifying their proposed solution. The solutions forms were scored using a 36-point rubric that was piloted with 300 students and refined for this study based on suggestions from teachers and students. Two raters scored each solution form. The inter-rater reliability for these scores (R D .90) was established by comparing rater scores on 10 randomly selected solutions forms for each teacher. Degrees of Reading Power (DRP). The DRP is a criterion-referenced measure of students’ reading comprehension that utilizes the Bormouth readability formula (r D .92) to determine students’ reading ability level. The formula is based on word familiarity and an analysis of the words and sentences students can read and understand during an assessment that requires students to complete cloze procedures on paragraphs of increasing difficulty. The test was normed in 1998 with approximately 48,000 students. Validity and reliability were established using a sample of 5,000 students. Reported parallel form reliability ranges from .86 to .91. The K-R 20 measure of internal consistency is .95 (Touchstone Applied Science Associates, 2000). Tool Use. Students were assigned unique user IDs and passwords. Each time a student logged onto the program a record of his or her tool use was obtained. Reliability of the tool use data (R > .95) was established by monitoring students’ tool use for two class periods and comparing observations from students, teachers, and the researcher to data collected using the data collection program. Procedures Students’ DRP scores from the fall of the previous academic year were used to establish group membership. Students who did not have DRP scores were excluded from the group assignment and data collection process. Students completed the pretest and viewed a video of the ill-structured problem during the first day of the intervention. During the following three days, they explored the program’s virtual rooms and tools in groups of two to five students. Dialogue regarding the functionality and efficacy of the tools was encouraged by the teachers
TECHNOLOGY-BASED TOOLS
95
in both the opening and closing discussions and as the students were working next to each other at their computers. After the three-day exploration, students began conducting preliminary research about the aliens. Students completed the solutions forms during the final two weeks of the intervention. On the last day of the intervention students completed the paper and pencil multiple-choice posttest. Fidelity of Implementation. Teachers were trained to implement the curriculum by following a teacher’s manual at a one-day group session. This was followed by a two-hour individual conference with the researcher one week later. Teachers demonstrated their ability to use the technology, reviewed the manual, practiced evaluating the solutions forms, and taught a selected lesson from the teacher’s manual to their peers. At the conclusion of the intervention, the researcher met with each teacher for a one-hour consultation meeting at which teachers and the researcher reviewed the solutions form rubric scoring procedure. Three measures were analyzed to determine the extent to which the curriculum was implemented as planned in the teacher’s manual. Teachers completed a self-report measure at the conclusion of each class. The form prompted teachers to identify the components of the lesson they had implemented during the class. In addition to the lesson components (i.e., group meeting, computer time, closing discussion), teachers were asked to identify salient aspects of the class discussions, the percentage of students they interacted with during the computer time, the nature of students with reading difficulties questions, and the students who were absent. Teachers attached their notes and other ancillary materials to the data forms. The researcher reviewed these forms each week and collected the forms at the conclusion of the intervention. The researcher completed two unannounced formal observations of each teacher each week for a total of six observations. These observations were documented using a form that was identical to the teacher classroom data form. The two forms were compared at the conclusion of each observation to determine trustworthiness of the observations and fidelity of implementation by the teachers. Teachers also completed a post-intervention survey that asked them to identify the extent to which they had implemented the curriculum as it was described in the teacher’s manual. Teachers implemented the three main components of the daily lessons (i.e., opening discussion, individual work at computers, and closing discussion) 82% of the time. Analysis Students who were absent from school for more than 20% of the intervention were excluded from the analysis. Cognitive tools were grouped into four categories based on Lajoie’s (1993) conceptual framework and previous studies with Alien Rescue (Liu, 2004; Liu & Bera, 2005). The categories used during this analysis were: (1) tools that share cognitive load (xenobiology, solar system database, mission database, computer core, periodic table, and spectrogram variables), (2) tools that support the cognitive process (field journal and teleporter variables), (3) tools that support out-of-reach activities (astro-engineering variable), and (4) tools that support hypothesis testing (telemetry variable). Correlations between students’ use of cognitive tools and their scores on the posttest and solutions forms were obtained. Separate one-way ANOVAs were conducted with cognitive tool categories as the dependent variables and DRP group as the independent variable. Statistically significant results were probed and followed up using Scheffe’s post hoc analysis.
96
MARINO
Data from the cognitive tool categories were skewed during initial analysis. Therefore, square root transformations were applied to normalize the data. Two separate simultaneous multiple linear regression analyses were conducted with posttest and total score as the dependent variables and the square root of cognitive tool category scores, DRP, and interactions between each of the categories as the predictor variables. Interactions were created by multiplying DRP, which was dummy coded 0,1, with each of the cognitive tool categories.
RESULTS Correlations between students’ use of the cognitive tools across the four tool categories and their scores on the posttest and solutions forms are presented in Table 2. Correlations between tool use and the posttest were found to be stronger than the correlations between tool use and the solutions form scores. The strongest correlation was between tools that share cognitive load and the posttest (r D .224). Tools that share cognitive load include databases that present information using graphic organizers, text, and interactive tutorials. The results of the four one-way ANOVAs examining differences in tool use across DRP groups were significant (p < .05) for each of the tool categories: (1) tools that share cognitive load F (2, 954) D 12.60, p < .001, 2 D .03; (2) tools that support the cognitive process F (2, 954) D 4.86, p D .008, 2 D .01; (3) tools that support out-of reach activities F (2, 954) D 3.07, p D .047, 2 D .006; and (4) tools that support hypothesis testing F (2, 954) D 5.567, p D .004, 2 D .01. Scheffe’s post hoc analysis indicated that there were no significant differences in the way students’ with severe reading difficulties and poor readers used the cognitive tools during the intervention. A summary of the post hoc analysis is presented in Table 3. Based on the similarities between DRP groups 1 and 2 during post hoc analysis, the DRP group variable was dichotomized as students scoring 50th percentile on the DRP (formerly students in DRP groups 1 and 2) and students scoring >50th percentile on the DRP (students in DRP group 3). A simultaneous linear multiple regression was conducted to examine the relationship between the dichotomized DRP groups, the students’ use of cognitive tools and their scores on the posttest and solutions forms. After controlling for DRP group, students’ uses of tools that share cognitive load and tools that support out-of-reach activities were found to be statistically significant predictors of their posttest score.
TABLE 2 Correlations between Tool Categories and Posttest and Solutions Form Scores Tool Category
Posttest
Tools Tools Tools Tools
.224** .085** .108** .174**
that share cognitive load that support the cognitive process that support out-of reach activities that support hypothesis testing
**Correlation is significant at the .01 level (2-tailed).
Solutions Forms .005 .039 .051 .115**
TECHNOLOGY-BASED TOOLS
97
TABLE 3 Mean Scores on Cognitive Tool Categories by Reading Ability Group Severe Reading Difficulties (DRP 1)
Poor Readers (DRP 2)
Proficient Readers (DRP 3)
Cognitive Tool Category
Mean
SD
Mean
SD
Mean
SD
Share cognitive load Support cognitive process Out-of-reach activities Hypothesis testing
29.41a 16.93a 11.20 4.65a
16.71 15.33 7.41 4.62
30.48b 17.99 11.88 5.21
16.21 16.74 7.97 4.60
36.17a;b 21.41a 13.57 6.12a
17.78 18.3 8.57 5.38
Note. Results of Scheffe’s post hoc analysis. Means in a row sharing subscripts are significantly different. For all measures, higher scores indicate a higher level of tool use.
Results of the simultaneous linear multiple regression analysis for the posttest and solutions form total score is presented in Tables 4 and 5. On the posttest dependent variable, there were statistically significant main effects for tools that share cognitive load (p < .001); tools that support out-of-reach activities (p D .024); and the dichotomized DRP groups (p < .001). There was also a statistically significant interaction between the dichotomized DRP groups (50% vs. >50%) and tools that support cognitive load (p D .047). Based on these findings, for every unit increase in the square root of cognitive load, we predict a .792 increase in the posttest score for students in the low DRP group (50%). For students in the high DRP group (>50%), for every unit increase in the square root of cognitive load, we predict a .330 (.792–.462) increase in the posttest score. Students’ use of tools that support out-of-reach activities had a statistically significant negative impact on students’ posttest scores, with each unit increase in the square root of tools that support out-of-reach activities resulting in a .676 unit decrease on the posttest score. TABLE 4 Posttest Coefficients for Multiple Regression Using Dichotomized DRP Groups Unstandardized Coefficients Predictor Variable SQRT share cognitive load SQRT support the cognitive process SQRT support out-of-reach activities SQRT support hypothesis testing DRP Dichotomous (>50%/