Jul 30, 2014 - Dr. Julie Sarama and Dr. Douglas H. Clements, you opened your door to me as I ...... structures, before attaining adult intelligenceâ (Ginsburg.
i
DEVELOPMENT AND EVALUATION OF A REVISED DEVELOPMENTAL PROGRESSION OF A LEARNING TRAJECTORY FOR VOLUME MEASUREMENT IN THE EARLY YEARS
by Douglas W. Van Dine July 30, 2014
A dissertation submitted to the Faculty of the Graduate School of The University at Buffalo, State University of New York In partial fulfillment of the requirements for the Degree of Doctor of Philosophy Department of Learning and Instruction
P a g e | ii
Copyright by Douglas W. Van Dine 2014
P a g e | iii Acknowledgements First and foremost I need to thank my husband, Louis Moran. I met you five months into this journey and you’ve been with me every step of the way, providing loving support that only you could give. Even as the journey took us away from Buffalo, you stood by my side and walked with me. Your patience, support, caring spirit, and encouragement are without equal. I love you more and more each day and cannot image having completed this journey without you by my side. Dr. Julie Sarama and Dr. Douglas H. Clements, you opened your door to me as I entered the PhD program at UB. Even though I was a middle and high school math teacher, you took a chance on me, allowing me the opportunity to work with younger children. In the process, you allowed me to uncover a previously-hidden passion for understanding how children learn to measure. The opportunities you have provided as well as the trust you have shown were instrumental in guiding me along the path to this final conclusion. Along the way, I’ve been joined by some incredible graduate school partners, as well. I will never forget the “Math Lab Dream Team” at UB with Jennifer McDonel and Lisa Napora. The two of you inspired me to delve deeply into understanding how children think and learn. It is through your partnership that I began to see children exploring volume through the use of four different schemes. Maria Vukovich and Kate Newburgh at DU helped me continue that journey and work through the laborious process of putting together coherent thoughts for our monograph on children’s measurement. I also have to thank my colleagues as ISU. Led by Dr. Jeffery Barrett, the team of Craig Cullen, Amanda Miller, Chepina Rumsey, Melike Kara, and Cheryl Eames were ever-present partners on the journey to understanding children’s measurement. Thank you!
P a g e | iv Table of Contents Acknowledgements ....................................................................................................................................... iii List of Figures .................................................................................................................................................. vi List of Tables ................................................................................................................................................. viii Abstract ............................................................................................................................................................. ix
Paper 1: Development of a Learning Trajectory for Volume Measurement – A review of
research .................................................................................................................................................... 1 Introduction ..................................................................................................................................................... 1 A Constructivist Framework ....................................................................................................................... 6 Cognitive Theories of Learning .................................................................................................................. 7 Studies Focusing on Volume ..................................................................................................................... 20 Studies Focusing on Learning Trajectories .......................................................................................... 34 Concluding Thoughts ................................................................................................................................... 43
Paper 2: Verifying and Refining a Developmental Progression for a Learning Trajectory
for Volume Measurement – Pre-K through Grade 2 ................................................................... 45 Purpose ............................................................................................................................................................ 45 Theoretical Framework .............................................................................................................................. 45 Qualitative Methods ..................................................................................................................................... 47 Qualitative Results ....................................................................................................................................... 51 Consistency with Initial LT for Volume Measurement ..................................................................... 51 Longitudinal Growth Charts ...................................................................................................................... 70 Discussion and Summary ........................................................................................................................... 72 Quantitative Method – Rasch Modeling ................................................................................................. 74 Discussion and Summary ........................................................................................................................... 79
Page | v Implications .................................................................................................................................................... 81 Hypothesizing a New Developmental Progression for Volume Measurement ........................ 82 Final Thoughts and Significance .............................................................................................................. 93
Paper 3: Evaluation of a Revised Developmental Progression of a Learning Trajectory for
Volume Measurement – Kindergarten through Grade 2 ............................................................ 94 Introduction ................................................................................................................................................... 95 Theoretical Framework .............................................................................................................................. 96 Method .............................................................................................................................................................. 99 Results ........................................................................................................................................................... 105 Discussion .................................................................................................................................................... 109 Implications and Further Research ..................................................................................................... 116
Appendices .......................................................................................................................................... 122 Appendix A ................................................................................................................................................... 122 Appendix B ................................................................................................................................................... 130 Appendix C ................................................................................................................................................... 131 Appendix D ................................................................................................................................................... 134 Appendix E ................................................................................................................................................... 141 Appendix F ................................................................................................................................................... 148 Appendix G ................................................................................................................................................... 155 Appendix H ................................................................................................................................................... 161 Appendix I .................................................................................................................................................... 166 Appendix J .................................................................................................................................................... 168
References ........................................................................................................................................... 175
P a g e | vi List of Figures Figure 1.1. Piaget’s Stages of Cognitive Development .................................................................. 9 Figure 1.2. van Hiele levels of geometric thinking. ..................................................................... 13 Figure 1.3. Siegler’s overlapping waves model. .......................................................................... 18 Figure 1.4. Students’ strategies for finding the number of cubes in a rectangular prism array (from Battista & Clements, 1996, p. 263)..................................................................................... 26 Figure 1.5. Cognitive milestones in structuring 3-D arrays in order of sophistication (Battista & Clements, 1996) ............................................................................................................................ 27 Figure 1.6. Mathematics Learning Cycle (Simon, 1995) ............................................................. 35 Figure 1.7. Length, Area, and Volume summary from TurnOnCCMath.net (Confrey et al., 2012) ....................................................................................................................................................... 37 Figure 1.8. Battista’s depiction of a learning progression ............................................................ 40 Figure 1.9. Hierarchic interactionalism (Sarama & Clements, 2009, p. 215) .............................. 43 Figure 2.1. Box and cube used in initial assessment task............................................................. 53 Figure 2.2. Initial assessment “ice block” task ............................................................................. 53 Figure 2.3. Lia chooses the largest cup for lemonade .................................................................. 54 Figure 2.4. Marina compares the cups by height, aligning them and looking across the tops ..... 56 Figure 2.5. Lia compares capacity using more than one dimension. .......................................... 57 Figure 2.6. “Sorting solids by volume” task from Spring of pre-K ............................................. 64 Figure 2.7. Edith iterating a cube to find the volume of a container ............................................ 66 Figure 2.8. Robert’s longitudinal growth chart ............................................................................ 71 Figure 2.9. Edith’s longitudinal growth chart .............................................................................. 72 Figure 2.10. Lia’s longitudinal growth chart ................................................................................ 72
P a g e | vii Figure 2.11. Composite growth chart for three children .............................................................. 73 Figure 2.12. Item difficulty aligned with LT for volume measurement level .............................. 80 Figure 3.1. Item VQ-C2B ........................................................................................................... 100 Figure 3.2. Misfitting item VQ-C1B .......................................................................................... 107 Figure 3.3. Combined plot for all items by learning trajectory level ......................................... 111 Figure 3.4. Plots of item by difficulty for filling items .............................................................. 113 Figure 3.5. Plots of item by difficulty for packing items ........................................................... 114 Figure 3.6. Plots of item by difficulty for building items........................................................... 115 Figure 3.7. Plots of item by difficulty for comparing items ....................................................... 116 Figure 3.8. DIF by Grade ........................................................................................................... 119 Figure 3.9. DIF by gender .......................................................................................................... 120 Figure 3.10. DIF by school ......................................................................................................... 120 Figure F1. Probability Curves for All Items ............................................................................. 153 Figure F2. Item-Person Map for All Items................................................................................ 154 Figure G1. Summary of Category Structure and Probability Curves for Volume Items .......... 159 Figure G2. Item-Person Map for Volume Items ....................................................................... 160 Figure J1. Standardized residual Contrast 1 Plot ....................................................................... 171 Figure J2. Category Probability Curves ..................................................................................... 173 Figure J3. Wright Map ............................................................................................................... 174
P a g e | viii List of Tables Table 2.1 – A Comparison of the Levels from the Initial and the Revised Developmental Progressions for Volume Measurement ........................................................................................ 87 Table 3.1 – Discrepancies in data entry ...................................................................................... 104 Table 3.2 – Items with more than one unacceptable fit statistic after removal of three noncontributing items ....................................................................................................................... 106 Table 3.3 – Assessment items from Battista (2012) ................................................................... 118 Table A1 – Attributes, Measuring Length and Capacity by Direct Comparison from TurnonCCMath.com ................................................................................................................... 122 Table A2 – Volume Measurement from TurnOnCCMath.com .................................................. 124 Table F1 – Table of Standardized Residual Variance for All Items ........................................... 148 Table F2 – Item Statistics by Misfit Order for All Items............................................................ 149 Table F3 – Summary of Measured (Non-extreme) Person for All Items ................................... 150 Table F4 – Summary of Category Structure for All Items ......................................................... 152 Table G1 – Table of Standardized Residual Variance for Volume Items .................................. 155 Table G2 – Table of Standardized Residual Loadings for Volume Items .................................. 156 Table G3 – Item Statistics by Misfit Order for Volume Items ................................................... 157 Table G4 – Summary of Measured (non-extreme) Persons for Volume Items .......................... 158 Table J1 – Item Statistics: Measure order ................................................................................... 168 Table J2 – Table of standardized residual variance (in Eigenvalue units) ................................. 170 Table J3 – Contrast 1 standardized residual loadings for item ................................................... 172
P a g e | ix Abstract This dissertation is a compilation of three related papers: •
Paper 1: Development of a Learning Trajectory for Volume Measurement – A review of research
•
Paper 2: Verifying and Refining a Developmental Progression for a Learning Trajectory for Volume Measurement – Pre-K through Grade 2
•
Paper 3: Evaluation of a Revised Developmental Progression of a Learning Trajectory for Volume Measurement – Kindergarten through Grade 2
The first paper provides a framework for the study of volume measurement within a constructivist paradigm. A review of research focused on volume as well as learning trajectories is presented. This culminates in a learning trajectory for volume measurement developed by Clements and Sarama (2009). Research with young children is sparse, however, leading to the conclusion that further evaluation and refinement of this learning trajectory for volume measurement is warranted. The second paper begins with the learning trajectory for volume measurement developed by Sarama & Clements (2009). Through qualitative analysis of longitudinal data from 8 children followed from pre-K through Grade 2, as well as quantitative analysis of assessment data from a larger sample of children in pre-K through Grade 5, evidence was found to support much of the initial learning trajectory, but also highlighted the need for revision. As a result, a revised developmental progression for this learning trajectory was developed. This revised developmental progression incorporates four schemes related to volume identified in young children: filling, packing, building, and comparing. Finally, the third paper presents research completed to further evaluate this revised developmental progression. Using 48 items assessing volume understanding adapted from previous research on volume (Curry & Outhred, Battista, Clements & Sarama), 82 children from
Page | x pre-K through Grade 3 were interviewed. The data was submitted to Rasch modeling and the outcome used to answer two research questions: 1. Are the hypothesized developmental progressions for filling, packing, building, and comparing volume valid for a larger sample of children? 2. Is the hypothesized developmental progression for volume, incorporating the subtrajectories for filling, packing, building, and comparing, valid when considered as a single, unidimensional developmental progression or should there be more than one developmental progression for volume measurement? Rasch modeling results support the unidimensionality of the volume construct as measured by the instrument. Thus, in answering Research Question 2, there is evidence of a single, unidimensional developmental progression for volume incorporating the subtrajectories for filling, packing, building, and comparing. Results also support that Research Question 1 is validated and that the developmental progressions for filling, packing, building, and comparing volume are valid for a larger sample.
Page | 1 Paper 1: Development of a Learning Trajectory for Volume Measurement – A review of research
To measure is to take out of a whole one element, taken as a unit, and to transpose this unit on the remainder of a whole: measurement is therefore a synthesis of sub-division and change of position. However, although this way of looking at it seems clear and self-evident, the process is far more intricate in fact. As often happens in psychogenetic development, a mental operation is deceptively simple when it has reached its final equilibrium, but its genesis is very much more complex. (Piaget, Inhelder, & Szeminska, 1960, p. 3) Introduction Picture this: A teacher stands in front of his or her classroom, holds up an object such as a cup or a box, and asks the class, “What is the volume of this?” On the surface, this seems like a straightforward and simple query. In reality, there are several obstacles to be navigated in order to provide a correct answer in the eyes of the teacher. The first challenge is the surprising complexity of the question. What is actually meant by volume? Is it about how much the container holds? Does it indicate how much space the materials making up the container occupy? Or is it in reference to the largest exterior dimensions of the object and how much space it appears to occupy? This is a dilemma many mathematicians and scientists find themselves embroiled in; that is, the clarity of definition. In actuality, the term volume has a variety of meanings based on context, as seen in the following from Collins’ English Dictionary (2010): Volume (ˈvɒljuːm) — n 1. the magnitude of the three-dimensional space enclosed within or occupied by an object, geometric solid, etc. 2. a large mass or quantity: the volume of protest
Page | 2 3. an amount or total: the volume of exports 4. fullness or intensity of tone or sound 5. the control on a radio, etc., for adjusting the intensity of sound 6. a bound collection of printed or written pages; book 7. any of several books either bound in an identical format or part of a series 8. the complete set of issues of a periodical over a specified period, esp. one year 9. history a roll or scroll of parchment, papyrus, etc. 10. speak volumes to convey much significant information For the term volume, meaning is represented in a variety of contexts – mathematical, scientific, and otherwise; thus, identifying the correct context is of utmost importance. Looking back into the classroom after teaching middle and high school mathematics for 15 years, I know from experience that if I were to ask the question posed in the initial paragraph a majority of secondary students would have quickly responded with “length times width times height.” Most students may have demonstrated little or no recognition of the three-dimensional object (i.e., a rectangular prism) to which this formula for volume is applied; it is simply what they have learned to associate with the term volume. Other students may even respond with “length times width” as a formula they know for calculating something; calculating what, however, they are uncertain. Consequently, where does understanding of volume begin? Where within the school curriculum do we see development of volume understanding and what does it look like? Clark, Gilbertson, and He (2012), examined how several common elementary curricula presented the ideas of volume and capacity; specifically, they studied at Saxon Math (Saxon), Everyday
Page | 3 Mathematics (EM), and Scott Foresman-Addison Wesley (SFAW). Typical definitions of capacity from each curriculum included: •
“The amount something will hold” – Saxon, Grade 3
•
“The amount a container can hold” – EM, Grades 2,3,5
•
“The amount a container will hold” – SFAW, Grades 2,3,4
•
“The amount of liquid that an object can hold” – SFAW, Grades 3,4,5
•
“A measure of the amount of liquid or other substance a container can hold” – EM, Grade 4
Similarly, in examining how these curricula defined volume, Clark et al. (2012) found: •
“The amount of space a shape occupies” – Saxon, Grade 5
•
“The number of unit cubes and fractions of unit cubes needed to fill the space taken up by the [3-dimensional] object” – EM, Grade 4
•
“A measure of how much space an object takes up” – EM, Grade 4
•
“The amount of space taken up by the figure” – SFAW, Grade 4
•
“The number of cubic units needed to fill a solid figure” – SFAW, Grade 4
In addition to definitions specific to volume or capacity, Clark et al. (2012) also referred to “blended” and “joint” definitions. Blended definitions seek to integrate the above definitions of capacity and volume. For example, capacity may be defined as the amount of space taken up, rather than simply the amount held by a container. In the curricula, examples of blended definitions included the following: •
“Capacity is a measure of an amount of liquid” – SFAW, Grade 5
•
“Volume can also refer to how much goes inside a container” – SFAW, Grade 3
•
“Capacity is a measure of the amount of space something occupies or contains” – EM, Grade 2
•
“The volume of a container that holds liquids is often called its capacity” – EM, Grade 3
Page | 4 Joint definitions, however, attempted to include the ideas of both volume and capacity within a single definition. In the curricula, joint definitions appeared as: •
“The amount of liquid a container will hold is called its ‘capacity’ or ‘volume’” – Saxon, Grade 3
•
“Both volume and capacity are measures of the amount of space something occupies” – EM, Grade 3
•
“Volume (or capacity) is the measure of the amount of space inside a 3-dimensional geometric figure” – EM, Grade 5
Again, the issue of lack of clarity in definition is apparent. Even at different grade levels, the same curriculum presented different definitions for volume and capacity; sometimes providing separate definitions while at others presenting a blended or joint definition. Looking beyond the definitions, the typical progression followed by most curricula, textbooks, and teachers paralleled what might be considered the mathematical progression in moving from one to two, then three dimensions. That is, in the early years, the focus of instruction was on length, then in the next couple years it moved to focus area. Finally, the focus arrived at volume in the later elementary years. Thus, US children often have not been exposed to volume topics and contexts until at least third or fourth grade (National Council of Teachers of Mathematics, 2000)…or possibly even later (CCSSO, 2010; National Council of Teachers of Mathematics, 2008). What about the deeper understandings of volume? Further, how can educators help children to see volume as more than simply a detached formula floating in space? The idea that mathematics instruction has often resulted in children missing early volume concepts is not a new idea. Wirszup (1976) summarized the work of Piaget and Inhelder published in the late 1950s and asserted that traditional geometry instruction began too late. Consequently, the concept of measurement is introduced right away and in doing so, important
Page | 5 qualitative phases that allow children experience with transforming spatial operations into logical ones are omitted from instructional early mathematics curriculum. Wirzup stated, …instruction is realized in a sequence corresponding to the historical development of geometry – from the ‘geometry of measurements’ to the ‘geometry of shape’ – from geometry of position to theoretical geometry. But the development of geometric operations in children actually proceeds in the opposite direction – from the qualitative to the quantitative (p. 76). Similarly, van Hiele (1999) identified that [s]econdary school geometry was for a long time based on the formal axiomatic geometry that Euclid created more than 2000 years ago. His logical construction of geometry with its axioms, definitions, theorems, and proofs was – for it’s time – an admirable scientific achievement. School geometry that is presented in a similar axiomatic fashion assumes that students think on a formal deductive level. However, that is usually not the case, and they lack prerequisite understandings about geometry. This lack created a gap between their level of thinking and that required for the geometry they are expected to learn. (p. 310) Still today, although the Common Core State Standards (CCSSO, 2010) have introduced the idea of volume in Grade 2 through a general focus on building, drawing, and analyzing threedimensional figures to help students develop a foundation for understanding volume (p. 17), it is not until Grade 5 that a description and definition is presented. In the general overview to the Grade 5 standards, we find the following: Students recognized volume as an attribute of three-dimensional space. They demonstrated an understanding that volume can be measured by finding the total number
Page | 6 of same-size units of volume required to fill the space without gaps or overlaps. They understood that a 1-unit by 1-unit by 1-unit cube was the standard unit for measuring volume. Students selected appropriate units, strategies, and tools for solving problems that involved estimating and measuring volume. Further, they decomposed threedimensional shapes and found volumes of right rectangular prisms by viewing them as decomposed into layers of arrays of cubes. Finally, students measured necessary attributes of shapes, in order to determine volumes to solve real world and mathematical problems. (p. 33) Although Grade 2 provided an introduction to volume, it was not until Grade 5 that volume became apparent as an instructional focus. This instructional focus remained parallel to what Piaget and the van Hiele’s posited; that is, students began with counting the number of cubes, and very quickly turned to calculation of volume using linear dimensions. It was not a far leap, then, to understand how many middle and high school students may have been lacking in deeper understandings of volume and, consequently, only saw volume as a formula or simply a number. A Constructivist Framework At this point, the focus shifted to exploring the ways in which different researchers have sought to explain how children learn to understand and measure volume. The first step on that journey was to look at the process of learning itself. Three major frames exist through out which scientists, philosophers, researchers, educators, and others have conceptualized children and learning: nativism, empiricism, and interactionalism/constructivism. In nativism, the child is viewed as born with knowledge and skills “hard wired” into his or her brain. Learning, and the goal of education, is to help the child uncover his or her inborn knowledge. In contrast, the empiricist saw the child born as a blank slate, a “tabula rasa”.
Page | 7 Learning occurred as a result of others, who are more knowledgeable at presenting information to the child. In the empiricist view, everything is thought to enter into the brain through the senses; thus, the child’s blank slate is gradually filled through his or her experiences. The interactionalist or constructivist stance is positioned somewhat between the other two, hypothesizing that the child has certain innate abilities through which he or she constructs knowledge from experience with the world around. As a researcher, I find myself aligning with the interactionalist/constructivist framework. That is, I do not view children as born with all knowledge “hard wired” into their brains, instead children learn as a result of experiences with their environment. What has separated me from the empiricist view, then, is that I do not see that everything needs to be “taught” to children. Adults, or “experts,” are not the purveyors of all knowledge, with children simply opening up their minds to receive information. Instead, children construct knowledge through their experiences, including interactions with “experts” and their own interactions with the environment. It is looking through this lens of constructivism that the development of learning trajectories can be seen as a way to explain the development of knowledge in children. Cognitive Theories of Learning Piaget In the early part of the twentieth century, Jean Piaget set the stage for what would lead to the development of learning trajectories by breaking away from many of his contemporaries, and focusing on qualitative changes in thinking. Whereas many behaviorists looked at learning through quantitative changes, Piaget believed that “changes in quality of thought refer to its nature or structure, not its quantity.” (DeVries, 2008, p. 187). Piaget saw development as
Page | 8 characterized by qualitative changes and fought against an idea of quantitative change; that is, accumulation of bits of information. Not only did Piaget open the door to qualitatively understanding of how children learn, he also “persuasively demonstrated how qualitatively different a child’s thinking is from an adult’s” in his theory of cognitive development (Inagaki, 1992, p. 116). Inagaki (1992) summarized Piaget’s conception of cognitive development as having two fundamental aspects – structuralist and constructivist. In specific, Inagaki (1992) emphasized that Piaget’s conceptualization incorporated an invariant sequence of general stages or structures of thinking and the belief that cognitive development consisted of a series of constructions. Structuralist aspect of Piaget’s theory. The structuralist aspect of Piaget’s conception of cognitive development was evident in a sequence of four qualitatively different stages, each characterized by different structures of thinking (operations) or logico-mathematical structures (see Figure 1.1). Piaget believed that “the child progresses through a series of stages, each characterized by different psychological structures, before attaining adult intelligence” (Ginsburg & Opper, 1988, p. 20). Not only were these stages evident in all children, Piaget viewed them as uniformly applicable within individuals. That is, every person moved through the stages in the same order as he or she grew from birth to adulthood. I
Stage Sensori-motor (0-2 years)
II
Preoperational (2-7 years)
III
Concrete operational (7-11 years)
Characteristics of Children’s Reasoning At birth, a child's cognitive system begins with simple reflexes. Through the first two years of life, the child builds on these reflexes to develop more sophisticated procedures – such as object permanence, cause/effect, and symbolic functioning. Knowledge is developing and is based on physical interactions and experiences. The child is egocentric and intuitions are formed based upon one perspective at a time (centration). The child acquires representational skills in mental imagery and the use of language. Rapid cognitive development occurs and but the child struggles with the idea of conservation. The child is able to take another's point of view, taking into account more than one perspective simultaneously. He/she can represent transformations
Page | 9
IV
Formal operations (11-15 years)
as well as static situations and develops conservation of number, length, liquid, mass, weight, area and volume. Intelligence is demonstrated through logical and systematic manipulation of symbols related to concrete objects. Reversibility in thinking is demonstrated, thus the ability to apply logical thought to concrete problems appears. A child cannot yet perform on abstract problems and that he/she does not consider all of the logically possible outcomes. The child is capable of thinking logically and abstractly, demonstrating his/her intelligence through the logical use of symbols related to abstract concepts. This allows the ability to reason theoretically. Piaget considered this the ultimate stage of development that some would never each at any age, and stated that although children would still have to revise their knowledge base, their way of thinking was as powerful as it would get.
Figure 1.1. Piaget’s Stages of Cognitive Development
Piaget viewed a child’s cognitive development as an ongoing, dynamic process by which he or she organized experiences and adapted to the external environment to make sense of world around him or her. Again, it should be stressed that Piaget theorized every child advanced through these stages in the same order with no stage bypassed, but clarified that there was no set age at which a child should be in a specific stage. Some children advanced more quickly, as a result of experience or genetic factors, while other children advanced more slowly. Further, Piaget also held that not all individuals would reach the highest or formal operations stage. In explaining why similar cognitive developments happened at different ages in children, Piaget referred to what he called décalage – a time lag or minor disparity of action by the child (Gruber & Voneche, 1995). These lags could be considered either vertical or horizontal. Vertical décalage referred to a child using increasing conceptual approaches on the same or similar tasks where, as the child moved to higher stages of development, the approach changed. For example, a child initially might act physically on a problem or dilemma, but be unable to explain in words what he or she did, then later move to acting symbolically. In turn, Piaget considered this as evidence that the ability to represent symbolically lagged behind the physical. Another example would be a child’s approach to seriation: initially using sensori-motor exploration, then progressing through trial and error, using a simple concrete plan, developing an abstract plan,
P a g e | 10 and finally, a theoretical approach. Horizontal décalage, or “lack of immediate transfer” (Ginsburg & Opper, 1988, p. 153), referred to a child acquiring a certain cognitive function, but not necessarily demonstrating the ability to apply this to all problems. His or her reasoning appeared tied to a particular situation and did not generalize to others. For example, a child may demonstrate conservation in one context (e.g., conservation of mass) but not another (e.g., conservation of volume). Constructivist aspect of Piaget’s theory. The constructivist aspect of Piaget’s theory dealt with how knowledge came about in learning. Piaget asserted that knowledge developed little by little through a laborious process of an individual acting on his or her environment which resulted in elaboration of his or her prior knowledge. Through interaction with his or her environment, a child might assimilate the environment or accommodate to it as a result of prior knowledge. Additionally, knowledge was acquired through reflexive abstraction (i.e., by coordinating various components of actions on objects) as well as through empirical abstraction (i.e., by observing the nature of objects and/or their changes due to the actions). Empirical abstraction was often referred to as “physical experience;” similarly reflective abstraction might be referred to as “logico-mathematical experience” (Gruber & Voneche, 1995, p. 727). According to Piaget, abstraction – a conscious realization of an abstract concept, required strenuous time, practice, and varied experiences in order to become knowledge. To better understand Piaget’s theory of cognitive development, it is important to define four integral concepts: schemata, assimilation, accommodation, and equilibrium. Schemata are the cognitive structures by which an individual intellectually has adapted to and organized his or her environment. For a child, a schemata is dictated through his or her pattern of reasoning.
P a g e | 11 Piaget asserted that children’s schemata might not be identical to those of adults, but each child’s organization would always be internally consistent. Assimilation is the cognitive process by which an individual integrates new stimuli into his or her existing schemata, thus incorporating aspects of the world around into his or her cognitive structures. In the process of assimilation, a child “…remains unaware of, or disregards, whatever does not fit into the conceptual structure [he or she] possesses” (von Glasersfeld, 1995, p. 63). If the child has not assimilated new stimuli or experiences into his or her existing schemata, then perturbation (i.e., agitation or tension) may result and lead to either rejection or accommodation. Accommodation involves the creation of new schemata or the modification of old schemata in response to external demands. Assimilation and accommodation go hand-in-hand and account for intellectual adaptation in cognitive structures—the schemata. “The individual not only modifies structures in reaction to external demands (accommodation), he also uses his structures to incorporate elements of the external world (assimilation)” (Ginsburg & Opper, 1988, p. 18). When a balance between assimilation and accommodation is met, there is equilibrium, a state of cognitive balance. When there is an imbalance or disequilibrium, there is cognitive conflict and a state ripe for change. Factors affecting cognitive development. Piaget’s primary goal could be seen “as the study of children’s gradual attainment of intellectual structures which allow for increasingly effective interactions with the environment” (Ginsburg & Opper, 1988, p. 13). Thus, experience (both physical and logico-mathematical), maturation, and social interaction were integral to Piaget’s theory of cognitive development. Experience resulted from the cumulative influences of the physical environment on the individual through direct interaction with objects as well as
P a g e | 12 observation of actions upon those objects. Maturation dealt with what a child genetically inherited, and the development of that genetic material. Social interaction included social transmission, the effects of social influences, and the interchange of ideas between the individual and others, whether with peers, parents, or other adults. Later, Piaget added equilibration as a fourth factor, explaining that “development involves a conflict among existing schemes, the child’s assimilation of new problems into those schemes, and a self-regulated adjustment or progression of the current modes of thought” (Ginsburg & Opper, 1988, p. 212). Thus, a child did not experience a static condition, but rather an expanding equilibration which resulted in cognitive development characterized by “…an increase in the range of perturbations the organism is able to eliminate” (von Glasersfeld, 1995, p. 67). Equilibration could be seen as an individual’s internal self-regulating system. Piaget, along with this overall theory of cognitive development, also focused on exploring and explaining the child’s conceptions of space and geometry. Further commentary will be provided to both of these concepts in later sections, as the discussion is shifted to specifically understanding how children develop an understanding of volume. van Hiele The work of Pierre van Hiele and his wife Dina van Hiele-Geldof focused on the role of instruction, namely geometry instruction, in cognitive development. Of importance, is that many of the van Hieles’ tenets paralleled Piaget’s theory of cognitive development, as well as his commentary on mathematics education. “Piaget’s point of view, which I support affectionately, was that ‘giving no education is better than giving it at the wrong time.’ We must provide teaching that is appropriate to the level of children’s thinking” (van Hiele, 1999, pp. 310-311).
P a g e | 13 The van Hiele Levels of geometric thinking. Through their research on geometry instruction, the van Hieles’ identified five levels of thinking in the development of students’ understanding of geometric concepts (Figure 1.2). Level 1
Visual
• Can name and recognize shapes by their appearance, but cannot specifically identify properties of shapes • Although able to recognize characteristics, do not use them for recognition and sorting Level 2 Descriptive/Analytic • Begin to identify properties of shapes and use appropriate vocabulary related to properties, but do not make connections between different shapes and their properties • Irrelevant features, such as size or orientation, become less important • Able to focus on all shapes within a class • Begin to talk about the relationship between shapes and their properties Level 3 Abstract/Relational • Recognize relationships between and among properties of shapes or classes of shapes and are able to follow logical arguments using such properties Level 4 Formal Deduction • Go beyond just identifying characteristics of shapes and are able to construct proofs using postulates or axioms and definitions Level 5 Rigor/Mathematical • Can work in different geometric or axiomatic systems • Describe the effect of adding or deleting an axiom on a given geometric system • Can compare, analyze, and create proofs under different geometric systems Figure 1.2. van Hiele levels of geometric thinking. Clements and Battista (1992) later asked the question, “What is the most basic level; that is, does a Level 0 exist?” (p. 429) and cited evidence that many students failed to demonstrate thinking at the visual level. Therefore, they proposed the addition of the following to provide a full and complete picture of geometric thinking: •
Level 0 – Pre-‐ recognition: Children perceive shapes, but may only attend to a subset of a shape’s visual characteristics. Additionally, they are unable to identify many common shapes and may not distinguish between figures in the same class (e.g., can distinguish between a circle and square, but not between a square and a triangle).
P a g e | 14 It should be noted that although the van Hieles’ research focused on students in secondary geometry classes, their theory is applicable from elementary school through high school and beyond (Burger & Shaughnessy, 1986; Fuys, Geddes, & Tischler, 1988; Wirszup, 1976). The van Hieles’ believed that if a student’s levels of thinking were addressed with appropriate levels of instruction, the student would be more invested in the learning process, resulting in a better likelihood of developing insight and moving to higher levels of understanding. Additionally, as with Piaget’s stage of formal operation, the van Hieles’ believed that few would reach the highest level; that of rigor or mathematical thinking. Assisted by appropriate instructional experiences, the model asserts that the learner moves sequentially from the initial, or basic, level (visualization), where space is simply observed – the properties of figures are not explicitly recognized, through the sequence…to the highest level (rigor), which is concerned with formal abstract aspects of deduction. Few students are exposed to, or reach, the latter level. (Crowley, 1987, p. 9) The van Hieles’ view of development through levels of understanding, however, did diverge slightly from that of Piaget. For Piaget, age and biological maturation were the main factors in advancing from one level to a higher level, with experience playing a smaller role. The van Hieles put a much stronger emphasis on instruction. How do students develop such thinking? I believe that development is more dependent on instruction than on age or biological maturation and that types of instructional experiences can foster, or impede, development… (van Hiele, 1999, p. 311) In the van Hiele levels of geometric thinking, as with Piaget’s décalage, students similarly may be at different levels or vacillate between levels for different geometric concepts or with different tasks (Burger & Shaughnessy, 1986; Mayberry, 1983). For instance, Gutiérrez,
P a g e | 15 Jaime, and Fortuny (1991) discovered some students “used several levels at the same time, probably depending on the difficulty of the problem” (p. 250). However, Gutiérrez et al. did not reject the hierarchical structure of the van Hiele levels, but suggested that the theory should be adapted to reflect “the complexity of the human reasoning process” (p. 250). The levels as described by van Hiele appeared to be discrete in nature, but according to Gutiérrez et al., most researchers would “…consider that the movement from one level to the following one is a continuous process, since the acquisition of a thinking level by a student is gradual…” (p. 32). Similarly, in a longitudinal study of elementary students, Lehrer, Jenkins, and Osana (1998) studied the spatial reasoning of students. The study resulted in the inability to identify or classify children at a single level of development. They concluded that “level mixture was therefore the most typical pattern of response, and children’s justifications often ‘jumped’ across nonadjacent levels of the van Hiele hierarchy” (p. 142). Phases in the learning process associated with the van Hiele levels. “Van Hiele stressed the importance of teachers assisting students through phases of an exploratory and cumulative nature that build concepts and related language” (Martin, 2009, p. 11). With that, van Hiele strived for concrete and practical methods: Instruction intended to foster development from one level to the next should include sequences of activities, beginning with an exploratory phase, gradually building concepts and related language, and culminating in summary activities that help student integrate what they have learned into what they already know. (van Hiele, 1999, p. 311) The final statement in the quote above hearkened back to the interplay between Piaget’s assimilation and accommodation. When students were able to access new information, there was the potential they would integrate it into what they already understood and, thus, learning was
P a g e | 16 advanced. Comparably, the processes of assimilation and accommodation induced by interaction with the environment, accounted for an individual’s intellectual adaptation, therefore resulting in a change in the individual’s cognitive structures or schemata. Following from the above quote more specifically, van Hiele asserted that “to promote the transition from one level to the next, instruction should follow a five-phase sequence of activities.” (van Hiele, 1999, p. 315) He felt that instruction should begin with an inquiry phase, in which materials led children to explore and discover certain structures. This would be followed by direct orientation (the second phase), where tasks were presented so that the characteristic structures appeared gradually to children (e.g., using puzzles that reveal symmetry of pieces). In the third phase (Explication) the teacher introduced terminology and encouraged its use in conversations and written work about geometry. In free orientation, the fourth phase, the teacher presented tasks with multiple solution paths enabling children to become more proficient with what they already knew (e.g., explorations of making different shapes with various pieces). The fifth phase was that of integration. This was where the student had oriented himself, but still needed to acquire an overview of all the methods at his disposal. During integration, he would work to condense all that he had explored into one integrated whole. The teacher could aid this by providing global overviews. It was important, however, that overviews did not present anything new to the student; they simply summarized what the student had already learned. The final result was that a new level of thought was attained at the close of this fifth phase. The student had at his disposal a system of relations related to the whole of the domain explored. This new domain of thought was then substituted for the previous domain of thought. Van Hiele pointed out that traditional instruction often involved only this last phase, which he felt explained why students did not master the material. Appropriate instruction,
P a g e | 17 according to van Hiele, was key. “Children whose geometric thinking you nurture carefully will be better able to successfully study the kind of mathematics Euclid created” (van Hiele, 1999, p. 316). Overlapping Waves Theory – Siegler Paralleling the work of Gutiérrez et al. (1991) and Lehrer, Jenkins, et al. (1998), Siegler (1996) demonstrated that children’s thinking had greater variability than recognized within the majority of theories of cognitive development. His research showed variability of several different types, including (a) different children using different strategies, (b) individual children using different strategies on different problems within a single session, and (c) individual children using different strategies to solve the same problem on two occasions close in time. As a result, Siegler proposed his overlapping waves theory, the basic assumption of which was that development occurred through a process of variability, choice, and change (Siegler, 1996, 2006). Figure 1.3 provides a pictorial display of Siegler’s overlapping waves model in which he …posited that children typically know and use varied strategies for solving a given problem at any one time. With age and experience, the relative frequency of each strategy changes, with some strategies becoming less frequent (Strategy 1), some becoming more frequent (Strategy 5), some becoming more frequent and then less frequent (Strategy 2), and some never becoming very frequent (Strategy 3). In addition to changes in relative use of existing strategies, new strategies are discovered (Strategies 3 and 5), and some older strategies abandoned (Strategy 1). (Siegler, 2006, p. 478)
P a g e | 18
Figure 1.3. Siegler’s overlapping waves model. One of the benefits of the overlapping waves model was that it allowed for the integration of both qualitative and quantitative aspects of learning within a single framework. That is, as children demonstrated novel and qualitatively different strategies, new waves were added to the model to represent these strategies. In addition, as a strategy increased or decreased in frequency of use – representing a quantitative change – that particular wave also increased or decreased in amplitude, respectively. Siegler stressed that “learning clearly involves both qualitative and quantitative changes; there is no reason for developmental theories to focus on one to the exclusion of the other” (Siegler, 2006, p. 478) Undeniably, there are other aspects of Siegler’s overlapping waves theory that cannot be depicted pictorially, such as in Figure 1.3. First, even in the earliest learning, children make adaptive choices from among known strategies, choosing strategies to fit the demands of problems and circumstances. These choices often depend upon finding a desirable combination of speed and accuracy from the available strategies and knowledge a child possesses. Second, and related to the first, is that these choices generally become even more adaptive and targeted with experience in the content area. Thirdly, improvements in children’s performance flow from
P a g e | 19 a combination of generating new superior approaches, relying more heavily on relatively advanced approaches already known, moving to increasingly adaptive choices from among available approaches, and improving the execution of all approaches. In his overlapping waves theory developed in conjunction with his microgenetic analysis, Siegler (2006) ascertained that cognitive change can be analyzed along five dimensions: source, path, rate, breadth, and variability. The source of change referred to the causes that set the change in motion. The path of change is the sequence of knowledge states, representations, or predominant behaviors that children use while gaining competence. The rate of change concerns how much time or experience separates initial use of a new approach from consistent use of it. The breadth of change involves how widely the new approach is generalized to other problems and contexts. The variability of change refers to differences among children in the other dimensions of change, as well as to the changing set of strategies used by individual children. (p. 479) Therefore, Siegler considered it of upmost importance to examine all five of these dimensions in order to obtain a full understanding of cognitive change in individuals. In summary, Piaget set the stage for constructivism through looking at qualitative changes in children’s thinking and learning. More recent and continued research has built on that foundation, and constructivism has continued to open the door to a more comprehensive understanding of how children learn and to how children grow to understand volume and volume measurement. Before turning to the idea of learning trajectories, which may seem a natural step at this point, I want to take a moment to look at previous research specific to spatial structuring and volume understanding.
P a g e | 20 Studies Focusing on Volume A number of geometricians believe that instruction in elementary notions of space should begin with the idea of volume, because that idea is less abstract, since all the objects we encounter in our everyday experience are in fact three-dimensional rather than twodimensional or linear. There is some justification for their point of view where the argument is limited to early topological intuitions of space (although, it should be borne in mind that volumes are always bounded by areas and areas in turn are bounded by lines, so that even at the earliest levels of representation, linear considerations are far more important than they are in perception). (Piaget et al., 1960, p. 360) The development of spatial awareness is acquired quickly in newborn infants. Initially, an infant’s world is limited to the space close around him or her. Gradually, as objects are explored within the child’s reach, he or she may begin to develop concepts, such as near/far, up/down, and big/little. At first, these concepts are relative to himself or herself – “I reach out, but can’t touch it” or “I can touch it, but it won’t fit in my hand.” As the child gains mobility, his or her world expands and new spatial ideas such as here/there, in/out, under/over and around. Eventually these concepts are linked with language, and later refined and clarified through application in a variety of situations. Piaget and children’s conception of space Piaget and Inhelder (1967/1997) contended that space was understood initially by topological relations and later by projective and Euclidean relations – Piaget’s topological primacy theory. This theory represented reversal of the development that was historically exhibited as Euclidean geometry, which occurred in antiquity, followed centuries later by the development of projective geometry, and followed later still by topology. Since its inception, Piaget’s topological primacy theory has found support from some researchers (Lehrer, Jacobson,
P a g e | 21 et al., 1998) while being contested by others (Clements & Battista, 1992; Clements, Swaminathan, Hannibal, & Sarama, 1999; Lovell, 1968). Piaget contended that children initially relate to the objects around them in a topological fashion; for example, distinguishing an object’s general properties of inside and outside. Perspective and metric relationships were ignored, resulting in children initially understanding shapes topologically; that is, in terms of four basic concepts: •
Proximity – the relative nearness of an object or event to any other;
•
Order – the sequence of objects or events (in time) according to size, color or some other attribute;
•
Separation – an object, event, or ‘space’ coming between other objects or events allowing for distinguishing between objects and parts of objects; and
•
Enclosure – an object or event surrounded by other objects or events, involving the ideas of inside, outside, and between.
With maturity and experience, the child began to relate to objects by taking other viewpoints into consideration. Consequently, this led to the conservation of straight lines. Finally, the more mature child took on a Euclidean perspective, demonstrating the ability to conserve distance, angles, and parallel lines. As such, the development of representational space required time. As an example, a child in the preoperational stage would consider a square, a circle, and a rhombus similarly based on his or her topology. The child’s drawings most likely would be closed, but would lack details of angle size and congruent sides. It is only with maturity and experience that the child began to develop a more sophisticated mental structure to represent objects spatially. Piaget would assert that it is important for adults to recognize that their assumptions of young children perceiving geometric shapes in a way similar to their own understanding of a three-dimensional coordinate system may be flawed. In fact, vertical-
P a g e | 22 horizontal coordination is not fully developed until age 8 or 9 (Piaget et al., 1960, p. 4). Contradictory studies by Lovell, however, have indicated that “…horizontal and vertical, and hence axes of reference, were understood by some 7-year-olds far better than one would expect from the Geneva results” (Lovell, 1968, p. 103). Moreover, Piaget asserted that motor activity was of enormous importance for the understanding of spatial thinking; that is, “…to recognize geometrical shapes the child has to explore the whole contour” (Piaget & Inhelder, 1967/1997, p. 23). After the child made sense of the shape of objects topologically, he or she would be confronted with the dilemma of how to make sense of objects as viewed from different perspectives. Of course, this dilemma could exist until the child realized that different perspectives could exist; thus, challenging the child’s egocentrism. Children, between the ages of four and six, begin to probe shapes through haptic perception—through the sense of touch, sometimes even in the absence of visual stimulation— yet, initially, they do so in a haphazard manner. They may happen upon cues or pointers that help to distinguish curved and straight sides, leading to recognition of the presence or absence of angles. Children’s drawings at this point demonstrate an attempt to represent these angles, but the drawings still lack true representation of the shape. It is not until the age of seven or eight, and through the development of more complex perceptual activity, that a child may return systematically to a stable reference point of a shape, and thus achieve “reversible co-ordination” (Piaget & Inhelder, 1967/1997, p. 36). “…drawing like the mental image, is not simply an extension of ordinary perception, but is rather the combination of the movements, anticipations, reconstructions, comparisons, and so on, that accompany perception and which we have called perceptual activity” (Piaget & Inhelder, 1967/1997, p. 33).
P a g e | 23 Up to ages seven and eight, children increasingly coordinate their physical actions and correspondingly show an internal coordination of their schemata. This coordination is present in conjunction with the evidence of anticipation of result, but still may demonstrate a lack of longrange planning. At this stage, the “linking together of intuitions, the formation of trains of ideas” is beginning to occur (Piaget & Inhelder, 1967/1997, p. 454). The coordination of schemata is an indication that the child is capable of mental reversibility. Therefore, an “initial equilibrium state is reached by internalized actions, [which] constitutes the first truly operational system” (p. 455). Piaget through his observations and explanations of a child’s conception of geometry and space opened the door for further research of children’s understanding of volume. As previously stated, some researchers have agreed with Piaget’s conclusions while others have contested them. In the end, however, and possibly of the greatest importance was that Piaget’s research spurred other researchers to explore more deeply the developmental processes by which children understand space and geometry, or, specifically for purposes here, volume. Piaget and children’s conception of geometry Piaget spent a great deal of time studying children engaged in what might appear to many adults as simple measurement tasks, many built on the idea of conservation. Conservation is the recognition that the attributes of an object (e.g., length, area, and volume) remain constant even though the appearance of the object may be changed. Piaget, Inhelder, and Szeminska (1960), in their studies of children’s conception of geometry, and Piaget and Inhelder (1967/1997), in their studies of children’s conception of space, questioned whether the development of conservation precedes measurement or, conversely, if conservation was an outcome of measurement understanding.
P a g e | 24 In his studies, Piaget presented children, ranging in age from three to nine years, with measurement tasks that required the children to use manipulative materials to solve storyembedded problems involving length, area, and volume. For volume, Piaget presented three tasks: the first involved the conservation of continuous quantity, the second focused on conservation of substance, and the third on conservation of volume (Ginsburg & Opper, 1988; Piaget et al., 1960). From these studies with children, Piaget related: …the conservation of a quantity of matter appears as early as level IIIA, but the conservation of volume as a physical concept is not elaborated until stage IV, being the level of formal operations. (Piaget et al., 1960) Piaget concluded that children conserved length around the age of six, area around eight, and volume around eleven. In discussing volume, Piaget distinguished two ways to look at volume. He referred to “occupied space” as a “moveable ‘container’” and “interior volume” as a “fixed spatial ‘container’” (Piaget et al., 1960, p. 394). Lovell clarified these distinctions relating occupied space to passing “…our hands around a box, block, or ball, to indicate the amount of space taken up by the object” (1968, p. 122). Similarly, he related interior volume to moving “…our hands about inside, say, a box or cupboard, in order to indicate the amount of space within” (1968, p. 122) Piaget determined that in order for a child to be successful on a conservation task, he or she needed to demonstrate three aspects of reasoning: (a) the identity of the object, (b) compensation of dimensions, and (c) reversibility of actions. First, through understanding identity, the child realized an object remained the same if nothing was added to or subtracted from it. Second, compensation indicated the child realized that changes in one dimension could
P a g e | 25 be offset by changes in another. Third, with reversibility, the child understood that mentally reversing steps would return an object to its original state, and cancel out a change. Piaget concluded that children below the formal operations stage conserved area based on the primitive conception of area (and volume) as “bounded by lines (or faces)” (Piaget et al., 1960, p. 355). At the early concrete operational stage children’s intuitions were “…still topological in character” (p. 368) and “…conservation is still limited to interior volume and does not as yet extend to the spatial relations between the object and its surroundings” (p. 374). He observed the first sign of the child “making full use of the concept of a unit of measurement or, alternatively, that he equated his measurements of boundary sides with his notion of volume” at the higher end of the concrete operational stage (p. 377). Clements & Battista Battista and Clements (1996) studied how 3rd, 4th, and 5th grade students conceptualized and enumerated cubes in three-dimensional arrays of rectangular prisms by asking children to determine the number of unit cubes required to construct various rectangular prisms. Findings indicated that children’s use of spatial structuring – “the mental act of constructing an organization or form for an object or set of objects” (p. 282) – in layering arrays of cubes to determine the volume was not intuitive, but rather, the process was created by mental actions (cognitive operations). Thus, successful enumeration strategies required children to coordinate the views of the rectangular prism faces, then integrate the information gained by the views of the rectangular prism faces “to form a coherent conception of the whole” (Battista & Clements, 1996, p. 271), which resulted in a global structuring of the cubes. In their results, they identified several strategies students utilized in the enumeration tasks. These strategies were classified into five categories as shown in Figure 1.4.
P a g e | 26 Category A. Conceptualizes the set of cubes as forming a rectangular array organized into layers B. Conceptualizes the set of cubes as space-filling but does not utilize layers
C. Conceptualizes the set of cubes in terms of its faces
D. Uses the formula L x W x H E. Other
Student strategies 1. Layer multiplying: Computes or counts number of cubes in one layer (vertical or horizontal) and multiplies by the number of layers. 2. Layer adding/iteration: Computes number of cubes in one layer (vertical or horizontal) and uses addition or skip counting (pointing to successive layers) to obtain total. 3. Counting subunits of layers: Cube counting is organized by layers, but counts by ones or skip-counts by a number other than the number of cubes in a layer. 1. Column/row iteration: Counts number of cubes in one row or column and uses skip-counting (pointing to successive rows or columns) to obtain total. 2. Counting subunits of columns or rows: Cubes counting is organized by row or column, but counts by ones or skip-counts by a number that other than the number of cubes in a row or column. 3. Systematic counting: Counts cubes systematically, attempting to count both inside and outside cubes.* 4. Unsystematic counting: Counts cubes in a random manner, often omitting or double-counting cubes, but clearly tries to account for inside cubes.* 1. Counting subset of visible cubes: Counts all, or a subset of, cubes on the front, right side, and top – those visible in the picture.* 2. Counting all outside cubes: Counts outside cubes on all six faces.* 3. Counting some outside cubes: Counts outside cubes on some visible and some hidden faces but does not count cubes on all six faces of the prism.* 4. Counting front-layer cubes: counts outside cubes in front layer. 5. Counting outside cubes, but not organized by faces. Explicitly says a formula is used, or implies it by saying, “Multiply this times this times this” [pointing to relevant dimensions]. No indication of understanding in terms of layers. (Note: If formula was used, followed up with, “Why did you multiply these numbers together? Why does this work?”) Uses a strategy other than those described in A-D, such as multiplying the number of squares on one face times the number on another face. * The strategy was used, and cubes on some edges were double-counted.
Figure 1.4. Students’ strategies for finding the number of cubes in a rectangular prism array (from Battista & Clements, 1996, p. 263).
Battista and Clements (1996?) found a fairly consistent distribution of strategies across problems. Therefore, they concluded that the use of non-layering strategies could not be attributed to any misinterpretation of diagrams, as had been suggested in some studies (BenHaim, Lappan, & Houang, 1985). Battista and Clements (1996?) found the following: •
About 60% of fifth graders, but less than 20% of third graders, used layering strategies.
•
No student used the volume formula meaningfully.
•
Double counting of cubes was the cause of many errors. (64% of the third graders and
P a g e | 27 21% of the fifth graders double-counted cubes at least once; 33% of the third graders and 6% of the fifth graders double-counted on all three items.) •
Students who applied a local, rather than global structuring would lose track of what they were doing, and thus were unable to make a correct enumeration. The students would often group some of the cubes that made up a portion of a side, column, or row, but overall demonstrated no scheme for organizing these groups.
Battista and Clements (1996) proposed that students must be able to spatially structure a three-dimensional array of cubes to enumerate the cubes in a meaningful way. This structuring involved (a) recognition of a unit, (b) establishing relationships between the units, and (c) recognizing that a subset, if repeated properly, can generate the whole (the repeating subset forming a composite unit). This thinking can be traced back to Piaget’s findings: …the reconstruction of shapes is not just a matter of isolating various perceptual qualities, nor is it a question of extracting shapes from the objects without more ado. The reconstruction of shapes rests upon an active process of putting in relation, and it therefore implies that the abstraction is based on the child’s own actions and comes about through their gradual co-ordination. (Piaget & Inhelder, 1967/1997, pp. 78-79) Battista and Clements (1996) also identified cognitive milestones in structuring threedimensional arrays. These are summarized in Figure 1.5 below. Medley of views Composite units Coordination Integration
Student organizes cubes or sets of cubes seen in only one face of the prism at a time (local structuring) – Strategy C Student conceptualizes one or more faces of the prism as a composite of 2-D arrays of cubes Student recognizes the interrelatedness of the different views of the 3-D array and coordinates accordingly Student constructs a coherent mental model and coordinates the orthogonal views of the prism
Figure 1.5. Cognitive milestones in structuring 3-D arrays in order of sophistication (Battista & Clements, 1996)
P a g e | 28 Although there were differences in how students structure an entire 3-D array, it appeared that the most effective was layering the composite 2-D array of one face. Included in this cognitive milestone lay “maintaining the faces-as-composites,” in which required a student recognized that a 2-D array on one face was a representation of a “composite units of cubes.” Thus, to advance from a “medley of views” perspective, a student must apply mental coordination of the orthogonal views. The initial result was that shared cubes were often double counted when enumeration of the cubes in the 3-D array was being made. Students needed to consider that different views of the prism would show different faces of particular cubes that were shared among views. Finally, there was the integration of the views, in which the student constructed “a coherent mental model” of the object that encompassed these views; thus, coordinating the orthogonal views. Battista and Clements (1996) hypothesized two processes by which a mental, spatial structuring model could be formed by an integration operation: the first depended on “recall” coordination; the second required transformation and coordination of images of objects (e.g., visualizing the iteration of a single layer through a distance determined by the third dimension). It is interesting to note that they suggested this “…spatial structuring provides the input and organization for enumeration. However, …sometimes it seems that attempts at enumeration engender spatial structuring or restructuring” (Battista & Clements, 1996, p. 288). This two-way synergy of enumeration supporting structuring and vice versa was also suggested by Ben-Haim et al. (1985). Their study indicated that middle school students, after an instructional intervention, showed improvement in solving problems involving orthogonal views. Ben-Haim et al. (1985) contend that spatial visualization can be improved by “training.”
P a g e | 29 Battista and Clements (1996) seem to be in agreement with Cobb, Yackel, and Wood (1992) that “…neither spatial structure nor mathematical meaning is inherent in objects …which are intended to embody certain concepts. Such meaning must be constructed by the individual” (Battista & Clements, 1996, p. 289). Thus, Battista and Clements (1996), as well as Cobb et al. (1992) parallel Piaget and Inhelder (1967/1997) in that the visual reconstruction of shapes occurs when children are able to abstract the structure of the object as a result of coordination of their own actions. Battista and Clements (1996) proposed a “…rudimentary, and by no means complete, theory of the development of students understanding of 3-D cube arrays” (p. 291): (a) the notion of spatial structuring, (b) the forming of composites, and (c) the coordination and integration operations. Battista (1999) verified the findings of Battista and Clements (1996) and also concluded that efficiently enumerating the cubes in a 3-D array, which required both coordination and integration, was complex. Battista (1999), though, went beyond the scope of the earlier study (Battista & Clements, 1996), in that he also investigated the sociocultural aspects evident in an inquiry-based, problem-centered classroom. Similarly to the Battista, Clements, Arnoff, Battista, and Borrow (1998) two-dimensional study, which involved students making predictions prior to manipulation of the square tiles, Battista (1999) required the students to predict the number of cubes needed to fill various boxes. Again, in order for a student to develop full understanding of spatial structuring, Battista (1999) contended that reflective abstraction in Piaget’s sense was necessary. This also correlated with the van Hiele’s levels, in which the implicit of one level is thought to become the explicit in the next. The interchange between the pairs of students in Battista’s study had considerable effect on the mental processes of each partner. In discussions that arose between pairs of students
P a g e | 30 and small groups, students were forced to justify their thinking, while at the same time extend their own conceptualizations to make sense of the partner’s justifications. While all five pairs of students in Battista’s (1999) case studies initially structured mental models of 3-D arrays as uncoordinated sets of orthogonal views, those who gained the most understanding of how to fill the box with cubes did so by coordinating the views. Those students who did not coordinate the views were not able to correct their structuring errors. Further, students who were able to interiorize the layer structure did so recursively, “…cycling through sequences of acting (structuring and enumerating), reflecting, and abstracting” (Battista, 1999, p. 442). However, students were often not consistent in using the layering structure. Curry and Outhred Curry and Outhred (2005) in their research were interested in understanding the interrelationships between the development of length, area, and volume understanding. In this, they described two different ways to look at volume: (Internal) volume may be measured in two ways. In one method, the space is packed with a three-dimensional array consisting of a two-dimensional array of units which is iterated in the third dimension. In the second method, the space is filled by iterating a fluid unit which takes the shape of the container. In this method, the unit structure is onedimensional. To differentiate these two methods, we shall call them volume (packing) and volume (filling) respectively. (pp. 265-266) The study provided confirmation that students developed an understanding of length, area, and volume measurement between Grade 1 to Grade 4, and also provided insight into how this development occurred. Overall, Curry and Outhred (2005) claimed confirmation of the general order of length-area-volume development, but also presented some other interesting findings. In specific, their work indicated the following findings:
P a g e | 31 •
Students were able to measure volume by filling as well as they did length, using a similar unit iteration procedure;
•
Measurement of length was not a prerequisite for measurement of area, partly because both seemed to be affected by a general tendency towards precision in recording the unit iteration; and
•
Understanding of the unit structure for area provides the foundation for understanding measurement of volume by packing. (Curry & Outhred, 2005, p. 271)
In terms of specific conclusions regarding volume, they expressed surprise by how well students could measure volume by filling, yet how students also appeared to be lacking in their understanding of volume by packing. To explain this, the researchers pointed to a potential for either a lack of experience with volume (packing) by the students or a lack of emphasis on volume (packing) by teachers. In their study, Curry and Outhred (2005) also presented what they called the first three levels of the learning framework for measurement, including the following: •
Level 1 – Identification of the attribute includes conserving, directly comparing, and ordering quantities.
•
Level 2 – Informal measurement includes choosing and using appropriate units for measuring quantities, and comparing and ordering quantities by using identical units to cover, fill or pack objects without gaps or overlaps.
•
Level 3 – Structure of repeated units includes using one unit or a composite unit to work out how many will be needed altogether when making indirect comparisons, and explaining the relationship between unit size and the number of units required to measure an object. (Curry & Outhred, 2005, p. 267)
P a g e | 32 It is important to notice that these three levels refer to measurement in general and are not specific to length, area, or volume. As part of their findings relating specifically to volume, they stated that “a clear distinction should be made between volume (filling) and volume (packing)” (Curry & Outhred, 2005, p. 272), adding that filling volume should be included much earlier in the curriculum. Curry and Outhred (2005) also made the claim that instruction focusing on measurement of packing volume should be delayed until after students have mastered measurement of area. Utilizing their framework for measurement, Curry and Outhred discovered Level 1 for volume (packing) appeared to be identical to Level 1 for volume (filling), thus argued for introducing both concepts at Level 1 as early as Kindergarten. Volume (filling) should then be reinforced with activities at Levels 2 and 3 in Grades 1 and 2. Levels 2 and 3 for volume (packing), however, they saw as quite different, claiming that students needed more practical experience involving packing with multiple units (i.e., Level 2) in Grades 2 and 3 to gain a better understanding of the three-dimensional array structure. Curry and Outhred (2005) followed this line of thought by stating that it might then be advantageous to delay Level 3 until Grade 4. Curry, Mitchelmore, and Outhred (2006), in a follow up article from the same study, compared five basic principles of measurement, which included the following components: •
The need for repeated units that do not change,
•
The appropriateness of a selected unit,
•
The need to use the same unit to be used to compare objects,
•
The inverse relationship between unit size and the number of units, and
•
The structure of the repeated units.
P a g e | 33 Regarding the first two principles, in both experiments designed to explore these, tasks focused on length, area, and volume were deemed not parallel, hence comparison across the three could not be accurately completed. For example, “students often rejected the possibility of using different sized units for area and volume on the basis that it was physically impossible to fit the different sized tiles or blocks together to fill the rectangle or box, not because it would produce an invalid measurement.” (Curry et al., 2006, p. 380) Curry, Mitchelmore, and Outhred (2006) findings demonstrated that by Grade 4, the majority of students seemed to have mastered the fifth principle for length and area, however only about 50% had done so for volume. Further, they pointed out that on the volume task where students were not given enough units to pack completely, many students could only cover the base of the given box using iteration. However, other students said they did not know what to do or that they needed more blocks. Students who were successful, they claimed, seemed to have a mental picture of the unit structure. These students often found the number of units in the base (or the bottom layer) and then multiplied by the number of layers to find the total volume. Curry et al. (2006) pointed to two factors that seemed to make the iteration of volume units more difficult than the iteration of area units: (a) the unit had to be moved around in empty space instead of on a hard surface, and (b) it was not possible to mark successive positions of the unit as it is moved around. Overall, the Curry et al. (2006) study revealed that students developed in understanding of length, area, and volume as they progressed from Grade 1 to Grade 4. Additionally, understanding of length appeared to be more developed than that of area, which in turn was more developed than that of volume. The study established the following implications for teaching: “Young students appear to have a much poorer understanding of the need for identical units that
P a g e | 34 leave no gaps than teachers often assume, and they may indeed have no clear concept of what they are measuring.” (Curry et al., 2006, p. 383) Studies Focusing on Learning Trajectories Although constructivism has provided mathematics educators with useful ways to understand learning and learners, the task of reconstructing mathematics pedagogy on the basis of a constructivist view of learning is a considerable challenge, one that the mathematics education community has only begun to tackle. Although constructivism provides a useful framework for thinking about mathematics learning in classrooms and therefore can contribute in important ways to the effort to reform classroom mathematics teaching, it does not tell us how to teach mathematics; that is, it does not stipulate a particular model. (Simon, 1995, p. 114) A solid framework upon which research regarding growth in student understanding can be built is provided in the theories of Constructivism. In addition, previous research focusing on volume measurement has demonstrated consistency with a constructivist framework. A logical next step is the development of learning trajectories for volume measurement. Simon (1995) stated, “The consideration of the learning goal, the learning activities, and the thinking and learning, in which students might engage make up the hypothetical learning trajectory” (p. 133). He further posited that these “hypothetical learning trajectories” are an integral component of the Mathematics Learning Cycle, as shown in Figure 1.6.
P a g e | 35
Figure 1.6. Mathematics Learning Cycle (Simon, 1995)
It is important to note that Simon utilized the adjective hypothetical in his description. He goes on to explain, “The path that you travel is your ‘trajectory.’ The path that is anticipated at any point in time is one’s own “hypothetical trajectory” (Simon, 1995, p. 137) His choice of the word trajectory is purposeful, and meant to refer to a path. As a journey is begun down a particular path, a plan for a route you might follow is revealed; hence, this is the hypothetical trajectory. Looking back, it is possible to see the path that was actually followed, and that is one’s own trajectory. For instance, consider the following: You have decided to sail around the world, in order to visit places that you have never seen. One does not do this randomly (e.g., go to France, then Hawaii, then England), but neither is there one set itinerary to follow. Rather, you acquire as much knowledge relevant to planning your journey as possible. You then make a plan. You may initially plan the whole trip or only part of it. You set out sailing according to your plan. However, you must constantly adjust because of the conditions that you encounter. You continue to
P a g e | 36 acquire knowledge about sailing, about the current conditions, and about the areas that you wish to visit. You change your plans with respect to the order of your destinations. You modify the length and nature of your visits as a result of interactions with people along the way. You add destinations that prior to your trip were unknown to you. (Simon, 1995, pp. 136-137) Over the past decade or so, researchers have developed learning trajectories to help explain children’s development in many different areas, not only in mathematics. For the purposes here, three learning trajectories will be presented, focusing on the development of volume understanding. Confrey Confrey et al. (2009) define a learning trajectory as: A researcher-conjectured, empirically-supported description of the ordered network of experiences a student encounters through instruction (i.e., activities, tasks, tools, forms of interaction and methods of evaluation), in order to move from informal ideas, through successive refinements of representation, articulation, and reflection, towards increasingly complex concepts over time. (p. 3) Although much of Confrey’s research on learning trajectories is focused on equipartitioning, she and her fellow researchers (see TurnOnCCMath.net) have also worked on developing complete learning trajectories built around the Common Core State Standards (CCSSO, 2010; Confrey et al., 2012). As the primary purpose is on volume measurement, attention will be focused there. A summary of the measurement concepts explicated in the Common Core is provided in Figure 1.7 below.
P a g e | 37
Figure 1.7. Length, Area, and Volume summary from TurnOnCCMath.net (Confrey et al., 2012) Confrey et al. (2012) described what they see presented in the Common Core as the beginnings of a learning trajectory. In this learning trajectory, students learn to identify the spatial attributes of objects and shapes for measurement. Driven by comparison of length, area, and volume, students master the conceptual principles underlying length measurement, area measurement, and volume measurement. For each of these, a progression in understanding has been outlined, including the following components: •
Define attributes
•
Directly compare and represent
•
Indirectly compare
•
Measure with no gaps or overlaps
•
Compensatory principle for unit size and measure
•
Additive principle
As with many areas of the Common Core, Confrey et al. have identified pieces in the measurement progression that were lacking; that is, they saw gaps in the progression laid out in the Common Core that needed to be filled. Thus, they introduce what they call “bridging
P a g e | 38 standards” to help fill the gaps and complete the learning trajectory for volume. As stated on the TurnOnCCMath website: Bridging standards have been created to introduce concepts involving length, area, and volume that cover any gaps in the above module and also in earlier grades (Grades K-2) for length, area or volume up to Grade 3, though according to the CCSS-M document, area and volume standards start at Grade 3. Subsequently, students learn to convert units of measurement (Grades 4-5), and then advance to finding the area and volume of geometric shapes and solids (Grades 6-8). (Confrey et al., 2012) For the purposes of this review, details of the Confrey et al. (2012) learning trajectory for volume are included in Appendix A details. In doing so, pieces of the learning trajectory were extracted for measurement focused on volume, and presented in three tables. Note that descriptions and descriptors shown in black are those originally presented in the Common Core State Standards (CCSSO, 2010); whereas, those in brown are the bridging standards developed by Confrey and her fellow researchers (Confrey et al., 2012). It is also important to note that since the Common Core is mainly focused on content development, the completed learning trajectory for volume presented also incorporated a content-development focus. Additionally, lacking from the overall learning trajectory is a focus on instruction. Battista In his work, Battista (2011) has defined “a learning trajectory as a detailed description of the sequence of thoughts, ways of reasoning, and strategies that a student employs while involved in learning a topic, including specification of how the student deals with all instructional tasks and social interactions during this sequence” (p. 510). Battista further stated,
P a g e | 39 “One critical difference between my definition of learning progressions and my definition of learning trajectories is that trajectories include descriptions of instruction, progressions do not” (Battista, 2011). Interestingly, as Battista continued on to present examples of “learning trajectories,” he outlined a slightly different picture. In particular, Battista presented developing understanding of volume through what he identified as “levels of sophistication,” that are described as the levels a student passes through as he or she moves from an intuitive idea and reasoning to a more formal understanding of a mathematical concept. The levels of sophistication in which Battista theorized make up a developing understanding of volume are illustrated in Appendix B. These levels of sophistication consist of both non-measurement and measurement levels. Non-measurement levels involve reasoning about volume without quantification, while measurement levels involve quantification. A picture of these levels of sophistication, with a taller column representing a higher level of sophistication in reasoning is presented in Figure 1.8. As you can see, Battista described a learning trajectory as the path a student takes as he or she progressed through the levels of sophistication. Different students followed different trajectories involving various combinations of non-measurement and measurement levels. Although Battista incorporated a more child-development focus to build his level of sophistication, his focus has remained more of a learning progression concentration. He has provided the levels of development children go through, however in presenting the idea of a learning trajectory, he has referred to the various paths students can take through the levels of sophistication. As with that presented by Confrey et al. (2012), the focus on instruction is somewhat lacking.
P a g e | 40
Figure 1.8. Battista’s depiction of a learning progression Sarama & Clements Parallels can be made in the work of Sarama and Clements to Simon’s definition of a learning trajectory, more than Confrey et al. (2012) and Battista (2011?). Like Simon, Sarama and Clements have viewed learning trajectories as an integration of psychological science with the science of teaching (Clements & Sarama, 2004b; Simon, 1995). In particular, they have posited learning trajectories as providing specific instructional interventions that fit with the ways of thinking characteristic of specific levels of strategy or thinking about the domain (Clements & Sarama, 2007a; Sarama & Clements, in press). As such, Sarama and Clements have conceptualized learning trajectories as descriptions of children’s thinking and learning in a specific mathematical domain and a related, conjectured route through a set of instructional tasks designed to engender those mental processes or actions hypothesized to move children through a developmental progression of levels of thinking, created with the intent of supporting children’s achievement of specific goals in that mathematical domain (Clements & Sarama, 2004a, p. 83).
P a g e | 41 Thus, Clements and Sarama (2009) learning trajectories have three components: (a) a goal (that is, an aspect of a mathematical domain children should learn), (b) a developmental progression or learning path through which children move through levels of thinking, and (c) instruction that helps them move along that path. As with the learning trajectories for Confrey and Battista, the learning trajectory for volume measurement as developed by Sarama and Clements (Sarama & Clements, 2009) are presented in the appendices (see Appendix C). Hierarchical Interactionalism. Sarama and Clements have hypothesized that LTs (in particular, children’s development of geometric measurement understanding) develop through a theoretical lens termed hierarchic interactionism (Sarama & Clements, 2009); that is, the influence and interaction of both global and local (domain specific) cognitive levels and the interactions of innate competencies, internal resources, and experience (including instruction and available tools from the culture). In general, children are expected to progress through levels of understanding for measurement of volume or capacity in ways that can be characterized by specific mental objects and actions (i.e., both concept and process) with the most visible progress through levels for domain-specific topics. In their theory, Clements and Sarama (2009) have postulated that various models and types of thinking grow in tandem to a degree, but a critical mass of ideas from each level must be constructed before a thinking characteristic of the subsequent level becomes ascendant in the child’s thinking and behavior (Clements, Battista, & Sarama, 2001b). This has often involved “fall back” to prior levels of thinking under increasingly complex demands, as typified in findings about children’s developing sophistication of units and units of units for measuring perimeter (Barrett, Clements, Klanderman, Pennisi, & Polaki, 2006). With experience, the level
P a g e | 42 of thinking may become robust, and progressions follow a predictable pattern of learning activity: first, there are sensory-concrete levels wherein perceptual, concrete supports are necessary and reasoning proceeds through limited cases; second, verbally-based generalizations follow as the child abstracts ideas beyond the immediate sense; third, there are integratedconcrete understandings that rely on internalized mental representations that serve as mental models for operations or abstractions. They also have viewed concepts and skills as proceeding in tandem (Siegler & Opfer, 2003), with a benefit accruing where concepts are established to give context and foundation to skills. A graphical depiction of hierarchic interactionalism is depicted in Figure 1.9. In hierarchic interactionalism, types of knowledge usually develop simultaneously. In this figure, dominance of a particular level of thinking is indicated through darker shading. Thus, Level A is dominant in the earliest years, but Level B thinking is developed and eventually enmeshed with Level A thinking, although inadequately at first (symbolized by the small double arrow at the left). Once Level B thinking is more developed and more strongly connected with Level A, it may become the dominant pattern of thinking. Shading indicates the unconscious probabilities of instantiation. In a related vein, but not illustrated, are the executive processes that also develop over time, serving to integrate these types of reasoning and, importantly, to determine which level of reasoning will be applied to a particular situation or task increasingly through development.
P a g e | 43
Figure 1.9. Hierarchic interactionalism (Sarama & Clements, 2009, p. 215) Concluding Thoughts The primary purpose was to present theory, and previous research leading to the development of learning trajectories for volume measurement. Piaget and other constructivists set the stage by helping to transition from a quantitative understanding of development to a qualitative one. Previous research in volume has helped to identify various understandings integral to a complete understanding of volume. Much of that research, however, has focused specifically on spatial structuring. Additionally, correlations between length, area, and volume understanding have been explored in the existing body of research. In that exploration, however, a strong distinction has been made in separating volume (filling) from volume (packing) (Curry et al., 2006; Curry & Outhred, 2005). I agree that these two are conceptually different, but are they so entirely
P a g e | 44 different that one can be separated from another? Should commonalities between the two be further explored in order to provide a more cohesive picture of volume understanding? All in all, this previous research has lead more recently to the development of learning trajectories for volume. Although there has appeared to be a lack of consensus on what is constituted as a complete learning trajectory, there has been a general consensus that the mathematical goal and developmental progression must be included in the definition. Some researchers have focused their theories more around the content, while others have focused more on child development. This has seemed to be an ever-present struggle in mathematics education, which has been apparent throughout the current review of research. I have to agree with Simon, as well as Sarama and Clements. It is not exclusively about the mathematical goal, and the developmental progression. In addition to the developmental progression reflecting the development of the child, an important component of any learning trajectory must be instruction designed to move students through the developmental progression. For that reason, I have continued my own research through that lens. In my next paper, I begin with the learning trajectory for volume, as developed by Sarama and Clements (see Appendix C). Further, the learning trajectory for volume is supported through the presentation of qualitative and quantitative evidence to verify and refine the existing learning trajectory. One promise of common state standards is that over time, they will allow research on learning progressions to inform and improve the design of Standards to a much greater extent than is possible today. (CCSSO, 2010, p. 5)
P a g e | 45 Paper 2: Verifying and Refining a Developmental Progression for a Learning Trajectory for Volume Measurement – Pre-K through Grade 2
Purpose The purpose of the research presented here is to provide qualitative and quantitative evidence in verification of the volume learning trajectory (LT) developed by Sarama and Clements (2009), as well as highlight areas in need of revision and refinement. Qualitative evidence comes from a longitudinal teaching experiment following the development of eight children from pre-K through Grade 2. Quantitative evidence comes from assessments given to a larger sample of children in pre-K through Grade 5. This assessment data was submitted to Rasch modeling and the results are presented here. In conclusion, I present a revised developmental progression for a LT for volume measurement that incorporates results from both the verification and refinement. Theoretical Framework I use learning trajectories (LTs) as defined by Sarama and Clements (2009) as the foundation for this research. In addition, since the work is built around the LTs of Sarama and Clements, their theoretical framework of hierarchic interactionalism also becomes the basis of mine. Explanations of both are presented here. Hypothetical LTs (Simon, 1995) serve as the core of multiple research projects, curricula, and professional development projects (Bredekamp, 2004; Clements & Sarama, 2004b; Simon, 1995; Smith, Wiser, Anderson, & Krajcik, 2006). Sarama and Clements (2004a) define LTs as… …descriptions of children’s thinking and learning in a specific mathematical domain and a related, conjectured route through a set of instructional tasks designed to engender those mental processes or actions hypothesized to move children through a developmental
P a g e | 46 progression of levels of thinking, created with the intent of supporting children’s achievement of specific goals in that mathematical domain. (p. 83) Thus, LTs have three components: •
a goal (that is, an aspect of a mathematical domain children should learn),
•
a developmental progression or learning path wherein children move through levels of thinking, and
•
instruction that helps children move along that path.
Teachers can use LTs to facilitate an understanding of (a) children’s mathematical thinking and learning, (b) how that learning is supported by curricula and pedagogy, and (c) appropriate mathematics content. By illuminating potential developmental progressions, consideration of LTs effects coherence and consistency within three main components of education: assessment, professional development, and instruction. Sarama and Clements (2009) view LTs through a theoretical lens they term hierarchic interactionalism. In their theory they highlight the influence and interaction of both global and local (domain specific) cognitive levels, as well as the interactions of innate competencies, internal resources, and experience (including instruction and available tools from the culture). In general for volume measurement, hierarchic interactionalism indicates that children to progress through levels of understanding for measurement of volume or capacity in ways that can be characterized by specific mental objects and actions (i.e., both concept and process) with the most visible progress through a specific progression of levels. In their theory, they postulate that various models and types of thinking grow in tandem to a degree, but a critical mass of ideas from each level must be constructed before thinking characteristic of the subsequent level becomes ascendant in the child’s thinking and behavior (Clements et al., 2001b). This often involves “fall back” to prior levels of thinking under
P a g e | 47 increasingly complex demands, as typified in findings about children’s developing sophistication of units and units of units for measuring perimeter (Barrett et al., 2006). With experience, a level of thinking becomes robust and progressions follow a predictable pattern of learning activity: first, there are sensory-concrete levels wherein perceptual, concrete supports are necessary and reasoning proceeds through limited cases; second, verbally-based generalizations follow as the child abstracts ideas beyond the immediate sense; and third, there are integrated-concrete understandings that rely on internalized mental representations that serve as mental models for operations or abstractions. Sarama and Clements also view concepts and skills as proceeding in tandem (Siegler & Opfer, 2003), with a benefit accruing where concepts are established to give context and foundation to skills. Behaviors are the observable signs of a complex system of cognitive schemes and networks within the ecology in which a child is operating. Thus, behaviors are indicators of mental actions on objects; we recognize actions as processes and objects as concepts. For example, a mental object such as a concept image of a shape or a ruler might undergo processes such as iterating, geometric transformation (mental versions of thee, all), combining, separating, etc. In summary, actions on objects are mental actions; observable behaviors signify those mental actions. Qualitative Methods Participants Participants for this research were from a small, urban, parochial school in Western New York (78% White, 9% Black, 9% Hispanic, 1% Asian, 3% two or more races). At the beginning of the study, we assessed all children in pre-K through Grade 5 who returned consent forms using a tool designed to assess understanding of measurement (including length, area, volume,
P a g e | 48 weight, and angle); then, using results from this assessment and in consultation with teachers, we selected a cohort of eight pre-K children as our focus children for a four-year, longitudinal study. As a representative subset for both school sites, we selected two low-, four middle-, and two high-performing students. In our selection, we also made an effort to vary SES, cultural background, gender, and achievement. These eight children were the main focus participants in the individual teaching episodes for whom qualitative data was collected. Another eight pre-K children were selected as background children using the same process; these children were used for pilot testing of tasks, collection of additional data, and group teaching episodes. Two of the focus children moved out of the district after the first year and a third after the second year; each time a child left the cohort, another was selected as a replacement from the background child group, to maintain a cohort of eight focus children for data collection. Three children remained in the study for the four-year duration and I use these as the main focus of the qualitative analysis: Edith, a high-performing, female student, Ryan, a middleperforming, male student, and Lia, a low-performing, female student. To provide additional details at points, we turn to two children for whom we have between two and three years of longitudinal data: Zola, a middle-performing, female student and Marina, a low-performing, female student. It should be noted that Lia was retained in Kindergarten due to a weak recognition of letters, while the other four children progressed yearly through the grades. In addition to the individual teaching episodes, each year we conducted whole class observations and classroom teaching experiments involving between 18 and 22 participants. During the fourth year of the study, we ran monthly Measurement Club meetings at the school. All the second graders were invited to the club meetings and each week between six and nine children were in attendance. Finally, at the toward the end of the fourth year of the study, we
P a g e | 49 again assessed all children in pre-K through Grade 5 using an instrument designed to assess understanding of length, area, and volume. In this report, my focus is on the development of the eight focus children’s measurement thinking and learning about volume as they progressed from pre-K (February 2008) through Grade 2 (June 2011), with a particular focus on the three children for whom there are four years of longitudinal data. Additional supporting evidence is presented from the assessment instruments. Data Sources Teaching episodes. The bulk of the qualitative data used to examine longitudinal growth came from the teaching episodes (TEs) conducted over the four years (Steffe & Thompson, 2000). TEs were conducted with selected individuals, pairs of children, and small groups (sometimes as a follow-up to classroom-based teaching experiments). Some were similar to clinical interviews, but most involved various instructionally-relevant actions. Planning for each TE was informed by all previous observations, with a focus toward clearer understanding of children’s mental actions on objects and understanding of volume. To ensure the research was well grounded within classroom instruction and teacher practices, classroom instructional sequences were co-designed and implemented with collaborating teachers (Jacobson & Lehrer, 2000; Lehrer & Chazan, 1998; Lehrer & Nitaback, 1996; Swafford, Jones, & Thornton, 1997). We also observed lessons or sequence of lessons taught by the classroom teacher that included substantial attention to measurement. In addition, during the final year of the study, a monthly “Measurement Club” was implemented as a way to provide children with exposure to and informal experiences with measurement in a group environment. This integration of individual and classroom foci was
P a g e | 50 intended to address the naturalistic setting of children’s experiential learning from multiple perspectives, including viewing learning as a psycho-cognitive and a socio-cognitive phenomenon (Fischer & Bidell, 2006). Thus, data sources included video records of the initial and final interview assessments, clinical interviews, and TEs, teacher-led instruction on measurement-related science and math lessons, written classroom observations, researchers’ journals from assessments and TEs, and classroom teachers’ reflective journals. Each data source was reviewed to identify behaviors specific to volume understanding. These behaviors were coded, then analyzed using the LT for volume measurement (Sarama & Clements, 2009) to identify the levels at which a child was demonstrating knowledge or highlight gaps and/or inconsistencies in the LT for volume measurement. As consistent behaviors were evidenced, revisions of the LT were considered and checked against new data.
Year 1
• Initial Assessment • 4-Day CTE focusing on length • 4 TEs – 4 involving length and 1 involving area
Year 2
• 4-Day CTE focusing on area and perimeter • 13 TEs – 11 involving length, 4 involving area, and 7 involving volume
Year 3
• 5-Day CTE focusing on length • 14 TEs – 8 involving length, 5 involving area, and 6 involving volume
Year 4
Data Collection Cycle
• 9 TEs – 2 involving length, 6 involving area, and 4 involving volume • 8 Measurement Club meetings – 5 involving length, 5 involving area, and 6 involving volume • Final Assessment
P a g e | 51 Qualitative Results Throughout the four-year longitudinal study, children exhibited actions on objects consistent with the learning trajectory (LT) for volume proposed by Sarama and Clements (2009). However, I also observed other volume/capacity-related behaviors I felt necessitated both revision and expansion of the LT for volume measurement. These behaviors led to reexamination of the LTs for length and area measurement, opening up consideration of possible parallels with the LT for volume measurement. My focus here is to present evidence using three focus students (Edith, Robert, and Lia) for which there are four complete years of longitudinal data, while also using other focus students to fill in more detail, as needed. Consistency with Initial LT for Volume Measurement Analysis of the initial assessment indicated that pre-K children in the study already identified volume as an attribute although the nature of the clinical interview method used on the initial assessment did not afford an opportunity to probe children’s thinking or verbalization competencies in depth. On the initial assessment, I also observed children using direct comparison strategies to compare capacities of containers and volumes of objects. As they moved through the longitudinal study and their spatial structuring skills increased, children progressed through various levels of volume quantification, including recognition of (a) cube faces, (b) cubes as blocks, (c) 1 x 1 x 1 cubes as units, (d) 1 x 1 x n cores or rows/columns (units of units), and then (e) 1 x n x m layers or slices (units of units of units); as well as the development of additive thinking and a movement toward multiplicative thinking while structuring volume.
P a g e | 52 Volume Quantity Recognizer Children at the Volume Quantity Recognizer (VQR) level identify capacity or volume as an attribute, perceiving space and objects within three-dimensional space. Children at this level articulate statements such as, “That box holds a lot of blocks” and often use general terms such as “big” and “small” but are not necessarily able to relate their observations to the dimensions of the box or to quantify a number of blocks. At the beginning of the study, in the Spring of pre-K, both Edith and Robert demonstrated reasoning above the VQR level. Lia and Marina, however, each demonstrated reasoning consistent with the VQR level. For example, on one initial assessment task, children were given a small cube and asked to predict how many would be needed to measure the space inside a juice box (materials for this task are shown in Figure 2.1). Lia, looked at the box and said, “You make some room and you put it in there,” as she placed the cube inside the box, never indicating a quantity nor seeking any more cubes to fill the box. Marina picked up the box and looked inside, then announced, “2,” clearly indicating that she was considering the interior of the box, but not clearly relating the size of the cube with that of the box. I interpreted these behaviors as indicative that children perceived the (intuitively, three-dimensional) space inside the box (i.e., that the box could hold objects) and that the cubes would take up space inside the box, both of which reflected mental actions on objects consistent with the VQR level. I would not say they are higher than VQR because their focus was simply on recognizing the interior of the box; neither Lia nor Marina looked to fill the box completely nor structure the cubes inside the box.
P a g e | 53
Figure 2.1. Box and cube used in initial assessment task On the initial assessment, several children also demonstrated VQR level thinking when they were shown four objects (see Figure 2.2), and told to imagine they were blocks of ice, and then asked to determine which block would melt into the largest puddle. Rhonda looked at the blocks, pointed to the 3 x 2 x 1 array then the S-shape, and said, “This is small; this is big.” Lia, on the same task, ignored the L-shape, but then pointed to the other three solids moving from left to right saying, “This one’s fat, this one is large, and this one is tiny.” I interpreted this thinking as indicative of the VQR level, because children made gross comparisons of the blocks, identified their volumes as an attribute, but did not attempt to align them to compare directly, discuss them in terms of any of their dimensions, or otherwise quantify them.
Figure 2.2. Initial assessment “ice block” task Similarly, I observed children making gross comparisons of capacity. For example, in the Fall of Kindergarten, children were presented with cups of varying sizes and asked which cup they would want if all were filled with lemonade. On this task, Edith, Robert, and Lia each chose the cup with the largest capacity. When asked why she chose the blue cup (see Figure 2.3), Lia
P a g e | 54 put her hand inside the top of the blue cup then continued to the yellow then green then red cups saying, “‘Cause that one is bigger, and this one is tiny, and this one is tiny, and this one is tiny.” Notice that the only differentiation Lia makes is between the largest of the cups and the others, with no reference to dimensions. I interpreted these intuitive comparisons of capacity as further indication of the VQR level, similar to those of the length trajectory, in which children make gross comparisons of length without physically aligning objects nor verbalizing a mental alignment.
Figure 2.3. Lia chooses the largest cup for lemonade These examples provided evidence that children had an awareness of volume (or capacity) as an attribute at the beginning of the study; however, most children were functioning beyond the VQR level. The data also indicate that children treated tasks focused on capacity differently than those focused on structuring. Although much of previous research focused on the initial phases of volume understanding around the idea of capacity, even at these initial levels we began to notice the need to expand the LT to include not only capacity, but also volume as spatial structuring. Capacity Direct Comparer Moving beyond simply recognizing volume as an attribute, at the Capacity Direct Comparer (CDC) level children demonstrate the ability to compare the volumes of two
P a g e | 55 containers or objects. Children exhibiting thinking at the CDC level are able to compare the capacities of two containers, often through the action of pouring from one container into another. Initially, children may compare the containers using perceptual cues and focusing only on linear extent (e.g., the height of cylinders, as in the seminal work on conservation, Piaget & Inhelder, 1967/1997). Eventually, when pouring from one container into another, children can recognize “overflow” as indicating the container “poured from” contains more than the container “poured into.” When asked to compare the volumes of objects and capacities of containers directly, children in the study physically manipulated objects, initially aligning by the longest dimension and referring to linear extent alone when comparing, using terms such as taller, bigger, shorter, and smaller to describe differences—a shift away from the gross comparisons used at the VQR level where no reference to dimension was utilized. For example, as a follow up to the initial assessment task in which children compared the volumes of the four solids in Figure 2.2, children were specifically asked to compare the volumes of the L-shape and the 3 x 2 x 1 array, each of which was constructed from six cubes, Robert laid the two solids so they were side by side on the table, put his hand on the L-shape, and said, “This one’s larger.” Similarly, Edith stood the two solids on the table next to each other to compare their heights, and responded, “This one’s larger [holding up the L-shape]. This one’s smaller [holding up the 3 x 2 x 1 array].” Marina, in the cups task (described above and depicted in Figure 2.4), was initially indecisive about which cup she would choose. She initially chose the yellow cup, and then changed her mind to the blue. Researcher: Why do you want the blue one? Marina: When I do like this [pushing the cups together with her hands then laying her head down sideways to look across the tops of the cups], this one is short, medium…
P a g e | 56 [pointing to the yellow, then the green cups]. This one is small, medium, large… [pointing to the yellow, green, then red cups] …and two of these mediums. (See Figure 2.4) R: Two of those are mediums? M: …and this one is large [pointing to the green, red, then blue cups]. R: And why do you want the large one? M: Because the large is the biggest. R: So does that mean the blue one would hold the most lemonade? M: Yes. Although she struggled to find unique words to describe all the cups, I interpreted her actions as indicative of the CDC level because she physically aligned the cups, then visually compared and attempted to label them by one dimension, height.
Figure 2.4. Marina compares the cups by height, aligning them and looking across the tops At this same point, some children also began referring to more than one dimension in their capacity comparisons, and often combined vocabulary with a motion or gesture to indicate the compared dimensions. For example, during a TE in the Fall of pre-K, children were presented with a graduated cylinder and a large, fluted glass, then asked which one they would like if they were both filled with lemonade. Lia chose the glass, “because that one is tiny [using her thumb and forefinger to refer to the opening at the top of the graduated cylinder; see Figure
P a g e | 57 2.5]; this one is that big [referring again to the opening of the glass]. This one is bigger and that one is littler [sweeping her hand up the glass, then down the cylinder to indicate height].” Although she struggled to find the vocabulary to describe what she was seeing, Lia clearly referred to more than one dimension in her comparison, as indicated by her hand gestures. However, at that point, she seemed to neglect the gross comparison of the two containers (i.e., that the glass had a stem for the bottom half that would not hold any liquid) in favor of comparing dimensions. Although Lia referred to two dimensions, I took this as evidence of the CDC level because her focus was on comparing volume using only the linear extent.
Figure 2.5. Lia compares capacity using more than one dimension. Both Robert and Edith demonstrated thinking at the CDC level in tasks that involved pouring from one container into another during the Fall of Kindergarten. In one TE, for example, children were presented with two graduated cylinders of different diameters and asked which one had the larger volume; with clarification that volume meant how much each would hold. Water was provided with which both Edith and Robert filled one cylinder, poured it into the other, and correctly determined the one with larger volume. In a second task, two weeks later, they were presented with two cups and rice as the filling material. Again, both correctly determined the container with the larger volume by filling one container and pouring it into the other.
P a g e | 58 In the Fall of Grade 2, children were presented with tasks asking them to choose which of two containers they thought had the larger volume without doing the direct comparison first, mainly as verification of a level they had shown previously. Children used different strategies for deciding which was larger and were encouraged to make a choice. Once a child had selected one container as “larger” (whether it was a correct determination or not), the “larger” container was filled with water and children were asked what would happen if that container was poured into the other. Both Edith and Robert thought the “smaller” container would be filled exactly to the top, not overflowing; Lia said that she was unsure of what would happen. Thus, although children could directly compare capacities at younger ages, they did not mentally “run” the procedure (cf. Simon, 2013) to anticipate the result during Grade 2 nor did we find evidence that they anticipated the outcomes of comparisons for capacity before Grade 2. Hierarchic interactionalism posits the continued existence of earlier levels, and the role of intentionality and social influences in their instantiation, which explains why in some contexts even adults fall back to earlier levels; for example, failing to conserve in certain situations. The lack of experience with this problem context may have evoked earlier levels of thinking in this case (see the theory’s postulation of the construct of nongenetic levels in Sarama & Clements, 2009). Or, the children did not reflect the physical actions of comparing capacities to a more abstract representation, perhaps due to a lack of articulation during the physical experiences. That is, their procedures for comparing capacities remained at the physical-concrete level because the children were not provided with experiences (tasks and interactions) that engendered the reflective abstraction necessary to rebuild the procedures at the integrated-concrete level that would support anticipation.
P a g e | 59 Evidence supported the claim that children move into direct comparison shortly after recognition of volume as a quantity, thus the CDC level is placed appropriately within the LT. However, as with the LQR level, I observed children’s development not only in capacity, but also in spatial structuring. Thus, I see a need for expansion of the level description to include spatial structuring at this early level. Capacity Indirect Comparer An indirect comparison strategy is required in situations where two objects cannot be compared directly (physically or mentally aligned). For capacity, this may involve using a third container to pour into each. From my observations, this thinking seemed to be a special strategy children used on occasion and often only when prompted; most children used direct comparison whenever the option was available. In tasks designed specifically to elicit behaviors indicative of indirect comparison, children in the study nearly always attempted to make visual direct comparisons instead; thus, as we discussed in the length chapter, it was difficult to determine if children were unable to compare indirectly or if they simply chose not to. Consequently, the placement of indirect comparison within the LT for volume measurement currently is undetermined. Nonetheless, I did observe children utilizing indirect comparison strategies during the study. In a TE in the Fall of Kindergarten, children were presented with several tasks asking them to compare the volumes (capacities) of pairs of containers by filling with rice or water. Specifically, the children were given a container filled with rice or water and asked if they could use that container to compare the volumes/capacities of the other two; that is, could they figure out which of the two containers had the largest volume. In some of the tasks, the initially-filled container was smaller than either of the two containers to be compared and, in other tasks, the
P a g e | 60 initially-filled container had a volume/capacity between the other two. During this TE, both Edith and Robert consistently demonstrated transitive reasoning and were able to correctly compare the volumes indirectly. Lia, however, was not able to utilize the initially-filled container to compare the other two. Additionally, when the researcher took the lead in pouring from the filled container into the others, Lia did not demonstrate the ability to reason transitively about the volumes/capacities of the two containers she was asked to compare. Later, during a TE in the Spring of Kindergarten, Robert was presented a similar task, but the task began with one scoop of water being poured into each container. Robert incorrectly identified the container in which the water reached higher after one scoop as the larger container. The researcher continued to put one scoop at a time into each container, asking Robert each time which he thought was the larger; Robert persisted in saying the container in which the water was higher was larger. After the fifth scoop of water, the container Robert indicated as larger overflowed with water while the other was not yet filled. Robert persisted in his determination that this container was larger than the one that was not yet filled, even though in previous TEs he had consistently determined that the one that overflowed was the smaller container. Similarly, in a TE two weeks later, Lia also demonstrated an inability to compare capacities indirectly. Given two containers, she visually inspected them and mentally determined they were the same size. When given a scoop, she found that one container held five scoops of rice while the other held six. Rather than change her determination that both were the same size, Lia attempted to describe the difference in terms of one container being made of glass and the other of plastic. Lia, although demonstrating higher levels of volume understanding, never solidly demonstrated the ability to compare capacities indirectly during the four years of the study.
P a g e | 61 Overall, Edith was the only focus child who demonstrated thinking at the Capacity Indirect Comparer (CIC) level throughout the study and Lia was the only child who did not solidly demonstrate thinking at the CIC level in any task. The other focal children wavered back and forth, as Robert did, in some tasks demonstrating transitive reasoning and the ability to indirectly compare capacities, while in other tasks using faulty reasoning, and in others simply focusing on comparing capacities directly through visual inspection. Thus, further research is required to accurately determine the placement of indirect comparison of volume/capacity within the LT for volume measurement. It may well develop in parallel to the other levels, depending on specific experiences. Primitive 3-D Array Counter Although the previous levels of Capacity Direct and Indirect Comparer levels describe children mainly producing qualitative comparisons, children at the Primitive 3-D Array Counter (PAC) level begin to quantify the volume and capacity of 3-D objects. At the PAC level, children show a partial understanding of cubes as filling space. They are able to visualize that a threedimensional space can be filled with cubes and, with strong guidance and perceptual support, can direct the filling and recognize when the filling is complete. When presented with an object structured from cubes and asked to determine its volume, children at the PAC level may initially only count the faces, possibly double-counting at the vertices and/or edges; additionally, they do not account for hidden or internal cubes. With guidance and support, they eventually count one cube at a time in carefully structured contexts, such as packing a small box with cubes. On the follow up to the initial assessment, Zola demonstrated an initial transition between direct comparison and quantification strategies when asked to compare the volumes of the Lshape and the 3 x 2 x 1 solid from Figure 2.2. She identified the L-shape as having a larger
P a g e | 62 volume and then tried to count, but when her count did not match her visual comparison, Zola struggled to resolve the dissonance it created. Researcher: [Placing two sets of blocks on the table in front of Zola – one constructed of six cubes in an L-shape; the other, six cubes in a 3 x 2 x 1 rectangular prism] Let’s look at these two [pointing alternately at each set of cubes]. Please compare these two sets of blocks for volume. Do they have the same volume or is one larger? Zola: This one [holding up L-shape] is larger than this one [holding up 3 x 2 x 1 prism]. R: How do you know that? Z: Because this one [holding up L-shape], it has more blocks than this one [holding up 3 x 2 x 1 prism]. R: This one [pointing at L-shape] has more blocks than this [pointing at 3 x 2 x 1 prism]? Z: Yes. R: All right. Well, how do you know it has more blocks? Show me. Z: Because… R: How many blocks does it have? Z: [Pointing at and counting the cubes in the L-shape] Five. (Note: There are six cubes in the shape.) R: Okay. Z: …but this one [holding up 3 x 2 x 1 prism] has more blocks than this one, but this is more taller [standing the L-shape up on the table] because… This one is short [standing the 3 x 2 x 1 prism on end], and this one is tall [standing the L-shape on end]. R: This one is short, and this one is tall, okay, but how many blocks does this one have? [pointing at L-shape] Z: [Pointing and counting the cubes again incorrectly] Five. R: Five. And how many blocks does this one have? [indicating the 3 x 2 x 1 prism] Z: [Counts the six faces on one side then flips it over and counts the six faces on the other side] Twelve. This one has more blocks [holding up the 3 x 2 x 1 prism], but this one is the tallest [standing the two side by side and placing a hand on top of each].
P a g e | 63
I interpreted this to indicate that Zola had begun to recognize cubes as filling space (“Because this one, it has more blocks than this one” when asked how she knew it was a larger volume) and attempted to use the number of cubes as a means to compare the volumes. However, because she double-counted cubes and counted incorrectly, she was unable to resolve the disparity between her comparison by number and her comparison by height. Ultimately, she fell back to direct comparison by linear extent, a behavior consistent with the theory of hierarchic interactionalism which recognizes that children move into increasingly sophisticated levels of thinking without abandoning prior levels; rather, children may fall back to prior levels of thinking when it is either convenient or necessary to cope with a complex task. Robert demonstrated an initial foray into the PAC level on the initial assessment in pre-K during the first year of the study. On the initial assessment item depicted in Figure 2.1, Robert looked inside the small box and predicted it would take three cubes to fill it. Given more than enough cubes, he placed them one at a time into the box, counting as he did so, and counted out four cubes as he placed them in the bottom layer, completely filling that layer. He did not, however, continue to place a second layer of cubes on top of this one to fill the box completely. Later, in the same initial assessment, Robert was presented with a 2 x 2 x 3 rectangular prism, constructed of wooden cubes, and asked how many cubes were in the prism. He pointed to and counted the individual squares in only the top face and said there were six cubes in the entire prism. Thus, I saw evidence that Robert recognized the cubes as filling space, but lacked a complete understanding of the space as consisting of a specific quantity of cubes. He counted faces, rather than cubes; this is thinking indicative of PAC. Lia, on the same initial assessment task, started pointing at squares on the top face and, double-counting some, counting “1, 2, 3, 4, 5, 6, 7, 8.” She, then, turned the solid 90 degrees so
P a g e | 64 she was seeing another face and restarted counting on that face. She counted “1, 2, 3, 4, 5, 6” on that face and then, rotating another 90 degrees, counted “1, 2, 3, 4, 5, 6” on the next face. With each turn, she counted the total number of squares on that face, double-counting the cubes at each edge. I saw this as evidence of PAC-level thinking because she demonstrated a partial understanding of cubes as filling space; she made the choice to count when asked to compare the volumes, but had difficulty maintaining an accurate count, focusing on counting squares on faces which resulted in the double-counting of cubes. During pre-K and Kindergarten, children transitioned from counting squares on the faces of a 3-D solid constructed of cubes to eventually counting one cube at a time in carefully structured and guided contexts. When children counted squares on faces, it often caused them to either miss cubes not on the top layer or hidden in other layers or led them to double-count cubes (as Zola did, described previously). As another example, in the Spring of pre-K, children were asked again to organize the shapes in Figure 2.6 from the smallest number of blocks to the largest number of blocks, a task similar to the one on the initial assessment. In the two shapes that had only a single layer of cubes (the S-shape and the L-shape), Marina counted accurately; whereas, in the shapes that had more than one layer of cubes, she miscounted, pointing to and counting only the top of each cube. It appeared that she only counted the faces that were in her line of sight.
Figure 2.6. “Sorting solids by volume” task from Spring of pre-K
P a g e | 65 Evidence indicated that children progressed into the PAC level after solidly demonstrating the ability to directly compare volumes. Additionally, thinking at this level was evident prior to that of capacity relating and repeating for all children; thus, I feel the PAC level is accurately placed within the LT for volume measurement. Capacity Relater and Repeater As children transition from the PAC level to the Capacity Unit Relater and Repeater (CRR) level, they become more aware of the interrelationships between volume, number of cubes, and space. Although at this level children may still make mistakes with volume, unlike children at the PAC level, they understand the need for equal-sized units when determining a volume and keep track of those units. At the CRR level, children use simple units to fill containers with accurate counting, measuring volume by repeated use of the unit. At this level, children can iterate a single unit to measure the volume of an object and recognize that identical units should be used, at least intuitively, but may not initially appreciate the need for identical units in every situation. Additionally, with teaching, a child at this level is able to relate the size and number of units, understanding that increasing the size of the unit will result in fewer units being needed. Based on what I had already observed regarding the parallel development of capacity understanding and spatial structuring, I decided not to limit my focus on unit relating and repeating actions simply to units of capacity even though the LT level focused only on capacity. Instead, we included tasks in the TEs that focused not only on capacity, but also on spatial structuring, with the expectation that these would need to be integrated in the revised LT for volume measurement.
P a g e | 66 Edith showed an initial foray into the CRR level in pre-K on the initial assessment task depicted in Figure 2.1. She took the cube, held it inside the top of the box, and iterated along the opening, counting “1, 2, 3, 4.” Although I would not claim that she was dominantly thinking at the CRR level because she did not recognize that there would be a second layer, she clearly was recognizing a unit, maintaining that unit mentally, and iterating to fill the space. Edith more solidly demonstrated CRR reasoning in at TE during Spring of Grade 1. In the TE she was tasked with finding the volume of a 3 x 3 x 4 clear, plastic box and given one 1 x 1 wooden cube. She began by iterating the cube in rows along the bottom of the container, then iterated it up and down each of the sides, and finally iterated in the interior (see Figure 2.7). Although she miscounted slightly and arrived at a total of 34, she made a clear attempt to maintain equal units as she iterated throughout the container, consistent with thinking at the CRR level.
Figure 2.7. Edith iterating a cube to find the volume of a container
Also in Spring of Grade 1, Edith was presented a situation where two children had measured the volume of the same box, but each used a different unit so arrived at a different count; one used a large cube while the other used a smaller cube. Edith quickly said that it was not okay that they two children arrived at different answers. She explained that we needed to
P a g e | 67 know the correct volume, so each child must use the same size cube. Further, she clarified that the child who used the larger cube would get a smaller number than the child who used the smaller cube. I saw this as evidence of reasoning at the CRR level. Other children also began to demonstrate thinking at the CRR level toward the end of Grade 1. Robert and Lia, in the Spring of Grade 1, were given a small, 1 x 1 x 1 cube scoop and asked to estimate the number of scoops it would take to completely fill a 2 x 2 x 3 container. Although both children estimated inaccurately, each thought it would take four scoops. When asked to fill the container, both Robert and Lia filled the scoop completely, through all iterations. Both children accurately counted as they poured each scoop into the larger container to arrive at a total of 12. Additionally, when shown a larger scoop, both children articulated that it would take fewer of the larger scoops to fill the container. The children’s use of simple units to fill a container with accurate counting and recognition that fewer larger than smaller units were needed to fill the given container provided evidence characteristic of thinking at the CRR level. In this sample, however, children did not appear to remain at the CRR level for very long, as the tasks presented here occurred during the last interview in the spring of the third year of the study. Robert, Lia, and Edith all exhibited dominant thinking at the CRR level, but were demonstrating thinking at the Partial 3-D Structurer level by the beginning of the fourth year. Likewise, during the first interview in Fall of Grade 2, a majority of children were already exhibiting thinking at the Partial 3-D Structurer level. Partial 3-D Structurer While children at the CRR level are able to determine the volume of an object by repeatedly filling with a single unit, at the Partial 3-D Structurer (PS) level children begin to develop the ability to visualize and keep track of cubes, which they may not have directly
P a g e | 68 handled. By the time children reach the PS level, they demonstrate an understanding of cubes as filling space, but do not yet use layers or multiplicative thinking. Their counting strategies show a recognition of internal cubes and move from somewhat unsystematic counting to using the row and column structure and skip counting to get a total. They are able to build, maintain, and manipulate mental images of a unit of units (i.e., a row or column) and apply the composite unit repeatedly, with additive thinking. Unlike CRR where the focus is on the individual unit, thinking at the PS level is characterized by the ability to create a mental image of a unit of units (i.e., a row or column) and operate on that mental unit. Lia did not begin to move into the PS level until the final year of the study and, even then, was unable to recognize internal cubes without strong guidance from the researcher. Robert and Edith, however, showed evidence of thinking at the PS level before Grade 1. Robert made an initial venture into the PS level in the Spring of Kindergarten. When asked what the volume of a 2 x 2 x 3 solid constructed from cubes was, he played with the solid for several seconds, turning it around on the table several times before picking it up and attempting to count around the bottom layer, turning the solid in his hands as he reached the corners and trying to keep track of which cubes he had already counted. He restarted his counting a couple of times, double counted at corners and, arrived at a total of nine. Then, unprompted, he laid the solid on the table, counted the six cubes in the top layer, flipped the solid over, and counted the six cubes in the bottom layer from left to right in two rows, arriving at a correct total of 12. Although he began to show more advanced structuring as well as recognition of cubes as filled space, because this task did not provide any “internal” cubes I could not claim that Robert was dominantly at the PS level. In fact, it was not until a TE during the Fall of Grade 1 that
P a g e | 69 Robert demonstrated behaviors convincingly indicating the PS level by accounting for internal cubes. Robert was asked to find the volume of a 3 x 3 x 4 rectangular prism that had been constructed using multilink cubes. He immediately recognized that there were 9 cubes on the top of the solid. He, then, followed up by saying “there are nine in each.” Although not asked to calculate a total for 9 + 9 + 9 + 9, his thinking was clearly additive and he was accounting for internal cubes, even though he could not see them. I see his thinking as indicative of the PS level. Edith, in the Fall of Kindergarten, showed initial movement into the PS level when she was asked to find the volume of a 2 x 2 x 3 rectangular prism made of small wooden cubes. In contrast to Robert, who picked up the prism, Edith left it sitting on the table with a 2 x 3 face facing her. She counted cubes in columns, starting in the front, left corner, moving around the back of the prism without moving it, and then across the front, correctly determining the volume as 12. Although this is a 3-D figure without interior cubes, Edith indicated recognition of “hidden” cubes because she was counting cubes on the bottom layer of the back face that she could not see. Additionally, she used the column as a structure to guide her counting. I interpreted these strategies as evidence of thinking at the PS level. Edith continued to demonstrate thinking at the PS level into Grade 2. In the same task as described for Robert above, Edith was given the 4 x 3 x 3 rectangular prism made of multilink cubes. She held the prism so she was looking at a 4 x 3 face with the 3 x 3 face on the top and began counting on the front face, counting four in each column. She, then, turned and counted the remaining columns on the other faces, making sure not to double count at the corners. When she arrived at the middle column (the one with hidden cubes), she pointed to the middle cube on the top face, then pointed and counted down the side. When the researcher asked for clarification of what she did, Edith indicated that she knew that squares at the corners on different faces of the
P a g e | 70 prism belonged to the same cube, so she only counted them once. She went on to say that, for the middle column, she used the number of cubes in the column on the side that she could see to guide her counting of the “hidden” cubes. When asked how many cubes were in the center column, Edith stated there were four. Again, the use of columns as a structure and recognition of internal cubes was interpreted as indicative of thinking at the PS level. More Sophisticated Levels than Partial 3-D Structurer No children in the study demonstrated consistent thinking at levels more sophisticated than PS, so I move to further explanation of the qualitative data already presented and leave discussion of higher levels of the LT for volume measurement to other research.
Longitudinal Growth Charts To help provide a summary of the above qualitative data, I present individual, longitudinal growth charts for three of our focus children—Robert, Edith, and Lia—for whom I have four years of data. The development and formatting of these charts is attributed to Dr. Craig Cullen. In each chart, you will see vertical lines representing the levels of thinking demonstrated during each volume TE over the four-year period. Additionally, a thicker section of the line is evident in each chart, representing what I identified as the dominant level of thinking demonstrated during that TE. Along with each child’s chart, I provide a short summary and interpretation of what the growth chart presents. Robert’s Growth Figure 2.8 represents the volume understanding demonstrated by Robert as he moved from pre-K through Grade 2. In pre-K, Robert demonstrated thinking predominantly at the CDC level of thinking, although there was evidence that his thinking was reaching into levels as high
P a g e | 71 PAC. As he moved into Kindergarten, his dominant level of thinking began at the CDC level, but progressed up to the PAC level, indicative of development in his spatial structuring ability. As Robert entered Grade 1, his spatial structuring ability had already developed to the point that his dominant level of thinking was at the PS level and he remained at this level for the remainder of the study (through the end of Grade 2). Pre-K
Kindergarten
Grade 1
Grade 2
VRCS PS CRR PAC CIC CDC VQR 2007
2008
2009
2010
2011
Figure 2.8. Robert’s longitudinal growth chart Edith’s Growth Edith’s dominant level of volume thinking during pre-K was representative of thinking at the PAC level (see Figure 2.9), demonstrating that her spatial structuring ability was already developing. During Kindergarten, this continued to develop as her dominant level of thinking moved to the PS level by the end of the school year. Although it seems, in the figure, that her ability dipped for most of Kindergarten, this is more representative of the fact that a majority of tasks during that year focused on capacity rather than spatial structuring; thus, the levels of the LT for volume measurement focusing on spatial structuring were not addressed. Edith’s thinking remained dominantly at the PS throughout Grades 1 and Grade 2, although toward the end of Grade 2 she was beginning to reach into the VRCS level. Pre-K
Kindergarten
Grade 1
Grade 2
VRCS PS CRR PAC CIC CDC VQR 2007
2008
2009
2010
2011
P a g e | 72 Figure 2.9. Edith’s longitudinal growth chart Lia’s Growth Lia, like Edith, demonstrated PAC as her dominant level of thinking in pre-K (see Figure 2.10). Unlike Edith, however, Lia continued to maintain this level of thinking throughout the second and into the first half of the third year of the study before her thinking progressed to the CRR level. During the final year of the study, she demonstrated inconsistent thinking, sometimes dominantly at the PAC level, while at other times at CRR or even PS. Pre-K
Kindergarten 1
Kindergarten 2
Grade 1
VRCS PS CRR PAC CIC CDC VQR 2007
2008
2009
2010
2011
Figure 2.10. Lia’s longitudinal growth chart Discussion and Summary Figure 2.11 presents a composite growth chart for the three focus children summarized above. Through examination of this chart, as well as the individual charts above, I made several observations regarding children’s thinking about and understanding of volume, as well as the LT for volume measurement itself. First, there is overall growth in the dominant level of thinking as children moved from pre-K through Grade 2. This is evidence that the LT is consistent overall with behaviors demonstrated by the children in our study. Thus, we have qualitative evidence through longitudinal data supporting the developmental progression in the initial LT for volume measurement.
P a g e | 73 Pre-K
Kindergarten
Grade 1*
Grade 2*
VRCS PS CRR PAC CIC CDC VQR 2007
2008
2009
2010
2011
Figure 2.11. Composite growth chart for three children Second, the Kindergarten year seems to be a year when volume understanding was developing, as indicated by the fact that the dominant levels of thinking demonstrated by these children during that year represented the widest distribution and were the most varied. This may be evidence that during this time, children’s spatial structuring ability is developing, but also may highlight the fact that the initial LT for volume measurement is inconsistent over this developmental period. That is, the initial level of the LT focused on understanding volume as a quantity, then the LT moves to focus on capacity for the next two levels before shifting to spatial structuring. Third, both Robert and Edith showed a shift to PS as a dominant level of thinking during the third year of the study, while in Grade 1. Lia, on the other hand, did not show this same shift. Unlike the other focus children, Lia was asked to repeat Kindergarten because the school felt she did not have a solid grasp of her alphabet. Although this would not have affected her development, it did affect her curriculum, as she was not exposed to the Grade 1 curriculum until the final year of the study. Lia, too, demonstrated the most inconsistency in her dominant level of thinking during the final year of the study. Further examination of the videotapes of her TEs revealed that, overall, her spatial structuring ability was fairly stable and well-developed for less complex tasks; however, she often struggled to maintain focus as the tasks became more complex.
P a g e | 74 By the end of the study, all children demonstrated an understanding and fluency with units of units; that is, they could build and maintain mental images of rows or columns and use additive reasoning to find the volume of an object. Only Edith had begun to visualize layers, but even she resorted to rows/columns and additive thinking when asked to determine volume. Thus, the dominant level of thinking most representative of children in our study at the end of Grade 2 was PS. Quantitative Method – Rasch Modeling I turn now to look at a larger sample of children assessed at then end of the four-year longitudinal study to provide some quantitative data to support what has already been shared qualitatively. Participants To provide triangulation on the qualitative findings, and to ascertain if those findings could be generalized to other children, we administered an assessment instrument we created to samples of children (n = 256) representing pre-K to Grade 5 during the Spring of 2011. The cohort of children followed in the longitudinal teaching experiment was included in this sample. Measurement Instrument The instrument analyzed here was developed to align with the levels of the measurement LTs of Sarama and Clements, specifically those of length, area, and volume. Tasks included the measurement items from a previously-developed and validated interview-based assessment (for validation, see Clements, Sarama, & Liu, 2008; Clements, Sarama, & Wolfe, 2011), as well as tasks from previous empirical studies (Clements & Sarama, 2007b; Szilágyi, Sarama, & Clements, 2010). Modifications of these tasks, along with newly created tasks, were designed to elicit specific strategies and responses at particular levels of the developmental progressions.
P a g e | 75 Data Collection Two items were utilized to assess each level within each of these LTs (for a total of 52 items – See Appendix G for volume items) with the understanding that a correct response on an item demonstrated a child was at least at that level. Additionally, for the lower levels of each LT, items were designed for presentation through interview to limit the confounding effect of reading ability for younger children. Children in pre-K, Kindergarten, and Grade 1 were assessed entirely through interview. Children in Grades 2 and 3 were assessed through a combination of interview and written items. Children in Grades 4 and 5 were assessed entirely through written items. All interviews were videotaped and conducted one-on-one with an assessor and child. All written items were presented to children in groups with all work and written responses collected. Data Scoring Each item was coded for correctness using the following rubric: 0 = “Incorrect”, 1 = “Correct with prompt”, 2 = “Fully correct”. Any item coded as “1” was considered “partially correct” for the purpose of the Rasch analysis. (Note: Prompts were given only according to assessment protocol.) An additional code of “999” was used to indicate any item for which a child’s response could not be attributed to his/her understanding (e.g., assessor error affected the response of the child, child’s response was unintelligible/unreadable, or he/she did not provide a response for the item). These items were recorded as missing data for the purpose of analysis. Data Analysis The analysis of the measurement instrument designed for this study was performed using Rasch modeling with Winsteps software version 3.75.1 (Linacre, 2011). Because the focus here is on volume measurement, a separate analysis using only those items focusing on volume was also conducted, resulting in 14 volume items (see Appendix G) being submitted to Rasch
P a g e | 76 modeling. To analyze adherence to the Rasch model, presented here are the following data: fit statistics table and Bubble chart, item response patterns/category strucutre, ICCs and discrimination indices, the Wright map inlcuding the agreement between predicted item order and actual item order, dimensionality, and reliability (SEM, separation, alpha). Because this research was an initial cycle in the evaluation of the designed measurement instrument, the primary interest was in the item estimates. Therefore, the current analysis focuses on item characteristics and behaviors. After confirming fit to the Rasch model, item difficulty estimates will be used to analyze children’s understanding. Presented here are the results for the entire instrument, as well as those specific to volume measure. Rasch Modeling of Overall Assessment Table F1 shows that the raw variance explained by the overall assessment measure was 51.3%. Additionally, the unexplained variance in the first contrast was only 2.3% with an eigenvalue of 2.4. While the results of the raw variance explained by the measure implied unidimensionality of the construct, an eigenvalue above 2.0 is potentially problematic and may indicate that there is another dimension being measured in the first contrast (Bond & Fox, 2007). However, one explanation for the higher eigenvalue was that the overall instrument included items from each level of all three geometric measurement LTs – length, area, and volume. Findings in the independent Rasch modelings of length, area, and volume provided solid evidence of unidimensionality; that is, that each individual LT strand measured a single latent trait (Bond & Fox, 2007). The MNSQ infit of items (see Table F2) ranged from 0.65 to 1.39; all values falling within the 0.5 to 1.5 range of productive measurement (Linacre, 2011). The person separation index (1.93) and person reliabilities (0.79) implied that the item difficulty hierarchy was suitable
P a g e | 77 for the sample and the instrument was able to distinguish between children of low and high ability (See Table F3). Although it would be advantageous to improve person separation, the initial findings, overall, indicated that the instrument had satisfactory unidimensionality, as well as good fit for persons and items to the model (Linacre, 2011). Probability curves displayed symmetric s-shaped curves crossing at a midpoint of 0.5 (See Figure F1). This indicated that the probability of a child obtaining a correct response gave the complimentary probability of a child earning an incorrect response (Bond & Fox, 2007). The probability curves established that if the difficulty of an item was above the predicted ability of the child, the child had a lower likelihood of responding correctly to that item. On the contrary, if the difficulty of an item was lower than the predicted child ability, the child would have a greater likelihood of correctly answering the task. Ascending categories demonstrated acceptable scale use and fit (See Table F1), which further confirmed the validity of the LTs (Linacre, 2011). From the item-person map (Figure F2), the distributions of persons and items appeared to span a similar range. Therefore, it can be inferred that the instrument was targeted well to the population. However, the above zero mean of person abilities indicated that the instrument was slightly easy for the sample used (Bond & Fox, 2007). This was understandable given that items were designed to align with our LTs identifying growth in measurement understanding through an approximate age of 9, but the instrument was administered to children through Grade 5. Rasch Modeling of Volume Items Looking specifically at the items assessing volume measurement, a total of 46.5% of the raw variance was explained by the measures, with only 3.5% of the unexplained variance in the first contrast and an eigenvalue of 1.6 (See Table G1). This provided evidence of unidimensionality of the volume construct. In considering a second potential construct (see Table
P a g e | 78 G2), one item (CIC-2) showed loading greater than 0.4 and two items (VRCS-1 and PAC-4) showed loadings less than -0.4. Examination of these items showed that each was not significantly different from the other item designed to measure the same level of understanding; that is, CIC-2 was not significantly different from CIC-1, VRCS-1 was not significantly different from VRCS-2, and PAC-4 was not significantly different from PAC-2. Further examination of these items revealed that the variation appeared to be due to variation in assessor. Table G3 indicated that an overall MNSQ fit of 0.93 and a range of infit MNSQ of 0.58 to 1.10; both demonstrated appropriate fit (Linacre, 2011). One item, PAC-2, were found to be under-fitting with MNSQ values of 0.58. The infit MNSQ range was 0.77 to 1.10 after excluding this item, thus demonstrating a strong fit to the model. Person reliabilities were reported at 0.72 (see Table G4), which was higher than the desired value of greater than a 0.7 (Bond & Fox, 2007), with a separation index of 1.60. Item reliabilities were reported at 0.99 (see Table G4) with a separation index of 9.92. This finding indicated a superior fit of item difficulty to the sample used. In addition, Figure G1 indicated good use of scale as evidenced by symmetric sshaped curves crossing at a midpoint of 0.5 and increasing categorical order. This also confirmed that item difficulty appeared to be well matched to child ability overall (Bond & Fox, 2007; Linacre, 2011). Overall fit of the item-person map (Figure G2) demonstrated a balanced distribution of children’s abilities and item difficulties, and reflected that volume items in the final assessment were well targeted, overall, to the children in the sample. Further, the spread of items and children was found to be consistent with the developmental progression hypothesized in the LT for volume measurement with the exception of the very top. More specifically, child ability appeared to be higher than AR-1, as nearly 36 children displayed abilities above the highest level
P a g e | 79 included in the LT for volume measurement. This provided further evidence that the final assessment was slightly easy based on the sample size of children in pre-K through Grade 5. Again, this is understandable as the LT for volume measurement is targeted for children through age eight (Sarama & Clements, 2009). Additionally, there is a somewhat larger gap between the PAC items and PS-1, indicating that further items are needed between these item difficulties. Discussion and Summary As explained here, the initial Rasch modeling provided solid evidence that the instrument was well developed and that the construct satisfies the assumption of unidimensionality. Additionally, infit statistics are within acceptable ranges, indicating that the instrument is well aligned with the Rasch model. From the dimensionality analysis in the results, I identified three items as having potential correlation with a secondary construct: CIC-2, VRCS-1, and PAC-4. Examination of these items showed that the each was not significantly different from the other item designed to measure the same level of understanding. Similarly, the overfitting item, when compared with the pair of underfitting items, did not highlight separate constructs, just different levels of the volume construct. My overall purpose here was to verify the developmental progression of the LT for volume measurement using quantitative data. Thus, I looked at each pair of items designed to assess a given level within the LT for volume measurement. Intervals were constructed for each pair by marking the difficulty level of each. For example, the Partial 3-D Structurer level was assessed by items PS-1 and PS-2. The predicted difficulty of PS-1 was 2.94 and that of SS-2 was 3.74. Thus, the interval for this level is from 2.94 to 3.74.
P a g e | 80 Figure 2.12 depicts all items arranged in increasing difficulty order (from bottom to top) with corresponding level intervals marked to represent the range of difficulty for each pair of items designed to assess the same level of understanding. Each interval, moving from left to right in the figure, corresponds to an increasingly higher level of the LT for volume measurement (see Appendix E for original LT); that is, in the developmental progression, Capacity Direct Comparer (CDC) is followed by Capacity Indirect Comparer (CIC) which is, in turn, followed by Primitive 3-D Array Counter (PAC) and so on. In general, estimates indicate a progression in difficulty of the items consistent with our hypothesized developmental progression. That is, an increase in item difficulty corresponds to a move to higher levels in the LT for volume measurement.
Figure 2.12. Item difficulty aligned with LT for volume measurement level
P a g e | 81 Two areas of concern are also evident and highlight the need for refinement of the LT for volume measurement. First, the interval for the Capacity Relater and Repeater (CRR) level is nearly parallel with that for the CIC level and is below that for the PAC level, which was hypothesized to be a lower level in the LT for volume measurement. Second, there is a large gap between the interval for PAC and that for PS. Although the original LT for volume measurement addressed the development of unit relation and iteration, the focus for the level at which unit relating and iterating solidify (that is, Capacity Relater and Repeater) in the original LT is solely on capacity. Evidenced from our qualitative data, the development of unit also encompasses spatial structuring and this needs to be recognized within a revised LT for volume measurement. Implications A major strength of this project is the integration of qualitative and quantitative research designs. The aggregate data afforded by the Rasch model validated the descriptive empirical results from the teaching experiment. Findings may represent a significant and generalizable contribution to researchers and educators alike. Additionally, finding positive confirmation of the LT for volume measurement at two unrelated sites gives confidence that it may be generalizable beyond a single specific group of children. Several important implications should be noted for future LT and early math development research. Rasch modeling results of the overall instrument, as well as those from the individual volume items, demonstrated a strong relationship between expected item fit and observed item fit in the Rasch model. Consequently, this further supported the unidimensional nature of the latent trait measured in the LT for volume measurement and the cohesiveness of the overall instrument (Linacre, 2011). Despite the range in item difficulty being slightly below that of children’s ability, the overall spread of items and persons in the item-person maps confirmed
P a g e | 82 the progression of development and thinking found in the LT for volume measurement (Bond & Fox, 2007). Additionally, the gap evident where the hypothesized CRR level should have been needs to be addressed with the development of items assessing unit relation and iteration with respect to spatial structuring. As a result, the findings may be used to further refine current LT theories and improve instructional methods to bridge gaps in early volume competencies (Clements et al., 2008; Sarama & Clements, 2009). Additionally, it will be useful to pilot the potentially problematic items after revising or creating new tasks to better reflect the respective levels in the LT for volume measurement. In turn, this may shed light on whether the misfitting items were indeed mis-targeted to children in Grades 4 and 5 or if there were other confounding factors at play in the study. Overall, adequate item fit, dimensionality, and additivity of item position in the Rasch model demonstrated construct validation of the LT for volume measurement. The appropriate scale use supported by increasing categories and symmetric s-shaped probability curves provided further evidence of construct validity. Additionally, the strong fit statistics confirmed the legitimacy of the LT for volume measurement and the underlying volume construct. In the future, it will be helpful to have the LT level assessment tasks reviewed by other content experts, ranked, and correlated with each other to boost the validity of data (Bond & Fox, 2007).
Hypothesizing a New Developmental Progression for Volume Measurement Extensions in Terms of Schemes Related to Volume As children begin to develop an understanding of volume they initially demonstrate four separate schemes related to volume – filling, packing, building, and comparing (Curry & Outhred, 2005; Sarama & Clements, 2009). Eventually children integrate these separate schemes
P a g e | 83 into a superordinate scheme for volume with subschemes for filling, packing, building, and comparing. The first scheme, volume filling,, is related to capacity, the awareness that objects can be filled. Children initially understand that an amount of material (e.g., sand or water) can be poured to fill a container and that different containers hold different amounts. The unit structure for volume-filling scheme initially may be psychologically one-dimensional for most children (i.e., simple iterative counting that is not processed as three-dimensional) – for example, in filling a cylindrical jar in which the (linear) height corresponds with the volume. The second and third schemes, volume packing and volume building, are related to discrete units and an awareness that objects take up space. The volume-packing scheme relates to filling containers with discrete units; the volume-building scheme relates to objects constructed of discrete units. Unlike volume filling, volume packing and volume building connect to the construct of spatial structuring (Battista & Clements, 1998)—3-D units must be defined, coordinated, and integrated in three dimensions (or layers, each of which consist of rows and columns). As a simple illustration of the developmental progression involving spatial structuring, children initially count faces of cubes in a two-dimensional understanding of volume (at some level of awareness, we posit they believe they are counting “blocks”) and then move to counting whole cubes, eventually recognizing that some objects have “hidden” cubes that also must be counted. A fourth scheme, volume comparing, relates to coordinating the ideas of volume-filling, volume-packing, and volume-building schemes to compare the volumes of different containers or objects. Children may make visual and/or direct comparisons – first in one, then in two and three dimensions. In volume filling, children develop the ability to compare volume by pouring (sand, water, etc.) from one container into another and noting differences. In volume packing and
P a g e | 84 volume building, children compare volume by counting cubes, eventually recognizing the space taken up by the cubes as volume. The participants in the study exhibited each of the volume filling, packing, building, and comparing schemes separately. For example, I observed that some children recognize that objects can be filled (capacity recognition), while not necessarily recognizing that objects take up space (volume recognition). Further, number sense must also be connected to (and eventually integrated with) these volume schemes; during early TEs, children in the study did not appear to connect counts of discrete objects with volume (the space occupied by an object). For example, when comparing objects using different-sized units, children often showed no dissonance about the size of the unit when determining the volume and simply used the number of units counted to compare the volumes of the objects; they were confident that the container with more cubes was bigger, regardless of the size of the cubes. Further analysis of children’s behaviors revealed that the schemes for filling and packing did not develop distinctly, but rather concurrently. The idea of comparison, as well, did not fit into a distinct level, as had been previously hypothesized. I saw that as children’s volume understanding became more sophisticated, their comparisons of volume became more sophisticated, moving from simple direct comparison to comparison of units needed to fill the amount of space within each container. Thus, I recognized a single developmental progression, but at each level saw schemes related to volume filling, packing, building, and comparing demonstrated by children. Therefore, I propose extension and revision of the early levels of the initial LT to incorporate subtrajectories for volume filling, packing, building, and comparing as different strands within a single, coherent LT for volume measurement.
P a g e | 85 Connections Across the Trajectories In looking to solidify the LT for volume measurement, I also noticed inconsistencies between the developmental progressions established for the LTs for length and area measurement when compared with that for volume. For example, in the LT for length measurement, children demonstrated end-to-end length measurement consistently, but although a similar behavior occurred with volume (i.e., children packed containers with cubes completely), this was not represented in the initial LT for volume measurement. Thus, I sought to identify consistent behaviors and mental actions on objects across all three trajectories to ensure all were represented in the revised LT for volume measurement. Indirect Comparison Another important note is that children demonstrated indirect comparison of volumes only a few times (and not consistently) in the TEs. From observations, this seems to be a special strategy children use on occasion; however, most children use direct comparison, whether physical or mental, whenever the option is available. An indirect comparison strategy is required in situations where two objects cannot be compared directly (physically or mentally aligned). In tasks designed specifically to elicit behaviors indicative of indirect comparison, children in the study nearly always attempted to make mental direct comparisons instead. Thus, it is difficult to determine if children were unable to compare indirectly or if they simply chose not to; consequently, the placement of indirect comparison within the LT currently is undetermined. In the next sections, I will examine the initial LT for volume measurement level by level incorporating the qualitative and quantitative data already presented, as well as additional analyses, to present a refined and revised LT for volume measurement. My focus at each level will be on both confirmation and modification.
P a g e | 86 Revisiting Volume Quantity Recognizer Evidence supported the claim that children do begin their progression toward a complete understanding of volume by recognizing volume as a quantity. Their mental actions on objects were characterized as perceiving space and knowing that objects occupy space. Children in the study demonstrated thinking at and beyond this level on the initial assessment in pre-K. Even at this initial stage of volume recognition, I see the subtrajectories of filling, packing, building, and comparing beginning to distinguish themselves. In both filling and packing, children make statements such as “This glass holds a lot of water” or “This box holds a lot of toys.” In describing or building objects, they use simple vocabulary to describe overall size such as “big” or “small” and, when given an object constructed of cubes, children often count only what they see on one face of the object. Comparisons of volume are gross comparisons, if they are made, or children focus on linear extent and identify an object that is larger in one dimension (often height) as having a larger volume. Revisiting Capacity Direct Comparer and Capacity Indirect Comparer As previously stated, comparison of volume does not happen at one distinct level. Instead, as children develop a more sophisticated understanding of volume, the way they compare volume from one object to another also becomes more sophisticated. Children may begin by making gross comparisons of volume, then move to comparing directly in only one dimension, and eventually compare by inter-relating quantity involving three dimensions. Additionally, comparison of capacity is only one type of volume comparison, so including individual levels focusing specifically on capacity was not consistent with my analysis of children’s behaviors. The initial LT posited that the one-dimensional nature of filling implied the
P a g e | 87 existence of an early-developing, separate level; evidence from this study did not support the existence of separate, prerequisite levels. Instead of individual levels, I have incorporated the beginnings of comparison, as well as the movement toward more sophisticated comparison, into each of the levels of the revised LT for volume measurement. Additionally, I have included the comparison of volume using spatial structuring, as well as the comparison of capacity. Thus, Capacity Direct Comparer and Capacity Indirect Comparer are no longer independent levels within the revised LT for volume measurement, but rather are incorporated into other levels (see Table 1). Table 2.1 – A Comparison of the Levels from the Initial and the Revised Developmental Progressions for Volume Measurement A Comparison of the Levels from the Initial and the Revised Developmental Progressions for Volume Measurement Initial Developmental Progression Revised Developmental Progression Volume Quantity Recognizer Volume Quantity Recognizer Capacity Direct Comparer Volume Filler Capacity Indirect Comparer Primitive 3D Array Counter Volume Quantifier Capacity Relater and Repeater Volume Unit Relater and Repeater Partial 3D Structurer Volume Initial Composite Structurer Adding Volume Filler (replacing Direct and Indirect Comparer in the Initial LT) As a further modification, I observed behaviors in children representative of mental actions on objects more sophisticated than those in the Volume Quantity Recognizer level and less sophisticated than those in the Primitive 3-D Array Counter level (this gap in the initial LT was occupied by the Capacity Direct Comparer and Capacity Indirect Comparer levels). Thus, a modification in the revised LT for volume measurement is the placement of the Volume Filler level between the Volume Quantity Recognizer level and 3D Array Counter level. At this level, children can directly compare both capacities and volumes using multiple dimensions, but may not explicitly recognize and integrate all three dimensions of an object or container. This
P a g e | 88 recognition of more than one dimension is also evident in their counting to determine the volume of a 3-D object constructed of cubes. At this level, children recognize and count on more than one face, but still do not count exhaustively on all faces. The Volume Filler level meets another goals for designing learning trajectories, that the leveled account should emphasize common aspects of development within related content topics (as is true here for 3 related goals: measures of length, area and volume). The Volume Filler level has been shown to be parallel in its characterization of children’s actions to repeat small portions of a measured object (unit pieces) until the object to be measured is completely exhausted or filled (with units). The typical actions observed at the Volume Filler level are parallel to children’s actions described as the End-to-End Length Measurer for the domain of length and the Side-to-Side Area Measurer in area. Children thinking at this level can be expected to visualize and to recognize that 3-D space can be filled with objects; further, children at this level typically exhibit an ability to build, maintain, and manipulate a mental image of an amount of material that is space filling. Thus, the Volume Filler level incorporates the idea that children can fill a container and recognize the container as being filled. Revisiting Primitive 3-D Array Counter (Volume Quantifier in revised LT) The Primitive 3-D Array Counter level is the level at which spatial structuring begins to be evident and children develop an explicit, but partial, understanding of how cubes can be arranged to fill space. As described previously, I found confirmation of this level of thinking occurring after Volume Filler as children exhibited this level as a dominant level toward the end of Kindergarten and into Grade 1. Children counted the faces of a cube building, often doublecounting at the edges and vertices, but did not demonstrate recognition of hidden or internal cubes in their counting. With guidance and instruction, however, the children were able to count
P a g e | 89 cubes one at a time in structured contexts time (i.e., if they were given prisms to measure having grids marked upon the faces as indicators of cube arrangements within, they counted the cubes correctly). As with previous levels, however, packing and building constituted different contexts for young children and each context is addressed within this level. When packing, children were able to pack a box completely and neatly with cubes, but were only able to determine a total if counting occurred while they were packing. If asked to determine the volume after the packing was complete, children failed to account for cubes they could not see. When building, children at this level of thinking moved to account for all faces of a 3-D object, where previously they had only shown partial recognition of multiple faces. They still, however, did not account for hidden or internal cubes in their counting. In looking at how children compared volumes at this level, I continued to see extensive use of direct comparison with a tendency toward the recognition of three dimensions. Beyond the direct comparison of 3-D structures constructed of cubes, comparisons made by children focused on the number of cubes in each structure with inconsistent recognition of the size of the cubes themselves. That is, children compared volume by how many cubes they counted in each structure and did not account for the fact that a larger cube unit would result in a larger volume, given the same number of cubes in a structure. In capacity tasks, many of which involved cylindrical containers, children continued to refer to the height as one dimension, but moved beyond simply referring to the width as an additional dimension. Some children recognized the circumference by placing their fingers into a circular shape while others appeared to refer to the cross-sectional area by placing a hand inside the top of the container and saying, “It’s bigger here.”
P a g e | 90 Overall, I found solid evidence to confirm this level. I did, however, feel that the name Volume Quantifier was more appropriate to describe this level of thinking as the focus for most children seemed to be on quantity as volume, per se, and children were not making an explicit connection between number and space. Revisiting Capacity Relater and Repeater (Volume Unit Relater and Repeater in revised LT) Children in this study exhibited thinking at the Capacity Relater and Repeater (CRR) level beginning in Grade and helped confirm that children at this level filled a container by repeatedly filling with a unit and counting how many units would be needed to fill the container. Also, children recognized that fewer larger than smaller objects or units would be needed to fill a given container. Additionally, I saw that children related the number of units needed to fill a container to the height of the container; for example, when determining how many glasses of water it would take to fill a container, some of the children iterated a height interval for the number of glasses filled after one glass had been poured in. I suggest an extension of the action scheme of this level to include relating the number of units with the height of the container to be filled with that unit. Generally, when posing CRR-level tasks, my data were coded “CRR level” or “not CRR level”. Due to the uniqueness of this level (divergent from the rest of the levels), it could be observed over a longer time span because it addressed a strong focus on the filling scheme whereas the majority of the remaining levels addressed the building or packing schemes, schemes involving discrete units. Examination of my data suggested the need for expansion of the level to include these other spatial structuring schemes related to volume. For example, children were presented with two rectangular prisms, one constructed using wooden, 1-inch
P a g e | 91 cubes (2 x 4 x 2) and the other of yellow, 1-centimeter cubes (2 x 3 x 4), along with the following situation: “Another student, when asked to compare the volume [of these two blocks], said that the yellow block (2 x 3 x 4) has a greater (bigger) volume because it is made up of more cubes. Do you agree with this student?” In this case, the prism made of centimeter cubes was obviously smaller in volume (Figure 2.13). The results indicated that children who attended to the unit size in relation to the expected overall measure were able to explain the role of the size of the unit in their comparison of volumes as an inverse relationship. Based on this and similar evidence, I have extended the level to include all four schemes related to volume measurement (filling, packing, building, and comparing).
Figure 2.13. Volume comparison with different sized units. Revisiting Partial 3-D Structurer (Initial Composite 3-D Structurer in revised LT) As children’s spatial structuring abilities continued to develop, I saw their thinking move to align with that of the Partial 3-D Structurer level. The children demonstrated thinking at this level toward the end of the study in Grade 2. At this level, children began to recognize hidden or internal cubes and their cube counting became more accurate in reporting volume. At first, they merely attempted to account for the hidden cubes when counting; next they came to recognize the number of cubes in a row or a column (what I refer to as a “core” or 1 x 1 x n composite unit) and used additive reasoning or “skip counting” to determine the total. Thus, children moved from
P a g e | 92 recognizing individual units to building, maintaining, and manipulating mental images of composite units. My evidence confirmed that this recognition was evident in both packing and building tasks, so the decision was made to rename the level to Initial Composite 3-D Structurer as a more accurate description of this shift in thinking and its placement within the LT. Similarly, children continued to develop in their thinking regarding volume filling or capacity, as well as their ability to compare volumes. Over time, children moved beyond characterizing volume by reporting only a number and came to understand volume as space. For example, in filling and packing contexts, they began to focus both on the amount filled as well as the space remaining to be filled, developing a sense of when a container was half full. Additionally, they began making connections between filling volume and packing volume, understanding that the cubes actually take up space, as liquid or fluid material does. As an example, during Grade 2, children were presented with tasks involving measuring containers marked in units equivalent to the blocks with which they were packing (i.e., if packing with inch cubes, the measuring container was marked in cubic inches). After establishing the equivalency with children, they were asked to fill one container with rice and then use the measuring container (marked in cubic inches) to determine its volume by pouring the rice from the filled container into the measuring container. Additionally, children were asked to measure the volume of another container by packing it with inch cubes. In these tasks, children demonstrated an ability to relate the number of cubes with the amount of space filled. Higher levels… Because this study was limited to children pre-K to Grade 2, higher levels of the LT for volume measurement were examined with an older cohort participating in companion studies at Illinois State University. Results from their analyses will include an exploration of how children
P a g e | 93 relate the linear dimensions of an object (length, width, and height) to the volume/capacity of an object. These results will be utilized to further verify and refine the higher levels of the LT for volume measurement. Final Thoughts and Significance A major strength of this project is the integration of qualitative and quantitative research designs. Overall, this study served to verify the developmental progression for the LT for volume measurement proposed by Sarama and Clements (2009) and depicted in Appendix C. Additionally, areas were highlighted in which the developmental progression needed revision based on a longitudinal sample of students. The aggregate data afforded by the Rasch model validated the descriptive empirical results from the teaching experiment. Thus, a revised developmental progression is proposed (see Appendix E) utilizing the results. Findings may represent a significant and generalizable contribution to researchers and educators alike.
P a g e | 94 Paper 3: Evaluation of a Revised Developmental Progression of a Learning Trajectory for Volume Measurement – Kindergarten through Grade 2 The first paper of this series provided a summary of previous and current research on volume measurement along with that on learning trajectories to establish a rationale that further research in the development of learning trajectories for volume measurement was warranted, and indeed, necessary. The second paper presented a learning trajectory for volume measurement as developed by Sarama and Clements (2009) and, using qualitative and quantitative evidence from a longitudinal study, a revised developmental progression for volume measurement was hypothesized incorporating four subtrajectories – filling, packing, building, and comparing. The current paper focuses on evaluation and verification of that revised developmental progression utilizing Rasch modeling of assessment data with a larger sample of children. Analyses with Rasch models is a systematic process, in which items are purposefully constructed according to a theory and empirically tested through Rasch models to produce a set of items that define a linear construct or scale; in this case, volume measurement. Unlike Classical Test Theory, the task for developing a measurement instrument through Rasch modeling is to create a set of items that produce data consistent with the theory. Developing measurement instruments using Rasch modeling, then, is not a matter of rejecting a hypothesis; rather, it is a focus on constructing items that result in data in agreement with the hypothesis (Liu, 2010). The purpose of this research is to answer the following research questions: Research Question 1. Are the hypothesized developmental progressions for filling, packing, building, and comparing volume in Kindergarten through Grade 2 valid for a larger sample of children?
P a g e | 95 Research Question 2. Is the hypothesized developmental progression for volume in Kindergarten through Grade 2, incorporating the subtrajectories for filling, packing, building, and comparing, valid when considered as a single, unidimensional developmental progression or should there be more than one developmental progression for volume measurement? Introduction The idea that current educational practices result in children missing early volume concepts has been long established in the field of early math research. In the late 1950s, Piaget and Inhelder asserted that traditional geometry instruction began too late and thus, introduced the concept of measurement right away (Piaget, 1952; van Hiele, 1959/2004, 1999; Wirszup, 1976). In doing so, instruction has traditionally omitted important qualitative phases, which are allow children experience with transforming spatial operations into logical ones. Traditional education has continued to follow the path of Euclidean (or axiomatic) geometry and required students to reason at a formal deductive level that is often beyond their abilities (CCSSO, 2010; van Hiele, 1999; Wirszup, 1976). As such, instruction has tended to follow a mode of quantitative understanding with a hope for the development of qualitative understanding as a result. Some researchers, however, have found evidence to support that children’s understanding actually flows in the opposite direction (Piaget, 1952; Piaget et al., 1960; Wirszup, 1976). Although the Common Core State Standards (CCSSO, 2010) have introduced the idea of volume in Grade 2, in which building, drawing, and analyzing three-dimensional figures is targeted toward helping students develop a foundation for understanding volume (p. 17), it is not until Grade 5 that volume is introduced as an instructional focus, and where an actual description or definition is presented (p. 33). Paralleling what Piaget and the van Hieles have claimed, this
P a g e | 96 focus begins with recognizing volume as an attribute then transitions to counting the number of cubes, then very quickly turns to calculation of volume utilizing linear dimensions. It is not a far leap, then, to understand how many secondary students lack a deeper understanding of volume, and may only see volume as a formula or simply a number. Fortunately, recent research in both children’s thinking and mathematics instruction has indicated that young children are capable of learning and understanding spatial measurement (i.e., volume) when given appropriate support (Battista, 2012; Clements & Sarama, 2007a; Lehrer, 2003; Stephan, Bowers, Cobb, & Gravemeijer, 2003). In addition, other research has identified key conceptual challenges that current U.S. curricula have overlooked or taken as intuitively obvious (Battista et al., 1998; Kamii & Kysh, 2006; Lehrer, Jenkins, et al., 1998). Taken together, this work has provided solid evidence that traditional classroom approaches to measurement are inadequate, therefore highlighting the fact that such approaches must evolve. Much work still remains, however, in the task of developing effective instructional sequences, in order to more effectively support student learning as well as develop assessments that yield more fine-grained and educationally useful results. Theoretical Framework Learning trajectories (Simon, 1995) have served as the core of multiple research projects, curricula, and professional development projects (Battista, 2012; Bredekamp, 2004; Clements & Sarama, 2009, 2004b; Confrey & Maloney, 2010; Mojica & Confrey, 2009; Simon, 1995; Smith et al., 2006). At a minimum, three learning trajectories for volume measurement have been developed to help explain children’s development of volume understanding (Battista, 2012; Clements & Sarama, 2004a; Confrey et al., 2012; Sarama & Clements, 2009). Of those, Clements and Sarama (2004a) have defined learning trajectories as developmental progressions
P a g e | 97 that include descriptions of children’s thinking and learning, as well as a related, conjectured route through a set of instructional tasks. Thus, learning trajectories include the following three components: •
A goal (that is, an aspect of a mathematical domain children should learn),
•
A developmental progression or learning path, wherein children move through levels of thinking, and
•
Instruction to help children move along that path.
Additionally, Clements & Sarama have viewed learning trajectories through a theoretical lens, in which they term hierarchic interactionism (Sarama & Clements, 2009). Rather than postulate that thinking proceeds through stages, such as Piagetian stages, Hierarchical interactionalism, instead, has hypothesized levels of thinking; that is, periods of qualitatively distinct cognition, but within a specific domain. In hierarchic interactionalism, they have postulated the construct of nongenetic levels (Clements, Battista, & Sarama, 2001a). Similar to van Hiele, progress is determined more by social influences, specifically instruction, than by agelinked development (cf. van Hiele, 1986, 1999). Consistent with other theories, hierarchic interactionalism has posited that each level is built hierarchically upon the concepts and processes of the previous levels. Although each higher nongenetic level is built on the knowledge developed and solidified in lower levels, its nongenetic nature has not preclude earlier levels of thinking in certain contexts, especially during more demanding or stressful situations. Likewise, developmental progressions within a domain may repeat themselves in new contexts (e.g., Siegler & Booth, 2004). In particular, although levels of thinking are coherent and often characterized by increased sophistication, complexity, abstraction, power, and generality, the learning process is more often incremental and gradual, as opposed to intermittent and tumultuous. Growth is evident when a critical mass of ideas is
P a g e | 98 constructed resulting in thinking characteristic of the subsequent level becoming principal in the child’s thinking and behavior (Clements et al., 2001a). However, under conditions of increased task complexity, stress, or failure an earlier level may serve as a more comfortable and consistent fallback position (Hershkowitz & Dreyfus, 1991; Siegler & Alibali, 2005). The continued existence of earlier levels has explained why, in some contexts, even adults fall back to earlier levels of thinking. With experience, the level of thinking is able to become robust and progressions follow a predictable pattern of learning activity. Examining the learning trajectories for geometric measurement (see Clements & Sarama, 2009; Clements, Sarama, Barrett, Van Dine, & McDonel, 2011; Sarama & Clements, 2009; Sarama, Clements, Barrett, Van Dine, & McDonel, 2011), the progression for geometric measurement might be summarized as the following: •
Recognition of attribute
•
Filling
•
Quantification
•
Unit Relating and Repeating
•
Measuring
In volume measurement, volume is first recognized in children as an attribute of an object. From that recognition, the idea of filling is developed; for example, a container can be filled with sand/water or packed with objects. In the progression of this understanding, children move to count and quantify as they fill objects. This quantification, however, may not encompass the recognition that gaps between objects packed should not exist nor the understanding that equalsized units should be used. The recognition of unit is the next level of understanding; this is typified in the ability to iterate a unit through a given volume, as well as the explicit relation of
P a g e | 99 size and number of units – that a larger unit will take fewer to fill. Finally, the idea of measurement is developed, at which point a child is thought to possess an “internal” measurement tool (e.g., a child is able visualize a quart or liter and can act on this visualization in measurement tasks). Additionally, children at this level explicitly understand the connection between the number of units and the volume of the container or object. Method Assessment Items In the current study, items used for the instrument analyzed were compiled primarily from previous research on children’s understanding of volume. First, two items were modified from the work of Curry and Outhred (2005), who utilized activities designed to assess both volume (filling) and volume (packing). Second, modifications of several items from the work of Battista and Clements (Battista, 1999, 2012; Battista & Clements, 1996, 1998), who examined the structuring of three-dimensional arrays, were included in the measure. In his CognitionBased Assessment Battista (2012) aligned items with his levels of sophistication for volume measurement. Finally, in a prior study and the focus of Paper 2, initial and final assessment items were utilized in the measure. In particular, each containing items was designed to target and identify specific levels of the initial learning trajectory for volume measurement (Sarama & Clements, 2009). Subsequently, items were pooled into a single instrument of 48 items, in which each item was matched to a level of thinking and scheme (filling, packing, building, or comparing) within the revised learning trajectory as presented in Paper 2 with the purpose of presenting items to children through one-on-one interviews (see Appendix H for items). For example, item VQ-C2B presented children with two buildings constructed of inch cubes – one a 3 x 2 x 2 building, and
P a g e |100 the other a 1 x 6 x 2 building (See Figure 3.1). Children were asked, “Which of the two buildings has more room inside, or do they have the same amount of room?” For the instrument analyzed here, this item was assigned to the Volume Quantifier level under the building scheme because correctly answering the item involved children quantifying that each building was built from 12 cubes, and recognizing that both will have the same volume because they are built from the same number of cubes.
Figure 3.1. Item VQ-C2B The Rasch Model Item Response Theory (IRT) is a psychometric method, in which researchers create an interval scale of scores for both the difficulty of items and the ability of the persons assessed in the instrument. These scores are reported in units called logits, which are typically placed on a vertical scale, and can be visualized in what is known as a Wright Map (Wright & Linacre, 1994). The vertical scale is utilized similarly to a yardstick measuring length in inches. That is, just as two inches are twice as long as one inch, thus two logits are twice as big as one logit. This method is different from other scores like percentile ranks, where it cannot be ascertained whether a person at the 50th percentile has twice as much ability as a person at the 25th percentile.
P a g e |101 A simple and efficient IRT technique is called the Rasch model (Bond & Fox, 2007). The Rasch model can be utilized to provide evidence of both the validity and the reliability of an assessment. In turn, researchers have the ability to mathematically estimate both the probability that a person will get an item correct (person ability) as well as the probability that an item will be answered correctly by a person (item difficulty). The probability of a person answering a question correctly is then solely dependent on that person’s ability, and can be expressed as 𝑃𝑃! 𝑥𝑥 = 1 =
! (!! !!! )
!!! (!! !!! )
where Bn is the person’s ability and Di is the item difficulty. This model
is more often expressed as ln
!
!!!
= 𝐵𝐵! − 𝐷𝐷! . When the probabilities are very different from
what actually has occurred, the results can be inferred as evidence that the data do not fit the expectations of the mathematical model (Bond & Fox, 2007). Rasch results are interpreted through the use of two fit statistics – infit and outfit. Fit
statistic indices estimate the extent to which the observed responses showed adherence to the Rasch-modeled expectations. Infit is weighted to provide more value to on-target observations; that is, when the person ability and item difficulty are closely aligned. Outfit is un-weighted and, therefore, is influenced by off-target observations, such as outliers. The mean square (MNSQ) statistic is a transformation of the difference between the predicted and the observed scores (residuals) that indicates the degree of fit of an item or a person. The expected value is 1, with values between 0.5 and 1.50, regarded as productive for measurement (Wright & Linacre, 1994). The Z-statistic (ZSTD) is a standardized fit statistic with the mean of 0 and variance of 1. For ZSTD, the range of acceptable values for a 95% confidence interval is between -2 and 2 (Bond & Fox, 2007; Linacre, 1994; Liu, 2010). For these analyses, I used the WINSTEPS 3.75 software to compute these statistics (Linacre, 2011).
P a g e |102 Sample Size One of the key questions regarding analyses using the Rasch model is that of sample size. Although no consensus among researchers has been established, the required minimal sample size is essentially an issue of the standard error of measures (Liu, 2010). In the Rasch model, these are the person and item parameter estimates. Wright (1977) has demonstrated that for a typical test with a raw score between 20% and 80% correct (a 5-logit difference range), the minimal sample size can be calculated as 𝑁𝑁 =
6 𝑆𝑆𝑆𝑆 !
where SE is the standard error of Rasch measures. As a result, if the SE is to be smaller than 0.35 – considered adequate for pilot testing – then the required minimal sample size is 50. Further, for the SE to be smaller than 0.25, the required minimal sample size is approximately 96 participants. It is important to note that this is the level considered acceptable for low-stakes testing situations. For higher-stakes testing, the SE should be smaller than 0.15. The required minimal sample size for this would be 267. Participants Participants come from five elementary schools in a large, suburban district in the Rocky Mountain region. From each of these schools, any child in pre-K through Grade 3 who returned a consent form was considered for inclusion in the study. As the focus was on children in Kindergarten through Grade 2, a majority of the sample (22 Kindergarteners, 24 first graders, and 24 second graders) interviewed were children from these grades. Additionally, 5 children in pre-K and 7 children in Grade 3 were interviewed in an attempt to ensure a wide range of abilities were included in the sample.
P a g e |103 The overall sample size interviewed and included in analyses was 82 children, providing a standard error of 0.27. This is slightly higher than what is considered acceptable for low-stakes testing, but satisfactory for the present research purposes. Data Collection As previously stated, a total of 48 items were included on the instrument. To minimize dependency between items due to order of items, three individual forms of the instrument were created by randomizing the item order. To accomplish this, items were initially grouped by scheme (filling, packing, building, and comparing). Next, items were ordered within each scheme by learning trajectory level. Finally, items were numbered consecutively in this arrangement. For example, VQR-F was numbered as 1, followed by VF-F1 as 2, VF-F2 as 3, and so on. Next, randomizer.org was used to obtain three randomized lists of numbers from 1 through 48, which were utilized as the order of items in creating Form A, Form B, and Form C, respectively (see Appendix I). Each time data collection began with a new set of children, one of the forms was selected and used for all children in that round of data collection. This decision was made to minimize the set up of manipulatives as well as to balance the number of children assessed with each form. The final results indicated that 26 children were interviewed using Form A, 26 using Form B, and 30 using Form C. Data was entered into an Excel spreadsheet, in which included the child’s response for each item along with any notes from the researcher regarding the child’s performance. After the data entry was finished, a data check was completed to verify the accuracy of the data. This involved a second researcher comparing the data collection sheets completed during each interview with the data entered into the Excel spreadsheet. Each entry on the data sheets was
P a g e |104 compared to the data in the corresponding Excel cell and of the 82 children and 48 items, only 7 discrepancies were noted and corrected (see Table 3.1) resulting in 99.82% agreement. Table 3.1 – Discrepancies in data entry Discrepancies in data entry Child ID Discrepancy Item 111319 VF-P2 111319 VF-F2 131711 VURR-P1 131715 VURR-F2 131312 VQ-F1 141520 VURR-F2 121214 VURR-F2 Coding/Scoring For each item-level match, it was assumed a correct answer on the item indicated a child was capable of thinking at that level or above in the learning trajectory. For example, in the item in Figure 3.1, correctly identifying that both buildings had the same amount of room inside because the buildings were built from the same number of cubes would indicate that a child was at least at the Volume Quantifier level of understanding in building volume. During the coding process, it was observed that some children demonstrated correct thinking, but may not have arrived at the correct answer. For example, one girl on item ICS-B1 (see Appendix H), clearly recognized hidden cubes and also that each column contained three cubes. She counted by pointing at each top cube and counting up three before moving to the next top cube. In her counting process, however, she made two counting mistakes at different decade points. She counted, “… 37, 38, 39, 50, 51 …” (skipping 40 through 49) and then “… 67, 68, 69, 90, 91, 92 …” (skipping 70 through 89). As a result, she skipped over 30 numbers in her counting and arrived at a total of 102, rather than the correct answer of 72. Another example is child who responded with an answer of 200 on item VCRS-4B (see Appendix H). After giving
P a g e |105 his answer, he explained, “I multiplied 6 by 5 to get 30, so I know there are 30 cubes in the bottom layer. Then I multiplied that by 8 to get 200.” He clearly demonstrated an understanding of layers, and ability to utilize the volume formula, but made a calculation error in multiplying 30 by 8. Due to cases like these two, the decision was made to provide partial credit. That is, a code of 4 indicated complete correctness, while 0 indicated a completely incorrect response. If a child clearly demonstrated correct thinking at the level the item was designed to assess, but made a computation or counting error, then their correctness was coded as 3. The decision to use a 4point scale was made to allow a partially-correct response that clearly indicated correct thinking to be scored closer to correct than to incorrect. Results Initial Rasch Modeling An initial modeling of the assessment results revealed three items did not contribute to the model based on their point-measure correlations. Items VRCS-3B and VRCS-2B both had point-measure correlations of 0, due to no children answering these items correctly. As such, the items did not add anything to the model, and should be removed from further models in this study. In future work with older children, however, the items should be included in the instrument. Additionally, one item had a negative point-measure correlation indicating it was not adding to the model. This item, VURR-C2, had a point-measure correlation of -0.1645, and should be eliminated. Unlike the first two items discussed, however, this item should not be included in future data collection or models.
P a g e |106 Rasch Modeling After Removal of Three Items After removing these three items, a second model was performed, and indicated every item reported a point-measure correlation above zero; therefore, the results implied that each item was contributing to the Rasch model. Examination of the fit statistics, however, revealed some items demonstrated a need for further consideration (see items highlighted in Table 3.2). One item, VQ-C1B (see Figure 3.2), was predicted to have all four fit statistics outside the acceptable ranges of 0.5 to 1.5 for the infit/outfit mean squares and -2 to 2 for the infit/outfit ZSTD (Bond & Fox, 2007; Liu, 2010). This finding suggested the item should be considered for removal from future models. It is not entirely clear why this item was such a poor fit to the model, as it was similar to other items in asking children to compare the volumes of two cube buildings. One explanation may be that a correct answer on this item did not require correctly quantifying the volume. A correct answer could be determined by direct comparison of the lengths of the buildings, as only one of the three dimensions differed between the two buildings. In looking at the Wright Map for the second model, there were two other items at the same difficulty level as VQ-C1B, so removal would not leave a gap in the Wright Map. With this supporting evidence, it is recommended that the item be removed from future models. Two additional items (VURR-F2 and VQR-C1B) demonstrated two fit statistics outside the acceptable ranges, indicating the need for further evaluation regarding fit to the Rasch model. Current evidence from examination of the Wright Map and dimensionality, however, did not indicate the need for removal from future models. Table 3.2 – Items with more than one unacceptable fit statistic after removal of three non-contributing items Items with more than one unacceptable fit statistic after removal of three non-contributing items NAME 35 - VQ-C1B 7 - VURR-F2 30 - VQR-C1B
MEASURE -0.76 0.3 -0.89
MODLSE 0.1 0.07 0.11
IN.MSQ 1.5428 1.496 1.5088
IN.ZSTD 2.0215 3.0015 1.5615
OUT.MSQ 4.6891 1.7385 2.6177
OUT.ZSTD 3.1647 1.6717 1.5926
P a g e |107
$& *'()"((+#)) "'0#&"(' #+2
""%#!! !* "'$"!#" )& %''#"'# !% ! "-
AF
Figure 3.2. Misfitting item VQ-C1B Rasch Modeling After Additional Removal of VQ-C1B After removal of the four potentially problematic items, a modeling using the remaining 44 items demonstrated all items reported better fit to the Rasch model (see Table J1). In looking at the infit statistics, no item had more than one value outside acceptable ranges, and no item had two infit or outfit statistics outside acceptable ranges. Specifically, the MNSQ infit of items ranged from 0.65 to 1.58; all values falling within the 0.5 to 1.5 range of productive measurement (Linacre, 2011) except for one (VQR-C1B). For item VQR-C1B, the MNSQ was only slightly outside the acceptable range (0.5 to 1.5) at 1.58; the infit ZSTD, however, was within the acceptable range (-2 to 2) at 1.74. As this was the only indication of misfit for this item, there was not sufficient evidence to support its removal from the model. Table J2 indicated that the raw variance explained by the assessment measures was 57.5%. Additionally, the unexplained variance in the first contrast was only 2.9% with an eigenvalue of 3.0. Although the results of the raw variance explained by the measure implied unidimensionality of the construct, an eigenvalue above 2.0 is potentially problematic and may indicate that there is another dimension being measured in the first contrast (Bond & Fox, 2007).
P a g e |108 In considering a second potential construct (see Figure J1 and Table J3), three items (VURR-B2, VQ-B3, and VQ-B2) reported loadings above 0.4 and two items (VQR-P and VF-C1) reported loadings below -0.4 (see Appendix H for items). Of these items, only VURR-B2 was determined to report fit statistics outside of the acceptable ranges. Examination of the other four items revealed they did not substantially differ from other items designed to assess the same level and scheme. Further examination of item VURR-B2 also indicated that the item was not substantially different from other similar items on the instrument. The only apparent difference seemed to be in asking children “If the cube has a volume of one, what is the volume of the rectangular prism?” rather than asking how many cubes would be needed to build the prism. Perhaps not directly asking for quantification is an area for further investigated. Additionally, removal of this item from the model did not substantially affect the fit statistics of any other items; thus, there was not enough evidence to remove the item from the model. The person separation index (2.95) and person reliabilities (0.90) implied that the item difficulty hierarchy was suitable for the sample, and the instrument was able to distinguish between children of low and high ability. Additionally, item reliabilities were reported at 0.97 with a separation index of 5.98 (See Table J1). This finding indicated a superior fit of item difficulty to the sample used in the study. Overall, the findings suggested the instrument had acceptable unidimensionality, as well as good fit for persons and items to the Rasch model (Linacre, 2011). Probability curves displayed symmetric s-shaped curves crossing at a midpoint of 0.5 (see Figure J2) as well as increasing categorical order. This finding indicated the probability of a child obtaining a correct response corresponded with the complimentary and equal probability of
P a g e |109 a child earning an incorrect response (Bond & Fox, 2007). The probability curves established that if the difficulty of an item was above the predicted ability of the child, the child had a lower likelihood of responding correctly to that item. On the contrary, if the difficulty of an item was lower than the predicted child ability, the child would have a greater likelihood of correctly answering the task. From the item-person or Wright map (Figure J3), the distributions of persons and items appeared to span a similar range. Therefore, it can be inferred that the instrument was targeted well to the population. However, the below zero mean of person abilities indicated that the instrument was slightly difficult for the sample used in the study (Bond & Fox, 2007). This was understandable given that items designed to assess the Volume Row and Column Structurer were more targeted to Grade 3 and above, while the instrument was administered primarily to children through Grade 2. The inclusion of these items was to ensure the predicted difficulty range of items reached high enough to accurately evaluate the current knowledge of the children included in the present study. Discussion Discussion of results will focus first on Research Question 2, in looking at the developmental progression as a whole. Following that, Research Question 1 will be discussed through looking at the results for the individual schemes of filling, packing, building, and comparing. Looking at Research Question 2 Is the hypothesized developmental progression for volume in Kindergarten through Grade 2, incorporating the subtrajectories for filling, packing, building, and comparing, valid when considered as a single, unidimensional developmental progression or should there be more
P a g e |110 than one developmental progression for volume measurement? In looking to answer this question, the overall results from the Rasch modeling, as well as the predicted difficulties of the items from the Rasch modeling is informative. Results from the Rasch modeling showed evidence of unidimensionality of the instrument. This indicated that the items included in the instrument measured a unidimensional construct; in this case, volume measurement. The evidence of unidimensionality as well as the good fit of items to the Rasch model supports that, indeed, the hypothesized developmental progression incorporating subtrajectories for filling, packing, building, and comparing was valid. Looking specifically at the predicted item difficulties (see Table J1), Figure 3.3 was constructed to determine whether the developmental progression demonstrated growth as the level of thinking increased in the children. That is, as a child progressed through the developmental progression for volume, did his/her thinking also become more sophisticated? To examine this, predicted difficulties for items designed to assess a particular level were grouped together. In Figure 3.3, item difficulty was plotted along the y-axis, while items assessing the same level of thinking within the developmental progression are plotted with the same x-value with a larger xvalue representing a more sophisticated level of thinking within the developmental progression. Difficulty ranges for each level were constructed using the predicted difficulties for items assessing that level, along with the predicted standard error. That is, a point was plotted for each item representing the item difficulty, then a range for that item was calculated by adding and subtracting the standard error from the item difficulty. An overall range was determined for each level of the developmental progression by combining the difficulty ranges for all items designed to assess that particular level. Figure 3.3 showed overall difficulty ranges for each level of the revised developmental progression up
P a g e |111
7PMVNF 6OJU 3FMBUFSBOE 3FQFBUFS
*OJUJBM $PNQPTJUF 4USVDUVSFS
7PMVNF 2VBOUJGJFS 7PMVNF 'JMMFS
7PMVNF 2VBOUJUZ 3FDPHOJ[FS
Figure 3.3. Combined plot for all items by learning trajectory level to and including Initial Composite Structurer. As can be seen in the figure, increasing item difficulty correlated with an increase in level within the revised developmental progression; that is, a general increasing trend in the graph was observed in the data. Evidence of unidimensionality of the construct as well as a correlation between item difficulty and level of thinking in the developmental progression provided support for a single developmental progression incorporating subtrajectories for filling, packing, building, and comparing volume. Recall that the data collected here focused on Kindergarten through Grade 2, therefore the results can only be verified with the levels of the developmental progression associated with children in those grades. The revised developmental progression for volume, however, described children’s thinking from pre-K through Grade 5, thus further data collection
P a g e |112 including children with higher and lower ability will be required to fully verify the revised developmental progression over the entire range. Looking at Research Question 1 Are the hypothesized developmental progressions for filling, packing, building, and comparing volume in Kindergarten through Grade 2 valid for a larger sample of children? To answer this question, it is necessary to look at the predicted item difficulties of the items designed to assess each specific scheme related to volume – filling, packing, building, and comparing. The items arranged according to predicted item difficulty are provided in Table J1. Two plots using the predicted difficulties of the items designed to assess filling are presented in Figure 3.4. For the plot on the left, the items were arranged according to learning trajectory level first and then within each level arranged by increasing item difficulty. As can be seen, in general, an increase in item difficulty corresponded to an increase in learning trajectory level. The only exception to this is at the more sophisticated levels of Volume Unit Relater and Repeater (VURR) and Initial Composite Structurer (ICS). The predicted item difficulties for VURR-F2 (0.29) and VURR-F3 (0.66) are almost identical to those for ICS-F1 (0.28) and ICSF2 (0.58), respectively. The plot on the right in Figure 3.4 represented the item difficulty plotted along the y-axis with a circle whose size was representative of the standard error of measurement for the item. The top two circles on this plot represented items VURR-F3 and ICS-F2. It is important to notice the amount of overlap between the two circles indicative of the difficulty levels of these two items overlapping. The third and fourth circles, representing VURR-F2 and ICS-F1 almost completely overlap. As the items are designed to assess different levels, this is problematic.
P a g e |113 Filling Items
Filling&
0.5%
0%
1%!%VQR!F% 2%!%VF!F1% 3%!%VF!F2%
4%!%VQ!F1% 5%!%VQ!F2B%
6%!%VURR! 7%!%VURR! 8%!%VURR! 9%!%ICS!F1% 10%!%ICS!F2% F1% F2% F3% Filling%
Measures
More
1%
!0.5%
Less
!0.5%
0.5%
!1%
!1.5% !2.5%
!1.5%
0%
2.5%
Figure 3.4. Plots of item by difficulty for filling items Another notable characteristic of the plot on the right is the large gap around zero. This indicated that no item was accurately measuring ability at this level. Looking more closely at this gap, it occurred between VURR-F1 and VURR-F2. One possible conclusion is that the items designed to assess the Volume Unit Relater and Repeater level need to be revisited, and possibly rewritten or added to before future administrations. VURR-F2 and VURR-F3 (see Appendix H) each asked children to iterate to determine the number of scoops or buckets needed to fill a container. Item VURR-F1, however, asked children to relate the size of units to determine that a unit half the size will take twice as many to hold the same amount of orange juice. These two components of the Volume Unit Relater and Repeater level – iterating and relating unit size – need to be explored further with the inclusion of additional assessment items. A second possible conclusion is evident from looking at the graphs for the other schemes related to volume; specifically those for packing and comparing (see the left plot in Figure 3.5 and Figure 3.7). For these two schemes, as with filling, notice a similar “dip” between items designed to assess the Volume Unit Relater and Repeater level and those designed to assess the Initial Composite Structurer level. This finding may suggest the two levels need to be more clearly defined. It might also indicated, as (Curry et al., 2006) hypothesized, that iteration of
P a g e |114 volume units was more difficult than the iteration of area or length units because (a) the unit had to be moved around in empty space instead of on a hard surface and (b) it was not possible to mark successive positions of the unit as it is moved around. In either case, more research is needed to clarify what is happening between these two levels of thinking. Packing Items
Packing(
0.5%
0%
11%!%VQR!P% 12%!%VF!P1% 13%!%VF!P2%
14%!%VQ!P1%46%!%VQ!P2%47%!%VQ!P3%15%!%VURR! 17%!%VURR! 16%!%VURR! 19%!%ICS! 18%!%ICS!P1% P3% P2B% P1% P2B% Packing%
Measures
More
1%
!0.5%
Less
!0.5%
0.5%
!1%
!1.5% !2.5%
!1.5%
0%
2.5%
Figure 3.5. Plots of item by difficulty for packing items The plots for items designed to assess the packing scheme are detailed in Figure 3.5. Similar to the plot for filling, the plot on the left demonstrated a general increase in item difficulty as the transition is made to higher levels of sophistication in volume measurement. One notable exception to this is the peak at item VF-P2. On this item, children were presented with a net from which a box without a top can be made (see Appendix H). Children were asked how many cubes were needed to fill the box completely. A correct response at the Volume Filler level indicated the child recognized which portions of the net formed the sides of the box, and which formed the bottom. Inclusion of the net required the child to have a more sophisticated spatial-structuring ability to answer correctly and consequently increased the predicted difficulty of this item. Thus, the use of nets within the learning trajectory need to be further explored to determine exactly where they fit among the LT levels. Additionally, in looking at the plot on the right in Figure 3.5,
P a g e |115 gaps between items toward the bottom indicated more items are needed to completely assess the less sophisticated levels for packing volume. The plot using the predicted difficulties for the building items is shown in Figure 3.6. This plot illustrated an increase in difficulty associated with an increase in learning trajectory level. Thus, the developmental progression for building volume was supported in the data. The plot on the right, however, demonstrated gaps in the middle. Comparing this to the plot on the left revealed that more items were required to fully assess the building scheme of the Volume Quantifier level. Building Items
Building(
0.5%
0%
20%!%VQR!B% 21%!%VF!B1% 22%!%VF!B2%
23%!%VQ!B1%48%!%VQ!B3%24%!%VQ!B2%27%!%VURR! 26%!%VURR! 25%!%VURR! 29%!%ICS! 28%!%ICS!B1% B3B% B2% B1B% B2B% Building%
Measures
More
1%
!0.5%
Less
!0.5%
0.5%
!1%
!1.5%
!1.5% !2.5%
0%
2.5%
Figure 3.6. Plots of item by difficulty for building items Finally, turning to the items designed to assess comparing volume, the left plot confirmed that an increase in item difficulty corresponded to an increase in learning trajectory level. As stated previously, the one exception was between the Volume Unit Relater and Repeater and Initial Composite Structurer levels. Similar to the plots of the other schemes, the plot on the right in Figure 3.7 indicated that additional items might be needed at the Volume Filler level.
P a g e |116 Comparing Items
Comparing*
0.5%
0%
30%!%VQR! 33%!%VF!C2% 32%!%VF!C1% 34%!%VF! C1B% C3B%
37%!%VQ! C3B%
36%!%VQ! C2B%
31%!%VURR! 38%!%VURR! C3B% C1%
40%!%ICS!C1%41%!%ICS!C2% Comparing%
Measures
More
1%
!0.5%
Less
!0.5%
0.5%
!1%
!1.5% !2.5%
!1.5%
0%
2.5%
Figure 3.7. Plots of item by difficulty for comparing items Overall, the evidence supported that the developmental progressions for filling, packing, building, and comparing are valid for a larger sample of children. Additional items are necessary, however, to completely assess each of these schemes and fill in the gaps between items currently on the instrument as well as to delineate where the use of nets is most appropriate. Finally, revisiting the Volume Unit Relater and Repeater and Initial Composite Structurer levels is necessary to determine whether the “dips” necessitate rewriting items or whether the levels themselves need to be clarified. Implications and Further Research In looking to answer Research Question 1, evidence supported the individual developmental progressions for filling, packing, building, and comparing. Further research, however, is warranted regarding the role and placement of nets in the developmental progression. Additionally, the Volume Unit Relater and Repeater level along with the Initial Composite Structurer levels need to be revisited as the distinctions between these levels for filling, packing, and comparing were not as solid as those for between the other levels. Further research is required to determine whether the items themselves were not sufficient or whether the levels themselves need to be re-examined. Based on the fact that three of the four schemes related to volume demonstrated similar discrepancies, either case could be argued in the data.
P a g e |117 This study also provided further verification of the developmental progression hypothesized by Sarama and Clements (2009). Items from assessments utilized in previous research using their learning trajectory for volume were included on the instrument used and analyzed in the current study. The results presented here parallel results from previous studies (e.g., Clements, Sarama, Barrett, et al., 2011; Eames et al., 2013; Kara, Eames, & Van Dine, 2013; Sarama & Clements, 2009; Van Dine, Sarama, Clements, & Vukovich, 2013). In looking to answer Research Question 2, evidence of unidimensionality of the construct as well as a positive correlation between item difficulty and level of the development progression supported a single developmental progression incorporating subtrajectories for filling, packing, building, and comparing volume at least up to Grade 2. However, further data collection is required with children of higher ability to verify the revised developmental progression up through Grade 5. Additionally, further analysis using multidimensional Rasch modeling is warranted, and should be pursued to examine the correlations between the subtrajectories. In comparing the results from this study with previous studies, it was found that the filling scheme did appear to be easier for children than those for building and packing volume. This supported the findings of Curry and Outhred, who observed children developed volume skills in filling prior to packing (Curry et al., 2006; Curry & Outhred, 2005). Whether this difference is significant and what correlation exists, however, must be further explored through the multidimensional IRT analysis. Also of interest for further examination is examining at the assessment items from Battista and Clements (Battista, 1999, 2012; Battista & Clements, 1996, 1998) utilized on the instrument utilized in the present study. These items along with their predicted difficulties from the current research and the problem number Battista used in his Cognition-Based Assessment
P a g e |118 are listed in Table 3. Although Battista made no claim that the items were ordered according to difficulty, he did state problems 1 through 8 are appropriate for Grades 1 and 2, while problems through number 14 are appropriate for Grade 3 and 4 (Battista, 2012, p. 373). Evidence from the children in my sample demonstrated agreement with this focus, although problem 8 appeared to be more difficult than Battista previously anticipated in his work. To identify a child’s thinking according to his levels of sophistication, Battista analyzed children’s responses on these problems qualitatively. That is, a single item could be used to identify several different levels of sophistication depending upon the way in which the child responded to the item. Further qualitative research on the sample of children utilized in this study may provide further insight into the possible relationships between Battista’s levels of sophistication, and the revised developmental progression for volume examined in the present study. Table 3.3 – Assessment items from Battista (2012) Assessment items from Battista and Clements with Rasch difficulty predictions and problem number from Battista (2012) Item Name Measure Battista Problem Number 45 - VRCS-4B 1.01 11 17 - VURR-P2B 0.81 8 19 - ICS-P2B 0.66 9 29 - ICS-B2B 0.62 12 42 - VRCS-1B 0.49 10 25 - VURR-B1B 0.41 7 27 - VURR-B3B 0.32 6 31 - VURR-C3B 0.23 5 36 - VQ-C2B 0.03 2 37 - VQ-C3B -0.03 3 34 - VF-C3B -0.04 4 30 - VQR-C1B -0.93 1 Finally, the Differential Item Functioning (DIF) from the Rasch model can be utilized to explore how different subsets of participants responded to items. DIF can be used, for example, to see whether males and females respond differently to items on an instrument (Bond & Fox, 2007). For this sample, DIF was examined across grade (Figure 3.8), gender (Figure 3.9), and
P a g e |119 school (Figure 3.10). In all three of these cases, there was not evidence of a large discrepancy in the way persons from different groups responded to items. In Figure 3.9, there are a couple items where female responded slightly differently than males, but overall the trend is the same across gender. In each of the graphs, additionally, four sections of increasing slope were evident. As the items for these graphs were grouped by scheme (filling, packing, building, and comparing), and then ordered by learning trajectory level within scheme, the graphs provided further verification of the developmental progressions for each scheme. Thus, Research Question 1 was further supported in the data PERSON DIF plot (DIF=$S8W1)
B% !1
% VR
CS
% C1 S!
IC %!%
%!% 42
%
3B
3B
!C
!C
VQ
VF
%!% 37
%!%
40
B% C1
1%
R!
!C VF
VQ
%!% 32
%!%
34
2% %
!B
B1
S! IC
%!% 28
30
3%
RR
%!%
VU
1%
!B
VQ
%!%
48
26
1%
!B
!B
VQ
%!%
VF
23
2B
B%
21
%!%
IC
S!
P2
!P 19
%!%
!P
RR
VU
17
%!%
2% !P
RR
VQ
VU %!%
%!% 46
15
P%
2%
R!
!P VF
VQ
%!%
%!%
13
F2
1%
!F
!%I
11
CS
R! 9%
F2 Q!
!%V
!%V 7%
5%
UR
% F2 F!
!%V
!%V 1%
3%
QR
!F
%
B%
%
3%
%
ITEM$
2% 1.5% 1%
DIF$Measure$(diff.)$
0.5% 0% !0.5% !1% !1.5% !2% !2.5%
Figure 3.8. DIF by Grade
1% 2% 3% K% P% *%
DIF$Measure$(diff.)$
1#
0.5#
0# VF
B#
!1
CS
#
C1
#
# 3B
3B !C S!
IC
VR
#!#
42
#!#
40
!C
2#
B#
C1 1#
!C
R!
VQ
#!#
37
#!#
34
VF
#!#
VQ
#!#
#
!B
B1
S!
IC
#!#
32
30
3#
RR
VU
!B
1#
1#
!B
VQ
#!#
#!#
28
26
48
!B
VQ
#!#
23
B#
#
2B
!P
P2
S!
IC
VF
#!#
21
3#
#
!P
2#
RR
VU
#!#
19
#!#
17
!P
RR
VU
2#
P#
R!
!P
VQ
#!#
#!#
15
46
VF
1#
F2
B#
#
R!
!F
VQ
#!#
13
#
F2
UR
CS
#!#
11
!#I
!#V
9#
7#
F2
Q!
F!
!#V
5#
!#V
3#
!F
QR
!#V
1#
DIF$Measure$(diff.)$
QR !F 3% !%V % F! F2 % 5% !%V Q! F2 7% !%V B% UR R! 9% F2 !%I % CS !F 1% 11 %!% VQ 13 R!P % %!% VF ! 4 6 P2 % %!% VQ 15 !P2 % %!% VU 17 RR! P %!% VU 3 % 19 RR! P2 %!% IC B% S 21 !P2 B% %!% VF ! 23 B1% %!% VQ 48 !B1 % %!% VQ 26 !B3 % %!% VU 28 RR! B2 %!% IC % S 30 !B1 % %!% VQ 32 R!C 1 %!% VF B% ! 34 C1% %!% VF ! 37 C3B %!% VQ % 40 !C3 B% %!% IC S 42 !C1 % %!% VR CS !1 B%
1% !%V
P a g e |120
PERSON DIF plot (DIF=$S7W1) ITEM$
2%
1.5%
1%
M%
0.5% F%
0% *%
!0.5%
!1%
!1.5%
!2%
!2.5%
Figure 3.9. DIF by gender PERSON DIF plot (DIF=$S1W2)
ITEM$
2#
1.5#
11#
12#
13#
14#
28#
*#
!0.5#
!1#
!1.5#
!2#
Figure 3.10. DIF by school
This study is just the first step in the process of analyzing and verifying the hypothesized
revised developmental progression for volume measurement. Although evidence was found to
support the development of the schemes of filling, packing, building, and comparing volume,
more research with a larger and higher-ability sample as well as inclusion of additional
P a g e |121 assessment items is needed. Additionally, evidence supporting a single developmental progression for volume was found, but multidimensional analyses should be conducted to further explore this developmental progression for volume and the correlations between the subtrajectories for filling, packing, building, and comparing.
P a g e |122 Appendices Appendix A Measurement Standards from CCSS-M with Linking Standards (Confrey et al., 2012) Table A1 – Attributes, Measuring Length and Capacity by Direct Comparison from TurnonCCMath.com Attributes, Measuring Length and Capacity by Direct Comparison from TurnonCCMath.com (Confrey et al., 2012) CCSS-M Description K.MD.1 Describe measurable attributes of objects, such as length or weight. Describe several measurable attributes of a single object.
Descriptor Section 1: Attributes, Measuring Length by Direct Comparison For a measurement to be carried out on any object, it is important to identify the attribute to be measured. Attributes are characteristics of objects. Different attributes of any object may be measured in different ways. These attributes include, among others, length, capacity, mass, time, temperature, and volume. Measurement activities provide natural contexts for building understanding of mathematical language of comparison and equivalence. These concepts include: heavier (more mass), lighter (less mass), larger, smaller, longer, shorter, wider, narrower, same length and same mass. Students demonstrate an emerging sense of measurement as they interpret and represent the physical world and link mathematics and science by identifying, for example, length and mass as attributes of objects. For example, students interpret plant growth as represented by the height of the plant or the size of its leaves.
P a g e |123 CCSS-M Description K.LAV.A Describe and distinguish spatial attributes (length, area and volume) of shapes and objects.
Descriptor This Bridging Standard is introduced to emphasize students’ early abilities to distinguish the attributes of length, area, or volume that are foundational to the measurements of these attributes. Before the formal measurements of length, area, or volume are introduced, students identify them as attributes that are qualitatively different from each other. Research shows that young students learn to distinguish these three attributes simultaneously rather than separating their introduction over elementary school. Students build proficiency with measuring length, area and volume gradually, and incrementally over time, but since they live in a 3-D world, making the basic distinctions begins early. Students first identify volume as an attribute informally using terms such as bigness or spaciousness. They then come to see volume as bounded by a closed surface or filling a 3-D shape.
Informal explorations identifying lengths on surfaces or as edges on 2-D and 3-D objects and surface areas on 3-D objects help to connect and contrast the ideas of length, area, and volume. Students in Kindergarten may also benefit from discussing that objects used to represent length actually have volume (e.g., sticks and string) or area (e.g., strips). Students learn that it is important to carefully identify which attribute is under study when looking at 1-D, 2-D, and 3-D shapes and objects. K.LAV.B Indirectly compare two objects by representing the attribute with, for example, another object and then directly comparing.
This Bridging Standard is introduced here to describe how students’ learning of measurements emerges from representing the attributes of objects and comparing the representations. At the heart of the measurement learning trajectory is the movement from identifying attributes, to representing attributes, directly and indirectly comparing attributes, and finally unitizing attributes using constructed units and wisely choosing common units. For instance, students could be given an opportunity to compare the size of a set of pumpkins and allowed to discuss what it means to be the “bigger pumpkin.” If asked to represent the set of pumpkins, they might draw figures that vary in height and use a string to measure the circumference, or weigh the pumpkins. Students could use label strips to compare the height of two pumpkins by cutting the strips to correspond to the height of the pumpkins and lining the strips end to end to establish a relation on length. In this example, the strip is used as a stand-in for length without the need to have conventional units, which are introduced later. It is important for students to explore representations and inscriptions for recording measurements on the way to learning to quantify them along a single dimension. This process of movement from identifying attributes, to representing, and to comparing helps students to establish a broad theory of measurement. This helps for developing systematic processes to compare the amount of two or more quantities rather than just thinking about measurement as the association of a number of units with a given quantity.
P a g e |124
Table A2 – Volume Measurement from TurnOnCCMath.com Volume Measurement from TurnOnCCMath.com CCSS-M Description 1.LAV.D Understand capacity as an attribute that describes the amount of space a three-dimensional object can hold.
Descriptor Section 4: Volume Measurement This Bridging Standard is added to highlight the fact that students’ early abilities to work with capacity and volume, and to emphasize the importance of developing these related concepts at an early stage. Even before first grade, students begin school with emerging abilities to identify capacity or volume as attributes because they bring in experiences with the 3-D world. Some students are able to compare volumes in vague visual ways. For example, they may examine a box and say that, "This box holds a lot of blocks!" Recall that in kindergarten, students first identified volume as an attribute informally using terms such as bigness or spaciousness. They then came to see volume as bounded by a closed surface or filling a 3-D shape. Students should formally be introduced volume as beginning from the concept of capacity. Capacity describes the quantity of liquid (or pourable substance such as cereal, rice, or sand) which fills the amount of space a three dimensional object can hold. The quantity of liquid is measured in customary or metric units of liquid volume such as ounces or liters.
2.LAV.C Directly compare two or more quantities by their capacity or volume using various strategies.
This Bridging Standard is added here because students’ comparison of volumes motivates their learning of capacity or volume measurement. It is also consistent with how the comparisons of area and length are emphasized in students’ learning of length and area measurement in the earlier Standard K.MD.2 and the Bridging Standard 1.LAV.B in this LT. Students compare the capacity or volume of two or more 3-D objects. At this grade level, they may do so in a variety of ways given the problem context: Students may use visual comparison to compare two objects that are adjacent to each other. When the objects can be completely nested within each other, students nest them and order the objects by their capacity. When two objects are adjacent to each other, students may compare them by identifying corresponding parts and comparing the parts in systematic ways using one-to-one correspondence, compositions and decompositions, and/or transformations. When the objects can be completely nested within each other but are not adjacent to each other, students may use a third object to compare them by separate nesting processes. Similar to length and area, Conservation of Volume over Rigid Transformations is a prerequisite to the strategies 2-4 above (see Standards K.MD.2 and 1.LAV.B earlier in this LT). They must understand that the volume of an object is invariant under simple rigid transformations including translation, rotation (including tipping), and reflection (including turning an object on its side). Students also
P a g e |125 CCSS-M Description
3.MD.2 Measure and estimate liquid volumes and masses of objects using standard units of grams (g), kilograms (kg), and liters (l). Add, subtract, multiply, or divide to solve one-step word problems involving masses or volumes that are given in the same units, e.g., by using drawings (such as a beaker with a measurement scale) to represent the problem. (Excludes compound units such as cm3 and finding the geometric volume of a container.)
5.MD.3.b A solid figure which can be packed without gaps or overlaps using n unit cubes is said to have a volume of n cubic units.
Descriptor understand that the volume of an object is not conserved through expanding.
Students learn to compare the capacity, for example, two boxes of cereal by emptying the contents of each box into a pair of identical containers and comparing the height of the cereal in the new containers. Students also address the misconception that height is automatically a good indicator of capacity, by asking students to compare the capacity of two glasses (with different radii) that are filled to equal heights. Students could also come to recognize that they can compare heights if and only if the containers have the same sized base. For example, students knows how to share equal amount of pop drinks among themselves using identical plastic cups by comparing the height of the drink poured into the cups. Students apply their knowledge of addition and subtraction, and multiplication and division to solve one-step word measurement problems involving masses and volumes (see Standard 2.OA.1 in the Addition and Subtraction LT and the referent preserving problems in the Standard 3.OA.3 in the Division and Multiplication LT). For example, “a measuring cup holds an amount of water leveled at the 350 milliliters-mark. At which mark should the water-level be if John wants to pour 150 milliliters of water from the beaker?” Liquid volume as used in the standards is one version of the quantities that represent capacity. Capacity can refer to amounts of flour, rice, cereal, as well as liquids held by a container. The unit used to measure mass at this grade level is going to be the same as for weight. The distinction of mass versus weight will be pursued in science. Volume is a measure of how much space is enclosed by a particular 3-dimensional shape. It is measured in cubic units. Units of measure for volume involve solids that tessellate space. Common units in U.S. customary measure are cubic inches, cubic feet, and cubic miles. The standard units in the metric system are cubic meters and cubic decimeters. In earlier grades, students may have partial understanding of cubes as filling a space. They may attempt to iterate cubes or scoops but may not understand the need for equal units. At the same time, students may count the faces of a cube building, possibly doublecounting cubes at the corners and usually not counting internal cubes. Eventually, they learn to count one cube at a time in carefully structured and guided contexts, such as packing a small box with cubes.
P a g e |126 CCSS-M Description
Descriptor The next step for students is to formally learn to associate a number of volume units with a volume. Students should experiment with measuring volumes using nonstandard units such as boxes or other 3-D solids that can pack a space without gaps. For example, if students use boxes to measure the volume of a wardrobe, they might use different-sized boxes, create overlaps, and/or miscount the number of boxes. Such activities help students come to appreciate the need for standard units to make volume comparisons possible across situations. They discover the advantages of using boxes that occupy the spaces without leaving gaps, such as cubes and rectangular prisms.
5.MD.3.a A cube with side length 1 unit, called a “unit cube,” is said to have “one cubic unit” of volume, and can be used to measure volume.
A relatively small unit is then wisely selected because it can iteratively pack different volumes to be more likely to result in a wholenumber comparison between two volume measurements. The standard only requires students to measure a single volume but comparisons can help motivate students’ measurement activities. Measurement of volume is typically restricted to measuring rectangular prisms or parallel-piped solids (i.e., a solid formed by three pairs of parallelograms). In this context, a 1 by 1 by 1 cube becomes a convenient standard unit. At this grade level, measurement of volume is typically restricted to measuring rectangular prisms. In this context, a 1 × 1 × 1 cube becomes a convenient standard unit sometimes called a unit cube. Students need to understand that a 1 unit × 1 unit × 1 unit cube is said to have 1 cubic unit of measure and this should be explicitly compared to volumes of combinations of rectangular prisms that have side lengths of 1 or 2 (e.g., 1 × 1 × 2, 1 × 2 × 1, …. 2 × 2 × 2). Students should also contrast cubic units with square units and linear units and learn to predict which unit is appropriate in which situations.
Although Standards 5.MD.3.b (earlier in this LT) and 5.MD.3.a only require students to measure a single volume, a comparison of volumes can help to motivate students’ measurement activities. 5.LAV.A Measure the volume of an object twice, using cubic units of different volumes for the two measurements; describe how the two measurements relate to the size of the unit chosen.
This Bridging Standard is added to provide students opportunities for understanding the inverse relationship between the unit and the measure in volume measurement. The Bridging Standard is also consistent with how the Compensatory Principles of Length and Area Measurement are emphasized in Standard 2.MD.2 and Bridging Standard 3.LAV.B earlier in this LT. Students explore and understand the Compensatory Principle of Volume Measurement: There is an inverse relationship between size of unit and number of units required to match the volume of an object. If they measure an object using a larger volume unit, the measure is smaller; likewise, a smaller volume unit yields a larger measure of the same object. The Compensatory Principle rests upon students’ understanding of the Conservation Principle of Volume (see Bridging Standard 2.LAV.C earlier in this LT).
P a g e |127 CCSS-M Description 5.MD.4 Measure volumes by counting unit cubes, using cubic cm, cubic in, cubic ft, and improvised units.
Descriptor Students experiment with how many cubes fill the interior space of rectangular prisms in measuring the volume. They also construct models out of snap cubes or other cubes, or fill boxes with cubes to see how they get packed in and arranged in layers. Use of various manipulatives and models help student develop build fluency in volume measurement.
5.MD.5.a Find the volume of a right rectangular prism with whole-number side lengths by packing it with unit cubes, and show that the volume is the same as would be found by multiplying the edge lengths, equivalently by multiplying the height by the area of the base. Represent threefold whole-number products as volumes, e.g., to represent the associative property of multiplication.
To lay the foundation for middle school math, students use previous experiences with capacity to investigate the volume of rectangular prisms. Students use manipulatives to fill spaces and count how many they use. Based on their experience with area, students recognize that the number of the unit cubes along a side correspond to the length of the side. They generalize this three ways: Seeing the base of the right rectangular prism as a rectangular surface of height 1, composed of a × b cubes. Then seeing that there are c of these layers in the vertical structure. Therefore the volume of the right rectangular prism is (a × b) × c. Seeing a vertical slice of the right rectangular prism as b × c cubes and seeing there are a of these slices to get a total of (b × c) × a cubes. Seeing a vertical slice of the right rectangular prism as a × c cubes and seeing there are b of these slices to get a total of (a × c) × b cubes.
From these experiences, the students conclude that the total number of unit cubes in the volume of a right rectangular prism is a × b × c and can be justified as the product of the length, width and height. The associative property of multiplication is illustrated in analyzing this activity because the total volume, a × b × c cubic units is the same whether the prism is decomposed along either any of the three dimensions (length, width or height) into a vertical slices of b × c, i.e., (b × c) × a, b vertical slices of a × c, i.e., (a × c) × b, or into c horizontal layers of the base a × b, i.e., (a × b) × c.
P a g e |128 CCSS-M Description
Descriptor They also interpret the volume as the layers of the rectangular base being swept through a number of times which corresponding to the height.
Discoveries regarding conservation are made as students recognize that two rectangular prisms may have the same volume although the dimensions may vary. 5.MD.5.b Apply the formulas V = l × w × h and V = b × h for rectangular prisms to find volumes of right rectangular prisms with wholenumber edge lengths in the context of solving real-world and mathematical problems.
Student can apply the formulas of volume, V = l × w × h and V = B × h to solve the following problem: “Carl is buying a new fish tank to replace an old fish tank of length 24 inches long, 16 inches wide and 20 inches tall. If he is looking for a new fish tank that can hold the same amount of water and fits into a rectangular space of 480 square inches, what would be the height of the fish tank?”
5.MD.5.c Recognize volume as additive. Find volumes of solid figures composed of two non-overlapping right rectangular prisms by adding the volumes of the non-overlapping parts, applying this technique to solve realworld problems.
Students find the volumes of rectangular prisms and other rectilinear objects formed by combinations of rectangular prisms emphasizing composition and decomposition of rectilinear shapes.
As students solve the problem, they also apply the principle of conservation of volume to find another rectangular prism with different dimensions but the same volume.
Students learn to identify the joining of volumes as sums based on composition and symbolically code this as addition. Likewise, students identify the comparison of two volumes based on decomposition and symbolically code this as subtraction. That is, for two volumes A and B, volume measure of (A joined with B) = volume measure of (A) + volume measure of (B), and volume measure of (A excluding B) = volume measure of (A) – volume measure of (B).
P a g e |129 CCSS-M Description
Descriptor
A prerequisite for understanding that volumes are quantitatively additive is another form of conservation known as the Conservation of Volume over Composition/Decomposition and Joining or Equipartitioning and Reassembly. Students recognize that when a solid volume is decomposed or equipartitioned into components and the components are rejoined or reassembled into the original whole, the object’s volume does not change regardless of the orientation of the rejoined or reassembled whole. (Note: The reference to solid volume is because volume enclosed by a container is arguably not conserved if one cuts a hollow cylinder in half and joins them together back-to-back).
P a g e |130 Appendix B Levels of Sophistication in Student Reasoning: Volume (Battista, 2012, p. 207) Level N0 N1
N2
N3
Non-Measurement Reasoning About Volume Sublevel Description Student compares objects’ volumes in vague visual ways. Student correctly compares objects’ volumes directly or indirectly. N1.1 Student compares whole objects’ volumes directly. N1.2 Student uses a third object to compare whole objects’ volumes indirectly. Student compares objects’ volumes by systematically manipulating or matching their parts. N2.1 Student rearranges parts to directly compare whole shapes. N2.2 Student matches parts one-to-one to compare volumes. Student compares objects’ volumes using geometric properties or transformations.
M0 M1
M2 M3
M4 M5
M1.1 M1.2 M1.3
M3.1 M3.2
Measurement Reasoning About Volume Student uses numbers in ways unconnected to appropriate volume-unit iteration. Student incorrectly iterates volume-units. Student iterates single volume-units incorrectly. Student decomposes shapes into parts incorrectly. Student iterates volume-units incorrectly, but eliminates double-counting errors. Student correctly iterates all volume-units one-by-one. Student correctly operates on composites of visible volume-units. Student correctly iterates non-layer composites. Student correctly iterates layer composites. Student correctly and meaningfully operates on volume using only numbers (no visible units or iteration). Student understands and uses procedures and formulas for determining volumes of rectangular boxes.
P a g e |131 Appendix C Developmental Progression for Volume Measurement (Sarama & Clements, 2009, pp. 306-308) Age 0-3
Developmental Progression Volume/Capacity: Volume Quantity Recognizer • Identifies capacity or volume as attribute, beyond implicit acting on materials. - Says, “This box holds a lot of blocks!”
4
Capacity Direct Comparer • Can compare two containers. - Pours one container into another to see which holds more.
• Using perceptual objects, internal bootstrap competencies to compare linear extent (see the length trajectory for “Direct Comparer”) or recognize “overflow” as indicating the container “poured from” contains more than that “poured into.”
5
Capacity Indirect Comparer • Can compare two containers using a third container and transitive reasoning. - Pours one container into two others, concluding that one holds less because it overflows, and the other is not fully filled.
• A mental image of a particular amount of material ("stuff") can be built, maintained, and manipulated. With the immediate perceptual support the containers and material, such images can be compared. For some, explicit transitive reasoning may be applied to the images or their symbolic representations (i.e., object names).
Volume/Spatial Structuring: Primitive 3-D Array Counter • Partial understanding of cubes as filling a space. - Initially, may count the faces of a cube building, possibly double-counting cubes at the corners and usually not counting internal cubes. - Eventually counts one cube at a time in carefully structured and guided contexts, such as packing a small box with cubes.
• With perceptual support, can visualize that 3-D space can be filled with objects (e.g., cubes). With strong guidance and perceptual support from pre-structured materials, can direct the filling of that space and recognize that filling as complete, but often only intuitively. Implicit visual patterning and constraints of physical materials guides placement of cubes.
6
Action on Objects • Action schemes, both physical/kinesthetic and visual, implicitly operate on 3D spatial extent from the earliest years. Then conceptualizes space and objects within the space explicitly. Action schemes are connect to spatial vocabulary.
P a g e |132 Age
7
7
8
Developmental Progression Action on Objects Capacity Relater and Repeater • Action schemes include the ability to iterate a mental unit along a perceptually-available object. The image • Uses simple units to fill containers, with accurate counting. of each placement can be maintained while the - Fills a container by repeatedly filling a unit and counting physical unit is moved to the next iterative position how many. (initially with weaker constraints on this placement). - With teaching, understands that fewer larger than smaller With the support of a perceptual context, scheme can objects or units will be needed to fill a given container. predict that fewer units will be required to measure an object’s volume. These action schemes allow the application of counting-all addition schemes to be applied to measures. Volume/Spatial Structuring: Partial 3-D Structurer • Builds, maintains, and manipulates mental images of composite shapes, structuring them as composites of • Understands cubes as filling a space, but does not use layers or individual shapes and as a single entity – a row (a unit multiplicative thinking. Moves to more accurate counting of units), then a layer (a “column of rows” or unit of strategies e.g.: units of units). Applies this composite unit repeatedly, - Counts unsystematically, but attempts to account for but not necessarily exhaustively, as its application internal cubes. remains guided by intuition. - Counts systematically, trying to account for outside and inside cubes. - Counts the number of cubes in one row or column of a 3D structure and uses skip counting to get the total. Volume/Spatial Structuring: 3-D Row and Column Structurer • Builds, maintains, and manipulates mental images of composite shapes, structuring them as composites of - Counts or computes (row by column) the number of cubes individual shapes and as a single entity – a layer (a unit in one row, and then uses addition or skip counting to of units of units) of congruent cubes. Applies this determine the total. composite unit repeatedly and exhaustively to fill the - Computes (row times column) the number of cubes in one 3-D array – coordinating this movement in 1-1 row and then multiplies by the number of layers to correspondence with the elements of the orthogonal determine the total. column. If in a measurement context, applies the concept that the length of a line specifies the number of unit lengths that will fit along that line. May apply a skip counting scheme to determine the volume.
P a g e |133 Age
9
Developmental Progression Volume/Spatial Structuring: 3-D Array Structurer • With linear measures or other similar indications of the two dimensions, multiplicatively iterates squares in a row or column to determine area. - Constructions and drawings are not necessary. In multiple contexts, children can compute the volume of rectangular prisms from their dimensions and explain how that multiplication creates a measure of volume.
Action on Objects • Builds, maintains, and manipulates composites (a 3-D array – units of units of units of units) that operate in two dimensions. Mentally de/composes 3-D array into layers, which themselves are de/composed into rows by columns. The mental image may be of a spatial array or, at this level especially, a symbolic array. Applies repeated addition or multiplication to composites. Curtails the process to use volume formulas with understanding.
P a g e |134 Appendix D Volume Assessment Items from Longitudinal Study
CDC-1 (Capacity Direct Comparer) Show the student two containers shown below:
“Pay attention because I am going to ask you a question about these two containers in a minute.” (Point to the two containers.) Completely fill one of the containers with water. Pour the water from container into the other. “Which of these two containers can hold more water?” (Point to the two containers again.) CDC-2 (Capacity Direct Comparer) Show the student the two cups shown below.
“Pay attention because I am going to ask you a question about these two containers in a minute.” (Point to the two cups). Completely fill one of the cups with water. Pour the water from cup into the other. “Which of these two cups can hold more water?” (Point to the two cups again.)
P a g e |135
CIC-1 (Capacity Indirect Comparer) Show the student the three containers below.
“Pay attention because I am going to ask you a question about these two containers in a minute.” (Point to the two rectangular prism containers.) Pour the full measuring cup into the larger container. Then pour the larger container back into the measuring cup. Pour the full measuring cup into the smaller container. “Which of these two containers can hold more water?” (Point to the two rectangular prism containers again.) CIC-2 (Capacity Indirect Comparer) When Amy pours an entire package of rice into Container A, the rice overflows. If she pours an entire package of rice into Container B, it is not filled completely.
CONTAINER A Which container can hold the most rice?
CONTAINER B
P a g e |136
PAC-2 (Primitive 3-D Array Counter) Place a 2 x 3 x 2 solid (“glued together”) on the table. Ask, “This is a cube. How many cubes do you think would be needed to make this?”
PAC-4 (Primitive 3-D Array Counter) Show the picture of a 2x4x2 (“glued together”). Ask, “The block shown below is made of cubes. How many cubes would be needed to build this block?”
P a g e |137
CRR-1 (Capacity Relater and Repeater) Show the picture below and ask: Two small juice boxes are as large as one big juice box.
If we put all of the orange juice into the big boxes, we will need 5 big boxes.
If we put the juice into small boxes, how many small boxes
will we need?
CRR-2 (Capacity Relater and Repeater) Show the picture below and say:
Water tank
So far, one bucket of water has been poured into the water tank. How many more buckets are needed to fill up the water tank? How do you know?
P a g e |138
PS-1 (Partial 3D Structurer)
If the cube has a volume of one, what is the volume of the rectangular prism? PS-2 (Partial 3D Structurer)
How many cubes altogether will it take to fill the box? VRCS-1 (Area Row and Column Structurer)
Some cubes are shown within the outline of a box. How many cubes all together will it take to fill the box? Explain how you found the number of cubes.
P a g e |139 VRCS-2 (3D Row and Column Structurer)
This net (on the left) folds up to make the box (in the middle). How many cubes (on the right) would it take to fill the box? AR-1 (3D Array Structurer)
6 cm
4 cm
3 cm
1 cm 1 cm
3 cm
6 cm
4 cm
1 cm
3 cm
One cubic centimeter
6 cm
4 cm
This cube (on the left) has a volume of one cubic centimeter. What is the volume of the rectangular prism (on the right)? Record your answer on the line below.
P a g e |140
AR-2 (Array Structurer)
A
B
One cubic centimeter
The volume of Tower B is 32 cubic centimeters. What is the volume of Tower A?
P a g e |141 Appendix E Revised Developmental Progression for Volume Measurement Developmental Progression Volume Quantity Recognizer Filling and Packing. Recognizes capacity as an attribute, beyond implicit acting on materials. “I can pour lots of sand into this can.” “This box holds a lot of toys.”
Mental Actions on Objects Perceives 3-D space and objects within the space.
When given a 3-D object constructed of cubes, may count only one face.
Building. Builds with blocks, associating more blocks with terms like “big” and fewer blocks with terms like “small” “I used a lot of blocks and made a big house. See?” Comparing. Initially, recognizes volume as an attribute, describes objects with words such as big, small, and tiny. Eventually, may compare volume recognizing only one dimension. - “This one’s bigger, with more blocks. See, it’s taller.” (Comparing two buildings with the same cross section.) Volume Filler Filling. Fills a container using another (smaller container) and counts the number needed to completely fill the larger container. Packing. Places cubes into a rectangular box to
Anticipated Misconceptions or Partial Conceptions When building, may not accurately recreate a given shape in size (e.g., number of blocks) or dimensions.
May not compare, then begins to compare capacities, but makes no reference to dimensions
With perceptual support, can visualize that 3-D space can be filled with objects (e.g., cubes). A mental image of a particular amount of material (“stuff”) can be built, maintained, and manipulated. With the immediate perceptual support of the containers and material, such images can be compared at an intuitive level.
May count when filling or packing but does not focus on quantifying the total volume or capacity. When filling, counts each “scoop” as one although the scoops may not be completely filled (or overfilled) each time.
P a g e |142 Developmental Progression fill it; eventually, may pack entire box with a focus on leaving no gaps. Building. Given a 3-D object constructed of cubes, recognizes and counts cubes (the child may be counting “blocks” or even “squares”) on multiple faces. Comparing. Compares objects by physically or mentally aligning; refers to a least two dimensions of objects. Places two objects next to each other, saying, “This one holds more because it’s taller and it’s wider.” May begin to associate number of scoops/cubes in comparison, simply using a bigger number for larger container.
Volume Quantifier –
Mental Actions on Objects
Anticipated Misconceptions or Partial Conceptions
Uses perceptual objects and internal innate competencies to compare extent of 3-D space, with attention usually focused on only one or two dimensions.
Attends to space filled without comparing it to the total capacity of the container. For example, when filling or packing, may not recognize container as half full.
Recognizes “overflow” as indicating the When packing, the focus is on filling the container “poured from” contains more than that container rather than on spatial structuring May “poured into.” use the walls of the container to constrain the covering of the bottom but may see container May apply explicit transitive reasoning in filled when only the bottom is covered. volume comparison situations. Given a 3-D object constructed of cubes, may not count on all faces consistently nor follow a pattern, may double-count, and ignore internal cubes.
Filling. Able to estimate number of scoops needed to fill. Able to attend to both the portion of container filled and the portion remaining unfilled. Recognizes when container is half full.
With strong guidance and perceptual support from pre-structured materials, can direct the filling of 3-D space with objects (e.g., cubes) and recognize that filling as complete. Implicit visual patterning and constraints of physical materials guides placement of cubes.
Packing. Exhibits initial spatial structuring.
Begins to associate multiple exposed faces of a
When attempting to reproduce rectangular prisms, only considers some characteristics of shape (e.g., one or two dimensions); therefore, may not produce identical shapes. May not recognize two shapes as having equal volume when oriented or constructed differently. May not recognize the need for equal size units when measuring. May attempt to iterate a given unit throughout a 3-D space, but not maintain equal unit size or spacing; may not recognize need for equal size units.
P a g e |143 Developmental Progression Packs box neatly and completely with cubes; may count one cube at a time, while packing, to determine total. Building. Exhibits initial understanding of cubes as filling a space (completely, without gaps). Counts on all faces of a 3-D object constructed of cubes, has a developing sense of the cube as a unit, begins to recognize that squares on adjacent faces of a rectangular prism and that share a side are faces of the same cube. Comparing. Compares objects by physically or mentally aligning and explicitly recognizing three dimensions (Note: Although cylindrical containers have only two dimensions to vary, height and radius, children at this level demonstrate understanding of 3-D space.) “This cup [cylindrical container] holds more because it taller and it is bigger here” (putting hand into the opening at the top to indicate the size of the opening) Compares the volume of objects by counting the number of cubes, showing initial understanding of cubes as filling space (as described above); may break a larger object into smaller pieces in order to “see” all the cubes. Recognizes that objects can look different, but still contain the same number of cubes. Volume Unit Relater and Repeater
Mental Actions on Objects single cube (‘block’) with a single cubic unit.
Anticipated Misconceptions or Partial Conceptions May not be able to accurately visualize and calculate total, possibly double-count cubes at the corners and often not count internal cubes.
Builds, maintains, and manipulates mental images of amount of material (“stuff”) such as liquid. With the immediate perceptual support of May not make an explicit connection between the containers and the material, the scheme number and space (volume as number). can compare (gross) images. Note on the psychological foundation: Not yet able to understand the basis for the multiplicative transformation of lengths into volume measurements.
Action schemes include the ability to iterate a
May overgeneralize unit iteration of length to
P a g e |144 Developmental Progression Relates size and number of units explicitly; understands that fewer larger than smaller units will be needed to fill or pack a given container. Can accurately convert units in 1:2 ratio. Filling. Uses simple units to fill containers with accurate counting, completely filling the scoop each time. After one unit has been poured into the container can anticipate the volume of the container by iterating the height filled by the unit exhaustively along the height of the container. Packing. Uses discrete units to pack a container without gaps and with accurate counting. Able to iterate unit throughout volume, maintaining equal unit size and spacing. Building. Exhibits developing understanding of cubes as filling a space. Counts cubes, not faces (or “faces-as-cubes”). Comparing. When comparing two 3-D objects in cases such as, congruent objects containing different numbers of units, non-congruent objects containing the same number of units, or non-congruent objects containing different numbers of units, describes correctly the relative volumes of objects by reasoning about unit size. Initial Composite 3-D Structurer
Mental Actions on Objects mental unit through a perceptually-available object. The image of each placement can be maintained while the physical unit is moved to the next iterative position (initially with weaker constraints on this placement).. Stronger constraints on object counting (counts all objects once and only once), and use of rows as an intuitive structure or explicit application of labeling as marker, allows child to keep track. These action schemes allow the application of counting-all addition schemes to be applied to measures.
Anticipated Misconceptions or Partial Conceptions containers with non-uniform cross-section, i.e., focusing only on the one dimension (e.g., height) and not variations in cross-sectional area. May not accurately convert units in a ratio other than 1:2. In packing or building, may not account for “internal/hidden” cubes. In comparing tasks, such as described to the left, may not explicitly quantify the discrepancy.
Notes on the psychological foundation: This level is foundational to the later development of the multiplicative transformation of lengths into volume measurement. This type of multiplicative structure is analogous to n times 1 cubic unit (e.g., for 12 cubic units to be placed and counted one at a time to be thought of as 12 times 1 cubic unit).
Builds, maintains, and manipulates mental
May not recognize the disparities between the
P a g e |145 Developmental Progression
Mental Actions on Objects
Understands cubes as filling a space. Explicitly relates size and number of units to volume. Uses additive reasoning (e.g., skip-counting to obtain total). Conversion of units develops to include ratios other than 1:2.
images of composite 3-D shapes, structuring them as combinations of individual shapes and as a single entity – a row or column (a unit of units). Applies this composite unit repeatedly, but not necessarily exhaustively, as its application remains guided by intuition.
Filling. Relates number of cubes to cubic units as measured by capacity. Given a graduated cylinder marked in cubic-inch units, child understands that sand filled to the 10 in the cylinder would fill a box that holds ten, 1-inch cubes. Packing. Begins to visualize and operate on composite units such as rows or columns (what we call a 1 x 1 x n core). Iterates to pack the space completely, accounting for “internal/hidden” cubes. Decomposes space, allowing for accurate use of units and subunits. Recognizes when a box is half full, visualizes remaining rows or columns. Building. Develops more accurate counting strategies. Counts systematically, accounting for internal/hidden cubes, and moves to operating on composites, including rows and columns. Comparing. Connects volume as number and volume as space. Develops sense of conservation for the cases in which
Notes on the psychological foundation: This level anticipates later development of the multiplicative transformation of lengths into volume measurement. This type of multiplicative structure can be analogous to n rows times m cubic units per row (e.g., for 12 cubic units built, packed, and/or counted, in part, as 3 rows with 4 cubic units per row).
Anticipated Misconceptions or Partial Conceptions measure of length of an edge and the number of volume units fitting along that edge. Focuses on either a row or a column but cannot transition from one to another. Does not visualize and operate on a layer as a composite.
P a g e |146 Developmental Progression transformation is involved. 3-D Row and Column Structurer Able to coordinate flexibly filling, packing, building aspects of volume. Shows a propensity for additive comparisons (e.g., “this one has 12 more”) but may show some nascent multiplicative comparisons (e.g., “this one is four times as big”). Initially counts or computes (e.g., number of rows times number of columns) the number of cubes in one layer (1 x m x n), and then uses addition or skip counting by layers to determine the total volume. Eventually moves to multiplication (e.g., number of cubes in a layer times number of layers). Operates fluidly and flexibly on units (cubes), units of units (rows or columns), and units of units of units (layers). Composes and decomposes array ←→ layers ←→ rows/columns ←→ units. With perceptual support, can decompose 3-D arrays into other, complex 3-D arrays (not only layers, rows, or columns) and calculate the number of these smaller arrays in the larger array. 3-D Array Structurer Has an abstract understanding of the rectangular prism volume formula. Shows a propensity for multiplicative comparisons,
Mental Actions on Objects Builds, maintains, and manipulates mental images of composite units, operates on them as composites of individual shapes and as a single entity – a layer (a unit of units of units) of congruent cubes. Applies this composite unit repeatedly and exhaustively to fill the 3-D array – coordinating this movement in 1-1 correspondence with the elements of the orthogonal column. May apply a skip counting scheme to determine the volume. In a measurement context, applies the concept that the length of an edge specifies the number of volume units that will fit along that edge, but may need to create a perceptual layer to support their reasoning. Notes on the psychological foundation: This level is the foundation for the multiplicative transformation of lengths into volume measurement. This type of multiplicative structure can be analogous to n layers times m cubic units per layer.
Builds, maintains, and manipulates composites (a 3-D array – units of units of units of units) and operates on three dimensions. Mentally composes and decomposes
Anticipated Misconceptions or Partial Conceptions When decomposing 3-D arrays into complex arrays, may still make iterating or counting mistakes especially if the decomposition case requires mental transformations (e.g. rotations). Although visualizes and operates on horizontal or vertical layers, may not operate flexibly on both. When determining volume, may add the edge lengths rather than applying an appropriate mental structuring.
P a g e |147 Developmental Progression coordinates multiplicative and additive comparisons flexibly. With linear measures or other similar indications of the three dimensions, multiplicatively iterates cubes in a row, column, and/or layer to determine volume. In multiple contexts, can compute the volume of rectangular prisms from their dimensions and explain how that multiplication creates a measure of volume. Develops the ability to visualize and operate on both horizontal and vertical layers, even without perceptual support.
Mental Actions on Objects a) array ←→ layers ←→ rows/columns ←→ units b) array ←→ smaller complex arrays ←→ units. Can visualize and operate on arrays spatially and/or symbolically. Can connect (l x w) x h to number of cubes in a horizontal layer times number of horizontal layers and l x (w x h) to number of cubes in a vertical layer times number of vertical layers. Curtails the process to use volume formulas with understanding. In a measurement context, applies the concept that the length of an edge specifies the number of volume units that will fit along that edge even without perceptual support.
Can decompose 3-D arrays into other, complex 3-D arrays (not only layers, rows, or columns) and calculate the number of these smaller arrays in the larger array (by using repeated Notes on the psychological foundation: The addition, multiplication, or division). Coordinates previous levels were foundational for the the spatial and symbolic decompositions. development of multiplicative transformation of lengths into volume measurement, which has been developed in tandem.
Anticipated Misconceptions or Partial Conceptions
P a g e |148 Appendix F Rasch Modeling Results for All Items – Final Assessment from Longitudinal Study
Table F1 – Table of Standardized Residual Variance for All Items Table of Standardized Residual Variance for All Items TABLE 23.0 Post-assessment Rasch Data Cleaned (r mathpost.outC Mar 13
7:57 2013Rasch.xlsx
INPUT: 258 PERSON 52 ITEM REPORTED: 258 PERSON 52 ITEM 2 CATS WINSTEPS 3.75.1 -----------------------------------------------------------------------------------------Table of STANDARDIZED RESIDUAL variance (in Eigenvalue units) -- Empirical -Modeled Total raw variance in observations = 104.8 100.0% 100.0% Raw variance explained by measures = 53.8 51.3% 50.6% Raw variance explained by persons = 20.1 19.2% 18.9% Raw Variance explained by items = 33.7 32.1% 31.7% Raw unexplained variance (total) = 51.0 48.7% 100.0% 49.4% Unexplned variance in 1st contrast = 2.4 2.3% 4.7% Unexplned variance in 2nd contrast = 1.9 1.8% 3.8% Unexplned variance in 3rd contrast = 1.9 1.8% 3.8% Unexplned variance in 4th contrast = 1.7 1.7% 3.4% Unexplned variance in 5th contrast = 1.7 1.6% 3.3%
P a g e |149 Table F2 – Item Statistics by Misfit Order for All Items Item Statistics by Misfit Order for All Items TABLE 10.1 Post-assessment Rasch Data Cleaned (r mathpost.outC Mar 13
7:57 2013Rasch.xlsx
INPUT: 258 PERSON 52 ITEM REPORTED: 258 PERSON 52 ITEM 2 CATS WINSTEPS 3.75.1 -----------------------------------------------------------------------------------------PERSON: REAL SEP.: 1.84 REL.: .77 ... ITEM: REAL SEP.: 5.57 REL.: .97 ITEM STATISTICS: MISFIT ORDER -------------------------------------------------------------------------------------------|ENTRY TOTAL TOTAL MODEL| INFIT | OUTFIT |PT-MEASURE |EXACT MATCH| | |NUMBER SCORE COUNT MEASURE S.E. |MNSQ ZSTD|MNSQ ZSTD|CORR. EXP.| OBS% EXP%| ITEM | |------------------------------------+----------+----------+-----------+-----------+-------| | 40 37 96 .43 .25|1.21 1.7|2.52 5.1|A .35 .53| 73.1 74.5| PAC-4 | | 16 12 84 3.61 .36|1.39 1.6|2.37 1.9|B .24 .47| 82.1 88.0| CLM-3 | | 23 10 43 1.06 .44|1.28 1.2|1.91 1.3|C .34 .49| 78.0 82.3| ASC-3 | | 46 103 127 -1.68 .26|1.01 .2|1.68 1.4|D .39 .44| 84.0 83.3| CRR-2 | | 1 37 39 -4.89 .84|1.67 1.1|1.18 .6|E .14 .39| 89.5 94.8| LQR-1 | | 45 104 131 -1.47 .26| .83 -1.2|1.65 1.3|F .54 .51| 88.3 84.0| CRR-1 | | 17 10 86 4.17 .40|1.12 .5|1.63 .9|G .42 .46| 88.4 90.6| CR-1 | | 31 38 46 -2.57 .44|1.20 .9|1.37 .8|H .30 .42| 81.0 82.6| CDC-1 | | 39 28 73 .71 .29| .90 -.8|1.33 .9|I .54 .52| 77.5 74.2| PAC-2 | | 13 58 76 -1.87 .31|1.21 1.3|1.31 .9|J .36 .48| 76.7 80.2| LURR-2| | 43 5 43 4.42 .54|1.30 .9| .66 .0|K .35 .42| 86.0 89.4| AS-1 | | 37 55 75 -1.63 .32|1.28 1.6|1.13 .4|L .43 .54| 71.2 80.7| CIC-1 | | 19 2 46 6.30 .77|1.22 .6| .84 .4|M .18 .28| 95.7 95.6| ICPM-1| | 22 4 45 5.24 .60| .88 -.2|1.22 .6|N .42 .41| 95.6 91.2| CAM-2 | | 30 99 125 -1.55 .26|1.07 .5|1.21 .6|O .43 .47| 83.1 83.2| AURR-3| | 10 72 96 -1.17 .29|1.11 .7|1.12 .4|P .48 .53| 78.7 82.2| EE-3 | | 15 34 89 1.81 .27|1.07 .6|1.08 .4|Q .54 .57| 75.3 77.0| CLM-1 | | 9 67 75 -2.97 .40|1.08 .4|1.02 .3|R .28 .32| 88.9 88.9| EE-2 | | 47 26 83 2.28 .29|1.05 .4| .95 .0|S .54 .56| 74.7 79.1| PS-1 | | 38 68 94 -1.41 .26|1.05 .5| .96 .0|T .44 .46| 71.4 76.5| CIC-2 | | 6 76 91 -2.36 .34|1.05 .3| .76 -.4|U .50 .51| 86.4 87.0| ILC-3 | | 29 111 133 -1.83 .28|1.03 .3| .93 .0|V .44 .46| 85.3 85.9| AURR-2| | 44 21 46 2.29 .36|1.01 .1| .88 -.3|W .57 .56| 76.1 75.0| AS-2 | | 12 66 98 -.72 .26| .98 -.2| .99 .1|X .54 .54| 80.2 77.4| LURR-1| | 24 35 42 -2.75 .46| .99 .0| .70 -.4|Y .43 .40| 82.1 82.7| ASC-4 | | 28 109 127 -2.27 .30| .98 .0| .65 -.6|Z .47 .45| 85.7 88.2| PC-2 | | 3 38 39 -5.05 1.02| .98 .3| .47 .0|y .18 .13| 97.3 97.3| LDC-1 | | 33 66 83 -.94 .31| .98 -.1| .92 .0|x .41 .40| 80.7 81.3| PRS-1 | | 18 26 87 2.26 .28| .97 -.2| .90 -.2|w .55 .53| 79.3 79.0| CR-2 | | 21 29 44 .87 .38| .96 -.1| .85 -.2|v .55 .53| 72.7 77.1| CAM-1 | | 42 16 45 2.53 .38| .96 -.1| .89 -.2|u .57 .55| 73.3 77.4| AC-2 | | 11 69 87 -.83 .32| .95 -.3| .92 .1|t .50 .49| 89.7 83.1| EE-5 | | 50 16 84 3.13 .35| .93 -.2| .95 .1|s .57 .55| 86.9 86.6| VRCS-2| | 5 60 78 -1.85 .30| .94 -.3| .75 -.5|r .46 .42| 80.3 78.9| ILC-1 | | 34 58 90 .30 .27| .80 -1.6| .92 -.1|q .62 .55| 88.9 78.5| PRS-2 | | 41 41 44 -1.64 .64| .90 -.1| .30 -.5|p .39 .30| 93.2 93.2| AC-1 | | 36 50 87 .65 .26| .89 -1.0| .75 -.8|o .58 .52| 78.2 74.0| ARCS-2| | 32 29 39 -2.31 .43| .88 -.5| .69 -.7|n .58 .51| 84.2 79.7| CDC-2 | | 49 16 88 3.32 .33| .88 -.6| .69 -.5|m .54 .48| 88.6 85.6| VRCS-1| | 26 26 39 -1.78 .38| .88 -.8| .78 -.9|l .52 .42| 78.4 71.2| SS-2 | | 14 75 84 -1.61 .39| .86 -.5| .62 -.3|k .41 .35| 91.7 89.4| LURR-3| | 8 69 89 -1.78 .30| .86 -.9| .78 -.4|j .54 .48| 86.4 81.5| SO-2 | | 35 27 86 2.21 .29| .86 -1.0| .62 -1.2|i .63 .55| 80.2 79.1| ARCS-1| | 48 26 90 2.35 .29| .84 -1.1| .68 -.9|h .63 .55| 83.3 80.5| PS-2 | | 27 112 131 -1.95 .29| .83 -1.0| .61 -.7|g .49 .42| 91.3 87.0| PC-1 | | 20 12 43 2.92 .41| .82 -.8| .56 -1.0|f .64 .54| 81.4 80.1| ICPM-2| -.5| .47 -.7|e .62 .53| 86.0 87.7| SS-1 | | 25 39 46 -2.95 .51| .82 | 7 74 80 -3.54 .46| .74 -.8| .26 -1.1|d .49 .33| 93.4 92.5| SO-1 | | 2 42 46 -3.54 .56| .73 -.7| .30 -.8|c .47 .32| 90.5 90.4| LQR-2 | | 51 9 43 3.47 .44| .70 -1.3| .44 -.9|b .65 .50| 86.0 83.6| AR-1 | | 52 7 46 4.58 .48| .65 -1.3| .56 -.4|a .61 .47| 93.5 87.0| AR-2 | |------------------------------------+----------+----------+-----------+-----------+-------| | MEAN 45.5 74.5 -.14 .42| .99 .0| .96 .1| | 83.7 83.5| | | S.D. 30.5 28.3 2.88 .26| .19 .8| .46 1.0| | 6.7 6.1| | --------------------------------------------------------------------------------------------
P a g e |150 Table F3 – Summary of Measured (Non-extreme) Person for All Items Summary of Measured (Non-extreme) Person for All Items TABLE 3.1 Post-assessment Rasch Data Cleaned (re mathpost.outO Mar 13 7:57 2013asch.xlsx INPUT: 258 PERSON 52 ITEM REPORTED: 258 PERSON 52 ITEM 2 CATS WINSTEPS 3.75.1 ----------------------------------------------------------------------------------------SUMMARY OF 253 MEASURED (NON-EXTREME) PERSON ------------------------------------------------------------------------------| TOTAL MODEL INFIT OUTFIT | | SCORE COUNT MEASURE ERROR MNSQ ZSTD MNSQ ZSTD | |-----------------------------------------------------------------------------| | MEAN 9.1 15.0 .50 .78 .97 .0 1.00 .2 | | S.D. 2.7 .8 1.83 .12 .43 1.0 1.12 .8 | | MAX. 15.0 16.0 5.46 1.17 2.74 2.9 7.83 3.2 | | MIN. 1.0 14.0 -4.92 .60 .21 -2.1 .12 -1.5 | |-----------------------------------------------------------------------------| | REAL RMSE .84 TRUE SD 1.62 SEPARATION 1.93 PERSON RELIABILITY .79 | |MODEL RMSE .78 TRUE SD 1.65 SEPARATION 2.11 PERSON RELIABILITY .82 | | S.E. OF PERSON MEAN = .12 | ------------------------------------------------------------------------------MAXIMUM EXTREME SCORE: 5 PERSON VALID RESPONSES: 29.1% (APPROXIMATE) SUMMARY OF 258 MEASURED (EXTREME AND NON-EXTREME) PERSON ------------------------------------------------------------------------------| TOTAL MODEL INFIT OUTFIT | | SCORE COUNT MEASURE ERROR MNSQ ZSTD MNSQ ZSTD | |-----------------------------------------------------------------------------| | MEAN 9.2 15.0 .54 .80 .97 .0 1.00 .2 | | S.D. 2.7 .8 1.84 .20 .43 1.0 1.12 .8 | | MAX. 15.0 16.0 5.46 1.92 2.74 2.9 7.83 3.2 | | MIN. 1.0 14.0 -4.92 .60 .21 -2.1 .12 -1.5 | |-----------------------------------------------------------------------------| | REAL RMSE .88 TRUE SD 1.61 SEPARATION 1.84 PERSON RELIABILITY .77 | |MODEL RMSE .82 TRUE SD 1.64 SEPARATION 2.00 PERSON RELIABILITY .80 | | S.E. OF PERSON MEAN = .11 | ------------------------------------------------------------------------------PERSON RAW SCORE-TO-MEASURE CORRELATION = .73 (approximate due to missing data) CRONBACH ALPHA (KR-20) PERSON RAW SCORE "TEST" RELIABILITY = .72 (approximate due to missing data) SUMMARY OF 51 MEASURED (NON-EXTREME) ITEM ------------------------------------------------------------------------------| TOTAL MODEL INFIT OUTFIT | | SCORE COUNT MEASURE ERROR MNSQ ZSTD MNSQ ZSTD | |-----------------------------------------------------------------------------| | MEAN 45.5 75.0 .00 .39 .99 .0 .96 .1 | | S.D. 30.8 28.3 2.74 .16 .19 .8 .46 1.0 | | MAX. 112.0 133.0 6.30 1.02 1.67 1.7 2.52 5.1 | | MIN. 2.0 39.0 -5.05 .25 .65 -1.6 .26 -1.2 | |-----------------------------------------------------------------------------| | REAL RMSE .44 TRUE SD 2.70 SEPARATION 6.10 ITEM RELIABILITY .97 | |MODEL RMSE .42 TRUE SD 2.70 SEPARATION 6.43 ITEM RELIABILITY .98 | | S.E. OF ITEM MEAN = .39 | ------------------------------------------------------------------------------MAXIMUM EXTREME SCORE:
1 ITEM
UMEAN=.0000 USCALE=1.0000 SUMMARY OF 52 MEASURED (EXTREME AND NON-EXTREME) ITEM ------------------------------------------------------------------------------| TOTAL MODEL INFIT OUTFIT | | SCORE COUNT MEASURE ERROR MNSQ ZSTD MNSQ ZSTD | |-----------------------------------------------------------------------------|
P a g e |151 | MEAN 45.5 74.5 -.14 .42 .97 .0 1.00 .2 | S.D. 30.5 28.3 2.88 .26 .19 .8 .46 1.0 | | MAX. 112.0 133.0 6.30 1.88 2.74 2.9 7.83 3.2 | | MIN. 2.0 39.0 -7.20 .25 .65 -1.6 .26 -1.2 | |-----------------------------------------------------------------------------| | REAL RMSE .51 TRUE SD 2.84 SEPARATION 5.57 ITEM RELIABILITY .97 | |MODEL RMSE .49 TRUE SD 2.84 SEPARATION 5.79 ITEM RELIABILITY .97 | | S.E. OF ITEM MEAN = .40 | ------------------------------------------------------------------------------ITEM RAW SCORE-TO-MEASURE CORRELATION = -.62 (approximate due to missing data) 3760 DATA POINTS. LOG-LIKELIHOOD CHI-SQUARE: 2796.08 with 3457 d.f. p=1.0000 Global Root-Mean-Square Residual (excluding extreme scores): .3420 Capped Binomial Deviance = .1572 for 3870.0 dichotomous observations
P a g e |152 Table F4 – Summary of Category Structure for All Items Summary of Category Structure for All Items TABLE 3.2 Post-assessment Rasch Data Cleaned (re mathpost.outO Mar 13
7:57 2013asch.xlsx
INPUT: 258 PERSON 52 ITEM REPORTED: 258 PERSON 52 ITEM 2 CATS WINSTEPS 3.75.1 ----------------------------------------------------------------------------------------SUMMARY OF CATEGORY STRUCTURE. Model="R" ----------------------------------------------------------------------|CATEGORY OBSERVED|OBSVD SAMPLE|INFIT OUTFIT| COHERENCE |ESTIM| |LABEL SCORE COUNT %|AVRGE EXPECT| MNSQ MNSQ| M->C C->M RMSR |DISCR| |-------------------+------------+------------+-----------------+-----| | 0 0 1508 39| -1.45 -1.44| .99 .94| 82% 76% .4042| | 0 | 1 1 2365 61| 2.12 2.11| .98 1.10| 85% 89% .2930| 1.01| 1 |-------------------+------------+------------+-----------------+-----| |MISSING 9143 70| .42 | | | | ----------------------------------------------------------------------OBSERVED AVERAGE is mean of measures in category. It is not a parameter estimate. M->C = Does Measure imply Category? C->M = Does Category imply Measure?
P a g e |153 P R O B A B I L I T Y O F R E S P O N S E
1.0
.8
.6 .5 .4
.2
.0
DICHOTOMOUS CURVES -+--------------+--------------+--------------+--------------++ + | | |0 1| | 000000 111111 | + 00000 11111 + | 0000 1111 | | 0000 1111 | | 000 111 | + 000 111 + | 000 111 | + *** + | 111 000 | + 111 000 + | 111 000 | | 1111 0000 | | 1111 0000 | + 11111 00000 + | 111111 000000 | |1 0| | | + + -+--------------+--------------+--------------+--------------+-2
-1
0
PERSON [MINUS] ITEM MEASURE
Figure F1. Probability Curves for All Items
1
2
P a g e |154 TABLE 12.2 Post-assessment Rasch Data Cleaned (r mathpost.outC Mar 13 7:57 2013Rasch.xlsx INPUT: 258 PERSON 52 ITEM REPORTED: 258 PERSON 52 ITEM 2 CATS WINSTEPS 3.75.1 ----------------------------------------------------------------------------------------MEASURE PERSON - MAP - ITEM | 7 .## + | | | ICPM-1 6 + | . |T .# | CAM-2 5 + . | .# | AR-2 AS-1 .# T| CR-1 4 .# + # | # | AR-1 CLM-3 # | VRCS-1 VRCS-2 3 .## + ICPM-2 ## |S .### | AC-2 ### S| ARCS-1 AS-2 CR-2 PS-1 PS-2 2 .##### + ##### | CLM-1 .###### | .###### | 1 #### + ASC-3 ########### | ARCS-2 CAM-1 PAC-2 .#### M| PAC-4 .######## | PRS-2 0 ######## +M .#### | .###### | ##### | EE-5 LURR-1 -1 .######## + PRS-1 #### S| EE-3 ###### | AURR-3 CIC-2 CRR-1 LURR-3 .# | AC-1 AURR-2 CIC-1 CRR-2 ILC-1 LURR-2 SO-2 SS-2 -2 ## + PC-1 .### | CDC-2 ILC-3 PC-2 | CDC-1 # |S ASC-4 -3 .# + EE-2 SS-1 . T| | LQR-2 SO-1 | -4 + . | | | -5 # + LDC-1 LQR-1 | |T | -6 + LDC-2 | EACH "#" IS 2.
EACH "." IS 1.
Figure F2. Item-Person Map for All Items
P a g e |155 Appendix G Rasch Modeling Results for Volume Items – Final Assessment from Longitudinal Study
Table G1 – Table of Standardized Residual Variance for Volume Items Table of Standardized Residual Variance for Volume Items TABLE 23.2 Post-assessment Rasch Data Cleaned (r ZOU276WS.TXTd Apr 25 6:30 2013ed.xlsx INPUT: 258 PERSON 14 ITEM REPORTED: 257 PERSON 14 ITEM 2 CATS WINSTEPS 3.75.0 --------------------------------------------------------------------------------------Table of STANDARDIZED RESIDUAL variance (in Eigenvalue units) -- Empirical -Modeled Total raw variance in observations = 46.5 100.0% 100.0% Raw variance explained by measures = 32.5 69.9% 67.0% Raw variance explained by persons = 13.3 28.6% 27.4% Raw Variance explained by items = 19.2 41.3% 39.6% Raw unexplained variance (total) = 14.0 30.1% 100.0% 33.0% Unexplned variance in 1st contrast = 1.6 3.5% 11.5%
P a g e |156 Table G2 – Table of Standardized Residual Loadings for Volume Items Table of Standardized Residual Loadings for Volume Items TABLE 23.3 Post-assessment Rasch Data Cleaned (r ZOU276WS.TXTd Apr 25 6:30 2013ed.xlsx INPUT: 258 PERSON 14 ITEM REPORTED: 257 PERSON 14 ITEM 2 CATS WINSTEPS 3.75.0 --------------------------------------------------------------------------------------CONTRAST 1 FROM PRINCIPAL COMPONENT ANALYSIS STANDARDIZED RESIDUAL LOADINGS FOR ITEM (SORTED BY LOADING) --------------------------------------------------|CON- | | INFIT OUTFIT| ENTRY | | TRAST|LOADING|MEASURE MNSQ MNSQ |NUMBER ITEM | |------+-------+-------------------+--------------| | 1 | .71 | -3.13 .98 1.57 |A 4 CIC-2 | | 1 | .36 | -4.31 .99 .43 |B 3 CIC-1 | | 1 | .31 | 3.74 .88 .55 |C 10 PS-2 | | 1 | .25 | -5.82 .96 .49 |D 1 CDC-1 | | 1 | .20 | 5.33 1.04 .58 |E 13 AR-1 | | 1 | .14 | -3.36 1.10 9.90 |F 8 CRR-2 | | | | | | | | | | | ----------------------------------------------------------------------------------------------------|CON- | | INFIT OUTFIT| ENTRY | | TRAST|LOADING|MEASURE MNSQ MNSQ |NUMBER ITEM | |------+-------+-------------------+--------------| | 1 1 | .71 | -3.13 .98 1.57 |A 4 CIC-2 | | 1 2 | .36 | -4.31 .99 .43 |B 3 CIC-1 | | 1 2 | .31 | 3.74 .88 .55 |C 10 PS-2 | | 1 2 | .25 | -5.82 .96 .49 |D 1 CDC-1 | | 1 2 | .20 | 5.33 1.04 .58 |E 13 AR-1 | | 1 2 | .14 | -3.36 1.10 9.90 |F 8 CRR-2 | | |-------+-------------------+--------------| | 1 3 | -.51 | 4.60 .85 .67 |a 11 VRCS-1 | | 1 3 | -.41 | -1.23 .84 .78 |b 6 PAC-4 | | 1 3 | -.34 | 2.94 .93 .71 |c 9 PS-1 | | 1 3 | -.26 | -1.21 .58 .80 |d 5 PAC-2 | | 1 3 | -.24 | 4.30 .93 7.79 |e 12 VRCS-2 | | 1 3 | -.21 | -5.22 .77 .20 |f 2 CDC-2 | | 1 3 | -.16 | 6.85 1.08 .73 |g 14 AR-2 | | 1 3 | -.08 | -3.49 1.10 9.90 |G 7 CRR-1 | ---------------------------------------------------
-------------------------------------------| | INFIT OUTFIT| ENTRY | |LOADING|MEASURE MNSQ MNSQ |NUMBER ITEM | |-------+-------------------+--------------| | -.51 | 4.60 .85 .67 |a 11 VRCS-1 | | -.41 | -1.23 .84 .78 |b 6 PAC-4 | | -.34 | 2.94 .93 .71 |c 9 PS-1 | | -.26 | -1.21 .58 .80 |d 5 PAC-2 | | -.24 | 4.30 .93 7.79 |e 12 VRCS-2 | | -.21 | -5.22 .77 .20 |f 2 CDC-2 | | -.16 | 6.85 1.08 .73 |g 14 AR-2 | | -.08 | -3.49 1.10 9.90 |G 7 CRR-1 | --------------------------------------------
P a g e |157 Table G3 – Item Statistics by Misfit Order for Volume Items Item Statistics by Misfit Order for Volume Items TABLE 10.1 Post-assessment Rasch Data Cleaned (r ZOU276WS.TXTd Apr 25 6:30 2013ed.xlsx INPUT: 258 PERSON 14 ITEM REPORTED: 257 PERSON 14 ITEM 2 CATS WINSTEPS 3.75.0 --------------------------------------------------------------------------------------PERSON: REAL SEP.: 1.40 REL.: .66 ... ITEM: REAL SEP.: 9.76 REL.: .99 ITEM STATISTICS:
MISFIT ORDER
-------------------------------------------------------------------------------------------|ENTRY TOTAL TOTAL MODEL| INFIT | OUTFIT |PT-MEASURE |EXACT MATCH| | |NUMBER SCORE COUNT MEASURE S.E. |MNSQ ZSTD|MNSQ ZSTD|CORR. EXP.| OBS% EXP%| ITEM | |------------------------------------+----------+----------+-----------+-----------+-------| | 7 103 128 -3.49 .30|1.10 .7|9.90 8.3|A .49 .54| 84.3 85.0| CRR-1 | | 8 101 122 -3.36 .32|1.10 .6|9.90 8.1|B .47 .54| 87.9 86.9| CRR-2 | | 12 16 75 4.30 .42| .93 -.3|7.79 3.5|C .63 .65| 90.0 87.6| VRCS-2| | 4 156 183 -3.13 .29| .98 -.1|1.57 1.1|D .52 .54| 90.0 88.0| CIC-2 | | 14 7 37 6.85 .79|1.08 .3| .73 .3|E .66 .68| 93.8 93.7| AR-2 | | 13 9 36 5.33 .60|1.04 .2| .58 -.1|F .70 .70| 87.1 87.0| AR-1 | | 3 143 158 -4.31 .37| .99 .0| .43 -.7|G .49 .48| 93.0 92.1| CIC-1 | | 1 211 219 -5.82 .45| .96 .0| .49 -1.1|g .32 .32| 96.8 96.7| CDC-1 | | 9 26 76 2.94 .38| .93 -.2| .71 -.1|f .73 .72| 88.4 86.8| PS-1 | | 10 22 81 3.74 .38| .88 -.6| .55 -.2|e .69 .67| 90.7 85.1| PS-2 | | 11 16 80 4.60 .42| .85 -.6| .67 -.1|d .65 .63| 91.9 89.4| VRCS-1| | 6 126 185 -1.23 .26| .84 -1.2| .78 -.3|c .70 .69| 91.5 85.8| PAC-4 | | 5 114 158 -1.21 .30| .58 -2.8| .80 -.1|b .75 .70| 96.9 88.5| PAC-2 | | 2 200 211 -5.22 .40| .77 -.8| .20 -2.0|a .40 .36| 95.6 95.5| CDC-2 | |------------------------------------+----------+----------+-----------+-----------+-------| | MEAN 89.3 124.9 .00 .41| .93 -.3|2.51 1.2| | 91.3 89.2| | | S.D. 70.3 59.7 4.26 .13| .14 .8|3.54 3.1| | 3.6 3.7| | --------------------------------------------------------------------------------------------
P a g e |158 Table G4 – Summary of Measured (non-extreme) Persons for Volume Items Summary of Measured (non-extreme) Persons for Volume Items TABLE 3.1 Post-assessment Rasch Data Cleaned (re ZOU276WS.TXTy Apr 25 6:30 2013d.xlsx INPUT: 258 PERSON 14 ITEM REPORTED: 257 PERSON 14 ITEM 2 CATS WINSTEPS 3.75.0 -------------------------------------------------------------------------------------SUMMARY OF 216 MEASURED (NON-EXTREME) PERSON ------------------------------------------------------------------------------| TOTAL MODEL INFIT OUTFIT | | SCORE COUNT MEASURE ERROR MNSQ ZSTD MNSQ ZSTD | |-----------------------------------------------------------------------------| | MEAN 4.7 6.9 -.31 1.51 .90 -.2 .88 -.1 | | S.D. 2.3 2.3 2.94 .39 1.34 1.1 2.09 1.4 | | MAX. 9.0 10.0 5.88 4.20 9.68 4.2 9.90 9.9 | | MIN. 1.0 4.0 -5.39 1.07 .02 -1.3 .01 -1.2 | |-----------------------------------------------------------------------------| | REAL RMSE 1.88 TRUE SD 2.26 SEPARATION 1.20 PERSON RELIABILITY .59 | |MODEL RMSE 1.56 TRUE SD 2.49 SEPARATION 1.60 PERSON RELIABILITY .72 | | S.E. OF PERSON MEAN = .20 | ------------------------------------------------------------------------------MAXIMUM EXTREME SCORE: 36 PERSON MINIMUM EXTREME SCORE: 5 PERSON LACKING RESPONSES: 1 PERSON VALID RESPONSES: 49.3% (APPROXIMATE) SUMMARY OF 257 MEASURED (EXTREME AND NON-EXTREME) PERSON ------------------------------------------------------------------------------| TOTAL MODEL INFIT OUTFIT | | SCORE COUNT MEASURE ERROR MNSQ ZSTD MNSQ ZSTD | |-----------------------------------------------------------------------------| | MEAN 4.9 6.8 .00 1.59 | | S.D. 2.5 2.4 3.28 .40 | | MAX. 10.0 10.0 7.90 4.20 | | MIN. .0 4.0 -7.11 1.07 .02 -1.3 .01 -1.2 | |-----------------------------------------------------------------------------| | REAL RMSE 1.91 TRUE SD 2.67 SEPARATION 1.40 PERSON RELIABILITY .66 | |MODEL RMSE 1.64 TRUE SD 2.84 SEPARATION 1.73 PERSON RELIABILITY .75 | | S.E. OF PERSON MEAN = .20 | ------------------------------------------------------------------------------PERSON RAW SCORE-TO-MEASURE CORRELATION = .94 (approximate due to missing data) CRONBACH ALPHA (KR-20) PERSON RAW SCORE "TEST" RELIABILITY = .70 (approximate due to missing data SUMMARY OF 14 MEASURED (NON-EXTREME) ITEM ------------------------------------------------------------------------------| TOTAL MODEL INFIT OUTFIT | | SCORE COUNT MEASURE ERROR MNSQ ZSTD MNSQ ZSTD | |-----------------------------------------------------------------------------| | MEAN 89.3 124.9 .00 .41 .93 -.3 2.51 1.2 | | S.D. 70.3 59.7 4.26 .13 .14 .8 3.54 3.1 | | MAX. 211.0 219.0 6.85 .79 1.10 .7 9.90 8.3 | | MIN. 7.0 36.0 -5.82 .26 .58 -2.8 .20 -2.0 | |-----------------------------------------------------------------------------| | REAL RMSE .43 TRUE SD 4.23 SEPARATION 9.76 ITEM RELIABILITY .99 | |MODEL RMSE .43 TRUE SD 4.23 SEPARATION 9.92 ITEM RELIABILITY .99 | | S.E. OF ITEM MEAN = 1.18 | ------------------------------------------------------------------------------UMEAN=.0000 USCALE=1.0000 ITEM RAW SCORE-TO-MEASURE CORRELATION = -.95 (approximate due to missing data) 1491 DATA POINTS. LOG-LIKELIHOOD CHI-SQUARE: 684.14 with 1262 d.f. p=1.0000 Global Root-Mean-Square Residual (excluding extreme scores): .2563 Capped Binomial Deviance = .0848 for 1749.0 dichotomous observations
P a g e |159 TABLE 3.2 Post-assessment Rasch Data Cleaned (re ZOU276WS.TXTy Apr 25 6:30 2013d.xlsx INPUT: 258 PERSON 14 ITEM REPORTED: 257 PERSON 14 ITEM 2 CATS WINSTEPS 3.75.0 -------------------------------------------------------------------------------------SUMMARY OF CATEGORY STRUCTURE. Model="R" ----------------------------------------------------------------------|CATEGORY OBSERVED|OBSVD SAMPLE|INFIT OUTFIT| COHERENCE |ESTIM| |LABEL SCORE COUNT %|AVRGE EXPECT| MNSQ MNSQ| M->C C->M RMSR |DISCR| |-------------------+------------+------------+-----------------+-----| | 0 0 499 29| -2.46 -2.46| .91 4.15| 88% 87% .3342| | 0 | 1 1 1250 71| 4.41 4.41| .92 1.95| 94% 94% .2096| 1.01| 1 |-------------------+------------+------------+-----------------+-----| |MISSING 1533 47| -2.75 | | | | ----------------------------------------------------------------------OBSERVED AVERAGE is mean of measures in category. It is not a parameter estimate. M->C = Does Measure imply Category? C->M = Does Category imply Measure? P R O B A B I L I T Y O F R E S P O N S E
1.0
.8
.6 .5 .4
.2
.0
DICHOTOMOUS CURVES -+--------------+--------------+--------------+--------------++ + | | |0 1| | 000000 111111 | + 00000 11111 + | 0000 1111 | | 0000 1111 | | 000 111 | + 000 111 + | 000 111 | + *** + | 111 000 | + 111 000 + | 111 000 | | 1111 0000 | | 1111 0000 | + 11111 00000 + | 111111 000000 | |1 0| | | + + -+--------------+--------------+--------------+--------------+-2 -1 0 1 2 PERSON [MINUS] ITEM MEASURE
Figure G1. Summary of Category Structure and Probability Curves for Volume Items
P a g e |160 TABLE 12.2 Post-assessment Rasch Data Cleaned (r ZOU276WS.TXTd Apr 25 6:30 2013ed.xlsx INPUT: 258 PERSON 14 ITEM REPORTED: 257 PERSON 14 ITEM 2 CATS WINSTEPS 3.75.0 --------------------------------------------------------------------------------------MEASURE 7
6
5
4
3
2
1
0
-1
-2
-3
-4
-5
-6 EACH
PERSON - MAP - ITEM | ############ + | AR-2 | | # + . | T| .# | AR-1 + | | VRCS-1 # |S VRCS-2 #### + .#### | PS-2 ## | | + PS-1 S| | | . + . | #### | ######## | .## + ## | | | +M . M| | .# | + | PAC-2 PAC-4 | ########### | ######### + . | | | + .###### S| CIC-2 CRR-2 .# | CRR-1 .#### | . + |S CIC-1 | . | ### + . | CDC-2 . | | CDC-1 .# + | "#" IS 3. EACH "." IS 1 TO 2
Figure G2. Item-Person Map for Volume Items
P a g e |161 Appendix H Assessment Items Used for Evaluation of Revised Developmental Progression Items Designed to Assess Filling Volume $
$&
$'
#+ ($ '()$2 "'#"#""$#"! #) # #+)$7"81'#""!# %" '"")
')
#"("&"'! '##$#"(( #"+(#"("&#'"2 %'"!!!%""" "'"!" ) #+'()"((#)''##$(# #"("&2
#,"'! )'##$#"(( #"+(#+ #'"2 %'"!!!%""" "'"!&) #+'()"((#)''##$(# #,2
>?
>@
>A
Correct - Completely fills container using scoop
Correct – Talks about cup interior or what it will hold $&
$'
#"("&"'! '##$#"(( #"+(#+ #'"2 #"%'"!!!%""" "'"!" ) #"#( #+ (#)''##$(# #"("&2'#&"'(!(2
#"("&"'! '##$#"(( #"+(#+ #'"2 !#!"!!""!" &"'%'-
>B
Correct - Completely fills container using scoop $& #+($()& #+2 %!#&! ! !#&- +
=
%#"" #""&!*%%5&!-
%#""#"!&!*%'!&!%%)
>C
Correct - Fills container to the halfway point
Correct - Estimate between 2 and 4
>D
Correct - 10
$'
$(
#+($()& #+2
*#"%" !# " "%" "-%' #"! "#"%" ") # #+)$1%'#%)
" &*+#('()"( )$+('"0 $#)&(#"("('"(##"("&0"'( #"("&(# * ('"2 " %"!##!+ %'##!!%#" "" "')
$& #"' ! #"!"! !(!"!#!-"' "" 12 %"!*%"!&!%# ""!#"!)%'# %) 14 13 12 11 10 9 8 7 6 5 4 3
>E
(&"
Correct - 4 (or 5 if includes bucket already poured)
Correct - 8 or 9
$' #"' ! #"!"! !(!"!#!-"&%"! "# "!""' *% % "!""' )%'# %) 14 13 12 11 10 9 8 7 6 5 4 3 2 1
Correct - 8
>F
?>
2 1
Correct - 3 x 2 x 2 box
?=
P a g e |162 Items Designed to Assess Packing Volume $
$ &
$ '
#+ "!$(-#,"'! 0+##" )2 !&"! "%#! %%"""&- "'#" #""$#"!&) # #+)$7"81'#""!& %""!#! '%)
')
%'!"!!(9"(:'#" %#"!# "!!" &) '#!%%'#%##!"! !"'# !% )* ( #'
'#"# #!"""!!'# # &%"# ! !-%'#!%#"""" &)
?@
?A
?B
Correct - 4 or 8
Correct - Child refers to interior of box
Correct - 3 or 6
$ &
VQ#P2
?,A,@#,#"(( #"+(@=0> ,>)'27'()"(!-" #( (!'28 %'"!#!%#""" "'"!&)
How$many$blocks$this$size$[indicate]*do$you$think$ would$be$needed$to$measure$the$space$inside$the$ box?$$$ Can$you$show$me$how$you$would$use$these$ blocks$to$check$your$answer?*Give*child*the* blocks
?C
27
Correct - 24
Correct - 4
Correct - 6
$ &
$ (
#+($()&#($&( -5 #,2 %'#!%#""""&)
&##$!#!#(0'- $!&!%$!" )$ &" $!%!!(
#+($()&#?,A,?#,2 &!%%! "!!%" !# !"!("!#-%' #!%#""!&)
?D
Correct - 36
;8
Correct - 36
Correct - 16
$ &
#+($()&#($&( -5 #,2 %'#!"" %"""" &)
" $!&! !"% $!" ,% ! !) &" !" !"!%- , !$!%) $ !!!! ! !)!) ! ,
@=
Correct - 54 or 80
?F
!!)!% ! ,
!! )!% ! ,
!!)!% ! ,
Top
Right side
$&" !!! !%(
Front
;;
Correct - 24
;
9
>:
Correct - 4 x 3 x 4 or 2 x 6 x 4 or 2 x 3 x 8
Correct - 30
" 4"&& $5 " 3 &4 &5 ,$&" $"!!!" !(
>
;
Correct - 72
P a g e |166 Appendix I Randomized Order of Items per Assessment Form Form A 41 21 45 42 38 29 31 2 33 5 39 30 27 9 7 46 44 11 18 24 22 8 36 1 16 40 26 6 23 15 13 12 4 37 25 35 3 14 34 19 32 20
Form B 23 36 34 44 27 39 18 38 4 41 1 6 24 25 11 9 29 16 12 46 19 26 15 37 2 31 45 17 40 28 22 43 10 5 35 3 21 14 7 13 33 20
Form C 31 25 24 14 12 28 30 8 21 43 27 37 2 3 16 6 42 33 7 13 44 1 22 41 38 26 15 11 20 34 36 17 4 39 19 5 45 32 29 9 35 18
P a g e |167 28 43 10 17
42 32 30 8
46 10 23 40
P a g e |168 Appendix J Rasch Modeling Results from Instrument Used to Evaluate Revised Developmental Progression Table J1 – Item Statistics: Measure order Item Statistics: Measure order TABLE 13.1 2014-06-19 Assessment Results (fixed) ZOU520WS.TXT Jul 6 9:09 2014 INPUT: 82 PERSON 44 ITEM REPORTED: 82 PERSON 44 ITEM 5 CATS WINSTEPS 3.75.0 -------------------------------------------------------------------------------PERSON: REAL SEP.: 2.95 REL.: .90 ... ITEM: REAL SEP.: 5.98 REL.: .97 ITEM STATISTICS:
MEASURE ORDER
--------------------------------------------------------------------------------------------------|ENTRY TOTAL TOTAL MODEL| INFIT | OUTFIT |PT-MEASURE |EXACT MATCH| | |NUMBER SCORE COUNT MEASURE S.E. |MNSQ ZSTD|MNSQ ZSTD|CORR. EXP.| OBS% EXP%| ITEM | |------------------------------------+----------+----------+-----------+-----------+--------------| | 41 10 81 1.01 .17| .88 .0| .57 .1| .22 .21| 91.4 91.9| 45 - VRCS-4B | | 16 16 81 .88 .14| .85 -.2| .46 .0| .34 .25| 90.1 88.5| 16 - VURR-P1 | | 18 16 81 .88 .14| .93 .0| .38 -.1| .32 .26| 87.7 86.4| 18 - ICS-P1 | | 17 20 80 .81 .12|1.06 .3|2.78 1.4| .20 .28| 81.3 82.3| 17 - VURR-P2B| | 28 26 81 .73 .11| .66 -1.2| .29 -.6| .47 .31| 80.2 78.3| 28 - ICS-B1 | | 8 32 82 .66 .10|1.36 1.3|2.65 1.6| .07 .34| 68.3 73.4| 8 - VURR-F3 | | 39 32 81 .66 .10|1.27 1.0|1.01 .3| .23 .34| 71.6 73.2| 41 - ICS-C2 | | 19 32 80 .66 .10| .99 .1|1.33 .6| .35 .35| 75.0 72.9| 19 - ICS-P2B | | 29 36 81 .62 .10| .98 .0| .52 -.4| .41 .36| 69.1 67.3| 29 - ICS-B2B | | 10 40 80 .58 .09|1.15 .7|1.33 .6| .31 .38| 58.8 60.1| 10 - ICS-F2 | | 15 48 80 .51 .09| .93 -.3|1.08 .3| .42 .41| 56.3 57.3| 15 - VURR-P3 | | 40 52 82 .49 .09|1.08 .4|1.36 .7| .34 .41| 51.2 53.6| 42 - VRCS-1B | | 37 52 81 .49 .09|1.09 .5| .81 -.1| .39 .41| 54.3 54.2| 38 - VURR-C1 | | 25 64 81 .41 .08| .92 -.4| .69 -.4| .50 .44| 48.1 47.0| 25 - VURR-B1B| | 38 70 80 .36 .08| .72 -1.8| .65 -.6| .59 .46| 41.3 40.7| 40 - ICS-C1 | | 43 72 80 .34 .08| .95 -.2| .66 -.6| .51 .46| 42.5 40.6| 47 - VQ-P3 | | 26 75 80 .33 .08| .65 -2.5| .47 -1.2| .64 .47| 41.3 40.5| 26 - VURR-B2 | | 27 79 82 .32 .08|1.00 .0|1.39 .9| .43 .47| 36.6 39.4| 27 - VURR-B3B| | 7 84 82 .29 .07|1.50 3.0|1.81 1.7| .21 .48| 34.1 38.2| 7 - VURR-F2 | | 9 84 81 .28 .07|1.16 1.1|1.17 .5| .41 .48| 37.0 38.6| 9 - ICS-F1 | | 24 92 82 .25 .07| .81 -1.5| .65 -.9| .60 .49| 36.6 35.7| 24 - VQ-B2 |
P a g e |169 | 31 96 82 .23 .07|1.34 2.4|1.74 1.8| .32 .50| 34.1 34.5| 31 - VURR-C3B| | 13 108 80 .15 .07|1.01 .1| .92 -.1| .52 .52| 31.3 31.5| 13 - VF-P2 | | 35 136 81 .03 .07| .75 -2.2| .64 -1.5| .68 .55| 23.5 23.1| 36 - VQ-C2B | | 36 148 81 -.03 .07|1.24 1.8|1.16 .7| .45 .55| 18.5 18.7| 37 - VQ-C3B | | 34 148 80 -.04 .07| .73 -2.3| .58 -1.9| .70 .56| 20.0 18.9| 34 - VF-C3B | | 32 168 80 -.13 .07| .95 -.4|1.07 .4| .58 .56| 20.0 21.1| 32 - VF-C1 | | 44 180 82 -.16 .07| .90 -.8| .78 -1.0| .62 .56| 24.4 22.7| 48 - VQ-B3 | | 42 192 81 -.23 .07| .87 -1.0| .93 -.2| .62 .56| 19.8 21.0| 46 - VQ-P2 | | 6 208 81 -.31 .07| .66 -2.7| .52 -2.1| .73 .55| 27.2 25.5| 6 - VURR-F1 | | 14 208 81 -.31 .07| .87 -.9|1.25 .9| .57 .55| 28.4 25.5| 14 - VQ-P1 | | 5 216 81 -.35 .07| .86 -1.0| .73 -.9| .62 .55| 32.1 31.1| 5 - VQ-F2B | | 4 248 80 -.55 .08|1.41 2.2|1.76 1.6| .30 .50| 45.0 48.3| 4 - VQ-F1 | | 23 248 80 -.55 .08| .71 -1.8| .49 -1.3| .65 .50| 51.3 48.3| 23 - VQ-B1 | | 12 268 81 -.67 .09| .96 -.1|1.65 1.2| .48 .47| 56.8 55.4| 12 - VF-P1 | | 22 284 82 -.76 .09| .74 -1.1| .42 -.9| .57 .43| 63.4 62.1| 22 - VF-B2 | | 3 284 81 -.80 .10|1.02 .2| .64 -.3| .44 .42| 61.7 62.0| 3 - VF-F2 | | 33 288 82 -.80 .10|1.32 1.3|1.45 .8| .27 .42| 62.2 66.6| 33 - VF-C2 | | 2 292 82 -.84 .10|1.31 1.2|1.52 .8| .27 .41| 64.6 67.0| 2 - VF-F1 | | 21 296 82 -.88 .11| .99 .1| .79 .0| .41 .39| 72.0 72.7| 21 - VF-B1 | | 30 300 82 -.93 .11|1.58 1.7|2.93 1.7| .04 .37| 74.4 78.4| 30 - VQR-C1B | | 20 308 80 -1.20 .16|1.36 .8|3.54 1.6| .04 .26| 90.0 91.3| 20 - VQR-B | | 1 312 81 -1.20 .16| .66 -.6| .12 -.6| .44 .26| 93.8 91.4| 1 - VQR-F | | 11 312 81 -1.20 .16|1.35 .8|1.47 .8| .10 .26| 90.1 91.4| 11 - VQR-P | |------------------------------------+----------+----------+-----------+-----------+--------------| | MEAN 142.7 81.0 .00 .09|1.01 -.1|1.12 .1| | 53.6 53.8| | | S.D. 103.7 .8 .63 .03| .24 1.3| .73 1.0| | 22.8 23.0| | ---------------------------------------------------------------------------------------------------
P a g e |170 Table J2 – Table of standardized residual variance (in Eigenvalue units) Table of STANDARDIZED RESIDUAL variance (in Eigenvalue units) Total raw variance in observations Raw variance explained by measures Raw variance explained by persons Raw Variance explained by items Raw unexplained variance (total) Unexplned variance in 1st contrast Unexplned variance in 2nd contrast Unexplned variance in 3rd contrast Unexplned variance in 4th contrast Unexplned variance in 5th contrast
= = = = = = = = = =
-- Empirical -Modeled 103.6 100.0% 100.0% 59.6 57.5% 57.1% 15.6 15.1% 15.0% 44.0 42.5% 42.2% 44.0 42.5% 100.0% 42.9% 3.0 2.9% 6.8% 2.7 2.6% 6.2% 2.4 2.4% 5.6% 2.4 2.3% 5.5% 2.1 2.1% 4.8%
P a g e |171
-2 -1 0 1 2 -+---------------+---------------+---------------+---------------+.6 + | A + | | | .5 + B | + C | | | O .4 + | C + N | D | E | T .3 + | GF + R | | H I J | A .2 + L K | M + S | N Q| O P | T .1 + | R + | vS | TV U | 1 .0 +-------------u-----t------------|------------s-------------------+ | | r | L -.1 + q | + O | m o | l n p | A -.2 + j k i h + D | | | I -.3 + f e g + N | d | | G -.4 + c | + | | | -.5 + a b | + | | | -+---------------+---------------+---------------+---------------+-2 -1 0 1 2 ITEM MEASURE COUNT: 3 11311 2 1211112 1 24213113112 1
Figure J1. Standardized residual Contrast 1 Plot
COUNT 1
CLUSTER 1
1
1
1 2 2 3 3 4 1 5 3 1 1 5 4
1 1 1 1 2 2 2 2 2 2 3 3 3
3 1 1
3 3 3
2
3
P a g e |172 Table J3 – Contrast 1 standardized residual loadings for item Contrast 1 from principal component analysis Standardized residual loadings for item (sorted by loading) ---------------------------------------------------------|CON- | | INFIT OUTFIT| ENTRY | | TRAST|LOADING|MEASURE MNSQ MNSQ |NUMBER ITEM | |------+-------+-------------------+---------------------| | 1 | .59 | .33 .65 .47 |A 26 26 - VURR-B2 | | 1 | .50 | -.16 .90 .78 |B 44 48 - VQ-B3 | | 1 | .42 | .25 .81 .65 |C 24 24 - VQ-B2 | | 1 | .36 | -.55 .71 .49 |D 23 23 - VQ-B1 | | 1 | .35 | .41 .92 .69 |E 25 25 - VURR-B1B | | 1 | .31 | .49 1.08 1.36 |F 40 42 - VRCS-1B | | 1 | .28 | .51 .93 1.08 |G 15 15 - VURR-P3 | | 1 | .26 | .28 1.16 1.17 |H 9 9 - ICS-F1 | | 1 | .24 | .66 1.36 2.65 |I 8 8 - VURR-F3 | | 1 | .24 | 1.01 .88 .57 |J 41 45 - VRCS-4B | | 1 | .22 | -.35 .86 .73 |K 5 5 - VQ-F2B | | 1 | .22 | -.88 .99 .79 |L 21 21 - VF-B1 | | 1 | .20 | .73 .66 .29 |M 28 28 - ICS-B1 | | 1 | .17 | -1.20 1.36 3.54 |N 20 20 - VQR-B | | 1 | .16 | .34 .95 .66 |O 43 47 - VQ-P3 | | 1 | .13 | .66 .99 1.33 |P 19 19 - ICS-P2B | | 1 | .13 | -.04 .73 .58 |Q 34 34 - VF-C3B | | 1 | .09 | .15 1.01 .92 |R 13 13 - VF-P2 | | 1 | .06 | -.23 .87 .93 |S 42 46 - VQ-P2 | | 1 | .05 | .32 1.00 1.39 |T 27 27 - VURR-B3B | | 1 | .04 | .49 1.09 .81 |U 37 38 - VURR-C1 | | 1 | .03 | .36 .72 .65 |V 38 40 - ICS-C1 | | 1 | .03 | -.31 .66 .52 |v 6 6 - VURR-F1 | | 1 | .02 | -1.20 .66 .12 |u 1 1 - VQR-F | | 1 | .01 | -.80 1.32 1.45 |t 33 33 - VF-C2 | | 1 | .00 | .81 1.06 2.78 |s 17 17 - VURR-P2B | ----------------------------------------------------------
--------------------------------------------------| | INFIT OUTFIT| ENTRY | |LOADING|MEASURE MNSQ MNSQ |NUMBER ITEM | |-------+-------------------+---------------------| | -.50 | -1.20 1.35 1.47 |a 11 11 - VQR-P | | -.50 | -.13 .95 1.07 |b 32 32 - VF-C1 | | -.40 | -.84 1.31 1.52 |c 2 2 - VF-F1 | | -.34 | -.55 1.41 1.76 |d 4 4 - VQ-F1 | | -.31 | .23 1.34 1.74 |e 31 31 - VURR-C3B | | -.30 | .03 .75 .64 |f 35 36 - VQ-C2B | | -.29 | .66 1.27 1.01 |g 39 41 - ICS-C2 | | -.21 | .62 .98 .52 |h 29 29 - ICS-B2B | | -.21 | -.03 1.24 1.16 |i 36 37 - VQ-C3B | | -.20 | -.80 1.02 .64 |j 3 3 - VF-F2 | | -.19 | -.67 .96 1.65 |k 12 12 - VF-P1 | | -.17 | .29 1.50 1.81 |l 7 7 - VURR-F2 | | -.16 | -.93 1.58 2.93 |m 30 30 - VQR-C1B | | -.16 | .58 1.15 1.33 |n 10 10 - ICS-F2 | | -.16 | -.76 .74 .42 |o 22 22 - VF-B2 | | -.14 | .88 .93 .38 |p 18 18 - ICS-P1 | | -.11 | -.31 .87 1.25 |q 14 14 - VQ-P1 | | -.05 | .88 .85 .46 |r 16 16 - VURR-P1 | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | ---------------------------------------------------
P a g e |173
Figure J2. Category Probability Curves
P a g e |174
MEASURE 1
0
-1
-2
PERSON - MAP - ITEM | + 45 - VRCS-4B | | 16 - VURR-P1 | 17 - VURR-P2B X | 28 - ICS-B1 X T| 19 - ICS-P2B XXX |S 10 - ICS-F2 | 15 - VURR-P3 X | 38 - VURR-C1 XXX | 25 - VURR-B1B XXX | 26 - VURR-B2 47 - VQ-P3 XXX S| 24 - VQ-B2 XXXX | 31 - VURR-C3B XXXXXXXXXXXX | 13 - VF-P2 XX | XXX +M 36 - VQ-C2B XXXXXXXXX | 34 - VF-C3B XXX M| 32 - VF-C1 XXX | 46 - VQ-P2 XX | XXXX | 14 - VQ-P1 XXXXX | XXXXXXXXXX S| X | 23 - VQ-B1 X |S | 12 - VF-P1 XXX | 22 - VF-B2 XX | 3 - VF-F2 X T| 2 - VF-F1 | 30 - VQR-C1B X + | X | | 1 - VQR-F |T | | | | | | | | | | + |
Figure J3. Wright Map
18 - ICS-P1 41 - ICS-C2 29 - ICS-B2B
8 - VURR-F3
42 - VRCS-1B 27 - VURR-B3B
40 - ICS-C1
7 - VURR-F2
9 - ICS-F1
37 - VQ-C3B 48 - VQ-B3 5 - VQ-F2B
6 - VURR-F1
4 - VQ-F1
33 - VF-C2 21 - VF-B1
11 - VQR-P
20 - VQR-B
P a g e |175 References Barrett, J. E., Clements, D. H., Klanderman, D., Pennisi, S.-J., & Polaki, M. V. (2006). Students' coordination of geometric reasoning and measuring strategies on a fixed perimeter task: Developing mathematical understanding of linear measurement. Journal for Research in Mathematics Education, 37(3), 187-221. doi: 10.2307/30035058 Battista, M. T. (1999). Fifth graders' enumeration of cubes in 3D arrays: Conceptual progress in an inquiry-based classroom. Journal for Research in Mathematics Education, 30(4), 417448. doi: Doi 10.2307/749708 Battista, M. T. (2011). Conceptualizations and issues related to learning progressions, learning trajectories, and levels of sophistication. Montana Mathematics Enthusiast, 8(3), 507569. Battista, M. T. (2012). Cognition-Based Assessment & teaching of geometric measurement: Building on students' reasoning. Portsmouth, NH: Heinemann. Battista, M. T., & Clements, D. H. (1996). Students’ understanding of three-dimensional rectangular arrays of cubes. Journal for Research in Mathematics Education, 27, 258292. Battista, M. T., & Clements, D. H. (1998). Students' understanding of three-dimensional cube arrays: Findings from a research and curriculum development project. In R. Lehrer & D. Chazan (Eds.), Designing learning environments for developing understanding of geometry and space (pp. 227-248). Mahwah, NJ: Lawrence Erlbaum Associates. Battista, M. T., Clements, D. H., Arnoff, J., Battista, K., & Borrow, C. V. A. (1998). Students' spatial structuring of 2D arrays of squares. Journal for Research in Mathematics Education, 29(5), 503-532. doi: Doi 10.2307/749731
P a g e |176 Ben-Haim, D., Lappan, G., & Houang, R. T. (1985). Visualizing rectangular solids made of small cubes: Analyzing and effecting students' performance. Educational Studies in Mathematics, 16, 389-409. Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences (2nd ed.). Mahwah, NJ: Lawrence Erlbaum Associates, Inc. Bredekamp, S. (2004). Standards for preschool and kindergarten mathematics education. Engaging young children in mathematics: Standards for early childhood mathematics education, 77-82. Burger, W. F., & Shaughnessy, J. M. (1986). Characterizing the van Hiele levels of development in geometry. Journal for Research in Mathematics Education, 17(1), 31-48. CCSSO. (2010). Common Core State Standards for Mathematics. Retrieved from http://www.corestandards.org/assets/CCSSI_Math Standards.pdf. Clark, D., Gilbertson, N., & He, J. (2012). Elementary curricula's treatment of volume and capacity. Paper presented at the 90th Annual Meeting of the National Council of Teachers of Mathematics Research Presession, Philadelphia, PA. Clements, D. H., & Battista, M. T. (1992). Geometry and spatial reasoning. In D. A. Grouws (Ed.), Handbook of Research on Mathematics Teaching and Learning: A Project of the National Council of Teachers of Mathematics (pp. 420-464). New York: Macmillan Publishing Company. Clements, D. H., Battista, M. T., & Sarama, J. (2001a). Logo and geometry. Journal for Research in Mathematics Education Monograph Series, 10. doi: 10.2307/749924 Clements, D. H., Battista, M. T., & Sarama, J. (2001b). Logo and geometry. Journal for Research in Mathematics Education, 10, i+1-177.
P a g e |177 Clements, D. H., & Sarama, J. (2004a). Learning trajectories in mathematics education. Mathematical Thinking and Learning, 6(2), 81-89. Clements, D. H., & Sarama, J. (2007a). Early childhood mathematics learning. In J. F. K. Lester (Ed.), Second Handbook of Research on Mathematics Teaching and Learning (pp. 461555). New York: Information Age Publishing. Clements, D. H., & Sarama, J. (2007b). Effects of a preschool mathematics curriculum: Summative research on the Building Blocks project. Journal for Research in Mathematics Education, 38(2), 136-163. Clements, D. H., & Sarama, J. (2009). Learning and teaching early math: The learning trajectories approach. New York: Routledge. Clements, D. H., & Sarama, J. (Eds.). (2004b). Hypothetical learning trajectories (Vol. 6). Clements, D. H., Sarama, J., Barrett, J. E., Van Dine, D. W., & McDonel, J. S. (2011). A hypothetical learning trajectory for volume in the early years. Paper presented at the Annual Meeting of the American Educational Research Association, New Orlean, LA. Clements, D. H., Sarama, J., & Liu, X. (2008). Development of a measure of early mathematics achievement using the Rasch model: the Research‐Based Early Maths Assessment. Educational Psychology, 28(4), 457-482. doi: 10.1080/01443410701777272 Clements, D. H., Sarama, J., & Wolfe, C. B. (2011). TEAM—Tools for early assessment in mathematics. Columbus, OH: McGraw-Hill Education. Clements, D. H., Swaminathan, S., Hannibal, M. A. Z., & Sarama, J. (1999). Young children's concepts of shape. Journal for Research in Mathematics Education, 30(2), 192-212.
P a g e |178 Cobb, P., Yackel, E., & Wood, T. (1992). A constructivist alternative to the representational view of mind in mathematics education. Journal for Research in Mathematics Education, 23(1), 2-33. Confrey, J., & Maloney, A. (2010). The construction, refinement, and early validation of the equipartitioning learning trajectory. Paper presented at the Proceedings of the 9th International Conference of the Learning Sciences-Volume 1. Confrey, J., Maloney, A., Nguyen, K., Mojica, G., & Myers, M. (2009). Equipartitioning/splitting as a foundation of rational number reasoning using learning trajectories. Paper presented at the 33rd Conference of the International Group for the Psychology of Mathematics Education, Thessaloniki, Greece. Confrey, J., Nguyen, K. H., Lee, K., Panorkou, N., Corley, A. K., & Maloney, A. P. (2012). Turn-on Common Core math: Learning trajectories for the Common Core State Standards for mathematics. from http://www.turnonccmath.net Crowley, M. L. (1987). The van Hiele model of the development of geometric thought. In M. M. Lindquist (Ed.), Learning and Teaching Geometry, K-12. 1987 Yearbook of the National Council of Teachers of Mathematics (pp. 1-16). Reston, VA: National Council of Teachers of Mathematics. Curry, M., Mitchelmore, M., & Outhred, L. (2006). Development of children's understanding of length, area, and volume measurement principles. . In Novotná, J., Moraová, H., Krátká, M. & Stehliková, N. (Eds.). Proceedings 30th Conference of the International Group for the Psychology of Mathematics Education, 2, 377-384. Prague: PME. Curry, M., & Outhred, L. (2005). Conceptual understanding of spatial measurement. In P. Clarkson, A. Downtown, D. Gronn, M. Horne, A. McDonough, R. Pierce & A. Roche
P a g e |179 (Eds.), Building connections: Research, theory and practice (Proceedings of the 28th annual conference of the Mathematics Education Research Group of Australasia) (pp. 265-272). Melbourne, Australia: MERGA. DeVries, R. (2008). Piaget and Vygotsky: Theory and practice in early education. In T. L. Good (Ed.), 21st century education: A reference handbook (pp. 184-193). Thousand Oaks: SAGE Publications, Inc. Eames, C. L., Miller, A. L., Kara, M., Van Dine, D. W., Cullen, C. J., Barrett, J. E., . . . Sarama, J. (2013). The longitudinal development of children’s conceptions of spatial measurement. Paper presented at the 43rd Annual Meeting of the Jean Piaget Society, Chicago, IL. Fischer, K. W., & Bidell, T. R. (2006). Dynamic development of action and thought. In W. Damon & R. M. Lerner (Eds.), Theoretical models of human development. Handbook of child psychology (6th ed., Vol. 1, pp. 313-399). New York: Wiley. Fuys, D., Geddes, D., & Tischler, R. (1988). The van Hiele model of thinking in geometry among adolescents. Journal for Research in Mathematics Education. Monograph, 3, i+1196. Ginsburg, H. P., & Opper, S. (1988). Piaget's theory of intellectual development (Third ed.). Englewood Cliffs, NJ: Prentice Hall. Gruber, H. E., & Voneche, J. J. (1995). The essential Piaget: An interpretive reference and guide (100th Anniversary ed.). Northvale, NJ: Jason Aronson, Inc. Gutiérrez, A., Jaime, A., & Fortuny, J. M. (1991). An alternative paradigm to evaluate the acquisition of the van Hiele levels. Journal for Research in Mathematics Education, 22(3), 237-251.
P a g e |180 Hershkowitz, R., & Dreyfus, T. (1991). Loci and visual thinking. In F. Furinghetti (Ed.), Proceedings of the fifteenth annual meeting International Group for the Psychology of Mathematics Education (Vol. II, pp. 181-188). Genova, Italy: Program Committee. Inagaki, K. (1992). Piagetian and post-Piagetian conceptions of development and their implications for science education in early childhood. Early Childhood Research Quarterly, 7(1), 115-133. Jacobson, C., & Lehrer, R. (2000). Teacher appropriation and student learning of geometry through design. Journal for Research in Mathematics Education, 31(1), 71-88. doi: 10.2307/749820 Kamii, C., & Kysh, J. (2006). The difficulty of “length × width”: Is a square the unit of measurement? The Journal of Mathematical Behavior, 25(2), 105-115. doi: http://dx.doi.org/10.1016/j.jmathb.2006.02.001 Kara, M. E., Eames, C. L., & Van Dine, D. W. (2013). Children’s reasoning about volume invariance. Paper presented at the Annual Meeting of the American Educational Research Association, San Francisco, CA. Lehrer, R. (2003). Developing understanding of measurement. In J. Kilpatrick, W. G. Martin & D. Schifter (Eds.), A research companion to principles and standards for school mathematics (pp. 179-192). Reston, VA: National Council of Teachers of Mathematics. Lehrer, R., & Chazan, D. (Eds.). (1998). Designing learning environments for developing understanding of geometry and space. Mahwah, NJ: Lawrence Erlbaum Associates. Lehrer, R., Jacobson, C., Thoyre, G., Kemeny, V., Strom, D., Horvarth, J., . . . Koehler, M. (1998). Developing understanding of geometry and space in the primary grades. In R. Lehrer & D. Chazan (Eds.), Designing learning environments for developing
P a g e |181 understanding of geometry and space (pp. 169-200). Mahwah, NJ: Lawrence Erlbaum Associates. Lehrer, R., Jenkins, M., & Osana, H. (1998). Longitudinal study of children’s reasoning about space and geometry. In R. Lehrer & D. Chazan (Eds.), Designing learning environments for developing understanding of geometry and space (pp. 137-167). Mahwah, NJ: Lawrence Erlbaum Associates. Lehrer, R., & Nitaback, E. (1996). Developing spatial sense through area measurement. Teaching Children Mathematics, 2, 473+. Linacre, J. M. (1994). Sample size and item calibration stability. Rasch Measurement Transactions, 7(4), 328. Linacre, J. M. (2011). A user's guide to Winsteps/Ministep Rasch-model computer program: Version 3.75.0. Chicago: Winsteps.com. Liu, X. (2010). Approaches to developing measurement instruments Using and developing measurement instruments in science education: A Rasch Modeling approach (pp. 35-66). Charlotte, NC: Information Age Publishing, Inc. Lovell, K. (1968). The growth of basic mathematical and scientific concepts in children. Australia: Hodder Headline. Martin, J. D. (2007). Children’s understanding of area of rectangular regions and volumes of rectangular shapes and the relationship of these measures to their linear dimensions. PhD Program in MSTE Education. Tufts University. Martin, J. D. (2009). A study of fourth grade students' understanding of perimeter, area, surface area, and volume when taught concurrently. (Ph.D. 3354724), Tufts University, United States -- Massachusetts. Retrieved from
P a g e |182 http://proquest.umi.com/pqdweb?did=1742044591&Fmt=7&clientId=39334&RQT=309 &VName=PQD Mayberry, J. (1983). The van Hiele levels of geometric thought in undergraduate preservice teachers. Journal for Research in Mathematics Education, 14(1), 58-69. Mojica, G., & Confrey, J. (2009). Pre-service elementary teachers' understanding of an equipartitioning learning trajectory. In S. L. Swars, D. W. Stinton & S. Lemons-Smith (Eds.), Proceedings of the 31st annual meeting of the North American Chapter of the International Group for the Psychology of Mathematics Education (pp. 1202-12012). Atlanta, GA: Georgia State University. National Council of Teachers of Mathematics. (2000). Principles and standards for school mathematics. Reston, VA: National Council of Teachers of Mathematics. National Council of Teachers of Mathematics. (2008). Curriculum focal points for prekindergarten through grade 8 mathematics: A quest for coherence. Reston, VA: National Council of Teachers of Mathematics. Piaget, J. (1952). How children form mathmatical concepts. Scientific American, 189(5), 74-79. Piaget, J., & Inhelder, B. (1967/1997). The child's conception of space (F. J. Langdon & J. L. Lunzer, Trans.). London: Routledge. Piaget, J., Inhelder, B., & Szeminska, A. (1960). The child's conception of geometry (E. A. Lunzer, Trans.). New York, NY: Routledge and Kegan Paul. Sarama, J., & Clements, D. H. (2009). Early childhood mathematics education research: Learning trajectories for young children. New York: Routledge. Sarama, J., & Clements, D. H. (in press). Learning trajectories: Theoretical approaches supporting the development of curriculum, standards and assessment research in
P a g e |183 mathematics education. In J. Confrey (Ed.), Learning Trajectories: Tools for Research in Mathematics Education (tentative title). Sarama, J., Clements, D. H., Barrett, J., Van Dine, D. W., & McDonel, J. S. (2011). Evaluation of a learning trajectory for length in the early years. ZDM, 43(5), 667-680. Siegler, R. S. (1996). Emerging minds: The process of change in children's thinking. New York: Oxford University Press. Siegler, R. S. (2006). Microgenetic analyses of learning. In W. Damon & R. M. Lerner (Eds.), Handbook of child psychology: Volume 2: Cognition, perception, and language (pp. 464510). Hoboken, NJ: Wiley. Siegler, R. S., & Alibali, M. W. (2005). Children's thinking. Englewood Cliffs, NJ: PrenticeHall. Siegler, R. S., & Booth, J. L. (2004). Development of numerical estimation in young children. Child Development, 75, 428-444. doi: 10.1111/j.1467-8624.2004.00684.x Siegler, R. S., & Opfer, J. E. (2003). The development of numerical estimation: Evidence for multiple representations of numerical quantity. Psychological Science, 14(3), 237-243. doi: 10.2307/40063895 Simon, M. A. (1995). Reconstructing mathematics pedagogy from a constructivist perspective. Journal for Research in Mathematics Education, 26(2), 114-145. Simon, M. A. (2013). The need for theories of conceptual learning and teaching of mathematics. In K. R. Leatham (Ed.), Vital Directions for Mathematics Education Research (pp. 95118): Springer New York. Smith, C. L., Wiser, M., Anderson, C. W. A., & Krajcik, J. S. (2006). Implications of research on children's learning for standards and assessment: A proposed learning progression for
P a g e |184 matter and the atomic-molecular theory. Measurement: Interdisciplinary Research & Perspective, 14(1-2), 1-98. Steffe, L. P., & Thompson, P. W. (2000). Teaching experiment methodology: Underlying principles and essential elements. In R. A. Lesh & A. E. Kelley (Eds.), Handbook of Research Design in Mathematics and Science Education (pp. 267-307). Hillsdale, NJ: Earlbaum. Stephan, M., Bowers, J., Cobb, P., & Gravemeijer, K. (2003). Supporting students' development of measuring conceptions: Analyzing students' learning in social context: National Council of Teachers of Mathematics Reston, VA. Swafford, J. O., Jones, G. A., & Thornton, C. A. (1997). Increased knowledge in geometry and instructional practice. Journal for Research in Mathematics Education, 28(4), 467-483. Szilágyi, J., Sarama, J., & Clements, D. H. (2010). Young children’s understandings of length measurement: A developmental progression. Submitted for publication. Van Dine, D. W., Sarama, J., Clements, D. H., & Vukovich, M. (2013). Validation of a developmental progression for volume using Rasch modeling. Paper presented at the Annual meeting of the American Educational Research Association, San Francisco, CA. van Hiele, P. M. (1959/2004). The child’s thought and geometry. In T. P. Carpenter, J. A. Dossey & J. L. Koehler (Eds.), English translation of selected writings of Dina van Hiele-Geldof and Pierre M. van Hiele (pp. 60-65). Reston, VA: National Council of Teachers of Math. van Hiele, P. M. (1986). Structure and insight: A theory of mathematics education. Orlando, FL: Academic Press.
P a g e |185 van Hiele, P. M. (1999). Developing geometric thinking through activities that begin with play. Teaching Children Mathematics, 5(6), 310-316. volume. (2010). Collins English Dictionary - Compete & Unabridged 10th Edition. Retrieved from Dictionary.com website: http://dictionary.reference.com/browse/volume von Glasersfeld, E. (1995). Radical constructivism: A way of knowing and learning (Vol. 6): ERIC. Wirszup, I. (1976). Breakthroughs in the psychology of learning and teaching geometry. In J. L. Martin & D. A. Bradbard (Eds.), Space and geometry: Papers from a research workshop (pp. 75-97): Columbus, OH. Retrieved from http://www.eric.ed.gov/ERICWebPortal/detail?accno=ED132033. Wright, B. D. (1977). Misunderstanding the Rasch model. Journal of Educational Measurement, 14(3), 219-225. Wright, B. D., & Linacre, J. M. (1994). Reasonable mean-square fit values. Rasch Measurement Transactions, 8(3), 370. http://www.rasch.org/rmt/rmt83b.htm