COMPUTER AS A TOOL OF VERIFICATION IN A

2 downloads 0 Views 144KB Size Report
selected the partial square units (triangles) and by copy and paste transferred them in the remaining part of the grid in the computer screen. Then, they change ...
COMPUTER AS A TOOL OF VERIFICATION IN A GEOMETRY PROBLEM-SOLVING CONTEXT. Ioannis Papadopoulos , Vassilios Dagdilelis University of Macedonia, Thessaloniki, Greece

Abstract Provided that verification process is one of the three functions for proof in mathematics, we try in this paper to record and distinguish a series of different kinds and strategies of verification the pupils use when they work in a computational geometry problem-solving environment. The main idea is the usage of computer as a tool of verification. We distinguish three different kinds and a certain number of strategies which have been applied to each kind.

1. Introduction Recording the way computers are utilized by the educational system we will notice that this is rather occasional than systematic. Especially in the level of primary education this usage is restricted in drill-and-practice software. Beside this predominant tendency of the daily practice, research has been focused on various aspects of Dynamic Geometry Software’s (DGS) capabilities. One of the most interesting research fields is technology’s role to the teaching of proof. In this case the majority of the research concerns secondary education. All these software initially designed to facilitate students making and testing assumptions. Relevant research shows that DGS allows students to generate assumptions that later can be verified or rejected through formal proof (Healy & Hoyles, 2001). This verification of assumptions takes place much more easily in the DGS environment than in other computational environment or in the more traditional setting of paper and pencil (Marrades & Gutierrez, 2000). And indeed in many cases it appealed that the DGS is used in a purely verifying role, in the sense that students are expected to confirm empirically at the computer, geometric data which are more or less given (Holzl, 2001; Gawlick, 2002; Laborde, 2001) or to refute assumptions and check properties (Arzarello et al., 2002). As it concerns pupils of the middle school, the dominant tool for progressing through verification to formal proof is usually the “dragging mode”. Holzl claims that drag mode can be already seen as a test mode although this usage is still shaky (Holzl, 2001). Jones considers that the verification process in this level is controlled by the drag mode (Jones, 2000). We can see the same in Nunokawa: “…The Cabri software seemed to make it possible to operate upon or “touch” the problem situation. The students could alter the situation by dragging the figure and check their results with measurement on the screen…”(Nunokawa & Fukuzawa, 2002).

2. Towards verification In the primary school level there is no so much place for formal proof. Students (who usually are at Level 1 and 2 of van Hiele), do not doubt the validity of their empirical observations. Proof is meaningless to them. They see it as justifying the obvious (De Villiers, 1987). What we see when the pupils work in a computational environment is a rather verifying manner of using the computer. Having in mind that: 1) Every student just entering the world of mathematics must start with the fundamental functions: verification and explanation (Hanna 2000). 2) Verification is one of the three functions for proof in mathematics in the sense that it is concerned with establishing the truth of a proposition (Bell, 1976) and 3) Three of the six possible mediating functions of the DG environment are: a) To serve as learning environment for raising assumptions. b) To refute (or confirm) an initial assumption (formulated with or without the DG tool) and c) to lead students to be convinced that a conclusion is right (Hadas & Hershkowitz, 2000), we decided to record and distinguish a series of different kinds of verification and of the strategies the pupils use when they feel the need to verify or to check their results during geometry problem-solving situations. 3. Description of the study. The study was accomplished by 36 students of the 5th and 6th grade of a primary school in an urban area of Greece. The topic was the calculation and comparison of areas of complex shapes (not regular). The tasks we used were non-routine in that “they are not typically covered in school geometry courses….. The non-standard nature of the problems ensures that the students will not be able to solve them by simply recalling and applying familiar solution patterns” (Schoenfeld, 1985). These tasks also are liable to dynamic approach in a computational environment. The problems required exact calculation, estimation and comparison of areas. We based in three different applications (Cabri, GeoComputer and MSPaint). The students used either only one of the software or a combination of them accordingly to the needs and the purposes of the specific task. The researchers made the decision and not the students. We also used the Camtasia Studio suite so as to record the activities of each student on the screen while he/she was trying to confront the task. Finally we used a video camera to record the whole population working in the specific environment. 4. Kinds and strategies of verification. Dagdilelis and Papadopoulos (2004) ordered the different ways students use the computer. Among them was a category under the title “Use of computer

as a means of verification”. The intention of this paper is to deepen the meaning of this category in a more detail and to describe specific kinds and strategies, which cover the students’ initiatives during the problem-solving sessions. The difference between kinds and strategies is that strategies are part of the kinds. We can recognize a certain strategy during the application of each kind of verification. 4.1 Kinds of verification. We distinguished three different kinds of the computer’s usage as a means of verification. 4.1.1. Verification step by step. In this kind of verification students divided the initial area to sub-areas. They calculated the areas of each sub-shape and immediately they resorted to the tool of automatic area measurement so as to realize whether their calculation was correct. If it was such they would stop. Otherwise they would keep on until they find the correct result. The main characteristic of this kind of verification was that the students did not continue to the next sub-shape unless they verified that their calculation was a correct one. 4.1.2 Reaction based verification. In the laboratory students had the possibility to make direct comparisons and to find relations between what they thought or what they expected as a result and what they saw in their computer screen. Thus visualizing the results of their activities on the screen, they could immediately react to them so as to reach finally the correct result (Papadopoulos, 2004). A student separated a non-regular polygon into sub-shapes (2 triangles and one trapezium) (Fig.1). In Figure 1. Division of an irregular order to estimate the area of the trapezium shape divided it into a rectangle and a triangle. The student realized the division by the eye. She did not use the suitable tools of the software concerning the perpendicular line. The consequence was that the triangle was an obtuse instead of a right one. She continued to handle it as a right. But when she saw her result on the screen she realized that something was wrong. It leaded her to the first reaction. She measured by the “distance and length” tool the common side of the triangle and the rectangle and compared it with the opposite side of the rectangle. They were not equal. Additionally she drew the altitude of the triangle (she was sure that it will be congruent with the

side of the triangle). The new result leaded her to a new reaction. She repeated the division of the trapezium using in a suitable manner the software’s tools. 4.1.3 Verification in the end. In this kind of verification the students completed the task and only then they used the tool “Area” which gave them the correct answer about the area of the whole shape. Thus, if there was a disagreement between what they found and what the computer gave them they had to reexamine the whole process to find the possible mistakes. 4.2 Verification strategies. Analyzing the way students used the computer as a tool of verification, of confirmation, to control their assumptions and conclusions we ordered their strategies into three main categories. 4.2.1 Visual verification. In essence it is about an absence of any strategy. Students are based on what they see and this is by itself the verification. In various problem sessions students have rejected correct solutions because they did not look sufficiently accurate and have accepted incorrect solutions because they looked good (Shoenfeld, 1986). They accepted with no doubt the legitimacy of their results and this habit made them to ignore results that could not stand a logical control (e.g. different outcomes from their own calculations and computer’s tools, unequal opposite sides in a “rectangle” etc). 4.2.2 Idiosyncrasy based verification. It concerns a series of strategies, which are close related with personal choices. So we realized four different of them. 4.2.2.a. Copy-paste verification. The students used this approach when they had to work in an electronic grid. The area unit was a square. The initial shape was constituted from complete as also partial square units. These partial square units combined per two formed a whole square or a rectangle that was equal with two square units. So the students, in order to verify what was an assumption in their minds, selected the partial square units (triangles) and by copy and paste transferred them in the remaining part of the grid in the computer screen. Then, they change the orientation of these triangles so as to fit one to each other. Figure 2. Copy-paste verification Thus, they verified their initial thought that the combination of these triangles results in a

whole square unit or in a rectangle that includes two square units (Fig. 2). We have to add however that in some cases this strategy had different consequences. First, the students conceived correctly the strategy but when they tried to apply it, they found it impossible because of their inability to execute correctly the required operations. Second, the triangles they transferred were not accurate copies of the initial ones, so it was difficult to verify what was obviously true to their minds. 4.2.2.b. Verification through erasing-redrawing. It could be regarded as an extension of the previous strategy of copypaste. But now, instead of copying and pasting students followed a different series of steps. It is due to the fact that the students felt that they could not realize successfully the transference of the pieces. So since they identified the pieces that fit one to each other, they preferred to erase the first partial square unit and then to redraw it in its new place so as to fit with the second partial square unit. Some negative consequences of this strategy were that during this process the students lost squares. In Figure 3 the result of the Figure 3. Loosing squares transferences would be a square 6X6. Instead of it the student lost 2 squares. Finally another consequence was the wrong combination of the partial square units, which resulted to over-covering between of them loosing thus a certain number of squares. 4.2.2.c. Transformation based verification. In this strategy the students tried to transform the unfamiliar shape through copy and paste to a familiar one (triangle, square, rectangle) (Fig. 4). The conceptual content of this activity is summarized by “area is preserved under an action of cut and paste”. Their intention was to deal finally with a known shape whose formula for the calculation of its area was also known so as

Figure 4. Transformation to familiar shapes

to check the correctness of their initial estimation. For example some students

after they finished the estimation of the unfamiliar shape (36 square units) they tried to transform it either into a square (6X6) or a rectangle (9X4). Thus they verified whether their initial estimation was correct or not. 4.2.2.d Properties based verification. This category is indicative of a rather advanced mathematical thinking. Students used the tools of the software in order to check if certain shapes are bearer of certain properties. For example in the case of a square in a non-traditional orientation a student used the tools “Distance and Length” and “Angle”. His intention was to check whether the shape had four equal sides and four angles equal with 90 degrees (Fig. 5). After he verified that the shape was a square it was easy for him to apply the known formula for Figure 5. Usage of properties the calculation of its area. He applied the same strategy about the isosceles and the right trapezium also. 4.3 Numerical verification. 4.3.1. Formula based verification. In this case the students initially calculated the area of a known shape (e.g. the isosceles trapezium of the above figure) using the tool “Calculate” of the software and applying the known formula E = (B+b)*h/2. Afterwards in order to verify whether their calculation was a correct one, they split the trapezium into three sub shapes – 2 triangles and a square. They measured by the appropriate tools bases and heights, they calculated the area of each subshape, they added the partial results and they compared the new final result with the initial one. 4.3.2. Outline and auto measure verification. This strategy emerged as an answer to a problem that faced the students when they felt the need for verification. The “area” tool of Cabri would give them an answer concerning the area of the whole shape in Figure 5. They initially split the shape to sub-shapes through the “segment” tool. They had then the possibility to recognize the known sub-shapes on their screen. But it was not the same about the software. Someone has to define a shape through the appropriate tools (“triangle”, “polygon”) so as to make them identifiable by the software. When the students calculated numerically the area of these sub-shapes and they tried to verify their calculations they found that the program could

only measure automatically the area of the initial shape. So they had somehow to overcome this difficulty. They thought then to re-draw these sub-shapes through certain tools. Thus, they made them recognizable from the program so as to use now the “area” tool to verify whether they calculated correctly the subareas. 5. Conclusions. Students of primary school are far from feeling the need for a mathematically acceptable proof. During their effort to solve the problems we posed them, we realized an uncertainty concerning whether their results were correct, given that they could not apply known processes or formulas due to the irregularity of the shapes. They had to follow a specific strategy (e.g. splitting the shape to familiar subshapes) so as to calculate the area of the initial shape. During this complex process students had the chance to verify the correctness of their results only through an empirical way. We refer to this way by the general term “verification” and it consists in the usage of alternative processes for the re-calculation of the area. The initial process results in a number for the area. The alternative ones aim at the derivation of a new output, which is in contrast with the former result. DG environments are in favor of the development of trial and error processes as also as of repetitive usage of verification processes. Our results seem to confirm absolutely these findings as our students used so frequently the verification process. We also concluded that DGS not only seems to favor the verification process, but also allows the development of this process in a wider variety compared to the traditional environment of paper-pencil. This is why we recorded so many different verification processes. We made an attempt to order them in a series of different kinds and strategies. However –and this is an important part of our research- these processes are not equivalent as far as their mathematical power is concerned. Some of them are based mainly in an absolute empiricism and they are limited for example in the visual impression of the coincidence of two shapes. Others, however, use more advanced techniques such as the performance of operations in an alternative way, the division of a shape or the involvement of the properties some shapes bear with the support of the software’s tools (e.g. perpendicularity or equality between segment lines). So, some verification processes could be regarded as indicative of an advanced mathematical thinking versus others that are closer to empiricism. Obviously, it would be an exaggeration to claim that there is an explicit separation between these categories. However we could propose a hierarchical order of these verification processes in a DG environment, according to how empirical they could be considered. We believe that this issue demands an extensive and more focused experimental order and at the same time more specialized analysis instruments. We intend to continue our research towards this perspective

Reference list Arzarello, Olivero, Domingo and Robutti (2002). A cognitive analysis of dragging practices in Cabri environments. ZDM, 34(3), 66-72. Bell A.W. (1976). A study of pupils’ proof-explanations in mathematical situations. Educational Studies in Mathematics. 7, 23-40. Dagdilelis V. and Papadopoulos I. (2004). An open problem in the use of software for educational purposes. In Elspeth McKay (Ed). Proceedings of the International Conference on Computers in Education 2004, pp.919-924, Melbourne, Australia. De Villiers (1987). Research evidence on hierarchical thinking, teaching strategies and the van Hiele theory: Some critical comments. Paper presented at Learning and Teaching geometry: Issues for research and practice working conference. Syracuse University. Gawlick Thomas (2002). On Dynamic Geometry Software in the Regular Classroom. ZDM, 34, 85-92 Hadas Nurit, Hershkowitz Rina, and Schwarz B (2000). The role of contradiction and uncertainty in promoting the need to prove in dynamic geometry environments. Educational Studies in Mathematics, 44, 127-150. Hanna G. (2000). Proof, Explanation and Exploration: an Overview. Educational Studies in Mathematics. 44, 5-23. Healy Lulu and Hoyles Celia (2001). Software tools for geometrical problem solving: potentials and pitfalls. International Journal of Computers for Mathematical Learning., 6, 235-256. Holzl Reinhard (2001). Using Dynamic Geometry Software to Add Contrast to Geometric Situations - A Case Study. International Journal of Computers for Mathematical Learning., 6, 63-86 Jones K (2000). Providing a foundation for deductive reasoning: students' interrelations when using dynamic geometry software and their evolving mathematical explanations. Educational Studies in Mathematics, 44, 55-85 Laborde Colette (2001). Integration of technology in the design of geometry tasks with cabri geometry. International Journal of Computers for Mathematical Learning., 6, 283-317. Marrades R and Gutierrez A (2000). Proofs produced by secondary school students learning geometry in a dynamic computer environment. Educational Studies in Mathematics, 44, 87-125. Nunokawa K & Fukuzawa T, (2002). Questions during Problem Solving with Dynamic Geometric Software and Understanding Problem Situations. Proc. Natl. Sci. Counc. ROC(D). 12(1), 31-43. Papadopoulos I. (2004). Geometry problem solving in a computational environment. Advantages and reservations. Paper presented at the 10th International Congress on Mathematical Education (ICME-10), Copenhagen, Denmark, http://www.icmeorganisers.dk/tsg18/ Schoenfeld H Alan. (1986). On having and using geometric knowledge. In J. Hiebert (Ed.), Conceptual and procedural knowledge: The case of mathematics (pp.225-264). Hillsdale, NJ: Lawrence Erlbaum. Schoenfeld H Alan. (1985). Mathematical Problem Solving. (p. 2). Academic Press Inc.

Suggest Documents