for students who entered university directly from high school. .... problem solving behaviors of students who earned A's or B's (âA-B studentsâ) to those of all other students ...... Lucas JF. The Teaching of Heuristic Problem-Solving Strategies in ...
The development and nature of problem solving among first-semester calculus students Paul Christian Dawkins and James A. Mendoza Epperson Northern Illinois University, DeKalb, IL, USA; University of Texas at Arlington, Arlington, TX, USA Abstract: This study investigates interactions between calculus learning and problem solving in the context of two first-semester undergraduate calculus courses in the USA. We assessed students’ problem solving abilities in a common US calculus course design that included traditional lecture and assessment with problem solving-oriented labs. We investigate this blended instruction as a local representative of the US calculus reform movements that helped foster it. These reform movements tended to emphasize problem solving as well as multiple mathematical registers and quantitative modeling. Our statistical analysis reveals the influence of the blended traditional/reform calculus instruction on students’ ability to solve calculus-related, non-routine problems through repeated measures over the semester. The calculus instruction in this study significantly improved students’ performance on non-routine problems, though performance improved more regarding strategies and accuracy than it did for drawing conclusions and providing justifications. We identified problem-solving behaviors that characterized top-performance or attrition in the course, respectively. Top-performing students displayed greater algebraic proficiency, calculus skills, and more general heuristics than their peers, but overused algebraic techniques even when they proved cumbersome or inappropriate. Students who subsequently withdrew from calculus often lacked algebraic fluency and understanding of the graphical register. The majority of participants, when given a choice, relied upon less-sophisticated trial-and-error approaches in the numerical register and rarely used the graphical register, contrary to the goals of much US calculus reform. We provide explanations for these patterns in students’ problem solving performance in view of both their preparation for university calculus and the courses’ assessment structure, which preferentially rewarded algebraic reasoning. While instruction improved students’ problem solving performance, we observe that current instruction requires ongoing refinement to help students develop multi-register fluency and the ability to model quantitatively, as is called for in current US standards for mathematical instruction. Keywords: calculus; problem solving; calculus reform; multi-register reasoning; modeling 1. Introduction Calculus has been celebrated as being “by just about any standard, one of the greatest intellectual achievements of western civilization. The subject drips with power and beauty. It rendered thousand-yearold questions immediately transparent” [1]. Though mathematicians might so laud calculus for its power for solving problems, research indicates that many students struggle to succeed in calculus courses [2] and even successful calculus students often fail to use calculus concepts to solve non-routine problems [3-5]. Calculus instruction thus appears to fall short of providing students with access to calculus’ full potential. In this study we extend prior research into the complementary interactions between calculus learning and mathematical problem solving. In one direction, we seek further insight into how calculus instruction fosters’ students abilities in problem solving. In the other, we investigate problem-solving behaviors that correlate with excellence or attrition in a first-semester calculus course. This study extends the insights of prior studies on calculus students’ problem solving development through two primary means: (1) statistical analysis of students’ problem solving performance four times across the semester and (2) comparative analysis of students’ problem solving behaviors that correspond to key aspects of problem solving including: procedural understanding, conceptual understanding, and heuristics. A number of mathematics education research and curriculum design projects targeted calculus instruction because calculus is both the capstone of secondary mathematics and the gateway to advanced study in science and engineering. Examples include Uri Treisman’s Emerging Scholar’s Program (ESP) in the 1970’s, the calculus reform movement of the 1980’s and 1990’s [6], and the ongoing investigations Calculus I instruction by the Mathematical Association of America in the United States [2]. The “natural history” of our research is rooted in the ESP [1, 7-10]. ESP replications across the USA achieved notable success [7, 8] with the following components for calculus instruction: 1) challenging problems, 2) extra class time, and 3) collaborative group work [10].
Common failure rate in US calculus courses is over 25% [2]. ESP programs show markedly lower failure rates. In light of this contrast, we intend to provide more detailed insight into students’ problem solving development in a teaching context that blends more traditional lecture structures with problemfocused instruction reflective of the calculus reform efforts. We investigate blended traditional/reform instruction as a local reflection on the efficacy and limitations of the calculus reform movements that fostered the state of this instruction. We shall thus address the following two research questions: • How does supplementing traditional first-semester calculus instruction with problem solving laboratory experiences (blended traditional/reform instruction) foster students’ problem solving abilities? • What problem solving behaviors characterize students who subsequently excel in or withdraw from first-semester calculus? 2. Literature Review In this section, we review literature regarding problem solving and calculus teaching and learning. This survey should accomplish two goals. First, we situate our study within previous literature to clarify how our study design builds on prior research, in part by delineating which aspects of problem solving we do and do not attempt to measure. Second, we review some research findings relevant to our study to place our research findings appropriately in the ongoing conversation about calculus instruction. 2.1. Problem solving Santos-Trigo [11] pointed to the use of “tasks that offer diverse challenges” to describe successful problem solving class environments (p. 645). Schoenfeld [12] argued, though, that many students go through years of mathematics instruction solving few or no problems. This striking claim arises from his dichotomy between exercises and problems. Problems require students to construct understanding or combine previously learned understandings to reach a solution, unlike exercises that ask students to repeat solution procedures provided by the teacher. We use the term “non-routine problem” after the studies by Selden et al. [3-5], who use the term similarly to Schoenfeld’s “problems.” As part of the reform movement of the last decades, mathematics educators increasingly value engaging students in solving non-routine problems. Reformers argue that solving rich problems is an integral part of doing mathematics and students should see it as such. Accordingly, two of the most influential USA standards for mathematics instruction, the National Council of Teachers of Mathematics (NCTM) Standards and the Common Core State Standards for Mathematics (CCSSM) [13, 14], endorse problem solving as a mathematical process through which students should learn the range of mathematics they encounter. To identify the various elements of problem solving that should be developed through calculus instruction, we draw strongly from Carlson and Bloom’s framework [15]. They synthesized a large body of research into their problem-solving framework, with which they modeled expert problem solvers’ behavior at a fine-grained level. The following sections outline key elements of their framework. 2.1.1. Problem solving attributes Carlson and Bloom [15] delineate four main attributes of problem solving: resources, heuristics, monitoring, and affect. Resources entail knowledge of facts and procedures as well as conceptual understandings in a content domain. Heuristics describe general problem solving methods that might be used to solve a problem or make it tenable. Whereas previous attempts to teach general heuristics directly found mixed results [16-18], Selden et al. [3] suggest that heuristics should grow with the extent and diversity of students’ problem-solving experiences. Monitoring processes include metacognitive reasoning about understanding a problem, a solution approach, efficacy of a chosen solution path, and the reasonableness of a solution. Monitoring can occur anytime in problem solving, but is especially important when choosing a solution method, when a method becomes fruitless, or when checking a solution. Students’ monitoring often appears underdeveloped without explicit training or guidance [19]. Finally, affect includes the emotional and belief-related aspects of problem solving such as frustration,
confidence, persistence, enjoyment, and views about mathematics. Because this study examined artifacts from a written problem solving assessment, we do not report on students’ affect in problem solving. 2.1.2. Problem-solving phases Carlson and Bloom [15] divided the problem solving process into four phases: orienting, planning, executing, and checking. Orienting involves interpreting the nature and parameters of the question itself. Planning involves proposing possible solution paths and anticipating their likely efficacy. Executing involves carrying out a chosen solution approach via some form of computation, algorithm, explanation, etc. Checking describes the solvers’ efforts to interpret and assess the outcome of the executing phase. These latter three stages (planning, executing, and checking) form a cycle that repeats until students solve or abandon the task. The written problem-solving assessments in this study did not capture many details of the orienting stage of the framework, but we investigated the planning/executing and checking phases from the problem solving artifacts collected. 2.2. Calculus learning Ongoing efforts to “reform” calculus instruction arise in large part from concerns that university students learned calculus as a series of algorithms without understanding the underlying concepts and meanings [20]. This persistently raises questions about the value of secondary calculus for tertiary learning. Ferrini-Mundy and Gaudard [21] found that secondary calculus tended to support higher achievement in tertiary calculus. To improve tertiary instruction itself, reform curricula tended to follow two paths. Some explored the intra-mathematical meanings rooted in analyzing properties and representations of functions in various mathematical registers. Others emphasized extra-mathematical meanings rooted in modeling real-world phenomena. Figure 1 portrays these two complementary means of providing conceptual meaning to calculus concepts and procedures. Our problem solving assessment reflects both emphases: 1) coordinating calculus-related properties of functions across native mathematical registers and 2) applying calculus to tasks in real-world contexts. We discuss research relating to these two calculus reform themes in the following sections. Native Mathematical Registers
Algebraic Register Algebraic Modeling
Numerical Register Translation Among Registers
Situational Register
Graphical Register Graphical Modeling
Intra-mathematical reasoning
Extra-mathematical reasoning
Figure 1. Relating two traditions of calculus reform in terms of translations between registers. 2.2.1. Multiple mathematical registers in calculus Mathematical functions are commonly embodied in mathematics classes and texts in one of several “representations” [22] or what we shall call registers: numerical/ tabular, symbolic/ algebraic, graphical, and verbal. The most common meanings for derivatives in calculus texts reflect the graphical register: the slope of a tangent line. However, the processes for evaluating derivatives most often appear in the algebraic register (e.g. power rule, product rule, chain rule). This indicates that for students to understand calculus in many classrooms, they must coordinate a web of meanings relating these various registers
[22]. Thus Hughes-Hallet [23, 24] said of her reform-oriented calculus text, “One of the guiding principles is the ‘Rule of Three,’ which says that wherever possible topics should be taught graphically and numerically, as well as analytically” (p. 121; recent versions of the text also include “verbal”). Some studies indicate that common instruction (both in calculus and elsewhere) emphasizes nonvisual (and thus non-graphical) reasoning [25-27]. As a result, students and teachers display reluctance to visualize in certain types of problem solving [25, 28], though affect interacts with this phenomenon [27]. Presmeg [29] found that top-performing secondary students prefer analytical reasoning more than the general population. Some infer from this that mathematics instruction unequally rewards non-visualizers, depending also upon the instructor’s preferences [30]. Others instead interpret Presmeg’s findings to suggest that conventional instruction fails to support students who prefer reasoning visually to employ visualization productively [25]. Visualization entails multiple cognitive processes that may require instruction [27, 30]. Such trends in secondary and tertiary mathematics may have shifted due to increased emphases on multiple registers in reform mathematics instruction [31]. We designed our problem solving assessments to investigate this by ensuring that our tasks could be solved in different registers. 2.2.2. Understanding calculus concepts through modeling quantitative situations Calculus’ is essential for the science and engineering modeling toolkit. A study conducted by the Mathematical Association of America (MAA) with representatives of other scientific and humanities disciplines recommended modeling as a valuable and needed emphasis for future mathematics education [32]. However, the prevalent organization of mathematics texts often relegates “word problems” or “applications” to the end of a section or as justification for the abstract mathematical principles being taught. This implicit distinction between abstract mathematics and contextual applications appears to manifest in a long-term divergence of meaning among various end-users of calculus. After experiencing university instruction, mathematics students tend to hold “slope of tangent line” meanings for the derivative while science and engineering students appear to prefer “rate-of-change” meanings [33-35]. The full implications of these differences is not fully understood, but research on quantitative reasoning indicates that there are significant differences between students’ cognitive processes when they reason about mathematical expressions as measurable attributes of objects, as opposed to a-contextual numbers and symbols [36-38]. Consistent reference to the situation in mathematical problem solving can provide students with powerful resources for reasoning [39], which addresses the perceived need for meaning in calculus learning. One of the strong symptoms of students failing to reason quantitatively is when students interpret graphical representations as “pictures” whose shapes should match some visual aspect of the situation (conflating the graphical and situational registers) [40, 41]. Due to our attention to multiple mathematical registers, such “shape thinking” influenced student performance in our study. 3. Methodology In this section we describe the context of calculus instruction in which the study occurred. Next, we present the problems solving assessments and the method of administration during the semester. Finally, we outline our method of scoring and analyzing the students’ problem solving performance both by statistical analysis and by comparing problem-solving behaviors. 3.1. Teaching context and study participants The study took place at a large (25,000), suburban institution in the Southwestern United States. The mathematics department strongly coordinated the teaching of calculus instituting a common text [42], common examinations, and shared homework lists. Instructors taught using a lecture/recitation structure in which each week students attended 150 minutes of large lecture (2-3 sessions, class size ≤ 60) and 100 minutes of recitation (2 meetings, class size ≤ 40). In one weekly recitation, graduate students answered homework questions and provided supplemental explanations. We say this course blended traditional and reform-oriented instruction because of the other weekly, problem solving-focused recitation session. These sessions entailed challenge problems solved in small groups, small and large-group discussions, student and group presentations, constant group and instructor feedback, and reflection upon in-depth mathematical tasks, all reflective of the ESP model for instruction. To further emulate the challenge,
extended time, and collaboration elements of the ESP model, one of the sections in the study instituted 1.5 supplemental hours per week of challenging problem solving activities similar to the other reforminfluenced recitation hour. We thus define “blended traditional/reform instruction” by the combination of conventional calculus instruction (150 minutes of lecture and 50 minutes of traditional recitation) with problem solving-oriented instruction (50-140 minutes of “reform” recitation and supplemental sessions). The data presented in this study came from two of the large lecture sections (four recitation sections) made up of 135 total participants. The analyzed cohort consisted of 64.9% males and 35.1% females. Quantitative scores on the Scholastic Aptitude Test (SAT), the most common college entrance examination used in the southwestern USA, were only available for 55% of the group. This lower frequency is a result of both students who took other college entrance exams and differences in university records for students who transferred from other colleges. SAT quantitative scores use an 800-point scale with all scores being multiples of 10. For this data, SAT quantitative scores of 500, 600, and 700 represented the national 43rd, 74th and 93rd percentiles, respectively. Of our sample whose SATquantitative scores were available, 38.2% scored between 500-590, 50% between 600-690, and 11.8% greater than or equal to 700. There is no indication that this smaller sample was not representative of the overall calculus population at this university, though the SAT data may have been more readily available for students who entered university directly from high school. White students represented 47.2% of the group and African American, Hispanic, and Asian students represented 13.4%, 9.4%, and 22.8% of the group, respectively. Over 75% of the cohort majored in science or engineering. 3.2. Research Assessments To measure students’ levels and development of key problem solving competencies, we selected four problem-solving tasks from previous research [40, 41, 43] on pre-calculus/calculus understanding. Following the assessment structure of White and Mitchelmore [20], we adapted four versions of each task differing in difficulty relative to the problem solving competencies required by each problem (Item 1 – most sophisticated, Item 4 –least sophisticated). Study participants completed a problem solving assessment up to four times during the calculus course, solving one version of each task each sitting. Each test included one task at each level of sophistication (1-4) and the tests were cycled so that no student saw the same version a task twice. Students saw the versions of the tasks in ascending order of sophistication (modulo the cycle) to minimize any task-specific learning, though the time between administrations should have rendered any improvement from repeated measures negligible. These tasks were not closely approximated by any regular course tasks. Table 1 presents a sample progression of four tests as they were administered across the semester. Table 1. Sample problem solving test progression across administrations. Test A Test B Test C Test D
Item 1A: Projectile task Item 1B: Bottle task Item 1C: Inequality task Item 1D: Car task
Item 2A: Bottle task Item 2B: Inequality task Item 2C: Car task Item 2D: Projectile task
Item 3A: Inequality task Item 3B: Car task Item 3C: Projectile task Item 3D: Bottle task
Item 4A: Car task Item 4B: Projectile task Item 4C: Bottle task Item 4D: Inequality task
The four tasks were chosen because they could be solved using pre-calculus so that even beginning calculus students could complete them successfully. However, each relates to calculus concepts such that understanding calculus concepts might improve success. Also, these tasks afforded differentiation into four levels of sophistication. Only the Projectile task (Figure 2) closely resembles tasks students might be expected to encounter in standard calculus curricula. Students tend to solve this task using resources and heuristics including: using the zeros of the function and the symmetry of parabolas to locate the maximum height (t=3 seconds), converting the height function to standard form (𝑦 = 𝑎(𝑥 − ℎ)! + 𝑘) to find the vertex, setting the height function equal to 140 to see whether the projectile reaches that height (Item 1-2), or maximizing the height function using the derivative. The presence of the pavilion in Items 1
and 2 often revealed students’ “shape thinking.” If students conflate the graph of the height/time function (graphical register) with the path of the projectile through space (situational register), they conclude that the projectile misses the pavilion because of its horizontal displacement. ITEM 1. A projectile is launched and travels according to the law 𝑠 𝑡 = 𝑎𝑡 − 𝑏𝑡 ! , where a and b are constants, t is the time in seconds after it is launched, and 𝑠(𝑡) is the height in feet above the ground at time t. We know that after 1 second the projectile is 80 feet high and that after 2 seconds it is 128 feet high. There is a pavilion structure over the launch site that extends 2 feet and is 140 feet high. Does the projectile hit the roof of the pavilion? ITEM 2. A projectile is launched and travels according to the law 𝑠 𝑡 = 96𝑡 − 16𝑡 ! , t is the time in seconds after it is launched, and 𝑠(𝑡) is the height in feet above the ground at time t. There is a pavilion structure over the launch site that extends 2 feet and is 140 feet high. Does the projectile hit the roof of the pavilion? ITEM 3. A projectile is launched and travels according to the law 𝑠 𝑡 = 96𝑡 − 16𝑡 ! , t is the time in seconds after it is launched, and 𝑠(𝑡) is the height in feet above the ground at time t. Find the maximum height reached by the projectile. ITEM 4. Given 𝑠 𝑡 = 96𝑡 − 16𝑡 ! . Find the vertex.
Figure 2. The four versions of the Projectile task [43]. Item 1 from the Bottle task (Figure 3) previously appeared in the work of Monk [41] and Carlson [40] and assesses students’ ability to imagine the covariation of two quantities to produce a height/volume graph. The task differs strongly from the bulk of calculus problems and exercises because only Item 4 of the task affords an algebraic solution. In most cases successful solvers must translate directly from the situational register to the graphical register without algebratizing. As such, students are encouraged toward non-numeric quantitative reasoning to produce a graph. The most common responses to the bottle task included: linear graphs through the origin (both quantities increase with no attention to relative amounts of change), single concavity curves (recognizing that the rate of change varies, or using “shape thinking”), or graphs with an inflection point (Items 1-2).
Imagine this [container] filling with water. Sketch a graph of the height (of the water) as a function of the amount of water that’s in the container. Explain (in writing) your answer.
Item 1
Item 2
Figure 3. The four versions of the Bottle task [40, 41]
Item 3
Item 4
Students overall made the least progress toward satisfactory solutions of the Inequality task (Figure 4). Item 1 of this task comes from Carlson’s function assessment [40]. The task asks students to verify or contradict an inequality similar to the definition of convexity. Expressed intuitively, the question asks whether the average of the outputs exceeds the output at the average of the inputs. The inequality appears in the algebraic register and includes a function whose argument is an expression rather than a single variable. Many students failed to interpret this appropriately, since it is inconsistent with the belief that “𝑓(𝑥)” is the name of the function. Students approached the task in three main registers: numerically (via trial-and-error), algebraically (which proved very cumbersome), and graphically (the most elegant and intuitive solution). If students appropriately interpreted the notation, trial-and-error was successful on Item 4 because the function is concave down or on any item when the student set v=w. The task requested multiple justifications, which was intended to encourage solutions in multiple registers. ITEM 1. Assume 𝐹 is any quadratic function.
ITEM 3. Assume 𝐹 𝑥 = 2𝑥 ! − 3𝑥 − 5.
ITEM 2. Assume 𝐹 𝑥 = 𝑎𝑥 ! .
ITEM 4. Assume 𝐹 𝑥 = −2𝑥 ! − 3𝑥 − 5.
a. b.
!!!
True or False: For all real numbers v and w, 𝐹 ! Justify your answer in at least two different ways.