ICM2002 Satellite Conference on Mathematics Education Lhasa, Tibet, August 2002.
Encouraging higher level mathematical learning using computer aided assessment C. J. Sangwin LSTN Maths Stats and OR Network School of Mathematics and Statistics, University of Birmingham, Birmingham, B15 2TT, UK Telephone 0121 414 6197, Fax 0121 414 3389 Email:
[email protected]
http://www.sangwin.com/
Abstract: This article defines “higher level mathematical skills” and details an important class: that of creating instances of mathematical objects satisfying certain properties. We explain how such questions may be assessed in practice under some circumstances without the imposition on staff of an onerous marking load. We report reactions to these questions and discuss their design and impact. Included are working examples which have been implemented on a free computer aided assessment system with some source code. Keywords:
1
high level mathematical skills, computer aided assessment, deep learning
Introduction
This paper will briefly define higher level skills, give examples of questions believed to test these, discuss their use in learning and teaching, and then demonstrate how a free and practical automatic marking system may be used to assess these under some circumstances. The Oxford English Dictionary defines a ‘skill’ as a “practised ability, expertness, technique, craft, art, etc.”. Therefore any definition of ‘mathematical skill’ must be based on an analysis of what in reality we ask the students to do. We set students a range of assignments, and there have been many attempts to classify mathematical tasks, for example [14]. The study [11] found that the majority of tasks set to first year undergraduate students at a United Kingdom university may be classified into the categories shown in Table 1. An understanding of assessment methods and strategies is important since assessment strategies dictate learning styles. See for example, [2]. Hence we may influence learning styles through choice of assessment strategies [5]. In contraposition, if we fail to ask students to practice some 1. 2. 3. 4. 5. 6. 7. 8.
Factual recall Carry out a routine calculation or algorithm Classify some mathematical object Interpret situation or answer Prove, show, justify – (general argument) Extend a concept Criticize a fallacy Construct an instance of something
Table 1: The mathematical question taxonomy of [11] 1
technique we can hardly complain afterwards that they have failed to do so! Educational research [3, 8, 9, 16, 17] consistently shows that “learning events” are stimulated when students create for themselves examples of objects, ie instances of something, that satisfy some pre-specified properties. This is required by those questions in the last category of Table 1, and we concentrate on questions of this type for the remainder of this paper. Examples and discussions of other types of questions may be found elsewhere [11, 13, 14].
2
Constructing an instance of something
Tasks which require the student to construct an instance of something come in a number of forms: 1. examples of objects, 2. counter examples to a theorem or proposition, 3. objects with properties eg Find a change of basis for ... eg Find a substitution that ... eg Find a Lyapunov function for ...
Questions of this style appear to be rare in undergraduate work. The recent study of tasks set to undergraduate students in their first year at a UK university found only 18 questions from 463, ≈ 3.9% [11]. This category is mentioned by [14] who, differing from the usual pattern in that paper, provides no examples of such questions. One might think of such tasks as being “hard” or “advanced” such as the following problem of real analysis: Question 1 Provide a counter example to the following P∞ statement. “If (a n ) is a sequence of non-negative real numbers that converges to zero then n=1 (−1)n an converges.” This problem, which is related to the alternating series test, is one which in some real sense defines the very subject itself [4], and would pose a real and definite challenge to even the best students. Instead, consider the following examples of questions, that might be tackled by undergraduate students. Question 2 Find a cubic polynomial with a critical point∗ at x = 1 passing through the co-ordinate (−1, 2). Question 3 Find a singular 5 by 5 matrix with no repeated entries. Question 4 By giving the formula for the general term an in a sequence, give an example of a sequence with limit lim an = e. n→∞
Question 5 Find three different examples of functions that are tangent to y = x 2 at the point x = 2. These questions do not require particularly advanced theory. They are of a particular style: that of creating instances of objects that satisfy certain properties. This style of question is not restricted to university level mathematics. It it firmly believed that (i) questions of this style are possible and desirable at all levels of mathematical education, and (ii) many questions and topics have the potential to be assessed in this way. To support these assertions two specific examples follow. ∗ That
is to say a point at which the gradient is zero.
2
Question 6 How may you write a half ? Give as many examples as possible. 2 4 (and many other √ √ 1 2) −1 2 , 4− 2 , 1/ 4, log( log(2) ,
Answers to this question include, ‘a half’, 12 , 0.5, 50%,
fractions, including non-
integer and non-positive numbers). Continuing: and so on. Furthermore, one may use 0.1 in base 2. Now, of course, you ask what is ‘a half’ in base 3? Also, 0.4999 · · · – opening a can of worms or even 0.0111 · · · in base 2. Plenty of scope here for real and serious discussion of the simplest questions of number representation, and we haven’t even begun to consider historical perspectives on this question. The following two questions† were taken from the TIMSS observational study and illustrate well how a particular question may be changed in style to require the construction of something. In this case the problem is that of calculating the area of an irregular plane shape. Question 7 (Germany) 5
2 a
3
b
Find the area of this shape, first using the line a, then using the line b.
8 Question 8 (Japan) y
5
Find the area of this shape in as many ways as you can.
x 8
In the second case the student is required to construct a number of different dissections of the given shape into elementary components, the areas of which can be calculated more easily. Notice also the presence of unknown variables x and y in the problem. There are many different such dissections in addition to the two offered in Germany, such as those shown below.
Requiring the student to solve a particular problem in more than one way is an approach that the Japanese seem to use systematically [1, 10]. This often requires different constructions of steps in the solution, for example here different dissections. To what extent does this account for their success in TIMSS?
3
Effect on learning
The fact that there often exist non-unique correct answers provides an opportunity for genuine mathematical discussion. That is to say, there is something to talk about. For example, as reported in [13], the answers to Question 4 supplied by a focus group comprising six students were † The author is indebted to Prof. Michael Neuband, for suggesting this example during valuable discussion during the ICM2002 Satellite Conference on Mathematics Education, Lhasa.
3
an := e an := e + an := e + an := e an :=
1 n 1 n
+
2 n
+
3 n
+
7 n10
1+n n
µ
1+
1 n
¶n
[1 student]
Constant
[2 students] [1 student]
an = e + p(n)
[1 student]
an = e1+p(n)
[1 student]
Other
where p(n) → 0 as n → ∞. The student who answered with the constant sequence was disappointed with the question which he regarded as “trivial”. However, none of the other students had realized the constant sequence was sufficient to answer the question. Similarly only one student remembered that e was defined as the limit to the sequence given in the last answer. Discussing the different answers allows students to encounter or recall information that they had perhaps forgotten or not known. It also allows the instructor to introduce topics naturally that they might not have been exposed to yet. This same group of students were asked the following question. Question 9 Give a differentiable function with a critical point at x = 1. Two different strategies emerged JL: Ok, just take the parabola and shift it one. ··· B: I said, x − 1 = 0, then integrated it. Generally the “graph shifters” had not thought about integrating and the “integrators” had not considered shifting graphs of known functions. Both strategies are effective, and rather different. Knowing and having facility with both would be advantageous. These questions provide a natural opportunity to discuss both, and perhaps other methods. Such questions provide room for manoeuvre that more traditional model answer based questions do not. It appears that a certain level of familiarity with the topic is required. It may be that weaker students find this approach off putting, and so more research in this area is needed to fine tune the technique. Experienced school teachers were asked to provide three different examples in answer to Question 9. Their responses appeared to show great variation, but they may be grouped as follows c1 (x − 1)2 + c2 , c1 (x2 /2 − x) + c2 , c1 (x − 1)n + c2 , n ≥ 3. Furthermore, individuals showed little variation themselves and the participants appeared to deploy only “one idea”. A typical sequence was (x − 1)2 ,
(x − 1)3 ,
(x − 1)4 .
Essentially here, only the power has been increased. Others added different constants or used varying multiples. Two individuals multiplied and added — still only modifying their original function. All participants responded with polynomial expressions, so that all responses are infinitely differentiable. Furthermore all coefficients belong to the set {±1, ±2, ±3, 21 , 32 }. All but one response (−(x − 1)2 ) has a positive coefficient for the highest power of x. No one used trig functions, for example cos(x − 1) would suffice. Constant functions, in particular the zero function, were absent.
One wonders whether familiarity with tasks of this style would encourage students to be more ambitious and creative. These issues are discussed more fully in [13]. The general characteristics of these problems are that: 4
• non-unique correct solutions exist, • no general or unique methods exist for finding solutions, • one needs to reverse engineer the problem, • solutions are routine but time consuming to mark. Furthermore, there are some questions, such as Question 3, which would be practically impossible to mark by hand for large groups. In some circumstances the marking may be performed by computer algebra systems.
4
Alice Interactive Mathematics — AIM
AIM is a computer aided assessment system which uses the Maple ‡ computer algebra system to grade students’ answers. Such automatic marking provides new opportunities to pose questions which educational theory has demonstrated to be greatly beneficial, but which would be impractical to set. The use of computer algebra allows the expression entered by the student to be manipulated and tested symbolically, using the impressive collection of genuine mathematical operations available within Maple. The system is web based; students access the system using a normal browser and enter their responses using Maple syntax. Students do not need Maple, or any special software, such as a browser plug-in. Specific details are available elsewhere [6, 7, 15]. This system is becoming widely used in higher education. The AIM system is currently being integrated into the core of a first year mathematics degree at the University of Birmingham, UK, with the help of an institutional Learning Development Unit grant. Currently it is used as a tool for self testing and more routine calculus and algebra work. However, as identified in [13], the intention is to use the system to pose questions which involve student generated examples. These are stored by the system and will be subsequently used to attempt to generate tutorial debate. Therefore, the use of the computer aided assessment system is envisaged to be only one part of the learning cycle. There are two versions of AIM. The original AIM system was designed by Theodore Kolokolnikov [7]. AIM-TTH is the second version, and was designed by Neil Strickland [15]. This may be viewed at http://www.mat.bham.ac.uk/aim or may be downloaded from http://maths-physics.org.uk/aiminfo/. There is also a JISC mailing list,
[email protected], for staff and developers. See http://www.jiscmail.ac.uk/lists/aim.html
4.1
Overview of AIM
AIM has the ‘usual’ question types such as multiple choice, multiple response, numerical, and matrix entry etc. More importantly any Maple expression may be entered. Maple syntax needs to be used, eg “3*x” not “3x”, although syntax errors by the student are not penalized. Students may simply re-enter their answer with no penalty. From the students’ point of view, AIM is web based, and they may use any browser. AIM uses Maple, although only one copy is needed on the server. Thus, students do not need Maple. No plug-in or special software is needed. From the staff point of view AIM is free. It is web based, again staff use any browser, and links may be added to web based resources. All administrative tasks are available via the Web, including authoring questions, creating quizzes (access, due date), generating grade sheets & statistics. Questions are (relatively) easy to author, as will be shown below, and questions may be shared. ‡ Maple
is a registered trademark of Waterloo Maple Software.
5
4.2
AIM itself
AIM-TTH uses a combination of LATEX and Maple code to author questions. This combines the typesetting advantages of LATEX with the mathematical sophistication of Maple. Maple checks for the symbolic equivalence of answer and solution so that great sophistication is possible. In addition to this, the use of Maple allows the following features. • Random questions for practice may be generated, see Section 5.2. • Sophisticated feedback, based on the student’s answer may be given and partial credit may be awarded. See Section 5.3. • Hints, sub-questions (Original AIM only at the moment) are possible, see [7]. Again, for full details see [6, 7, 15].
5
In practice
There are three levels of operation: subjects (whole courses), quizzes, and questions. We concentrate on the questions, which are authored using a web interface, and are written as plain text files containing lines of code. Each line begins with an AIM flag. Some of these question flags are: t> text displayed on the screen, which may include LATEX h> hidden Maple commands v> question value (number of marks) a> denotes the unique answer s> marking procedure for more complex situations ap> answer prompt As a simple example consider the question Differentiate (x − 1)3 with respect to x. This may be authored using the following three lines of code: t> Differentiate $(x-1)^3$ with respect to $x$. a> 3*(x-1)^2 end> Note that the system correctly equates 3(x − 1)2 and 3x2 − 6x + 3 marking both correct. The AIM system is ideally suited to marking such routine questions. Since multiple choice, or the equivalent, is not used staff need not author lists of incorrect distracters, and students must provide an answer and may not strategically eliminate answers from a list provided. Issues of question distortion by multiple choice format, and their elimination by the use of computer algebra marking are treated more fully in [12]. Of course, we may do significantly more with the computer algebra system as we shall see.
5.1
Operating on the answer
Maple may perform mathematical operations on an answer. The question Give an example of a differentiable function f (x) which has a turning point at x = 1, ie f 0 (1) = 0. 6
may be authored as t> flag denotes text displayed on the screen
LATEX code
t> Give an example of a differentiable function t> $f(x)$ which has a turning point at $x=1$, ie t> \[ f’(1)=0. \] v> 2 ap> $f(x):=$ s> [(ans)->‘aim/Test‘( eval(subs(x=1,diff(ans,x))),0 ),x^2-2*x+3] end>
Marking procedure
Uses Maple procedures
One correct solution
The line beginning with the s> flag contains the marking procedure. This takes the answer (ans), differentiates it, substitutes x = 1 and simplifies. The result is compared to 0 using the function ‘aim/Test‘( , ). One correct answer, x2 − 2x + 3, is also supplied, should the student ask for it.
5.2
A randomly generated question
Randomization is possible, in fact genuine random questions are relatively easy. If we wish to practice many similar integration questions we could 1. generate a random polynomial, 2. set a question with this as the integrand, 3. mark the work by taking the student’s answer, differentiate it and compare it to the question. By choosing suitable values for the coefficients and degree of the polynomial, the corresponding AIM code for this question might be h> p := x^4 + randpoly(x, degree=3, coeffs=rand(-3..3)); t> Evaluate the following integral: \[ \int @p@ dx \] s> [ (ans) -> ‘aim/Test‘(diff(ans,x),p), int(p,x) ] end> The code @p@ simply instructs AIM to make the substitution for the Maple variable p before displaying the result.
5.3
Tailored feedback & partial credit
The question Give an example of a cubic polynomial p(x) with the following properties 1. p(0) = 1, 2. p(x) = 0 at x = a and x = b
7
has a number of separate criteria. Not all of these may be satisfied by a student’s answers and we may wish to provide tailored feedback and partial credit. This may be done, although writing such tailored feedback does, of course, become more time consuming. The following code contains randomized parameters a and b . 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22.
h> a_ :=rand(1..3)(); h> b_ :=a_+rand(1..2)(); t> Give an example of a cubic polynomial $p(x)$ with the t> following properties \begin{itemize} t> \item $p(0)=1$ t> \item $p(x)=0$ at $x=@a_@$ and $x=@b_@$. \end{itemize} s> [ proc(ans) local marks_; marks_:=1;\ if not ‘aim/Test‘(subs(x=0,ans),1) then\ ‘aim/t‘("Your polynomial fails to satisfy $p(0)=1$.\\\\ ");\ marks_:=marks_-0.25; fi;\ if not ‘aim/Test‘(subs(x=a_,ans),0) then\ ‘aim/t‘("Your polynomial fails to satisfy $p(x)=0$ at $x=@a_@$.\\\\");\ marks_:=marks_-0.25; fi;\ if not ‘aim/Test‘(subs(x=b_,ans),0) then\ ‘aim/t‘("Your polynomial fails to satisfy $p(x)=0$ at $x=@b_@$.\\\\");\ marks_:=marks_-0.25; fi;\ if not ‘aim/Test‘(degree(ans,x),3) then\ ‘aim/t‘("Your polynomial is not a cubic.\\\\");\ marks_:=marks_-0.25; fi;\ RETURN(marks_);\ end ,(1-x/a_)*(1-x/b_)*(1-x)] end>
Note that the line numbers themselves do not constitute part of the code. Lines 7–21 comprise the marking scheme. Each of the conditions in the question is checked separately by the marking scheme. Lines 8–10 perform the marking of criteria (1) of the question, for example. If the student’s answer fails to satisfy p(0) = 1 then line 9 provides tailored feedback and line 10 deducts one quarter of the total marks for the question. Question templates such as this are adaptable.
6
Conclusion
Genuine mathematical answers may now be evaluated by computer aided assessment systems with sophisticated tailored feedback and with partial credit. Questions may be set which encourage deeper learning. It is confidently predicted that in the near future all CAA systems will link computer algebra and assessment to perform automatic marking [12]. We have only begun to explore the potential provided by this system.
References [1] J. P. Becker and S. Shimada (eds). The open-ended approach: a new proposal for teaching mathematics. National Council of Teachers of Mathematics, 1997. [2] C. Beevers and J. Paterson. Assessment in mathematics, chapter 4, pages 49–61. Kogan Page Ltd, London, 2002. [3] R. P. Dahlberg and R. P. Housman. Facilitating learning events through example generation. Educational Studies in Mathematics, 33:283–299, 1997. [4] B. R. Gelbaum and J. M. H. Olmsted. Counterexamples in analysis. Holden-Day, 1964. [5] G. Gibbs. Using assessment strategically to change the way students learn. In Assessment Matters in Higher Education: choosing and using diverse approaches, pages 41–53. Society for Research into Higher Education & Open University Press, 1999.
8
[6] D. F. M. Hermans. Embedding intelligent web based assessment in a mathematical learning environment. In Proceedings of the International Symposium on Symbolic and Algebraic Computation (ISSAC2002), Internet Accessible Mathematical Computation 2002 workshop, Lille, July 2002. [7] S. Klai, T. Kolokolnikov, and N. Van den Bergh. Using maple and the web to grade mathematics tests. In Proceedings of the International Workshop on Advanced Learning Technologies, Palmerston North, New Zealand, 4–6 December, 2000. http://allserv.rug.ac.be/~nvdbergh/aim/docs/ (viewed August 2002). [8] J. Mason and A. Watson. Getting students to create boundary examples. MSOR Connections, 1(1):9–11, 2001. http://ltsn.mathstore.ac.uk/ (viewed August 2002). [9] E. R. Michener. Understanding understanding mathematics. Cognitive Science, 2:361–381, 1978. [10] M. Neubrand. Geometrische aufgaben aus dem japanischen “open-ended approach”. Beitr¨ age zum Mathematikunterricht, pages 483–486, 1998. [11] A. Pointon and C. J. Sangwin. An analysis of undergraduate core material in the light of hand held computer algebra systems. International Journal for Mathematical Education in Science and Technology, 34:5, 671–686, 2003. [12] C. J. Sangwin. Assessing higher skills with computer algebra marking. JISC Technology and Standards Watch, TSW 02–04, September 2002. http://www.jisc.ac.uk/techwatch/ (viewed September 2002). [13] C. J. Sangwin. New opportunities for encouraging higher level mathematical learning by creative use of emerging computer aided assessment. International Journal for Mathematical Education in Science and Technology, accepted 2003. [14] G. Smith, L. Wood, M. Coupland, and B. Stephenson. Constructing mathematical examinations to assess a range of knowledge and skills. International Journal of Mathematics Education in Science and Techology, 27(1):65–77, 1996. [15] N. Strickland. Alice interactive mathematics. MSOR Connections, 2(1):27–30, 2002. http://ltsn.mathstore.ac.uk/ (viewed August 2002). [16] D. Tall. Advanced mathematical thinking. Dordrecht: Kluwe, 1992. [17] A. Watson and J. Mason. Student-generated examples in the learning of mathematics. Canadian Journals for Science, Mathematics and Technology Education, 2(2), 2002.
9