Developing questions for Maple TA using Maple ... - Computer Science

25 downloads 84642 Views 3MB Size Report
Aug 1, 2011 - ... features similar to other interactive languages such as Perl or Python. ... Maple TA uses Maple as a back end math engine for a web-based system for ..... More robust answer checking needs support for automated testing.
Developing questions for Maple TA using Maple library modules and non-mathematical computation Bruce Char Department of Computer Science Drexel University Philadelphia, PA 19104 [email protected] August 1, 2011 Abstract Maple TA is a system for authoring, administering, and auto-grading web-based student assignments that has the Maple technical computation system as a back end service. Its strength is its ability to facilitate authoring, presenting, and automatically grading technical questions with symbolic or numeric answers. We have used it to develop questions for a course introducing problem-solving through technical computation and programming to over 1000 freshmen engineers per year at Drexel University. We have found that additional Maple programming can extend the flexibility of question generation and answer checking. We explain how to use Maple modules to generate questions, answers, and answering checking algorithms in Maple TA. We discuss how this this has changed our software engineering practices in question development and maintenance.

1

Introduction

Interactive computer-based testing has been around since the early days of interactive computing with the PLATO system [1, 11]. Its widespread use in university instruction had to await the favorable economic conditions – institutional budgets for reliable computing infrastructure (networking and web servers), and software systems that make development of computerized study aids, assignments, and quizzes possible through sustainable, conventional amounts of effort by instructional staff. For the past four years, we have run Computation Lab, a three quarter, one credit course introducing problem-solving via technical computing at Drexel University. During the period 2008-2011, it was given to all freshmen engineering students, resulting in enrollments of 600-1000 students per term. The course introduces use of a technical computation system (in our case, Maple[7, 13]). Maple is an interactive system developed beginning in the 1980s to support symbolic mathematical computation. It has become a commercial system with strengths in symbolic and numeric computation. It also has a document worksheet interface geared toward technical word processing and result presentation. As a programming language, it has grown to include general-purpose features similar to other interactive languages such as Perl or Python. We chose Maple because of it allows Palette/menu-driven, WYSIWYG entry of mathematical formulae and computation using a robust computational library for the standard technical tasks Since technical word processing, visualization, and formula-oriented derivations can be combined with computation in a single document, we can in the first lab, present and work on engineering, mathematical, or scientific computations that the students see in their other courses. Students are taught to use interactive calculation to gain insight and to answer questions involving the application of mathematical modeling. They are also taught basic scripting and programming to handle more problems that require more elaborate modeling or simulation. Major course objectives are to have students develop problem-solving, planning, and troubleshooting skills that should transfer from Maple to any similar interactive computational system. The course has three components:

1

1. Pre-lab out-of-class activities that prepare students for the upcoming lab. These include reading the course materials, and taking an on-line quiz that checks conceptual comprehension and the ability to do simple tasks that students are expected to master on their own. 2. In-class lab activities with face-to-face instruction and support by instructional staff. These activities build on the understanding supposedly established by the pre-lab activities. They typically provide both warm-up exercises to check out the student’s mastery of the new concepts in an easy situation, and more challenging applications of the concepts to more realistic modeling and simulation problems. 3. An on-line post-lab quiz/assignment that typically requires more practice with the cycle’s concepts. 4. An end-of-term proficiency exam that asks the students to demonstrate in class that they can use Maple on their own to solve variations of problems that they’ve seen earlier in the term. We rely on a proprietary system, Maple TA[12], to provide automated support for the testing that occurs in items 1,3, and 4 above. Maple TA uses Maple as a back end math engine for a web-based system for automatic testing and grading. Because the language that Maple TA uses for its computation is the same as that being used by the students, we can ask programming questions as well as answers with mathematical or numeric answers. Instructors fill the web server back end database with formatted questions, and construct assignments built from collections of questions. Maple TA account administration gives each student in a class an account to log onto the system and do an assignment interactively through the web browser. Student responses are logged and automatically graded if the question’s author has supplied the grading algorithm for the question. Instructors (and by permission, students) can retrieve individual and aggregated results for an assignment. Assignments can be configured to allow multiple attempts with instructor-configurable amounts of feedback. At Drexel, we use a dual core 2Ghz system (Linux, with Apache Tomcat as the web service) with 4Gb memory to serve a class of 1000 students in a technical course on computing and programming. Close to quiz or assignment deadlines, there may be up to 150 simultaneous quiz-takers. Maple TA has been used primarily for on-line mathematics assignments and examinations. Our use of it for instruction has caused us to use extended amounts of Maple programming to generate and automatically grade problems. In this paper, we describe: 1. Our instructional objectives for these extensions. 2. Our use of Maple modules and other Maple programming techniques to achieve them. 3. Software engineering rationale that justifies the extra work typically needed to implement these extensions.

2

Blending on-line use of Maple TA with face-to-face instruction

We run our course as a blend of face-to-face instruction in a computer lab with the on-line Maple TA activities both in and outside of the classroom. We use Maple TA to do things that we could not afford to do with the human staffing resources available to us: 1. It provides a way of requiring and monitoring student involvement with the course between class meetings. Pre-lab work with course content is crucial to make the in-class lab time productive. Postlab out-of-classroom practice can consolidate and expand on what has been learned in the lab. 2. We can ask questions that implicitly require procedural “how to do” knowledge in a strong way, since we can establish question-generation and answer-checking algorithms of arbitrary complexity – anything that Maple can be programmed to do. We use these capabilities so that our questions often give different versions of the problem to every student and have answers that are difficult to guess correctly without deep insight or non-trivial computation. 3. No longer is it necessary to ask the same question to everyone for the purposes of grading convenience. It is possible to generate questions whose the coefficients used in an equation, or whether a problem condition was “greater than” or “less than or equal to” varies from student to student under program 2

control. This encourages students to become familiar with the solution process rather trying to work on particular version of the problem. It also makes it easy for the instructor to ask students to practice solving problems that are highly similar in their solution technique. 4. We estimate that each year Maple TA grades over 120,000 questions (some with multiple parts) for our class. We estimate that this allows us to use several hundred hours of staff time per year on face-to-face instruction or tutorage instead of grading. We can ask many different questions (or different versions of the same question, or repeated tests) to a class without incurring more staff time in grading. 5. It provides immediate feedback to students. Computerized feedback may not be as extensive as that given by a human evaluator, but it is available 24/7 and ready whenever the student is willing to work. 6. On assignments, we permit re-tries with immediate feedback given about whether the answer is or isn’t correct. This allows us to expect that students will “work until correct” rather than “stop when you have an answer to turn in, and hope that you can pass the course through partial credit”. This is confirmed by the summary statistics, where problems posted for out-of-class work each typically have a success rate in excess of 90% of items fully correct. 7. We can arrange multiple versions of a quiz or assignment. This allows us to scale our assigned work to dozens of sections scheduled across many days of the week. 8. With access to performance statistics in summary form or a question-by-question, student-by-student level, we can evaluate learning progress as soon as an exam is over, or even while it is in progress. 9. Since an electronic record of student submitted answers is kept until the end of the term, review of grading results (with possible manual override) is easy to do even several weeks after a test has been taken. 10. The availability of assignments can be limited to a pre-set length and window of administration. This allows us to create an in-class proficiency exam where computers are used “live” to answer versions of questions previously seen in assignments and exercises in Maple TA. The results of the exam are available to instructors and students immediately after it concludes. Because we ask dozens of questions via Maple TA during the term it is straightforward to make up multiple exams of selected subsets of the questions. Since we have the summary statistics for every question, we have data to work with that provides assurances that the exams should be of comparable difficulty. Furthermore, we have the information immediately after an exam is over to confirm this. Students may do well by being prepared to correctly answer any of the questions, but cannot predict which questions will be asked on their version of the exam. The students can practice with the questions for a week before the exam, with the same feedback given during the practice as would be given after an assignment. This allows us to run the exam across over 20 time slots without giving extra information from advance knowledge to later exam-takers. The fact that the questions vary from instance to instance is a key feature exploited in making unlimited practice with feedback possible. 11. Hints and supplementary explanation can be given conditionally. Maple TA’s limitations include: 1. Maple TA is a web-based application. The server only records and responds to what is sent to it (answers to questions). It does not reside on the student’s computer so it can’t gather information on how students are working on the desktop, including what they may have done with Maple to generate the answers they input into Maple TA. Furthermore, access to performance or student profile information (e.g. the student’s identity, or their performance on prior questions or quizzes) is not available to the Maple TA question author. Maple TA does not award of partial credit for an answers due to this1 . Because of the limited access to information on student cognitive processes, we have not tried to create automatic tutoring or remediation such might be possible by software such as the Cognitive Tutors developed by Anderson et. al [2, 15]. 1 One

can still ask multi-part questions and award partial credit for getting parts correct.

3

2. It does not allow “survey” questions that ask for opinions or sentiments. We can’t use it to poll what the students think or feel about their learning or even about the test questions they are getting. Of course, there are many other ways of conducting on-line polls, but they would require more administration work and another tool for instructors and students to get accustomed to. 3. We are fortunate that we are with a subject at a level where the functionality provided by Maple TA auto-grading can be used to cover effectively much of the skills and knowledge. However, we are also not suggesting that Maple TA is a replacement for all forms of instruction. Work with Maple TA provides experience working with problems, but it isn’t particularly effective in explaining or introducing concepts, unless the point is for the student to use methods of their own devising and reading materials of their own selection to figure things out. In addition to the contact with instructional staff provided during lab time, we provide staffed optional tutoring and homework help sessions that are popular with many students during each assignment cycle, particularly as the Maple TA out-of-class assignment due dates loom.

3

Basic Maple TA scripting

In this section, we discuss how Maple TA works and what its programming language looks like. We assume familiarity with Maple’s programming language[5] which is distinct and unrelated. Maple TA provides a capability on-line testing and question authoring similar in concept to that provided by well-known course systems such as Respondus[17] or Blackboard Vista [10, pp. 80-110],[6]. Questions are composed in a WYSIWYG Question Designer, which facilitates the input and layout of question content. The Question Designer the creation for a variety of automatically or manually checked questions: multiple choice, matching, numeric, text, or mathematical, including responses that use mathematical formulas or the syntax of the Maple programming language. Once the questions have been created, assignments can be created using those questions. Reports can be generated whose scope can range from the results of one student’s assignment, to summary statistics for the entire class on a number of assignments. Maple TA has a scripting language to allow question authors to specify algorithms for computing or checking answers. An example of a short script is given in Figure3. A Maple TA script can assign dynamically defined variables with values which can be numeric, string, or derived from Maple results. Random generation and selection of values is allowed. This allows questions to vary between students through the use of randomly selected coefficients or other components. The variables can be used in the text of the question or in the specification or grading of student answers. Maple TA uses a WYSIWYG Question Designer for straightforward layout of the question. The Question Designer also allows specification of how answers should be checked for correctness if they are free-form text, algebraic, or numeric responses. An important feature of Maple TA is that the answer checking can reference user-provided Maple programming as well as randomly-generated values of script variables. For example, in Figure 3, Maple’s numerical solver fsolve is invoked to solve an equation. This features allows questions that ask the student to do non-trivial amounts of computation and programming to answer them. It also allows answer checking for questions to be generated on-the-fly through randomization. Figures 1 through 6 illustrate a sample question we have developed for our course using Maple TA. The problem is an enhanced version of a problem taken from a calculus text[3, p. 99], which we use in our course to revisit and review pre-calculus concepts used in first year technical courses. Figures 1and 2show the top and bottom portions of a screen that the student would see when they connect to Maple TA and see a question. Figure 3 shows the Maple TA script (algorithm) for the question. It uses random number generation (the Maple TA built-in function range, which selects a randomly generated integer from the specified range) to create a formula. It then invokes Maple to calculate some of the quantities used in the posing of the problem and the answers the student is asked to calculate on their own. Those results are stored in the Maple TA script variables $lowtime , $hightime , and $midTime . Maple is also invoked to dynamically generate portions of the problem’s presentation to students: $eq , the textual (Maple syntax form) of the equation to be solved, as well as $qml , the MathML[4] format of the cooling law used for this instance of the problem. Portions of the presentation of the question developed in the Maple TA question editor is shown in Figures 45. The placement of the MathML result $qml in the Maple TA Question Designer

4

Figure 1: A Maple TA question, part 1

is shown in1. The student sees the presented form of the question (Figure 1) with the MathML is rendered as two dimensional mathematics. A screenshot from the Question Designer displaying the answer-checking options for one part of the question is shown in Figure 6. It specifies that the student answer should be compared against the computed reference answer $lowTime . The Question Designer dialog box allows the question author to specify that the student’s response should agree with the calculated answer $lowtime within 0.1% (e.g. about 3 decimal digits’ agreement) in order to be graded as correct. This typically permits minor variations in floating point round off or of math library implementation. In this way such questions can be used even if computation is being done with hand calculators, a different version of Maple, or another system such as Matlab2 . Figure 7 describes the software architecture of this simple way of using Maple within Maple TA.

4

Five aspects of question development in Maple TA 1. Dynamic problem creation. Maple TA developers have the choice between developing static problems – problems that are the same each time they are given –or problems whose content changes (one that may change each time a test or assignment is given to a student). For the latter, we characterize questions as involving either dynamic generation of the problem (e.g. use the random number generation facilities to pick parameter values to be used in a formula for each student as their test is given), or dynamic selection (randomly select from a list of versions of a problem established by the question author). Of course, questions can combine the two approaches as well.

2 Particular care must be taken in setting up randomized questions that use floating point computation. Randomization can lead to tricky issues in numerical or mathematical analysis. It is the responsibility of the question author to ensure that domain of questions being randomly generated does not lead to situations where numerical results cause fair automatic grading to be difficult due to ill-conditioning or being beyond the limits of what the math libraries can handle.

5

Figure 2: Maple TA question, part 2

Figure 3: Maple TA question – the algorithm

6

Figure 4: Initial portion of question in Maple TA question editor

Figure 5: Another portion of question in Maple TA question editor

7

Figure 6: Maple TA question – an answer checking specification

Figure 7: Schematic of a simple Maple TA problem implementation

8

(a) Dynamic generation through randomized coefficients. The script in 1 illustrates dynamic generation where randomly chosen numbers are used in formula coefficients and numerical conditions3 . Generating a problem then becomes selecting values for each parameter, typically through random number generation for each parameter. Visualizations (graphs or animations) based on the formulae that thus arise can also be generated dynamically. (b) Dynamic selection. As an example of dynamic selection, the multiple choice problem in Figures 19-21 selects the options randomly from a collection of correct and incorrect responses built into the problem’s algorithm. Maple’s combinat[randperm] function is used to scramble the lists of correct and incorrect options. The first four options of the scrambled list $named are selected, and the first of the scrambled list $unnamed is picked. The “permuting” choice in the question editor specifies that the five options will be randomly permuted between versions of the question. Students taking a quiz more than once, or a collection of students taking the same quiz, will see different options in their versions of the question, with the correct answer’s position varying. (c) Dynamic formula and code manipulation. Maple can perform symbolic manipulation of expressions and of programs. The “first class citizenship” of expression and procedure data types facilitates the programming of such manipulation. This allows operations, not just coefficients, to be changed dynamically – for example having a problem which might specify the use of either “greater than” or “less than or equal to” as the selection criterion for searching through a list of values. Symbolic manipulation can also be used to vary expressions to be used, or to vary the order or nature of the steps used in a solution procedure. This advanced technique is discussed further in section 7. Varying the problem in this way also typically requires dynamic generation of the problem description, discussed further in Section 3, since such changes typically require going beyond substitution of coefficients to rewording the text of the problem. 2. Answer generation. Whenever problems are being generated on the fly, the problems’ answers need to be calculated dynamically as well. We discuss this further in Item 3 of Section 6. 3. Presentation generation. A variety of physical situations often have identical or highly similar mathematical models. For example, one can use exponential growth/delay formulas in word problems having to do with baking (as in Figure 1), voltage of a charged capacitor, time-of-death body-cooling problems, etc. The computational algorithm to generate problems and answers can be used for all of these problems. Once this is realized, it is a small step to add a little more machinery to generate the text of the problems as well as the mathematical formulae and answers, for example setting up the problem to talk of electricity and units of electrical current, versus talking of fluid flow and units of liters/minute. In this way, problems can be devised which ask students to practice their skills at mapping problem descriptions to mathematical models, rather than in just doing calculations. Rather than typing a fixed problem statement into the WYSIWYG Maple TA Question Editor, we include Maple programming to dynamically create a string with HTML coding that presents the question. A Maple TA variable is assigned the result of a Maple printf of a the Maple string. The Maple TA variable is then referenced in the Question Editor. We can use this technique to dynamically generate tables with formulas or numerical data, or references to dynamically selected JPG figures or animations. The “Spline 3” problem in section D illustrates presentation generation. 4. Grading (answer checking). Automatically checking a student’s mathematical answer is not typically just a matter of seeing whether the student input is exactly the same as the solution computed by the author or the answer generator. When numeric results are computed through floating point arithmetic, the non-commutativity of floating point arithmetic may produce small variances in student results from the application of steps in a slightly different order, or even possibly due to changes in the floatingpoint solvers between different versions of Maple. For these reasons, question authors often decide that anything within a small range around the reference answer should be accepted using notions of relative error, or a certain number of digits’ correctness. If an answer is an algebraic formula, the student may 3 The script does not go as far with randomization as it could. For example, in addition to randomly generating each numerical coefficient or exponent in the heating formula, it could also randomly generate the numbers for the beginning and ending of “doneness”.

9

give a mathematically equivalent syntactically different response, e.g. x2 + y 2 versus (x − y)(x + y). Fortunately, Maple TA provides built-in options for doing this [12, "A question with Infinitely Many Correct Answers", p. 97; "Numeric Question using the Question Designer", p. 35]. However, there are situations where the built-in support for answer checking does not go far enough. Since part of our curricular agenda is to teach about data structures, some of our computational problems sometimes involve answers that are expressed as lists, vectors, or matrices of results. Sometimes the answers involve nested data structures, such as a list of lists. Each item in the structure needs to be checked itemwise against the reference answer possibly using numerical tolerances, or filtered through equivalence testing. (An example of tolerance checking on structures occurs in $withinPairlist (Lines 53-56, section C.2). 5. Grading (procedure testing). Another part of our curricular agenda is to have students construct and use procedures as part of their training in programming. Therefore, some of our questions ask students to supply a procedure as a answer. It can be difficult to determine functional equivalence of a student’s procedure to the reference answer. We usually settle for checking that the student’s response agrees (within tolerance) with the reference result when executed on a set of test data. As part of the design for questions with procedure testing, the question author needs to decide if they will reveal all test data in the problem description, or leave some to be revealed after testing is complete. They also need to decide whether the test data and results will be generated statically or dynamically.

5

Anti-gaming strategies and the complexity of question authoring

Any on-line system when used in a loosely supervised way out of the classroom, can be subjected to student manipulation of results. Here are some of the gaming strategies we have considered, and how we dealt with the in Maple TA.

5.1

Static versus dynamically created questions

Statically generated questions can be answered without knowledge of the problem-solving process, if the student prepares by memorizing or copying the answer of someone else who has already taken the test and knows that their results have been graded correct. Randomization ensures that the problem-solving process (rather than “character string copying”) must be applied to each instance of student work.

5.2

Avoiding the “manual processing” strategy

As mentioned in item 5 of section 4, the difficulty of evaluating the correctness of code automatically naturally leads question authors to ask students to input the results of executing the procedure, which are typically much easier to grade automatically. However in our course about computation, the question author needs to set up the question in a way that requires the student to do the desired execution on a computer. For example, asking the student to write a procedure to find the minimum of a list of numbers, it is important to avoid giving as the only tests lists that have only a handful of items. Given the amount of time it may take a novice to write a program, it may often be the case that scanning a list of even fifty or 100 items for the minimum will take less time than writing a program to do the processing. Giving very long lists (e.g. one of hundreds of items) changes the balance. This requires more work in presentation to ensure that the tests are clearly presented.

5.3

When feedback is a deterrent to learning – discouraging solution by exhaustive enumeration

Usually feedback on errors is a good thing, as it helps the students learn how to avoid them. However, the presence of feedback changes the way questions must be authored. The nature of the grading in Maple TA means that either the student is given full credit or no credit at all for a particular graded item. Since Maple TA cannot see the work done to produce the answer and it is difficult to get it to detect when the answer, while wrong, is close to correct, it is not possible to give partial

10

credit. This places the grader in the position of giving no credit for answers which would be quickly corrected in “real life” when there is feedback about syntactic correctness. For example, requiring completely accurate input the first time would give no credit someone who through imprecise mouse motion omits to paste in the leading bracket – a [ – of a long list of items copied in from an answer calculated in a Maple worksheet. To avoid extra work manually overriding such “unfair” evaluation, we typically set up assignments to allow indefinitely many retries with “how did I do” turned on. While this allows us to make it reasonable to insist that the student try until they get things completely correct, we have found that this allows students to answer question without using exhaustive enumeration rather than the desired problem-solving process. For example, if the question is “how many of the following list of times does the simulation return a positive result?”, it may be known that the answer must be between 0 and 15 if there are fifteen numbers in the list of times. The question could be answered by the student trying possibilities exhaustively, just using “how did I do” after entering each of 0, 1, 2, etc. While exhaustive trial is a useful problem-solving strategy in some situations, we don’t want it to significantly distract students from pursuing learning about the modeling and simulation that’s our course domain. This strategy can be applied even when the answer desired is a floating point decimal number. For example, if a problem gives a function described by a formula, and then asks “at what value of x does the function achieve its maximum”, then a student could come up with an imprecise estimate of the result by just from real world considerations associated with the problem. If the tolerance is set too broadly (say, two digits accuracy), then the student can come up with a correct answer by exhaustively trying every two digit possibility in the range. What this is saying is that problems set up this way are not the right way to use Maple TA to ask the student to demonstrate that they can use an optimization routine. One strategy to take is to ask for two or more answers at once in a list, and use the greater number of possibilities to reduce the chances of successful guessing. For example, if we break up the list of 15 times into a list of 7 and a list of 8, and require an answer such as [2,3], then guessing must cover 56 possibilities rather than just 15.4 We could probably reduce the amount of such behavior if Maple TA had a way on limiting the number of retries to some reasonable value between “none” and “infinite”, but until that happens, we aim to write questions that make exhaustive trial an unlikely strategy for success. In addition to asking for numerical answers to enough accuracy, one can ask for multiple answers in a single list, which increase the number of possibilities combinatorially.5

5.4

Anti-gaming strategies require more elaborate programming

Our anti-gaming measures typically rely on dynamic creation and more elaborate answer-checking, both of which increase the amount of programming that goes into a Maple TA question. This has naturally led us to consider programming practices that will lessen the costs of Maple TA question development. In the next sections, we discuss this.

6

Software engineering considerations in the development of Maple TA questions

We observe the following limitations to developing questions that use Maple embedded in a Maple TA script and entered or edited in the Maple TA Question developer: 1. Multi-line Maple procedures need the conventional tools for textual software development. Extensive amounts (dozens or even a few hundred lines) of Maple code may be needed for some of our questions. It is difficult to easily read or maintain that amount of code in the Maple TA Question Developer environment, since source code manipulation is limited to the point-and-click capabilities of the Developer’s text editor which lacks tools such as regular expression search, global replacement, and other code reformatting or refactoring features. 4 We

have seen students work through dozens of possibilities in an exam situation, so even 56 may not deter the desperate. possible policies that would simplify anti-gaming: allowing grading to take into account multiple answers, or allowing assignment/quiz authors to turn off “how did I do” selectively to some questions. 5 Other

11

2. Multi-line procedures need documentation. The Maple TA scripting language does not yet have an ability to comment code except in workarounds such as defining dummy script variables whose string values contain comments. Maple source code allows conventional programming comments. 3. Random creation of problems needs support for automated testing. Dynamic selection or generation of questions often requires non-trivial amounts of testing of a large number of cases. We have found that despite our best efforts in analysis, it is not easy to avoid problems with randomly generated coefficients that result in nonsense versions of a problem (versions where the solutions would not result in a physically realizable situation, equations that have no solution, equations which the Maple solvers find intractable or solve only after a large amount of time, etc.) . Testing is needed to have confidence that problem generation almost always results in a solvable and sensible version of the problem. The Maple TA Question Developer environment, like much of Maple TA, has a GUI/user-operated philosophy involving multiple clicks to navigate through question generation, algorithm/script execution, test answer entry, and answer checking/grading. Manual case-by-case testing is uneconomically laborintensive for most organizations. The bottom line is that more elaborate problems need an environment that supports more automated testing. 4. More robust answer checking needs support for automated testing. If student answers are data structures or Maple procedures, as they are in our course, then the built-in tools for answer checking and student feedback must be supplemented with author-supplied procedures. For example, if a student is expected to enter a list of 1000 rational numbers as the answer to a problem, then it is usually insufficient to tell them “no, you didn’t enter a correct answer”. They may have forgotten to include the opening or closing bracket of the list, put in a trailing semi-colon, or included some floating point numbers because they haven’t learned that they are different entities than exact fractions. Adequate testing of the answer-checking algorithm requires that it be given a variety of test cases to ensure that it doesn’t inadvertently grade an answer as correct if it is off-beat or unusual. This leads to more need for automated multiple-case testing. 5. Extensive use of random numbers in problems benefits from control of the random number seed. Debugging questions that use random number generation requires access to and control of the random number seed, so that it is easy to reproduce the situation where bugs arise. Maple random number generation facilities (eg. randomize) allow such control, although apparently Maple TA’s facilities such as range do not [12, p. 74, pp. 81-87]. We have adopted the practice of setting the seed at the start of the Maple TA script by calling rint, a Maple TA procedure designed to produce random values. Unlike Maple randomize, Maple TA rint guarantees that different random number sequences are used even in concurrent student sessions[9]. We try to ensure that subsequent random number usage is based on the original seed. 6. Development of multi-line Maple procedures needs access to Maple-generated error messages. Maple TA scripts do not show error messages arising from Maple computations performed within it[14, , pages 140-145]. For example, running $result=maple(“solve(x=x/0,x)”); results in the message java.io.IOException: Maple computation error, rather than the more descriptive message numeric exception: division by zero that results when the solve is executed in Maple itself. More seriously, Maple TA does not indicate where within a multi-line Maple procedure an error occurs. This leads one to want to do most debugging of Maple code outside in the ordinary Maple environment so that the full diagnostic information from error messages can be seen6 . These considerations lead us to conclude that while Maple TA scripting is an environment conducive to the development of problems handled by a few lines of Maple TA or Maple code, more complicated questions should be developed in the ordinary Maple development environment, where most of the testing could then occur. Only after most of the code has been developed should a move to the Maple TA execution environment 6 A change in how Maple TA deals with Maple errors would change the balance of costs and benefits between developing in the Maple TA environment versus that of ordinary Maple. Some changes, though worthwhile, might be expensive to implement as they would require significant changes to how Maple’s run-time system. Changes could also be inadequate if they haven’t been verified to meet needs of developers.

12

be attempted, based on the assumption that such movement won’t require major code revisions. Some testing within Maple TA is necessary to check appearance and correct invocation of Maple, of course. In the next section, we discuss a technique to do this: Maple programming using Maple library modules.

7

Creating questions in Maple TA with Maple library modules

Use of Maple library modules in Maple TA questions provides a way to address the development and testing issues described in Section 6. While Maple TA has features specifically included to encourage the use of library modules in questions, the methodology is not well-explained in the Maple TA User Guide. In the next few sections, we present details of how to use them for questions. We also present further issues that we have seen arise when this style of Maple programming is used.

7.1

Creating a Maple module in a library file and testing it on the desktop

Briefly, the method is: 1. Create the Maple procedures for your Maple TA question and package them together as a Maple module on your own computer [5, Chapter 8, "Programming with Modules", pp. 323-364]. The procedures in the module should provide means for problem generation, question presentation, and answer checking. Use the standard Maple development environment to develop source files defining the Maple module. The procedures you will want to invoke from the Maple TA script should be exported members of the module. 2. Create the library files (.lib and .ind) for the module using the LibraryTools package from the Maple library and then upload them to the Maple TA server. See [5, Ch. 10 "Writing Packages", pp. 387-404] for a description of setting up a library package. 3. Develop test suites for your Maple module and run them on the desktop. These tests can check the correctness and timeliness of randomized problem generation, question presentation, and the robustness of answer checking. Appendix E describes a shell script that automatically includes the Maple code form LibraryTools to do this. It also automatically invokes the library on designated tests after it has been created.

7.2

Using the Maple module in a Maple TA question

Once the module has been developed and tested, the technique for using a module in a Maple TA question is: 1. To make the package accessible from Maple TA, it must be uploaded to the developer’s Maple TA server. Using the Web Editor feature of Maple TA (see[12, pp. 67-68]), create a folder on the server for the library. (See Figures 8 and 9). Then upload the .lib, .ind, and any other supporting file for the question to that folder on the Maple TA server (Figures 10 and 11). The code is not directly readable by student users of the Maple TA server, so details of the problem can be securely stored on the server. The Web Editor feature allows uploading and automatic unpacking of .zip files, so one can often save a few clicks by creating a .zip file with the .lib and .ind files together, along with any other files such as .gif graphics files for the question. The shell script described in Appendix E automatically creates a .zip file with with the .lib and .ind files in it after they have been created. 2. To use the module in a Maple TA question, use the Maple TA Question Editor to begin creating your question. Use the GUI interface of the Question Editor to select the .lib file of the module in the directory you created in step (1). This will insert the proper name through its point and click interface, as illustrated by Figure 12. Once the name has been revealed, you can textually copy and paste the reference to the module into subsequent invocations of module procedures within the Maple TA script using the editable text fields displaying the question’s algorithm.

13

Figure 8: Maple TA Web Site editor.Folder Creation icon highlighted in pink. Upload icon next to that.

Figure 9: Sensor2 directory after it’s been created. Subfolder, upload, and delete icons highlighted in pink.

Appendix C.3 lists the code for “Sensor2”, the name of the module encapsulating the computation for a problem. Appendix C.2 shows a complete Maple TA script that references procedures from a module stored on a Maple TA server.

8

An example of symbolic manipulation used in dynamic creation of questions

Maple permits symbolic manipulation of expressions and of programs. This allows dynamic generation to go beyond substitution of numerical coefficients. Symbolic manipulation can also be used to vary expressions to be used, or to vary the order or nature of the steps used in a solution procedure. This typically requires dynamic generation of the problem description as well since such changes usually require rewording of the problem rather than just substitution of numerical values in formulae or the problem statement. The possibility of creating significant different problems through this kind of manipulation means that the question author usually introduces a need for more analysis to avoid variants will that are too difficult or easy. Figures 13 and 14 give two versions of a problem created dynamically. Not only do the different numbers 14

Figure 10: A zip file being uploaded

Figure 11: After zip file uploaded. It contained .lib and .ind files, as well as copies of source to the questions and tests.

15

Figure 12: Maple TA question designer with reference to module Sensor2

The designer has specified that the variable $problemSpec should be assigned the result of the Maple computation Sensor2:-problemGen($seed), where $seed is the value of the random number seed established in a prior step of the Maple TA script. The code for Sensor2 is in .lib and .ind files on the Maple TA web server in the LIB location, which has been generated using a pop-up window (not shown) invoked by clicking on the Maple Repository button. used in the list and for the selection criteria, the problem also mandates different selection criteria. The problem generator must manipulate the code templates for the solution, substituting in the dynamically selected operation that performs the specified selection. It then invokes the instantiated code template to generate the numerical answers. The anti-gaming considerations mentioned in Section 5 led us to create lists that are long enough so that the problem cannot be solved easily just through visual inspection, but short enough to be easily copied between the browser and whatever screen or file the calculation is using for input. Appendix B lists the Maple TA and Maple programming for this question.

9

Software engineering advantages of question development through modules

We have found that most of the question development described in Section 4 can be carried out by Maple programming rather than Maple TA programming. If the Maple programming is placed in modules, then the Maple TA script for a question can be reduced to initializing a few key values (typically, the random number seed, and the specification of problem size/difficulty parameters) and disassembling the results from Maple into individual Maple TA variables that are used in the presentation or in answer checking. If this is done, then most of the debugging and testing can occur on the question developer’s desktop machine in the conventional program development environment. Using this environment has the following advantages over the Maple TA question developer: 1. We can use a full-featured text editor (global search and replacement, reformatting tools, etc.) to develop the Maple programming. 2. We can use shell scripts to automate compound tasks such as module creation and testing. Our shell script shell script makelib2.sh (see Appendix E) is an example of this. It automatically combines module creation with execution of a test script for the module. 3. We gain access to Maple parse-time and execution-time error messages (recall Item 2 in Section 6) during testing. 16

Figure 13: A temperature calculation

17

Figure 14: Another version of the temperature problem

18

Figure 15: Schematic of a Maple TA problem invoking a Maple module

This reduces an iteration of edit-generate-test for the Maple programming of question to a matter of seconds, compared to the minute or two it takes to upload Maple programming onto a Maple TA server, and then manually execute a test in Maple TA. Even though Maple lacks an integrated development environment such as Eclipse [8] or extensive tools for testing, we have become more productive. We note that making most of the question’s source code executable on the desktop does not address some testing needs. The Maple TA script and answer-checking specifications must still be tested. Typically this must be done manually through the standard Maple TA interface, although web-testing harnesses such as Selenium [16]hold promise for further testing automation in the future. In addition, the presentation of the problem description must also be checked for technical correctness, and that the problem as presented works well for the intended (student) audience in the production environment. For the foreseeable future, some amount of manual testing and feedback from human testing will always be necessary.

10

System limitations and programming workarounds

While Maple modules facilitate ambitious question authorship, there are downsides to larger computations in Maple TA questions. In this section, we discuss two of the more significant we have run into.

10.1

The work of disaggregation

Maple TA limitations make it difficult to return multiple results from Maple without a tedious sequence of additional operations. The natural impulse in problem generation is to have a single procedure that generates all or many parts of the answer, and returns everything needed in a single data structure to Maple TA such as a list. Using switch[14, p. 232], parts of the data structure could be assigned to individual Maple TA variables and used for presentation text, answers, answer checking algorithms, etc. While this works in simple cases, the Maple TA scripting language makes it difficult to disassemble aggregated information from Maple when results contain subparts, such as a lists of lists. For example, in Figure 16, a Maple result consisting of a lists of lists is created and assigned to the Maple TA script variable $result. For the first few operations, the Maple TA operation switch can be used to extract a part of the result computed by Maple. However, the third item of the list is a list itself, and the part extraction by Maple TA just gives a part of it. The fourth item extracted through switch shows that Maple TA is using fundamentally different semantics from Maple, in that “e” is interpreted as the base of the natural logarithm and approximated, rather than as a 19

symbol as it is treated in Maple. The workaround is to pass the result back to Maple and to extract it there such as occurs with $r3m in Figure 22 but this involves the overhead of another invocation of Maple. Our difficulties would be eased significantly if Maple TA adapted a feature that would allow multiple assignment from a Maple result. We discuss our techniques for facilitating disaggregation in section 10.5. It is difficult to break a large Maple computation into several parts in the Maple TA execution environment. Conventional software engineering says that a good way to handle task complexity is to decompose it into separate parts, and to execute each part in sequence, or in parallel as is appropriate. However, invocation of Maple in Maple TA is “stateless”: subsequent calls to Maple start with a fresh copy of Maple. Thus prior results must either be recomputed within the subsequent call, or they must be fed back into Maple from Maple TA variables that recorded the previous invocation’s results. We have not yet determined a convenient way of doing this kind of daisy chaining that avoids tedious amounts of coding and non-trivial computational costs.

10.2

Three operational limits to returning Maple results to Maple TA

In addition to the “pain of coding”, we have encountered two limits to problem generation results: 1. An ambitious Maple computation may run too slowly in a production test-giving environment. If the author of a question gets sufficiently ambitious, the cost of generation and answer-checking for a question can take several seconds of cpu time. In our production environment, where over 100 students may be trying to take a quiz at the same time on a dual processor server, any question that takes more than a second of system time to produce may cause noticeable delays in response to the quiz-takers. Lengthy disaggregation or daisy-chaining sequences that pass large results back and forth between Maple and Maple TA are a further cause of additional slowdowns. We discuss our techniques for dealing with good problems that take too long to run in 10.4. 2. Maple results, if too large, cannot be returned to Maple TA. There is a currently undocumented (and hence, to the Maple TA question author, unknown) limit to the size of results that can be returned from Maple to Maple TA. The limit appears to be less than a hundred kilobytes. This means that there are some things that can be easily computed in less than a second of CPU time in a contemporary computer system that cannot be used in a Maple TA question.We have run into this limit with elementary list processing programming problems with non-”toy” input sizes. We have also encountered this when generating Maple animations used for question presentation. We discuss our response to this in sections 10.6and 10.7. 3. It is difficult to return enough diagnostic information to students when running the results of tests on student-submitted procedures. As mentioned in Item 6 of Section 6, there is easy way to convey Maple run-time or parse-time error messages for student submitted programs. While the programmable feedback mechanism in Maple TA allows one to provide some automatic guidance (e.g. that the type of result returned by the student procedure does not meet the type specifications, or that certain tests are failing), there are few tools to make this easy.

10.3

Workarounds to avoid grading code

Even though a major objective of the course is to get students to develop their own computational scripts and procedures to solve technical problems, Maple TA does not strongly support grading student code. First, the “Maple syntax” entry box for answers does not accept scripts – multi-line sequences of code that are not Maple procedures. Secondly, as mentioned in Item 6 in Section 6, Maple TA does not provide diagnostic messages when student-submitted procedures are executed during answer-checking. Thus students have little information about why their code is rejected if it has syntax or run-time errors. While one could argue that this reinforces the lesson that students need to be responsible for getting their code to work before submitting it, the reality is that hundreds of students come for help during consultation hours because they don’t think they have to means on their own to figure out why their code isn’t working.7 7 There

are also limitations in the kinds of procedures that Maple TA can accept. For example, procedures that contain the character ” – such as any procedure that uses a string constant – cannot be given as input to Maple TA even if

20

Figure 16: Maple TA script extracting parts of a list

Because of this, we have adopted the strategy of asking questions in a way that requires execution of code only on the student’s machine rather than on Maple TA,. In other words, we ask the student about the results of running the program (or of implications of the results). We do not completely avoid submission of code to Maple TA, but it becomes feasible for our large class only when the students have gotten to the point where basic procedure creation and execution is not a major problem for most. Having the ordinary run-time and parse-time messages from Maple displayable from Maple TA when grading answers given in Maple syntax would allow us to shift back to asking code/code testing problems more straightforwardly.

10.4

Workarounds for lengthy generation times

It may not be necessary to generate problem versions on-the-fly. Pre-computing several dozen different versions, storing them on the server, and using dynamic selection may meet pedagogical and exam security needs. Pre-generation requires an extra layer of Maple and shell programming to drive the generation and storage of results. This extra work is balanced by an advantage to pre-computing, that one can exhaustively test problem versions, rather than trying to prove through theoretical analysis or statistical sampling that the the generation process is without fault. Retrieval of pre-computed problems may generate its own timing/performance problems. If one wanted 1000 different versions of a problem, with each version’s description needing 10Kb of file space, then dynamic selection would require reading in and parsing/processing 10Mb of Maple expressions. We experimented by generating Maple lists of integers of varying sizes, and timed how long it took to generate a simple question that involved the list. Table 1 shows that the time for file i/o and Maple TA processing is for files of this size is significant. Using questions that accessed stored data in a production environment with many simultaneous students could lead to response problems or time-outs without a quiz being generated. If it takes too long to read in all the precomputed problems, the approach could be refined by spreading the data for the problem versions onto multiple files. Dynamic selection would then include computing the file name containing the problem to be selected.

10.5

Workarounds for limits to the size of results that can be returned from Maple

There are undocumented limits to the length of results returned from Maple to Maple TA. We have encountered these limits when generating animations during the question-generation activities. Often times it is they are valid procedures in Maple.

21

Length of list in question Size of .lib file (in kilobytes) Size of .ind file (in kilobytes) Size of zip file Time to generate assignment

104

106

107

89.2 40.9 66.5 < 2 seconds

8,796.5 40.9 6,562.1 ~3 seconds

87,958.8 40.9 65,615.2 ~21 seconds

Table 1: Time to generate a question reading data from a Maple library file stored on Maple TA Details of this experiment are given in Appendix F. not possible to return multiple animations in a single result. To work around this, we have either resulted to daisy-chaining the problem generation (see Section 10.1), or precomputation (Section 10.4). Either way, there are additional costs in programming.

10.6

Workarounds due to length limits in answers returned to Maple TA from Maple

In Section 5 we discussed the need to use larger problem sizes when the subject of the question is a simple to understand computation task. While Maple TA makes it easy to grade numerical or formula answers, it does not make it easy to grade code answers. The issue is that if one checks test results without inspecting the student’s coding, then there’s no explicit proof that the student used computation to get the answer. Short problems sizes for simple computational tasks can be done by hand. If one is setting up a data structuring problem where the solution is a list with 50,000 floating point numbers, there’s currently no simple way you can get the answer computed by the problem generator in Maple into Maple TA so that it can be used to check the student answer. The most obvious ploy is to generate shorter problems. However, for some classes of problems, where the programming activity is non-trivial even if the motivation is not, it’s important to have problems and results large enough so that computation cannot be avoided. For example, if one gave a sorting, searching or counting exercise but only had problems with ten items in it, the problem could be done by the student manually “by inspection”. A problem requiring the processing of several thousand items necessitates computation. Hopefully one can find problems that are acceptably large but small enough to not run into the Maple-Maple TA size limitation. If the problem is too large, the only recourse is to find a smaller problem that will fit. If the problem description can be made to fit within the limit, there may still be a problem with the answer being too large to get from Maple to Maple TA. However, it may be possible to be reasonably certain that the student’s answer is correct even without access to the entire answer. For example, one could check a randomly generated subset of several items from the student’s answer. The answer generator can easily return the indices of the subset as well as well as the items from its reference answer. This kind of probabilistic checking is obviously different than what we’d do with a mathematics problem with a short fixed answer, but indicates the different needs that a computational course that demands the solution of problems that cannot be done through paper and pencil.

10.7

An example of using workarounds: sizable animations

We have on occasion generated animations to include as part of the narrative of a question, to help the student better understand the problem being solved. However we have found either that the animation generated is too large to send between Maple and Maple TA, or that it takes too long to generate dynamically. Figure 17 shows an example of this. The problem builds off a lab exercise where students learn how to write control programs for a model robotic car moving around an arena with obstacles (shown in green) and targets (show in blue). The current position of the car is shown in red and the path of taken so far in the simulation is shown in grey. The object of the problem is to write a control program that sends the car form its starting point through the hole in the barrier to a post, where it then turns left and heads towards a target. Two examples of desired control program’s behavior on two test cases is given. Each case is presented as an animated gif which is displayed in the web browser as a continuous animation of a few dozen frames each. We’ve found it too time consuming to generate these animations on the fly. Furthermore, the Maple-Maple

22

TA size limit would not let us transmit the animations through the Maple-Maple TA connection even if we were to compute them in this way. We’ve handled this limitation by combining the ideas of the previous few sections: pre-compute the animations and generate animated gif files, storing them in separate files whose names differ in an obvious way through varying an index. The problem generator dynamically selects the version of the problem. The names of the files are assigned to Maple TA variables and referred to in the presentation. Maple TA does not appear to have performance issues sending the animated gif files to the student’s browser. The Maple code segment in Figure 18 generates an animation in Maple and saves it as a animated gif in the specified file after printing at the terminal the full path of the gif file. This segment is part of a larger script where the value of id1 is varied as the various animation cases are generated. The image tool in the Maple TA question editor can be used to display a file whose name is generated dynamically (e.g. $gifLocation). The Maple TA code randomly selects values for id1 and numTrials from the permissible ranges, and generates the file name of the appropriate gif. These values are also given to the Maple module for the problem to allow it to retrieve the other information necessary for presentation and grading for the selected problem.

11

Conclusion

We have presented a way of developing Maple TA questions so that Maple’s mathematical, random number generation, and text processing capabilities replace most of the use of Maple TA. This reduces the role of Maple TA to labeling results computed through Maple, generating the original random number seed, and using its GUI for question layout and answer-checking. Software engineering advantages include a better organized and document code structure, access to better code development tools, and more easily automated testing. We have also discussed some of the issues in developing good Maple TA questions for our course – anti-gaming issues, workarounds due to size and performance limitations. We believe that the advantages of the approach outweigh the work needed, but look forward to system enhancements that would further facilitate such software development. Acknowledgments We gratefully acknowledge the help of Mark Boady in the preparation of the coding examples used in this paper. The Department of Computer Science of Drexel University (Jeremy Johnson, department head) provided the support for the development of the course materials. Jeremy Johnson and David Augenblick provided thoughtful feedback on the course, course questions, and this paper. Carl Hickman and others in technical support at Maplesoft have answered many questions from us, including some of the technical details of creating using Maple modules in Maple TA.

References [1] D. Alpert and D. L. Bitzer. Advances in computer-based education. Science, 167(3925):1582–1590, 1970. [2] John R. Anderson, Albert T. Corbett, Kenneth R. Koedinger, and Ray Pelletier. Cognitive tutors: Lessons learned. Journal of the Learning Sciences, 4(2):167, 1995. [3] Howard Anton, Ira Bivens, and Stephen Davis. Calculus. Wiley, 8th edition, 2005. [4] Ron Ausbrooks et al. Mathematical Markup Language (MathML) Version 3.0. Technical report, W3C, June 2011. [5] L. Bernardin and others. Maple 15 Programming Guide. Maplesoft, 2011. [6] Peter Bradford, Margaret Porciello, Nancy Balkon, and Debra Backus. The Blackboard Learning System: The Be All and End All in Educational Instruction? Journal of Educational Technology Systems, 35(3):301 – 314, 2007.

23

Figure 17: Example of gif animations used for an assignment problem in Maple TA

Figure 18: Maple code segment for saving an animated gif in an index file f i l e n a m e := c a t ( " SecondTurn1 −", id1 ," −" , numTrials , " . g i f " ) ; print ( currentdir () , filename ) ; p l o t s e t u p ( g i f , p l o t o u t p u t=f i l e n a m e ) ; print ( i n t e r f a c e ( plotdevice ) , i n t e r f a c e ( plotoutput ) ) ; p r i n t ( carMovie ( s t a t e T a b l e ) ) ;

24

Figure 19: A randomized multiple choice question

[7] Bruce W. Char, Gregory J. Fee, Keith O. Geddes, Gaston H. Gonnet, and Michael B. Monagan. A tutorial introduction to Maple. J. Symb. Comput., 2:179–200, June 1986. [8] Eclipse Foundation. Eclipse documentation (3.7 indigo), 2011. [9] Carl Hickman. Personal communication. Multi-threaded/multi-process nature of rint versus Maple random number facilities., July 2011. [10] Blackboard Inc. Blackboard learning system designer and instructor reference – vista enterprise license (release 8). Manual, documentationion version 8.0.0, 2006. [11] Kirk L. Kroeker. Celebrating the legacy of PLATO. Communications of the ACM, 53(8):19–20, August 2010. [12] Maplesoft. Maple T.A. 6 User Guide. Maplesoft, 2010. [13] Maplesoft. Maple 15 User Guide. Maplesoft, 2011. [14] Maplesoft. Maple T.A. 7 User Guide. Maplesoft, 2011. [15] Peter Pirolli and Margaret Recker. Learning strategies and transfer in the domain of programming. Cognition and Instruction, 12(3):235, 09 1994. [16] Selenium Project. SeleniumHQ web application testing system – Selenium Documentation. http: //seleniumhq.org/docs/. [17] Respondus. Respondus 4.0 User Guide. Respondus, Inc., Juluy 2011.

A A.1

A dynamically selected multiple choice question Problem description

The problem tests whether students have read a particular passage in the course readings. The question asks for the selection of one correct answer out of several possibility. The question author creates lists of correct and incorrect answers. When a version of the problem is generated for a student, correct and incorrect answers are randomly selected from the lists. In question layout the incorrect answers are listed before the correct one, but the question format specifies that the student will see these in randomly scrambled order making it difficult to predict the position of the correct answer.

1 2 3

A.2

A screenshot of the question

A.3

Maple TA script selecting options

$ s e e d=r i n t ( 1 0 , 1 0 0 0 0 0 ) ; $named=[" Python " , " Matlab " , "C" , " Mathematica " , "C ++", " Java " , "MuPad" , " Octave " , " Sage " , "Macsyma " , "Axiom " ] ; $unnamed=["C Sharp " , " L i s p " , " V i s u a l B a s i c " , " P e r l " ] ; 25

Figure 20: Maple TA answer layout for multiple choice question

4 5 6 7 8 9 10 11 12 13 14 15

$named=maple ( " [ ‘ Python ‘ , ‘ Matlab ‘ , ‘C‘ , ‘ Mathematica ‘ , ‘C ++‘, ‘ Java ‘ , ‘MuPad ‘ , ‘ Octave ‘ , ‘ Sage ‘ , ‘ Macsyma ‘ , ‘ Axiom ‘ ] " ) ; $unnamed=maple ( " [ ‘C Sharp ‘ , ‘ Lisp ‘ , ‘ V i s u a l Basic ‘ , ‘ P e r l ‘ ] " ) ; $ p i c k f o u r=maple ( " S t r i n g T o o l s [ Randomize ] ( $ s e e d ) ; combinat [ randperm ] ( $named ) " ) ; $ p i c k F u n c t i o n=maple ( " ( p , i ) −> p r i n t f ( c o n v e r t ( p [ i ] , s t r i n g ) ) " ) ; $a=maple ( " apply ( $ p i c k F u n c t i o n , $ p i c k f o u r , 1 ) " ) ; $b=maple ( " apply ( $ p i c k F u n c t i o n , $ p i c k f o u r , 2 ) " ) ; $c=maple ( " apply ( $ p i c k F u n c t i o n , $ p i c k f o u r , 3 ) " ) ; $d=maple ( " apply ( $ p i c k F u n c t i o n , $ p i c k f o u r , 4 ) " ) ; $ p i c k o n e=r a n g e ( 0 , 3 ) ; $e=maple ( " S t r i n g T o o l s [ Randomize ] ( $ s e e d +1) ; p r i n t f ( c o n v e r t ( combinat [ randperm ] ( $unnamed ) [ 1 ] , s t r i n g ) ) " ) ;

A.4

B B.1

Maple TA specification of answers

The Temperature problem Description of problem

This problem selects a computational task having to do with temperatures. A typical task might be “given the following list of daily temperatures, count the number of days where the temperature was at least .5 degrees Fahrenheit warmer than the previous day”. The temperature difference would be randomly generated, and 26

Figure 21: Maple TA answer layout for multiple choice question

whether or not the counting comparison should be “greater”, “less than”, “greater than or equal to”, etc. is randomly selected.

B.2

Maple TA script for temperature problem

1 $ s e e d=r i n t ( 1 , 1 0 0 0 0 ) ; 2 $v =310; 3 $M=60; 4 $A=40; 5 $P=365; 6 $N=310; 7 $ p i c k s=maple ( " randomize ( $ s e e d ) ; S t r i n g T o o l s [ Randomize ] ( $ s e e d ) ; combinat [ randperm ] ( 4 ) ; " ) ; 8 $ d i f f T a r g e t=maple ( " randomize ( $ s e e d +1) ; RandomTools [ Generate ] ( f l o a t ( r a n g e = 1 . 0 . . 3 . 5 , d i g i t s =2) ) " ) ; 9 $ p i c k 1=s w i t c h ( 0 , $ p i c k s ) ; 10 $TV=maple ( " t e m p e r a t u r e s 1 :− genProblem ( $v , $seed , $M, $A , $P , $N , $ d i f f T a r g e t ) , libname=/mapleta /web/ SanForCs002 / Public_Html / t e m p e r a t u r e s 1 / t e m p e r a t u r e s 1 . l i b ") ; 11 $count=maple ( " t e m p e r a t u r e s 1 :− countProc ( $ d i f f T a r g e t , $ pic k1 , $TV) , libname=/ mapleta /web/ SanForCs002 / Public_Html / t e m p e r a t u r e s 1 / t e m p e r a t u r e s 1 . l i b " ) ; 12 $avgCount=maple ( " t e m p e r a t u r e s 1 :− avgProc ( $ d i f f T a r g e t , $pi ck1 , $TV) , libname=/ mapleta /web/ SanForCs002 / Public_Html / t e m p e r a t u r e s 1 / t e m p e r a t u r e s 1 . l i b " ) ; 13 $pickmsg=maple ( " p r i n t f ( t e m p e r a t u r e s 1 :− p i c k I ( $ d i f f T a r g e t , $ p i c k 1 ) [ 2 ] ) , libname=/ mapleta /web/ SanForCs002 / Public_Html / t e m p e r a t u r e s 1 / t e m p e r a t u r e s 1 . l i b " ) ; 14 $ p i c k 2=s w i t c h ( 1 , $ p i c k s ) ; $ d i f f T a r g e t 2=maple ( " randomize ( $ s e e d +2) ; RandomTools [ Generate ] ( f l o a t ( r a n g e = 1 . 0 . . 3 . 5 , d i g i t s =2) ) " ) ; $count2=maple ( " t e m p e r a t u r e s 1 :− countProc ( $ d i f f T a r g e t 2 , $ pic k2 , $TV) , libname=/mapleta /web/ SanForCs002 / Public_Html / t e m p e r a t u r e s 1 / t e m p e r a t u r e s 1 . l i b " ) ; 15 $avgCount2=maple ( " t e m p e r a t u r e s 1 :− avgProc ( $ d i f f T a r g e t 2 , $pi ck2 , $TV) , libname=/ mapleta /web/ SanForCs002 / Public_Html / t e m p e r a t u r e s 1 / t e m p e r a t u r e s 1 . l i b " ) ; 16 $pickmsg2=maple ( " p r i n t f ( t e m p e r a t u r e s 1 :− p i c k I ( $ d i f f T a r g e t 2 , $ p i c k 2 ) [ 2 ] ) , libname =/mapleta /web/ SanForCs002 / Public_Html / t e m p e r a t u r e s 1 / t e m p e r a t u r e s 1 . l i b " ) ;

27

B.3

Maple library module for temperature problem

1 t e m p e r a t u r e s 1 := module ( ) 2 e x p o r t genProblem , p i c k I , countProc , avgProc ; 3 4 genProblem := p r o c ( v , seed ,M, A, P , N, d i f f T a r g e t ) 5 l o c a l R, s , tempF , T, temps , LValues , TValues , vs , tv ; 6 randomize ( s e e d ) ; 7 R := S t a t i s t i c s [ RandomVariable ] ( Normal ( 0 , 5 ) ) ; 8 vs := S t a t i s t i c s [ Sample ] ( R,N) ; 9 s := c o n v e r t ( vs , l i s t ) : 10 tempF:= ( t ) −> A∗ s i n ( t /P∗2∗ Pi )+M; 11 T : = ( [ s e q ( e v a l f ( tempF ( i ) ) , i = 1 . .N) ] ) : 12 p r i n t ( " l e n g t h o f T=", nops (T) , " l e n g t h o f s =", nops ( s ) ) ; 13 temps := s+T : 14 LValues := s o r t ( combinat [ randperm ] ( N) [ 1 . . v ] ) : 15 tv := x−>e v a l f ( x , 4 ) ; 16 TValues := map( tv , [ s e q ( temps [ LValues [ i ] ] , i = 1 . . v ) ] ) ; 17 r e t u r n TValues ; 18 end ; #genProblem 19 20 #Pick t h e q u e s t i o n g i v e n t h e i n d e x . 21 p i c k I := p r o c ( d i f f T a r g e t , i ) 22 local possibilities ; 23 p o s s i b i l i t i e s : = [ [ ’ ’ deltaT>=d i f f T a r g e t ’ ’ , s p r i n t f ( " t h e t e m p e r a t u r e i s a t l e a s t %.3 f d e g r e e s h i g h e r than t h e p r e v i o u s day " , d i f f T a r g e t ) ] , 24 [ ’ ’ − deltaT>=d i f f T a r g e t ’ ’ , s p r i n t f ( " t h e t e m p e r a t u r e i s a t l e a s t %.3 f d e g r e e s l o w e r than t h e p r e v i o u s day " , d i f f T a r g e t ) ] , 25 [ ’ ’ abs ( d e l t a T )>=d i f f T a r g e t ’ ’ , s p r i n t f ( " t h e d i f f e r e n c e between t h e day and t h e p r e v i o u s day ’ s t e m p e r a t u r e s i s a t l e a s t %.3 f d e g r e e s i n magnitude " , diffTarget ) ] , 26 [ ’ ’ abs ( d e l t a T )=d i f f T a r g e t ’ , s p r i n t f ( " t h e t e m p e r a t u r e i s a t l e a s t %.3 f d e g r e e s h i g h e r than t h e p r e v i o u s day " , d i f f T a r g e t ) ] , 36 [ ’ − deltaT>=d i f f T a r g e t ’ , s p r i n t f ( " t h e t e m p e r a t u r e i s a t l e a s t %.3 f d e g r e e s l o w e r than t h e p r e v i o u s day " , d i f f T a r g e t ) ] , 37 [ ’ abs ( d e l t a T )>=d i f f T a r g e t ’ , s p r i n t f ( " t h e d i f f e r e n c e between t h e day and t h e p r e v i o u s day ’ s t e m p e r a t u r e s i s a t l e a s t %.3 f d e g r e e s i n magnitude " , diffTarget ) ] , 38 [ ’ abs ( d e l t a T )=d i f f T a r g e t ’ , s p r i n t f ( " t h e t e m p e r a t u r e i s a t l e a s t %.3 f d e g r e e s h i g h e r than t h e p r e v i o u s day " , d i f f T a r g e t ) ] , 60 [ ’ − deltaT>=d i f f T a r g e t ’ , s p r i n t f ( " t h e t e m p e r a t u r e i s a t l e a s t %.3 f d e g r e e s l o w e r than t h e p r e v i o u s day " , d i f f T a r g e t ) ] , 61 [ ’ abs ( d e l t a T )>=d i f f T a r g e t ’ , s p r i n t f ( " t h e d i f f e r e n c e between t h e day and t h e p r e v i o u s day ’ s t e m p e r a t u r e s i s a t l e a s t %.3 f d e g r e e s i n magnitude " , diffTarget ) ] , 62 [ ’ abs ( d e l t a T )"; tableEnd := ""; rowHeader := ""; rowEnd := ""; itemHeader := ""; itemEnd := ""; message := cat ( tableHeader , rowHeader , itemHeader , s p r i n t f ( " time i n %a s " , pickTime ) , itemEnd , itemHeader , s p r i n t f ("%a " , p i c k s U n i t ) , itemEnd , rowEnd , seq ( ’ rowHeader , itemHeader , c o n v e r t ( x l i s t [ i ] , s t r i n g ) , itemEnd , itemHeader , c o n v e r t ( y l i s t [ i ] , s t r i n g ) , itemEnd , rowEnd ’ , i = 1 . . nops ( x l i s t ) ) , tableEnd ) ; r e t u r n message ; end p r o c ; #end makeTable r e t u r n makeTable ( x l i s t , y l i s t ) ;

60 61 62 63 64 65 66 67 68 69 70 71 end p r o c ; 72 73 answerGen := p r o c ( s e e d ) 74 $ i n c l u d e " S p l i n e 3 I n c l u d e . mpl" 75 c u r v e := C u r v e F i t t i n g [ S p l i n e ] ( x l i s t , y l i s t , v , d e g r e e=k ) ; 76 #Pick a random e v a l u a t i o n p o i n t 77 pickAPoint := RandomTools [ Generate ] ( f l o a t ( r a n g e = z1 /2 . . z1 −0.1 e −1, d i g i t s = 3) ) ; 78 #C a l c u l a t e t h e a r e a 79 v a l u e A t P o i n t := e v a l ( curve , v=pickAPoint ) ; 80 a r e a := i n t ( curve , v = 0 . . z1 ) ; 81 a r e a I n U n i t s := c o n v e r t ( area , u n i t s , p i c k s U n i t ∗ pickTime , p i c k d U n i t ) ; 82 r e t u r n pickAPoint , valueAtPoint , area , a r e a I n U n i t s ; 83 end p r o c ; 84 85 end module ; D.3.2

Spline3Include code l o c a l n , n u m I n t e r v a l s , x l i s t , y l i s t , curve , pickAPoint , z1 , h , k , valueAtPoint , framework , questionType , pickTime , p i c k s U n i t , pickdUnit , area , q u e s t i o n , a r e a I n U n i t s , q u e s t i o n 2 , degreeText , s p l i n e D e s c , makeTable ;

framework [ 1 , 1 ] := "

You a r e measuring c u r r e n t flow , measured i n %a s f o r a p e r i o d You wish t o e s t i m a t e t h e t o t a l c h a r g e c a r r i e d by t h e f l o w o v e r t h e e n t i r e p e r i o d , e x p r e s s e d C s h o u l d be an e x p r e s s i o n i n t h e v a r i a b l e t ( f o r time ) . I n t e g r a t e C with r e s p e c t t To g e t t h e curve , u s e t h e C u r v e F i t t i n g [ S p l i n e ] o p e r a t i o n o f Maple t o f i t a %s s p l i n framework [ 1 , 2 ] := "What i s t h e t o t a l charge , measured i n %as , f o r t h e p e r i o d t=0 t o framework [ 1 , 3 ] := [ ampere , b i o t , abampere ] ; 44

framework [ 1 , 4 ] := [ second , minute ] ; framework [ 1 , 5 ] := [ coulomb , ampere ∗ hour , b i o t ∗ hour ] ; framework [ 2 , 1 ] := "

You a r e measuring power g e n e r a t e d by a d e v i c e , measured i n % You wish t o e s t i m a t e t h e t o t a l e n e r g y g e n e r a t e d o v e r t h e e n t i r e p e r i o d , e x p r e s s e d i n u n i t s C s h o u l d be an e x p r e s s i o n i n t h e v a r i a b l e t ( f o r time ) . I n t e g r a t e C with r e s p e c t t To g e t t h e curve , u s e t h e C u r v e F i t t i n g [ S p l i n e ] o p e r a t i o n o f Maple , you c u r v e f i t a framework [ 2 , 2 ] := "What i s t h e t o t a l energy , measured i n %as , f o r t h e p e r i o d t=0 t o framework [ 2 , 3 ] := [ watts , horsepower , k i l o w a t t s ] ; framework [ 2 , 4 ] := [ second , minute ] ; framework [ 2 , 5 ] := [ c a l o r i e , k i l o c a l o r i e , BTU, erg , j o u l e ] ; framework [ 3 , 1 ] := "

You a r e measuring t h e amount o f f l o w o f a l i q u i d c h e m i c a l , m You wish t o e s t i m a t e t h e t o t a l mass c a r r i e d by t h e f l o w o v e r t h e e n t i r e p e r i o d , e x p r e s s e d i C s h o u l d be an e x p r e s s i o n i n t h e v a r i a b l e t ( f o r time ) . I n t e g r a t e C with r e s p e c t t To g e t t h e curve , u s e t h e C u r v e F i t t i n g [ S p l i n e ] o p e r a t i o n o f Maple , you c u r v e f i t a framework [ 3 , 2 ] := "What i s t h e t o t a l mass , measured i n %as , f o r t h e p e r i o d t=0 t o t framework [ 3 , 3 ] := [ t o n s / hour , k i l o g r a m s / hour ] ; framework [ 3 , 4 ] := [ second , minute ] ; framework [ 3 , 5 ] := [ pound , gram , ounce ] ; framework [ 4 , 1 ] := "You have a \" b l a c k box \" o f an e l e c t r i c v e h i c l e which i s g i v i n g You wish t o e s t i m a t e t h e t o t a l d i s t a n c e t r a v e l l e d i n t h e p e r i o d , a s measured i n %a . To do t framework [ 4 , 2 ] := "What i s t h e t o t a l d i s t a n c e , measured i n %a , f o r t h e p e r i o d t=0 t framework [ 4 , 3 ] := [ m i l e s / hour , k i l o m e t e r s / hour ] ; framework [ 4 , 4 ] := [ second , minute ] ; framework [ 4 , 5 ] := [ i n c h e s , f e e t , m e t e r s ] ; d e g r e e T e x t := [ " l i n e a r " , " q u a d r a t i c " , " c u b i c " ] ; randomize ( s e e d ) ; q u e s t i o n T y p e := rand ( 1 . . 4 ) ( ) ; #Pick a q u e s t i o n v a r i e t y n := rand ( 1 . . 8 ) ( ) ; #s e l e c t o r d e r o f B e s s e l f u n c t i o n k := rand ( 1 . . 3 ) ( ) ; #s e l e c t i o n o f s p l i n e v a r i e t y s p l i n e D e s c := d e g r e e T e x t [ k ] ; pickTime := framework [ questionType , 4 ] [ rand ( 1 . . nops ( framework [ questionType , 4 ] ) ) ( ) ] ; p i c k s U n i t := framework [ questionType , 3 ] [ rand ( 1 . . nops ( framework [ questionType , 3 ] ) ) ( ) ] ; p i c k d U n i t := framework [ questionType , 5 ] [ rand ( 1 . . nops ( framework [ questionType , 5 ] ) ) ( ) ] ; # Pick number o f data i n t e r v a l s n u m I n t e r v a l s := rand ( 5 . . 9 ) ( ) ; #Pick t h e end p o i n t ( t r u n c a t e d t o 2 d e c i m a l p l a c e s ) . z1 := e v a l f ( B e s s e l J Z e r o s ( n , 1 ) ) ; z1 := e v a l f ( t r u n c ( z1 ∗ 1 0 0 ) / 1 0 0 . 0 ) ; h := 1/ n u m I n t e r v a l s ∗ z1 ; x l i s t := [ s e q ( ’ RandomTools [ Generate ] ( f l o a t ( r a n g e = h∗ i . . h ∗ ( i +1) −0.1 e −1, d i g i t s = y l i s t := map ( ( x ) −>B e s s e l J ( n , x ) , x l i s t ) ; #Tack on an end p o i n t that ’ s 0 . x l i s t := [ op ( x l i s t ) , z1 ] ; y l i s t := [ op ( y l i s t ) , 0 . 0 ] ;

E

Bash script for library and zip file creation and testing

The script makelib2.sh, listed in Figure 24 is a bash script that takes as a single argument the name of a module, M. The programming follows the convention that the source code for the module is in M.mpl, and that there is are Maple commands for a test of the module in the file M Test.mpl. It also compresses a copy of the library code, the source (including include files of the form XX Include.mpl), the test script, and any gif files of the form MXX.gif (such as are generated in Figure 18 ) in the directory into a zip file M.zip. The script, when executed, feeds M.mpl through the Maple processor, along with script-generated prefix and suffix code that will save M ’s definition in the library files M.lib, and M.ind. Currently, the results of

45

Figure 24: makelib2.sh #Remove a l l e x i s t i n g . l i b and . i n d f i l e s i n t h i s d i r e c t o r y . rm −f ∗ . l i b ∗ . i n d #Run s c r i p t t o c r e a t e . l i b and . i n d f i l e s . Assumes module name i s t h e #same a s t h e name o f t h e . mpl f i l e with t h e s o u r c e . maple −q