Session S3C A web-based testing system with ...

73 downloads 57848 Views 47KB Size Report
We are all familiar with machine scanned answer keys for variations on ..... [2] Leon Atkinson Core PHP Programming: Using PHP to Build Dynamic. Web Sites ...
Session S3C A web-based testing system with dynamic question generation Jeff McGough, Jeff Mortensen, Jerry Johnson, Sami Fadali Abstract—The use of the web for a variety of educational needs has exploded recently. It provides one approach in the increasing use of electronic media. HTML—the “Lingua Franca” of the internet—provides great platform independence. However, for mathematically intensive applications as seen in Engineering, HTTP and HTML are not able to compete with non-web computer based programs. At least not without server-side or client-side enhancement. In the following, we describe one solution to the problem of presenting students with dynamically generated browserbased exams with significant engineering mathematics content. This proof of concept is called WTML, Web Testing Markup Language. WTML is an extension of HTML. The webserver is extended to process the WTML and support features necessary for mathematics exams. The result is then supplied to the client (student’s browser). The student completes an exam and returns it for grading and database recording. The system is used in a calculus class and then assessed. Index Terms—Computer testing, Web, Distance learning.

Introduction Automated student testing is not a new idea. It is a very old one. We are all familiar with machine scanned answer keys for variations on multiple choice tests. Testing of this form has received a lot of criticism, but remains prominent due to the sheer numbers of students at many institutions. Our goal was to address some of the problems and concerns with existing multiple choice paper exams. For some institutions, managing the mix of paper exams with computerized databases is overwhelming. A second concern is about having enough versions of the exam to prevent exam memorization. Further concerns involve being locked into a rigid test form like multiple choice instead of a free response system. Ideally, one would like a system that takes in the student’s work and assigns a grade based on the entire solution process and not just a number in a box. In short, the system could give partial credit to free response questions. Clearly we are not at this last stage, so no faculty positions are at risk yet. However, the system we describe below does address the concerns described above. This work was initiated at University of Reno, Nevada and now involves a collaboraJeff McGough is with the South Dakota School of Mines and Technology in the Math and Computer Science Department. E-mail: [email protected] . Jeff Mortensen, Jerry Johnson are with the University of Nevada - Reno in the Mathematics Department Sami Fadali is with the University of Nevada - Reno in the Electrical Engineering Department

­

tion with the South Dakota School of Mines and Technology. The work is currently supported by a grant from the National Science Foundation, DUE 9980687, A New On-Line Mathematics testing, Remediation and Assessment Strategy for Engineeering Majors.

Webtest system The webtest system is a new testing system that allows for dynamic generation of exams. These exams are assumed to have significant mathematical content and support for this content is provided. The questions delivered in the exam system may contain content which is generated on-the-fly by the web server. This means that custom or unique exams may be created. This is the novel part of the system we describe as there are now many computerized testing environments available (webct, authorware, etc). However, most of these are tuned for delivery and testing of material without a great deal of mathematical content. None of the existing tools address the main new features of the system we describe: on-thefly generation of unique questions, randomization of question components, high level support for advanced mathematics typsetting and equation parsing for student response. Browsers, web servers and programming When developing this system, we needed to answer some basic questions. First, why have this system web based? There are several reasons. There is not a single computer vendor, operating system and configuration. Faculty in many Engineering and Mathematics departments are using Unix workstations, where probably down the hall in labs you find rows of Windows machines and the students may have Macs. A flexible system will not be locked into a single platform. The system must have a client-server type of design and have the client include some type of graphical user interface. This left two options, a Java application (stand-alone) or a web browser application. The latter was chosen since the browser already provides a graphical user interface and the client-server aspect is already established. The decision was made to use a web server programming (scripting) language and deliver exams to a browser. No special software installs are then required and it ensures that fairly standard computers will be ready to use the system. This idea was taken one step further in that the server side was developed using freely available software to ensure the best possible portability. The web server used in the tests is Apache 1 which is fairly standard on Linux systems now. The web server scripting lan http://www.apache.org

0-7803-6669-7/01/$10.00 c 2001 IEEE October 10–13, 2001 Reno, NV 31st ASEE/IEEE Frontiers in Education Conference S3C-23

Session S3C guage PHP2 was chosen to provide the dynamic generation of test pages [2]. PHP also provides the link to the database MySQL3 which keeps all the records. PHP allows for inline code with a C style syntax and is very easy to use. It has a devoted following and is under rapid development. Most of the functions that are required to build the web testing system are intrinsic functions in PHP. Only the mathematical expression evaluation had to be added. We plan to integrate this latter functionality into PHP and make it available for a future release of PHP. Overall we kept the system based on open source software or freeware. Since Linux is now a common operating system, our two test server platforms were Linux and Solaris. The system performed well on both. As with any client-server application, if large numbers of users are accessing the system, there may be delays in the response. In our experience, this was not a problem. One of the most important aspects of a testing system is the actual test questions - the exams. No matter how well formatted or quickly delivered the exams are, if they cannot assess the student, then the system is worthless. To be useful, it was clear that a large number of questions were required. This would make the system more attractive to instructors. There is no way to include all possible questions and occasionally instructors will have to author questions. However, if enough question types are provided, instructors can simply modify them. To make the question authoring as simple as possible, most of the coding that was required for the dynamic exam generation was placed in libraries and a macro language was created to invoke the functions. This macro language includes some extra tags which supplement the HTML tags. This macro language is called WTML for ”Web Test Markup Language.” Question authors need not understand how to program in PHP or know anything about the webserver. One learns the macro language and then modifies existing questions. Once the questions have been created, it is a simple matter to build an exam. Webtest system overview The typical interaction for this system is much the same as any dynamic web page site on the internet. The student will connect to the system most often from a class web page link. The student will select the exams link and this will direct the student to the login page. The login and passwords are controlled by the webtest database which is done by MySQL. The login page also presents the student with a list of test options. Following a successful login, the student is then asked if she is ready to take an exam or needs help with the system. Selecting “start exam” instructs the server to build the exam. Although no restrictions on the exam form are enforced other than it be material that can be placed in HTML,  http://www.php.net  http://www.mysql.com

­

we will assume that some of the test is not static material. The test software will run the test generation code and read an exam configuration file. This exam configuration file describes which questions are to placed on the exam. Each question is a template for creating a specific problem. Thus the question can have elements like function coefficients that are generated by a random number generator (restricted to some range or type). The student then receives a unique exam (if the instructor so wishes). This exam is sent to the client or student’s browser. The test is standard HTML (with images and optional applets) so there is no need for browser modifications or even plugins. The test is an HTML form, so answers are typed in and at completion the student will submit the exam back to the testing system. The testing system will grade the exam, record the score and give the student feedback as a graded exam with correct solutions. Cookies are used to keep track of the database keys. Students may look at the exam source without gaining anything since answers are not passed as hidden variables. The database keys will be large random strings and will not provide any useful information. Unique questions As mentioned above, each question in the database is described by a template. The template is written in WTML and could be a static HTML question like: All cats are mammals? True or False. However, students have a tendency to discuss exams and memorize items that may appear on an exam. Thus the system allows for any number of random numbers to be generated. It also allows for shuffling of lists so that multiple choice questions can have the answers appear in random order. Formulas built on random integers and floats can create a vast array of problems rendering memorization ineffective. Typesetting HTML and current browsers do not support mathematical typesetting beyond (super)/(sub)scripting and changes in font size or type. To get complicated equations set, HTML authors would have to set the equations in some other system, convert them to GIF/JPEG and include as a graphic. This system will allow any number of images to be included, but this does not solve the problem when there is dynamic question generation. The solution in the short term is to use an applet that typesets LATEX . Thus we can generate the equation dynamically and then present it to the client browser without the conversion to an image. The typesetting is not integrated, but is treated as a module. This typesetting applet will be replaced by MathML 4 —the new mathematical markup language for the web—once this standard is supported by the popular web browsers.  http://www.w3.org/Math/

0-7803-6669-7/01/$10.00 c 2001 IEEE October 10–13, 2001 Reno, NV 31st ASEE/IEEE Frontiers in Education Conference S3C-24

Session S3C Equation parsing The webtesting system includes the ability to parse equations entered by the student. Using this, we can compare the free-form response by a student and the reponse in the answer section and grade a symbolic answer. This is also an essential component if an exam is to be completed without a calculator, for then the student may return an answer of the form  which is first parsed, evaluated and then graded correctly. The parser which is used is currently an external routine (C++) which will compile under GNU C and the Sun C++ compiler. It can handle multivariate strings each with the possibility of multiple evaluation points. To determine if a student response is correct, the student supplied equation is subracted from the correct string and handed to the parser. The parser then evaluates the expression at an author specified range of points. If the difference expression evaluates to zero (or within a specified error like

  of zero) then we say the strings are the same and thus returns a correct response. So the system does not attempt to analytically show the student answer is the same as the one supplied by the question author. This indeed is a source of possible error, but with low probability and in practice works well. Example Questions Sample questions are shown in figure 1. The first question is one that invokes the random number generation for the numbers that appear. The second question is a sample of the symbolic response mode.

Webtest system details WTML is a mix of HTML and custom tags for a HTML macro language which has a C like syntax. A question is made up of three or four sections and will look like: # variables and commands for the preprocessor $MODE = mode type - string; $NVAR = number of variables - integer; $VAR[1] = variable1; $VAR[2] = variable2; ... $VAR[n] = variable n; $NANS = number of answers - integer; $ANS[1] = formula for answer 1; $ANS[2] = formula for answer 2; .... $ANS[n] = formula for answer 2; # body of the question, this is the template Question text and variables go here. Variable 1 is accessed by @VAR[1], variable 2 would be accessed by @VAR[2], etc. # The reply section (HTML form inputs) HTML form: # What to display for the answer: this is not displayed on the exam, it is for the answer pages.

Example: The WTML for one of the questions in figure 1 appears below. $MODE="APPROX_ABS"; $NVAR = 2; $VAR[1] = rand_int(5,15); // I $VAR[2] = rand_int(100,1000); // R $NANS = 1; $ANS[1] = ($VAR[1])*($VAR[2]); For a voltage v volts across a resistance, the current through it is given by i = V/R where R is the resistance in Ohms and i is the current in amperes. If a current of @VAR[1] amperes passes through a resistance of @VAR[2] ohms, what is the voltage across the resistance?


0-7803-6669-7/01/$10.00 c 2001 IEEE October 10–13, 2001 Reno, NV 31st ASEE/IEEE Frontiers in Education Conference S3C-25

Session S3C V=

Section tags above: ¯ Defines the section of the question which directs the dynamic variable construction. ¯ : Defines the body of the question. This is where the specific content goes. ¯ : Defines the HTML form input section. This is the student response section. ¯ : Defines the worked solution section. This is optional. For interactive online learning systems, this will be an important component. Webtest section variables: ¯ $MODE = mode type string; Mode describes the type of question. Specifically, this is multiple choice, numerical entry (exact or approximate), etc. The options here are: APPROX ABS which takes a numerical response and computes the absolute error. If it falls within   , then it is correct. APPROX REL which takes a numerical response and computes the relative error. If it falls within   , then it is correct. MULTIPLECHOICE for multiple choice answers. EQUATION which takes a string and evaluates it as a mathematical expression. ¯ $NVAR = number of variables - integer; ¯ $VAR[n] = n-th variable; ¯ $NANS = number of answers - integer; ¯ $ANS[n] = n-th answer - may be a formula;

Student database MySQL is used for student records [3]. It keeps track of the specific exams, grades, etc. The testing system provides a web interface for looking at the records, but any MySQL GUI would work. With the installation of an ODBC driver on a client PC, the data could also be accessed from within Excel on MS Windows machines. An additional interface to the database can be set to allow students to view their own scores and progress. It uses the same authentication system as the test delivery part. This alleviates a main concern of students: whether their test was received and scores recorded.

Server code The server code for the test program is nearly all PHP code (with a few external routines in C). The basic components (pages) in the test system are shown in figure 2. After the user selects a test and is authenticated, the HTML for the exam is constructed by the test building page. Unless the test is of the proctored variety in which case the test must first be enabled by a proctor—usually after checking ID. Sending— and resending upon reload requests—the HTML to the client browser is the responsibilty of the test display page. When the student presses the SUBMIT button the test grading and display pages are then called in turn.

Database

Environment and Tools Question Authoring Since WTML is extended HTML, the question may be written in any HTML composition program. The HTML source may then be pasted into an editor where the WTML tags are inserted. HTML is very easy to use and the question author can have any HTML text on hand [1]. The system currently provides a web interface for filesystem navigation, question editing, question preview and exam construction. The graphical user interface for question authoring could be extended greatly to facilitate question development. The system does not currently support a full GUI interface like Frontpage for question construction.

Login and test selection page

Proctor page Test building page

Browser reload

Test display page Submit

Database

Test grading page

Question database The questions are not stored in MySQL. The exam questions are currently stored as text files grouped in unix directories according to category. Test question authors can generate questions in a variety of ways and then copy them to the correct directory. The system has an extensive collection of calculus questions. Questions are typical mid-section homework problems of a computational nature.

­

Graded test display page

Fig. 2. Test Components

0-7803-6669-7/01/$10.00 c 2001 IEEE October 10–13, 2001 Reno, NV 31st ASEE/IEEE Frontiers in Education Conference S3C-26

Session S3C Classroom results

Online learning systems

The webtest was first used Fall semester 2000 at both the University of Nevada-Reno (UNR) and the South Dakota School of Mines and Technology (SDSMT). At SDSMT one of the authors used the system for a homework/quiz component to a Calculus III course. The course required 7 online quizzes to be completed, roughly one every two weeks. The questions were similar to typical Calculus III textbook questions, but had coefficients in the problems that were generated by calls to random number functions. In this course, the quizzes were treated as homeworks in that the quizzes were not proctored or restricted to a specific lab. The students were able to complete the quizzes from on or off campus. The initial problems were typical of getting any group onto a computer system. Most problems were small; like that of logins or passwords which were assigned in class, but forgotten by 20% of the students by the time of the first quiz. Feedback from student surveys indicated the following items:

Webtools

¯

Many students are very nervous about computers or new technology. They will hesitate to try the system. If possible, we suggest that the new computer testing or online systems be demonstrated in class. ¯ We found that nearly all problems were not that of the exam system, but lack of understanding about the computer or the browser. It is also important to have a very clear instruction page as well as some form of demo. ¯ There is a tendency to put off work. This is not new or unique to engineering and math. However, getting the students on the system by the end of the second week of classes seems to make a difference in how they approach using the system. ¯ No matter how closely the problems match those in the text or the lecture, the students will believe that the problems are different. Even when the exact problems are lifted from the text (examples of homework), it will be reported that these problems are not related to the lecture or the text. These complaints arose with paper quizzes as well. However, with the exam on the computer screen, it tends to change the environment and feel less familiar. Starting with a quiz of a very elementary nature, like an algebra review for a Calculus class, will smooth the transition. ¯ Many students preferred paper quizzes because they were accustomed to partial credit. This can be partially addressed by having more problems with a finer granularity. One benefit is that the computer expects a correct answer, not a nearly correct answer, forcing them to complete entire problems correctly. Students did not like this zero partial credit approach that the test forced, but it is not necessarily a bad thing. Indeed, one strategy is to allow limited retakes in place of partial credit. ¯ Students were in uniform agreement about the speed of the system. They thought it was fast and efficient in terms of delivering and grading the quizzes.

­

The software for the question database can be used in other applications very easily. It has been used to build instructional code for teaching numerical algorithms. This project is currently in-progress. Currently we cover the Bisection method, the Secant method, Newton’s method, the Fixed point method, Gauss Elimination, Gauss Elimination on the Normal equations, Jacobi’s method, Gauss-Seidel method, the SOR method, Numerical Jacobian and a Nonlinear systems solver. These codes were used as lecture supplements for an engineering numerical methods course at SDSMT in the Fall semester of 2000. The main advantage of this type of an approach is that one does not present static presolved problems in the lecture. The system can walk the class or individual through a process using text and graphics with user defined inputs (functions or data). Weblearn To make the next step from using the system for assessment to using it for tutoring or teaching, several components need to be added. The basic content of the subject must be entered. This is a rather large undertaking. Following this, some sort of expert system (in a rather minimal sense here) must be employed to direct the student through the content. The technology is currently available. Publishers now have online texts. Combining this with the assessment tools provided here, and a layer of software to direct the student, a complete online learning system could be constructed.

Conclusion There are several points to evaluate and summarize our experiences. Can a web based exam system be built (proof of concept)? Does it require a great deal of resources? How effective is it? Does it lead to collateral tools? This was a complete success in terms of proof of concept. The authors were able to construct a web based testing system which supported the needs of mathematics intensive courses. The authors completed the required coding roughly in a calendar year. The latter two questions are really the most important ones here and will be addressed in later papers which study the use of the tools. Preliminary student surveys indicate that students saw this as a way to provide additional homework or review material. They thought the system had real promise in the area of online tutoring. As far as the collateral tool question is concerned, we have already employed the server codes in other contexts and so it seems the answer is yes.

Acknowledgment The authors would like to acknowledge the coding assistance of Ajaykumar Poondla and Mallikarjun Reddy Adamala.

October 10–13, 2001 Reno, NV 0-7803-6669-7/01/$10.00 c 2001 IEEE 31st ASEE/IEEE Frontiers in Education Conference S3C-27

Session S3C References [1] E. Castro HTML for the World Wide Web Peachpit Press, Berkeley, CA [2] Leon Atkinson Core PHP Programming: Using PHP to Build Dynamic Web Sites Prentice Hall PTR/Sun Microsystems Press [3] R. Yarger, G. Reese & T. King MySQL & mSQL O’Reilly & Associates, Sebastopol, CA

­

0-7803-6669-7/01/$10.00 c 2001 IEEE October 10–13, 2001 Reno, NV 31st ASEE/IEEE Frontiers in Education Conference S3C-28