A web-based testing system with dynamic question ...

3 downloads 45098 Views 179KB Size Report
The decision was made to use a web server programming ... However, there is no way to include all .... The server code for the test program is nearly all PHP.
FRONTIERS IN EDUCATION 2001

1

A web-based testing system with dynamic question generation Je McGough, Je Mortensen, Jerry Johnson, Sami Fadali Abstract | The use of the web for a variety of educational needs has exploded recently. It provides one approach in the increasing use of electronic media. HTML|the \Lingua Franca" of the internet|provides great platform independence. However, for mathematically intensive applications as seen in Engineering, HTTP and HTML are not able to compete with non-web computer based programs. At least not without server-side or client-side enhancement. In the following, we describe one solution to the problem of presenting students with dynamically generated browserbased exams with signi cant engineering mathematics content. This proof of concept is called WTML, Web Testing Markup Language. WTML is an dynamic extension of html. The webserver is extended to process the WTML and support features necessary for mathematics exams. The result is then supplied to the client (student's browser). The student completes an exam and returns it for grading and database recording. The system is used in a calculus class and then assessed. Keywords | Computer testing, Web, Distance learning. I. Introduction

Automated student testing is not a new idea. It is a very old one. We are all familiar with machine scanned answer keys for variations on multiple choice tests. Testing of this form has received a lot of criticism, but remains prominent due to sheer numbers of students at many institutions. Our goal was to address some of the problems or concerns with existing multiple choice paper exams. For some institutions, managing the mix of paper exams with computerized databases is overwhelming. A second concern is about having enough versions of the exam to prevent exam memorization. Further concerns involve being locked into a rigid test form like multiple choice instead of a free response system. Ideally, one would like a system that takes in the student's work and assigns a grade based on the entire solution process and not just a number in a box. In short, the system could give partial credit to free response questions. Clearly we are not at this last stage, so no faculty positions are at risk yet. However, the system we describe below does address the concerns described above. This work was initiated at University of Reno, Nevada and now involves a collaboration with the South Dakota School of Mines and Technology. The work is currently supported by a grant from the National Science Foundation, DUE

9980687,

A New On-Line Mathematics testing, Remedia-

.

tion and Assessment Strategy for Engineeering Majors

II. Webtest system

The webtest system is a new testing system that allows for dynamic generation of exams. These exams are assumed to have signi cant mathematical content and support for this content is provided. The questions delivered in the exam system may contain content which is generated on-the- y by the web server. This means that custom or unique exams may be created. This is the novel part of the system we describe as there are now many computerized testing environments available (webct, authorware, etc). However, most of these are tuned for delivery and testing of material without a great deal of mathematical content. None of the existing tools addressed the main new features of the system we describe: on-the- y generation of unique questions, randomization of question components, high level support for advanced mathematics typsetting and equation parsing for student response. A. Browers, web servers and programming

When developing this system, we needed to answer some basic questions. First, why have this system web based? There are several reasons. There is not a single computer vendor, operating system and con guration. Faculty in many Engineering and Mathematics departments are using Unix workstations, where probably down the hall in labs you nd rows of Windows machines and the students may have Macs. A exible system will not be locked into a single platform. The system must have a client-server type of design and have the client include some type of graphical user interface. This left two options, a Java application (stand-alone) or a web browser application. The latter was chosen since the browser already provides a graphical user interface and the client-server aspect is already established. This is not a comment about not using Java, more about the author's skills. The decision was made to use a web server programming language and deliver exams to a browser. No special software installs are then required and it ensures that fairly standard computer systems will be ready to use the system. This idea was followed one step further in that the server side was developed using freely available software to Je McGough is with the South Dakota School of Mines and Technology in the Math and Computer Science Department. E-mail: ensure the best possible portability. The web server used Je [email protected] . in the tests is Apache with the PHP module compiled in. Je Mortensen, Jerry Johnson are with the University of Nevada - This is fairly standard on Linux systems now. PHP is set Reno in the Mathematics Department Sami Fadali is with the University of Nevada - Reno in the Electrical to connect to MySQL (the database) which is also shipped with Linux systems and is widely available. Engineering Department

FRONTIERS IN EDUCATION 2001

PHP [2] is a web server scripting language with functionality similar to CGI/Perl. It allows for inline code with a C style syntax and is very easy to use. PHP has a devoted following and is under rapid development. Most of the functions that are required to build the web testing system are intrinsic functions in PHP. Only the mathematical expression evaluation had to be added. We plan to integrate this latter functionality into PHP and make it available for a future release of PHP. Overall we kept the system based on open source software or freeware. Since Linux is now a common operating system, our two test server platforms were Linux and Solaris. The system performed well on both. As with any client server application, if large numbers of users are accessing the system, smaller PC's may have delays in the response. In our experience, this was not a problem. One of the most important aspects of a testing system is the actual test questions - the exams. No matter how well formatted or quickly delivered the exams are, if they cannot assess the student, then the system is worthless. To be useful, it was clear that a large number of questions were required. This would make the system more attractive to instructors. However, there is no way to include all possible questions. Instructors would have to author questions, which if the question database was large enough, this meant modifying an existing question. To make the question authoring as simple as possible, most of the coding that was required for the dynamic exam generation was placed in libraries and a macro language was created to invoke the functions. This macro language ended up being an extension of HTML and was called WTML. WTML stands for Web Test Markup Language. This is a simple extension of HTML and includes some extra tags for support of the dynamic exam generation features. Test authors need not understand how to program in PHP or know anything about the webserver. One learns the macro language and then modi es existing questions. Once the questions have been created, it is a simple matter to build an exam.

2

le describes which questions are to placed on the exam, not the questions themselves. Each individual question is a template for creating a speci c problem. Thus the question can have elements like function coeÆcients that are generated by a random number generator (restricted to some range or type). The student then receives a unique exam (if the instructor so wishes). This exam is sent to the client or student's browser. The test is standard html (with images and optional applets) so there is no need for browser modi cations or even plugins. The test is an html form, so answers are typed in and at completion the student will submit the exam back to the testing system. The testing system will grade the exam, record the score and give the student feedback as a graded exam with correct solutions. Cookies and hidden variables are used to keep track of the database keys. Students may look at the exam source without gaining anything since answers are not passed as hidden variables. The database keys will be large random strings and will not provide any useful information. C. Unique questions

As mentioned above, each question in the database is described by a template. The template is written in WTML and could be a static HTML question like: All cats are mammals? True or False. However, students have a tendency to discuss exams and memorize items that may appear on an exam. Thus the system allows for any number of random numbers to be generated. It also allows for shuing of lists so that even multiple choice questions can have the answers appear in random order. Formulas built on random integers and oats can create a vast array of problems rendering memorization ine ective. D. Typesetting

HTML and current browsers do not support mathematical typesetting beyond (super)/(sub)scripting and changes in font size or type. To get complicated equations set, html authors would have to set the equations in some other system, convert them to GIF/JPEG and include as a graphic. B. Webtest system overview This system will allow any number of images to be included, The typical interaction for this system is much the same but this does not solve the problem when there is dynamic in the short term is to as any dynamic web page site on the internet. The student question generation. The solution AT X . Thus we can generate use an applet that typesets L E will connect to the system most often from a class web page link. They will select the exams link and this will direct the equation dynamically and then present it to the client the student to the login page. The login and passwords browser without the conversion to an image. The typesetare controlled by the webtest database which is done by ting is not integrated, but is treated as a module. MML, MySQL. The login page also presents the student with a Math Markup Language is in beta release. The authors list of test options. Following a successful login, the student feel it will become the typesetting language for the web. is then asked if they are ready to take an exam or if they Thus as soon as Netscape and Explorer support MML, it will be added as a module. need help with the system. Selecting \start exam" instructs the server to build the E. Equation parsing exam. Although no restrictions on the exam form are enThe webtesting system includes the ability to parse equaforced other than it be material that can be placed in html, we will assume that some of the test is not static material. tions entered by the student. Using this, we can compare The test software will run the test generation code and the free-form response by a student and the reponse in the read an exam con guration le. This exam con guration answer section and grade a symbolic answer. This is also

FRONTIERS IN EDUCATION 2001

an essential component if an exam is to be completed without a calculator, for then the student may return an answer of the form sin(:2674) which is rst parsed, evaluated and then graded correctly. The parser which is used is currently an external routine (C++) which will compile under GNU C and the Sun C++ compiler. It can handle multivariate strings each with the possibility of multiple evaluation points. To determine if a student response is correct, the student supplied equation is subracted from the correct string and handed to the parser. The parser then evaluates the expression at an author speci ed range of points. If the difference expression evaluates to zero (or within a speci ed error like 10 3 of zero) then we say the strings are the same and thus returns a correct response. So the system does not attempt to analytically show the student answer is the same as the one supplied by the question author. This indeed is a source of possible error, but with low probability and in practice it works well. F. Example Questions

3

made up of three or four sections and will look like: # variables and commands for the preprocessor $MODE= mode type- string; $NVAR = number of variables - integer; $VAR[1] = variable1; $VAR[2] = variable2; ... $VAR[n] = variable n; $NANS = number of answers - integer; $ANS[1] = formula for answer 1; $ANS[2] = formula for answer 2; .... $ANS[n] = formula for answer 2; # body of the question, this is the template Question text and variables go here. Variable 1 is accessed by @VAR[1], variable 2 would be accessed by @VAR[2], etc.

Sample questions are shown in gure 1. The rst question is one that invokes the random number generation for the numbers that appear. The second question is a sample of the symbolic response mode. # The reply section (HTML form inputs)

HTML form: # What to display for the answer: this is not displayed on the exam, it is for the answer pages.

Example: The WTML for one of the questions in gure 1 appears below.

Fig. 1. Sample questions III. Webtest system details

WTML is a mix of HTML and custom tags for a HTML macro language which has a C like syntax. A question is

$MODE="APPROX_ABS"; $NVAR = 2; $VAR[1] = rand_int(5,15); // I $VAR[2] = rand_int(100,1000); // R $NANS = 1; $ANS[1] = ($VAR[1])*($VAR[2]); For a voltage v volts across a resistance, the current through it is given by i = V/R where R is the resistance in Ohms and i is the current in amperes. If a current of @VAR[1] amperes passes through a resistance of @VAR[2] ohms, what is the voltage across the resistance?


FRONTIERS IN EDUCATION 2001

V=

Section tags above:

 De nes the section of the ques-

tion which directs the dynamic variable construction.  : De nes the body of the question. This is where the speci c content goes.  : De nes the html form input section. This is the student response section.  : De nes the worked solution section. This is optional. For interactive online learning systems, this will be an important component. Webtest section variables:  $MODE = mode type string; Mode describes what type of question we are asking. Speci cally, this is multiple choice, numerical entry (exact or approximate), etc. The options here are: APPROX ABS which takes a numerical response and compute the absolute error, if it falls within 10 4 , then it is correct. APPROX REL which takes a numerical response and computes the relative error, if it falls within 10 4 , then it is correct. MULTIPLECHOICE for multiple choice answers. EQUATION which takes a string and evaluates it as a mathematical expression.  $NVAR = number of variables - integer;  $VAR[n] = n-th variable;  $NANS = number of answers - integer;  $ANS[n] = n-th answer - may be a formula; IV. Environment and Tools A. Question Authoring

Since wtml is extended html, the question may be written in any html composition program. The html source may then be pasted into an editor where the wtml tags are inserted. HTML is very easy to use and the question author can have any HTML text on hand [1]. The system currently supports a web interface for basic le and editing functions. This graphical user interface provides for lesystem navigation (unix command replacement) through the GUI interface. It also provides editing, question preview and exam construction utlities. The graphical user interface for question authoring could be extended greatly to facilitate question development. The system does not currently support a full GUI interface like Frontpage for question construction. B. Question database

The questions are not stored in MySQL. The exam questions are currently stored as text les grouped in unix directories according to category. Test question authors can generate questions in a variety of ways and then copy them to the correct directory. The system has an extensive collection of calculus questions. Questions are typical midsection homework problems of a computational nature. C. Student database

MySQL is used for student records [3]. It keeps track of the speci c exams, grades, etc. Since the data is tracked in

4

MySQL, any front-end to MySQL may be used to access the data. The testing system provides a web interface for looking at the records, but any MySQL GUI would work. With the installation of an ODBC driver, the data can also be accessed from within Excel on MS Windows machines. The system keeps grades for all students in the system. It is not broken down by class or group. To keep track of student grades organized by class, a small gradesheet application is available which can read from the webtest database and act as a simple spreadsheet. An additional interface to the database can be set to allow students to view their own scores and progress. This uses the same authentication system as the test delivery part. One of the main concerns of students is whether their test was received and scores recorded. The ability to view scores alleviates this concern. V. Server code

The server code for the test program is nearly all PHP code (with a few external routines in C). The basic components (pages) in the test system are shown in gure 2. After the user selects a test and is authenticated, the HTML for the exam is constructed by the test building page. Unless the test is of the proctored variety in which case the test must rst be enabled by a proctor|usually after checking ID. Sending|and resending upon reload requests|the HTML to the client browser is the responsibilty of the test display page. When the student presses the SUBMIT button the test grading and display pages are then called in turn. Database

Login and test selection page

Proctor page Test building page

Browser reload

Test display page Submit

Database

Test grading page

Graded test display page

Fig. 2. Test Components VI. Classroom results

The webtest was rst used Fall semester 2000 at both the University of Nevada-Reno (UNR) and the South Dakota

FRONTIERS IN EDUCATION 2001

School of Mines and Technology (SDSMT). At SDSMT one of the authors used the system for a homework/quiz component to a Calculus III course. The course required 7 online quizzes to be completed, roughly one every two weeks. The questions were similar to typical Calculus III textbook questions, but had coeÆcients in the problems that were generated by calls to the random number functions. In this course, the quizzes were treated as homeworks in that the quizzes were not proctored or restricted to a speci c lab. The students were able to complete the quizzes from on or o campus. The initial problems were typical of getting any group onto a computer system. Most problems were small; like that of logins or passwords which were assigned in class, but forgotten by 20% by the time of the rst quiz. Feedback from student surveys indicated the following items:  Many students are very nervous about computers or new technology. They will hesitate to try the system. If possible, we suggest that the new computer testing or online systems be demonstrated in class.  We found that nearly all problems were not that of the exam system, but lack of understanding about the computer or the browser. It is also important to have a very clear instruction page as well as some form of demo.  There is a tendency to put o work. This is not new or unique to engineering and math. However, getting the students on the system by the end of the second week of classes seems to make a di erence in how they approach using the system.  No matter how closely the problems match those in the text or the lecture, the students will believe that the problems are di erent. Even when the exact problems are lifted from the text (examples of homework), it will be reported that these problems are not related to the lecture or the text. These complaints arose with paper quizzes as well. However, with the exam on the computer screen, it tends to change the environment and feel less familiar. Starting with a quiz of a very elementary nature, like a algebra review for a Calculus class, will smooth the transition.  Many students preferred paper quizzes because they were accustomed to partial credit. This can be partially addressed by having more problems with a ner granularity. One bene t is that the computer expects a correct answer, not a nearly correct answer, forcing them to complete entire problems correctly. Students did not like this zero partial credit approach that the test forced, but it is not necessarily a bad thing. Indeed, one strategy is to allow limited retakes in place of partial credit.  Students were in uniform agreement about the speed of the system. They thought it was fast and eÆcient in terms of delivering and grading the quizzes. VII. Online learning systems A. Webtools

The software for the question database can be used in other applications very easily. This has been used to build instructional code for teaching numerical algorithms. This project is currently in-progress. Currently we cover the

5

Bisection method, the Secant method, Newton's method, the Fixed point method, Gauss Elimination, Gauss Elimination on the Normal equations, Jacobi's method, GaussSeidel method, the SOR method, Numerical Jacobian and a Nonlinear systems solver. These codes were used as lecture supplements for an engineering numerical methods course at SDSMT in the Fall semester of 2000. The main advantage of this type of an approach is that one does not present static presolved problems in the lecture. The system can walk the class or individual through a process using text and graphics with user de ned inputs (functions or data). B. Weblearn

To make the next step from using the system for assessment to using it for tutoring or teaching, several components need to be added. The basic content of the subject must be entered. This is a rather large undertaking. Following this, some sort of expert system (in a rather minimal sense here) must be employed to direct the student through the content. The technology is currently available. Publishers now have online texts. Combining this with the assessment tools provided here, and a layer of software to direct the student, a complete online learning system may be constructed. A prototype is in the works. VIII. Summary

There are several points to evaluate and summarize our experiences. Can a web based exam system be built (proof of concept)? Does it require a great deal of resources? How e ective is it? Does it lead to collateral tools? This was a complete success in terms of proof of concept. The authors were able to construct a web based testing system which supported the needs of mathematics intensive courses. The authors completed the required coding roughly in a calendar year. The latter two questions are really the most important ones here and will be addressed in later papers which study the use of the tools. Preliminary student surveys indicate that students saw this as a way to provide additional homework or review material. They thought the system had real promise in the area of online tutoring. As far as the collateral tool question is concerned, we have already employed the server codes in other contexts and so it seems the answer is yes. Acknowledgment

The authors would like to acknowledge the coding assistance of Ajaykumar Poondla and Mallikarjun Reddy Adamala. References

[1] E. Castro HTML for the World Wide Web Peachpit Press, Berkeley, CA [2] Leon Atkinson Core PHP Programming: Using PHP to Build Dynamic Web Sites Prentice Hall PTR/Sun Microsystems Press [3] R. Yarger, G. Reese & T. King MySQL & mSQL O'Reilly & Associates, Sebastopol, CA

Suggest Documents