Document not found! Please try again

Session F2C THE DESIGN OF A WEB-BASED COMPUTER ...

4 downloads 194 Views 462KB Size Report
be met by passing an appropriate course, many students have, on their own, acquired most of the skills taught in an introductory computer class. Thus, the option ...
Session F2C THE DESIGN OF A WEB-BASED COMPUTER PROFICIENCY EXAMINATION Nancy Kinnersley1, Scott Mayhew 1, and H. Scott Hinton1 Abstract  In 1996, the Kansas Board of Regents passed a requirement that, beginning in 2001, a student applying for admission to a Kansas university must have earned one high school unit of computer technology. One option a student has for satisfying this requirement is to pass a computer proficiency examination. This paper reports on the development of a web-based proficiency test designed and developed at the University of Kansas. It discusses initial design decisions that were made as well as unanticipated problems which arose that had to be resolved in order to bring this project to fruition. Two types of security issues had to be addressed--maintaining the integrity of the question source and ensuring that the testing environment at the remote site was strictly monitored. Having to deal with different versions of browsers as well as different computing platforms presented additional challenges..

graded has several advantages. In an interactive skills test, the user performs word processing or spreadsheet tasks in a window that has been designed to look and function like those found in Microsoft Word or Excel. Students can take the tests in their own schools rather than having to travel to a remote testing site, and they have more flexibility in scheduling the time of the examination. The results can be immediately returned at the completion of the test. This paper will describe the design and development of a web-based computer proficiency examination that the Kansas Board of Regents has approved as a way for students to satisfy the computer technology requirement of the state’s Qualified Admissions Precollege Curriculum.

Index Terms  Computer proficiency examination, webbased test.

In 1996, the Kansas Board of Regents approved a qualified admissions precollege curriculum that includes one high school unit of computer technology. The requirement first applies to students entering a state university in the fall of 2001. The proficiencies specified by the Kansas Board of Regents are divided into four knowledge content areas: Operating Systems and Hardware; Computer Software; Networking and the Internet; and Social and Ethical Issues in Computing. Included in each area are a glossary of terms the student should know and a list of the relevant skills to be attained. Under the sponsorship of the Division of Continuing Education at University of Kansas, work on a web-based computer proficiency examination began in early June of 2000. Our first goal was to have an examination ready for beta-testing by the end of September. Several of the design decisions we made were motivated by this time constraint. Specifically, we decided to: • design skills tests only for word processing with Microsoft Word and spreadsheets using Microsoft Excel; • not accept alternative ways of answering questions on the interactive tests; • support only the PC platform running Windows 95 or later; and • require that the user’s browser be Netscape 4.7 or Internet Explorer 5.0 or higher since the skills tests were to be written using Macromedia’s Flash [4].

MOTIVATION Many state Boards of Education ([2], [3], [6], [8], [9]) as well as individual high schools and universities ([5], [10] [13]) now require that their students demonstrate a minimal level of computer proficiency. While this requirement may be met by passing an appropriate course, many students have, on their own, acquired most of the skills taught in an introductory computer class. Thus, the option of taking an examination allows such students take additional courses in other areas, and alleviates the burden that would be placed on high schools if every student was required to take a proficiency course in order to graduate. Tests currently available, both commercial ([1], [14]) and those developed by individual schools or states [7] have other drawbacks. They are often given only at prescribed times and may require a student to travel to a remote site to take the examination. Moreover, most, if not all, of the questions on a typical proficiency examination are multiplechoice. However, such questions are usually insufficient to assess how well the student taking the examination can perform word processing, spreadsheet and data base tasks. Asking a student to submit documents adhering to a set of specifications requires that someone examine and grade that work. Thus, a web-based computer proficiency examination that includes interactive skill tests and is entirely computer1

TEST REQUIREMENTS AND INITIAL DESIGN DECISIONS

Department of Electrical Engineering and Computer Science, University of Kansas, Lawrence, KS 66045-7523

0-7803-6669-7/01/$10.00 © 2001 IEEE October 10 - 13, 2001 Reno, NV 31st ASEE/IEEE Frontiers in Education Conference F2C-3

Session F2C We also elected to develop the multiple choice portions of the test using Macromedia's Dreamweaver and CourseBuilder. A MySQL database was used to hold name, password and score information, and extensive use of both JavaScript and PHP was required in order to integrate all parts of the project.

SECURITY ISSUES In developing the examination, two types of security issues had to be addressed  ensuring that the testing environment at the remote site was strictly monitored and maintaining the integrity of the question source. The responsibility for test registration, assignment of user names and passwords, identification and instruction of proctors and reporting of scores to the schools was assigned to the Office of Continuing Education at the University of Kansas. All student passwords are for one time use. If a student needs to retake the test, a new password is issued to the student. Passwords are maintained in a database that is used to authenticate users. Most of the individuals who have signed up to take the examination are high school students, so the proctors were chosen from among the teachers at their high schools. Each proctor is assigned a name and password that must be entered before a student is permitted to begin taking a section of the test. After each of the three test sections is completed by the student, the proctor must again enter a name and password before the test window can be closed and a score entered in the database. In order to maintain the integrity of the examination, several controls that disabled certain browser operations had to be incorporated into the test. The main concern with exam integrity is the ability for the student to determine the answers to Javascript-based questions by viewing the HTML page source with a browser's view source menu option. To prevent this, when a user begins a section of the test the questions are displayed in a new browser window that contains no toolbars or menus. The right mouse button was disabled thereby preventing the test taker from viewing the HTML page source by selecting that option from the menu displayed when that button is clicked. Keyboard shortcuts in the skills tests are disabled so that the test taker may only use menus to answer the interactive Word and Excel questions.

FORMAT OF THE EXAMINATION The examination consists of three sections. The sections may taken in any order, and may be taken in separate sessions on different days, if desired. The first section consists of fifty multiple choice questions that test general computer knowledge. It is divided into five pages of ten questions each. When a page is completed, the student must click on a submit button after which the score earned on that page is reported. Once the answers have been submitted, they cannot be changed.

When all fifty questions in the section have been answered, the total score is reported. Each page in the section is randomly selected from one of the several test versions available. Sections two and three have the same structure. Each contains twenty multiple choice questions, again in pages of ten questions each, and seven skills tests, each on a separate page. Again, at the end of each page the submit button must be clicked. Sections two and three cover Microsoft Word and Microsoft Excel, respectively. When the window for a section is closed, the score is put in the database, and a flag is set to prevent the student from retaking that section. After a student has taken all three sections, his or her scores are put into a file. Once a day, a script is used to send the information in the file to a computer at Continuing Education.

IMPLEMENTATION DETAILS A person taking the test logs onto a server in the e-Learning Design Lab that supports this project. The server is currently set up to handle 150 simultaneous test takers. At this time, there is insufficient data on the performance of the program (such as the time to load pages, and the speed of the connection) to determine if this number should be decreased. The initial step in developing the content of the examination was creating several files of multiple choice questions, each file corresponding to all or part of one of the aforementioned areas of knowledge. Questions were then matched with specific objectives and incorporated into an examination. The pages of multiple choice questions were created using the CourseBuilder extension to Macromedia’s Dreamweaver [4]. Each multiple choice question contains four or five possible answers and the user must click a radio button to choose one of the alternatives. Because the software does not permit one to easily randomize either questions or the answer alternatives, the multiple choice sections of the test were subdivided into pages of ten questions each. Multiple versions of the test permit random selection of a each page during the testing process. A page template was created so questions could be easily entered by cutting them from the source files and pasting them into an HTML document. Corresponding pages in each version of the test contain questions testing the same concepts. As described in more detail below, the web pages that comprise the skills portion of the test were created using Macromedia’s Flash, HTML and Javascript. Javascript is used to calculate scores, time the test and disable some user actions which might compromise the integrity of the exam. Each time a page of questions has been submitted, Javascript is used to calculate the total score for the page and post that total through a PHP script, It also manages the user’s timer by displaying the time remaining to complete the test and, when the user runs out of time, properly scoring and closing the test. The timer, based on the system clock, is updated

0-7803-6669-7/01/$10.00 © 2001 IEEE October 10 - 13, 2001 Reno, NV 31st ASEE/IEEE Frontiers in Education Conference F2C-4

Session F2C every second, but is suspended while new pages are being loaded. When finished, the user is shown the total score for the section just completed as well as the scores on each of the pages. The server side scripts are written in PHP. These scripts are used to get the scoring information from the client side Javascript and write the results to the proper student’s record in the database. PHP is also used to generate an HTML table of a user’s total scores when the test is completed. The server also has the responsibility of entering students into the database and reporting their scores once all sections of the test have been completed. The Linux crontab

THE INTERACTIVE SKILLS TESTS The interactive movies that emulate Microsoft Word and Excel were created using Macromedia’s Flash software. As illustrated in the screen shots above, the question to be answered is presented to the user in a window designed to look and act like the corresponding Microsoft windows. In each movie, the user sees the menu bar and the standard and formatting toolbars. Although not all functions on the toolbars are implemented, some are active and must be used to answer certain questions. All menus are active and contain many, if not all, of the options available in their Microsoft counterparts. Dialog boxes and checkboxes appear when the appropriate commands are chosen from a menu. Each Flash movie is a timeline of Word or Excel images. As the user triggers mouse events within the movie, actions associated with the triggered events are performed. In the proficiency exam movies the action is usually to move to a different index in the timeline. An example of this is when a user clicks the mouse over a

file is modified to execute these Perl scripts daily. As students register for the test, their assigned names and passwords are sent (in a file) to the server to be entered into the database. A Perl script is used to parse the information from the file and include it in the database. The reporting of scores is handled by a Perl script which searches the database for students who have completed the test, retrieves their scores and writes the results to a file which is then transmitted to the Division of Continuing Education. Shown below are screen shots from the Microsoft Word and Microsoft Excel skills tests.

menu item. Flash detects when the area set for the menu item has been clicked and moves to a frame in the movie that displays the menu over the original image. A frame in a movie can also contain another movie which has it’s own events and actions. By using nested movies, every operation to be performed in Word or Excel can be contained in a separate movie and displayed over any other frame. Creating the Word and Excel movies for the examination was very time consuming. All of the images had to be gathered or created, cropped and sized, and then added to the frames in the movie timeline. Invisible buttons and rollover/rollout areas had to be created and placed in the proper position over the image for each frame. Actions to handle user events had to be assigned to each button and rollover/rollout area. Each action is used to move to a different frame in the movie, check values in text fields, move from a frame in an inner movie to a different frame in the main movie, check for correct completion of a question, or call a Javascript function to post the score to the database. A lot of time was required to fine tune the

0-7803-6669-7/01/$10.00 © 2001 IEEE October 10 - 13, 2001 Reno, NV 31st ASEE/IEEE Frontiers in Education Conference F2C-5

Session F2C button and rollover/rollout area actions to have the same look and feel as the Microsoft products. In the discussion of the skills tests below, the method(s) in which the answers may be obtained are included to illustrate the number and variety of different movies that had to be created in order to accomplish the various tasks. The seven Microsoft Word skills included in the test and the way(s) in which correct answers can be obtained are: • name and save a text file by selecting Save or Save As from the File menu, typing the name of the file into a dialog box and then clicking on save; • open a file by selecting its name from a list that appears in response to clicking File Open; • replace a phrase in a text passage by choosing either Find or Replace from the Edit menu and then entering the appropriate phrases in a dialog box; • justify a line of text by clicking an icon on the toolbar; • change the font of a selected passage either by selecting Font from the Format menu or by choosing the new font from the popup menu on the formatting toolbar; • insert a page break by choosing Break from the Edit menu; and • cut out a block of text by selecting that option from the Edit menu. The seven Microsoft Excel skills included in the test and the way(s) in which the correct answers can be obtained are: • change the format of a cell by choosing Cells from the Format menu and then selecting the requested format from the Number tab in a dialog box; • create a 3-D column chart from selected cells, entitle the chart and place it on a new sheet by selecting the Chart Wizard on the standard toolbar and choosing options from the dialog boxes that follow; • sort a set of data in ascending order on one field and descending on another by selecting Sort from the Data menu and choosing the appropriate columns in the dialog box; • autofit a selected column by double clicking the border between two columns or by using the Format menu; • delete highlighted cells and shift to the left by choosing Delete from the Edit menu and clicking the appropriate radio button in the checkbox; • insert a new row by highlighting the row below the insertion point and then select Rows from the Insert menu; and • insert a formula to average the contents of a range of cells by choosing Function from the Insert menu or by clicking the paste function icon from the standard

toolbar and then making the appropriate choice of function in the dialog box.

CONCLUSION The beta-testing phase of the project began in November 2000. Two groups of students from Kansas high schools as well as personnel from the Division of Continuing Education and the Office of Admissions and Scholarships participated in this. Their feedback indicated that both the multiple choice questions and the user interface required revision. As a result, many questions were reworded or replaced and the login process and screens were redesigned. Although there were reports of portions of the tests being inaccessible these problems were a direct result of the users employing older versions of the browsers. By early January the modifications suggested by the beta-testers had been implemented. A second round of beta-testing was completed by early February. The Kansas Board of Regents approved use of the test on February 13, and it was put on line February 21, 2001. Although over 300 students have expressed interest in taking the test, at this time few have had the opportunity to do so. In the next few months, we will analyze the test results and use the feedback received to improve both the content and robustness of the examination. We hope to be able to present this additional data at FIE 2001.

ACKNOWLEDGMENT The authors would like to thank Alan Cerveny and the staff at the Office of Admissions and Scholarships, and Sharon Graham, Lynne Lipsey and Barbara Watkins from the Division of Continuing Education at the University of Kansas for their cooperation in this project.

REFERENCES

[1] Association of Computer Support Specialists http://www.acss.org/quizhome.htm [2] Georgia http://www.fcboe.org/gradreq/ NewGRADREQ.html [3] Indiana http://www.k12.in.us/super/082799/ graduation.htm [4] Macromedia http://macromedia.com [5] Minnesota http://www.isd77.k12.mn.us/resources/ infocurr/infolit.html [6] North Carolina http://www.dpi.state.nc.us/testing [7] North Carolina Test of Computer Skills http://www.catawba.k12.nc.us/pa…trac10/brenda/ statecomptest.htm [8] South Carolina http://www.state.sc.us/sde/bts/ gradreq.htm 0-7803-6669-7/01/$10.00 © 2001 IEEE October 10 - 13, 2001 Reno, NV 31st ASEE/IEEE Frontiers in Education Conference F2C-6

Session F2C [9] Texas http://www.rules.state.ut.us/publicat/ code/r277/r277-700.htm [10] University of Florida http://www.cnre.ufl.edu/ computer%20requirement.htm [11] University of Missouri http://gep.ps.missouri.edu/ ComInfProf.html [12] University of Texas at Arlington http://www/uta.edu/testing/info.htm [13] Virginia Commonwealth University http://www.mas.vcu.edu/info/cpexam/ [14] Virginia Foundation for Independent Colleges Tek.Xam http://www.tekxam.com

0-7803-6669-7/01/$10.00 © 2001 IEEE October 10 - 13, 2001 Reno, NV 31st ASEE/IEEE Frontiers in Education Conference F2C-7

Suggest Documents