Digital System Description Knowledge Assessment K. Jelemenská, P. Čičák, M. Jurikovič and P. Pištek Institute of Computer Systems and Networks, Faculty of Informatics and Information Technologies Slovak University of Technology Bratislava Ilkovičova 3 842 16 Bratislava, Slovakia
[email protected]
Abstract— The paper presents a knowledge assessment approach developed for support of Digital system description course. In the course the students should comprehend the methods and techniques used in digital system design, and gain the skills in digital systems modelling using hardware description languages (HDL). For many years the course exams used to be done only in pen and paper form. The students hated writing HDL programs on paper and the teachers disliked correcting the programs on paper as well. What is more, this type of exam has not allowed the verification of the student’s ability to debug the created model, which is the substantial part of the student’s practical skills. That is why, some years ago, we concentrated our work on designing a knowledge assessment system that will enable reliable evaluation of practical skills in the area of digital system description without substantial teacher involvement. Several applications have been developed and tested in the course. One of the best will be presented here.
I. INTRODUCTION Knowledge and skills assessment has always been problem in the courses devoted to programming languages, whether it was assembly or higher level programming languages [3,4]. The problem is very similar in the case of hardware description languages (HDL). To master the programming or modeling language, the students should solve various problems using the language. It is useless to learn language constructs or exact syntax by heart without writing any program. That is why the programming courses assessment process often adopt the following schema. The lab exercises are based on more or less individually allocated assignments that the students can solve at school or at home. The solutions are evaluated by teachers and usually form a part of the overall course assessment. Unfortunately, there are always some students who try to find the easiest way to get the lab work results (e.g. search for a complete solution on the internet, copy the solutions from their classmates, etc.). The lab teachers usually have no chance to find out whether the students really did the programming work themselves, or not. They can just check up, whether the students understand their solutions and whether they are able to perform some minor changes in them. But it is not an easy task to do. Another problem is that different teachers might use various scales for assignments evaluation. As a result, many students do not get the required knowledge and practical skills, even though they get the highest evaluation for lab assignments. This part of the course assessment is often neither objective nor realistic although requiring a lot of teachers’ work. The situation is even
worse when distant learning is considered, where teachers have do not have the possibility to check out the students understanding of the solutions. The second part of the course assessment is usually formed by an exam, sometime supplemented by some midterm tests. The tests and exams are often performed in a traditional way – using pen and paper. Although this type of exam gives minimum chance for cheating, there are many reasons why this type of exam is not suitable for practical skills assessment. First, the students have no chance to check the correctness of their programs or HDL models before handing them out for evaluation. Secondly, this type of exam does not allow verification of the student’s ability to debug the created program/model, which is a substantial part of student’s practical skills. Finally, hand written works are very difficult to read, correct, and evaluate. It is very demanding and time consuming work for teachers to evaluate programs/models written on paper and the result is often uncertain. In order to make the course assessment process more objective, fair and effective, there was an urgent need for redesign of the assessment. One of the solutions assumes that the substantial part of the course assessment will be shifted from the assignments to the skills based midterm tests. At the same time, new technology will be developed to perform the exam and midterm tests. The new applications, specially designed for this purpose, will enable the testing of not only the knowledge, but also the skills used, all in an automated manner. II. LEARNING MANAGEMENT SYSTEMS CAPABILITIES To ease the work needed for knowledge and skills assessment in the course, various Content Management Systems (CMS), Learning Management Systems (LMS), Computer-Based Training (CBT) or Web-Based Training (WBT) systems can be used which support automated knowledge and/or assignments assessment. Nowadays there are a vast number of such systems, but most of them are unable to evaluate a student‘s skills using an online test. Various systems supporting knowledge testing and/or assignment assessment have been reviewed in [2]. The emphasis was placed on supported questions, test types and automated assessment. The existing systems can be divided into two groups. Test Creation Tools: These tools represent standalone solutions. The tests created using these tools can not usually be connected to learning content and must be used en bloc. The created tests conform to e-learning standards and therefore can be integrated to LMS. Another advantage is that these systems
III.
QUIZIT
Toolbook Instructor
eDoceo
Program Autor
Macromedia Authorware 7
ATutor
Dokeos
Ilias
HotPotatoes
Moodle
Assignment management
WebToTest
TABLE I. ASSIGNMENT AND QUESTION TYPES SUPPORT IN REVIEWED SYSTEMS
-
-
+
-
-
-
+
-
+
-
+
Question types Program
+
-
-
-
-
-
-
-
-
-
-
Multi-choice single answer
-
+
+
-
+
+
+
+
+
+
+
Multi-choice multi answer
-
+
+
-
+
+
+
+
+
+
+
True/False
-
+
+
-
+
+
+
-
+
+
+
Gap filling
-
+
+
-
+
+
+
+
+
+
+
Matching
-
+
+
-
-
-
-
+
+
+
-
Slider
-
-
+
-
-
-
-
-
-
-
-
Arrange objects
-
-
+
-
-
-
-
-
+
+
-
Place object
-
-
+
-
-
+
-
-
-
+
-
Identify correct picture region
-
-
-
-
+
+
-
-
+
-
-
Identify correct object
-
-
-
-
-
+
-
-
-
-
-
Crossword
-
-
-
-
-
-
-
-
-
+
-
+ =YES
- = NO
are better elaborated and support more question types, and test creation is easier and flexible. Learning Management Systems (LMS): These systems support e-learning and provide assignment management subsystems as well as testing subsystems. All considered LMS support the e-learning standards SCORM, ADL and IMS. The capabilities of the reviewed systems are summarized in Table I. As we can see most of the reviewed systems support several types of test questions but only WebToTest enables students to edit, compile, and debug a program and submit it using a special form. We were not able to find any system supporting skills based assessment in the area of HDL modelling. This fact inspired us to develop several skills assessment systems supporting languages like VHDL and SystemC. The standalone testing application presented in [5,12] and the Drag&Drop Moodle module [13] that have been used for student’s knowledge and skills assessment in the last academic year are two examples of them. The special VHDL testing module, presented in this paper, is one of the latest solutions, suitable for online skills based examination in the area of digital systems modelling using VHDL. The module integrated into the Moodle LMS is already a part of the set of assessment tools used in the Digital system description course.
VHDL MOODLE MODULE DESIGN
A. Functional Requirements There are three types of users in the module – student teacher and system. 1) Student: Student is the user with minimum rights. He or she can do only actions allowed by users with higher rights and therefore cannot modify functionality of the system. Authentication – Students must log in to the system using the Moodle built in authentication system. They will have access to the Digital system description course, where a VHDL test is created as a special activity. Source code editing – A test consists of test assignment and VHDL source code editing windows, which can contain a program core created by a teacher. Comfortable work during the test will be provided by VHDL syntax highlighting, similar to that used in ModelSim. Source code compilation – Students can compile their source codes unlimited number of times. The compilation errors are returned to students by the system. Source code simulation – The compiled model can then be simulated, using the test bench, prepared by the student or by the teacher. The student should receive the signals waveforms as an output. Source code submission – The students have the possibility to upload their partial solutions to the server any time they wish. After the final submission, the system will further process the student’s solution to get it evaluated automatically. The final submission of the source code is activated automatically in case the time allocated for the test elapses. Results review – Students can see the results of all their VHDL tests in the separated module. 2) Teacher: The teacher can administrate his/her teaching module and adapt it to his/her own needs (i.e. adding teaching aids, activities, tests, creating forums etc.). The VHDL test will be created adding the special new activity into the course. There are several activity attributes that can be set during this process. Instance name – Using this name the teacher can locate and download or review the results of the respective activity. General instructions – These instructions are optional and can be used to display any remarks made concerning the test. Start of the test – This attribute can be used to schedule the test at any date and time. The test will not be accessible to students before this time. Test duration – The teacher has to set up the time that the students can spend working on the test. After this time the final source code submission, followed by test evaluation is activated automatically. Test assignment – The teacher creates a text form of assignment saying what the students should design. Program core - The teacher can copy parts of the solution to the core which will be displayed in student’s VHDL source code editing window. Typically at least the entity declaration of the designed circuit is provided to make the student’s design consistent with the reference. Also, the teacher’s test bench can be provided here, in case it should be made visible to students. Although this part is voluntary, if the teacher doesn’t
use it, the system will not usually be able to do automated evaluation. Reference test bench – This test bench will be used to replace the student’s one to make the final simulation after the final source code submission. Reference solution – The complete solution including reference test bench should be provided here. This solution is used to generate the reference waveform that will be used for automated evaluation. The reference solution can also be used to make some source code comparisons. Other parameters of the test include maximum number of points, difference penalty, syntactical correctness gain, and the figure to be displayed. After the test, the teacher can use another module to review the solutions, eventually to alter their evaluation. There is also the possibility to download all the solutions of the selected test for further revisions or archiving. 3) System: Automated tasks are realized by the system. The rights must carefully be chosen to prevent system malfunction. Source code storage – The system must guarantee the student’s solution storage any time it is required to do so. The source code must be stored correctly even though it may contain special characters such as apostrophes, etc. Source code compilation – Source code is compiled each time a student requires it. The actual compilation is done invoking the vcom compiler, an integrated part of ModelSim. Source code simulation – The compiled model is simulated each time a student requires it. Simulation is done invoking the vsim simulator, an integrated part of ModelSim. Final source code submission – This process is activated by the system either automatically in case the time allocated to the test elapses, or interactively by the student’s command. As part of this process the student’s test bench will be replaced by the teacher’s, and the test will be automatically evaluated, comparing the resulting waveform to the reference. For each difference the assigned penalty is subtracted from maximum number of points. In case the source code can be compiled, but there are too many differences, a syntactical correctness gain is returned as a result. In case the source code is not syntactically correct, it can still be compared to the reference solution to find some partial similarities. B. System Architecture A client-server architecture model was used as shown in Fig.1. The LMS Moodle, together with the new module, are installed on the server. This web-based application requires a web server – in this case the Apache HTTP Server is used. A database server is also required. Although several types of databases can be used, the MySQL database is probably best supported. On the client side, a web browser is used to communicate with the server using a specially developed user interface. The most important module functions are source code syntax verification and design simulation. To compile and simulate VHDL code, the commercial software ModelSim from Mentor Graphics Company is used. The compiler and the simulator can be launched from the command line and therefore initiated from another application. The vcom output - compilation errors - can be redirected into the file. The output of vsim is a .wlf format file, containing simulation results.
Fig.1.
The system architecture
ModelSim is placed on the server. It is called by a PHP application using a console mode and provided source code as an input. ModelSim processes the provided input and returns the output to the PHP application. Using this solution the problem of server overloading might arise in the case of too many users. We can then try processing the requirements in parallel using, for example, a connected cluster of processors. TABLE II. TABLE MDL_VHDL: STORES DATA FOR MODULE INSTANCES Name
Description
ID
Identifier of module instance
Course
Identifier of course where instance belongs
Name
Name of a course where instance belongs
Nadpis
Assessment title
Intro
Opening instructions and test directives
Zaciatok_testu
Time in epoch format when the test will start
Dlzka_testu
Test duration in minutes
Text_zadania
Text of assessment
Kostra_kodu Testovacia_vzorka Obrazok Pocet_bodov
Source code scheme in VHDL Test sample which will be used for compilation and simulation Name of the picture which will be used during test Maximum number of points
Bodova_zrazka
Points discount for one difference
Body_kompilacia
Base number of points which are given to student in the case of successful compilation
IV. VHDL MOODLE MODULE IMPLEMENTATION System implementation uses the MySQL database. This database is used by the Moodle system as a storage place for all data except files, stored on disc. The VHDL testing module uses data from this database as well. However, some new database tables had to be created to store data associated with this module. A. VHDL Testing Module Two new database tables were created in order to store data associated with this module: mdl_vhdl in Table II. stores data for module instances, and mdl_vhdl_odovzdane (shown in Table III.) stores the source code of particular students. The fields identify the specific test and student, as well as the full source code and results of the evaluation. The first names and surnames of students are stored for test archiving purposes – even if the Moodle user database is cleared, the complete tests are still available. B. Module for Reviewing Submitted Assignments This module also needs to have its own table, where all created instances are stored. The table is called mdl_prehlad and contains only 3 items: ID, Course, and Name. No other tables are necessary since the module uses data from the mdl_vhdl_odovzdane table, which holds all of the vhdl code submitted by students. C. Shell Script for Compilation and Simulation To compile and simulate the VHDL model a shell script is called from PHP application. It consists of three main branches. 1) The compilation branch creates a new subdirectory in the directory /tmp/data/, named using user ID. In the subdirectory, the library work and file source.vhd are created. Thereafter, the file is compiled with the vcom compiler. The compiler output is returned to PHP application. TABLE III. TABLE MDL_VHDL_ODOVZDANE: STORES SUBMITTED SOLUTIONS Name
Description
ID
Identifier of submission
Course
Identifier of course which was used for examination
Test
Test name which was submit
User_id
Identifier of user who submitted the test
Zadanie
Assignment of the test which was solved by user
Kod
Code which was sent by user
Cas_zacatia
Time in epoch format when the test was started
Cas_odovzd ania
Time in epoch format when the test was submitted
Hodnotenie
Points evaluation of student
Result
Results for compilation and simulation
Meno
First name of student.
Priezvisko
Student’s surname.
Fig.2.
The correction algorithm
2) The simulation branch is a bit more complicated. At first, the existence of library work is checked. If it is there, the simulation .do script is dynamically created. This script is sent to the vsim simulator and causes the test bench simulation to be run. It is necessary to name the test bench “TESTBNCH”. If simulation runs without errors, the reference and student’s waveforms (simulation outputs) are compared. The comparison .do script is dynamically created for this purpose and sent to the vsim simulator. The outputs of the simulation and the comparison are returned to the application. 3) The final evaluation branch cleans up all the temporary files created during the test. Then the file source.vhd is created once more, using the reference test bench in this case. The file is compiled and simulated analogously. The outputs are then returned to the PHP application where they are processed and
the results are calculated. The algorithm for the evaluation of student solutions is given in Fig. 2.
similar to that used by ModelSim (used during the term), is used to make the editor familiar to students.
D. Test Creation Shell Script After the instance of the testing activity is created by the teacher, a reference waveform in the form of .wlf file is automatically generated. A shell script is used for this purpose, which is called after the test creation form is sent. This script compiles and simulates the reference solution entered by the teacher. The output of the simulation, a .wlf file, is then stored together with the other files associated with this testing activity instance (like e.g. image). They are all stored in the directory moodledata/courseid/instanceid. V.
VHDL MOODLE MODULE USER INTERFACE
A. New Instance of VHDL Test Creation The screenshot given in Fig. 3 represents a part of the test creation form. This form is only accessible to the teacher or administrator. The teacher sets all the test attributes here - for example, duration of test, test bench, reference solution, etc. B. VHDL Test Elaboration The screenshot shown in Fig. 4 illustrates the student interface to the designed module. Shown at the top of the screen are the remaining time, the test assignment text, and an optional scheme. Under these parts is the VHDL source code editing window, where the provided core should be completed using the student’s VHDL code. VHDL syntax highlighting,
Fig.4.
Fig. 3.
The student screen – Test elaboration
The teacher screen – Test creation
VI. RESULTS AND EXPERIENCE We started to use the new testing module in the course Digital systems description to evaluate students’ practical skills during midterm tests. In the course, the students will pass three midterm tests and one final exam. On each midterm test, the students are divided into three groups based on the testing environment - TabletPC, Drag&Drop, or VHDL Moodle test. Each student will pass each type of test once. However, the test assignment is the same, regardless of the test type. This constraint ensures that the severity of all the types of the test should be more or less the same. Fig. 5 shows the results of the first and the second midterm test - the distribution of Average Test Attainment. As we can see, the students reached the best attainment in the Drag&Drop test. The reason is that in this case the correct solution is already given in the test and the students just have to find the correct statements and place them correctly into the VHDL model.
disposal a source code editor with syntax highlighting, a VHDL language compiler, and simulator outputs. The application gives the teacher a means to comfortably manage the testing system. Finally, the students have the opportunity to verify their designs before sending them for evaluation. ACKNOWLEDGMENT The support by Slovak Science Grant Agency (VEGA 1/0649/09 “Security and reliability in distributed computer systems and mobile computer networks“) and HP Technology for Teaching grant 2007 “Mobile Education Center” is gratefully acknowledged. REFERENCES [1] [2] [3]
[4]
[5] Fig. 5.
The first and the second midterm test results
After the test students were asked several key questions. The answers to these questions are shown in Fig. 6. As we can see, most of the students think that the Drag&Drop test, composing a model from given statements, is the best way of testing practical skills.
[6]
[7] [8]
[9]
[10]
Fig. 6.
Survey results
VII. CONCLUSION The first results prove that the kind of online test described previously provides more realistic image of student knowledge of digital systems modelling, compared to tests based on the types of questions commonly used in online tests. Because the VHDL Moodle test requires creativity by students, the test results are worse compared to the Drag&Drop test. The presented testing solution brings number of advantages. First, there is a substantial reduction in the demanding and time consuming work of the teacher, relative to paper exams correction. At the same time, the level of difficulty of the exam is preserved. Second, it enables the teacher to check the skill of students in the area of model debugging, which was previously not possible. All participants in the examination have at their
[11]
[12]
[13]
[14]
M. Mayer, “Distance Learning Support for Digital Systems Description Course,” master theses, FIIT STU Bratislava, 2009. P. Polačko, “Learning Support for Specification and Description Languages Course,” master theses, FIIT STU Bratislava, 2007, pp. 86 Ch. Douce, D. Livingstone and J. Orwell, “Automatic Test-Based Assesssment of Programming,” ACM Journal of Educational Resources in Computing,Vol. 5, No 2, September 2005 C. A. Higgins, G. Gray, Pavlas Symeonidis and A. Tsintsifas, “Automated Assessment and Experience of Teaching Programming,” ACM Journal of Educational Resources in Computing,Vol. 5, No 3, September 2005 Š. Beleš, K. Jelemenská, M. Jurikovič, P. Pištek, T. Polák, J. Zeman and M. Žucha, “Mobile Testing System,” Košice : Elfa, 2008, In: ICETA 2008 : 6th International Conference on Emerging e-Learning Technologies and Applications. High Tatras, Slovakia, September 1113, 2008. - Košice : Elfa, 2008. - ISBN 978-80-8086-089-9. - pp. 133138 Š. Beleš, M. Jurikovič, P. Pištek, T. Polák, J. Zeman, and M. Žucha – “Learning Support for Specification and Description Languages,” Team Project, FIIT STU Bratislava, 2008, pp. 83 J. Kytka, “Learning Support for Specification and Description Languages Course,” master theses, FIIT STU Bratislava, 2006, pp. 85 N. Pollock, J. Cornford, “Theory and Practice of the Virtual University, report on UK universities use of new technologies,” http://www.ariadne.ac.uk/, Jun-2000 Bruggeman, J., Long, H., M., Misra, C.: Effective IT Security Practices (ID: EDU06165), Presented at EDUCAUSE Annual Conferences (10/09/2006), Brown, M., B., Smith, J., M., Windham, C.: From Today's CMS to Tomorrow's Learning Support System, Presented at EDUCAUSE Annual Conferences (10/11/2006), WINNER: 2006 EDUCAUSE Catalyst Award. Hemmi, A., Pollock, N., Schwarz, C.: If not the Virtual University then what? Co-producing E-learning and Configuring its Users, 8th European Conference on Media in Higher Education, (Best Overall Paper), GMW, University of Duisburg, 16-19 September 2003. K. Jelemenská, M. Jurikovič, P. Pištek and P. Čičák, “Automated Skills Assessment in Mobile Environment,” in: Research, Reflections and Innovations in Integrating ICT in Education, Vol.1, V.International Conference on Multimedia and Information & Communication Technologies in Education (m-ICTE 2009), Lisbon, Portugal : Vol. 1. Badajoz : Formatex, 2009. - ISBN 978-84-692-1789-4. - pp. 487-491 K. Jelemenská, E. Tomalová, and P. Čičák, Pavel, “Knowledge Assessment Improvement in Moodle Environment,” in: Research, Reflections and Innovations in Integrating ICT in Education, Vol. 2, V.International Conference on Multimedia and Information & Communications Technologies in Education (m-ICTE 2009, Lisbon, Portugal. - Badajoz : Formatex, 2009. - ISBN 978-84-692-1790-0. - pp. 979-983 J. Genči, “Knowledge Assessment – Practical Example in Testing,” In: Journal of Communication and Computer, Vol. 6, 8, 2009, pp. 65-69, ISSN 1548-7709