Session T2B ENHANCING ABET EC2000 PREPARATION USING A WEB-BASED SURVEY/REPORTING TOOL Fong Mak1 , Stephen Frezza2 , Wook-Sung Yoo 3 Abstract - This paper describes the use of an on-line survey and reporting tool to enhance ABET EC2000 preparation and to support consistent student evaluation of instruction. The tool includes interfaces to support chair, faculty and student processes necessary to publish, collect, and report survey information on learning outcomes from individual courses. This work includes tracking relationships of course outcomes to departmental and program outcomes, as well as use of the tool to collect and analyze data to support continuous quality improvement. The paper also describes an EC2000 process, the integration of the survey and reporting tool into the process, and the benefits derived from its use. Index Terms – EC2000 Process, ABET assessment tools, Course-exit survey, Web-based survey.
INTRODUCTION The Department of Electrical and Computer Engineering (ECE) of Gannon University has recently completed a major effort in restructuring and refining its departmental goals and objectives, catalog, and operations in preparation for its next ABET visitation using EC2000 criteria [1]. To ensure a coordinated effort, Gannon’s School of Engineering has formed an ABET steering committee consisting of representative faculty from the two departments that support ABET-accredited programs: ECE and Mechanical Engineering (ME). Representatives from the Computer and Information Science (CIS) department also participate in this project, as they plan to seek ABET accreditation in the near future. The committee is responsible to evaluate the process in place for departments and to ensure consistency among departments in meeting ABET criteria. Among other coordinating efforts, the committee developed a common format for use in end-of-semester course evaluation (e.g. a course-exit survey) for each course that pertains to ABET evaluation. This survey consists of the course name, instructor information, course objectives, assessment methods and includes sections containing a qualitative questionnaire, a quantitative questionnaire, and an ABET criteria. Both faculty and students are required to complete these surveys to support the quality improvement of the programs . While valuable for providing timely and useful data, the effort involved in conducting the surveys, analyzing and reporting the results seemed prohibitive.
To streamline the survey process, the ECE department developed a web-based survey and reporting tool to collect feedback data from faculty and students for every course taught. The system also enhances our ability to compile and analyze the data used in our EC2000 assessment process. A key component of this effort is the measurement of course outcomes by identified program constituents [2]. Among the identified constituents are faculty and students. Early estimates of the effort needed to effectively collect data from these constituents, analyze the data and make the results available to constituents for evaluation in a reasonable time were well beyond the limited resources available. These resource constraints are consistent factors in encouraging programs to utilize on-line survey systems [3]. This web-based survey and reporting tool facilitates the process in collecting, analyzing and publishing the survey data in a timely manner. The ECE, ME, and CIS departments used this web-based tool in their EC2000 processes during the spring 2002 semester with an acceptable level of additional effort. The paper describes the use of this web-based tool in support of the data collection, analysis, and enhancement to the departments’ continuous quality improvement process. The following section describes first the general features of the web-based survey/reporting tool. The detailed design of this tool is reported in [4]. We then describe the ABET process in use by the ECE program. The subsequence section describes how this tool has been integrated into the ABET evaluation process and the impact it has had on the process.
WEB-BASED SURVEY/REPORTING TOOL The web-based survey/reporting tool was conceived to help the department Chair reduce the effort required to (a) assist faculty in collecting objective evident to satisfy ABET EC2000 criteria, (b) maintain and track objective evidence, (c) automate parts of the faculty members’ ABET documentation process, (d) facilitate the reporting effort on survey results and (e) improve on student’s feedback rate relevant to program goals assessment. Since the Chair has the primary duty in overseeing the operation and process of program evaluation, and given the size and limited resources of the program, the web-based tool was designed to center around the Chair as the primary administrator of the ABET process. Figure 1 depicts the set of constituent relationships
1
Fong Mak, Gannon University, Electrical and Computer Engineering, Erie, PA 16541
[email protected] Stephen Frezza, Gannon University, Electrical and Computer Engineering, Erie, PA 16541
[email protected] 3 Wook-Sung Yoo, Gannon University, Computer and Information Science, Erie, PA 16541
[email protected] 2
0-7803-7961-6/03/$17.00 © 2003 IEEE November 5-8, 2003, Boulder, CO 33 rd ASEE/IEEE Frontiers in Education Conference T2B-17
Session T2B and corresponding web-based surveying/reporting tool features that support the ABET process. Super User Site: Setup new survey Setup/edit email preference Modify/edit user’s course access
Modify/edit survey dates Edit user’s right
Chair/Dean Site: Monitor course info setup status Monitor survey assess status Perform trend line analysis All features listed in faculty sites Faculty Site: View course objective/assess methods Setup course-exit survey View instructor’s response View student’s response
Extend survey date Print outcome matrix Print survey report in Word
Assess survey View past course info View past courses taught
Student Site: View course objective/assess methods Assess course-exit survey
View past course info
FIGURE 1
Each of these pages is discussed in order. Figure 2 outlines how faculty members use the page to define the course objectives and assessment for each course (Page 1 layout). Home Past Courses Taught Past Co urse Info General instructions… Or use past course template Course Objectives: Obj1: … Obj2: … Assessment Methods: Homework: … Examination: … Assessment Methods Homework Examinat ion
Obj1 √
Obj2 √
edit
Textbook & supporting tools: Text: … References: … Software: … Continue to page 2
FIGURE 2
GENERAL RELATIONS FOR THEWEB TOOL SETUP
PAGE 1 --- COURSE OBJECTIVES AND ASSESSMENTS
Each arrow in Figure 1 indicates “forward relations” between sites, such as the Chair/Dean can view information entered by faculty and students, whereas instructors can view only certain student information. The super user is responsible for (a) setting up the site for use at the beginning of the semester, (b) website maintenance and update, (c) email generation, and (d) system administrative functions. The next level down is at the Chair or Dean level. The Chair and Dean roles share the same privilege in accessing information. At this level, each Chair can oversee his/her own departmental activities related to the web-based surveys (such as viewing survey results for each course, printing reports, etc.). At the faculty level, each is responsible for his/her own course information setup, self-evaluation, and report generation for the courses taught. The student role allows students to answer the course-exit surveys.
The underlined texts in Figure 2 are links that allow instructors to enter course objectives, assessment methods, textbook & supporting tools, for example. A dynamic table that relates assessment methods with objectives is provided in this page as well. This page is generally available throughout the semester for viewing once it is setup. Qualitative and quantitative assessment (page 2) is an essential aspect of the survey. The information provided by the instructor on this page defines the main assessment page for both faculty and students at the end of the semester. Figure 3 shows these quantitative and qualitative sections, with each objective scaled from “strongly disagree” (1) to “strongly agree” (5). An additional quantitative section is pre-defined to rate on the course items such as (a) presentations/class notes, (b) homework & project assignments, (c) textbook(s), (d) software resource(s), and other resource(s). Each course item is given a rating 1 – 5 of “not at all” to “extensively” accordingly. A pre-defined qualitative section is added here as well to include questions which the ABET committee believes will benefit the evaluation process for each course offered. The last key information provided by the instructor is found on the Criteria fulfillments page (page 3), and is shown in Figure 4. There are three sets of defined criteria: (a) ABET A-K criteria, (b) defining (e.g., IEEE, ASME, etc.) program-specific criteria and (c) department competencies criteria. Figure 4 depicts how programspecific competencies and department-specific competencies are selected from among those defined for the ECE program. Instructors mark the relationship(s) among their course objectives and the criteria (competencies) accordingly. Once
Faculty Site At the beginning of each semester, the Chair sends out a general email to remind faculty members to setup their course information for this round of surveys. Course instructors access the site using their university ID and password. Once logged in, instructors have access to the features listed in faculty site box in Figure 1. The “Setup course-exit survey” provides link to three important pages: • Course objectives and assessment page (Page 1) • Qualitative and quantitative assessment page (Page 2) • Criteria fulfillment page (Page 3).
0-7803-7961-6/03/$17.00 © 2003 IEEE November 5-8, 2003, Boulder, CO 33 rd ASEE/IEEE Frontiers in Education Conference T2B-18
Session T2B the faculty portion is completed, the system will automatically generate all surveys, and make them available on the web site at the appropriate time. Home Past Courses Taught Past Course Info Instructions… Quantitative Section: Please rate this group of students’ ability to the following aspect of the course Obj1: … 1 – 5 scale rating and a button for “not applicable” Obj2: … 1 – 5 scale rating and a button for “not applicable” Rate how much each of the following course items contributed to their learning so far: Presentations/class notes Homework & project assignments Textbook(s) 1 – 5 scale rating and a button for “don’t know” Software resource(s) Other resource(s) Qualitative Section: Comment on any material you feel should be omitted from the course in the future Comment on any material whose presentation you feel should be shortened • • Continue to page 3
FIGURE 3 PAGE 2 --- QUANTITATIVE AND QUALITATIVE SECTION
Student Site Toward the end of the semester, each student will begin to receive periodic email containing (a) a request to complete his/her survey form(s) online, (b) the available period for evaluation and (c) a list of current courses that need evaluation (including hyperlinks). These emails will be repeated until the student has completed the survey for of all registered courses. Once a student completes a course evaluation, that course is removed from the list sent to the student to ensure data integrity; thus the student knows exactly how many more courses left to be done evaluation accordingly. The main features available for students are listed in the student box in Figure 1. Students can assess to Page 2 (see Figure 3) during course evaluation, they then enter the quantitative and qualitative data accordingly at their own time. However, students have only one chance to submit a completed form. From the students’ perspective, the survey site is quite simple and straightforward. The data provided by the students is kept strictly anonymous. Individual instructors only have access to the summary of information from their courses, as well as a list of which students have not completed the survey. As this data serves as the basis for survey-based program evaluation, only Chairs and Deans have access to view the data across a particular program. This feature is available through the Chair/Dean site.
Home Past Courses Taught Past Course Info Instructions… Identify how this course contributes to the program objectives: Criteria Fulfillments: Obj1 Obj2 ABET Criteria: a) Apply knowledge of Mathematics, √ √ Science, and Engineering • • √ k) Ability to use techniques, skills and √ modern tools IEEE competencies for ECE department related degree programs: • √ • Department specific competencies: • • Final submission
FIGURE 4 PAGE 3 --- CRITERIA FULFILLMENTS
Chair/Dean Site The Chair and the Dean share the same privilege in accessing the site. It helps promote consistency in monitoring the information being setup. The Chair has the primary oversight responsibility. The supporting features of interest are the “Monitor course info setup status” and reporting features like “print outcome matrix” and “trend line analysis” (see Figure 1). Figure 5 depicts the course monitoring page. Marks in the “Setup Status” column indicate that the instructor has completed the setting up their course survey. Marks in the “Assess Status” column indicate the instructor has completed his/her own self-evaluations at the end of the semester. This feature allows the Chair to remind any instructor to complete their tasks (e.g., when submission deadlines gets close). The “change” feature allows the chair to respond to any instructor requests for a deadline extension for either the survey setup or the self-assessment period. The “print” feature prints the entire report for each course in that semester. Figure 5 also shows the link to the reporting features, which generate a program outcomes matrix. Figure 6 shows a sample outcomes matrix (Excel format). The matrix tabulates all courses’ objectives for the selected term verses the criteria fulfillments entered by the instructor (see Figure 4). The course outcomes matrix is a great convenience for the Chair. The matrix is readily formed once instructors enter their criteria fulfillments (Page 3) for each course. There are options where the Chair can print multiple semester courses in a single outcome matrix or just a single semester courses outcome matrix. To capture all courses offered by the department, a multiple semester outcomes matrix format is appropriate. This matrix helps form a close loop picture of how the department course offerings fulfilling the overall department goals.
0-7803-7961-6/03/$17.00 © 2003 IEEE November 5-8, 2003, Boulder, CO 33 rd ASEE/IEEE Frontiers in Education Conference T2B-19
Session T2B Home Report Past Courses Taught Past Course Info Instructions… Department Name Select Course √ ?
Course/Section number
Setup Status √ √
Course 1 Course 2 :
Assess Status
Date Extend change change
print print
Continue Department List
FIGURE 5 CHAIR ’ S COURSE MONITORING P AGE
Circuit
X X X
Adv Digital Design
Year/Term 02/FA ECE_346_01 Obj1 Obj2 Obj3 ECE_345_01 Obj1 Obj2 Obj3 Obj4
X X X
D6
D2
D1
X X X
X X
X X
DEPT C5
C2
C1
IEEE Crit (K)
Department outcome
Crit (b)
Course ID
Crit (a)
ABET
X X X
FIGURE 6 SNAP SHOT OF COURSE OUTCOMES MATRIX
Another useful analysis feature is trend line analysis. By a click of a button, the Chair can select a course or multiple courses for trend line analysis across multiple semesters/offerings. In this feature, we generate an Excel table that captures the overall average rating for each objective specified in each course (Figure 3) as the y-axis vs. the terms offered as the x-axis. We can then plot a trend line curve based on the Excel table generated. If the same course has been offered for multiple terms, the average rating trend line throughout terms can be plotted. In our case, this feature is useful by the end of spring 2003 as we will have data for our first set of repeated courses. The trend line is used as a guide for judging how well that particular course is taught. If the trend line is moving up, it indicates improvement or otherwise All of these analysis and reporting capabilities are key to supporting the chairs in ABET preparation. However, there are some common activities (including errorcorrection) that extend beyond the role of an individual chair or dean, and are assigned to a site super-user. Super User Site The super user is responsible for setting up the site at the beginning of each semester. Figure 1 depicts features available to the super user. The “Setup new survey” feature automatically archives the previous semester’s survey results before initiating a new survey. The “Modify/edit survey
dates” allows super user to change the survey date as well as the deadlines for features to “Setup course-exit survey”, “Assess course-exit survey”, “View instructor’s responses”, and “View students’ responses” in the faculty site. The “Setup/edit email preferences” provides link to setup emails to both faculty and students for reminder. It provides features to define interval, frequency, and target dates. The “Edit user’s right” is where the super user modifies and/or adds a new user to have the Chair privilege to the site. The “Modify/edit user’s course access” provides ability to extend deadlines for courses relevant to instructors from any department. These modifications override the Chair’s privilege for his/her department. The survey tool was designed to run on Windows platform. Both Active Server Page (ASP) and Java Serve Page (JSP) techniques were employed to experiment with the performance of different implementations, and the survey tool runs using either Tomcat or Jrun as a server. The system relies on MS SQL Server for database management, but it can be configured to work with Microsoft Access as well. The overall development effort primarily involved the ECE Chair, a CIS faculty and five graduate students on a rotating basis over a period of a year and a half. This was primarily due to changing requirements, as the users’ needs (faculty from all three departments) and requests for features has required the continued addition of new features. For details on the system design, please refer to [4]. Now that we have presented the core features of the tool, the following section outlines the EC 2000 process that the tool enhances.
PROGRAM ASSESSMENT PROCESS The ECE department program assessment process begins with the university mission as shown in Figure 7. The department goals are defined well within the bound of Gannon mission. The curricular outcomes are judged based on the ABET criteria and the IEEE department specific criteria as well as our department specific competencies. With these in mind, each course objectives and assessment methods are carefully examined for a better coordination among courses and setup in order to reach a complete coverage of the program outcomes that drive towards the program goals. As for the program implementation, inputs from faculty and staff as well as the support by the department facilities and web-based tools are part of the equation in the process of quality improvement. As also indicated in Figure 7 that there are two main assessment components: the department goals assessment and the curricular outcomes assessment. For the department goals assessment, we use (a) alumni survey results, (b) senior exit survey results and (c) course outcomes matrix generated by the web-based tool as guides to make assessment on the healthy and quality growth of the program. We believe that this set of tools tends to give a longer-term perspective of measuring our program goals.
0-7803-7961-6/03/$17.00 © 2003 IEEE November 5-8, 2003, Boulder, CO 33 rd ASEE/IEEE Frontiers in Education Conference T2B-20
Session T2B As for curricular outcomes assessment, there are two mechanisms: (a) faculty & facilities resources assessment and (b) course assessment. The ECE department conducts its laboratory and software needs evaluation for the following semester in the middle of the prior semester. This process is instituted to give a better control and coordination for the laboratory and software resources, as well as a smoother software implementation process for both the lab technician and faculty. Similarly, results from senior exit surveys are explicitly included in these reviews and impact the lab needs evaluation. Lastly, the tools used for course assessment are quizzes/tests, homework, labs, projects, where summary information and evaluation are managed most effectively by the web-based course-exit survey. Once the data is collected, it is analyzed and evaluated for generation of action items or proposed changes to the department goals and/or curricular outcomes. The proposed action items or changes are reported to the ABET committee before they are disseminated to both the student body via student meetings (e.g., IEEE/HKN) and the college’s Engineering Advisory Council (EAC), respectively. The changes are then implemented accordingly and form a closed loop process for quality improvement to our ECE program. Other departments implement a similar process.
OPERATIONAL FLOW OF PROGRAM ASSESSMENT PROCESS To better understand how the web-based survey/reporting tool is augmented and integrated in the program assessment process, we should look into the process from the beginning of fall semester till the end of summer in following year. Figure 8 depicts the entire operational flow of this process. The process is repeated annually. In the beginning of fall semester, the web-based survey tool is first setup by the super user (currently the ECE Chair). Individual instructor are given a specified period to setup their course objectives and assessment methods for each course that they teach and have them ready for that semester. At an early semester welcome gathering for returning students and freshmen, we remind/inform students of the department goals and expectations, as well as the importance of their role in our quality improvement process. Once the semester settles in, the department faculty analyze the data collected from the last spring semester. As indicated in Figure 8, data from the senior-exit survey, alumni survey, course-exit survey, the outcomes matrix, and the action items from the last semester’s evaluation process are all available. We feel that the data analysis task would have been overwhelming without the use of the web-based tool. With the web-based tool, the quantitative and qualitative results for each course are readily available. In the report feature, the Chair can print the entire department courses survey results into a MS Word document. In that document, it consists of survey results (Pages 1-3 as shown in Figures
2-4) as completed by faculty as self-evaluation and by students in the class at the end of semester for each course. It is a fairly large document considering 5 or 6 pages for each course and depending on the number of courses offered per term. The results are then reviewed in the department meetings. In addition, the outcomes matrix as shown in Figure 6 is also generated for that term. In general, special sessions of department meetings are allocated for the curriculum outcomes assessment and for the department goals assessment. GU Mission ECE Department Goals ECE Program Curricular Outcomes
ABET Criteria; IEEE Department Specific Criteria
ECE Courses, Course Goals & Objectives, Assessment Methods ECE Program Implementation
Faculty, Staff, Facilities, Web tools
ECE Program Assessment Plan: • Department goals assessment: -- Alumni Survey -- Senior exit survey -- General outcomes matrix • Curricular outcomes assessment: -- Facult y & facilities resources assessment • Lab evaluation • Software needs evaluation • Senior exit survey --- Course assessment Plan: • Quizzes/tests • Homework, labs, projects • Web-based course-exit survey Data Analysis/Evaluation Proposed changes to course goal, objectives, pre-requisite or assessment Data dissemination to constitutes: students and EAC Implement changes
FIGURE 7 ECE P ROGRAM ASSESSMENT P ROCESS FLOW
Faculty openly reviews each other’s course survey results qualitatively and quantitatively. The open session discussion of this nature has benefited our department tremendously. Towards the end, actions items are generated for the betterment of the program health. As a byproduct, this process actually helps bring the faculty closer to each other.
0-7803-7961-6/03/$17.00 © 2003 IEEE November 5-8, 2003, Boulder, CO 33 rd ASEE/IEEE Frontiers in Education Conference T2B-21
Session T2B Fall Semester: Survey Site Setup: Course Goals/Objectives, Assessment methods
ABET Committee: Report/Review on proposed changes for next Spring
Inform Constituents: Inform students and faculty of the department goals/objectives for both undergraduate and graduate Review Department Resources Assessment Plan: • Lab Software needs • Equipment/instruments needs
Department-wise 1. Review/Analyze the following last Spring data for action items & proposed changes: • Course-exit survey • Senior-exit survey • Alumni Survey • General matrix evaluation 2. Review/close action items on changes or improvement made based on last Fall data analysis/ evaluation
Spring Semester: Assess Survey: Semester-end course survey by students and faculty
Survey Site Setup: Course Goals/Objectives, Assessment methods
ABET Committee: Report/Review on proposed changes for next Fall
Review Department Resources Assessment Plan: • Lab Software needs • Equipment/instruments needs
Department-wise 1. Review/Analyze the following last Fall data for action items & proposed changes: • Course-exit survey • Senior-exit survey • Alumni Survey • General matrix evaluation 2. Review/close action items on changes or improvement made based on last Spring data analysis/ evaluation
Data Dissemination: • IEEE students • HKN body
Engineering Advisory Council: Subcommittee on Curriculum/ object ives: feedback on overall outcome assessments.
Assess Survey: Semester-end course survey by students and faculty
Annual Final Report: On program assessment includes data for both Fall and Spring (summer report)
FIGURE 8 OPERATIONAL FLOW OF PROGRAM ASSESSMENT PROCESS
Towards the middle of the semester, a lab plan session REFERENCES on lab needs, software and equipment resources, are evaluated for the next semester and near future. The results [1] Engineering Crit eria 2000 3rd Edition: Criteria for Accrediting and action items are then presented and discussed at the Programs in Engineering in the United States. Published by The ABET committee meeting for promoting a consistency Accreditation Board for Engineering and Technology (ABET), Baltimore MD. 1997. Available: among departments. At the last two weeks of semester, the http://www.abet.org/EAC/eac2000.html. survey site is open for students and faculty to complete the [2] Royer, E., Wright, C., Peterson, D., “Assessment for Electrical survey. The similar activities repeat themselves in the Engineering Programs – Processes Implemented at the United States following spring semester as indicated in Figure 8, except Air Force Academy,” IEEE Transactions on Education, Vol. 43 No. that the action items and/or the proposed changes are 2, May 2000. disseminated to the IEEE/HKN student body and the EAC [3] McGourty, J., Scoles, K, and Thorpe, S., “Web-based course members. By doing so, we close the loop of informing evaluation: comparing the experience at two universities,” 32nd constituents, students and industrial members, of our process ASEE/IEEE Frontiers in Education Conference, T1B-17, Boston MA, of quality improvement. Last but not least, the annual November 2002. ABET-like report for the yearly progress is prepared during [4] Mak, F., Frezza, S., Yoo, W.S., “Web-based course-exit survey for the summer break. ABET EC2000,” ASEE Annual Conference and Exposition, Nashville, TN, June 2003. In sum, the operational flow of activities of data collection, analysis and publication will not be possible and be a heavy burden for the size of our department and resources without the aid of the web-based tool. Faculty members can once again focus on their teaching and scholarly activities without spending too much time on data compilation in order to satisfying the ABET preparation. Chair can also readily review faculty’s teaching performance with this tool and/or to assist faculty in fulfilling weak areas where it is appropriate. This web-tool has certainly enhanced our ABET preparation process. 0-7803-7961-6/03/$17.00 © 2003 IEEE November 5-8, 2003, Boulder, CO 33 rd ASEE/IEEE Frontiers in Education Conference T2B-22