Evaluation of a Computer Networking Class in Information Technology

3 downloads 3383 Views 233KB Size Report
Oct 18, 2008 - Computer networking is one of the required core disciplines in Information Technology (IT) and developing evaluation techniques for this class ...
Evaluation of a Computer Networking Class in Information Technology C. Richard G. Helps

J. J. Ekstrom

Information Technology Brigham Young University Provo, UT, USA +1 (801) 422-6305

Information Technology Brigham Young University Provo, UT, USA +1 (801) 422-1839

[email protected]

[email protected]

ABSTRACT

General Terms

Information Technology (IT) is a rapidly-developing discipline. IT instructors often design new courses to meet changing needs. There is also a need to evaluate courses once they have been designed and implemented. Evaluation of courses leads to improvements in the learning experiences for students and better understanding of the educational process and outcomes for course designers and instructors. Evaluation and improvement of quality is a requirement for programs accredited under ABET (and other accrediting bodies).

Management, Measurement, Documentation, Human Factors

Keywords Course Evaluation, Course Design, Networking

1. Introduction Courses in higher-education technology disciplines are typically designed by the faculty who teach them. Frequently these faculty members have little or no formal training in educational design. They tend to follow their own experience or organizational guidelines when creating these courses. Faculty designers will share new course designs with colleagues at conferences and in journals. At the 2007 SIGITE Conference multiple papers described new courses or new aspects of courses. [1, 2, 3, 4, 5, 6, 7, 8]

A networking class in IT was selected for evaluation. Networking was selected because it is one of the core required topics in the IT curriculum. The course was compared to national standards for curriculum in the networking area. A single class was evaluated but methods were developed which can be applied to other courses in the IT discipline. The evaluation study included evaluation of the course content and the course structure. Student input was obtained through a survey instrument and the validity of the input was considered.

Most of these papers included some evaluation of the effect of the new design on student learning, often using a student survey, but they do not include in-depth evaluation of the course structure and have few references to course outcomes. Many similar papers can be found in other conferences and journals. EG Courte, 2007 and Strooper & Meinicke, 2007 [9, 10]

Several questions were identified for this evaluation: Did the networking class meet student needs and expectations in terms of content and teaching approach; What are the students preferred learning styles; What coursework should a networking class include compared to what was actually taught; Did success in the earlier foundation class lead to success in the networking class; How well did students perform in this class relative to their performance in the IT major as a whole?

Descriptions of detailed evaluations of IT courses are much rarer. In particular evaluations which include not only some measure of the effectiveness of the course but also how the course meets program goals and outcomes and also evaluation of the structure of the course and the effect this may have on the short and long-term success of the course design. Some studies do address issues such as learning styles and course structure EG. Bryant 2004. [11] These are more useful in terms of evaluating both the immediate efficacy of a course as well as its underlying structure and course design.

Pursuing these questions involved data gathering from students as well as researching student records. Appropriate mechanisms were developed to protect student privacy. The evaluation of this class led to a number of useful insights and recommendations into technical class content, teaching and learning styles, and into the evaluation process. The methods of the study can be used for other courses and for other IT programs.

For programs accredited under ABET and for instructional designers who wish to achieve specific learning outcomes course design and evaluation including outcomes assessment is much more significant. Evaluation of the course design must include evaluation of the design of the outcomes.

Categories and Subject Descriptors K.3 [Computers and Education]: Networking, Evaluation

The evaluation reported here was performed on a computer networking class. Computer networking is one of the required core disciplines in Information Technology (IT) and developing evaluation techniques for this class are not only valuable in themselves but can be generalized and applied to other core classes in the IT major.

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. SIGITE’08, October 16–18, 2008, Cincinnati, Ohio, USA. Copyright 2008 ACM 978-1-60558-329-7/08/10...$5.00.

The present study was designed to address issues of student satisfaction as well as course design, course structure and the selection of course outcomes. In addition the study considers how

259

components, trying (and often discarding) new software packages. In-class, “show-of-hands” surveys also indicate that more than half the students are the local “computer experts” for their local community IE for their student housing, family or neighborhood communities.

the course compares to established curriculum standards for IT required courses and also considers the effect of course design on the inevitable changes that occur in the rapidly changing field of IT. A series of evaluation questions were developed in consultation with the instructor and the course evaluator and various evaluation techniques were applied, including surveying the students. These are discussed in detail in section 3.

The general picture presented by this measured and informal data is that working with computers consumes a large percentage of the students’ time; that they are enthusiastic about their chosen field and are recognized, at least by ordinary people, as being skilled in what they do. This depth of experience adds weight to their opinions in any survey of computing education.

An interesting aspect of this study was the importance of student opinion. The authors felt that the students were an important and informed source of information for the course content, and in order to estimate the value of the students’ opinions parts of the survey addressed the students’ computing experience and background.

3. The Evaluation Questions The evaluation questions were selected to gain a deeper understanding to the learning in the class and the students’ reactions to the course structure. After considering the needs of the students and the instructors the following evaluation questions were chosen

2. Student Experience in Computing In evaluating a networking class the students represent a key stakeholder population. It was felt that many of them were deeply involved in computing and would have valuable, relevant input into both the curriculum and the instructional strategy. The validity of their opinion was assumed to be somewhat dependent on their background in the discipline. Although they are still undergraduate students, anecdotal evidence suggested that they have significant depth of experience in computing and their opinions should be considered as informed. Therefore it was decided to measure how much time and effort they had previously invested into their chosen field outside of the classroom. The results were as follows: •

Hours/week in computer use at work: 20-40 hours (Median response)



Hours/week in computer use at home:

Hours/week in computer use in college labs: 10-19 hours (Median response).

The sum of the last three numbers indicates that these students are typically spending 50-100 hours per week actively working with computers. •

Experience in IT (Excluding academic involvement): 1-4 years (Median response)



Did the networking class meet student needs and expectations in terms of technical content?

2.

What coursework should a networking class include compared to what was actually taught?

3.

What are the teaching and learning styles employed and how do these styles relate to student preferences?

4.

Does success in the earlier foundation class predict success in the networking class?

5.

How well did students perform in this class relative to their performance in the IT major as a whole?

In a broader sense there are many questions that would be of interest to course designers and faculty in the IT program, such as relationships between performances in all the earlier and later courses in the major, as well as consideration of the appropriateness of the content of all courses. Such an evaluation would require many resources and it would be very difficult to track the performance of students longitudinally. The validity of longitudinal studies would also be compromised by the relatively frequent changes in curriculum and course content, due to the newness and nature of the IT discipline. This evaluation project is exploring a few of these topics to answer a few specific questions and also to explore the need and opportunity for more extensive studies. As a result this evaluation is based on a ‘snapshot’. Specifically it is based on the performance of one section of students in one class, with retrospective consideration of their performance in a previous pre-requisite class.

20-40 hours (Median response) •

1.

Personal IT exploration: “At least Monthly” (Median Response)

This last item was the response to the question “How often do you typically try out new IT developments at your own expense and in your own time (not class work)”

4. BYU Networking Courses Like other IT programs BYU has offered required coursework in Networking since its inception. BYU has several courses that contribute to the students’ understanding of networking. Three of the courses are key contributors. They are:

• Combining this data with their experience in IT suggests a typical student involvement in tens of independent computer explorations prior to entering the networking class. This evaluation included multiple observations of class sessions and discussions with students. Anecdotal discussions with students indicate that many of their explorations involve setting up ad-hoc computer networks, connecting networks to television or audio-visual systems to create multi-mode and recording systems, installing experimental operating systems, developing new websites, exploring new ways of communicating over networks, creating home computer systems involving multiple machines, building or modifying computer systems using cards, motherboards and other

IT210: Fundamentals of Web-Based Information Technology IT327: Digital Communications IT347: Computer Networks The IT327 course is relatively independent of the other two courses and focuses on the physical, modulation and encoding aspects of

260

downloaded into a spreadsheet for further processing. Once the data was collected statistical analysis was performed on the data. In addition the student responses to the survey instrument questions were analyzed.

networking. It is mostly concerned with the first two layers of the ISO seven-layer network model, the physical and data layers [12]). The IT210 course, despite the web emphasis in the title, is a key contributor to students’ understanding of networks in general. This course attempts to address several basic technical issues within the discipline and includes modules on operating systems, databases and network configuration. Since networking is a fundamental technology for web systems this course teaches networking concepts as well as web system concepts. It is a required prerequisite for the IT347 course.

The response rate was high. 32 of the 36 students responded to the survey instrument and all 36 responded to the standard class survey. Response is encouraged but voluntary. Whether or not a student responds was not tracked by the instructor or the evaluator. No compensation was provided for responding. The class was also observed. The evaluator visited the class and labs a couple times a week throughout the semester to observe teaching styles, student-instructor interactions and to better understand the technical domain content of the course. Informal notes were kept of these observations and were used to validate or confirm some of the other results.

IT 347, built on the foundations of IT210 covers the core topic areas of networking including most of the seven layers of the networking model, but with minimal coverage of those network layers addressed by the IT327 course. Since the IT347 is the core course for teaching networking it was chosen as the primary target for the evaluation. The effects of the IT 210 pre-requisite course on IT347 were also considered.

6. Evaluation of the Course Technical Domain Content

5. Evaluation Methodology

One of the evaluation questions was whether the technical domain content was appropriate for the course. This was evaluated through three mechanisms. Firstly the content was compared against a nationally recognized model curriculum and then two different student responses were used.

Course documentation in the form of syllabi, course outlines, assignments, exams, textbooks etc. was available and was used in the study, particularly for evaluating course structure. The class was taught as normal by the regular instructor, who was also the course designer. The evaluator, in consultation with the instructor designed a survey instrument to assess aspects of the class performance to address the selected evaluation questions. The survey instrument was administered to the class near the end of the semester. In addition to the survey instrument students also responded to the standard class evaluation survey which is administered to every IT class at the end of each semester.

Two different responses were obtained from the students asking for their opinion of the course content. The first is from the standard survey each IT class is asked to take each semester, which is narrowly focused on the stated class outcomes. The second is part of the survey instrument designed for this evaluation.

6.1 Comparison with National Standards The course outcomes were compared against the IT model curriculum [13]. The standard for comparison is the model curriculum for IT programs recommended by the Joint Task Force for Computing Curricula. There have been several versions of this recommendation document. This was first issued in 2001 (CC2001) and the current version was issued in 2005 (CC2005) and is being updated. These documents are widely accepted nationally as providing a good basis for technical-domain content for various computing disciplines. The task force is sponsored by the largest professional organizations in computing, the Association for Computing Machinery (ACM), the IEEE Computer Society (IEEECS) and the Association for Information Systems (AIS). The ACM and IEEE claim hundreds of thousands of members and are nationally and internationally recognized as leaders in the field of computing.

To explore student performance relationships several grade data points were gathered. These included the students’ grades in the networking (IT347) class, the students’ grades in their final exam and their grades for the laboratory work. Their grades for the prerequisite class (IT210) and their major GPA were also collected so that grades could be related to previous work. Student anonymity was protected as discussed below. Although the data gathered in the survey was largely course-related data rather than personal data, it was felt that if the students knew the responses were anonymous they might be more comfortable providing frank and truthful responses, since both the instructor and the evaluator are members of the IT faculty. The assistance of the college student-advisement center was enlisted. Each student was provided with an alias. The alias was used by the students to respond to the survey instrument. However the authors wanted to relate student responses to performance data. Therefore the advisement center researched the necessary performance data (grades etc.) for each student and then linked it to the aliases. The complete package of anonymous data was then returned to the research team with all the necessary data linked to alias names.

The specific topics recommended for networking for IT programs from CC2005 are as follows [13, version: April 22, 2008] NET1. Foundations of Networking (3) NET2. Routing and Switching (8) NET3. Physical Layer (6)

Parenthetically, the evaluator has been involved with exit interviews with students from this program for some years and has generally found them to be very willing to express their opinions openly, both positive and negative, when invited to do so.

NET4. Security (2) NET5. Network Management (2) NET6. Application Areas (1)

The data was collected through an on-line web-site. This allowed the students to respond at their leisure from any point where the Internet was available. The web-site collected the data and it was

There are six networking sub-domains listed. The numbers after each section in the above list represent the minimum number of

261

20

Explain the operation of the components of a SOHO wireless edge router including DHCP, NAT/PAT , Routing function, Switching function and relationship to wireless access point

NET2

21

Describe how DNS works in the global internet including caching and root servers

NET5

22

Explain why guaranteed delivery is not desirable in certain multimedia protocols

NET2

23

Implement a simple communications protocol using sockets and experience the issues associated with getting it to work with other's implementations

NET2

CC2005

24

Implement a simple web server

NET6

equiv

25

Explain how 802.11 collision avoidance works

NET2

lecture hours that should be spent on each sub-topic. The standard stresses the fact that traditional lecture approaches are not required nor specifically recommended, however they provide a recognizable basis for estimating effort in presenting topics. Each “lecture hour” unit is assumed to incorporate three out-of-class hours doing labs or homework. Details of the expectations in each of these six subdomain areas can be found in the full report, which is available through the Internet. Using this as a basis the IT347 course outcomes were evaluated. The course has 28 outcomes. Each outcome was compared against the standard as shown in the table below. Table 1 Defined Course outcomes for IT347 #

Outcome

1

Understands computer networking concepts and vocabulary

NET1

26

Troubleshoot connectivity problems in a host occurring at multiple layers of the OSI model

NET3

2

Has received experience with real implementations of the concepts

-

27

Access MIBS from devices using SNMP on a workstation

NET5

3

Has built an intuition of the relationship between standards bodies and technology

NET1

28

Describe the operation of and configuration of Virtual LANs on an 802.1Q switch

NET5

4

Has built confidence through problem solving

-

5

Describe the role of a node and a link in an abstract network model

NET1

6

Define bandwidth, latency and throughput

NET1

7

Explain how bandwidth * latency = data volume is related to the concept of traditional "volume"

NET1

8

Describe the operation of a packet based sliding window protocol

NET2

9

List 7 layers of the OSI Model and compare them to the layering used in the Internet model

NET1

10

Explain the concept of encapsulation and its relationship to layering in the network models

NET2

11

Explain the operation of CSMA/CD and its implementation in Ethernet

NET2

12

Describe how an IP packet is carried in an Ethernet frame

NET2

13

Explain how TCP's byte-stream sliding window is related to a traditional packet-based sliding window algorithm

NET2

14

Explain the differences between a hub, switch, bridge, and router

NET2

15

Explain the relationship between an 802.1D bridge and a modern switch

NET2

16

Describe how a learning bridge "learns" what port should be used to forward a packet to a device

NET2

17

Describe how Spanning tree works and explain why it is necessary

NET2

18

Compare and contrast distance-vector and linkstate routing algorithms

NET2

19

Describe the evolution of Internet addressing including class-based and classless (CIDR) and their relationship to subnetting

NET2

Although lecture hour units are not listed in the above table a cursory examination of the curriculum and observation of the class showed that the minimum hours required were easily exceeded for NET1 and NET2. Noticeably missing from this list are entries for NET 3 (Physical Layer). These aspects are covered extensively in IT327. Also missing was NET 4, (Security). Security aspects are covered in a separate required class in the program, Information Assurance and Security (IT466). Content for NET 6 (Application Areas) is only addressed in course objective #24, however that objective is taught through several multi-hour lab assignments and thus the course meets and exceeds the 1 hour minimum time expected for this sub-domain. Although NET 5 (Network Management) is addressed in two of the outcomes (#27, #28) it does not appear to meet all the detailed requirements of the CC2005 model and therefore it is recommended that this area should be strengthened. With this exception all the minimum requirements of the model curriculum in the area of networking are more than satisfied. It is also noted that some of the course outcomes (EG #2 & #4) do not directly address areas of the recommended networking model curriculum; however they address more general areas of learning, such as “problem solving.”

6.2 Standard Class Assessment Each semester all students taking IT courses are asked to evaluate the class in terms of achievement of the stated course outcomes. For the IT 347 class the outcomes are listed in Table 1. Students responded on a 1 to 5 scale where 1 is the poorest achievement of the outcome and 5 is the best. After the survey data was collected and analyzed the instructor commented on the student responses (this is also standard departmental practice). There was an error in the standard class survey for this particular semester. Although 28 outcomes were listed and were surveyed by

262

Table 2 List of Technical Topics Taught in IT347.

the class survey two of the outcomes, as listed, were identical; #25 was repeated in place of #27. This was a typographical error when the survey was created. Interestingly 5 of the 34 students answered the two duplicate questions differently. A few statistics were calculated with the error included and with them excluded and the differences as a fraction of the overall class response are minimal (fourth significant digit). For simplicity’s sake this analysis used both versions of student response for question #25 thus smoothing out the different responses of the 5 students.

Topic

Generally students rated the achievement of outcomes highly with an average of all the outcomes by all the students of 4.3/5.0. The lowest scoring outcome was #11 with an average score of 3.5/5, (“Explain the operation of CSMA/CD and its implementation in Ethernet”). The instructor’s comment on this was that this topic was de-emphasized since it is not very relevant to the field today and he queries whether this should be removed from the list of objectives. This low-rated item and the instructor’s comment are interesting indicators of an issue that arises in higher-education technology classes. The technology evolves rapidly and continuously. The changes in specific technologies are exponential and are governed by Moore’s Law [14, 15]. If course outcomes are defined too strongly in terms of specific technologies then the courses have to be re-designed every time the technologies change. On the other hand if the course does not include the current leading-edge technologies it will not serve the needs and interests of the students. It is recommended that course outcomes and course design be focused, as far as reasonably possible around concepts and learning goals and that specific technologies are implemented and discussed within the scope of those concepts. This helps to reduce the constant “churn” in course redesign [16, 17]. This is discussed further in this report under the analysis of outcome types section below.

diff

1

Socket programming for network communication/interoperability issues

2

Designing and setting up network systems

3

History and impact of Standards on Technology Development and adoption

-0.1

4

IETF and ISO-OSI 7-layer networking models

-0.1

5

Layer 2 (MAC) addressing

0.5

6

Hubs/switches/brides/algorithms/comparisons

0.7

7

IP addressing

1.0

8

Distance Vector protocols/Link state protocols

0.8

9

MPLS

0.3

10

Autonomous systems and BGP

0.8

11

TCP, UDP and Sliding windows

0.3

12

DNS, DHCP

1.2

13

Threads

-0.1

14

Client-server models

-0.5

15

Switching and forwarding, Bridge learning algorithms

0.3

16

Network management and SNMP

0.9

17

NAT, PAT

0.9

-0.9 2.1

The topics that the students most strongly felt should receive more emphasis were the following 6. These are ranked from most important to least important according to the student responses.

Student comments included a few concerns that will be discussed in more detail in the context of the more detailed survey instrument.

6.3 Survey Instrument Assessment The survey instrument designed for this study did not address the technical domain topics from the viewpoint of objectives or outcomes; rather it listed 17 technical topics as taught in the class through lectures and labs and then asked the students to rate the relative importance of each topic. The students rated the topics from two viewpoints, firstly how important each topic was as presented in the class and secondly how important each topic should be, in their opinion. A five-point Likert scale was used for each of these assessments. The 17 technical topics assessed are shown in Table 2 below.

1.

“Designing and setting up network systems” (+2.1) (4 standard deviations): Not only was this the largest single result but student comments also reinforced this conclusion.

2.

“DNS, DHCP” (+1.2) (2.3 standard deviations): This too should be emphasized more

3.

“IP addressing” (+1.0) (1.9 standard deviations)

4.

Network management and SNMP “ (+0-9) (1.8 standard deviations)

5.

NAT, PAT (+0.9) (1.8 standard deviations)

6.

“Distance Vector protocols/Link state protocols” (+0.8) (1.6 standard deviations)

There were several other topics that the students felt should be emphasized more but these had the largest differences.

In order to highlight the differences between these two assessments the average score was calculated for each topic and then the difference between the averages was calculated (column ‘diff’ in Table 2). Positive numbers indicate the students felt the topic should receive more emphasis, negative numbers indicate that the students feel the topic should receive less emphasis. The average difference between actual and desired emphases was 0.7, with a standard deviation of 0.5.

The topic that the students most strongly felt should be deemphasized was the following: 1.

263

“Socket programming for network communication/ interoperability issues” (-0.9) (1.7 standard deviations). Although this wasn’t the biggest difference of all the topics it was the topic most mentioned by students in their comments. They felt that too much emphasis was placed in this area.

rather than changing outcomes and other more structural aspects of the course design.

The students only recommended de-emphasizing 5 of the 17 topics and only for the “Socket programming” issue did the response exceed one standard deviation. It should be noted that although this is only one topic it involves several lab experiences and is thus a substantial issue.

7. Student Learning Styles A topic of interest in IT is that of student learning styles. Most IT programs value experiential learning models and education theorists such as John Dewey, Kurt Lewin, Jean Piaget and David Kolb have shown its efficacy and discussed its limitations [18]. Experiential learning is widely used in many technology disciplines, including Information Technology. The efficacy of experiential learning is often assumed but seldom measured.

The primary recommendation from this analysis is to significantly increase the topic of designing and setting up network systems, either in this course or in another course in the major. This received such an overwhelming response from the students (4.1 standard deviations) that it must be considered as a very strong request. For all the areas that the students felt should be strengthened the instructor/course designer should evaluate them to see what effect spending more time or effort with these topics might have on the overall course structure, however the table and list above does provide topics to focus on as changes are made.

The IT 347 course structure is similar to that of other IT courses in the program. Two class lecture hours are presented each week and there is a lab assignment each week. The laboratory assignment is nominally scheduled for three hours per week but the students are free to work on the assignments independently and many of them take more than the nominal three hours.

The “socket programming” topic, which the students would prefer to deemphasize, was discussed with the instructor. He indicated that this is an important topic that addresses several of the course outcomes. The students, however, do not perceive the value of the exercises. A restructuring of the exercises or a different presentation to the students would achieve better student motivation and lead to better learning.

For the class lecture hours the students are supposed to come to class having completed pre-class reading assignments and then participate in the class. Based on observation by the evaluator, most lecture hours were a mixture of lecture and discussion. It was apparent from the class observations that the instructor has a rich background of experience as a practicing professional. This background illuminated the class discussions. The atmosphere was relatively relaxed and students were comfortable in asking questions and in joining the discussion. No formal mechanisms were used to require discussion or to monitor who did and who did not participate. The instructor endeavored to use the class time to focus on areas where the students seemed to have the most confusion. He did not attempt to rigorously “cover” all the material.

6.4 Analysis of Course Outcomes The desired course outcomes, listed in Table 1, vary from narrowly technical, domain-specific to broad learning outcomes. It was decided to further analyze these outcomes to determine their nature. There are several purposes for such an analysis. Any course will have a mixture of broad and narrow desired outcomes. It is characteristic of high tech courses, especially in IT that some of the technical domain content will evolve rapidly and need to be updated. If changing technical domain content is included in the defined course outcomes, then the outcomes will also have to be updated. This tends to lead to further changes in the course structure and lead to larger efforts in updating courses. [16, 17]

Some students did comment in the survey that they felt excluded by some of the class discussions. They felt that the instructor was talking to a small minority of students who were knowledgeable in the specific technical area. It would have been desirable to discover whether the students who complained had completed the assigned pre-class reading but this was not included in the survey. It would also have been interesting to discover whether the students who did participate in the discussions did the pre-class reading. These questions may be addressed in a future survey instrument.

The specified outcomes vary from broad to narrowly specific. They also vary from domain specific (EG “Explain how 802.11 collision avoidance works”) to general (EG. “Has built confidence through problem solving”). All the narrow outcomes are very technicaldomain specific.

The IT 347 course includes significant experiential learning experiences. The primary vehicle for these experiences is the set of laboratory assignments. The assignments are in the form of short projects. Students are asked to complete a task in computer networking and are provided with guidelines and resources. They are then expected to solve the problems independently. They are supported in this task by the instructor and a teaching assistant who act as coaches and mentors. They are expected to find additional resources on their own – typically through Internet searches or researching the textbook. These experiences are authentic, in that the problems and the problem-solving approach mirror professional IT practice.

Fourteen of the 27 outcomes were found to be narrowly domain specific, eleven more are domain-specific but more general (I.E. Concepts such as listing the seven layers of the general networking model are general). Four are more general concepts (one overlaps general and domain-specific). Ten of the outcomes are applications of other concepts and four of them explicitly call for experiential strategies. The outcomes are all expressed in behavioral terms although some would be hard to measure (EG. “Has built an intuition …”) It was noted earlier in the instructor’s analysis of student responses that outcome #11 should possibly be removed from the list of outcomes as the technology is no longer as relevant as it used to be. It is recommended that each of the narrowly domain-specific topics be examined to see if they can be expressed in terms of underlying concepts focused on that area of the networking discipline, such that the specific technologies and standards become the chosen method of teaching the outcome (specified in the curriculum) rather than the outcome itself. Current technology examples or exercises can then be included in the lab or class exercises which address this topic. Such exercises will be easier to change as the technology evolves,

This evaluation did not perform a full assessment of the efficacy of experiential learning but did ask the students to self-report their preferred learning style. The average responses of the students to the following statements on a 1 to 5 scale were as follows (1=Worst to 5=Best):

264

I learn best by listening and taking notes in class

2.71

I learn best by independent reading or studying alone

2.97

I learn best by discussing material in groups (in or out of class) 4.10 I learn best by working in the lab

8.1 Class grade vs. Lab and Exam grades The graph Figure 1 below) shows the relationship between the final grade in the class (x axis) and the final exam grade and the lab grade.

4.35

I learn best by completing the assigned reading before class. 3.19

Class vs Lab & Exam grades

This data shows a clear preference for experiential learning in the labs. This confirms, at least from a preference viewpoint, the value of experiential learning models in this discipline.

100.00%

90.00%

Perhaps a little more surprising is that the second strongest preferred learning style is for discussing material with peer groups. This method is significantly favored over solo studying and class lectures. It does it offers a contradictory datum for the stereotypical view of students in computing fields being antisocial and solitary.

Exam / Lab grade

y = 0.0848x + 0.6365 R2 = 0.3459 Exam Grade

y = 0.1652x + 0.3047 R2 = 0.69

80.00%

Lab Grade

3

Linear (Lab Grade) Linear (Exam Grade)

70.00%

The least popular learning style is the class-lecture learning model. 60.00%

A recommendation following from this data is that more attention could be paid to cooperative and cognitive apprenticeship learning applications in technology classes. There is abundant theoretical and practical research on these theories, which could be applied here [19, 20, 21, 22 p. 152, 23]. There are also indications that combining interactive modes of learning with experiential learning can be effective [24].

50.00% 2.0

2.5

3.0

3.5

4.0

347 Class Grade

Figure 1: IT347 Class Grades vs. Lab and Exam grades Linear regression lines were fitted to the two sets of data and plotted on the graphs. Note that the lab and exam grades contribute to the class grade so these are not independent variables. Some factors are immediately apparent. The Exam grade correlates much better with the final grade (r2=0.7) than the Lab grade does. (r2=0.3). This is despite the fact that the grading weighting for the class places more emphasis on labs than on the exam. However the grades for the labs are higher and have less variability (avg=93%; sdev=6.9%) than those for the exam (avg=88%; sdev=9.6%).

Initially it was felt that the students’ preferred learning style may have an impact on their performance in the different parts of the course. A linear-least-squares correlation of course grades as well as lab and exam grades to the preferred learning styles was done. No relationship was readily apparent in any of these three performance measures. The correlation coefficients were very low with some of them negative. The linear regression r2 values ranged from 0.00 to 0.04. A visual examination of the graphs of the learning styles compared to the various grades showed considerable scatter and no apparent patterns. It was concluded that students expressed learning style preferences did not correlate with measured performance in this class and no further investigation was done on this.

The explanation we propose for this is that the labs are not graded as carefully for content as the exam and are therefore a less powerful discriminator of student performance. Since students express a preference for learning with lab exercises we recommend that a more discriminating assessment method for the labs could lead to better learning outcomes. The effectiveness of the discrimination could be measured by its correlation to the final class grade. The practicalities of implementing this we leave to a future design and analysis. It is acknowledged that more discrimination in general requires more time from the instructor, however some mechanisms, such as grading rubrics, could be used to prevent overburdening the instructor.

8. Class Performance Analysis As part of this evaluation we studied how well various grades correlate with each other. There is an aphorism sometimes heard among instructors that, “A students get A’s and C students get C’s”, implying that students tend to produce very consistent results in all classes in a given major. If this is true there should be very strong correlations between the different scores of each student. We hypothesized that students who did well in the pre-requisite course (IT210) would also do well in the studied course (IT347). We felt that we should confirm or refute this hypothesis. This correlation, or lack thereof, might lead to a better understanding of the role of the pre-requisite course and the consistency of student performance, relative to the two courses and relative to their GPA in the major.

8.2 Comparison of Pre-requisite Course and Major GPA to Class Grade One of the evaluation questions concerned the relationship of grades in the studied class (IT347) to the prerequisite class (IT210) and to students’ major GPA.

We also wished to gain a better understanding of the correlation between the various grades the students receive and the final grade they receive for the course. Students take a variety of exams, complete homework assignments and complete lab assignments, all of which contribute to their final course grade. Given that students have a preference for lab-based learning we felt we could improve our understanding of student learning and course design by looking for correlation between the students’ final exam scores, their lab scores and their final grade.

The data was gathered and linear regressions were calculated. The correlations were not as strong as we expected. Correlation to the pre-requisite class had an r2=0.29 and correlation to the major GPA had an r2=0.20. Our explanation for this is that the content between the classes is significantly different and that students have a preference for one sub-domain of IT over another. Further research could illuminate this topic. There are no specific recommendations arising from these results as there is no specific goal to make the correlations either stronger or weaker. However, this does provide insights into the degree of linkage between the various performance

Statistics were used to analyze the relationships between several of these measured quantities.

265

all class outcomes rated by students at 4.3 on a 1-to5 scale.

indicators. Grades can change significantly. The worst case was from an A- to a D+, but the average grade change was only 0.5 (a drop from A to A- represents 0.33). The grades therefore, do not vary much on average but significant variation does occur (standard deviation 0.55). Overall from the 210 class to the 347 class the grades rose slightly (average rise= 0.23).



Several recommendations based on the findings are presented in the report. The most significant of these recommendations are summarized below.

The Major GPA is calculated for class grades contributing directly to the student’s major1 . The major GPA had a lower correlation to the IT347 grade than was found for the pre-requisite course. This is not very surprising since the major GPA calculation includes several classes that students are probably less interested in (EG Technical writing, Economics) than those that are directly focused on their major interests, as IT210 is.



Network Management as defined in the model curriculum (NET 5) was relatively weak and we are not aware of this topic being addressed significantly in other courses in the IT major. It is recommended that this topic area be strengthened, either in this course or in another course in the major. This will help to bring the overall IT program into alignment with the model curriculum.



Feedback from the students indicated a strong desire for more emphasis in designing and setting up networks. This would overlap the previous recommendation.



Feedback from the students also indicated a desire to do fewer “socket programming” exercises. This indicates that the students do not perceive the benefits of this learning exercise. As discussed earlier this may be as much an issue of presentation and student expectations as content. A different presentation of this exercise may help motivate students and improve their learning in the concepts involved in the socket programming exercises.



Students requested several other changes. The statistics in table 2 have ranked these in order of perceived importance. These changes should be reviewed by IT347 course designers to look for further opportunities to engage the students in the learning process. There is the potential to improve learning opportunities. The implications to the rest of the course and to the instructor of implementing these suggestions must also be considered.



The narrow domain-specific outcomes should be evaluated to determine if they can be restructured in more general terms so that the course outcomes are less likely to become obsolete. This will help to keep the course focused on its primary learning objectives and to avoid redesigning course outcomes and course structure as specific technologies evolve.



The students’ strongest preferred learning style is for labbased learning and this is well catered for in the course design. However their second-strongest preferred learning style is through discussion and group interaction. At present very little is done in the course design to take advantage of this. Class discussions are used but could be structured to take more advantage of peer learning. Students do form ad-hoc learning communities, but it is probable that specifically addressing this learning style in the course design could lead to rewards for the students and instructors. Course designers should ensure that the course design supports and encourages this learning mode appropriately.



Despite the student preference for lab-based learning the lab grades are not a clear discriminator of student performance. Course designers should seek a design for

9. Recommendations A number of recommendations arise from this evaluation study. This project has evaluated a single IT class by developing several evaluation questions relevant to the stakeholders and using a variety of evaluation techniques. There are two general sets of recommendations that are derived from this study. The first are the recommendations that apply to this specific class. And the second relate to generalizing the evaluation techniques to apply to other classes in IT.

9.1 Specific Course and IT Program Recommendations A number of strengths of the current course were noted as part of this evaluation. •





1

On the meta-level a further important strength is the IT department’s, and this specific instructor’s, desire to improve the course through evaluation and refinement.

The students showed a clear preference for experiential learning through lab assignments, projects and exercises. The course design caters very well for this in that the students spend more time in labs than in all other parts of the course combined. The labs are carefully designed to give a structured and graduated learning experience. Support mechanisms with teaching-assistant and instructor mentoring are provided. The students’ creativity is challenged through an ambitious lab project at the end on the semester. The atmosphere is collegial. The relatively small class size (36 students) and the open and engaging attitude of the instructor allows for discussion in the class to explore individual topics of interest. Material is largely presented in a standard lecture mode but this is interspersed with question and answer sessions between students and instructor. The instructor encourages this interaction by assigning reading in advance and encouraging students to bring questions to class. The instructor’s years of experience in the industry enrich these discussions considerably. The students on the whole feel that the class outcomes are being achieved. This is indicated by the survey score of

Major GPA Calculation: Any class specifically required for the major is included in the Major GPA calculation. This would include classes such as Physics, Technical Writing, Math and Statistics, as well as all the specific IT classes, but will not include Classes which BYU requires, such as “History of civilization” which the major does not list as a requirement.

266

identified through this study and methods to manage those problems are being designed.

lab exercises that will allow their grades to be a better discriminator of student performance. This will better correlate their preferred learning style with their course grade.

11. References

Implementing these recommendations in the course will lead to further improvements in student learning. This report outlines methods that the improvements in learning could be measured and verified.

[1] Frye, L. M. (2007). Wireless sensor networks: learning and teaching. Paper presented at the Proceedings of the 8th ACM SIGITE conference on Information technology education. [2] Gaspar, A., & Langevin, S. (2007). Restoring "coding with intention" in introductory programming courses. Paper presented at the Proceedings of the 8th ACM SIGITE conference on Information technology education.

9.2 Application of evaluation principles to other IT courses

[3] Gorka, S., Miller, J. R., & Howe, B. J. (2007). Developing realistic capstone projects in conjunction with industry. Paper presented at the Proceedings of the 8th ACM SIGITE conference on Information technology education.

Perhaps more importantly the techniques used in this evaluation can be used to evaluate other courses at this or at other institutions. Specifically issues relevant to IT course evaluations include the following: •

Comparison against national and international standards provides a strong basis for validity.



Consideration of the design of the course relative to the changeability of IT courses allows for courses to evolve over time with the inevitable progress of technology. IE Evaluate course outcomes and objectives to see if they allow for change



[4] Kane, M. D., & Springer, J. A. (2007). Integrating bioinformatics, distributed data management, and distributed computing for applied training in high performance computing. Paper presented at the Proceedings of the 8th ACM SIGITE conference on Information technology education. [5] Malik, R. A., Hansen, R. A., Goldman, J. E., & Smith, A. H. (2007). Laboratory modules for conducting comparative analysis of 802.11 frames. Paper presented at the Proceedings of the 8th ACM SIGITE conference on Information technology education.

Consideration of the students as a relatively informed stakeholder and the relative importance of their opinions on the basis of this evaluation.



Evaluation of the course outcomes (both stated and implied outcomes) in terms of desired and actual achievement.



Evaluation of learning and teaching styles of the course



Involvement of the instructor with the development of the evaluation questions increases the probability that the evaluation will lead to changes and improvement of the course.

[6] Wagner, B., Renshaw, S., & Broadbent, K. (2007). A multipart lab exercise for analyzing the effect of peer-to-peer software on a university network. Paper presented at the Proceedings of the 8th ACM SIGITE conference on Information technology education. [7] Webster, L. D., & Mirielli, E. J. (2007). Student reflections on an academic service learning experience in a computer science classroom. Paper presented at the Proceedings of the 8th ACM SIGITE conference on Information technology education. [8] Zilora, S. J., & Hermsen, L. M. (2007). Take a WAC at writing in your course. Paper presented at the Proceedings of the 8th ACM SIGITE conference on Information technology education.

Reviewing the various correlations that were performed we noted that several of them revealed very little useful information and since they required significant effort to collect the data, these could be omitted in future evaluations. On the other hand several questions that emerged were not addressed by the survey and could be included in future evaluations.

[9] Courte, J., E. (2007). Comparing student acceptance and performance of online activities to classroom activities. Paper presented at the Proceedings of the 8th ACM SIGITE conference on Information technology education.

10. Conclusion

[10] Strooper, P, & Meinicke, L. (2007). Evaluation of a new assessment scheme for a third-year concurrency course. Paper presented at the Proceedings of the ninth Australasian conference on Computing education - Volume 66.

Developing and carrying out this evaluation provided significant insights into the networking class. The methods used and results found apply here and we suspect similar results would be found in other courses at BYU and in other IT programs. The insights into student learning styles particularly suggest other possible course designs to take advantage of student preference for discussion. We also recommend the methods developed should be applied to future evaluations.

[11] Bryant, K., C. (2004). The evaluation of courses in Information Systems. Paper presented at the Proceedings of the sixth conference on Australasian computing education - Volume 30. [12] Information technology - Open Systems Interconnection Basic Reference Model: The Basic Model. 1994. Retrieved. from http://standards.iso.org/ittf/licence.html

We were able to satisfactorily answer all our evaluation questions. The research required considerable resources, principally time for the evaluator, the instructor, the support staff in the advisement office and the students.

[13] Computing Curricula: Information Technology Volume. 2005. (May 2008 update version) Retrieved. from http://campus.acm.org/public/comments/it-curriculum-draftmay-2008.pdf

Further research is indicated to extend this evaluation to multiple classes. Some difficulties in doing further research have been

267

[20] Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive Apprenticeship: Making Thinking Visible. American Educator (Winter, 1991).

[14] Moore, G. E. (1965). Cramming more components onto integrated circuits. Electronics, 38(8). [15] Schaller, R. R. (1997). Moore's Law: Past, Present and Future. IEEE Spectrum, 34(6), 52-59.

[21] Collins, A., Brown, J. S., & Newman, S. E. (1989). Cognitive Apprenticeship: Teaching the Craft of Reading, Writing, and Mathematics. . In L. B. Resnick (Ed.), Knowing, Learning, and Instruction: Essays in Honor of Robert Glaser. Hillsdale, NJ: Erlbaum.

[16] Helps, R. G., & Renshaw, S. A. (2004). Design of a flexible case-study instructional module for operating systems for information technology. Proceedings of the 5th conference on Information technology education, 56-59.

[22] Mayer, R. E. (1999). Designing Instruction for Constructivist Learning. In C. M. Reigeluth (Ed.), Instructional Design Theories and Models (Vol. II, pp. 141-159). London: Lawrence Erlbaum Associates.

[17] Helps, R. G., & Renshaw, S. A. (2005, June 2005). UML and Design Layers Provide a Course Design Paradigm and Notation to Create Robust Technology Courses in Rapidly Changing Environments. Paper presented at the 2005 ASEE Annual Conference & Exposition, Portland OR.

[23] Slavin, R. (1994). Cooperative Learning: Theory, Research and Practice (2 Ed.). New Jersey: Prentice Hall.

[18] Kolb, D. A. 1984. Experiential learning: experience as the source of learning and development. Englewood Cliffs, N.J.: Prentice-Hall

[24] Baker, A. C., Jensen, P. J., & Kolb, D. A. (2002). Conversational Learning: An Experiential Approach to Knowledge Creation: Greenwood Publishing Group.

[19] Johnson, David W.; Johnson, Roger T.; Smith, Karl A., 1998, Active Learning: Cooperation in the College Classroom, Interaction Book Company

268

Suggest Documents