Impact of Alternative Teaching on Computer Science ... - CiteSeerX

3 downloads 26682 Views 135KB Size Report
All new learning involves transfer based on previous learning. COSC 112 is the first programming course in the course sequence, and students may or may not ...
Redesigning Core Programming Courses Through A Direct Instruction Approach Lethia Jackson, Velma Latson and Monika Gross Department of Computer Science Bowie State University, 14000 Jericho Park Road, Bowie, MD 20715, USA

Abstract - Learning from examples and repetition has been used in K-12 instructional strategies to teach complex skills as mathematics and skills used for the various science disciplines. The use of examples in teaching creates the kind of learning experiences that leads to the successful transfer of what is learned in one distinct problem to a different problem. Student success in the gatekeeper courses in the Computer Science Department is heavily dependent on the students’ ability to transfer knowledge from one course to the sequent course. A Direct Instruction (DI) approach to teaching is used to address cognitive learning and information transfer in the computer science gatekeeper courses. Keywords: programming, online, instruction, learning environment

tutoring,

direct

limits of verbal and nonverbal memory workspace. Three assumptions stand in cognitive load theory [3]: 1.

2. 3.

Consequently, if temporary memory is limited and overloaded, knowledge, comprehension, an approach to an outcome, retention and transfer of learning will have negative results. Cognitive load theory poses three independent sources of cognitive load [3]:

1. Introduction

1.

The Computer Science faculty at Bowie State University has continually reviewed the course content delivered in two gatekeeper core programming courses: Computer Science I (COSC 112) and Computer Science II (COSC 113). Students taking these courses continue to struggle with the cognitive domain of learning as well as transfer. The key characteristics of cognitive learning and transfer [1] include three elements: I. Initial learning is necessary for transfer; II. Knowledge that is overly contextualized can reduce transfer; abstract representation of knowledge can help promote transfer; and III. All new learning involves transfer based on previous learning.

2.

COSC 112 is the first programming course in the course sequence, and students may or may not be familiar with the process of computational thinking [8] [7]. The introduction to the material may cause students to experience cognitive overload. Cognitive overload can have a negative impact on the students’ ability to transfer knowledge. Therefore, cognitive load theory [3] assesses the memory load required to perform a task for a learner by examining the memory

Temporary memory used for reasoning comprehension and learning has limited capacity; Relatively permanent memory is unlimited in capacity; and The mechanics of cognitive processes decreases the temporary memory load.

3.

Temporary memory is required to complete a task; Irrelevant information that may or may not contribute to learning how to complete a task are presented to the learner; and Useful information is used to improve learning on how to complete a task.

To address the struggles of the students, the faculty modified the present learning environment for both programming courses (COSC 112 and COSC 113) to a Direct Instruction [6],[9] teaching approach which incorporates the following aspects: a) mandatory tutoring b) repetition and c) selected programming concepts based on “key characteristics of learning and transfer” [1]. Direct Instruction, as defined by [6], is “an instructional model that focuses on the interaction between teachers and students that is comprised of modeling, reinforcement and successive approximation.” The following includes the components of DI [6]: Lesson is delivered in small parts. Student outcomes and objectives must be stated clearly.

Students have opportunities to apply their new knowledge of concepts and terms with previous knowledge. Students practice concepts as introduced. Students continue to practice concepts through group work or independently that promote transfer of a distinct problem to a different problem. Instructors provide feedback to students at every practice opportunity.

2. Project Description The Computer Science Department redesigned both COSC 112 and COSC 113 to a direct instruction-learning environment in an effort to increase students’ cognitive domain of learning as well as transfer. Two sections for each course participated in the course redesign. The course redesign included three objectives: I. II.

III.

Prevent cognitive overload. Give content questions repeatedly to meet all levels of thinking complexity (knowledge, comprehension, analysis, application, and synthesis). Increase transfer of information from one distinct problem to a different problem.

In objectives I and II, students taking COSC 112 and COSC 113 in spring 2010 were given a pre-test, three tests, weekly quizzes, and a final exam. Each programming course had designated topics that had to be comprehended before taking the next subsequent course in the sequence. The pretest and the final exam were the same. The pre-test was administered on the first day of class. The purpose of the pre-test was to assess whether students had any prior knowledge to the course content. The answers on the pretest were measured against the answers on the final exam to assess information learned during the semester. The final exam was comprehensive. Each question on the final exam addressed all of the designated topics and content discussed in the course. Three questions from the final exam were disbursed among the test and quizzes. The three questions were labeled as structured question 1, structured question 2, and structured question 3. Faculty teaching COSC 112 and COSC 113 met to discuss which three questions would be selected as the structured questions. Each of the three structured questions met the following criteria: Content of the question is necessary to prepare students to comprehend material covered in the next course sequence. Content of the question is known as difficult for the student to grasp.

Each of the three tests covered the content taught within the period prior to the test. Each test also contained one of the three structured questions. For example, test 1 contained several questions that pertained to the first four weeks of material taught during the course. One of the questions on test 1 was structured question 1. Test 2 contained several questions on material taught in weeks 5 through 8. One of the questions on test 2 was structured question 2. This process continued for Test 3. Each structured question was assessed to measure whether students had improved their knowledge of the content after reviewing the question a number of times. To assess the lesson segments, weekly quizzes were given. The quizzes were announced and covered the material taught prior to administering the quiz. Quizzes were not given during the week of the test or final exam. The faculty teaching the core programming courses agreed to nine quizzes for the semester. The quizzes contained no more than two questions. A quiz that contained a structured question was considered to be a structured quiz. The purpose of the structured quiz was to assess that particular question. The results of the quiz revealed whether students needed more instruction for that particular topic. There were four weeks of instruction before each test. The structured quiz was given anytime within that four-week period. The structured quiz was not necessarily given the week prior to the test. Table 1: Course Schedule Week 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Tutoring

Standard Structured Standard Standard Standard Structured Structured Standard Standard

Assessment Pre-Test Quiz 1 Structured Quiz 2 Quiz 3 Test 1 Quiz 4 Quiz 5 Structured Quiz 6 Test 2 Quiz 7 Quiz 8 Quiz 9 Test 3

Structured Question

#1 #1

#2 #2 #3

#3

Standard Final Exam

#’s 1, 2, and 3

Table 1 gives an example of the course schedule during the semester.

In objective III, weekly mandatory tutoring worth at least 10% of the students’ grade was given. Tutoring was delivered either face-to-face or online. Students received online tutoring via software available through Angel called Elluminate. The online tutoring session had to be delivered and verified by the instructor of the course. Elluminate is Web 2.0 application conferencing software used to promote virtual collaboration and elearning environments. Learning is enhanced when communication tools are used to initiate and sustain exchanges among participants [5], [4]. Some structured tutoring occurred using Elluminate to demonstrate the skills necessary to solve the problems. Students were able to attend each virtual session and each session was recorded. Students were able to access the recorded session whenever they needed to review the material if they were not able to attend the virtual session. Technology included interactive technology, which allowed students to become actively involved in the learning process [2]. This gave the students flexibility since tutoring was mandatory. Tutoring was categorized as either standard tutoring or structured tutoring. Standard tutoring allowed the student to lead the tutoring session with questions concerning any topic of their concern. Structured tutoring allowed the instructor to augment standard tutoring by focusing on the topic of the structured question. Before a structured quiz, students received structured tutoring.

Table 2: Student outcomes for Quiz/Test/Final

Tutoring

Course Section

Online

COSC 112 Eve

Online

F2F

Online

4. Results of the Structured Questions A structured question earned a grade of 100 if the question was completely correct. Structured questions that received partial credit were not counted. If students received a 100 on a structured question, it indicated that the students comprehended the content of the question. Based on the structured quiz assessments, the results revealed that structured tutoring regardless of delivery did assist the students in comprehending the content material. The test results showed that a repeat of instruction after the quiz improved test results as compared to quiz results of that same structured question. The results of the final exam fluctuated compared to the test and quiz results for any particular structured question. The assumption is the overwhelming aspect of a final exam.

SQ #1

SQ #2

SQ #3

Quiz Test

0 63%

5% 33%

0 5%

Final

80%

37%

5%

Quiz Test Final

33% 66% 50%

52% 88% 93%

7% 71% 42%

Quiz Test Final

25% 31% 38%

62% 36% 50%

0 31% 43%

Quiz Test Final

5% 24% 32%

5% 33% 37%

0 5% 5%

COSC 112 Day

COSC 113Eve

COSC 113Day

3. Pre –Test Results The assessment of the pre-test revealed that none of the students in COSC 112 had prior knowledge to the content of the course. The pre-test assessment for COSC 113 showed that a few students had prior knowledge. The assessment of the pre-test answers as compared to the final exam answers revealed that students did comprehend most of the content delivered in both courses.

Quiz/ Test/ Final

Table 2 gives the percentage of students that received a grade of 100 on the structured question per quiz/test/final.

5. Summary A Direct Instruction teaching approach proved to be beneficial in addressing cognitive learning and information transfer in COSC 112 and COSC 113. Students that consistently participated in the tutoring performed better than those that did not. All weekly quizzes were announced and given after the students had received tutoring. Students that consistently came to class and took the quiz also performed better in the class than those that did not. The weekly quizzes helped the students to comprehend information in chunks and to focus on that particular content. Administering weekly quizzes prompted the students to study throughout the week. The quiz scores show that some students did comprehend the material after a tutoring session. After the quiz, the content material was discussed and reviewed as part of the class lecture. At the time of the test, the students had already seen the structured question a number of times via tutoring, quiz, and additional class lecture on that particular content. The

test was given after three quizzes where one quiz was considered a structured quiz. The test scores confirm an improvement in comprehension of the content as compared to the quiz scores.

[6] Magliaro, S., Lockee, B., & Burton, J. (2005). Direct instruction revisited: a key model for instructional technology. Educational technology research and development, 53 (4), 41-56.

Students had been tested on the structured question a number of times before the final exam. Based on the results of the final exam, the percentage of students passing the structured question is variable. All of the final exams scores improved over the quiz scores. For the most part, most of the final exam scores demonstrated improvement or remained the same as compared to the test scores.

[7] National Research Council. (2010). Report of a workshop on the scope and nature of computational thinking. Washington DC.

6. Conclusion The results of the Direct Instruction approach to teaching indicate that the students retained information and performed well in the course. According to PeopleSoft data, the number of sections offered fall 2010 for COSC 113 increased by one. In addition, the number of sections offered for the next subsequent programming course after COSC 113, COSC 214, increased by one section. An increase in the number of sections offered indicates retention of the students majoring in computer science or computer technology.

7. References [1] Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000).How children learn. How people learn expanded edition: brain, mind, experience, and school. (pp. 79113).Washington DC: National Academy Press. [2] Brown, J. S. (2000). Growing up digital: how the web changes work, education and the ways people learn. [Electronic version] Change Mar/Apr. [3] Doolittle, P.E., McNeill, A. L., Terry, K. P., & Scheer, S. B. (2005). Multimedia, cognitive load and pedagogy. In S.Mishra & R.C. Sharma (Eds.), Interactive multimedia in education and training. (pp.184-212). Hershey: Idea Group Publishing. [4] Hannafin, M., Land, S. & Oliver, K. (1999). Open learning environments: foundations, methods, and models. In C. M. Reigeluth (Ed.). [Electronic version] Instructionaldesign theories and models (volumes II) (pp 115-140). Mahwah, NJ: Erlbaum. [5] Jonassen, D. (1999). Designing constructivist learning environments. In C. M. Reigeluth (Ed.). [Electronic version] Instructional-design theories and models (volumes II) (pp 115-140). Mahwah, NJ: Erlbaum.

[8] Papert, S. (1993). Mindstorms: children, computers and powerful ideas. New York: Basic Books. [9] Wilson, B.G. & Myers, K.M. (2000). Situated cognition in theoretical and practical content. In D.H. Jonassen & S.M. Land (Eds.). Theoretical foundations of learning environments. (pp 57-88). Mahwah, NJ: Erlbaum.

Suggest Documents