Closed Laboratories with Embedded Instructional Research Design for CS1 Leen-Kiat Soh, Ashok Samal, Suzette Person Department of Computer Science and Engineering University of Nebraska 256 Avery Hall, Lincoln, NE 68588-0115 USA
E-mail: {lksoh, samal, sperson}@cse.unl.edu ABSTRACT Closed laboratories are becoming an increasingly popular approach to teaching introductory computer science courses. However, as observed in [1], “Considering the prevalence of closed labs and the fact that they have been in place in CS curricula for more than a decade, there is little published evidence assessing their effectiveness.” In this paper we report on an integrated approach to designing and implementing laboratories with embedded instructional research design. The activities reported here are part of our department-wide effort to not only improve student learning in Computer Science and Computer Engineering, but also to improve the agility of our Computer Science and Engineering Department in adapting our curriculum to changing technologies, incorporate research, and validate the instructional strategies used. This paper presents the design and implementation of the labs and the results and analysis of student performance. Also described in this paper is how we have employed cooperative learning in our labs and how it impacts student learning.
Categories and Subject Descriptors Course Related, CS Ed Research, CS1/2, Curriculum Issues
General Terms: Design, Experimentation Keywords Closed Laboratories, Cooperative Learning, Instructional Design
1
INTRODUCTION
Rapid and continuous changes in the areas of software development and information technology pose significant pressure on educational institutions in terms of educating and training the next generation of professionals. In particular, maintaining the curriculum in a computer science degree program is a challenge that requires constant attention. The Association for Computing Machinery (ACM) and IEEE Computer Society, the two leading professional bodies in the field of computer science, have recently Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. SIGCSE’05, February 23–27, 2005, St. Louis, Missouri, USA. Copyright 2005 ACM 1-58113-997-7/05/0002...$5.00.
Gwen Nugent, Jeff Lang National Center of Information Technology in Education (NCITE) University of Nebraska 269 Mabel Lee Hall, Lincoln, NE 68588-0230 USA
E-mail:
[email protected],
[email protected] released guidelines outlining core topics for a computer science degree program [2]. Subsequently, the Department of Computer Science and Engineering at the University of Nebraska-Lincoln reviewed its own undergraduate program in Computer Science with the long-term goal of redesigning and reorganizing the CS curriculum to improve the quality of instruction and student learning. Following this review, the Department approved an innovative curriculum that has the potential to significantly improve the quality of undergraduate CS education. One of the key innovations is the application of a traditional, science-based (e.g., Physics, Chemistry, Biology, etc.) approach to CS laboratories, supported by research in educational psychology and instructional design. The scope of this project includes our introductory CS courses (CS0, CS1, and CS2). Closed laboratories are becoming an increasingly popular approach to teaching introductory CS courses [1], per the recommendations of Denning et al. [3] and ACM’s Computing Curricula 1991 [4]. Closed labs have several advantages. Students learn at the beginning of their majors to be active learners through goaloriented problem solving in a laboratory setting [5]. Doran and Langan [6] demonstrated that labs promote students’ cognitive activities in comprehension and application, in terms of Bloom’s taxonomy [7]. One study in fact, reported that even though their closed labs did not help improve retention or project completion rates in the CS1 course, there was a qualitative improvement in student learning in the closed lab sections [8]. Thweatt reported a statistical design with an experimental group (closed lab) and a control group (open labs) for a CS1 course and found that students in closed labs consistently performed significantly better on comprehensive CS1 exams than those in open labs [9]. Furthermore, exploration opportunities help first-time programmers overcome common hurdles, such as misconceptions about the nature of computers and programs [10]. Parker et al. found that closed laboratories demonstrate the scientific method of inquiry and teach skills in data collection and analysis [11]. The laboratory environment also facilitates cooperative learning among students e.g. [12]. Finally, laboratories tend to provide a more flexible environment that can cater to students of different backgrounds and learning styles. However, as observed in [1], “Considering the prevalence of closed labs and the fact that they have been in place in CS curricula for more than a decade, there is little published evidence assessing their effectiveness.” In this paper, we present a systematic approach to design, implement, assess, and evaluate closed labs. To demonstrate the effectiveness of the approach, we focus on measuring the impact of
cooperative learning in laboratory settings and analyzing the results pedagogically.
2
COOPERATIVE LEARNING
Our approach to incorporating labs in introductory CS courses, as reported in this paper, is based on embedding instructional research design and assessment components into each laboratory. This guides and motivates the design and development process for each laboratory in the way we developed the pre- and post-tests and the activities. In this paper, we report on the incorporation of cooperative learning into our labs and studies. While direct instruction has been shown to be effective in certain domains, studies have shown cooperative learning to be an effective pedagogy for CS, producing significant gains in student achievement [13-15]. Other advantages of cooperative learning are the development of communication and problem solving skills [16]. Most of our students intend to join private industry where collaboration and teamwork are the norm, so collaborative learning in college settings better prepares students for what they will most likely encounter after graduation [17]. Direct instruction at the college level tends to emphasize individual skills, and is often removed from environments encountered in industry [18]. Cooperative learning can help students “become aware of the significance of small group dynamics as a tool for task achievement and success in a team environment” [17]. We relied primarily on the work of Johnson and Johnson [19] to model the implementation of cooperative learning in our CS1 laboratories. For cooperative learning to be superior to individualistic, competitive approaches, five elements are necessary: positive interdependence, face-to-face promotive interaction, individual and personal accountability, interpersonal skills, and group processing [19]. Positive interdependence requires that group members "encourage and assist each other to do well” [20]. The students should feel that they will succeed or fail together. Face-to-face promotive interaction can be defined as individuals encouraging and facilitating each others' efforts to achieve, complete tasks, and produce in order to reach the group's goals [20]. Individual accountability and personal accountability involve each group member providing their "fair share" of work and feedback. Interpersonal and small group skills address the group members' ability to positively interact and support one another. All five essential "elements" are included in our laboratory design.
3
DESIGN & IMPLEMENTATION
The process of redesigning the CS curriculum was preceded by extensive interactions between researchers from four academic departments: Computer Science and Engineering, Educational Psychology, Curriculum & Instruction, and Instructional Design. Much of the design was formalized through a joint seminar course organized in Spring 2003. The emphasis was not only on the development of novel approaches to deliver and assess course materials that promote “deep” learning, but also on developing a framework in which we can conduct systematic evaluation of the approaches and their short- and long-term effectiveness. Therefore it was important that cognitive and experimental psychologists and instructional designers be an integral part of this effort from the beginning. Due to the vast scope of this project, we decided to first focus on our CS1 course. In addition to lectures, students in our CS1 course attend a programming laboratory that meets for two hours each week. Approximately 25-30 students attend each lab section. The labs provide students with structured, hands-on
activities intended to reinforce and supplement the material covered in the course lectures. Although brief instruction may be provided, the majority of the lab period is allocated to student activities.
3.1 Laboratory Design The first step in developing the labs was to create a base document for each lab that included: the lab’s objectives, prerequisite knowledge, tools required, instruction topics, activities and exercises, supplemental resources, follow-on assignments, relevance to course goals, Curriculum 2001 core topics addressed, and ideas for pre- and post-test questions. After a review of the base documents individually and collectively by CS faculty, each lab was developed by creating a series of five documents in parallel, including (1) a student handout, (2) a laboratory worksheet, (3) an instructional script, (4) a pre-test, and (5) a post-test. The student handout serves several purposes. It is both the preparation guide and the laboratory script. Each handout includes the lab objectives, a description of the activities that will be performed during the lab (including the source code where appropriate), a list of references to supplemental materials that should be studied prior to the lab and a list supplemental references that can be reviewed after the student has completed the lab. The student handout also provides optional activities that can be completed during or following the lab to give students an opportunity for extra practice. During each lab, students are expected to answer a series of questions for each activity and record their answers on a worksheet (paper). Worksheets contain questions specifically related to the lab activities and are intended to provide the students with an opportunity to find the answers through programming-based exploration. These worksheets also serve as an assessment tool to gauge the student’s comprehension of topics learned and practiced in the lab. In addition to the student handout, the lab instructor is provided with an instructional script that provides supplemental material that may not be covered during lecture, special instructions for the lab activities, hints, resource links, and useful insights. Additional space is provided at the end of the instructions for each activity to allow the instructor to record his or her comments regarding the activity and suggestions for improving the lab. The lab pre-tests are on-line and students are required to pass them prior to coming to lab, however, students may take each pretest as many times as necessary to achieve a passing score (80%). The pre-test is open-book and open-note, and includes multiplechoice, short answer and true/false questions. The goals of the lab pre-test are to encourage students to prepare for the lab and to allow them to test their understanding of the lab objectives and concepts prior to attending the lab. Questions for the pre-test are taken from a variety of sources including the course textbook, other textbooks, and questions found on the web. Questions are categorized according to Bloom’s Taxonomy [7]. During the last ten minutes of each lab, students take an on-line post-test as another measure of their comprehension of lab topics. Like the pre-test, questions are taken from a variety of sources and are also categorized according to Bloom’s Taxonomy [7]. It should be noted that the goal of the post-test is designed to assess
how well they learned the concepts after they have performed the activities specifically designed to reinforce the concepts. Table 1 shows the list of CS1 laboratories and their corresponding objectives. We incorporated two event-driven programming labs; the first lab introduces students to the differences between eventdriven programming and traditional sequential programming; the second lab addresses the capabilities and features of event-driven programming. We designed three testing and debugging labs. The first lab introduces the idea of debugging to students and teaches them how to debug using simple print statements and built-in features of the IDE, and how to identify the different bugs. The second lab describes a more systematic, holistic approach to debugging, with different strategies such as a debug flag. The third lab introduces testing components from the viewpoint of software engineering, such as test cases and drivers. Overall, the three labs progress from simple, reactive debugging to more goal-directed debugging, to standardized testing. Table 1. Laboratory Topics of Our CS1 Course. No.
Labs
1 2 3 4 5 6 7 8 9 10 11 12 13 14
Introduction to IDE & the Computing Environment Simple Class Documentation Testing and Debugging 1 File I/O Applets and Applications Event Driven Programming 1 Exceptions GUI/Swing Event Driven Programming 2 Testing and Debugging 2 Inheritance Testing and Debugging 3 Recursion
3.2 Instructional Research Design As mentioned in the previous sections, our approach to implementing CS1 laboratories includes embedded instructional research design to systematically study the effect of our design on student learning. We have modeled our design based on the work of Johnson and Johnson [19] to include the five essential elements discussed in Section 2. Here, we focus on an investigation of cooperative learning in our labs.
3.2.1. Study 1: Effective Pedagogy for CS1 Labs (Part I) The purpose of Study 1 was to determine the most effective pedagogy for CS1 laboratory achievement. According to the social constructivist view, the cooperative groups should perform higher than the direct instruction group. Participants: The participants were 68 traditional undergraduate students from the University of Nebraska-Lincoln. Of the 68 students, 5 were female. The study was conducted during the Fall semester of 2003. Procedures: The three laboratory structures used were: cooperative group with structure, cooperative group without structure, and direct instruction. The difference between the two cooperative groups was the structure of the group: formal versus informal.
The cooperative structured group (formal) had defined roles that alternated each week. The laboratory instructor was responsible for monitoring which student “drives” and which students review. The goal of this format was to develop interdependence among the group members based on the environment (a shared computer) and breaking the tasks into smaller parts with each member responsible for a part. By doing so, the group only functioned if each individual contributed his or her part for the whole group to complete their goal. The cooperative unstructured group was similar to the cooperative structured group in that we wanted to create interdependence among the group members. The difference was that for this group format, we did not control the roles of the group members. The members were responsible for assigning roles and completing tasks. In both groups, the lab instructor served as a facilitator giving both groups the freedom to solve problems themselves. The last group format used in our study was direct instruction. This is the classical format in which students work individually and competitively against other class members. This group served as the control group. The role of the instructor was to answer individual questions and discourage cooperation while students completed laboratory exercises. We randomly assigned the pedagogy of each laboratory section (cooperative structured, cooperative unstructured or direct instruction). For the students enrolled in a section employing the cooperative approach, we used stratified random assignment to assign students to their cooperative groups. This was accomplished by ranking the placement test scores used for this course. The scores were grouped into three categories: high, middle, and low. , Students were selected from each group at random and placed in the cooperative group where they would remain for the entire semester. This was to ensure heterogeneous grouping, which has been shown to be the most effective approach [19]. Dependent Measures: We used total laboratory grades and a prepost-test measuring self-efficacy and motivation as our outcome measures. The combined outcome measures provide evidence of the effectiveness of laboratory pedagogy and achievement. Total laboratory grades were measured by combining post-test grades and worksheet scores for each lab. Although some students worked in cooperative groups, all students were required to individually complete the post-test in the laboratory. The second outcome measured students’ self-efficacy and motivation during the first and last week of the semester. The tool used to measure these constructs adapted eight questions taken from the Motivated Strategies for Learning Questionnaire (MSLQ) developed by Pintrich and De Groot [21]. Students were instructed to respond to a five-point Likert scale (1=strongly disagree to 5=strongly agree) providing their estimated self-efficacy and motivation for CS. Our questionnaire returned a reliability measure (Cronbach Alpha) of .90 with a mean of 3.45 and standard deviation of .09; a Cronbach Alpha value of at least 0.8 is considered sufficient for statistical validity. Results: The first research question examined was student achievement in the laboratory. Analysis of variance (ANOVA) was used to determine significant differences between the sample means of cooperative groups with structure, cooperative groups without structure, and direct instruction. An ANOVA takes the variance (differences) between the three sample means and normalizes them using the variance (differences) within the groups
accounting for sampling error. A significant result, less than .05, indicates that the differences between the group means was something other than chance. Both cooperative learning groups performed significantly higher than the direct instruction group (F(2,66)=6.325, p