Session F2C
Preliminary Experiences with “Flipping” a Freshman Engineering Programming Course Renee M. Clark, Dan Budny, Karen M. Bursic, and Mary E. Besterfield-Sacre University of Pittsburgh,
[email protected],
[email protected],
[email protected],
[email protected] Abstract - We employed the “flipped classroom” in a freshmen engineering programming course taken by nearly 700 first-time freshmen and transfer students during the 2013-2014 academic year. In the “flipped classroom,” students are encouraged to watch video lectures outside of class. This enables more class time for active learning, practice and demonstration of programming skills, and one-on-one assistance, with the instructor serving more so as an advisor versus a transmitter of information. Although over 50% of freshmen respondents preferred using class time for problem solving and active learning with the instructor present, we found that the very great majority did not use the videos for first-time instruction as intended with the flipped classroom. Frequently-stated benefits of the flipped classroom by students included access to multiple sources for explanation or clarification, reinforcement of understanding, flexibility and convenience, in-class application of knowledge, and the ability to re-watch videos and self-pace their learning. Based on the College and University Classroom Environment Inventory, freshmen scored this flipped course highest on the personalization dimension, which assesses student-to-teacher interaction. We further evaluated this flipped classroom for the degree of instructor-supported active learning and problem solving using a structured observation protocol known as the Teaching Dimensions Observation Protocol (TDOP). Based on the TDOP, we observed that 40% or more flipped class segments observed over the two semesters involved instructor-led demonstration of programming skills as students actively followed along on their computers as well as active problem solving by students as the instructor or TA circulated throughout the classroom for assistance. The experiences and reflections of multiple instructors in teaching this flipped course will be discussed. Index Terms - Flip, Freshman, Programming, Computing. 1.
INTRODUCTION AND LITERATURE REVIEW
The flipped classroom is a teaching and learning approach that offers the opportunity to incorporate active learning into the classroom while still covering necessary content. With an active learning approach in the classroom, students engage in problem solving and other activities, thereby promoting student involvement and motivating them to
physically do something versus attending class passively [1,2]. Leading educators have called for active learning in the classroom by stating that true learning occurs when students talk, write, analyze, create, apply, or otherwise become involved in their instruction [ 3]. In the classroom flip, students watch online videos containing course content outside of class, thereby freeing class time for instructors to mentor and support students in problem solving or application of their skills, which is particularly important in an introductory programming course [4,5,6]. In addition, course flipping leads to benefits such as increased interaction among teachers and students as well as teamwork among students. There is also the opportunity to accommodate various skill levels and learning tendencies by providing support “upon demand,” instruction with self-paced “pause and rewind” capability, and flexibility for students who are unable to attend class due to legitimate reasons [7,8]. In addition, flipping provides students with access to multiple sources for mastery, clarification, or reinforcement of content. Studies also suggest that active or interactive learners show significantly higher learning outcomes and gains compared to passive learners in terms of problem solving, time to mastery, and understanding, including conceptual understanding [2,9]. The flipped classroom was formally promoted in the Swanson School of Engineering at the University of Pittsburgh starting in the fall 2013, with the assistance of the school’s Engineering Education Research Center (EERC). The school’s objectives were as follows: 1) enhancement of in-depth learning and achievement of higher-order skills in Bloom’s taxonomy, 2) enhancement of student engagement, and 3) utilization of the school’s state-of-the-art instructional technology to support active learning. One of the first courses in which the flipped classroom was implemented was the required freshman course in introductory computing and programming. This course was taken by nearly 700 first-time freshmen and engineering transfer students in the 2013-2014 academic year. There are two versions of the course for first-time freshmen – honors and non-honors – with the honors version offered in the fall and the non-honors version offered in the spring. There is also an offering in the fall for transfer students. In this course, students learn to program a computer using both MATLAB and C so these skills can be applied to solve engineering problems. One of the goals in flipping this course was to enhance freshmen programming skills by
6th First Year Engineering Experience (FYEE) Conference F1A-1
August 7 – 8, 2014, College Station,TX
Session F2C using class time for application and hands-on support for the more difficult topics. To determine the degree to which the objectives were met, the EERC developed a comprehensive evaluation and assessment plan consisting of both direct and indirect assessment. The assessments include course-specific embedded assessments, instructor interviews, student surveys, classroom observation, and web analytics data on video usage. The student surveys included classroom environment and formative evaluation surveys. The classroom environment survey, formally known as the College and University Classroom Environment Inventory (CUCEI), was selected to assess student climate perceptions and engagement [10]. This reliable inventory evaluates student perceptions regarding seven psychosocial dimensions of the classroom and has been used previously in inverted classroom research [5]. This instrument was chosen because several of the dimensions are particularly relevant to the flipped classroom, including involvement, student cohesiveness, satisfaction, personalization, and innovation. Our initial student formative evaluation survey was inspired by the work of Leicht, Zappe, and colleagues in their flipped classroom research and was expanded upon using ideas from our faculty [4,6]. We also conducted classroom observation using the validated TDOP, or Teaching Dimensions Observation Protocol. This structured classroom observation protocol uses a series of small observation windows and a set of codes to identify teaching and learning practices. These teaching and learning dimensions include 1) teaching methods, or how information is dispersed and understanding is generated in class; 2) pedagogical moves, pertaining in part to teaching style and strategy; 3) questioning between instructors and students; 4) cognitive engagement of students; and 5) instructional technology usage [11]. 2.
METHODS
In this section, we discuss both the methods used to develop the flipped version of the course as well as the techniques used to assess the outcomes. With the assistance of the EERC, two of the instructors began preparations for this flipped classroom approximately six months before implementation. Given the large number of students, five instructors actually teach the various sections of the course. The school-wide preparations included the formation of an internal professional learning community in the spring 2013 semester led by the EERC [12]. In addition to these freshmen instructors, other engineering faculty members who planned to flip courses joined this community. The evaluator and the IT staff members involved in the video creation and editing were also part of the learning community. During the periodic meetings, various topics were discussed within the group, including challenges related to the video recording, assessment goals and activities, classroom and content logistics, active learning approaches, challenges related to students, and the instructors’ goals.
The instructors began creating the video lectures in the summer of 2013 prior to the targeted fall 2013 semester. In order to modularize the course content and provide an informal script, one of the instructors created slides of the content, and the other used them to record the lectures. The lectures were recorded in small modules of different lengths using the Camtasia software with the assistance of the IT staff. They recorded 79 modules having an average length of between 8 and 9 minutes. The video lectures included presentation of concepts and syntax as well as coding demonstrations within the programming software. Example module titles included the following: Branching Module – If Basics Branching Module – If Example Branching Module – Switch Case Looping Module – While Loops Looping Module – For Loops Looping Module – Avoiding Infinite Loops Application – Solving Linear Equations – Basics Arrays in C – Basics Arrays in C – Examples File IO in C – Example of Reading from a File Function Calls in C – Passing Arrays We administered the College and University Classroom Environment Inventory (CUCEI) in the flipped classroom at approximately week eight in both the fall 2013 and spring 2014 semesters. Given the lack of pre-flip measurement, we compared these results to the results of other flipped classrooms in the Swanson school as a whole. We administered the formative evaluation survey closer to the end of the term. Also, using the Teaching Dimensions Observation Protocol (TDOP), we observed two sessions of each of the three variants of the course – honors, nonhonors, and transfers – across the two semesters to evaluate classroom practices in the flipped mode of instruction. 3.
RESULTS
In this section, we present the students’ evaluation of this flipped course in freshman computing. We also provide the results of the classroom environment survey and the observation of the flipped classroom, based upon two semesters of assessment. We also discuss web analytics data on the videos accessed by the students and the relationship to final course grades. 3.1 STUDENT EVALUATION OF FLIPPED CLASSROOM The students evaluated the flipped course via a formative assessment survey, with approximately 19% of the students providing feedback. This survey was modeled upon the work of Leicht et al. and Zappe et al., who used student perception surveys to obtain formative feedback in a flipped undergraduate course [4, 6] taken by students with sixthsemester standing in architectural engineering.
6th First Year Engineering Experience (FYEE) Conference F1A-2
August 7 – 8, 2014, College Station,TX
Session F2C Approximately half (51%) of responding freshmen preferred using class time for problem solving with the instructor present for assistance versus listening to lecture, as shown in Figure 1. A comparison of this question by course variant (i.e., honors, non-honors, and transfer students) is provided in Figure 1, with a larger percentage of non-honors students (compared to honors) strongly agreeing that they preferred problem solving in class. However, based on Fisher’s exact test, this relationship was not significant (p=0.14). In comparison, Zappe et al. found similar agreement to this question, with 48% agreeing or strongly agreeing that they preferred problem solving versus lecture during class time [6].
40% 35% 30% 25% 20% 15% 10% 5% 0%
Prefer using class time for problem solving vs. a lecture Honors Non-Honors
TABLE 1 TOP OPEN ENDED RESPONSES ON BENEFITS AND LIKES % of Responses Respondents Code Description (n=130) Access to more than one MULT source for understanding 17 13% SOURCES or explanation, such as class, video, etc. The videos were relevant VIDEOS or contained essential BRIDGED information for 15 12% RELEVANT clarification, learning, or task completion. Understanding reinforced; REINFORCE material reviewed by 14 11% instructor or student. There was demonstration, VIDEO DEMO examples, derivations, etc. 11 8% EG in the videos. Flexibility or according to FLEX PREF one's preferences; 9 7% convenience. ALT TO There was an alternative 9 7% TEXTBOOK to reading the textbook. Can re-watch videos. REWATCH 7 5%
Transfers
FIGURE 1 “PREFER USING CLASS TIME FOR PROBLEM SOLVING VS. LECTURE?”
In an open ended question, we asked the students what they liked about the flipped classroom and the benefits they perceived. The top 10 themes present in their responses are shown in Table 1. The most frequently mentioned benefit was having access to more than one source for understanding or explanation of the material (MULT SOURCES). In addition, there were nine responses specific to having an alternative to using the textbook (ALT TO TEXTBOOK). The videos seemed to provide a bridge for the students by containing relevant information that they used for clarification, learning, or task completion (VIDEOS BRIDGED RELEVANT). The students liked the reinforcement that the flipped classroom afforded, including review of material by the instructor or the students themselves, often for study purposes (REINFORCE). These open ended responses coincided with their responses to the closed ended questions about how they primarily used the videos (i.e., for review and reinforcement) and when they watched them (i.e., after class). The students also liked the demonstrations and examples shown in the videos (VIDEO DEMO EG). Other frequently-stated benefits included flexibility and accommodation of preferences, the ability to re-watch lectures and work at one’s own pace, in-class active learning, and the modularization of the lectures with distinct videos for each topic.
6
5%
OWN PACE
5
4%
IN CLASS AL
4
3%
MODULES
Ability to work at one’s own pace; may include pausing the video. In-class problem solving, activities, active learning, examples, & applications. There was a different video for each topic.
Although over 50% of responding freshmen preferred the in-class problem solving aspect of the flipped classroom, the very great majority indicated they had not watched the videos as intended with the flipped classroom. As shown in Figure 2, the great majority of responding freshmen (91%) watched the videos after the class period for which they were assigned and not as part of first-time instruction to learn the material. This was despite periodic in-class quizzes. In comparison, 86% of respondents from other courses (sophomore through senior level) flipped in our school watched the videos before the class period for which they were assigned. As shown in Figure 3, only 14% of the responding freshmen used the videos for first-time instruction to learn the material, compared to 86% of respondents from other flipped courses in the school. To provide an indication of the incoming knowledge base of the students, 73% of the spring 2014 freshmen respondents were only slightly familiar or not at all familiar with computer programming prior to taking this course.
6th First Year Engineering Experience (FYEE) Conference F1A-3
August 7 – 8, 2014, College Station,TX
Session F2C TABLE 2 SELF-REPORTED PERCENTAGE OF VIDEOS WATCHED
When viewed?
100% 80%
Fr (Honors)
60% Fr (Non-Honors)
40%
Fr (Transfer)
20% 0% Before After assigned assigned class class period period
Fresh (Honors)
55%
37
Fresh (Non-Honors)
34%
42
Fresh (Transfer)
55%
38
Other Flipped
86%
141
3.2 WEB ANALYTICS DATA
"WHEN DID YOU PRIMARILY VIEW THE VIDEOS?"
How primarily used?
Fr (Honors) Fr (Non-Honors)
Fr (Transfer) Other Flipped Learn new material
n
Other Flipped
FIGURE 2
100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0%
Average %
Review or reinforce material
FIGURE 3 "HOW DID YOU PRIMARILY USE THE VIDEOS?”
We also asked respondents to report the percentage of videos watched. This data has the limitation that it’s based only on the segment of the population that responded to the survey (~19% of students). The average percentage was between 34% and 55%, depending on the course variant, as shown in Table 2. This was in sharp contrast to the average percentage (86%) reported in other (non-freshman) flipped courses in our school. Zappe et al. found a similar percentage for non-freshman engineers, with 92% reporting having watched each video [6]. Taken together, these various pieces of data suggest that many freshmen expect to “be taught” rather than take an active role in their acquisition of information. Upon further reflection, however, we feel not all five of the instructors emphasized the use of the videos, contributing to the lower percentages. The average percentages among the freshmen groups in Table 2 are different. The difference between honors and non-honors is significant (p=0.004) as is the difference between transfers and non-honors (p=0.004) based on a Tukey test. The difference between all freshmen and the other flipped students was significant (p=0.000) based on a t-test.
In addition, we utilized web analytics data to determine the videos accessed by each student. However, this data only indicates that a video was launched or loaded by a student and not necessarily that the video was watched in whole or even in part. In addition, the instructor noted that students may watch the videos in groups; therefore, not all students might officially log in to watch a particular video. Thus, we present both the self-reported as well as the web analytics data regarding access and viewing of the videos. Compared to the self-reported data, the web analytics data actually indicates that a lower percentage of the 79 available videos were accessed by the students, as shown in Table 3. Based on Fisher’s exact test, the relationship between being an honors vs. non-honors student and the number of videos accessed was significant (p=0.0003). This was also true for transfer vs. non-honors students (p=0.017) but not for honors vs. transfer students (p=0.29). Based on these various forms of data, it appears likely that many freshmen in this course did not come to class prepared by completing the self-directed portion of the flipped classroom as intended. TABLE 3 WEB ANALYTICS PERCENTAGE OF VIDEOS ACCESSED Average Average Number Videos % n of Videos Available Accessed Accessed Fresh (Honors) Fresh (NonHonors) Fresh (Transfers)
s
26
79
33%
69
21
7
79
9%
454
12
19
79
24%
162
20
However, upon further analysis, we found that not watching the videos was not necessarily associated with a lower grade in the course. We actually found a very small correlation between the number of unique videos accessed and the final course grade. Each letter grade was assigned its specific point value; for example, an A was assigned a value of 4.00. The Pearson correlation values, which were not statistically different from zero, were -0.054, 0.043, and 0.062, for the non-honors, transfer, and honors students, respectively. It should be kept in mind that not all five of the instructors placed the same emphasis on the viewing of the videos.
6th First Year Engineering Experience (FYEE) Conference F1A-4
August 7 – 8, 2014, College Station,TX
Session F2C The scatter plot of the number of unique videos watched versus the final course grade (i.e., grade points) for all freshmen is shown in Figure 4. For the combined freshmen data, the correlation coefficient was 0.011. This plot shows that some students did in fact “fit” the expected relationship between videos watched and the final course grade. However, many students who performed well watched only a portion of the available videos, as indicated by the large number of points near the upper left portion of the plot. The plots for transfer and non-honors students were similar to Figure 4. However, the plot for honors students was different in that nearly all students achieved an A or B regardless of the number of videos watched.
3.3 CLASSROOM ENVIRONMENT SURVEY Using the College and University Classroom Environment Inventory (CUCEI), we assessed the seven psychosocial dimensions of our flipped classroom shown in Table 4. There are seven questions per dimension, with each question having a scale of 1 to 5, with 5 being most desirable. The CUCEI was administered at approximately the midpoint of the course. We received 192 responses and benchmarked them against responses received in other flipped courses within our school. We used other flipped classes as the benchmark given a lack of pre-flip data for the freshman course, which would have been the preferred comparison. TABLE 4 COLLEGE & UNIVERSITY CLASSROOM ENVIRONMENT INVENTORY (CUCEI) RESULTS
Grade Points
Freshmen Programming Course 4 3.75 3.5 3.25 3 2.75 2.5 2.25 2 1.75 1.5 1.25 1 0.75 0.5 0.25 0
Fresh Flipped
Other Flipped
AVG
AVG
p
Students know & help one another
3.01
3.31
0.00
0.36
Students can make decisions; treated individually or differentially
2.53
2.64
0.02
0.22
New or unusual class activities or techniques
2.76
3.02
0.00
0.46
Students participate actively in class
3.01
3.49
0.00
0.81
Student interaction w/ instructor
3.71
4.12
0.00
0.61
Satisfaction
Enjoyment of classes
3.50
3.37
0.12
Task Orientation
Organization of class activities
3.60
3.86
0.00
sample size (n)
192
323
Dimension
Student Cohesiveness -1
9
19
29
39
49
59
69
Definition
Difference d
79
Unique Videos Viewed
FIGURE 4
Individualization
UNIQUE VIDEOS WATCHED VS. FINAL COURSE GRADE FOR ALL FRESHMEN
This raises several questions for future research and investigation. For those students who were not watching the videos and still performing well, how did they acquire their knowledge? Was the level of the instructor’s emphasis on the videos a large contributor to this? Thus, what are the primary methods, behaviors, or characteristics of freshmen who do not fit the expected relationship between video usage and achievement? In addition, we also analyzed the total number of times the videos were accessed versus the final grade and also found the correlation to be very small. The latter analysis accounted for the case in which a video was accessed multiple times by a student, for example for review or study purposes. These Pearson values were also not statistically different from zero and were -0.048, 0.051, and 0.012 for the non-honors, transfer, and honors students, respectively. We defined a “distinct” access of a video by a student as one that occurred at least ten minutes after the last access of the video by the student. For example, if a student accessed a particular video at both 2:01 PM and 2:03 PM on a given day, these would not be counted as two distinct accesses. We considered ten minutes to be reasonable given the average length of each video of eight to nine minutes. As stated previously, there are certain limitations with the web analytics data that could potentially bias these outcomes.
Innovation
Involvement
Personalization
0.41
As shown in Table 4, the flipped classroom environment scored highest on the personalization dimension, with a dimension mean of 3.71 on the 5-point scale. This dimension assesses the interaction between the students and instructor. The individualization and
6th First Year Engineering Experience (FYEE) Conference F1A-5
August 7 – 8, 2014, College Station,TX
Session F2C innovation dimensions scored lowest and below the average value of 3, with dimension means of 2.53 and 2.76, respectively. Thus, the freshmen respondents did not view the flipped classroom as particularly innovative and did not perceive notable individual or differential treatment with this mode of instruction, which are two key characteristics or objectives of the flipped classroom. However, these two dimensions were also lowest for the other flipped classrooms that were assessed at the school. In comparing the freshmen to students in other flipped courses, the freshmen had a lower perception of their classroom environment for all dimensions except the satisfaction dimension. Further, there was a significant difference at α=0.05 between these two groups for all dimensions except satisfaction, as shown in Table 4 (p