Comparing the Collaborative and Independent Viewing of Program

0 downloads 0 Views 697KB Size Report
University of Turku, Department of Information Technology, Doctoral Programme ... and Turku Centre for Computer Science (TUCS), temira@utu.fi, ertaka@utu.fi,.
Session F3G

Comparing the Collaborative and Independent Viewing of Program Visualizations Teemu Rajala, Erkki Kaila, Johannes Holvitie, Riku Haavisto, Mikko-Jussi Laakso and Tapio Salakoski University of Turku, Department of Information Technology, Doctoral Programme for Multidisciplinary Research on Learning Environments (OPMON) and Turku Centre for Computer Science (TUCS), [email protected], [email protected], [email protected], [email protected], [email protected], [email protected] In this paper, we report a study on the differences of using a program visualization tool collaboratively or independently. We conducted a study, where students were divided randomly into two groups: the treatment group used a visualization tool called ViLLE in collaboration with another student, while the control group used the tool alone. During the study, we recorded screen captures and students’ conversations. Our previous results confirmed that the treatment group outperformed the control group in the post-test in questions related to functions and in total score. Thus, we now annotated and tagged students’ actions in answering the exercises, trying to find out an explanation for the difference in learning results. The results show, that the students working in collaboration spent more time answering the difficult exercises than the students working alone, and moreover, spent more time in higher level of engagement, both relatively and absolutely measured. Furthermore, we found out that the students working in pairs discussed the most when in the higher level of engagement and that almost all discussion was related to the exercise they were doing. Index Terms – Collaboration, engagement taxonomy, programming learning, program visualization INTRODUCTION Several multinational studies conducted in recent years show that learning to program is very difficult. One method that has been tested as a mean to help novice students in learning basic programming concepts is program visualization. There are very few studies on the effectiveness of such tools and the results of the studies are mixed. Thus, it is important to study how visualizations can be utilized as effectively as possible. The effect of collaboration on learning has been studied in various fields. Collaboration in itself is not always beneficial to learning and its effectiveness depends much on the setting and subject where it is utilized. In programming the effect of collaboration has been studied for example in pair-programming [1]. However, these studies focus on how to produce good programs more effectively and rarely discuss the use of collaboration in learning programming basics. There are few studies where collaboration has been utilized in algorithm visualization but its effect has rarely been studied in program visualization.

ViLLE is a program visualization tool designed for teaching basic programming to novice programmers. Previously conducted studies show that ViLLE is beneficial in learning basic programming and that the collaborative use of the tool enhances learning even more. This paper is an extension of our previous study on the effects of collaboration in using a program visualization tool. In the previous study students utilized ViLLE in a two hour programming session. The students were randomly divided into two groups: the students in solo (control) group used ViLLE by themselves and in pair (treatment) group the students worked in pairs. Students’ actions were recorded with screen capture software. Additionally, the discussions of the pair group were also recorded. The results from the previous study showed that students who used ViLLE in collaboration learned significantly better than students working alone. In this paper, our goal is to find out the reasons for the better results of the pair group by analyzing the recordings made during the previous study. The paper has the following structure: In the next section previous work related to the paper is presented, the following two sections describe the research setup and present the results of the study, and finally, in the two last sections the results are discussed and conclusions presented. RELATED WORK Various studies conducted in recent years have shown that the first steps in learning to program can be extremely difficult for students. McCracken et al. [2] conducted a multi-national study about the programming skills of the first year CS students, and found out, that alarmingly most of the students don’t have even the basic skills in programming after the introductory courses. In fact, according to Bergin and Reilly [3], only one fifth of all students in the introductory courses are even interested in learning to program. Lister et al. [4] concluded that the students lack the prerequisite skills needed to solve the programming problems. Moreover, Lahtinen et al. [5] found out, that the area the students have most problems with, is understanding the basic structure of programs. Lopez et al. [6] stated, that the ability to trace program code is directly related to ability to write program code. Program visualization (PV) is a method which can be effectively used to clarify the structures and illustrate the execution flow of basic programs. Laakso [7] states, that utilizing automatic assessment with visualization exercises

978-1-61284-469-5/11/$26.00 ©2011 IEEE October 12 - 15, 2011, Rapid City, SD 41st ASEE/IEEE Frontiers in Education Conference F3G-1

Session F3G allows us to tackle the problem of overcrowded programming courses and limited teaching resources. There are various PV systems developed in recent years, including Jeliot3 [8], BlueJ [9] and JavaVis [10]. However, there seems to be quite little research on the effects of the PV to student learning, especially when the potential benefits of the collaboration are included in the study. Engagement taxonomy was presented by Naps et al. [11]. Their hypothesis is, that visualizations must be used in the higher levels (> viewing) of engagement for students to actually benefit from them. The taxonomy consists of six levels of engagement:      

No viewing: there is no visualization to be viewed. However, there may be other materials, such as textual learning tutorials. Viewing: the students can only view the visualizations, but no other interaction is involved. Responding: the students answer questions about the visualization Changing: the students can modify the visualizations, e.g. by changing the parameters or input sets of the algorithms Constructing: the students can construct their own visualizations of programs or algorithms Presenting: the students present their visualizations to peers for reviews and discussion.

Laakso et al. [12] compared the learning performances of students using an algorithm visualization system TRAKLA2 collaboratively on different engagement levels. They found out, that there were no statistically significant differences between groups in the post-test. However, when the groups were re-arranged according to engagement level they actually performed in, differences were found in the total and pair average. According to authors, this confirms some of the hypotheses presented in the engagement taxonomy. Korhonen et al. [13] continued to show that the amount of collaboration and discussion increases during the learning sessions when the level of engagement increases. The groups that used visualizations on the higher levels of engagement discussed the topics on different levels of abstraction while the groups on lower levels of engagement focused only on single aspect of topic. Rajala et al. [14] studied the effect of collaboration in algorithm visualization. Students did algorithm visualization exercises with TRAKLA2 during a short learning session. They found out that although the collaboration didn’t have any additional effect on the learning results, all students learned significantly during the session. Myller [15] presented a hypothesis that the increase in the engagement level results into more collaboration and better collaboration process (including more interaction and concentrating on the learning activities), and confirmed the hypothesis with the studies presented.

Jeng and Chan [16] found out that students who used their visualization system collaboratively in doing programming tasks got better results than students who worked individually. We have previously studied the effectiveness of ViLLE in different studies, and found out, that the tool can enhance novices’ learning results significantly [17]. Moreover, we found out, that the tool should be used in higher levels of engagement to support active learning, and that it is highly important to familiarize the students with the visualization tool before researching its usefulness (a fact that we should bear in mind each time conducting such research) to reduce the cognitive load of learning to use the tool. Other research conducted about ViLLE has shown e.g. that the students find the tool beneficial on their learning, and that some students even considered it more important on their learning than traditional lecture setup. ViLLE ViLLE is a program visualization tool (see Figure 1), developed at the University of Turku, Finland. ViLLE can be used to visualize the program code execution step by step. The visualization presents several aspects of the program state, including e.g.     

variable values and scopes, objects presented in shared memory area, sub programs presented in their own frames, including all local variables, textual explanation of the currently executed program line and program output.

Execution controls are flexible: it is possible to execute the program one step at a time both forwards and backwards, or let ViLLE execute the program automatically in adjustable speed. The execution slider can be used to move to any state of the execution. The students can also attach breakpoints at any steps of the execution, and move between these points by using the animation controls. ViLLE supports a variety of programming languages, and it handles the translation to all other languages automatically from program code written in Java. Language support can easily be extended by using the built-in syntax editor. To engage students into learning, ViLLE makes it possible to attach automatically assessed questions (multiple choice questions and graphical array questions) to selected steps of the program execution. New version of ViLLE supports a variety of other exercise types as well (see the Future work section). However, these new exercises were not used in this study. The version used in this study required the use of TRAKLA2 web environment [18] to make the exercises available in the web as well as keep score of submissions and students’ scores. For more information on ViLLE, see [17].

978-1-61284-469-5/11/$26.00 ©2011 IEEE October 12 - 15, 2011, Rapid City, SD 41st ASEE/IEEE Frontiers in Education Conference F3G-2

Session F3G

FIGURE 1 THE VISUALIZATION VIEW IN VILLE

RESEARCH SETUP The study was conducted in the first year introductory CS course ‘Introduction to Information Technology’. 112 students participated in the study, divided randomly in two groups: in the treatment group (N=62) the students used the tutorial with ViLLE exercises in collaboration with another student, and in the control group (N=50) the students used the same material alone. Most of the students were first year CS-majors. The study was conducted in the the fourth week of the eight week course. All students participated in the 2hour lab session, consisting of a pre-test, tutorial session and a post-test. Quantitative setup and results We have previously reported the effects of collaboration in program visualization [19]. At the beginning of the session all students took a pre-test individually. The test consisted of four program tracing exercises covering basic programming concepts (such as conditional statements, loop structures and methods). The students had 15 minutes to complete the test. Each question was scored in the scale of 0…10. The results for the pre-test are shown in table 1.

Q1 Q2 Q3 Q4 Total

TABLE 1 PRE-TEST RESULTS Control (N=50) Treatment (N=62) 7.60 (3.05) 8.19 (2.69) 3.96 (3.52) 4.03 (3.47) 4.30 (3.99) 5.60 (3.97) 2.72 (4.19) 3.32 (4.62) 18.58 (11.25) 21.15 (8.95)

p-value 0.277 0.914 0.089 0.476 0.193

After taking the pre-test, the students used a programming tutorial with attached ViLLE exercises for 45 minutes. The treatment group used the tutorial in collaboration with another student, while the control group used the same tutorial alone. In the tutorial, various basic programming concepts were covered with brief textual description. Moreover, each concept was accompanied with ViLLE exercises about the concept. There were 15 exercises in total attached to tutorial. After going through the tutorial, each student answered to a post-test. The post-test contained all 4 questions from the pre-test, accompanied with two additional, more advanced questions, including one question where the students were asked to write program code instead of tracing the execution of existing code. The time reserved for answering the post-

978-1-61284-469-5/11/$26.00 ©2011 IEEE October 12 - 15, 2011, Rapid City, SD 41st ASEE/IEEE Frontiers in Education Conference F3G-3

Session F3G test was 30 minutes. The results of the post-test are shown in table 2.

PQ1 (Q1) PQ2 (Q2) PQ3 PQ4 (Q3) PQ5 (Q4) PQ6 Total (shared) Total (all) Total (relational)

TABLE 2 POST-TEST RESULTS Control (N=50) Treatment (N=62) 8.22 (2.80) 8.44 (2.26) 6.18 (4.35) 6.52 (3.77) 5.96 (3.84) 6.84 (3.67) 7.72 (2.96) 8.55 (2.43) 5.08 (4.75) 6.84 (4.38) 4.78 (4.62) 6.92 (4.00) 27.20 (10.80) 30.34 (9.34) 37.94 (17.40) 44.10 (14.78) 0.51 (0.34) 0.63 (0.30)

p-value 0.654 0.667 0.220 0.107 0.046 0.011 0.108 0.045 0.042

When comparing the results of pre- and post-tests with pairwise t-test, we found out, that both groups improved their performance in shared questions statistically significantly. Moreover, while there was no statistically significant difference between the groups in the pre-test, there was a difference in post-test questions PQ5 and PQ6 (questions about functions), and in total scores. The differences in the total score and the increase in absolute difference in all questions support the findings of Laakso et al. [12], that the collaboration is highly beneficial when using visualizations to support learning. Observational research setup In this paper we’re trying to find out the reasons behind the differences in learning results. To accomplish this, we analyzed the tutorial usage meticulously. Screen capture software was used during the tutorial sessions to record all student on-screen activity. Moreover, we used microphones to capture the conversations between the students in pair groups. We randomly picked 5 single users from the control group (N=5) and 5 pairs (N=5) from the treatment group for closer inspection. The observed students were all aware of monitoring, and permission for observation was asked before the session. We picked two ViLLE exercises (see Appendix A) in the tutorial session for analysis: E1 was a basic if-then-else structure with several variable value changes. E2 was a more advanced exercise, consisting of function call with two numeric parameters and a return value assignment. The exercises were selected to see if there were any differences between understanding these two fundamental concepts in programming. Moreover, parameter passing is usually considered as a very difficult topic for novices, and notably, there was a difference in performance between pair and solo groups in function exercises in post-test (PQ5 and PQ6). There were four multiple choice questions in E1 and three in E2 (see Appendix A). All actions in both exercises were analyzed, including the possible re-starts. The screen capture videos were analyzed in 5-second steps. For each step, we analyzed the engagement level the students were in. In practice, the students doing the exercise were either at the viewing level (passive) or at the

responding level (active) of engagement. We also recorded the time each group spent answering the individual questions inside the exercises. Moreover, we recorded the time the students in the treatment group were discussing during the exercises, and the topic of conversation for each step. The topics were divided into following categories:  



  

Program behavior: all discussion concerning the program execution, including the rows executed, variable values, program output etc. Tool and it’s features: discussion concerning ViLLE’s features, such as finding the variable values displayed, using the controls, locating the subprograms and their return values etc. Exercise: discussion about the multiple choice questions; this usually consisted of evaluating the choices; if the discussion was related to events and values of the program executed, it was categorized in the ‘Program behavior’ category. Referring to learning materials: all discussion about the tutorial or pre-test. Other on-topic: discussion about programming not directly related to previous categories. Off-topic: all other discussion.

The taxonomy of topics is chosen to be somewhat similar to research conducted by Korhonen et al. [13], which makes it possible to compare the results. RESULTS In this section we present the results of the screen recording analysis. First, in table 3 the total time in seconds spent in an exercise for both groups is displayed. There were five students in solo group and five pairs in pair group. TABLE 3 TOTAL TIME IN SECONDS SPENT IN EACH EXERCISE FOR BOTH GROUPS Solo group: Solo group: Pair group: Pair group: Exercise 1 Exercise 2 Exercise 1 Exercise 2 160 85 210 315 145 80 340 470 330 190 170 260 110 140 145 155 160 70 90 95 Average 181 113 191 259

As seen in table 3, there was practically no difference in times for E1. However, for E2, the pair group spent more than double the time compared to the solo group. However, it is important to remember, that the students spent the whole 45 minutes tutorial session doing as much as possible from total of 15 exercises. In table 4, we present the total number of 5-second steps in different levels of engagement for both groups in both exercises. The engagement level was analyzed for each step. The last row of the table includes the value from the chisquare test for both exercises.

978-1-61284-469-5/11/$26.00 ©2011 IEEE October 12 - 15, 2011, Rapid City, SD 41st ASEE/IEEE Frontiers in Education Conference F3G-4

Session F3G TABLE 4 5-SECOND UNITS SPENT IN DIFFERENT ENGAGEMENT LEVELS Exercise 1 Exercise 2 Engagement Solo Pair Solo Pair level group group Group group Passive 33 30 29 33 Active 148 161 84 226 Chi-square test 0.245 2.76 ∙ 10-21 value

The hypothesis was that the ratio between passive and active levels of engagement should be the same in both groups. Indeed, this is the case in E1. However, as seen in table 4, the users in pair group spent a lot more time in active level in E2 (the function exercise). The percentage values for active engagement were 82% for solo group and 84 % for pair group in E1, and 74 % for solo and 87 % for pair group in E2. Hence, in function exercise E2 the pair group spent a lot more time in active level, both absolutely and relatively. The latter was confirmed with the chi-square test (p < 0.001). Figures 2 and 3 display the percentages of 5-second steps that belong to passive level of engagement for both exercises. The columns are shown in increasing order for illustrative purposes.

As seen in the figures there was not much difference in the percentages of passive engagement in E1. However, in E2 there is a clear difference between the groups: students working in collaboration spent relatively much less time in passive engagement level. In addition, while tagging the engagement level in 5second steps, we also tagged the topics of conversations (if any) for each of these steps. Figures 4 and 5 display the percentages of different discussion topics.

Exercise 1 51.8 %

28.1 % 11.5 %

Program Tool and behaviour it's features

Exercise 1

Pair group

20%

Exercise

Referring Other on- Off-topic to learning topic materials

Exercise 2

Solo group

30%

0.7 %

FIGURE 4 DISCUSSION TOPICS IN EXERCISE 1

50% 40%

7.9 % 0.0 %

48.7 % 39.2 %

10% 10.1 %

0% 1

2

3

4

0.5 %

5

FIGURE 2 PERCENTAGE OF PASSIVE ENGAGEMENT IN EXERCISE 1

Exercise

0.0 %

Referring Other on- Off-topic to learning topic materials

FIGURE 5 DISCUSSION TOPICS IN EXERCISE 2

Exercise 2

50%

Program Tool and behaviour it's features

1.5 %

Solo group

40% 30%

Pair group

20% 10%

As seen in the figures, practically all discussion was related to the task at hand. Students were very focused on doing the exercises. Figure 6 shows the percentage of silent periods while the pairs were doing the exercises either passively (viewing) or actively (responding). Each pair of columns in the figure illustrates one pair in one exercise.

0% 1

2

3

4

5

FIGURE 3 PERCENTAGE OF PASSIVE ENGAGEMENT IN EXERCISE 2

978-1-61284-469-5/11/$26.00 ©2011 IEEE October 12 - 15, 2011, Rapid City, SD 41st ASEE/IEEE Frontiers in Education Conference F3G-5

Session F3G 120% 100% 80% 60%

Passive

40%

Active

20% 0% 1 2 3 4 5 6 7 8 9 10 FIGURE 6 PERCENTAGES OF SILENT PERIODS IN EXERCISES

It seems that the pairs were discussing more in the active engagement level. Only in two occasions there was relatively less discussion in the active engagement level. This supports the findings in Korhonen et al. [13].

DISCUSSION According to the results presented in the last section, it seems that the pairs spend more time doing the exercises than students working alone. This is especially true when the subject is more difficult. This should lead to deeper understanding of the subject. When we compare this to the results of our previous study, it seems that the positive effect of collaboration is enhanced in difficult exercises. Another interesting result is the difference between the percentages of silent periods of time in passive and active levels of engagement. When students were responding to the questions they also discussed more. This is in line with the results found in Korhonen et al. [13]. In their paper they report that the engagement level of changing increased the amount of discussion. The same seems to apply also for the responding level. In general, students worked hard during the tutorial session. In pair groups almost all discussion was related to the exercise they were doing. One factor that might have contributed to this activity was that students got points for their final exam if they did well enough in this programming session. This can be also seen in our previous results: both groups learned significantly during the two-hour programming session. Brief and controlled tutorial sessions seem to be an effective way of learning programming.

CONCLUSIONS AND FUTURE WORK Based on the results we can conclude that students who worked collaboratively spent more time doing individual exercises, which might be one of the reasons they got better results reported in our previous work. The discussions with the pair may help the weaker students understand what is going on in the program. In the easier exercise there was practically no difference in the time the collaborative and individual groups spent in passive and active levels of engagement. However, in the more difficult exercise

students working in pairs spent much more time in active levels of engagement than the students working alone. Collaboration seems to be most effective in more difficult exercises. Thus it might be sensible to do the easier exercises individually and utilize collaboration in more demanding tasks. Another important result from our study is how the engagement level with the visualization affects the amount of discussion: active levels of engagement encourage students to discuss together. In future we plan to analyze the videos more extensively by tagging also other types of exercises from them. The development of ViLLE is also an ongoing project. In addition to program visualization exercises, the new ViLLE platform supports various kinds of exercise types. Some of the new exercise types are designed especially for collaborative work.

REFERENCES [1] Williams, L.A. & Kessler, R.R. (2000). All I Really Need to Know About Pair Programming I Learned in Kindergarten. Communications of the ACM, vol. 43, no. 5, pp. 108-114. [2] McCracken, M., Almstrum, V., Diaz, D., Guzdial, M., Hagan, D, et al. (2001). A Multi-National, Multi-Institutional Study of Assessment of Programming Skills of First-year CS Students. ACM SIGCSE Bulletin, 33, 4, 125-140. [3] Bergin, S. & Reilly, R. (2005). The Influence of Motivation and Comfort-Level on Learning To Program. In Proceedings of the 17th Workshop on Psychology of Programming, PPIG’05. [4] Lister, R., Adams, S., Fitzgerald, S., Fone, W., Hamer, J., et al. (2004). A Multi-National Study of Reading and Tracing Skills in Novice Programmers. SIGCSE Bulletin, 36, 4, 119-150. [5] Lahtinen, E., Ala-Mutka, K. and Järvinen H-M. (2005). A Study of the Difficulties of Novice Programmers. ITiCSE'05. Caparica, Portugal, 14-18. [6] Lopez, M., Whalley, J., Robbins, P. & Lister, R. (2008). Relationships between reading, tracing and writing skills in introductory programming. In Proceeding of the fourth international workshop on Computing education research, September 06-07, 2008, Sydney, Australia, 101-112. [7] Laakso, M.-J. 2010. Promoting Programming Learning. Engagement, Automatic Assessment with Immediate Feedback in Visualizations. TUCS Dissertations no 131. [8] Moreno, A., Myller, N., Sutinen, E. and Ben-Ari, M. (2004): Visualizing Programs with Jeliot 3. Proc. of the Working Conference on Advanced Visual Interfaces (AVI 2004), Gallipoli (Lecce), Italy. ACM Press, New York: 373-380. [9] Kölling, M., Quig, B., Patterson, A. and Rosenberg, J. (2003). The BlueJ system and its pedagogy. Journal of Computer Science Education, Special issue on Learning and Teaching Object Technology, 13, 4. [10] Oechsle, R. and Schmitt, T. (2001). JAVAVIS: Automatic Program Visualization with Object and Sequence Diagrams Using the Java Debug Interface (JDI). Revised Lectures on Software Visualization, International Seminar, May 20-25, 76-190. [11] Naps, T.L., Rößling, G., Almstrum, V., Dann, W., Fleischer, R, et al. (2002). Exploring the Role of Visualization and Engagement in Computer Science Education. In Working Group Reports from ITiCSE on Innovation and Technology in Computer Science Education, 35, 2, 131-152.

978-1-61284-469-5/11/$26.00 ©2011 IEEE October 12 - 15, 2011, Rapid City, SD 41st ASEE/IEEE Frontiers in Education Conference F3G-6

Session F3G [12] Laakso, M.-J., Myller, N. and Korhonen, A. (2009). Comparing learning performance of students using algorithm visualizations collaboratively on different engagement levels. Journal of Educational Technology & Society, 12(2), 267–282. [13] Korhonen, A., Laakso, M.-J. & Myller, N. (2009). How Does Algorithm Visualization Affect Collaboration. Video Analysis of Engagement and Discussions. In: Joaquim Filipe and José Cordeiro eds. Proceeding of the 5th International Conference on Web Information Systems and Technologies. INSTICC — Institute for Systems and Technologies of Information, Control and Communication, WEBIST 2009, 23-26 March, Lisboa, Portugal, pp. 479-488. [14] Rajala, T., Kaila, E., Salakoski, T. & Laakso, M.-J. 2010. How Does Collaboration Affect Algorithm Learning? - A Case Study Using TRAKLA2. In 2010 International Conference on Education Technology and Computer (ICETC 2010), Shanghai, 22-24, June 2010, Shanghai, China. [15] Myller, N. 2009. Collaborative Software Visualization for Learning: Theory and Applications. Academic dissertation, University of Joensuu. [16] Jehng, J.-C. J. & Chan, T.-W. (1998). Designing computer support for collaborative visual learning in the domain of computer programming. Computers in Human Behavior, 14(3):429–448. [17] Kaila, E., Rajala, T., Laakso, M.-J. & Salakoski, T. (2009). Effects, Experiences and Feedback from Studies of a Program Visualization Tool. Informatics in Education, 8 (1), 17-34. [18] Malmi, L., Karavirta, V., Korhonen, A., Nikander, J., Seppälä, O., & Silvasti, P. (2004). Visual Algorithm Simulation Exercise System with Automatic Assessment: TRAKLA2. Informatics in Education Volume 3(2), 267-288. [19] Rajala, T., Kaila, E., Laakso, M.-J. & Salakoski, T. 2009. Effects of Collaboration in Program Visualization. Technology Enhanced Learning Conference 2009 (TELearn 2009), October 6 to 8, 2009, Academia Sinica, Taipei, Taiwan.

AUTHOR INFORMATION Teemu Rajala, PhD student, University of Turku, Finland, Doctoral Programme for Multidisciplinary Research on Learning Environments (OPMON), [email protected] Erkki Kaila, PhD student, University of Turku, Finland, Turku Centre for Computer Science (TUCS), [email protected] Johannes Holvitie, Master’s student, University of Turku, Finland, [email protected]

Mikko-Jussi Laakso, PhD (Tech), University of Turku, Finland, [email protected] Tapio Salakoski, Professor, University of Turku, Finland, [email protected] E1: def main(): a = 3 b = 4 c = 5 d = a + b + c #Q1 if d a b d

< = = =

APPENDIX A E2: def main(): a = calculate(1,2) #Q3 print a

(a+b+c):#Q2 a + 1 b * 2 d - 5

def calculate(a,b): #Q1 c = a + b return c #Q2

else: d = d - a - b a = 2 c = a * b #Q3 if c c b a

> = = =

else: a = b = c = d =

(d + a):#Q4 c - d - a 3 + a a - 1 a b c d

+ + + *

b c d 2

Q1. Which value will be assigned to variable d in this line? Q2. Which line is executed after the current line? Q3. Which value will be assigned to variable c? Q4. What would this line look like if the variable values were written in the place of the variables?

Q1. Which value is assigned to variable b? Q2. Which line will be executed after the current line? Q3. Which value will be assigned to variable a? (The question is displayed after returning from the function ‘calculate’).

Riku Haavisto, Master’s student, University of Turku, Finland, [email protected]

978-1-61284-469-5/11/$26.00 ©2011 IEEE October 12 - 15, 2011, Rapid City, SD 41st ASEE/IEEE Frontiers in Education Conference F3G-7

Suggest Documents