Helping Students Track Learning Progress Using ...

4 downloads 1671 Views 987KB Size Report
Scrum, have adopted the use of burn down charts to help track ... with a system where they could track their progress during ... desktop PC behind a desk.
Helping Students Track Learning Progress Using Burn Down Charts Clinton J Woodward, Andrew Cain, Shannon Pace, Allan Jones, Joost Funke Kupper Swinburne University of Technology, Faculty of Information and Communication Technologies John Street, Hawthorn, Victoria 3122, Australia Email: {cwoodward,acain,space,ajones,jfunkekupper}@swin.edu.au

Abstract—Agile software development methods, such as Scrum, have adopted the use of burn down charts to help track progress by development teams. We considered if this same technique could be applied to help students track their progress in programming units. Tools that help students visually track their progress may help address issues with time management, particularly in units that make use of frequent formative feedback. In this work we describe such a tool, named Doubtfire, and explain its use in helping students keep track of their progress across a number of undergraduate programming units.

allows teaching staff to outline the tasks students need to complete during the semester. Student are then able to monitor their progress against these tasks using burn down charts. The charts show the backlog of work remaining week by week, which decreases as work is completed. An example is shown in Fig. 1.

Index Terms—introductory programming; constructive alignment; portfolio assessment; agile software development; project management

I. I NTRODUCTION A key principle of Agile software development methods [1] is that they embrace change [2] by allowing for adaptive, periodic adjustment of activities, resulting in robust and effective outcomes. The basis for adaptation in Scrum, a popular Agile method [3] for development teams, is empirical information; a consistent measure of the work remaining (“backlog”), and a measure of the rate work is being completed by the team (“velocity”). The purpose of a Scrum “burn chart” is to be an “information radiator” [4] for the team. It allows stakeholders to consider, in an empirical manner, the velocity of the team with respect to the current backlog. Burn charts can be used for either release or “Sprint” iteration schedules. Since quality of work should not be compromised, the requirements (backlog) of work can be adjusted in order to meet the required schedule and cost [5]. Our previous work applied constructive alignment [6], with portfolio assessment, to develop a model for teaching introductory programming [7], and examined issues students raised in units delivered using this approach [8]. Of the issues faced by students in these portfolio-based units, time management was raised most frequently. One of the reasons presented for this was the model’s reliance on frequent formative feedback, with no marks attached to these tasks students could be tempted to focus on more pressing requirements from other units. It was proposed that this may be addressed by providing students with a system where they could track their progress during the semester. In this paper we present Doubtfire, a tool used to help students track their progress in portfolio-based units. Doubtfire

Fig. 1.

Burn down chart showing student progress against weekly tasks

To evaluate the effectiveness of this idea, Doubtfire was used in the teaching of two undergraduate programming units covering introductory programming and object orientation. Each of these units used the online task tracking tool to help students monitor their progress on the weekly formative assessment tasks. The development of this tool, and analysis of its use, is an important part of our ongoing initiative to improve constructively aligned portfolio assessed units. The work presented in this paper outlines the principles behind the use of burn down charts for task tracking, and presents a tool implemented to provide these facilities. It is expected that the tool will provide both staff and students with valuable information on how students progress, and help students keep track of their progress without having to assign marks to these formative tasks. In Section II we present the requirements expected for an effective tool for student progress management. This is

followed by a description of the Doubtfire tool in Section III, and details of its use in Section IV. The paper then concludes with discussion in Section V, and conclusions in Section VI. II. R EQUIREMENTS As already noted, the central requirement for the Doubtfire tool was that provide burn down charts. These charts give students a visual way to know how many tasks they need to complete over the semester, and estimate the relative complexity of those tasks – a skill which many seem to lack. Students should also be able to use the tool to determine whether they need to increase their rate of progress (velocity) and, if so, commit more time to the subject or take greater advantage of resources available to them. In addition to the simple scrum-style marking of tasks as completed, it was also seen that students could use the system to indicate if they were working on, or having trouble with, particular tasks. To account for task heterogeneity, staff need to be able to weight tasks, based on predicted size and complexity. A students projected completion should be recalculated as tasks and weeks progress. For example, if six tasks were completed in one week, based on the velocity, a thirty-six task project is expected to be completed in six weeks. As Doubtfire should be an interactive system, a number of requirements are needed to ensure that the tool could be best utilised by all targeted user groups. The following were identified: • Online: making the tool available online allows students to access the tool from virtually anywhere. It also avoids the need for client software, which makes the development process simpler (no platforms need to be supported) and means that students do not have to install clients. • Easy to use: if students struggle to interact with the tool, they may choose not to use it – regardless of the advantages it may provide them. The usability of the tool must provide as small a barrier as possible to student adoption. • Mobile friendly: with the devices now available to students, we cannot assume that they are working on a desktop PC behind a desk. To ensure that they can check and report their status whenever they need, the tool must be able to provide those features on mobile devices. • Aesthetically pleasing user interface: to encourage adoption of the tool among students, a visually appealing user interface is ideal. Simply, we want a tool that is simple and appealing for students to use and that can be accessed almost anywhere. We do not want the students to feel that interaction with the tool is work. Students, however, were not the only user group targeted by Doubtfire. Tutors need to be able to respond to students actions in the tool, and convenors need to be able to observe the performance of the student cohort and provide simple administrative actions. Both groups benefit from the requirements already

listed. Most importantly, the mobile nature allows tutors to easily check and update students’ progress during labs. In terms of the development and deployment of the tool, a number of software qualities are desirable. These include: •





Quick to develop and extend: it is valuable to be able to produce features as soon as possible so that students may benefit from them early in the semester. Supports adaptive teaching environments: the tool should fit inside the teaching environment; it should not be necessary to fit the teaching environment around the tool. It must remain a supportive technology. Controllable: the schema that defines the way tutors and students interact over tasks must be easy to modify as the teaching requirements change in response to changing assessment criteria.

It is desirable that the tool conforms to standard quality requirements appropriate to online and mobile-accessible task tracking software: that it is easy to change, maintain and administrate. It is valuable to add features that track student behaviour in the tool to: • • • • •

Determine whether the expected use of the software matches the actual. Identify possible issues in the unit curriculum, such as inconsistent assignment weighting. Identify possible flaws in the rule system governing student and tutor interaction in the software. Exploit the information to insights into general teaching issues. Optimise the experience based on how students use the software.

As this information is constantly generated through use of the tool, it is simply a matter of storing what actions have been performed. This provides unambiguous information that can then be analysed through quantitative methods. This section has described the requirements we deemed non-negotiable in the production of an effective progress management tool for the users identified. There are a number of minor requirements that were not considered significant enough to describe here. From these requirements were produced a number of features accessible to the convenor, tutor and student user groups, which are discussed in the following section. III. S OLUTION A. User Roles The Doubtfire tool provided features for the user groups of Convenor, Tutor and Student. The features available are described in Table I. As Convenors manage the unit, Doubtfire provides them with the tools to setup the unit’s tasks and enrol students. The Convenor dashboard shows overall progress of students by unit, and enabled quick access to views showing progress by task and student. Fig. 2 shows an example chart that visualises

TABLE I F EATURES EXPOSED TO EACH USER GROUP IN D OUBTFIRE Abilities

Role Convenor

- Administer units by enrolling students and creating tasks - View breakdown of student progress on projects - View breakdown of task statuses (i.e. the progress distribution for each of the subject’s tasks) - View list of tasks and their associated status per student - Assess tasks completed by the students Tutor

- View list of tasks and their associated status per student - Assess tasks completed by the students

Student

- Change the state of progress in completing a task - View a progress burn down chart showing Target, Actual and Projected completion

Fig. 5.

The Tasks list enabled students to view and change task status

B. Task Processes Fig. 6 shows a UML State Chart representing the main transitions between task statuses. The process is designed so that initially tasks are set to a Not Started status. Students then transition tasks to a Progressing status, and when ready for assessment to a Ready to Mark status. At this stage Tutors assess the work and either return it to a Progressing status if the task needs to be fixed, or sign the task off as Complete. start

Not Started

Fig. 2.

Convenor view showing distribution of student status by task Student indicates started

the distribution of student status for each task. Convenors also had the ability to perform the same actions as Tutors. In managing their classes Tutors were able to view class lists showing student progress. These lists could be used to drill down to view individual students and their burn down charts, and provided a convenient means of updating task status. Students were provided with a dashboard that listed their progress in the units they were enrolled in, shown in Fig. 4. Drilling down into a unit provided them with the relevant burn down chart and the status for the individual tasks. The Tasks list enabled Students to update the status of their tasks as shown in Fig. 5. Across all views the burn down charts showed three lines, as shown in Fig. 1. These included lines for: • • •

Target Completion: representing the target due dates as specified by the Convenor. Actual Completion: showing weekly student progress. Projected Completion: indicating the projected end date based on the current velocity.

Progressing

indicates ready to mark

Ready to Mark

provides feedback

signs off Complete Tutor

Fig. 6.

UML state chart showing task states and transitions.

To provide finer-grain feedback, the Progressing status was divided into a number of sub-states. Students were able to transition the task to a Working on It or Need Help status.

Fig. 3.

Fig. 4.

Tutors viewed class groups and could adjust task states

Student home page in Doubtfire, showing progress on all enrolled projects

When given feedback Tutors were able to indicate the task needed some aspects adjusted (Fix status) or that it should be redone (Redo status). These status were designed to help provide more accurate details of progress for both staff and students. Students indicated how they were progressing with the tasks, and staff could provide feedback on the outcomes students had achieved. IV. U SE OF D OUBTFIRE A. Introduction to Programming 1) Capable students: Capable students used the tool to validate that their performance matched their expectations. They generally completed the work promptly and had little difficulty understanding the underlying programming concepts. Thus

their work was typically set to Ready to Mark within the week. In almost all cases this work was marked as Complete during the assignment feedback session. Capable students were the most active in following up perceived issues in assignment status due to actions by the tutor. For example, if an assignment was accidentally set to Fix (a state not justified by the level of feedback) or the tutor simply forgot to update the status, such students contacted their tutor about the issue in the same lab. If those students correctly received a Fix , they usually confirmed with the tutor the reason why their work was marked so (either in the lab or via an email to the tutor). Generally, the student’s misunderstanding of the concept involved was immediately resolved.

2) Struggling students: A number of students struggled with the weekly assignments, but only seemed to realise the benefit of Doubtfire later in the semester. Such students typically updated the status of multiple assignments simultaneously, which was only possible because their submissions were late. The quality of work was generally mediocre, and such work was marked as Fix . However, later in the semester they started to use Doubtfire to reflect their understanding of the concepts covered, and they did so by setting the relevant assignment status to Needs Help. Interactions with this group of students consumed the majority of time allocated to assignment feedback during each lab. By indicating their level of understanding, tutors were able to prioritise feedback on assignments containing concepts more troublesome for the student. As many of these students failed to indicate progress via Doubtfire in the first half of semester, and also frequently failed to submit, it is possible that they were avoiding submissions that they knew to be in the Need Help state. After seeing the value of tutor feedback they chose to engage with the tool, and that allowed the tutors to address gaps in the students’ understanding. 3) Disinterested students: Some students chose not to engage in the feedback process and attempted to establish progress on their own terms. These students usually had a very poor understanding of the programming concepts, and their attendance and participation in labs and discussion forums, respectively, was inconsistent. Based on the quality of their submitted work, this group of students should have made frequent use of the Need Help state. However, their engagement with Doubtfire was poor. By not indicating their progress in Doubtfire, tutors were unable to estimate their progress on an assignment until it was submitted. As they tended to submit multiple assignments simultaneously, estimating their overall progress was time consuming. The problem was compounded by such students failing to provide an accurate picture of their level of understanding during discussions with tutors. Ultimately, these students repeatedly submitted each assignment, either making very slow or no progress towards a satisfactory outcome in each iteration. When severe problems were identified, such students typically responded in the vein of “Isn’t it close enough?” Due to the time cost that this behaviour accrued for tutors, we began to ignore simpler assignment tasks and ask those students to concentrate their efforts on critical tasks that we felt best demonstrated student ability.

during the lab sessions. The problems they encountered were usually not so serious to require a Need Help state. 2) Struggling and disinterested students: Other students tended to use Doubtfire to indicate that their work was Ready to Mark , but did not use the Working on It state to indicate that they were working on the assignment. This was a problem for the tutors because they were unable to estimate the progress of students in this cohort until those students were satisfied with their assignments and they were submitted (as indicated by the Ready to Mark state). As such, tutors were forced to appraise student progress during the lab before they were able to help the student. This was a time consuming process. As shown in Introduction to Programming, if students use the tool to indicate their progress on assignments and which ones they are struggling with, tutors can provide comprehensive, prioritised feedback.

B. Object Oriented Programming

A. Limitations of Doubtfire

1) Capable students: Capable students had the highest rate of engagement with the tool. Their use case typically involved setting their assignment status to Working on It after the delivery of lab content, and updating the status to Ready to Mark before submission in the next week. The generally high quality of the work produced indicates that these students are punctilious, a trait they applied in the creation of their assignments and their interaction with Doubtfire. When students in this cohort required help they usually seeked it from tutors

Students identified some limitations with Doubtfire and indicated functionality that could improve their use of the tool. Students in Object Oriented Programming requested additional features to enable them to attach commentry to each task. They felt this would better enable communication between students and tutors. Attaching a discussion thread to each task would have allowed tasks to be annotated with details on what needed to be fixed, or on aspects the students wanted feedback on.

C. Differences in usage of Doubtfire between units A key difference between Introduction to Programming and Object Oriented Programming is that the former required students to submit hard copies of weekly assignments in weekly lectures and the latter did not. Students of the latter unit received weekly instruction on concepts and were allocated tasks to complete relating to those concepts. However, they were not required to submit physical copies of their assignment work. This released the pressure on students to produce coherent documents, and made it more difficult for tutors to appraise the quality of their work and track how far the student had progressed. Possibly because students were under less pressure to accurately document their progress in the assignments, they felt less pressure to indicate their progress in Doubtfire. Generally Object Oriented Programming students participated less in the feedback process and in using the tool than students in Introduction to Programming. There were more similarities among the cohorts in Object Oriented Programming than Introduction to Programming. For example, students in the former unit only updated their assignment status in the labs (despite being able to reach Doubtfire online), and use of the Need Help state was almost non-existant. Tutor observations suggest that, by not requiring physical submissions of the assignments, students felt less pressure to accurately represent their progress; in terms of both their assignment content and their self-reporting in Doubtfire. V. D ISCUSSION

In Introduction to Programming the submit-mark-fix steps in the process were abused by a small, but persistant, group of disinterested students. These students submitted work each week with little, if any, changes from previous submissions. These students were adopting a surface [9] approach to learning and just wanted the work to be ‘signed off’ without them having to achieve the required understanding. Unfortunately the process embedded in the tool, shown in Fig. 6, did not provide a graceful means of handling this situation. As a result, tutors felt the students were trying to wear them down through attrition. Convenors and Tutors could have benefit from additional diagnostic views of student progress. The Convenor dashboard provided an overview of the unit as a whole, and these similar views could have been beneficial at a class level. B. Addressing Process Abuse To help address the issue caused by the repeated submissions from, disinterested students, a new process has been implemented. This process is illustrated in Fig. 7 and provides an option for tutors to ask students to fix the task and include it in their portfolio for final assessment. The student is then empowered to close these tasks off as complete, and their progress will be updated in their burndown chart. The tool records that these tasks were not signed off by staff, ensuring that the tasks can be checked in the final portfolio submission for the relevant students.

can expect to complete the unit work. Convenors and tutors were previously only able to understand student progress based on the quality of submitted assignments. In summative assessment, the student does not revisit the assignment and so any further progress regarding the relevant concepts is invisible to the tutor. By allowing students to report on their progress and confidence before assignments are submitted, tutors are able to address students’ concerns and help them improve first submissions. In conjunction with a formative assessment style that permits – indeed, encourages, if necessary – multiple submissions, students that engage with the tool are able to very quickly refine their understanding of the covered concepts. Tutor observations of how students used the tool have also identified that students interact differently with assignments based on their confidence of their understanding of the concepts. Students with a strong understanding of the concepts covered use the tool to validate their progress. Students who struggle with the concepts use the varied feedback options to indicate where they are least confident, and thus tutors are able to focus on those areas with the student. The progress of students that actively avoid meeting the stated requirements of the unit is reflected accurately by the tool: they do not advance. Because they do not indicate their status through the tool, it is difficult for tutors to provide specific guidance to them. A feedback loop amplifies the value of the activities undertaken by the participants; disinterested students undermine the effectiveness of Doubtfire by choosing not to engage with the tool.

start

VI. C ONCLUSION In this paper we presented our application of burn down charts as an online tool to support student learning. We described the different ways in which students utilised the tool, and discussed current limitations. Overall we have seen positive outcomes for each of the intended user groups, in particular time management skills needed by students.

Not Started Student indicates started

Complete closes task

Fix and Include

Progressing

R EFERENCES indicates ready to mark

indicates student needs to fix and include in portfolio

Ready to Mark

provides feedback

signs off Complete Tutor

Fig. 7. UML state chart showing the states and transitions aimed at addressing abuse of formative process.

C. Benefits of the Doubtfire tool Doubtfire provides a much greater understanding of actual progress to all targeted user groups. Students benefit because they can see the tasks completed and remaining, how complex each task is considered (by the task weight) and when they

[1] K. Beck, M. Beedle, A. van Bennekum, A. Cockburn, W. Cunningham, M. Fowler, J. Grenning, J. Highsmith, A. Hunt, R. Jeffries et al., “The agile manifesto,” http://www.agilemanifesto.org/principles.html, 2001. [2] K. Beck, “extreme programming explained: embrace change addisonwesley,” Reading, Ma, 2000. [3] K. Schwaber and M. Beedle, Agile software development with Scrum. Prentice Hall PTR Upper Saddle Riverˆ eNJ NJ, 2002, vol. 18. [4] A. Cockburn, Agile software development. Addison-Wesley Professional, 2002. [5] J. Sutherland and K. Schwaber, The scrum papers: Nuts, bolts, and origins of an agile process. Citeseer, 2007. [Online]. Available: http://scrumfoundation.com/library [6] J. Biggs, “Enhancing teaching through constructive alignment,” Higher Education, vol. 32, pp. 347–364, 1996. [7] A. Cain and C. J. Woodward, “Toward constructive alignment with portfolio assessment for introductory programming,” in Proceedings of the first IEEE International Conference on Teaching, Assessment and Learning for Engineering. IEEE, 2012, pp. 345–350. [8] ——, “Examining student reflections from a constructively aligned introductory programming unit,” in Proceedings of the 15th Australasian Computer Education Conference, vol. 136, 2013. [9] F. Marton and R. S¨alj¨o, Approaches to Learning. Scottish Academic Press, 1984, pp. 39–58.