Assessment & Evaluation in Higher Education Vol. 29, No. 6, December 2004
Students as partners in evaluation: student and teacher perspectives Anna Giles, Sylvia C. Martin, Deborah Bryce and Graham D. Hendry* University of Sydney, Australia
Most course evaluation in higher education is designed and conducted by university staff and rarely are students given central responsibility for planning and implementing an evaluation. Involving students as partners in educational evaluation may offer them authentic ways to develop professional skills. We describe an ‘education option’ that involved five students in designing and conducting an evaluation of the Virtual Anatomy Tutor (VAT), an online learning resource for medical students, and report both student and staff reflections on the process. The experience gained by the five students can be transferred to other settings which provide opportunities for student designed and conducted evaluation of learning resources.
Introduction Most course evaluation in higher education is designed and conducted by university staff. Rarely are students given central responsibility for the entire process of planning and implementing an evaluation. In one of the few reported studies of student-led evaluation, medical students in their first clinical year conducted a survey of their peers to evaluate or ‘audit’ variation in surgical teaching across five hospitals (Lockwood et al., 1986). In this article we report student and teacher perspectives on a student-led evaluation of an online learning resource. Context Students in the 4 year graduate entry University of Sydney Medical Program (USydMP) come from a variety of degree backgrounds, with the majority of entrants holding science degrees, while some have studied arts and humanities exclusively. The curriculum in Years 1 and 2 of the USydMP is structured around a pattern of three problem-based learning (PBL) tutorials per week, with a new clinical problem introduced weekly. Problem ‘triggers’ (brief audiovisual summaries of patients’ presenting problems) are delivered to PBL groups on a Faculty intranet prior to any teaching. Addition*Corresponding author. Office of Teaching and Learning in Medicine A27, Faculty of Medicine, University of Sydney, Sydney, NSW 2006, Australia. Email:
[email protected] ISSN 0260-2938 (print)/ISSN 1469-297X (online)/04/060681-05 2004 Taylor & Francis Ltd DOI: 10.1080/0260293042000227227
682
A. Giles et al.
Figure 1. The overall design and student-led evaluation process
ally, students have access to web-based learning resources that support their self-regulated learning, including text summaries, images and interactive modules on specific topics, e.g. the electrocardiograph (ECG). Method: student-led evaluation In 2001 teachers in the departments of Anatomy and Medical Education invited Year 1 students to help them design and evaluate an online module called the ‘Virtual Anatomy Tutor’ (VAT). Volunteers were called for via a bulletin on the medical program website and five students were selected on a ‘first come, first served’ basis. Students conducted the evaluation as their course ‘education option’ with guidance from staff. An education option, offered in the first 2 years of the USydMP, is intended to involve students in their own and/or others’ professional development as future educators. The overall process is summarized in Figure 1. Students met with teachers in late 2001 to share ideas on the content, layout and self-assessment features of the VAT (see Step 1). The students (now in their second year) again consulted with staff in early 2002 to review the layout and plan evaluation of the VAT (see Step 2). One
Students as partners in evaluation 683 student (AG) with survey experience led the development of a questionnaire designed to assess first year students’ satisfaction with the VAT as a learning resource. The student evaluators also convened two focus groups with first year students (see Step 3). The students submitted individual option reports and AG and SM prepared an overall analysis of the evaluation data. The results indicated that 83% of respondents were highly satisfied with the VAT; 80% said they would recommend it to other students. Students were enthusiastic about the development of further modules: ‘Refreshingly different way of learning—the interactive style, the use of specimens and mock exam/consolidation questions were great’. The majority of negative comments concerned the layout and navigation of the module. Teacher perspectives The student evaluators provided invaluable input into the design of the VAT. They were creative, thoughtful and competent in designing a questionnaire and conducting focus groups. Staff were impressed with the quality of the evaluation report and encouraged by Year 1 students’ satisfaction with the VAT as a useful learning resource. They decided to use the template to develop similar modules taking into account suggestions made. Student perspectives The student evaluators found the project to be a useful and satisfying learning experience. Of particular benefit was knowledge gained about the process of evaluation. It became clear to both teachers and the students that a formative, ‘stakeholder-oriented’ evaluation (Wilkes & Bligh, 1999) was needed, with a focus mainly on feedback for improvement (Maudsley, 2001). Students initially found working as a team difficult, differed in their enthusiasm for the project, and expected more input from staff than they received. However, overall they enjoyed the process of evaluating the VAT, valued the opportunity to work in a project team environment and developed skills in teamwork and project management. As a result of their experience they have several suggestions for how students in other degree programs can engage successfully in student-led evaluation (see Figure 2). Conclusion This study reports experiences of teachers and students as partners in educational evaluation. Students successfully conducted an evaluation of an educational resource and gained valuable experience in evaluation and project development processes. In a recent review of evidence-based guidelines for achieving quality in online education, Greenhalgh et al. (2003) concluded that decisions about the design of web resources should be based on research and/or evaluation as well as ‘practical experience’ (p. 144). Qualitative student feedback is an accepted form of ‘primary
684
A. Giles et al. • Assume the intiative early, even if the project is a faculty initiatied one. • Ensure a specific learning contract, outlining aims and roles, and clarify if necessary. • Assign tasks to each other early, ensuring they are specific and have a precise deadline. • Assume meetings will be problematic to arrange. Agree upon less frequent meetings, take minutes and reiterate decisions made with email communication (if possible). • Ensure any learning resource to be evaluated is introduced in a timely manner, allowing learners the opportunity to use it as designed. • Acknowledge that students do not have an advantage over other evaluators in obtaining an adequate response rate. • Consider requiring one or two students on your team to have experience in evaluation and/or statistical analysis, or develop basic understanding in these areas, before embarking on the project. Figure 2. Suggestions for students to conduct an effective educational evaluation
research’ (Greenhalgh et al., 2003). Involving students as authentic evaluators satisfies evidence-based guidelines, with the important advantage of helping students to develop greater autonomy and professional skills. Acknowledgements We gratefully acknowledge the contribution and effort of all students who participated in the evaluation project team, Anna Giles, Maria Martin, Sylvia Martin, Katrina McEwin and Shannon Reid. Notes on contributors Anna Giles is a student in Year 4 of the University of Sydney Medical Program. She has a Bachelor of Science with a major in Psychology and Masters of Psychology (Clinical). Sylvia Martin is a student in Year 4 of the University of Sydney Medical Program. She has a Bachelor of Science with a major in Molecular Biology and Bachelor of Arts with a major in Psychology. Deborah Bryce is a Lecturer in the Department of Anatomy, University of Sydney. Her research interests include student learning and online education. Graham Hendry is a Senior Lecturer in the Office of Teaching and Learning in Medicine, University of Sydney, with a responsibility for process and program evaluation. His research interests include student learning, academic development and quality improvement.
Students as partners in evaluation 685 References Greenhalgh, T., Toon, P., Russell, J., Wong, G., Plumb, L. & Macfarlane, F. (2003) Transferability of principles of evidence based medicine to improve educational quality: systematic review and case study of an online course in primary health care, British Medical Journal, 326, 142–145. Lockwood, D. N. J., Goldman, L. H. & McManus, I. C. (1986) Surgical dressers: the theatre experience of junior clinical students, Medical Education, 20, 216–221. Maudsley, G. (2001) What issues are raised by evaluating problem-based undergraduate medical curricula? Making healthy connections across the literature, Journal of Evaluation in Clinical Practice, 7(3), 311–324. Wilkes, M. & Bligh, J. (1999) Evaluating educational interventions, British Medical Journal, 318, 1269–1272.