© 2003 by The International Union of Biochemistry and Molecular Biology Printed in U.S.A.
BIOCHEMISTRY
MOLECULAR BIOLOGY EDUCATION Vol. 31, No. 1, pp. 24 –27, 2003
AND
Articles Assessment of Verbal Communication in Science Education A COMPARISON OF SMALL AND LARGE CLASSES* Received for publication, August 13, 2002 Ian S. Haworth‡§ and Ashley Garrill¶储 From the ‡Department of Pharmaceutical Sciences, University of Southern California, Los Angeles, California 90089-9121 and the ¶Department of Plant and Microbial Sciences, University of Canterbury, Private Bag 4800, Christchurch, New Zealand
Clear verbal communication is a key skill for the modern scientist. In this article we describe two model science courses, one given to a small class and the other to a large class, in which verbal communication is emphasized as a key learning objective. Both courses use a form of problem-based learning (PBL) with an end point that involves students communicating their results and ideas verbally. Hence, assessment of verbal communication is necessary. We describe several different approaches to this challenge and highlight their applicability to small and large class sizes. We then show that, with careful application of these methods, a good relationship can be obtained between scores on verbal assessment exercises and scores on written examinations. On this basis, we argue that the verbal assessment can give an accurate reflection of the student’s ability and can therefore be used as a method of formative evaluation in addition to being a component of the overall grade. Keywords: Assessment, verbal communication, science education.
recognized that students may respond differently to verbal and written forms of assessment [14]. Here we describe two courses, one with a small enrollment and the other with a large one, in which verbal communication skills are assessed by faculty during class sessions. We present data that show a correlation between verbal assessment scores and written examinations scores over a 4-year period in each course. These data suggest that verbal assessment methods can be used as an effective means of formative assessment (for a discussion of formative assessment, see Ref. 15). We also suggest reasons why the teaching of verbal communication skills may not be occurring in contemporary science courses.
Effective verbal communication is an essential skill for modern scientists, who are required to present their research at conferences and to teach their disciplines in lectures. Despite this, there is little attention paid to this skill in the typical science curriculum, which is usually delivered in the traditional lecture/examination format. Students are presented with and then tested on material that is pertinent to their discipline. Both students and faculty may have become comfortable with this method of teaching and assessment, which does little to foster verbal communication skills. The problem-based learning (PBL)1 approach [1] goes some way to addressing the development of verbal communication skills. This has been used in a variety of disciplines, including those of biochemistry [2– 4] and pharmaceutics [5–7]. One of the main difficulties with incorporating verbal communication as a learning objective in any course is its mode of assessment. Assessment methods in PBL courses have been described [8 –10], and their effectiveness has been discussed [11–13]. Furthermore, facilitator [12] and peer [13] assessment scores have been compared with written examinations scores. None of these studies have explicitly addressed an assessment of verbal communication skills, although it has long been
COURSE STRUCTURE
We have chosen to address issues associated with verbal communication by focusing on two model science courses. These courses are both taught as graduate classes to students who have already met the requirements for undergraduate degrees. The academic requirements for entry into the two courses are approximately similar. One class, in Medical Biochemistry, henceforth referred to by its course code of BCHM401, is taught at the University of Canterbury, Christchurch, New Zealand. The other, in Pharmaceutics, henceforth referred to as Pharm. I, is taught as part of the Pharm.D. curriculum at the University of Southern California, Los Angeles. The data presented for BCHM401 cover the years 1998, 1999, 2000, and 2001 in which the class enrollments were 9, 11, 25, and 14, respectively. For Pharm. I we have used data from 1996, 1997, 1998, and 1999 in which class
* This manuscript was written during a visit made by I. S. H. to the University of Canterbury as an Erskine Fellow. The Erskine Bequest supported this visit. § To whom correspondence may be addressed. E-mail:
[email protected]. 储 To whom correspondence may be addressed. E-mail:
[email protected]. 1 The abbreviation used is: PBL, problem-based learning.
24
This paper is available on line at http://www.bambed.org
25 TABLE I Quantification of time spent with each student in the assessment of verbal communication skills
a b
Form of verbal assessment
Groups of students
Students in each group
Sessions for each group
Duration of each session
Time for each student
h
min
Seminara Discussionb Interviewb
3 26 26
5 ⬃7 ⬃7
1 4 2
3 1 0.67
30 35 12
Used in BCHM401; numbers based on an enrollment of 15 students. Used in Pharm. I; numbers based on an enrollment of 175 students.
enrollments (those students completing the class) were 175, 176, 175, and 182, respectively. This difference in class size allows us to compare the effectiveness of verbal assessment methods in large and small classes. For purposes of discussion, we will use an enrollment of 15 for BCHM401 and 175 for Pharm. I. The principle aim of the BCHM401 course is the development of skills that are essential for research scientists. These include the potential to identify key research papers, to critically evaluate scientific data, to write a grant proposal, and most significantly in the context of verbal communication, to give an oral presentation of a research paper and to engage in scientific debate. The course is taught in eight sessions of 3 h each. Verbal communication skills are first developed through an initial session in which an introduction to scientific communication is given. The instructor then follows with an example verbal presentation in the second session in which the students are encouraged to be critical of both the scientific content and the presentation. Following this, two or three class debates are held on topical areas of medicinal biochemistry. In the most recent offering of the course, debates have been held in the areas of xenoestrogens, pharmacogenetics, and neurodegeneration. The class debate format was chosen to allow students to be challenged by their peers. It is our belief that most students feel more comfortable about speaking in such a situation compared with facing a similar challenge by a faculty member. The remaining sessions are student-led and involve a seminar presentation by each student in an area that he or she, as part of a small group of about five students, has researched thoroughly. Assessment of the seminars is used to evaluate the students’ verbal communication ability. The final element of the course is a comprehensive examination based on the areas covered in the debates and seminars. The Pharm. I class is taught using a problem-based learning, student-centered approach. The course addresses the physical chemistry of pharmaceutical formulation and drug delivery. A description of the course has been published previously [6], and here we give a brief outline. Twenty-six groups of about seven students each are presented with a case study for which they are required to give a written report. Two case studies are assigned over the semester with the work and associated lectures for each covering about 7 weeks (or about half of the semester). Discussions with faculty (four faculty are involved in the course each year) are held as the case study is being answered. These discussions (two per case study) are an integral part of the learning experience, and they are also evaluated by the faculty with each student
receiving a score. As such, they form a part of the verbal assessment of each individual student (see below). Following completion of each case study, each group is interviewed by a faculty member, and, through this interview, individual students are assessed for their contribution to and understanding of the case study material and for their ability to communicate these ideas verbally. Hence, overall, four discussions and two interviews are used to assess verbal communication skills. The final element of the course is a comprehensive examination. VERBAL ASSESSMENT METHODS
Three different methods of verbal assessment are used across the two courses. In BCHM401, an assessment of student performance is made through the student-led seminars. In Pharm. I, the assessment is made during group discussions and through an interview. In Table I, we have calculated the approximate time per student for these assessments based on the average class enrollments. In understanding these data, it is important to recognize that, for example, the relatively short interview time reflects an intense period in which a student is asked many questions, whereas the other methods reflect rather more relaxed forms of assessment. For their seminars in BCHM401, students are placed into groups of five, on average, and each group is expected to lead a 3-hour class on a particular subject area. This will typically comprise a 20- to 30-min seminar by each student and then a general class discussion that the students in the presenting group are expected to lead. The faculty member plays only a minimal role in the discussion and is mainly present to assess the verbal communication skills. The students are assessed using specific grading criteria with an individual score for each criterion being given (see below). The large class size in Pharm. I makes it difficult to assess verbal skills based on specific criteria that can be evaluated and recorded in detail. Further, in a large class the limited time makes it necessary to combine “teaching” with “assessment.” Sessions cannot be reserved for assessment alone. To facilitate this, we conduct group discussions in which a group of about seven students meet with a faculty member. These discussions, which are designed to allow the group to make progress on their case study, are freeflowing. The students are assessed in a general way based on their contribution to the discussion and the general accuracy and insight of their comments. Following completion of the case study, a more direct interview is held, again with the whole group present, but with questions directed to a given student at any one time. Criteria similar to those for the discussion sessions are used to evaluate the responses.
26
BAMBED, Vol. 31, No. 1, pp. 24 –27, 2003
FIG. 1. Relationship of verbal assessment scores with final examination scores in the BCHM401 course. The scores on each axis are shown as a percentage of the available total score. Data are shown for 1998 (closed circles), 1999 (open circles), 2000 (closed triangles), and 2001 (open triangles). The bold line represents a linear regression of the data.
FIG. 2. Relationship of verbal assessment scores with final examination scores in the Pharm. I course. The scores on each axis are shown as a percentage of the available total score. Data are shown for 1996 (closed circles), 1997 (open circles), 1998 (closed triangles), and 1999 (open triangles). The examination scores are normalized to the average percentage score for the 4 years, where the data on the graph ⫽ awarded score ⫹ (4-year average ⫺ annual average). The verbal assessment scores are shown as awarded. A linear regression of the data was also performed for each year (broken lines labeled with the corresponding years; the regression line for 1999 is almost identical to that for 1998). The bold line represents a linear regression of all the data.
Four faculty are involved in the assessment each year, and a general grading scheme is agreed upon in advance (see below). However, the nature of the assessment is such that no formal evaluation criteria are used, in contrast to the assessment made in the smaller BCHM401 class. RELIABILITY OF VERBAL ASSESSMENT SCORES
In the BCHM401 course verbal assessment in the years 1998 –2001 accounted for 10 –20% of the final course mark. The final examination was worth 60 –70%. The remainder of the marks were given for written assignments, including a grant proposal and a short essay. Assessment of the seminars was based on four specific criteria: (i) actual presentation skills, including the use of visual aids such as an overhead projector, (ii) skills in handling questions and facilitating debate, (iii) evidence of adequate preparation, and (iv) group interaction. For each criterion, the students were ranked exceptional (⬎85%), excellent (80%), very good (75%), good (70%), average (65%), below average (60%), poor (55%), very poor (50%), and fail (⬍50%). The totals were summed for each student and
averaged. These rankings were made during the seminars. In the Pharm. I course, the combined discussion and interview scores made up about 15% of the total course score for the years 1996 –1999. The written final examination represented 40% of the course grade with the remainder being derived from case study reports and other assessments [6]. Our general approach in assigning verbal assessment scores in the discussion and interview was to define four categories: (i) leadership in the group, the ability to communicate sophisticated concepts effectively, and a good general understanding of the material (15% of course grade awarded); (ii) the ability to communicate ideas well and a good understanding of specific areas (about 13% of course grade awarded); (iii) average communication but unwillingness to talk unless specifically called upon (11%); and (iv) less than average communication ability (9% or less). To determine whether the methodology described above can be used in the quantification of student performance and ability, we have compared the results from the verbal assessment exercises with those from final examinations. For both courses, these examinations are es-
27 say-based and are given at the end of the course. In making this comparison, we are assuming the final examination result is independent of the verbal assessment score. This may not actually be true. Students who score well in the verbal elements of the course may be encouraged by this and may, as a consequence, perform better in the written examination and vice versa. However, with this caveat, we suggest that if a correlation can be obtained between verbal assessment scores and final examination scores, this would provide evidence that verbal assessment methods are effective and may be used as formative evaluations. Comparisons of verbal assessment and final examination scores in the BCHM401 and Pharm. I courses are shown in Figs. 1 and 2, respectively. For data compiled over the 4 years of the BCHM401 course, there is a correlation (r2 ⫽ 0.31) between these scores (Fig. 1). For each year of the Pharm. I course, the verbal assessment score is also predictive of examination performance (Fig. 2). Based on a linear regression of the combined data for Pharm. I over 4 years (Fig. 2), a difference of 30% in the verbal assessment score is predictive of a 20% difference in examination performance. That is, those students obtaining 70% of the verbal assessment points scored 45% on the final examination on average, whereas those students receiving 100% on the verbal assessment scored about 65% on the examination. A similar analysis of the data for the BCHM401 course (Fig. 1) results in a very similar conclusion. Students receiving 60% of the verbal assessment score obtained a score of about 65% on the final examination, whereas those receiving 90% of the verbal assessment score received about 83% on the final examination. Again, a difference of about 30% in the verbal assessment score is predictive of a difference of about 20% on the final examination. DISCUSSION
The results above suggest that careful, criteria-based verbal assessment in a small class can be predictive of written examination results. Furthermore, in a much larger class this relationship between verbal assessment and examination scores could be reproduced despite using a much more general approach to verbal assessment. Although there is considerable spread in the data and the correlation is only weak, we consider these data to provide some validation of the verbal assessment methods. The correlation with final examination performance also encourages us to view verbal assessment as a formative assessment method [15]. One criticism that is often made of verbal assessment and, indeed, of all forms of “alternative” assessment methods that may be used to replace midterm examinations is that the students do not receive sufficient “feedback” regarding their “standing” in the course and their understanding of the material. That is, the results of such assessments do not allow students to “predict” how they will perform on future examinations and in the course as a whole. However, the data we present suggest that verbal assessment can provide a reasonably reliable indicator of future examination performance, even in large classes. For Pharm. I, while the correlation is weak (regression coefficients of about 0.1) and the data are diffuse, this is perhaps to be expected as the assessments
are done by several faculty members. Nonetheless, the predictive nature of the verbal assessment scores is apparent in both the small and large classes. Finally, we make some comments on the limited amount of teaching of verbal communication skills and on the general problems of pursuing PBL in contemporary basic science courses. As effective scientific presentation skills are best developed through practice, there may be time constraints and faculty or administration pressure to make teaching more “efficient.” Verbal assessment may often also be viewed negatively by the student, who may prefer the relative “anonymity” of the written examination. This concern may be the fault of the academic for not establishing appropriate faculty-student relationships (although larger class sizes makes this increasingly difficult). Out of context, isolated verbal assessment may not work very well. Albanese and Xakellis [16] have written recently that the true value of PBL is the establishment of collegiality. Although in a different context, we believe that a similar collegiality developed between faculty and students (a “horizontal” peer relationship rather than a “vertical” hierarchical relationship) through PBL approaches is essential in breaking down barriers that will allow for the development of verbal communication skills and for their effective assessment. Acknowledgments—We thank the faculty who contributed to the teaching of the courses: Dr. Sandra Jackson at the University of Canterbury and Drs. Stuart Eriksen, Curtis Okamoto, Suman Mukherjee, and Rebecca Romero at the University of Southern California. REFERENCES [1] H. E. Khoo, R. K. Chhem (2001) Problem-based learning: issues and challenges, Ann. Acad. Med. Singap. 30, 338 –339. [2] J. Rosing (1997) Teaching biochemistry at a medical faculty with a problem-based learning system, Biochem. Educ. 25, 71–74. [3] A. Jaleel, M. A. Rahman, N. Huda (2001) Problem-based learning in biochemistry at Ziauddin Medical University, Karachi, Pakistan, Biochem. Mol. Biol. Educ. 29, 80 – 84. [4] E. Johnson, S. Herd, K. Andrewartha, S. Jones, S. Malcolm (2002) Introducing problem-based learning into a traditional lecture course, Biochem. Mol. Biol. Educ. 30, 121–124. [5] W. C. Duncan-Hewitt (1996) A focus on process improves problembased learning outcomes in large classes, Am. J. Pharm. Educ. 60, 408 – 416. [6] I. S. Haworth, S. P. Eriksen, S. H. Chmait, L. S. Matsuda, P. A. McMillan, E. A. King, J. Letourneau-Wagner, K. Shapiro (1998) A problem-based learning, case study approach to pharmaceutics: faculty and student perspectives, Am. J. Pharm. Educ. 62, 398 – 405. [7] G. A. Brazeau, J. A. Hughes, L. Prokai (1999) Use of problem based discussion sessions in a first year pharmaceutical dosage forms course, Am. J. Pharm. Educ. 63, 85–97. [8] H. B. White (2002) Problem-based testing, Biochem. Mol. Biol. Educ. 30, 56. [9] P. K. Rangachari (2002) The TRIPSE. A process-oriented evaluation for problem-based learning courses in basic sciences, Biochem. Mol. Biol. Educ. 30, 57– 60. [10] J. Szebere´ nyl (2002) Problem-solving tests for problem-based learning, Biochem. Mol. Biol. Educ. 30, 61. [11] S. J. van Luijk, C. P. M. van der Vleuten (2001) Assessment in problem-based learning (PBL), Ann. Acad. Med. Singap. 30, 347–352. [12] C. F. Whitfield, S. X. Xie (2002) Correlation of problem-based learning facilitators’ scores with student performance on written exams, Adv. Health Sci. Educ. 7, 41–51. [13] M. Segers, F. Dochy (2001) New assessment forms in problem-based learning: the value-added of the students’ perspective, Stud. Higher Educ. 26, 327–343. [14] G. M. Seddon, V. G. Papaioannou (1990) A comparison of written and oral methods of testing in science, Res. Sci. Tech. Educ. 8, 155–162. [15] B. Bell, B. Cowie (2001) The characteristics of formative assessment in science education, Sci. Educ. 85, 536 –553. [16] M. A. Albanese, G. C. Xakellis (2001) Building collegiality: the real value of problem-based learning, Med. Educ. 35, 1143.