Table 4: The relation between research questions and instruments in this study ... Table 28: Summary of teachers' interview responses on practical lab report .... aims and consequently, design the practical work in the classroom practice for ...
Declaration I declare that this research report titled „Science practical work and its assessment system at IER: Nature, impact and improvement‟ is entirely my own work. This thesis contains no material which has been accepted for the award of any other degree in any university or other institution. To the best of my knowledge, the thesis contains no material previously published or written by any other person, except where due reference is made in the text of the thesis.
…………………………… DEYA CHAKRABORTY Date: September 18, 2016
i
Dedication
“To My Parents”
ii
Acknowledgement On the accomplishment of present study, I would like to take this opportunity to extend my deepest sense of gratitude and words of appreciation towards those, who helped me during the pursuit of study. First of all, I am extremely grateful to my supervisor Dr. S M Hafizur Rahman, Professor, Institute of Education and Research, University of Dhaka. His valuable guidance, scholarly inputs, critical comments and advice have been precious for the development of the thesis content. I am grateful to Mr. Nure Alam Siddique, Associate Professor, Institute of Education and Research, University of Dhaka for his inputs which have been invaluable for the research work. The teachers of Science, Mathematics and Technology Education department have been kind enough to extend their support at various phases of this research whenever I approached them, and I do hereby acknowledge all of them. I thank Dr. Sirajul Islam, Assistant Professor, for his concern and good wishes. I acknowledge with deep sense of gratitude the contributions made by Mrs. Zeba Farhana, Mrs. Tahmina Awal and Mrs. Rezina Ahmed of the Institute of Education and Research. I sincerely thank all the students from science stream who participated voluntarily in this study. I sincerely thank Dr. Farhan Azim for his unconditional support through this entire time. I remember with great pleasure the contribution of Shreyashi Haldar, Roti Chakraborty and Taukir Ahmed Khan. Their advices have been proved most helpful. I am equally grateful to Kalyani, Nishi and Labonno or our memorable time together. I thank them for their constant inspiration and unswerving cooperation. My parents, specially my father Ranjit Kumar Chakraborty, deserve special credit for not giving up on me and allowing me to be as ambitious as I wanted. I am indebted to Santonu Kumar Sanyal, my husband, for believing in me. iii
Abstract For several years, science practical work and its assessment system have been the focal points of research for scholars. Numerous studies have been conducted in the field of secondary education. Nevertheless, a few studies address these issues in the tertiary level. The purpose of this study is to explore the existing nature of practical work and its assessment system in the context of Institute of Education and Research (IER), University of Dhaka. This study is qualitative in design. The data sources of my study were: relevant documents, students, teachers and externals of practical examination. I collected data from the data sources by using semi structured interview schedule and focus group discussion guidelines. 7 teachers and 30 students of science stream were my samples. The teachers were chosen based on purposive sampling technique and the students were chosen conveniently. I have used thematic analysis to analyze the qualitative data. The major findings of this study reveal the nature of science practical work and its assessment system of IER. This study shows that the existing nature of science practical work at IER is recipe style. The study also finds that paper-pencil test is emphasized the most in IER science practical however the criteria of paper-pencil test are not explicit. Furthermore, the study reveals that there is no formative assessment in IER science practical class. This study has, therefore, implications for curriculum developers of IER and teachers of SMTE department. Further research has also been suggested in the implication chapter.
iv
Table of Contents Declaration...........................................................................................................................................i Dedication ............................................................................................................................................ ii Acknowledgement .............................................................................................................................. iii Abstract ............................................................................................................................................... iv List of Tables .........................................................................................................................................i List of Figures ...................................................................................................................................... iv Chapter One: Introduction ...................................................................................................................1 1.1 Background of the study ............................................................................................................1 Context of IER: .............................................................................................................................1 1.2 Statement of the Problem .........................................................................................................2 1.3 Purpose and Research Questions of the study ..........................................................................3 1.4 Audience ....................................................................................................................................4 1.5 Outline of the Study ...................................................................................................................4 Chapter Two: Literature Review ..........................................................................................................5 2.1 Practical work.............................................................................................................................5 2.1.1 Aims of laboratory/ practical work in science ....................................................................5 2.1.2 Different types of laboratory works....................................................................................6 2.2 Open investigation in science practical class at tertiary level ...................................................8 2.2.1 What is Open Investigation? ...............................................................................................8 2.2.2 Model of science investigation process ..............................................................................8 2.2.3 Emergence of introducing open investigation ................................................................. 10 2.2.2 Present scenario of secondary science education in Bangladesh ................................... 11 2.2.3 Scale of Openness to Inquiry ........................................................................................... 12 2.3 Assessment of practical work ................................................................................................. 13 2.3.1 Traditional ways of assessing practical work and its impact ............................................... 13 2.3.2 Using formative assessment to inform teaching-learning ................................................... 15 2.3.3 Using Rubrics for assessment of open investigation work .................................................. 16 Chapter Three: Methodology ........................................................................................................... 17 3.1 Strategy of Inquiry................................................................................................................... 17 3.2 Outline of Research Design ..................................................................................................... 18
v
3.3 Data Sources ........................................................................................................................... 20 3.3.1 Document analysis ........................................................................................................... 20 3.3.2 Teachers of SMTE at IER................................................................................................... 20 3.3.3 Students of science stream at IER .................................................................................... 21 3.3.4 External of Practical Examination .................................................................................... 21 3. 4 Sample and Sampling ............................................................................................................. 21 3.5 Data collection ........................................................................................................................ 22 3.6 Instruments ............................................................................................................................. 22 3.6.1 Semi structured interview schedule ................................................................................ 22 3.6.2 Focus Group Discussion guidelines .................................................................................. 23 3.7 Data Analysis ........................................................................................................................... 24 3.8 Ethical Consideration .............................................................................................................. 25 Chapter Four: Results........................................................................................................................ 26 4.1 Document Analysis.................................................................................................................. 26 4.2 Nature of science practical work in the classroom ................................................................. 27 4.3 Nature of formative assessment system in science practical classroom................................ 32 4.4 Nature of summative assessment in science practical classroom .......................................... 39 4.5 Impact of existing nature of science practical work on students ........................................... 55 4.6 Impact of the existing nature of assessment system on students.......................................... 55 4.7 Ways to improve the current nature of practical work and assessment system at IER ......... 58 Chapter Five: Discussion ................................................................................................................... 62 5.1 Existing nature of science practical work and its impact on students .................................... 62 5.1.1 Lack of clarity regarding learning outcome ..................................................................... 62 5.1.2 Incapability of relating practical work with theory .......................................................... 63 5.1.3 Lack of attention to the development of investigation skill ............................................ 63 5.1.4 Fail to achieve affective outcomes .................................................................................. 64 5.2 Ways to improve the existing nature of science practical work at IER................................... 64 5.3 Nature and impact of summative assessment of science practical work at IER..................... 69 5.3.1 Paper and pencil test ........................................................................................................... 69 5.3.2 Practical lab reports ......................................................................................................... 70 5.3.3 Viva-voce .......................................................................................................................... 71 5.4 Nature and impact of formative assessment in science practical class of IER ....................... 71 vi
A. Criteria for formative assessment is not explicit ..................................................................... 72 B. No effective feedback culture .................................................................................................. 72 C. No scope for self-assessment .................................................................................................. 73 5.5 Ways to improve the existing situation of science practical assessment system at IER ........ 73 5.5.1 Formative assessment to be included ............................................................................. 73 Chapter six: Implications and Conclusions........................................................................................ 77 6.1 Implications for the curriculum reform .................................................................................. 77 6.2 Implications for teachers’ practice ......................................................................................... 77 6.3 Implications for future research ............................................................................................. 78 6.3.1 Action research ................................................................................................................ 78 6.3.2 Setting assessment criteria for assignments, in course and presentation ...................... 78 6.4 Personal reflection .................................................................................................................. 78 References ........................................................................................................................................ 79 Appendices........................................................................................................................................ 83 Appendix 1 .................................................................................................................................... 83 Appendix 2 .................................................................................................................................... 86
vii
List of Tables Table 1: Level of Inquiry/Openness Table 2: Table of data sources Table 3: Sample and sampling technique Table 4: The relation between research questions and instruments in this study Table 5: Problem identification Table 6: Procedure Table 7: Apparatus Table 8: Results Table 9: Summary of teachers‘ interview responses on the nature of practical work Table 10: Summary of students FGD responses on the nature of science practical work Table 11: Is formative assessment present in science practical? Table 12: Are the criteria for formative assessment clearly conveyed? Table 13: What is teachers‘ feedback culture in science practical classes? Table 14: Is there a scope for self-assessment in science practical classroom? Table 15: Summary of teachers‘ interview responses on formative assessment in practical class Table 16-17: Summary of students FGD responses on formative assessment in science practical class Table 18: Teachers follow up FGD responses on formative assessment in science practical class Table 19: Assessment criteria while observing during paper pencil test Table 20: How are the students informed about assessment criteria of paper pencil test? Table 21: Summary of teachers‘ interview responses on paper pencil test i
Table 22: a) Summary of students‘ FGD responses on paper pencil test Table 23: b) Summary of students‘ FGD responses on paper pencil test Table 24: Teachers follow up FGD responses on paper pencil test Table 25: What are the criteria of assessing lab report? Table 26: How do you assign number to the lab report? Table 27: How do the teachers let the students know about the assessment criteria? Table 28: Summary of teachers‘ interview responses on practical lab report Table 29: Physics lab report Table 30: Chemistry lab report Table 31: Zoology lab report Table 32: Botany lab report Table 33: Summary of students‘ FGD responses on practical lab report Table 34: Summary of teachers‘ follow up FGD responses on practical lab report Table 35: What do you assess in a viva? Table 36: How do you ascertain a number to a student? Table 37: How do you let the students know about viva‘s assessment criteria? Table 38: Do you think students should know about the assessment criteria? Give justification of your answer Table 39: Teachers‘ follow up FGD responses on the impact of existing nature of practical work Table 40: What is the impact of not knowing the criteria? Table 41: Students‘ FGD responses on the impact of existing nature of assessment system on students Table 42: Teachers‘ follow up FGD on the impact of existing nature of assessment system on students ii
Table 43: Teachers‘ interview responses on the ways to improve the nature of practical work at IER Table 44: Teachers‘ interview responses on the ways to improve the nature of practical assessment system Table 45: Students FGD responses on the ways to improve the nature of practical work and assessment system Table 46: Teachers follow up FGD responses on the ways to improve the existing nature of practical work and assessment system
iii
List of Figures A model of science investigation processes…………………………………………….10 Outline of Research design……………………………………………………………... 20
iv
Chapter One: Introduction 1.1 Background of the study Practical work is an essential part of science education. Different forms of practical work develop different learning outcomes, so it is important to use a range of practical experiences to develop the full range of outcomes (Hackling, 2007). Traditional recipe style laboratory exercises in which students follow the teacher‘s instructions to answer the teacher‘s question provide little opportunity for students‘ ownership and agency, intellectual challenge or engagement. Investigations on the other hand allow students to plan and conduct their own experiments within a context and boundaries set by the teacher. Investigations provide a more authentic experience of the nature of science and require students to be both hands on and minds on (Hackling, 2007). Thus, the trend of science practical work has shifted from recipe to investigation around the world. Now, in these circumstances, it is essential to explore what is happening in the laboratories of Bangladesh.
Context of IER: Institute of Education and Research (IER), University of Dhaka, established in 1959, has been playing an unparalleled role in bringing reform to the education system of Bangladesh. It offers teaching programs leading to higher degrees, conducts research studies and provides extension services in Education. It prepares and supplies manpower for major specialized areas of Education and renders services for the development of Education as a whole. The institute offers various courses ranging from a Four year (8 semester long) Bachelor of Education (Honors), a 1 year Master of Education, a 1 year Master of Education (Evening) to 2 year Master of Philosophy and Doctor of Philosophy. Bachelor of Education course has an extensive curriculum of its own. The curriculum is divided into 4 sections such as Science (Physical Science and Biological Science), Language, Social Science and Special Education. The students of science stream have a 1
scope to study Physics, Mathematics, Chemistry, Botany and Zoology up to 4th semester (2nd Year). Those, who choose Physical Science group, study Physics and Mathematics and those, who choose Biological Science, study Botany and Zoology. Both of the groups (Both the groups belong to science stream) have to study Chemistry as a compulsory subject throughout these four semesters. The purpose of teaching these academic elective subjects extensively is to construct students‘ content knowledge on the various areas of science. Each science courses is divided into two parts, theory and practical. Science practical work in the context of IER is the topic of my research.
1.2 Statement of the Problem IER curriculum places a huge emphasis on practical work since the curriculum has allocated 25 marks for practical work for science subjects up to 4th semester (the Curriculum Committee, 2006). One of the course objectives of the IER science curriculum clearly states that students will be able to develop skills in practical aspects like observation, performance, recording, setting while performing the practical experiments of the science subjects. However, I have problems in practical classes in understanding that how these skills are to be achieved. Apart from this, I have encountered problems regarding practical work. The problem related to practical work was: as a student, my role in practical work was severely limited, I had problems in understanding the learning outcome of the practical work and my interest for practical work was decreasing. IER science curriculum states that students will be able to develop skills in practical aspects like observation, performance, recording, setting while performing the practical experiments of the science subjects. According to the assessment policy of IER, the science subjects (Physics, Chemistry, Zoology and Botany) have 25 marks for practical out of 100. But, the criteria for the assessment of practical work, on the basis of which marks/grades are assigned to the students, are not clearly stated in the curriculum. When assessment criteria for an assessment task are not explicitly stated, students find it difficult to identify their level of learning and to select their learning approach for achieving their learning goals (Suskie, 2002). The Bachelor of Education curriculum of IER has no specific instruction or strategy by which a teacher can evaluate the above mentioned skills within an individual learner 2
during a practical examination and grade him/her according to that assessment. The assessment guidelines given in the curriculum documents are not as explicit and comprehensive as necessary (Rahman & Siddique, 2007). I have had difficulty to find a research which addresses the nature of the practical work in IER context. It was also difficult for me to find a past research that discusses the impact of the existing nature of practical work on students of IER. There is, therefore, a need to investigate this matter further. For the science practical assessment system at IER, the past research recommended a suggestion for further investigation: an explicit guiding principle is essential for the assessment process which covers all aspects of assessment (Rahman and Siddique, 2007). However, the above mentioned research does not address the issue on how to make the assessment guideline explicit for the evaluation of the practical of science subjects. It is, therefore, necessary to address this issue as well in this study.
1.3 Purpose and Research Questions of the study This study has two major purposes: to explore the existing nature of practical work and assessment system of IER and to find out the impact of existing situation of practical work and assessment system on students. I also would like to recommend some suggestions to improve the existing situation of science practical at IER. The following research questions are going to address the above purpose of the study and will also direct the readers about how this study was organized: 1. What is the existing nature of science practical work class at IER? 2. What is the existing nature of assessment in science practical work at IER? 3. How the current nature of practical work impacts students‘ learning? 4. How the nature of assessment in science practical work impacts students‘ learning? 5. How the existing situation of practical work at IER can be improved?
3
1.4 Audience By exploring these issues, the research can inform the curriculum developers of IER, teachers and policy makers, who also run similar kind of programs, about two things. First of all, they will be informed about the existing nature of practical work and assessment system of IER and secondly, they will understand the impact of the existing nature of science practical work and assessment system on students. As the study also discusses the possible ways to overcome this current situation of practical work and its assessment system, the curriculum developers and teachers can use this information in order to improve the situation of science practical work of IER.
1.5 Outline of the Study This thesis consists of six chapters. The first chapter describes introduction, statement of the problem, purpose, research questions and significance of the study. The second chapter includes review of relevant literature of practical work and its assessment system. The chapter also includes the emerging trend in science practical work and its assessment system at tertiary level. The third chapter includes methodology of the study. In this chapter, I have discussed the strategy of inquiry, research design, data sources, sample and sampling, data collection, instruments, data analysis and ethical consideration. The fourth chapter represents the results section of this qualitative study. Data has been analyzed by using thematic analysis. Themes were predetermined based on the focuses of my research questions. In chapter five, I have presented the discussion section in which I have discussed the findings of this research study in relation to my review of literature. This discussion has been organized based on the order of research questions of this study. Chapter six presents the conclusion. In this section, I have discussed the implications of my study. My study has implications for curriculum reform/development, teachers‘ practice and for future research. This chapter also includes my personal reflection of the study. Finally references have been added to refer to literature and other information sources and instruments which are used in this study are included in appendices.
4
Chapter Two: Literature Review 2.1 Practical work 2.1.1 Aims of laboratory/ practical work in science Science practical work can be used to help students achieve a number of learning outcomes (Hackling, 2004). Aims of laboratory or practical work can be referred to as ―general statements of what the teacher intends to achieve‖ (Sutton, 1985, cited in Johnstone & AlShuaili, 2001). Numerous attempts have been made to articulate the aims of laboratory work and there is substantial agreement among them (Garnett, Garnett & Hackling, 1995; Johnstone & Al-Shuaili, 2001). Garnett and Hackling (1995), aims of laboratory work can be classified into four major categories, they are: 1. Conceptual learning Practical work is used to develop students‘ conceptual learning and understanding of science. Laboratory activities are used to introduce, illustrate and verify scientific theories and concepts and to provide concrete experiences of scientific phenomena. More recently, from a constructivist view of science teaching and learning, laboratory work is seen as a tool for reconstructing students‘ personal ideas or theories. 2. Techniques and manipulative skills Laboratory work helps students to develop manipulative skills and techniques such as handling apparatus like burette, pipette, measuring length by using slide calipers and meter scale and measuring weight precisely by using balance etc. these skills are valuable in terms of general utility and vocational value and are also important for future practices as a scientist. 3. Investigation skills Laboratory work aims to develop students‘ investigation skills, which includes planning an investigation, conducting the investigation, processing and interpreting data and evaluating the findings.
5
4. Affective outcomes Laboratory work aims to develop students‘ interest, motivation, enjoyment, satisfaction and confidence towards science. It also aims to develop scientific attitudes and values such as objectivity, critical mindedness, rationalism etc. which help students to think and work scientifically. Understanding the aims of practical work helps to understand the role of practical work in science education and hence, what kind of practical work would be the best to achieve these aims and consequently, design the practical work in the classroom practice for better students‘ learning.
2.1.2 Different types of laboratory works Johnstone and Al-Shuaili (2001) used work of Domain (1999) to classify laboratory work into four groups of based on the nature of outcome, approach and procedure. These four types of laboratory work are: Expository, Discovery, Inquiry and Problem-Based.
1. The Expository laboratory (Verification) Expository instruction is the most common type in use. The role of the learner in an expository lab is only to follow the teacher‘s instructions or the procedure (from the manual) that is stated in detail. The outcome is predetermined by the teacher and may also be already known to the learner. Expository instruction has been criticized for placing little emphasis on thinking. There is no opportunity for students to develop skills associated with formulating a research question, identifying and manipulating variables or planning how to control variables. When placed beside the aims of laboratory work already discussed, the expository laboratory seems to be incapable of helping students to achieve many of them. It may be a place for exercising manipulative and data gathering skills, but may fail to offer little motivation and stimulus. However, small modifications of expository laboratories can offer the possibility of introducing some of those desirable experiences.
6
2. Discovery Laboratory (Guided Inquiry) The heuristic method taught by Armstrong in the early 20th century can be regarded as the origin of discovery laboratory teaching. In this type of laboratory, students were required to generate their own questions for investigation. No laboratory manual was used and the teacher provided minimal guidance. The student was placed in the role of discoverer. Similar to the inquiry method, the discovery approach is inductive but differs with respect to the outcome of the instruction and to the procedure followed. In discovery laboratory activities, the teacher guides learners toward discovering a desired outcome. The criticisms of discovery learning are that it is more time-consuming and potentially more costly than expository learning. Hodson () described discovery instruction as not only philosophically unsound, but also pedagogically unworkable. He asserts that the learner couldn‘t discover something that he is conceptually unprepared for. The learner does not know where to look, how to look, or how to recognize it when he has found it. 3. Open Inquiry When learners generate their own procedure to obtain an undetermined result, the laboratory work is known as open inquiry. In an open inquiry, students formulate the problem, relate the investigation to previous works, predict the result, identify the procedure and perform the investigation. This kind of laboratory work is also known as open investigation or students‘ investigation. Open inquiry or investigation is gaining acceptance worldwide but not yet common. 4. Problem-Based Instruction In problem based instruction, the teacher adopts an active and stimulating role, presents students a problem statement often lacking crucial information. Students redefine he problem, devise a procedure and combine necessary experiments to find the predetermined solution. Like discovery and inquiry instruction, it is a time consuming process. The difference between open inquiry and problem based learning is that in the former process, the outcome is unknown to the teacher and learners and in the later, the outcome is predetermined. 7
2.2 Open investigation in science practical class at tertiary level Open investigation is the new emerging trend in science practical work. Hackling and Garnett (1995) assert the need to introduce open investigation in science practical work in secondary science education to develop students‘ science investigation and problem solving skills. However, the time has come when the integration of open investigation at tertiary level must be given some consideration.
2.2.1 What is Open Investigation? Open investigations are activities in which students take the initiative in finding answers to problems (Jones, Simon, Fairbrother, Watson & Black, 1992). The problems require some kind of investigation in order to generate information that will give answers. Garnett, Garnett and Hackling (1995) describe a science investigation as ‗a scientific problem which requires the student to plan a course of action, carry out the activity and collect the necessary data, organize and interpret the data, and reach a conclusion which is communicated in some form‘.
2.2.2 Model of science investigation process Hackling and Fairbrother (1996) gave a model of the investigation process (Figure 1). In this investigation process model, there are four phases of investigation. They are: planning, conducting, processing and evaluating. In practice, these processes may not take place in the strict order of planning-conducting-processing-evaluating. Part way through conducting, students may realize that further planning is required to improve the measurement technique or design of the experiment. They assert that in practice, these processes may not take place in the strict order of, say, Planning – Conducting – Processing – Evaluating, because halfway through Conducting it may be realized that more planning is needed and therefore a more recursive model may more accurately represent the investigation process.
8
Evaluating
Planning Problem identification and analysis
Revise methods and techniques
Evaluate the design of the experiment and the techniques or methods used
Identification or variables
Formulation of a research question or a hypothesis and prediction
Evaluate findings in relation to the problem, question or hypothesis Reformulate problem or hypothesis Processing
Operationalize variables, plan the design and procedure
Conducting Conduct preliminary trials
Carry out the experiment, observe, measure and record data
Revise methods and technique s
Use science knowledge to develop explanations for patterns, trends or relationships in data
Analyze data; identify patterns or trends in data and relationships between variables
Organize data, calculate, and construct graphs
Figure 1: A model of science investigation processes (Hackling & Fairbrother, 1996)
9
2.2.3 Emergence of introducing open investigation Open investigation has been an emergence because of three main factors. Garnett, Garnett and Hackling (1995) asserts the factors- criticisms of current practice, research findings about laboratory investigation skills and national curriculum trends. These factors are discussed below: (a) Criticisms of current practice Criticisms of the current nature of laboratory work include an over emphasis on conceptual learning, the predominance of recipe-style laboratory work, a lack of attention to the development of investigation skills and subjecting students to information overload (Hackling, 1994). Fensham (1988) believes that modern science courses have reduced the role of laboratory work to the enhancement of conceptual learning and have neglected opportunities for students to develop confidence and skills in applying scientific knowledge to solve problems. Criticism has also been leveled at the predominance of recipe style laboratory work. Lloyd (1992) indicates that a structured or cookbook approach is still overwhelming choice in laboratory manuals. These activities provide few opportunities to identify problems or formulate hypotheses and to design experiments or procedures; insufficient discussion of limitation and underlying assumption and inadequate provision for discussion, analysis and consolidation.
Merritt, Schneider and Darlington (1993)
proposed a change in emphasis with greater involvement of students in planning their experiments. (b) Research findings about laboratory investigation skills Research indicates many schools students perform poorly with a laboratory investigation. A recent study by (Hackling & Garnett, 1993) showed that school students generally demonstrated a poor level of performance or skills relating to problem analysis and planning, carrying out controlled experiment, basing conclusion only on obtained data and recognizing limitations in their methodology. Roth and Roychoudhury (1993) asserts that open-ended inquiry laboratory resulted in considerably improved skills of identifying variables, hypothesizing, planning and carrying out experiment and interpretation data.
10
(c) National curriculum trends Science curricula in different countries has included or restructured laboratory work during 1990s. Although ‗inquiry‘ has a central term in the theories of past science education reforms in the United States of America (USA). Recent reforms (AAAS, 1999; NRC, 1996, cited in Abd-El-Khalick, 2004, p. 398) have placed an extensive emphasis on scientific inquiry and nature of science as cognitive outcomes. Students are expected to master a set of inquiry-related skills such as devising valid research designs as well as to develop understandings about inquiry (Abd-El-Khalick et al., 2004). England National Curriculum introduced open investigation in 1989 which included planning, experimental procedure, obtaining evidence, analyzing evidence and drawing conclusions and evaluating evidence (Pekmez,, 2005). In Australia, Australia Education Council develops a set of national profiles and statements in 1994 that served as a common framework for curriculum development in eight learning areas including science, in terms of content, concepts and processes (Abd-El-Khalick et al., 2004). Science curriculum has been organized in five strands, four conceptual strands and one process strands. The process strand ‗working scientifically‘ focuses mainly on investigation skills and includes four organizers associated with investigations; planning investigations, conducting investigations, processing data and evaluating findings (Australian Education Council, 1994). It also includes two more organizers- using science and acting responsibly. Some states in Australia adopted the set of profiles and other states made significant changes to them (Goodrum, Hackling & Rennie, 2000, cited in Abd-ElKhalick et al., 2004). Open investigation has also been introduced in science curricula of Lebanon, Israel, Venezuela and Taiwan (Abd-El-Khalick et al., 2004, cited in Rahman & Siddique, 2007).
2.2.2 Present scenario of secondary science education in Bangladesh Bangladesh has introduced open investigation in secondary science curriculum. Siddique and Rahman (2007) have supported this introduction of open investigation in science practical work in secondary education of Bangladesh. They have supported this integration of open investigation in secondary science for three reasons: a) the criticism of cookbook
11
style approach, b) positive research findings about laboratory investigation skills and c) the national curriculum trends. National curriculum trends indicate that open investigation has been introduced in the science curricula of Australia, England, Lebanon, Israel, Venezuela and Taiwan (Abd-ElKhalick et al., 2004).
2.2.3 Scale of Openness to Inquiry When discussing formats used for laboratory activities, a scale which classifies activities according to openness is a helpful technique because it aids in communication. A scale has been used in research (Hegarty-Hazel, 1986; Tamir, 1989) to classify laboratory formats. This scale was first devised by Schwab in 1962 and elaborated to include level zero, the lowest level of inquiry, by Herron in 1971 (Tamir, 1989). Hegarty-Hazel (1986) further elaborated the scale to divide level 2 into levels 2c and 2b to increase discrimination between levels of openness. This scale is used to determine the level of openness in a practical class. Answers to four basic questions are important to determine this information: a. How the problem of an experiment is decided? b. How the procedure of an experiment is determined? c. Who decides what apparatus need to be used in the experiment? d. How the result of an experiment is determined? The scale of openness is presented below: Table 1: Scale of openness Level
Problem
Procedure
Apparatus
Result
Common Name
0
Given
Given
Given
Given
Verification
1
Given
Given
Given
Open
Guided Inquiry
2a
Given
Open
Given
Open
Open Guided Inquiry
2b
Given
Open
Open
Open
3
Open
Open
Open
Open
Open Inquiry
12
2.3 Assessment of practical work Since the central role of the practical work in science education has been acknowledged, the assessment of students‘ performances in the laboratory has become the focal point of discussion. In the literature, researchers have certainly argued that practical science be assessed (Abousief & Lee, 1965; Kelly & Lister, 1969; Tamir ,1972; Whittaker, 1974; Gunning & Johnstone, 1976; Stannard, 1982; Lynch, 1984; DES/APU 1984) however Hofstein and Lunetta (1982) considered the area of practical skills assessment 'a neglected aspect of research'. An important query has emerged with this negligence: What is the appropriate way to measure a student in open investigation? There are two common ways in which students are assessed in practical classes traditionally. According to Ganiel and Hofstein, 1982; Bryce and Robertson, 1985; Giddings and Hofstein, 1990; Tamir, Doran and Chye, 1992; Lazarowitz and Tamir, 1994; Lunetta, 1998; Hofstein, Kipnis and Shore, 2004, one of them is written evidence which means traditional written reports or paper-pencil tests. Another type of assessment strategy is one or more practical examinations.
2.3.1 Traditional ways of assessing practical work and its impact The detail descriptions of the assessment strategies that have been used to assess students‘ performances are described below: Written evidence The most traditional style of assessing practical work is based on written evidence, namely written reports and items on paper and pencil tests (Kruglak, 1958, Grobman, 1970, Doran, 1978, Giddings & Hofstein, 1980). However, written reports provide only limited information regarding the students‘ behavior and performance during the practical exercise. Paper-pencil test is designed to assess students‘ knowledge and understanding of the use of experimental techniques and the principles underlying laboratory work and procedures. Unfortunately, this strategy is limited to the more theoretical components of the laboratory work and therefore does not provide evidence for the more performance-type activities (Hofstein, 2004). It has been shown that the correlation between students‘ achievement in 13
practical tests and their achievement based on written evidence is rather low. (Robinson, 1969; Buckley, 1970; Tamir, 1972; Ben Zvi, 1977). There is, therefore, is a need to develop special measures to assess what he actually does in the practical class. Practical examinations Practical examination is a common approach for assessing students. Practical examinations are in use by several Boards of Examination and in final examinations in several countries. In these examinations, students are usually examined by external examiners and not by their own teachers. According to Ganiel and Hofstein (1982), such practical examinations suffer from several drawbacks. Such as:
In many cases, different examiners use different criteria to assess student performance.
Practical examinations in which the assessment is not based on clearly defined criteria are not very useful, since their outcome will be greatly influenced by personal preferences and biases of the particular examiner.
Because of administrative constraints, practical examinations are administered to a large group of students simultaneously. Consequently, the examiner will not be able to concentrate on observing each student systematically, and will have to rely in his assessment on the results of the experiment and on the written reports.
Examinations are limited to those experiments that can be readily administered to students during a limited time period. This obviously restricts both the scope and validity of the assessment.
Since such examinations are difficult to implement, they cannot be conducted very often.
Consequently the element of chance is rather dominant. This obviously increases anxiety of the student.
The implementation of practical examination is very time consuming and difficult.
The drawbacks of the above mentioned traditional assessment strategies indicated that a shift is necessary. There has been a shift towards formative assessment of practical abilities
14
conducted and monitored by teachers in attempting to overcome these limitations and obstacles (Hofstein, 2004).
2.3.2 Using formative assessment to inform teaching-learning In recent years there has been a movement towards implementation of formative assessment. This has been formalized to some degree in the United Kingdom (University of London, 1977). Formative assessment refers to assessment that is specifically intended to generate feedback on performance to improve and accelerate learning (Sadler, 1998). Formative assessment to be present in a classroom must have these three criteria. These are based on the framework of formative assessment given by Black and William (1998). 1. The criteria for formative assessment must be clearly conveyed Assessment criteria refer to what a student needs to demonstrate to achieve the learning outcomes. Assessment criteria provide the minimum requirement expected of students. It is good practice to provide marking or grade criteria in a chart, where the expectations are set out at each level (University of Plymouth). 2. There will be an effective feedback culture in classroom The effectiveness of formative assessment depends on the quality of feedback given by teachers to students about their learning (Hackling, 2004). Research by Butler (1988) demonstrates that feedback must be task involving rather than ego involving- that is it directs attention to the task rather than to the self or ego. According to Sadler‘s framework (1983, 1989, 2010) there are three important conditions for achieving effective feedback. These are: (1) the student needs to have an idea of the objectives he or she has to achieve and the standard that is expected from him or her; (2) the feedback has to include a comparison between the actual level of the product/behavior and the expected level and (3) the feedback must provide instructions/suggest actions that will enable the student to close the gap between their current performance and where they need to get to in order to meet the requirements of a task.
15
3. Self-assessment in classroom Self-assessment is more accurately defined as a process by which students 1) monitor and evaluate the quality of their thinking and behavior when learning and 2) identify strategies that improve their understanding and skills. That is, self-assessment occurs when students judge their own work to improve performance as they identify discrepancies between current and desired performance. This aspect of self-assessment aligns closely with standards-based education, which provides clear targets and criteria that can facilitate student self-assessment. Finally, self-assessment identifies further learning targets and instructional strategies students can apply to improve achievement.
2.3.3 Using Rubrics for assessment of open investigation work Rubrics provide a useful approach to involving students in collecting evidence to demonstrate their learning in the open investigation work. Rubrics are devised by teachers, sometimes with students‘ involvement, and are used by students and by teachers working in a standard based assessment schemes. Rubrics help students understand the learning outcome they are striving to achieve and the standards or steps along that learning journey. The provision of clearly expressed outcomes may encourage dialogue which will help students to understand the requirements for improved performance (Hackling, 2005). Students can use rubrics for self-assessment. Student self-evaluation exercises can help students reflect on their learning and develop an understanding of what they need to do to make progress. They help teachers to be clear about standards or levels of achievement and can provide a framework to guide both formative and summative assessment (Hackling, 2004).
16
Chapter Three: Methodology The study has been conducted to explore the current approach of science practical work and assessment system at IER. Also, the study has been done to explore the impact of current nature of practical work and assessment system on students and what are the possible ways to overcome the situation. The methods and techniques followed in this study are presented in this chapter. This chapter includes strategy of inquiry, research design, data sources, sample and sampling, data collection, instrument, data analysis and ethical consideration.
3.1 Strategy of Inquiry Research questions of this study are: 1. What is the existing nature of science practical work class at IER? 2. What is the existing nature of assessment in science practical work at IER? 3. How the current nature of practical work impacts students‘ learning? 4. How the nature of assessment in science practical work impacts students‘ learning? 5. How the existing situation of practical work at IER can be improved?
A strategy of inquiry provides specific direction for the procedure in a research design (Creswell, 2009). As we can see, all of these five research questions are of exploratory types. The central phenomena of RQ. 1 and R.Q. 2 are nature of science practical work and science practical assessment system. The central phenomena of R.Q. 3 and R.Q. 4 are impact of current nature of practical work and assessment system on students. R.Q. 5 focused on a central phenomenon which is to improve the existing situation of science practical class and assessment system. Creswell (2012) states that one of the major characteristics of qualitative research is exploring a problem and developing a detailed understanding of a central phenomenon. I, therefore, chose qualitative research for my present study to explore the central phenomena to develop a thorough understanding of them.
17
3.2 Outline of Research Design Phase I Interview with teachers
Phase II Focus group discussion with students
Phase III
Developed some preconceived codes and themes based on the available data and prepared key points to discuss during the follow up FGD of teachers
Phase IV Follow up FGD of teachers
18
Phase I During the first phase of my research, I developed a semi structured interview schedule. Using this semi structured interview schedule, I gathered my qualitative data from the SMTE teachers of IER. This semi structured addressed all the research questions of my thesis. Their interviews were a rich source of information about the existing nature of practical work and its assessment system. Phase II The second phase of my research was to conduct focus group discussion with the students of science stream of IER. I developed some topics to be discussed during the focus group discussion based on the semi structured interview schedule. The topics worked as my focus group discussion guideline during focus the second phase. This guideline has the potential to address the main focuses of my research questions. Phase III The third phase of my research was to identify preconceived codes from the available data and develop key points to be discussed and confirmed. During this phase, I transcribed most of my qualitative data to identify the codes and possible themes. This thematic analysis provided me the desired key points which I have been looking for to discuss in the follow up FGD with teachers. Phase IV Follow up FGD with teachers was the last and fourth phase of my research design. I gathered all the teachers whom I interviewed in the first phase and engaged them in the focus group discussion. The teachers clarified the responses they mentioned in their interviews and together, they reached a common understanding on the nature of practical work and assessment system.
19
3.3 Data Sources The detailed description of the data sources along with a diagram are given in the following for a clear understanding: Table 2: Data sources Sources of Data
Research Question Document analysis
How is the existing nature of science practical work class at IER? How is the existing nature of assessment in science practical work at IER?
Curriculum of Bachelor of Education
Semi structured interview
FGD
Follow up FGD
7 SMTE teachers
30 science stream students
7 SMTE teachers
7 SMTE teachers
30 science stream students
7 SMTE teachers
7 SMTE teachers
How the current nature of practical work impacts students‟ learning? How the nature of assessment in science practical work impacts students‟ learning?
7 SMTE teachers
30 science stream students
7 SMTE teachers
How the existing situation of practical work at IER can be improved?
7 SMTE teachers
30 science stream students
7 SMTE teachers
3.3.1 Document analysis The existing curriculum of Bachelor of Education (B.Ed.) has been analyzed primarily to get a clear view of the existing nature of assessment system of science practical work at IER.
3.3.2 Teachers of SMTE at IER The central phenomena of my research questions focused on nature and impact of the existing science practical work and assessment system of IER. Therefore, all the teachers from SMTE who conducts science practical work have been one of my data sources as they can provide me with a distinct picture of the present situation.
20
3.3.3 Students of science stream at IER Another significant data source for my study was the students of science stream. I have chosen them as a source of my data as they have faced the science practical work and assessment system at IER to understand the situation from their perspectives as well.
3.3.4 External of Practical Examination Teachers from other departments are invited as externals during the viva-voce of science practical assessment. Viva-voce is an element of summative examination and thus it shares a portion of the marks in science practical. It is essential to know their thinking/perspectives/views/beliefs while marking students in viva. That is why I have chosen them as a data source for my research study.
3. 4 Sample and Sampling For my research study, I have chosen 7 teachers from SMTE, who conduct or have conducted practical classes. I have used purposive sampling to participate in the interview as Johnson (1995) asserted that in purposive sampling (sometimes referred to as judgmental sampling), the researcher specifies the characteristics of a population of interest and then tries to locate individual who have those characteristics. Table 3: Sample and Sampling of the study Number of sample
Sampling technique
7 teachers from SMTE
Purposive sampling
5 to 6 students of science stream from
Convenience sampling
17th, 18th, 19th and 20th batch
Another data source of my research was the students of science stream. For this, I have chosen 5 to 6 students from 17th, 18th, 19th and 20th batches each. The students have been selected by convenient sampling to participate in the focus group discussions as Johnson and Christensen (2012) states that researchers use convenience sampling when they include 21
in their sample people who are available or volunteer or can be easily recruited and are willing to participate in the research study. That is why I have selected individuals who are willing to participate in this study.
3.5 Data collection Interview and focus group discussion have been used as data collection process for this research study. First of all, interviews have been conducted with 7 teachers of SMTE department to obtain answers for research question 2 to 5. According to Patton (1987), qualitative interview allows a researcher to enter into the inner world of another person and gain an understanding of that person‘s perspective. Secondly, students from 4 different batches have participated in 5 separate FGDs to provide information based on research question 2 to 5 as well. Lastly, teachers from SMTE department (who participated in interviews) have been gathered once again in a follow up FGD to come to a common understanding on research question 1 to 5. According to Johnstone and Christenson (2008), focus group discussion (FGD) is useful for providing in-depth information in a relatively short period of time. Externals of practical examination have been interviewed to provide data about their assessment during viva-voce. Semi structured interview schedule and focus group discussion guidelines have been used as instruments to collect data on the five of the research questions. The detail descriptions of these instruments are stated below:
3.6 Instruments 3.6.1 Semi structured interview schedule To answer research questions from 1 to 5, I conducted semi structured interviews with SMTE teachers who conduct science practical classes. I used semi structured interview schedule to obtain in depth data from the teachers. First part of the semi structured interview schedule was developed based on the Planning and Report sheet for Science Investigations which was developed by Mark W. Hackling. 22
This has helped me to explore the nature of science practical work in IER while interviewing. Second part of the semi structured interview was based on assessment of practical work which explored the nature of formative and summative assessment in science practical classes. Black and William‘s (1998) framework for formative assessment has been used to explore the nature of formative assessment. The components for summative assessment have been identified by Mark W. Hackling. Third part of the semi structured interview schedule focused on exploring the impact of the current nature of practical work and assessment system of IER. The fourth part focused on teachers‘ reflections to prompt their ideas of what could be done in order to improve the situation of science practical of IER. The questions of the semi structured interview were both close and open ended which allowed me to explore the situation thoroughly. According to Creswell (2008), interview is suited to explore in-depth views. 3.6.2 Focus Group Discussion guidelines To answer research questions from 1 to 5, I conducted 5 separate focus group discussions with at least 6 students from 17th, 18th, 19th and 20th batches. 18th batch has been divided into two parts: non SMTE group and SMTE group. Research questions demanded to elicit students‘ experiences of science practical work and assessment system and their ideas on how to improve the existing situation. As according to (Laws, 2013), focus group discussion is undoubtedly valuable when in depth information is needed about how people think about an issue – their reasoning about why things are as they are, why they hold the views they do.
23
Table 4: The relation between research questions and instruments in this study Instrument
Semi structured interview of teachers
Focus group discussion of students
Follow up FGD of teachers
What is the existing nature of science practical work class at IER?
What is the existing nature of assessment in science practical work at IER?
Research Questions
How the existing nature of practical work impacts students‟ learning?
How the existing nature of assessment in science practical work impacts students‟ learning?
How the existing situation of practical work at IER can be improved?
3.7 Data Analysis Data analysis technique for my research study was thematic. Although there is no definite step for process of doing this, I used Tesch (1990) and Creswell (2007) recommended steps: 1. Firstly, I heard all the recordings to get a sense of the whole. 2. Secondly, I transcribed the recordings one by one as accurately as I can. I omitted and edited the text during transcribing. 3. Thirdly, I read through the transcriptions quite a few times to divide the texts into segments of information. 4. Fourthly, I labeled the segment of information with codes. In this way, I got 10-12 codes.
24
5. Lastly, the codes were collapsed into themes that were preconceived. The themes were determined based on my research questions. These are the steps I followed in my data analysis phase.
3.8 Ethical Consideration Permission has been taken from each teacher and student before conducting interviews and focus group discussions. Teachers and students have been informed about the purpose of my study before involving them in as samples. According to Creswell (2012), to gain support from participants you need to convey to participants that they are participating in a study and inform them of the purpose of the study. Total confidentiality and anonymity have been ensured in presenting data in data analysis section. The criteria of the American Anthropological Association (see Glesne & Peshkin, 1992) reflect appropriate standards. For example, researchers need to protect the anonymity of the participants by assigning numbers or aliases to them to use in the process of analyzing and reporting data. As my supervisor is one of my samples, I assured that his comments were not biased during the interview and follow-up FGD.
25
Chapter Four: Results
This chapter presents data from different sources collected during the data collection phase to answer five of my research questions. The focus of these five research questions are as follows: nature science practical work, nature of science practical assessment system, the impact of the existing nature of practical work and assessment practice on students and possible ways to improve the existing situation of science practical work and assessment system at IER. Document analysis has been done primarily to get acquainted with the existing situation of science practical assessment system. After that, I collected data from teachers of SMTE, students of science stream and externals of practical examination. Teachers from the Science, Mathematics & Technology Education (SMTE) department at the Institute of Education and Research (IER) have been interviewed by using a semi-structured interview schedule. 7 teachers participated willfully in the study who conduct (or, has conducted) practical classes at IER. The semi-structured interview schedule had five major components (nature of science practical work and assessment system at IER, incorporating formative assessment into summative, impact and views of teachers about the current trend of science practical assessment system). The results are organized according to the major parts of that semi structured interview schedule. The first component, which is the nature of science practical work, has 4 major subsections such as- planning, experimenting, data analysis and evaluating.
4.1 Document Analysis For the purpose of exploring the nature and assessment of science practical at IER, I analyzed the curriculum document. The experiments of all the science subjects are prescribed in the curriculum and the marks distribution of the Science courses with practical is as follows: 26
In course examination (best one out of two) = 10 Tutorial/Assignment/ Project = 05 Practical = 25 Course Final Examination = 60 Total
= 100 (the Curriculum Committee, 2006).
4.2 Nature of science practical work in the classroom This section is organized according to the research design. At the beginning, I have presented findings from the teachers‘ interviews. This is based on four themes: problem, procedure, apparatus and results. I have taken decision about the nature of the practical work based on these four elements. Secondly, this section shows the findings from the students FGDs responses which is also organized based on the same themes. Thirdly, findings from the follow up FGDs have been presented. A table is inserted after each section to summarize the responses from my data sources. a. Teachers‟ interview responses on the nature of science practical work The first research question of the study is to explore the nature of science practical classes at IER. To determine the nature of the practical classes at IER, I marked four basic elements which could determine the level of openness in the practical work. The elements are: problem identification, procedure, apparatus and result. I have used two instruments to find data about these elements. The instruments are: semi structured interview schedule (for teachers) and FGD guidelines (for students and teachers both). The following section will represent data about these major components in practical work. i. Problem Identification The teachers in SMTE department were asked this question: ―Who chooses/decides the problem/question of the investigation? Their replies and the justification of their answers are presented below in the table.
27
Table 5: Who chooses the problem of the investigation? Teacher Responses A They are already given in the curriculum. B
―The problems are given in the curriculum. But we add to the list of problems‖
C
―There was no chance for students to choose the problem‖ ―The problems of the investigation is decided by the curriculum‖
D
Remarks Set by curriculum Set by curriculum, sometimes teacher adds something. Set by curriculum Set by curriculum
E
―Curriculum provides us (teachers) with a number of Set by curriculum experiments. I select a few among them.‖
F G
―Problems are provided by the curriculum.‖ Set by curriculum ―Topics are fixed in the curriculum/ topics are given in Set by curriculum the curriculum‖
ii. Procedure Teachers from SMTE department have been asked: ―Who decides the procedure of the investigation, teacher or students?‖ Their responses are presented in the table below with justification. Table 6: Who chooses the procedure of the investigation? Teacher Responses A Students are given photocopies from the book after briefing them about practical class at the very beginning. Then they are asked to read that material before the practical. B
―I try to keep it open sometimes if the students are interested.‖
C D E
―Procedure was given to the students.‖ ―I briefed the matter in the classroom before their field work.‖ ―I provide them (students) with written document before the class.‖ ―I try to involve the students in the experiment but if I see they are having problems, I do it myself.‖ ―We have spotting and work-out, I that case, I demonstrate them the work, instruct them.‖
F G
Remarks Given
Given/open (sometimes) Given Given Given Given Given
28
iii. Apparatus Teachers were also asked about the apparatus: ―Who decides the materials and equipment needed for the experiment?‖ Teachers have answered the question as presented in the table below: Table 7: Who decides the materials and equipment needed for the experiment? Teacher A
Responses ―Students are given photocopies from the book after briefing them about practical class at the very beginning. Then they are asked to read that material before the practical.‖
Remarks Given
B
―The equipment are set by us (teacher and lab assistant) before the practical class.‖ ―List of materials and equipment needed for the experiment was included in the procedure.‖ ―Apparatus is given‖ ―They have a box of instruments. It‘s called ‗dissection box‘. They collect the appropriate instrument from the box.‖ ―I told my students what they might need in my first class.‖ ―I provide the students with the list of materials/equipment. The whole process is teacher guided. There is no chance to do something differently on our own.‖
Given
C D E F G
Given Given Given Given Given
iv. Results Though the problem, procedure and apparatus related decisions are mostly taken by the course teacher at IER, it has been found that the results of the experiments are given in all cases. Table 8: How is the result determined? Teacher Responses A ―The main job of the students is to verify. We have no chance to be open here.‖ B ―I don‘t think open investigation approach is applicable to tertiary level.‖ ―I try to keep that option open, if students are interested.‖ C ―Most of the time, students knew what they are doing. Suppose: they are identifying the functional group in organic chemistry. So they already know that which functional group they are looking for.‖ D ―They collect sample and show me. If they can‘t identify the sample, I help them or ask them to take help from their Botany teacher‖
Remarks Given Given
Given
Given
29
Teacher Responses E ―I told them (students) what to observe.‖ F ―They have read the theory so they can easily identify. If they can‘t, I gave them a clue. After that they can easily identify the specimen‖ G ―I ask them to observe, then if they don‘t find the desired observable data, I tell them what to look for.‖
Remarks Given Given
Given
Table 9: Summary of teachers‟ interview responses on the nature of science practical work Teacher
Problem
Procedure
Apparatus
Result
A
Given
Given
Given
Given
Decision about the Mode/Nature Recipe
B
Given
Given
Given
Given
Recipe
C
Given
Given
Given
Recipe
D
Given
Given
E
Given
Given (not for field work) Given
Given/predetermined Given
Recipe
F
Given
Given
Given
G
Given
Given
Given
Given/ predetermined Given/predetermined Given
Given
Recipe
Recipe Recipe
b. Students‟ FGD responses on the nature of science practical work Students of IER have been engaged in focus group discussions. For this study, five FGDs have been conducted with four ongoing batches of IER. The batches are 17th, 18th, 19th and 20th. 18th batch has been divided into two groups of students. They have been asked a few questions, based on the semi structured interview schedule, which basically address the main concerns of the research questions. Students were also asked about the approach of the science practical classes at IER based on four elements (problem, procedure, apparatus and results). Their responses have been summarized in the table 10:
30
Table 10: Summary of students‟ FGD responses on the nature of science practical work Team
Problem
Procedure
Apparatus
Result
1
Given
Given
Given
Given
Decision about the Mode/Nature Recipe
2
Given
Given
Given
Given
Recipe
3
Given
Given
Given
Recipe
4
Given
Given
5
Given
Given (not for field work) Given
Given/predetermined Given
Given
Given
Recipe
Recipe
c. Teachers‟ follow up FGD responses on the approach of science practical work After interviewing the teachers from Science, Mathematics & Technology Education (SMTE) department, a follow up FGD has been conducted in the third phase of the research study so that teachers can reach a common understanding among themselves. The second question of the follow up FGD was this- ―What is the current mode of science practical classes at IER?‖ all the teachers agreed that the ongoing trend of the science practical class follows recipe style. Teacher A, C, D, E, F and G agreed that the experiments are mostly at the verification level. However, Teacher B disagreed. He concluded in this way, ―if possible, if students are interested, I keep that option open. I ask my students to try to do an experiment differently.‖ d. Findings about the nature of science practical work from the data stated above The findings of this section answer the first research question. Finding from teachers‟ interviews By interviewing the SMTE teachers following this detailed interview schedule, it has been found that the current trend of science practical classes at IER is mostly recipe style.
31
Findings from teachers follow up FGD In teachers follow up FGD, teachers agreed on the point that IER‘s current approach of practical work is basically recipe mode. Findings from students‟ FGDs‟ Students from 5 FGDs agreed that the current approach of science practical work is at verification level.
4.3 Nature of formative assessment system in science practical classroom a. Teachers‟ interview responses on formative assessment in science practical classroom In this section, teachers‘ responses in interview regarding formative assessment in classroom are presented in the following table. 1. Is formative assessment present in science practical classroom? 4 teachers among 7 have answered positively that they assess the students formatively in the classroom. The remaining 3 teachers among 7 have reported that they do not assess formatively because there is no mark assigned for formative assessment in the curriculum. Table 11: Is formative assessment present in science practical classroom? Teacher A
Responses YES
B
YES
C
YES
D E
NO YES
F G
NO NO
Justification for the answer ―I have set aside 5 marks for performance‖ ―If they (students) repeat the mistakes at the end of the class of the semester, i mark them negatively.‖ ―It was a big issue in my class. But unfortunately, we have no marks set aside for classroom performance. But, still, I used to observe them closely.‖ -----------------------------------------------------―I mark them as I have seen them in my practical classes. I do not keep notes.‖ -------------------------------------------------------------------------------------------------------------
32
If answered YES to the above mentioned question, the teachers were asked the following questions i, ii and iiiTable 12: i. Are the criteria of the formative assessment clearly conveyed? Teacher
Responses
Remarks
A
―There is no formal way for Criteria for formative this. We impart this assessment are conveyed information orally.‖ orally.
B
―I tell the students in the classroom about the formative assessment. They know about this.‖ ―There was no written form of the criteria. The students were informed orally.‖
C
E
Criteria for formative assessment are conveyed orally. Criteria for formative assessment are conveyed orally.
―Students know the criteria. Criteria for formative I inform them about it assessment are conveyed orally.‖ orally.
Decisions based on the remarks Assessment criteria for formative assessment is not explicit Assessment criteria for formative assessment is not explicit Assessment criteria for formative assessment is not explicit Assessment criteria for formative assessment is not explicit
ii. What is teachers‟ feedback culture in science practical classes? Condition 1: Student needs to be informed about the criteria
Condition 2: The feedback has to include a comparison between the actual level of the product and the expected level.
Condition 3: the feedback must provide instructions that will enable the student to close the gap between actual and expected level.
Table 13: What is teachers‟ feedback culture in science practical classes? Teacher A B C D E F G
Condition 1
Condition 2
× × × × × × ×
× × × × × × ×
Condition 3
Remarks Not effective feedback culture Not effective feedback culture Not effective feedback culture Not effective feedback culture Not effective feedback culture Not effective feedback culture Not effective feedback culture 33
iii. Is there a scope for self-assessment in science practical classroom? Table 14: Is there a scope for self-assessment in science practical classroom? Teacher A B C D E F G
Remarks There is no scope for self-assessment in science practical classroom There is no scope for self-assessment in science practical classroom There is no scope for self-assessment in science practical classroom There is no scope for self-assessment in science practical classroom There is no scope for self-assessment in science practical classroom There is no scope for self-assessment in science practical classroom There is no scope for self-assessment in science practical classroom
34
Table 15: Summary of teachers‟ interview responses on formative assessment in practical class The questions asked to the teachers during interview on the issue of formative assessment are listed below: 1: Is formative assessment present in practical classroom? i: Are the assessment criteria for formative assessment clearly conveyed? ii: What is the feedback culture in science practical classes at IER? iii: Is there a scope for self-assessment in science practical classes?
Teacher A
1 YES
i Criteria for formative assessment are conveyed orally.
ii Not effective feedback culture
iii There is no scope for selfassessment in science practical classroom
B
YES
Criteria for formative assessment are conveyed orally.
Not effective feedback culture
There is no scope for selfassessment in science practical classroom
C
YES
Criteria for formative assessment are conveyed orally.
Not effective feedback culture
There is no scope for selfassessment in science practical classroom
D
NO
------------------------------------
Not effective feedback culture
------------------------
E
YES
Criteria for formative assessment are conveyed orally.
Not effective feedback culture
There is no scope for selfassessment in science practical classroom
F
NO
-------------------------------------
Not effective feedback culture
-------------------------------
G
NO
-------------------------------------
Not effective feedback culture
---------------------------------
35
b. Students‟ FGDs‟ responses on formative assessment in science practical class Students of IER from 4 different batches participated in 5 FGDs. They have been asked the same questions as the researcher has asked the teachers during interview about formative assessment in science practical classroom. The responses of the questions are presented are presented in the table below: Table 16-17: Summary of students‟ FGD responses on formative assessment in science practical class 1: Is formative assessment present in practical classroom? i: Are the assessment criteria for formative assessment clearly conveyed? ii: What is the feedback culture in science practical classes at IER? iii: Is there a scope for self-assessment in science practical classes? In here, I have presented two tables to describe the findings of students FGD responses. In table 18, findings from question 1 and related question no i and iii are presented. In table 19, findings from question ii are presented according to each science subjects. Table 16: Is formative assessment present in practical classroom? And Is there a scope for self-assessment in science practical classes?
Team
1 Phy
1 2 3 4 5
Chem
Zoo
i
iii
No No No No No
No No No No No
Bot
YES
NO
NO
NO
NO NO NO
NO NO NO
NO NO NO
NO NO NO
36
Table 17: What is the feedback culture in science practical classes at IER? Team
1 2 3 4 5
ii
× × × × ×
Phy Conditions × × × × ×
× × × × ×
Chem Conditions × × × × ×
× × × × ×
Zoo Conditions × × × × ×
Bot Conditions × × × × × × × × × ×
c. Teachers‟ follow up FGD on formative assessment in science practical class Teachers participated in a follow up FGD in which formative assessment in practical classroom has been addressed. The questions/issues and uniform decisions came out from the follow up FGD have been summarized in the table below: Table 18: Summary of teachers follow up FGD responses on formative assessment Participants
Issues addressed Is formative assessment present in practical classroom?
Uniform decisions Yes but informally
Is formative assessment needed in practical classroom?
All the teachers have agreed on the point that formative assessment is necessary Criteria may vary according to the nature of the subject these criteria will be decided among teachers before the exam students will be notified about them before the exam
What should be the criteria for formative assessment?
Marks assigned for formative assessment 7 SMTE Teachers
What is the framework for incorporating formative assessment into summative assessment?
How to practice effective feedback culture in science practical classrooms at IER?
5 marks can be set aside for classroom performance/formative assessment This decision has not been made but teachers have agreed that they can take this decision by mutual understanding among them Rubrics can be used for students‘ selfassessment so that they can monitor their own progress
37
d. Findings about the formative assessment in science practical class Finding from teachers‟ interviews By interviewing the SMTE teachers following this detailed interview schedule, it has been found that 4 out of 7 teachers (A, B, C & E) said that formative assessment is present in science practical classes; These teachers informed that they let the students know about the assessment criteria orally. There is no written form for assessment criteria; The feedback culture in IER science practical class is not effective; There is no scope for self-assessment in science practical class;
Findings from students‟ FGDs‟ Students from 5 FGDs agreed that There is no formative assessment in science practical classroom at IER; Students are not formally informed about the criteria of formative assessment; Students agreed that the teachers instruct them about how to close the gap actual and expected level; Students agreed that they have no scope for self- assessment; Findings from teachers follow up FGD In teachers follow up FGD, teachers agreed on the points that Formative assessment is informally present in science practical classes at IER; Formative assessment is important and IER‘s next curriculum should introduce formative assessment in science practical classes; 5 marks can be set aside for formative assessment in science practical classes The criteria of formative assessment may vary according to the nature of subject. These criteria can be decided by course teachers before exams and should be imparted to students 38
Rubrics could be a good tool for students‘ self-assessment and also help teachers to convey the assessment criteria clearly;
4.4 Nature of summative assessment in science practical classroom I have identified three sources of evidence for summative assessment such as paper-pencil test, laboratory report and viva.
Paper-pencil test During paper-pencil test of science practical examination, teachers observe students‘ performances as well as evaluate the written documents. My target was to determine the criteria during this paper-pencil test since it consists of 15 marks out of 25. a. Teachers‟ interview responses on observation during practical examination The data obtained from the teachers‘ interviews is presented below: i.
What are the assessment criteria while observing students in the practical exam?
Teachers have been asked this question during individual interview. They have specified different criteria they intend to observe during science practical exam. Table 19: What are the assessment criteria while observing students in the practical exam? Teacher A B C
Responses Reliability of data, Did the students collect the data themselves? Science process skills Group collaboration, neatness, how the procedure is being followed
D
Can the students dissect the specimen properly and show the parts? Can the students identify the specimen and write the identifying characteristics?
E
Sincerity, neatness and smartness of presentation, individuality
F G
Safety caution, individuality Individuality
39
ii. a) Are the students aware of the assessment criteria? b) If yes, how do you let them know? All the teachers were asked to specify two things: if students know about the assessment criteria of science practical exam and if students know about assessment criteria, how teachers inform them about the criteria? Table 20: Are the students aware of the assessment criteria? And If yes, how do you let them know? Teacher A
ii. a. YES
Remarks There is no explicit way
D E
YES Has not specified YES YES
ii. b There is no formal way of letting them know Students already know these ---------------------------------------------Orally Orally
F G
YES YES
There is no formal way There is no formal way
There is no explicit way There is no explicit way
B C
There is no explicit way There is no explicit way There is no explicit way There is no explicit way
Table 21: Summary of teachers‟ interview responses on paper-pencil test The questions asked to the teachers during interview on the issue of paper pencil test are listed below: i.
What are the assessment criteria while observing students in the practical exam?
ii. a. Are the students aware of the assessment criteria? b. If yes, how do you let them know? Teacher
i.
A
Reliability of data, Did the students collect the data themselves?
B
science process skills
ii. a. YES
YES
ii. b There is no formal way of letting them know Students already know these
Remarks There is no explicit way to let the students know about the assessment criteria of paperpencil test There is no explicit way of letting the students know about the assessment criteria of paperpencil test
40
Teacher
ii.
ii. a. Has not specifie d
ii. b ---------------------------------------------
Remarks
C
Group collaboration, neatness, how the procedure is being followed
There is no explicit way of letting the students know about the assessment criteria of paperpencil test There is no explicit way of letting the students know about the assessment criteria of paperpencil test
D
Can the students dissect the specimen properly and show the parts? Can the students identify the specimen and write the identifying characteristics?
YES
Orally
E
sincerity, neatness and smartness of presentation, individuality
YES
Orally
There is no explicit way of letting the students know about the assessment criteria of paperpencil test
F
safety caution, individuality
YES
There is no formal way
G
Individuality
YES
There is no formal way
There is no explicit way of letting the students know about the assessment criteria of paperpencil test There is no explicit way of letting the students know about the assessment criteria of paperpencil test
b. Students‟ FGD responses on paper-pencil test Students of science stream from four different batches participated in five FGDs. They have been asked the same questions as the researcher has asked the teachers during interview about observation during practical examination issue. The responses of the questions are presented are presented in the table below: Table 22 : Summary of students‟ FGD responses on paper-pencil test Q. 1: What does the teacher assess in paper-pencil test?
41
Team 1
Physics If we can do the experiment by ourselves, did we take the reading ourselves, did we write the report following he given framework etc. If we can do the experiment by ourselves, did we take the reading ourselves
Chemistry Result of the experiment
Zoology If we can dissect the specimen and show the different parts to the teacher
Not sure
If we can dissect the specimen and show the different parts to the teacher
3
Skill of handling a machine, connecting the circuits properly etc. (assumed)
Teacher used to get annoyed if equipment got broken
4
1. Can Operate the machines properly (if any machines involved). 2. Knows how to collect data. 3. Can be able to tell the reasons of the errors. (the criteria are assumed by them) Not sure
Dissection skill/other skills, can the students perform an experiment and present it in the exam copy accordingly Need to clarify
Not sure
They have not specified the criteria
2
5
Botany If we can identify the plants, if we can dissect the plant/flower and show the different parts If we can identify the plants, if we can dissect the plant/flower and show the different parts How perfectly a student can set a slide?
Individual performance is assessed but what is assessed by teacher is not clear
Table 23: Q. 2: Are you being notified about the assessment criteria before the test? Q.3: How does the teacher let you know them?
Team
Physics Q. 2
Q. 3
Chemistry Q. 2
Q. 3
Zoology Q. 2
Q. 3
Botany Q. 2
Q. 3
1 2
YES NO
W+O -------
NO NO
---------------
NO YES
------O
NO YES
--------O+W
3
NF
O
NO
---------
YES
O
YES
O+W
4
NO
---------
NO
---------
YES
O
YES
O+W
5
NO
---------
NO
YES
O
YES
O+W 42
* W means written * O means orally * NF means not formally c. Teachers‟ follow up FGD responses on paper-pencil test Teachers have been asked to clarify about the issues related to observation during practical exams. The main issues were: What should be the assessment criteria during practical examination and what could the way to make the assessment criteria explicit. Teachers‘ decisions about these issues are presented in the table below: Table 24: Summary of teachers‟ follow up FGD responses on paper-pencil test Participants
Issues addressed
What are the assessment criteria to be assessed during paper-pencil test of a practical examination?
7 SMTE teachers What could be the way to make the assessment criteria explicit?
Uniform decision This may vary from teacher to teacher according to the nature of the subject. But the students must know the criteria before handedly. A uniform framework has to be established. Rubrics can be a way to make the criteria explicit. Bringing the criteria will decrease the level of biasedness in practical examination. But a teacher disagreed. He said this is a closed method. We need to be open about something.
d. Findings on paper-pencil test Findings from teachers‟ interviews By interviewing the SMTE teachers following this detailed interview schedule, it has been found that Teachers have different criteria which they observe during practical examination There is no formal way of letting the students‘ know about the criteria of practical examination 43
Findings from students FGDs Students from 5 FGDs agreed that During a paper-pencil test, a teacher mainly observes whether a student can or cannot carry out an experiment; Botany‘s assessment criteria for paper-pencil test is explicit whereas in case of Physics, Chemistry and Zoology, students are not provided with explicit guidelines of paperpencil test; Findings from teachers follow up FGD In teachers follow up FGD, teachers agreed on the point that The criteria of practical examination may vary from teacher to teacher but the students must know about the criteria beforehand; Rubrics can also be helpful in bringing explicitness in paper-pencil test of practical examination; Practical report writing a. Teachers‟ interview responses on practical report writing Firstly, the teachers were asked about the marks that are assigned for report writing in practical. 6 out of 7 teachers answered in the interview that they have assigned 5 marks for practical report writing. Secondly, teachers were asked about criteria of lab report. Thirdly, teachers were asked about the ways in which teachers let the students know about the assessment criteria. i. What are the criteria of assessing practical lab report? Teachers have mentioned different frameworks for their practical lab reports. Their responses are presented below:
44
Table 25: What are the criteria of assessing practical lab report?
Teacher A
Responses Name of the exp., summary of the theory, apparatus, calculation with error calculation, result & discussion of the result
B
Name of the exp., theory, calculation, interpretation & safety cautions
C
No final report but daily report on what they did in the lab that has to be signed by teacher at the end of every class. That would be counted as a formative assessment of students. Referencing, other components depends on the topic Proper identifying characteristics and figure/picture quality Figure/picture quality, labeling & scientific name Figure/picture quality, labeling & scientific name
D E F G
ii. How do you assign number to the lab report? Teachers have been asked this question to clarify what is the framework of dividing the 5 marks in lab report. Their responses are presented in the table below: Table 26: How do you assign number to the lab report? iii. a) Are the students aware of the assessment criteria? b) How do the teachers let the students know about the assessment criteria? All the teachers have been asked these two questions during the interview. Their answers are presented in the table below: a) Teachers have answered positively while answering the first question. b) 5 teachers out of 7 have specified that they let the students know about the lab report assessment criteria orally. Teacher A has said that the criteria are written on the board in the practical classroom while Teacher G‘s answer is not clear on this issue. Teacher A B C
Responses ―It‘s written on the board of the classroom.‖ ―I discuss it at the beginning of practical class of each semester.‖ Have not specified clearly.
Remarks Written Orally Orally
Decisions based on remarks Explicit guideline No explicit guideline No explicit guideline 45
Teacher D E
F G
Responses
Remarks
―5 marks is not divided in that way.‖ ―It‘s hard to break this 5. I give them 3 for writing identifying characteristics and 2 for drawing a perfect figure/picture. They know the framework.‖
No fixed Framework Orally
‗I repeat the criteria in the practical classes frequently.‖ ―They know the criteria from their Intermediate level. ―
Orally
Decisions based on remarks No explicit guideline No explicit guideline
No explicit guideline No explicit guideline
Not clear
Table 27: Summary of teachers‟ responses on practical report writing issue i.
What are the criteria for assessing lab report?
ii.
How do you assign a number to lab report?
iii. a. Are the students aware of the marking scheme? b. If YES, how are they notified about the assessment criteria? Teacher I A Name of the exp., summary of the theory, apparatus, calculation with error calculation, result & discussion of the result B Name of the exp., theory, calculation, interpretation & safety cautions C No final report but daily report on what they did in the lab that has to be signed by teacher at the end of every class. That would be counted as a formative assessment of students. D Referencing, other components depends on the topic E F
Proper identifying characteristics and figure/picture quality Figure/picture quality, labeling & scientific name
G
Figure/picture quality, labeling & scientific name
ii Fixed framework
iii Written
No fixed framework No fixed framework
Orally Orally
No fixed No framework Framework Fixed Orally framework No fixed Orally framework No fixed Orally framework
46
Students‟ FGDs‟ on practical report writing Students of science stream from four different batches participated in five FGDs. They have been asked the same questions as the researcher has asked the teachers during interview about observation during practical examination issue. Table 28: Physics lab report Team
What is assessed through lab report?
How does teacher assign the numbers?
Are the students aware of the marking scheme?
If YES, how are they notified about the assessment criteria?
1
No idea
No idea
No
2
Not sure
Not sure
No
-----------------------------------------------------------------
3
No idea
No
4
Know but not explicitly, mentioned the importance of the report Based on result
No
------------------------------
5
Not sure
Probably based on the mark in paper pencil test No idea
No
-------------------------------
Table 29: Chemistry lab report Team
1
2
What is assessed How does through lab report? teacher assign the numbers?
Are the students aware of the marking scheme?
If YES, how are they notified about the assessment criteria?
Not sure (had to write how they have conducted the experiment, what changes they have made) They copied the procedure from he book
Not sure, teacher did not specify
No (they knew that lab report has 5 marks out of 25)
--------------------------------
Have no idea
No (they knew that lab report has 5 marks out of 25
-------------------------------47
Team
What is assessed through lab report?
How does teacher assign the numbers?
Are the students aware of the marking scheme?
If YES, how are they notified about the assessment criteria?
3
Not sure
Have no idea
------------
4
Not sure
Have no idea
5
Not sure
Have no idea
No (they knew that lab report has 5 marks out of 25) No (they knew that lab report has 5 marks out of 25) No (they knew that lab report has 5 marks out of 25)
------------------------------------------------------------
Table 30: Zoology lab report Team
1
What is assessed How does teacher assign through lab report? the numbers?
Not sure, but teacher emphasized on being neat and clean
2
3
Not sure
4
5
Group based work, depends on the quality of the work, no fixed criteria
Are the students aware of the marking scheme?
If YES, how are they notified about the assessment criteria?
They did not get the split marking so they have no idea about this
No
---------------------------------
They did not get the split marking so they have no idea about this Marks distribution is not clear/ not given They did not get the split marking so they have no idea about this Marks distribution is not given
No
---------------------------------
No --------------------------------No
------------------------------
48
Table 31: Botany lab report Team What is assessed through lab report?
1
2 3
4 5
Not sure, but teacher emphasized on being neat and clean Figure/picture quality was emphasized Not sure but teacher emphasized on the quality of the figure/picture Figure/picture quality was emphasized Figure/picture quality was emphasized
How does teacher assign the numbers?
Have no idea
Are the students aware of the marking scheme? No
Have no idea
If YES, how are they notified about the assessment criteria?
-----------------------------------------------------------------
Have no idea
No
Have no idea
No
They have not mentioned a framework but reported that they face no problem regarding the botany practical copy
Yes
--------------------------------Orally, teacher instructs them frequently, gives them qualitative feedback
Table 32: Summary of students‟ FGD responses on practical report writing issue i. ii. iii.
What are the criteria for assessing lab report? How do you assign a number to lab report? a. Are the students aware of the marking scheme? b. If YES, how are they notified about the assessment criteria?
49
Tea m
Physics lab report i
ii
iii
Chemistry lab report i ii iii
Zoology lab report
Botany lab report
i
Ii
iii
i
ii
iii
1
No idea
No No idea
Not Not sur sur No e e
Neatn ess
No idea
No
Neatn ess
No idea
No
2
No idea
No No idea
Not No sur ide e a
No
Not sure
No idea
No
Not sure
No idea
No
3
Know, but not explici tly
No No idea
Not No sur ide e a
No
Not sure
Not give n
No
No idea
No
4
No idea
No No idea
Not No sur ide e a
No
Not sure
Not give n
No
No idea
No
5
Not sure
No No idea
Not No sur ide e a
No
Depe nds on the qualit y, no fixed criteri a
Not give n
No
Qualit y of figure/ pictur e Qualit y of figure/ pictur e Qualit y of figure/ pictur e
They face no proble m
No
The table above summarizes the FGD responses of students on the practical lab report topic. According to the students, students are not explicitly informed about the criteria of a practical lab report. In Physics and Chemistry, they copy the procedure of an experiment from a textbook provided by teacher. In Zoology and Botany, teachers emphasize the image quality of the pictures drawn by students.
50
b. Teachers follow up FGD responses on practical report writing Table 33: Teachers follow up FGD responses on practical report writing Participants
Issues Addressed
7 SMTE Teachers
What are the criteria that need to be assessed in lab report?
Can rubric be used to let the students know about criteria?
Uniform Decision This may vary from teacher to teacher. Teachers can come to a uniform decision through a meeting on this issue. Report writing can be considered as a part of formative assessment. However, teacher C said that he doesn‘t like the concept of report writing. He prefers students to write the report regularly and submit those. Students do not need to write the report in an organized fashion. Teachers agreed on this point. They believe that the assessment will be easy and fair.
d. Findings about practical lab report Findings from teachers‟ interviews By interviewing the SMTE teachers following this detailed interview schedule, it has been found that In all cases, teachers have different criteria for assessing practical lab report; In one case, I have found out that teacher A has a fixed framework for assessing laboratory reports. Other teachers have no explicit guideline for this. Findings from students FGDs Students from 5 FGDs have agreed that In case of Physics and Chemistry lab reports, students are less informed about the assessment criteria and marking scheme than Zoology and Botany‘s. In all cases, students have no idea about how teachers assign a number to the practical lab report; In all cases, students are not provided with explicit assessment criteria for assessing practical lab report;
51
Findings from teachers follow up FGD In teachers follow up FGD, teachers agreed on the following points: The criteria of assessing practical lab report may vary from teacher to teacher. Teachers can come to a uniform decision through a meeting on this issue. Report writing can be considered as a part of formative assessment. Rubrics can be used to let students know about the criteria and marking scheme of lab report.
Viva: a. External teachers‟ interview responses on the issue of viva The marks of viva-voce are given by the external during practical examination. Four externals have been interviewed individually to get data on this issue. There were four basic questions that were addressed during the external interview. They are: A. What do you assess in a viva? B. How do you ascertain a number to the students? C. Do you let the students know about viva‘s assessment criteria? D. How do you let the students know about viva‘s assessment criteria? The responses of externals of four science subjects (Physics, Chemistry, Zoology and Botany) are given below: Table 34: What do you assess in a viva? Teacher Physics Chemistry Zoology Botany
A Fact based, conceptual understanding See if students can relate practical work to real life Fact based, Conceptual understanding, to see if students can explain the topic simply Fact based questions/ check the knowledge, attitude, appearance, smartness
52
Table 35: B. How do you ascertain a number to the students?
Teacher 1
2
3
4
B Remarks ―Answer to every question= 100% marks, if one can Fixed framework answer to 2/3 question, I give 70/80% marks, if one couldn‘t answer properly to any question, I give those students 40% marks.‖ ―It depends on the level of students. If I see that the No fixed framework subjective number of a student is good, I understand his/her level. So I mark accordingly.‖ ―If I cannot stop a student with a question, I give Fixed framework him/her, 5/5. But if I have to push a student, then I give 3 or 3.5, if one answers freely, I usually give 4 out of 5.‖ ―I ask question from three levels: I ask 1/ 2 easy Fixed but marking questions at first, and then I move on to ask a bit scheme is not clear difficult questions.‖
Table 36: C. Do you let the students know about viva‟s assessment criteria? D. How do you let the students know about viva‟s assessment criteria? Teacher 1
2 3 4
C and D ―I think they know about it. But they do not study well. They want to get highest marks but they don‘t study well.‖ There‘s nothing new about viva. They know about it from intermediate level. We need to think about it about viva‘s assessment criteria. I think they know about it.
Remarks External of practical examination have a perception that students already know about the criteria for viva because students have been facing viva since higher secondary level.
b. Students‟ FGD responses on viva Students of science stream from four different batches participated in five FGDs. They have been asked the same questions as I asked the externals of practical examination. Their responses of these questions are presented in the table 37:
53
Table 37: Summary of students‟ FGD responses on viva
Team
1
2
3
4
5
What is assessed in a viva?
How does teacher ascertain a number to the students?
Physics
Chemis try
Zoolog y
Botany
Physics
Chemistr y
Zoology
Botany
Practical work related (the one student gets during the practical exam) Procedure of the practical, theoretical questions Practical work related (the one student gets during the practical exam), asks the procedure Practical work related (the one student gets during the practical exam) Not sure
Questio ns from the whole syllabus , fact based Content based
Conten t based
Varies from person to person
Do not know
Do not know
Do not know
Do not know
Conce pt based
Totally content based
Do not know
Do not know
Do not know
Do not know
Not sure
content based
Totally content based
Do not know
Do not know
Do not know
Do not know
Not sure
content based
Totally content based
Do not know
Do not know
Do not know
Do not know
Not sure
content based
Totally content based
Do not know
Do not know
Do not know
Do not know
Findings from external teachers interview responses: In external teachers‘ interviews, I have found the following findings: The criteria of viva-voce vary from teacher to teacher; Teachers have a fixed framework of their own while marking in the viva; It has been found that externals have a perception that students already know about the criteria of viva-voce;
54
Findings from students FGDs responses: Students have agreed on the following points during FGDs: Physics external ask questions about the particular experiment students get during paper-pencil tests; students are unsure about Chemistry external‘s pattern of asking questions; zoology and botany external ask content based questions; Students are not sure about the ways in which teachers ascertains a number to the students during viva-voce;
4.5 Impact of existing nature of science practical work on students Impact of existing nature of science practical work has been discussed during teachers follow up FGD. Their responses are shown in the table below: Table 38: Teachers follow up FGD responses on the impact of existing nature of science practical work Participants 7 SMTE teachers
Uniform decisions Students cannot relate practical with theory; Lack of investigation skills among students;
4.6 Impact of the existing nature of assessment system on students a. Teachers interview responses on the impact of existing assessment system on students Teachers have been asked questions about the impact of the existing assessment system on students. i. ii.
Do you think the students should know about the assessment criteria? Please give justification for your answer. What is the impact on students about not knowing the assessment criteria?
55
Table 39: Do you think the students should know about the assessment criteria? Please give justification for your answer. Teacher A
i YES
B C
Not sure YES
D E
Not Sure YES
F
YES
G
YES
Justification ―I think students should know how they are being assessed. It helps them to learn better.‖ -----------------------------------------------------------------------------―It is absolutely necessary for the students to know the criteria for their better learning. Also, they need to know to understand on what they should give importance on‖ ―I haven‘t thought about this issue like this before.‖ ―I think students should know about the criteria. Otherwise they won‘t be able to learn properly.‖ ―If the students know about the assessment criteria, they will be more attentive.‖ ―It will help them to learn better.‖
Table 40: What is the impact on students about not knowing the assessment criteria? Teacher A C E F G
ii ―Without knowing the assessment criteria, they won‘t be able to learn.‖ If they don‘t know the criteria, students won‘t be able to relate theory with practical. ―They won‘t be able to align the practical with theory.‖ ―If students don‘t know about the assessment criteria, they won‘t give importance on the small parts.‖ ―They won‘t understand what to learn.‖
b. Students FGD responses on the impact of existing assessment system on students Table 41: What is the impact on students about not knowing the assessment criteria? Team 1
2
Issues addressed Do you think students should know about the assessment criteria? Why do you think so? Do you think students should know about the assessment criteria? Why do you think so?
Uniform decisions YES To understand what we need to achieve Yes and No 5 out of 6 students agreed that it is important. 6th student said that it won‘t make a difference 56
Team 3
4
5
Issues addressed Do you think students should know about the assessment criteria? Why do you think so? Do you think students should know about the assessment criteria? Why do you think so? Do you think students should know about the assessment criteria? Why do you think so?
Uniform decisions YES We will be able to understand what we need to achieve YES To help us achieve the learning outcome YES We need to understand what we need to achieve
c. Teachers follow up FGD on the impact of existing assessment system on students Table 42: What is the impact on students about not knowing the assessment criteria? Participants
Issues addressed Do you think students should know about the assessment criteria?
7 SMTE teachers Why do you think so?
Uniform decisions Teacher agreed that students must know about the assessment criteria To ensure learning, will bring accountability in assessment system and students will be able to relate theory with practical.
d. Findings about the impact of existing nature of assessment system on students at IER science practical class Findings from teachers‟ interviews Teachers mentioned following impacts which might be resulted from the existing nature of assessment system o Without knowing the assessment criteria explicitly, students would not be able to learn properly; o Students would not be able to relate practical with theory without knowing the assessment criteria;
57
Findings from students‟ FGD Students agreed on the following impacts which might be resulted from the existing nature of assessment system o Without knowing the assessment criteria, they would not be able to achieve what they need to achieve;
Findings from teachers follow up FGD Teachers in a follow up FGD agreed that students must know the assessment criteria in order to o Bring accountability in assessment system; o Relate practical with theory;
4.7 Ways to improve the current nature of practical work and assessment system at IER a. Teachers‟ interview responses on the ways to improve the existing nature of practical work The teachers were asked about the ways to improve the current approach of practical class at IER. Their responses are given in the table below: Table 43: Teachers‘ interview responses on the ways to improve the existing nature of practical work Teacher A
B C
Response Remarks ―We need to change the recipe mode of Change of current mode practical classes. We need to change our mind of practical class: set up as well.‖ Recipe to open investigation ―The current approach is appropriate for tertiary level.‖ ―We need to change the mode of our practical class. Recipe mode is not at all acceptable at this tertiary level.‖
In support of recipe/verification mode. Change of current mode of practical class: Recipe to open investigation 58
Teacher D
E
F
G
Response Remarks ―We need to include investigation approach at Change of current mode this level.‖ of practical class: Recipe to open investigation ―I am not sure about designing the experiment In support of because most of the time students have no prior recipe/verification mode. knowledge about the topic in practical.‖ ―I don‘t follow teacher-centric method in my Supports investigation classroom. I try to involve my students in the mode in practical classes. practical class.‖ ―We are used to this system.‖ In support of recipe/verification mode.
a.2 Teachers interview responses about the ways to improve the existing nature of assessment system The teachers were asked about the ways to improve the existing assessment system at IER. Their responses are given in the table below: Table 44: Teachers interview responses about the ways to improve the existing nature of assessment system Teacher Response A ―We need to come up with a guideline that will describe explicitly the division of 25 marks. Like what we need to assess in performance/how the 5 marks in performance will be distributed etc.‖ B ―Assessment system is okay. I don‘t think we have anything that we need to change.‖ C ―We need to clear the assessment criteria. And we need to include formative assessment as well.‖ D ―I think assessment system is okay, more or less.‖ E ―Assessment system is okay.‖ F ―I follow the assessment system I have faced in my department.‖ G ―Assessment system is fine, I think.‖
Remarks An explicit guideline to indicate assessment criteria
-------------------------------------Explicit assessment criteria Inclusion of formative assessment ------------------------------------------------------------------------------------------------------------------------------------------------------
59
b. Students FGD responses on the ways to improve the current approach of practical class and assessment system Table 45: Students FGD responses on the ways to improve the current approach of practical class and assessment system
Team 1, 2, 3, 4, 5
Issues addressed What are the possible ways to improve the current approach of practical class?
Uniform decisions Criteria should be informed Chance to design the experiment topic can be selected by consulting the students A fixed date for practical class
c. Teachers follow up FGD on the ways to improve the current approach of practical class and assessment system Table 46: Teachers follow up FGD on the ways to improve the current approach of practical class and assessment system Participants
7 SMTE teachers
Issues addressed
How to improve the current approach of practical class and assessment system?
Uniform decisions The teachers agreed that the approach of practical exams should be changed from verification to open investigation. The students must be informed about the criteria of assessment. Also: a particular time for practical class Need a good demonstrator
d. findings about ways to improve the existing situation of science practical class and assessment system Findings from teachers‟ interviews 5 out of 5 teachers agreed that the existing recipe approach should be changed and open investigation should be introduced at IER science practical class; 60
2 out of 7 teachers said they would prefer recipe approach; 3 out of 7 teachers said criteria of assessment should be communicated to students; Findings from students‟ FGD Students agreed on the points mentioned below: Students want a chance to design the experiments; They need to know the assessment criteria for a fair assessment; Topic can be selected by teacher consulting the students; There should be a fixed date for practical classes;
Findings from teachers‟ follow up FGD The teachers agreed on the following points: Recipe mode is not acceptable at this level of education. It has to be changed; Criteria of assessment should be fixed and need to be imparted to students; A particular time should be allocated for practical class; Teachers need a good demonstrator;
61
Chapter Five: Discussion 5.1 Existing nature of science practical work and its impact on students It is seen from the study that most of the time in science practical classes at IER, problem, procedure, apparatus and result of the experiment are given to the students by the teacher at IER. According to Hackling (2004), in recipe style laboratory exercises, students follow a procedure prescribed by the teacher. Hackling (2004) also stated that students follow the teacher‘s instructions to set up the prescribed equipment and make measurements and observations. It can therefore be said that the existing nature of science practical work at IER follows recipe style. A recipe style practical class is criticized for various impacts that it has on students. The following section of this chapter addresses the impacts which results from following a recipe style approach in a science practical.
5.1.1 Lack of clarity regarding learning outcome Participant students have informed in the data collection phase that they are not clear about the learning outcome in practical classes. This is similar to the findings of Chang and Lederman (1994) and others (e.g. Wilkenson & Ward, 1997) where they found that students often do not have clear ideas about the learning outcome for their work in science laboratory activities. Reid and Shah (2006) have pinpointed this problem as well. They mentioned that: too much emphasis on the experiments to be performed and not enough emphasis on what the students should be gaining. Several consequences emerge from this lack of clarity in understanding regarding learning outcome which were evident from the data collected from students. Firstly, students cannot relate practical work with the theory for the lack of clarity regarding learning outcome. Secondly, mismatches occur among students‘ understanding of the concepts and their peers understanding of the concept. Hofstein and Lunetta (2003) have supported this consequence. Thirdly, inconsistencies often occur between teachers‘ perceived goals for practical work and students‘ perceptions of such activities which is supported by Hodson,
62
1993, 2001; Wilkenson &Ward, 1997. Kirschner (1993) found the similar thing in a study where he stated that the objectives set by experts were not those that the students expected to happen. It is, therefore, possible that because of lack of clarity regarding learning outcome in students of science stream at IER, these consequences can happen to them.
5.1.2 Incapability of relating practical work with theory Incapability of relating practical work with theory is one of the major impacts of the lack of clarity regarding learning outcome in practical work. Also, it is considered to be one of the serious impacts of recipe style practical work at IER as well. From the study I have found that students cannot relate practical work with theory. This is similar to the finding of Moreira (1980) where students were unable to identify the conceptual underpinnings of a laboratory activity. This incapability of relating practical work with theory results means that no meaningful learning has taken place for the students. This consequence is similar to the finding of Hofstein and Lunetta‘s (1982). According to Novak (1988), meaningful learning takes place when students are able to integrate new experiences with prior knowledge, establish a context for the purpose of the laboratory activity, and determine the activity‘s relevance to them. Since, students of IER cannot relate practical with theory, it is, therefore, possible that no meaningful learning takes place for the students in the practical classes.
5.1.3 Lack of attention to the development of investigation skill The participant teachers informed that students of science stream at IER do not have investigation skills necessary for tertiary level. This is in line with the result of Hackling‘s (2004) where Hackling mentioned that the impact of recipe style approach is that there is no opportunity for students to develop skills associated with formulating a research question, identifying and manipulating variables or planning how to control variables. Garnett, Garnett and Hackling (1995) criticized recipe style practical work as it cannot help students to develop investigation skills. Johnstone (1994) voiced the same thing. He 63
asserted that traditional university laboratories often do not give opportunities for the development of skills such as formulating hypothesis and designing experiments. Lloyd (1992) indicated that this kind of approach provide few opportunities to identify problems or formulate hypothesis and to design experiments or procedures; insufficient discussion of limitations and underlying assumptions; and inadequate provisions for discussion, analysis and consolidation. Therefore, it might be possible that students of science strema at IER have no scope to develop these investigation skills which are necessary for tertiary level. Since students fail to achieve the skill to investigate, they eventually cannot achieve one of the important aims of practical work which is to develop investigation skills.
5.1.4 Fail to achieve affective outcomes It has been revealed in the present study that students do not feel interested about their practical work. Present nature of practical work fails to motivate students because motivation is directly related to students‘ curiosity and interest (Hodson,). Mitu (2011) also voiced the same thing. She stated that present cook-book recipe style practical works seems boring to the students. Since the existing nature of science practical work at IER follows recipe style, students spend more time determining if they obtained the correct results than they spend thinking about planning and organizing the experiment (Stewart,1988). It means that there is very little emphasis on their thinking. This is similar to the findings of Hofstein and Lunetta (1982) where they stated that recipe style practical class has been criticized for placing very little emphasis on students‘ thinking. It can be said with confidence that the existing nature of science practical work cannot help students to achieve affective outcomes of practical work.
5.2 Ways to improve the existing nature of science practical work at IER This section deals with the suggestions that have been discussed by participant teachers and students in order to improve the existing nature of science practical work at IER. 64
Most of the participant teachers were agreed on the point that recipe approach should not be followed in science practical class at all. They also argue that it should be replaced with open investigation approach. Baird (1990) voiced the same as he has observed that the laboratory learning environment warrants a radical shift from teacher-directed learning to ―purposeful-inquiry‖ that is more student-directed. IER students complained about not being able to connect theory with practical work. This situation may improve with open investigation. According to Hofstein and Lunetta (2002), investigation offer opportunities to connect science concepts and theories discussed in the classroom and in textbooks with observations of phenomena and systems. Science stream students of IER do not have investigation skills necessary for tertiary level. Their lack of skills is an impact of recipe style approach in science practical work. Integrating open investigation in IER science practical work may improve the situation as Goodrum and Hackling (1995) stated that students can develop science investigation and problem solving skills by working in an open investigation tasks. Recent studies showed that inquiry laboratories are effective in promoting students‘ development of formulating and implementing their own investigation (Deckert, Nestor, Dilullo, 1998; Sundberg & Moncada, 1984). These investigation skills are important for ensuring a student‘s scientific literacy (Organization for Economic Co-operation and Development [OECD], 2006). One of the impacts of recipe style approach is that it fails to motivate students (Hodson, 1993). This impact can be overcame by introducing open investigation in science practical work since it increases students‘ motivation and interest (Hodson, 1990; Skinner, 1993; Woolnough & Allsop, 1985). Moreover, increased motivation has great implications for increasing the quality of work and learning achieved by the student (Woolnough & Allsop, 1985). It is therefore possible that introducing open investigation at IER may increase the quality of work of students and students learning will be ensured. However, another thought emerges as Teacher B urged to keep on following the recipe style approach in science practical. According to that teacher (teacher B), it is not possible to shift from recipe to open investigation in science practical classes in tertiary level. Teacher B stated that recipe style approach is essential for students to develop manipulative skills which are important to gain at this tertiary level. Johnstone and Shuaili (2001) voiced 65
a thought similar to this which states that totally inquiry laboratories are probably impractical in the present situation in universities. It is be possible that introducing inquiry laboratories in science practical class might create problems in IER context. This situation creates a problem as both sets of thoughts of participant teachers are valid and considerable. Johnstone and Shuaili (2001) have provided an answer to this dilemma. They suggested that short inquiry should be attached to the end of an expository laboratory using the skills and knowledge gained in the laboratory but with no fixed instructions. According to them, in this way it is possible to exercise the skills and knowledge gained in the laboratory and so it will reinforce the learning. There is also an opportunity for planning, and interpreting and the bonus of ownership and enthusiasm.
Scaffolding science practical work The study presents a suggestion for integrating open investigation in science practical work of IER. This has been suggested by participant teachers to ensure four basic aims of laboratory work which are to develop students‘ conceptual learning, techniques and manipulative skills, investigation skills and affective outcomes. This suggestion is in line with the current world trend as open investigation has been introduced in science curricula of Western Australia, Lebanon, Israel, Venezuela and Taiwan (Abd-El-Khalick, 2004).
It should also be noted that Bangladesh has also introduced open investigation in secondary science curriculum as well. Siddique and Rahman (2007) have supported this introduction of open investigation in science practical work in secondary education of Bangladesh. It is therefore, I can consider introducing open investigation in the tertiary education as well. The main reason is that, students who are used to with doing their practical activity using open investigation at secondary level they may get demotivated if they face recipe style practical work.
However, introducing open investigation in science practical work at IER may not improve the situation. Scaffolding students‘ activities in the next step is just as much important. This 66
is necessary at the primary stage for students to get acquainted with the process of open investigation. Scaffolded Investigation Planning and Report sheet (IPRS) of Hackling‘s has been implemented on a group of students in a research study done by Garnett (1998) to check its effect on students. The findings revealed that scaffolded IPRS provides a focus for students learning and organizing instruction. The study also revealed that scaffolded IPRS is useful in assisting students‘ learning and ensuring that they had met the requirements of the task. Students also perceived that they improved social and workplace skills including working cooperatively, attending to detail, managing time and being organized.
It is therefore I would like to suggest to incorporate Hackling‘s investigation planning and report sheets at IER context. This scaffolding leads students through the investigation process and elicits information from them about their thinking and what they are doing at each stage of the investigation (Hackling, 2004). Vygotsky (1978, cited in Hackling, 2004) stated that students who have been used to traditional laboratory work dominated by recipe style routine laboratory exercises and have become passive followers of the teacher‘s instructions will need a framework or scaffold. Hackling (1996) stated that the written record of students‘ work becomes a valuable source of data for the teacher to use for formative and summative assessment purposes. A scaffolded IPRS is illustrated below to guide students through investigation: Question or prompt
Instructional purpose of question or prompt
What question or hypothesis are you going to Students focus on the problem and formulate a investigate?
question or hypothesis for investigation.
What do you will happen? Explain why
Students make a prediction and justify their prediction-this
activates
pre-instructional
knowledge for the investigation.
67
Which variables are you going to:
Change
Measure
Keep the same?
How will you make it a fair test?
Students identify and operationalize the key variables.
Students reflect on their plan and ensure that variables are controlled.
What equipment will you need?
Students think about the apparatus they will require to conduct the investigation.
Describe your experimental set-up using a The plain is set out in detail using a diagram labeled diagram and explain how you will and the steps in their procedure. collect your data? Did you carry out any preliminary trials of
These questions provide an opportunity for
your procedure to see whether your planned students to demonstrate that they conducted method of data collection would work? Were there any problems?
preliminary trials and have refined their procedure based on what they learned from the trials.
What happened? Describe your observations This prompts students to record their and record your results.
measurements and/or observations.
Can your results be presented as a graph?
This prompt student to decide whether it would be worth graphing their data.
What do your results tell you? Are there any These questions prompt students to search for relationships, patterns or trends in your patterns in data. results? Use some science ideas to help explain what happened. Can you explain the relationships, patterns or These questions prompt students to explain trends in your results? Use some science ideas the patterns in their data using science to help explain what happened.
concepts.
68
What did you find out about the question you These
questions
prompt
students
to
investigated? Write your conclusion. Was the summarize their findings as a conclusion and outcome different from your prediction? to Explain
compare
prediction.
their
finding
Discrepancies
with often
their occur
between predictions and findings due to students‟ pre-instructional knowledge. Such discrepancies may cause students to reflect on their beliefs. What difficulties did you experience in doing This question prompts students to reflect on the this investigation?
processes used in the investigation and identify difficulties experienced.
How could you improve this investigation- for This question helps students focus on what they example, fairness, and accuracy?
have learned about improving their investigation processes.
This scaffold may provide support to students, and the written record of the investigation provides information needed by teachers to assess students‘ investigation work.
5.3 Nature and impact of summative assessment of science practical work at IER 5.3.1 Paper and pencil test It has been seen from the study that teachers at IER use paper and pencil tests mostly to assess students in science practical. Yung (2001, cited in Johnstone & Al Shuaili, 2001) has stated in his study that teachers continue to assess their students using paper and pencil tests and using paper pencil test means neglecting many of the most important components of students‘ performances in the science laboratory. Hofstein (1982) clarified this point. He asserted that this method is limited to the more theoretical component of the lab work and therefore does not provide evidence for the more performance type activities. It can, therefore, be said that students‘ performance type activities are neglected in the IER science practical. 69
Furthermore, the criteria of assessing students in paper-pencil test are not explicit. When assessment criteria for an assessment task are not explicitly stated, students find it difficult to identify their level of learning and to select their learning approach for achieving their learning goals (Suskie, 2002). It is therefore possible that students may get confused about the intention of their teachers and cannot perform well during the test. Paper pencil test or written test has limited implications in students‘ learning. Existing research shows that written tests promote low levels of comprehension (Dochy, Segers, Gijbels & Struyven, 2007), reproduction of information under pressure and surface approaches to learning (Brown, 1997). Most of the marks (60%) are allocated for paper pencil test where students have little opportunity to explore their real life experiences which are important aspects of adult learning, cannot be ensured in this assessment system (Rahman & Siddique, 2007). Since, IER science practical emphasizes on paper-pencil test, students learning is hindered.
5.3.2 Practical lab reports In my study, I have found that teachers of IER put a lot of importance on laboratory reports to assess students at the end of the semester. The framework for the practical lab report is a typical one. It includes name of the experiment, summary of the theory, apparatus, calculation, result, discussion and safety cautions and this report is prepared after each lab sessions and submitted at the end of the semester for summative assessment. This is similar to the study done by Hofstein and Lunetta (1982) in which they stated that traditionally science teachers have been assessing their students‘ performances in the laboratory on the basis of their written reports during or after the laboratory exercise. Hunt, Koenders and Gynnild (2012) also found that traditionally students of the Biological Sciences at Edith Cowan University, Western Australia, were assessed on their laboratory work by reports, write-ups, assignments and examinations which were mostly prepared outside of laboratory sessions. Hofstein and Lunetta (1982) asserted that this method of assessment provides only limited information regarding the students‘ behavior and performance during the practical exercise. It is evident from the study that practical lab reports submitted by students of IER at the end of the semester provide very little information about students‘ behavior and performance. 70
Moreover, the assessment criteria of the practical lab report vary from teacher to teacher and are not explicit enough. The situation is similar that has been observed in the paperpencil test. According to Suskie (2002), when assessment criteria for an assessment task are not explicitly stated, students find it difficult to identify their level of learning and to select their learning approach for achieving their learning goals. Furthermore, each student has to submit 4 different lab reports for 4 different science subjects at the end of the semester which consists of 5 marks each. This pattern of assessment creates a work overload for students in a semester, although, it is argued that a well-designed assessment allots a reasonable workload to students (James & McInnis, 2001; University of Technology Sydney, 2006). This excessive workload pushes students to take achievement approach together with the surface approach because they do not get much time to examine the logic of the argument in their learning (Rahman & Siddique, 2007).
5.3.3 Viva-voce Viva voce or oral assessment constitutes a part in the practical examination. It has been found from the study that viva has been used as an important tool to assess students understanding of science concepts. However, the questions asked in viva of IER are often closed and content based. According to Hackling (2005) it is important to use open questions that require students to explain using their own words. This ensure that the student is required to apply the learning from the old context to explain in a new context using their own words rather than simply recalling the words used by the teacher without any real understanding. It is, therefore, possible that since the questions are closed-type, students‘ understanding of a science concept is not assessed properly in a viva of a science practical at IER.
5.4 Nature and impact of formative assessment in science practical class of IER It has been found from the study that formative assessment is very rare in science practical classes of IER. Therefore, at IER, students are unable to know their progress for the lack of feedback process although getting feedback or explicit information about their progress is considered as a right of students (Clarke, 1992). I have taken this decision based on Black 71
and William‘s (1998) framework of formative assessment where they stated that for assessment to qualify as formative assessment there has to be three basic criteria such as: clearly conveyed criteria, effective feedback culture and scope for self-assessment for students. The following section of this chapter discusses these criteria in detail and presents the impacts related to the absence of each criterion.
A. Criteria for formative assessment is not explicit It has been found from the study that the criteria for formative assessment are not explicit. When assessment criteria for an assessment task are not explicitly stated, students find it difficult to identify their level of learning and to select their learning approach for achieving their learning goals (Suskie, 2002). My experience while conducting students FGDs was that in these cases they found it very difficult in identifying their level of learning and selecting their learning approach for achieving their learning goals. These leads students to take the surface approach because they face problems in identifying their level of learning and selecting their learning approach for achieving their learning goals. Effective feedback culture could have solved this problem about identifying their level of learning and selecting their learning approach.
B. No effective feedback culture There is no effective feedback culture in science practical class at IER. Teachers do not inform students about the criteria so that they cannot identify their level of learning and compare between actual levels of learning and the expected level. For the lack of effective feedback culture in science practical classroom at IER, students are unable to know their progress although getting feedback or explicit information about their progress is considered as a right of students (Clarke, 1992). Students do not understand the weak and strong sides of their learning strategies due to the lack of feedback. Rahman and Siddique (2007) stated that students with learning difficulties face trouble to find the right track and this indicates a lack of equal access, because students are deprived of their right to get feedback. Furthermore, it is also necessary for students to use the feedback properly. Both Sadler (1989) and Black and William (1998) assert that self-assessment is essential to using feedback appropriately. It is therefore clear that self-assessment in classroom is absolutely necessary if students want to use the feedback appropriately. 72
C. No scope for self-assessment However, this study revealed that there is no scope for self-assessment in science practical class at IER. McMillan and Hearn (2008) asserted that self-assessment is the key to stronger student motivation and higher achievement. This eventually means that students at IER science practical class are not motivated and their achievement rate in science practical work is lower. It can therefore be said that students‘ learning is hampered because of the absence of self-assessment in science practical work.
5.5 Ways to improve the existing situation of science practical assessment system at IER While discussing about the ways to improve the existing situation of science practical assessment system at IER, teachers agreed on two distinctive points: a) formative assessment has to be included and b) rubrics need to be introduced in science practical classes. Detailed discussion on these two points is elaborated in the following sub section.
5.5.1 Formative assessment to be included The nature of summative assessment in science practical assessment system of IER has been explored in this study. The exploration reveals that paper-pencil test or written test gets too much emphasis in the assessment system of science practical work. However Grobman (1970) stated that paper pencil test is not at all appropriate for assessing laboratory experiment or fundamental laboratory tasks. Bryce and Robertson (1985) have sought a solution to this problem. They stated that potentially greater gains may be achieved if the assessment is continuous. By continuous assessment, they refer to formative or diagnostic assessment. Hunt (2011) emphasized formative assessment in practical class by stating that if the aim is to teach practical laboratory skills, this may be achieved by assessing those skills in the laboratory rather than assessing lab reports or answers to examinations questions. This shift towards formative assessment was preferred by Thomas (1972) as well. These literatures are in line with the suggestion that participant teachers suggested during follow up FGD. They suggested including formative assessment in science practical assessment system at IER.
73
The nature of formative assessment in science practical class of IER has been explored in this study. The findings revealed that formative assessment does not exist in IER science practical assessment system. This decision has been made based on three points: the criteria for formative assessment are not explicit, there is no effective feedback culture and there is no scope for self-assessment. It is therefore, there is a need to introduce an assessment strategy that may address all these issues. Criteria in assessment need to be clearly conveyed in formative assessment. Rahman and Siddique (2007) voiced the same. They stated that it is necessary to make the assessment guidelines more flexible and explicit for students (Rahman & Siddique, 2007). Allen and Tanner (2006) stated that rubrics can be used to make standards of accomplishment clear and explicit. This feature of rubric solves the first requirement of having formative assessment system of IER science practical class. Clearly conveyed criteria may help teachers to provide effective feedback to students. Rahman and Siddique (2007) stated in their study that formative feedback is essential in assessment process for promoting students‘ learning. Rubrics can be an effective way to provide formative feedback in science practical classroom. Stevens and Levi (2004) asserted that rubrics help teachers to provide timely feedback to students. This resolves the second issue of the formative assessment system at IER science practical class. However, providing feedback is not enough. Students need to use this feedback appropriately by selfassessment as both Sadler (1989) and Black and William (1998) asserted that selfassessment is essential to using feedback appropriately. Self-assessment is an important component of formative assessment. Rubric can play a very significant role in this situation as it is a valuable tool for self-assessment. Because rubrics not only list the success criteria but also provide descriptions of levels of performance, students are able to use them to monitor and evaluate their progress during an assessment task or activity. Hackling (1998) also stated that the students outcome statements can be written in a form of checklist for the students to use in self-evaluation of their investigation work. This kind of checklist is a beginning point to construct a rubric (Allen & Tanner, 2006).
74
In a research study done by Andrade and Du (2005), participants agreed in a focus group discussion that they use rubrics to plan an approach to assignment, check their work, and guide or reflect on feedback from others. Rahman and Siddique (2007) assert that an explicit guiding principle is essential for an assessment process. Rubrics can be that explicit guiding principle in these circumstances. Participant teachers of this study also suggested introducing rubric as an assessment tool in science practical assessment system.
Excellent
Good
Average
Weak
Poor
5
4
3
2
1
Prediction Justification of prediction Designing the test Considering limitation
Conducting
Using apparatus Observing Comparison with prediction
Processing
Data processing Evaluating findings Relating with real life Students self-assessment ng
Evaluati
Rubric for assessing practical work
Planning
Formulating question
Personal reflection
75
The rubric described above is based on the model of science investigation process given by Hackling and Fairbrother (1996) and Bryce and Robertson‘s (1985) 5-point scale. However, this rubric needs further revision and hence, is a matter of future research.
76
Chapter six: Implications and Conclusions The study presents nature and impact of existing practical work and assessment system at IER. The study also presents the possible ways to improve the existing situation of practical work and assessment system. The findings have several implications for curriculum reform, teachers‘ practices and future research. The detailed descriptions of the implications of the study are given below:
6.1 Implications for the curriculum reform This study can inform the curriculum reform at IER. The curriculum developer can use the study for two main purposes. This study will guide them to understand the existing situation of practical work and assessment system of IER science practical. This study will also inform them about the current trend of practical work and practical assessment system around the world and Bangladesh. This may lead the curriculum developers of IER to rethink the ways practical work and assessment system are conducted. They can implement the proposed recommendations of this study to improve the situation of science practical.
6.2 Implications for teachers‟ practice The study has implications for teachers‘ practice. This study reveals that the nature of science practical work at IER is recipe style which results several impacts among students. Such as: students become unclear about the learning outcome and they fail to relate practical with theory. The study also reveals that the assessment criteria of IER science practical work are not clear. These findings will inform teachers of SMTE at IER about the impacts of the current nature of practical work and assessment system and may urge them to reconsider the existing nature of practical work and assessment system at IER.
77
6.3 Implications for future research The study has implications for future research as well.
6.3.1 Action research The study suggests inquiry/open investigation approach to be introduced in science practical work and rubric to be introduced as an effective tool for providing clearly stated assessment criteria to the students. However, action research needs to be conducted in the IER science practical classes to examine the effects of a new practical teaching-learning approach and assessment system.
6.3.2 Setting assessment criteria for assignments, in course and presentation In my study, I have kept my focus on practical work while exploring the existing assessment system. However, the explicitness of assessment criteria in assignments, in course and presentation needs to be explored to ensure better teaching learning.
6.4 Personal reflection This study has been a learning journey to me. Though I commenced this study to explore the science practical assessment system at IER, I ended up exploring both the nature and impact of practical work and assessment system. It is my understanding that teachers are unwilling to conduct science practical using science investigation model. Their belief system is a huge problem and they are reluctant to change it. Therefore, it was difficult for me to get them to reach a common understanding. I believe that I barely scratched the surface of the issues I have been exploring and this needs more vigorous studies to get a complete understanding.
78
References Abd‐El‐Khalick, F., Boujaoude, S., Duschl, R., Lederman, N. G., Mamlok‐Naaman, R., Hofstein, A., ... & Tuan, H. L. (2004). Inquiry in science education: International perspectives. Science education, 88(3), 397-419. Abouseif, A. A., & Lee, D. M. (1965). The evaluation of certain science practical tests at the secondary school level. British Journal of Educational Psychology, 35(1), 4149. Abrahams, I., & Millar, R. (2008). Does practical work really work? A study of the effectiveness of practical work as a teaching and learning method in school science. International Journal of Science Education, 30(14), 1945-1969. Allen, D., & Tanner, K. (2006). Rubrics: tools for making learning goals and evaluation criteria explicit for both teachers and learners. CBE-Life Sciences Education, 5(3), 197-203. Andrade, H. L., & Du, Y. (2005). Student perspectives on rubric-referenced assessment. Baird, J. R. (1990). Metacognition, purposeful enquiry and conceptual change. The student laboratory and the science curriculum, 183-200. Bell, J. (2005). Doing your research project. Berkshire, England: Open University Press. Ben‐zvi, R., Hofstein, A., Samuel, D., & Kempa, R. F. (1977). Modes of instruction in high school chemistry. Journal of Research in Science Teaching, 14(5), 433-439. Black, P., & Wiliam, D. (1998). Assessment and classroom learning.Assessment in Education: principles, policy & practice, 5(1), 7-74. Bryce, T. G., & Robertson, I. J. (1985). What can they do? A review of practical assessment in science. Bybee, R., Minstrell, J., & van Zee, E. H. (2000). Inquiring into inquiry learning and teaching in science. Wasington, DC: American Association for the Advancement of Science AAAS. Creswell, J.W. (2012). Educational Research: Planning, conducting and evaluating quantitative and qualitative research. Boston, MA: Pearson Education, Inc. Deckert, A. A., Nestor, L. P., & DiLullo, D. (1998). An example of a guided-inquiry, collaborative physical chemistry laboratory course. J. Chem. Educ,75(7), 860. Dochy, F., Segers, M., Gijbels, D., & Struyven, K. (2007). Breaking down barriers between teaching, learning and assessment: Assessment Engineering. Ganiel, U., & Hofstein, A. (1982). Objective and continuous assessment of student performance in the physics laboratory. Science Education, 66(4), 581-591.
79
Garnett, P.J., Garnett, P.J., & Hackling, M. W. (1995). Refocusing the chemistry lab: A case for laboratory-based investigations. Australian Science Teacher Journal, 41, 26-32. Giddings, G. J., Hofstein, A., & Lunetta, V. N. (1991). Assessment and evaluation in the science laboratory. Practical science, 167-178. Gunning, D. J., & Johnstone, A. H. (1976). Practical Work in the Scottish OGrade. Education in Chemistry. Gunstone, R. F., Loughran, J. J., Berry, A., Mulhall, P. (1999). Inquiry in Science Classes-Do We Know "How, When and Why"?. Paper presented at the Annual meeting of American Educational Research Association, Montreal. Grobman, H. G. (1970). Developmental Curriculum Projects: Decision Points and Processes: a Study of Similarities and Differences in Methods of Produciny Developmental Curricula. FE Peacock. Hacking, M.W., (2004). Inquiry and investigation in science. (Ed.) The art of teaching science for middle and secondary school (pp. 104-121). Australia. Allen & Unwin. Hackling, M. W., & Fairbrother, R. W. (1996). Helping Students To Do Open Investigations in Science. Australian Science Teachers Journal, 42(4), 26-33. Hegarty-Hazel, E. (1986). Lab work. SET: Research information for teachers, number one. Canberra: Australian Council for Education Research. Hodson, D., (1990). A Critical Look at Practical Work in School Science. School Science Review, 71(256), 33–40. Retrieved from http://eric.ed.gov/?id=EJ413966 Hodson, D., (1993). Re-thinking old ways: Towards a more critical approach to practical work in school science. Studies in Science Education, 22, 85-142. Hofstein, Avi, and Vincent N. Lunetta. "The role of the laboratory in science teaching: Neglected aspects of research." Review of educational research52.2 (1982): 201217. Hofstein, A., & Lunetta, V, N. (2004). The laboratory in science education: Foundation for the 21st century. Science Education, 88, 28-54. Hofstein, A., Shore, R., & Kipnis, M. (2004). Providing high school chemistry students with opportunities to develop learning skills in an inquiry-type laboratory - A case study. International Journal of Science Education 26, 47-62. Hunt, L., Koenders, A., & Gynnild, V. (2012). Assessing practical laboratory skills in undergraduate molecular biology courses. Assessment & Evaluation in Higher Education, 37(7), 861-874. Johnson, B., & Christensen, L. (2008). Educational Research: Quantitative, Qualitative and Mixed approaches. California, US: Sage Publications, Inc. Johnstone, A.H., & Al-Shuali, A. (2001). Learning in the laboratory: Some thoughts from 80
the literature. University Chemistry Education, 5(2), 42-51. Kelly, P. J., & Lister, R. E. (1969). Assessing practical ability in Nuffield A-level biology. Studies in assessment, 121-152. Lazarowitz, R., & Tamir, P. (1994). Research on using laboratory instruction in science. Handbook of research on science teaching and learning, 94-130. Lloyd, B. W. (1992). The 20th century general chemistry laboratory: Its various faces. J. Chem. Educ, 69(11), 866. Lunetta, V. N. (1998). The school science laboratory: Historical perspectives and contexts for contemporary teaching. International handbook of science education, 1, 249262. McMillan, J. H., & Hearn, J. (2008). Student self-assessment: The key to stronger student motivation and higher achievement. Educational Horizons,87(1), 40-49. Merritt, M. V., Schneider, M. J., & Darlington, J. A. (1993). Experimental design in the general chemistry laboratory. J. Chem. Educ, 70(8), 660. Moreira, M. A. (1980). A non‐traditional approach to the evaluation of laboratory instruction in general physics courses. European Journal of Science Education, 2(4), 441-448. Novak, J. D. (1988). Learning science and the science of learning. OECD. (2006). Assessing scientific, reading and mathematical literacy: A framework for PISA 2006. Pekmez, E. S., Johnson, P., & Gott, R. (2005). Teachers‘ understanding of the nature and purpose of practical work. Research in Science & Technological Education, 23(1), 3-23. Rahman, S.M., & Siddique, N.A. (2007). Assessment policy and practice at the institute of Education and Research: A reflective study. Teacher’s World. 30-31. Shah, I., Riffat, Q., & Reid, N. (2007). Students perceptions of laboratory work in chemistry at school and college in Pakistan/Percepciones de los estudiantes de escuela y universidad en Paquistán del trabajo de laboratorio químico. Journal of Science Education, 8(2), 75. Roth, W. M., & Roychoudhury, A. (1993). The development of science process skills in authentic contexts. Journal of Research in Science Teaching, 30(2), 127-152. Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional science, 18(2), 119-144. Staer, H., Goodrum, D., & Hackling, M. (1998). High school laboratory work in Western Australia: Openness to inquiry. Research in science education,28(2), 219-228.
81
Tamir, P. I. N. C. H. A. S. (1972). The practical mode—a distinct mode of performance in biology. Journal of Biological Education, 6(3), 175-182. Tamir, P., Doran, R. L., & Chye, Y. O. (1992). Practical skills testing in science. Studies in Educational evaluation, 18(3), 263-275. The Curriculum Committee. (2006). Curriculum: Four-year Bachelor of Education (Honors)- An integrated program of general and professional education. Dhaka, Bangladesh: Institute of Education and Research (IER), University of Dhaka. Vygotsky, L. S. (1980). Mind in society: The development of higher psychological processes. Harvard university press. Wilkinson, J. W., & Ward, M. (1997). The Purpose and Perceived Effectiveness of Laboratory Work in Secondary Schools. Australian Science Teachers Journal, 43(2), 49-55. Woolnough, B. E., & Allsop, T. (1985). Practical work in science. Cambridge University Press. Yung, B. H. W., & Yung, B. H. W. (2001). Three views of fairness in a school-based assessment scheme of practical work in biology. International Journal of Science Education, 23(10), 985-1005.
82
Appendices
Appendix 1 Semi structured interview schedule Phase One: Planning 1. Question or problem of the investigation (what are you going to investigate?) Chosen by the teacher Chosen by the students 2. Prediction (of what will happen?) Made by the teacher Made by the students 3. Variables a. Identification of the independent and dependent variables i. Made by the teacher ii. Made by the students b. Manipulation of variables i. Which variable are you going to change? ii. Which variable are you going to measure? iii. Which variable are you going to keep the same? 4. Justification of prediction Explained by the teacher Explained by the students 5. Safety measures Safety precautions maintained by teacher Safety precautions maintained by students 6. Procedure of the investigation Decided by the teacher Decided by the students 7. Materials and equipment Identified by the teacher Identified by the students
83
Phase Two: Experimenting 8. Preliminary trials There is a chance to give a preliminary trial Such chances are not available 9. Observation Teacher describes what is happening Students record their observation and measurements 10. Comparison with prediction Teacher compares prediction with observation Students compare prediction with observation 11. Justification of the observable data Teacher explains the phenomena Students construct an explanation for their data
Phase Three: Data Analysis 12. Identifying patterns in data Teachers explains the data Students try to identify patterns or trends in data 13. Presenting data using graphs Teacher decides the best way to present data Students decides themselves the suitable way to present the data
Phase Four: Evaluating 14. Difficulties (reflection on students’ work) Teacher points out the areas of difficulty to the students Students reflect on their own work 15. Suggestion for improvement Teacher identifies the weaknesses of the investigation and suggest improvements Students identify their weaknesses in the investigation and outline improvements
84
Practical Examination 1. Marks distribution a. Please explain the marks distribution of practical examination in detail. b. Are the students aware of the marks distribution? 2. Examination process (test items etc.) a. Please explain the examination process in detail. 3. Observation during practical exams a. What do you observe during practical examination? b. Are the students being notified the assessment criteria before the test? c. If YES, how you let the students‘ know on what basis they are being assessed? 4. Report writing iv. You assign numbers to the reports. How do you assign the numbers? / Or what are the criteria for assigning a particular number? v. Are the students aware of the marking scheme? vi. If YES, how are they notified about the assessment criteria? 5. Viva a. What do you assess in a viva? b. How do you ascertain a number to the students?
Incorporating Formative and Summative assessment 1. Daily class performances have 10 marks in the practical marks distribution. i. How do you transform the descriptive feedback in a particular number? ii. Is there a framework for that transformation? iii. If YES, please explain the framework in detail. 2. How do you incorporate the formative assessment into summative assessment?
Impact
1. Do you think students should know about the assessment criteria? Why or why not? 2. What do you feel about students‘ not knowing the assessment criteria?
85
Appendix 2 Follow up FGD of teachers প্রশ্ন তালিকাাঃ Approach Phase One: Planning ১। অমাদের অআআঅদর াআদের প্র্যাকটিকযাগুদা করাদ ার উদেশ্য ট দে প্র্থদম কথা বদবা। অপ ার কী মদ ে ,এডুদকল টডটটিদ াআদের প্র্যাকটিকযা করাদ ার উদেশ্য কী? ২। অআআঅর এর প্র্যাকটিকযা ক্লাগুদা ককম দে ? টলক্ষকদের উত্তদরর পর পদরর প্র্শ্নঃ টলক্ষকদের াদথ আন্িারটভউ করার পর কপদেটি কয ,অআআঅর এর প্র্যাকটিকযাদর কমাড দ কিাআ করটটপ স্টাআদর। এআ কমাডিা কেঞ্জ করার ককা সুদযাগ অদি বদ মদ কদর টক এবং কীভাদব ? Phase Two: Experimenting ২। মূ কয পরীক্ষদের বযাপারিা( exper i ment i ng )কখাদ টলক্ষাথথীদের করা খুব কম। শুধু প্র্টটডওর ফদা কদর কাজ করার মদধযআ ীমাবদ্ধ। এআ জােগািা পটরবর্থদ র ককা সুদযাগ অদি বদ মদ কদর টক ?কীভাদব ?কক ? Phase Three: Data analysis: ৩। ডািা যা াাআটদর প্র্দে অট। টলক্ষাথথীদের ট দজদের মদর্া কদর ডািা আন্িারদপ্র্ি করার সুদযাগ কম। অপট /অপ ারা টক মদ কদর এিা পটরবর্থ ওো েরকার ?দ কক এবং কীভাদব? Phase Four: Evaluating ৪। কবটলরভাগ কক্ষদে টলক্ষকআ টলক্ষাথথীর ভু ধটরদে টেদে থাদক । এখাদ টলক্ষাথথী ট দজআ ট দজর গ্রগটর্ মট ির করদর্ পাদর এম বযবস্থা কীভাদব করা যাে? Assessment i n Pr act i cal
Cont ext : I ER
Formative Assessment in Classroom:
86
৫। প্র্যাটিকযা ক্লাদ ফরদমটিভ যাদদমদন্ির গুরুত্ব কর্িুকু বদ মদ কদর ? ক )ফরদমটিভ যাদদমদন্ির জন্য অাো মার্ক্থ থাকা কর্িা গুরুত্বপূেথ বদ অপট মদ কদর ? খ )ক্লাদ কয ধরদ র টফডবযাক টলক্ষক টেদে থাদক কিা মাদর্ক্থ কীভাদব পটরবটর্থর্ দব ক টবদে ককা টেকট দেথল া অদি কী ?থাকদ কী ধরদ র বযাখযা করু । থবা , া থাকদ কখাদ একিা কেমওোকথ কীভাদব অ া কযদর্ পাদর বদ অপট
দম কদর ?
৬। কেখা যাে কয ,যাদ করার কক্ষদে এদকক টলক্ষক এদকক ক্রাআদিটরোর টভটত্তদর্ মূযাে কদর থাদক । টলক্ষাথথীরা এআ ক্রাআদিটরোগুদা ম্পদকথ বগর্ অদি টক া কিা জা াদ া কর্িা গুরুত্বপূেথ বদ মদ করদি ? ৭। প্র্যাকটিকযা পরীক্ষার মে এর্ক্দপটরদমদন্ির মে টলক্ষাথথীদের কী কী ক্রাআদিটরো বজাভথ করা দে কিা কর্িা স্বে ?অপট টক মদ কদর এখাদ অদরা স্বের্ার প্র্দোজ অদি ?কআ স্বের্া কীভাদব অ া কযদর্ পাদর? ৮। মূযােদ র এআ ক্রাআদিটরোগুদা টলক্ষাথথীরা কীভাদব জা দর্ পাদর বদ মদ করদি ? ৯। টলক্ষাথথীরা দ কমেআ ক টফউজড থাদক যাদদমন্ি ক্রাআদিটরোর বযাপাদর। এআ া জা ািা র্াদের উপর কী ধরদ র আম্পযাি কফদর্ পাদর বদ মদ কদর ? ৯। টলক্ষাথথীদের প্র্যাটিকযাদর বযাপাদর টলিি মাটকথং প্র্কাল করা কর্িা প্র্দোজ বদ অপট মদ কদর ? ১০। প্র্যাকটিকযালর ২৫ নাম্বার এলকক শলক্ষক এলকক ভালব ভাগ কলর থালকন। এ বযাপারটালে একটা শনশদি ষ্ট ফ্রেমওয়ালকি আল কেটা ভালা য় ?ফ্রই ফ্রেমওয়াকি টা ফ্রকমন লে পালর ?শলক্ষাথীলক জানালনা ফ্রযলে পালর কীভালব? **
প্র্থলম ২৫ নাম্বার কীভালব ভাগ লব ফ্রই শবলয় একমলে ফ্রপ ৌঁছালনা।
র্ারপর , ***
ভাআভার ৫ কীভাদব ভাগ দব /?ক্রাআদিটরো কী কী থাকদব ?
****ক্লাস পারফরমমমের ৫/১০ নাম্বার কীভামব ভাগ হমব /?ক্রাইমেলরয়া কী কী থাকমব ? ******লরমপােট/খাতার ৫ কীভামব ভাগ হমব/ ?ক্রাইমেলরয়া কী কী থাকমব? ১১। এই শবয়টালক অ্যালে করার জনয রুশিক এর বযবার করা ল ফ্রটা কেটা ফপ্র্ূ লব বল আপশন মলন কলরন ? 87