Review of the Leaving Certificate biology examination

7 downloads 0 Views 734KB Size Report
Jun 13, 2016 - Keywords: Bloom's taxonomy; Leaving Certificate; examination; high-stakes assessment ... erage of the outraged responses from teacher unions and teachers alike (Donnelly. 2015 ... dations for future written examination questions. ... form of cognitive thinking and evaluation is the highest (See Figure 1).
Irish Educational Studies

ISSN: 0332-3315 (Print) 1747-4965 (Online) Journal homepage: http://www.tandfonline.com/loi/ries20

Review of the Leaving Certificate biology examination papers (1999–2008) using Bloom’s taxonomy – an investigation of the cognitive demands of the examination Alison Cullinane & Maeve Liston To cite this article: Alison Cullinane & Maeve Liston (2016) Review of the Leaving Certificate biology examination papers (1999–2008) using Bloom’s taxonomy – an investigation of the cognitive demands of the examination, Irish Educational Studies, 35:3, 249-267, DOI: 10.1080/03323315.2016.1192480 To link to this article: https://doi.org/10.1080/03323315.2016.1192480

Published online: 13 Jun 2016.

Submit your article to this journal

Article views: 217

View Crossmark data

Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=ries20

Irish Educational Studies, 2016 Vol. 35, No. 3, 249–267, http://dx.doi.org/10.1080/03323315.2016.1192480

Review of the Leaving Certificate biology examination papers (1999– 2008) using Bloom’s taxonomy – an investigation of the cognitive demands of the examination Alison Cullinanea* and Maeve Listonb a EPI·STEM - National Centre for STEM Education, Department of Education and Professional Studies, University of Limerick, Limerick, Republic of Ireland; bDepartment of Learning, Society, and Religious Education, Mary Immaculate College, Limerick, Republic of Ireland

(Received 30 March 2015; accepted 4 April 2016) It is widely recognised that high-stakes assessment can significantly influence what is taught in the classroom. Many argue that high-stakes assessment results in a narrowed curriculum where students learn by rote rather than developing higher cognitive skills. This paper describes a study investigating the various cognitive objectives present from Bloom’s Taxonomy Educational Objectives on the Leaving Certificate biology examination. The study analysed examination papers from the past and current biology syllabuses. Analysis was also carried out to determine the marks being awarded to the different cognitive objectives. The findings show that the examination predominately includes questions that do not promote higher levels of thinking. The majority of the marks on the paper were allocated to the lower objectives of the taxonomy, suggesting students can rely on rote learning to succeed when undertaking the biology examination. This study strongly highlights how high-stake examinations have a narrow scope in terms of student achievement and shows how current biology examination procedures promote low-level learning. This low level of thinking promotes rote learning and regurgitation of facts, requiring little to no understanding of the topics. To prepare students for the working world, there needs to be a shift from only terminal exams to a mixed approach. Keywords: Bloom’s taxonomy; Leaving Certificate; examination; high-stakes assessment; biology

The influence of high-stakes assessment on teaching and learning It is widely recognised internationally that high-stakes assessment methods can significantly influence what is taught and how it is taught in the classroom. In turn, this affects how students acquire and learn new information (Baird et al. 2014; Black and Wiliam 1998; Crooks 1988; Gardner et al. 2010; Supovitz 2009). Many argue that high-stakes assessment results in a narrowed or a mile wide and inch deep syllabus, where students learn by rote rather than developing a deep level of understanding (Baird et al. 2014; Supovitz 2009). High-stakes assessment is often associated with teachers focusing on the content of the tests, administering repeat practice tests and often facilitating a transmission style of teaching where gains in skills are limited *Corresponding author. Email: [email protected] © 2016 Educational Studies Association of Ireland

250

A. Cullinane and M. Liston

(Baird et al. 2014; Gardner et al. 2010). This restricts deeper levels of thinking and the overarching learning objectives of a syllabus can often be ignored (Harlen 2005; Supovitz 2009). Harlen and James (1997) argue that rote learning and having the ability to reproduce content when required lead to a passive acceptance of ideas and information. Rote learners often lack recognition of the main principles and tend to focus their learning for assessment needs rather than the understanding and application of information (Harlen and James 1997). High-stakes assessment is often associated with test-based accountability systems where it is believed that public education can be improved by requiring all students to take the same test where stakes are high, and where the results depend on progression or non-progression for students and schools (Black and Wiliam 1998; Hamilton, Stecher, and Klein 2002). There is empirical evidence to suggest that the type of test item influences both, the kinds of study activities students will engage in and their levels of achievement (Baird et al. 2014; Crooks 1988; Thomas et al. 1993). Therefore, high-stakes assessments can impede the development of teaching and learning strategies in a classroom (Harlen 2005; Stecher and Barron 1999; Supovitz 2009). High-stakes assessment in Ireland For over a century, Ireland has relied on high-stakes assessment where external examinations for assessing achievement and for selection to higher education pathways have played a prominent role in shaping the Irish education system. The Intermediate Education Bill or the ‘payment-by-results’ principle was introduced in 1878 in both primary school (until 1900) and second-level schools (until 1924). This paved the way for the reliance on test results from external examinations. Examinations and results became the primary concern of both teacher and student until 1924 when the bill was dropped from legislation (Ó’ Buachalla 1988). However, the indelible effect of this bill is still seen in the Irish education system today with the use of terminal high-stakes examinations at second-level education. This longstanding reliance on external high-stakes assessment is currently having a major influence upon the discourse about assessment in the Irish education system. This discourse has been fraught with controversy since, the then Minister of Education, Ruairí Quinn, announced proposed changes to Junior Cycle assessment (Department of Education and Skills 2010b), which aims to move from high-stakes external assessment to school-based assessment. This has led to substantial media coverage of the outraged responses from teacher unions and teachers alike (Donnelly 2015; Erduran and Dagher 2014). This outrage is based on the feelings that standards will drop, discipline problems will increase and training and knowledge on how to apply such changes is insufficient at present (Donnelly 2015). The outcome of these proposed changes may potentially affect the Leaving Certificate examinations and procedures in the coming years. The Leaving Certificate is the terminal examination completed at the end of second level in Ireland. Results are used to determine entry and allocate pathways in higher education. The Leaving Certificate is well-established in Irish society, with teachers, parents and students placing much trust in its efficacy (Baird et al. 2014; Looney 2006). Currently there is a choice of over 40 subjects at two levels: higher and ordinary level. Ordinary level is designed for weaker ability candidates. English, Irish and Mathematics can also be taken at an additional level; foundation level which contains

Irish Educational Studies

251

basic questioning, designed for students with very weak abilities or learning difficulties. The last number of years have been characterised by revisions and updates to syllabuses including the science subjects. A key objective of such revisions was to ensure that the broad range of skills, interests, learning styles and special needs of students were well catered for (Department of Education and Science 2004; NCCA 2011). Unlike the Junior Cycle level, assessment at the Leaving Certificate level is not set to change in the immediate future. The Leaving Certificate examination takes place at the end of a two-year cycle. Candidates sit the exams in the school but the script is corrected anonymously by external graders. Desmond and Desmond (2008) discuss the correcting and grading process of the state assessment in Ireland. The need for reforms to Leaving Certificate science education in Ireland Studies have proven that performance in second-level education is the best predictor of performance in third-level education (Geiser and Santelices 2007). The question then arises whether the assessment procedures in second level are preparing students for third level? Higher level institutes promote and strive for ‘graduates with essential generic foundation skills as adaptive, creative, well-rounded thinkers and citizens’ (Department of Education and Skills 2010b). The Irish government has recognised the need to change how science is being taught in the Irish secondary school system due to the washback effect of high-stakes tests upon learning and grade inflation (Government of Ireland 2006; 2008). Hyland (2011) reports that despite an increase in higher grades achieved by Leaving Certificate students, these tests rely on simple recall, suggesting the over reliance by Irish students on the regurgitation of facts. In light of these observations, the National Council for Curriculum and Assessment (NCCA) has developed new draft curricula for biology, chemistry and physics in 2011 which are not yet in practice. These curricula have signalled a change to the assessment procedure. A written examination (worth 80%) will remain a feature, with a new second component being introduced. These will comprise of two parts: a laboratory notebook reporting on mandatory activities, authenticated by teachers (worth 5%) and a practical examination which is marked externally (worth 15%) (NCCA 2011). The introduction of a practical component aims to ensure students can demonstrate basic competencies in laboratory skills prior to entering third level. How the 80% written exam will change has yet to be reported. Before the new syllabus assessment procedures are implemented, this study aims to provide a picture of the current situation through historical research. The study aims to identify the cognitive demands required by current assessment practices. Ten years of past papers, from both the previous and current syllabus, along with their subsequent scoring guidelines (marking schemes) were analysed. Results of this analysis will provide recommendations for future written examination questions. Both of syllabuses assess with a 100% summative written examination taking place at the end of a two-year programme. The previous syllabus was introduced in 1969 and first examined in 1971. There were some minor adaptations to the syllabus in 1975, but it has predominantly remained the same until 2002 (NCCA 2002; O’Sullivan 1985). This syllabus document consisted of 10 units of study; however, it did not contain set aims or objectives, it was last examined in 2003 (NCCA 2008). The current biology syllabus was introduced in 2002 with the aim of updating and incorporating new advancements in biological knowledge which occurred in the intervening years

252 A. Cullinane and M. Liston (NCCA 2002, 2–4). This was examined for the first time in 2004. It consists of three units of study. The need for biology education and skills reform It has been reported that Ireland’s future economic recovery hinges on the development of the biosciences and agriculture-based industries. Utilising the knowledge, skills and creativity of people is central to developing innovation and ideas (Government of Ireland 2008; Task Force on the Physical Sciences 2002). The biological sciences include one of Ireland’s largest and most important industries, leading to a resurgence of interest in biology and agriculture science (Forfás 2009; Teagasc 2011). The life science sector accounted for almost 30% of total exports in Ireland in 2008 employing in excess of 52,000 people, in over 350 enterprises (Forfás 2009). It has been reported that the future of the agri-food industry is favourable, with projections indicating by 2050, the world will need to increase food production by over 70%, leading to a demand for agri-scientists (Department of Education and Skills 2010a). There is a need to develop a highly skilled workforce in this expanding sector (IBM 2008). Preparation of a biologically literate society which has distinct skills and understanding of biological knowledge is being recognised as an essential factor for socio-economic growth and stability (Engineers Ireland 2010; Government of Ireland 2008; Teagasc 2011). The National Research Council (NRC) recommend the development of a new biology syllabus, which comprises an integrated problemfocused approach to science, consistent with research on how students learn best and skills needed for further education (NRC 2009). Bloom’s taxonomy of educational objectives – cognitive domain Bloom’s Taxonomy of educational objectives – cognitive domain (1956) was used as the theoretical framework for this analysis. Bloom’s Taxonomy of educational objectives is arguably one of the most influential educational monographs of the past half a century. That it has stood the test of time in the educational setting is testament to its contribution to the education community (Marzano and Kendell 2007). However, as influential as Bloom’s Taxonomy has been on educational practice, it is not devoid of criticism. One of the most common criticisms is that the taxonomy oversimplified the nature of thought and its relationship to learning (Marzano and Kendell 2007). In the case of this research, it was this simplified view that attracted the authors to using this original taxonomy rather than any revised or more modern frameworks. Other frameworks were examined and considered prior to starting analysis. Most closely related to Bloom’s original taxonomy is the revised taxonomy, produced by Anderson et al. (2001) which is a more comprehensive two-dimensional framework. Consequently, this was deemed overly complex for the needs of this study which involved classifying the questions. Another framework developed by Pollitt et al. (1985) was considered, which was developed to determine how easy or difficult a question was to answer but required a sample of student answers. As we did not have access to candidate’s completed exam scripts, this framework was deemed inappropriate for this study. Therefore, Bloom’s Taxonomy was considered to be the most effective for the needs of the analysis being undertaken. The simplicity of Bloom’s Taxonomy allowed higher order questions and lower order questions to be distinguished on the

Irish Educational Studies

253

examination papers easily and effectively. The various levels of the taxonomy will be referred to as objectives to prevent confusion between the two examination paper levels: higher level and ordinary level.

Cognitive domain The cognitive domain is divided into the following objectives: knowledge, comprehension, application, analysis, synthesis and evaluation, in which knowledge is the lowest form of cognitive thinking and evaluation is the highest (See Figure 1). Similar to other taxonomies, Bloom’s is hierarchical; meaning that learning at the higher objectives is dependent on attaining prerequisite familiarity and skills from the lower objectives of the taxonomy (Orlich et al. 2004). A student functioning at the application objective has usually mastered the material from the knowledge and comprehension objectives.

Figure 1. Bloom’s taxonomy of educational objectives: cognitive domain (1956) with associated verbs.

254

A. Cullinane and M. Liston

Research questions Having examined the literature, the following research questions were designed and used to frame the methodology of the study. (1) What skills and cognitive levels are demanded by the students to pass the Leaving Certificate Biology examination? (2) Is there a difference in the percentage cognitive demands required to pass the examination of current and previous Biology syllabi? (3) Are there any differences in the cognitive levels required by the higher level and ordinary-level candidates taking the Leaving Certificate Biology examination? (4) What percentages of marks are allocated to the different cognitive objectives of Bloom’s Taxonomy in Biology examination papers?

Research methodology: analysis of the examination papers This study used historical research, where past events are used to study or predict certain situations, in which it provides an authentic representation of a previous age (Cohen, Manion, and Morrison 2007; Swart 2010). Using this approach, 10 years of examination papers were chosen to be analysed (Department of Education and Science 2001). The years 1999–2008 were selected as 2002 marked the introduction of the current syllabus, which was first examined in 2004. Selecting these years would allow five years from each syllabus to be analysed and would therefore be a predictor of current trends. The examination papers for each syllabus were available in two levels: higher level and ordinary level. Both levels were analysed, with 20 examination papers examined in total. This sampling strategy implemented a non-probability sampling method. That involves homogeneous sampling, a purposive sampling technique where cases are purposely selected and aims to achieve a sample whose ‘units’ share similar traits (Creswell 1994). Further quantitative analysis of the examination papers following 2008 would have had no impact on this study as there were no significant changes made to the question style, format of the papers or marking scheme in the years after 2008. This is supported by the 2013 State Examinations Commission (SEC) Chief Examiners Report for Biology (SEC 2013).

The Leaving Certificate examination paper layout The examination papers from the two syllabuses differed in their layout. The previous syllabus (1969–2003) examination papers were divided into two sections: part one and part two. Part one was worth 120 marks, consisting of seven questions, of which six had to be answered. Each question was worth 20 marks. The answers to part one were written on the examination paper in the spaces provided. Part two was worth 280 marks and consisted of 8 questions. Candidates were required to answer four questions from this section, which were worth 70 marks each. Part two required longer essay style answers, which were answered in a separate answer book. Each question had part a, b and c. The current examination paper is divided into three sections: section A, section B and section C. Section A contains six questions of which five have to be answered, each carrying 20 marks (100 for the section) which are written on the examination paper in the spaces provided. Section B contains three questions, of which two questions have

Irish Educational Studies

255

to be answered from this section. This section is worth 60 marks (each question carried 30 marks). The answers are written in the spaces provided on the examination paper. Section C contains six questions of which four questions need to be answered. Each question carries 60 marks (240 for the section) and is to be answered in a separate answer book. The examination papers for the current syllabus and their corresponding scoring guidelines or marking schemes are available from the state exams commission website www.examinations.ie. Analysis of the Leaving Certificate Biology papers was performed on the following areas: (1) Classification of question types using Bloom’s Taxonomy of educational objectives-cognitive domain. (2) Comparing the percentage of marks allocated to each objective and the percentage of questions in each objective to investigate if question were awarded more marks as they progressed up the hierarchy. (3) Comparing the questions asked in the examination papers from the two syllabuses to investigate if there were a difference in questions from the various objectives. (1) Classifying the question types using Bloom’s Taxonomy cognitive domain The examination papers were analysed systematically. The method implemented was adapted from Bloom’s work. The questions were examined using instruments that contained tables of verbs associated with each objective of the cognitive domain (see Figure 1). These verbs (verbal cue) describe the complexity of behaviour required to answer the question (Bloom et al. 1956; Dalton and Smith 1986). Each question in the examination papers was analysed and counted individually. If a question contained more than one part, each part was classed as a single question. For example ‘Name an animal in the ecosystem and outline a role it plays in that ecosystem’. These would have been classified as two questions and across two objectives: knowledge (name) and comprehension (outline), respectively. If difficulty arose in classifying a question into a particular objective, the marking scheme for each paper was used to determine the complexity of answers required. This process provided more objectivity to the analysis. Implementing these methods, it was determined that questions per paper ranged from 109 in the 2002 ordinary-level paper to 156 in the 2000 higher level paper (see Table 1). Examples of classification The following sections provide examples of how the questions were classified for the objectives knowledge, application, analysis and evaluation. Table 1. Total number of questions in 1999–2008 Leaving Certificate higher level (HL) and ordinary-level (OL) biology examination papers. Includes all questions. Previous curriculum

New curriculum

Year

1999

2000

2001

2002

2003

2004

2005

2006

2007

2008

HL OL

120 110

156 135

145 132

113 109

113 111

122 131

137 123

135 120

126 139

133 115

256

A. Cullinane and M. Liston

Knowledge The questions in Figure 2 were assigned to the knowledge objective, as these asked the candidate to simply ‘name’ or ‘state’ answers that only required simple recall.

Application The following question (Figure 3) was classed in the application objective as the candidate not only had to show knowledge and comprehension of the heart but had to apply that to the dissection of the heart by drawing the components. These actions can be seen in the verbs associated with the application objective.

Analysis The verb ‘distinguish’ in the question below illustrates that this question is an analysis question. The candidate will have to break down knowledge into parts and show relationships among the parts. (Figure 4)

Evaluation A question asking the candidate to ‘suggest’ (Figure 5) required the students to evaluate and argue. Once it was determined which question related to which objective, the percentage was obtained. For example, if there were 83 knowledge questions out of a possible 137

Figure 2. Example of a knowledge-type question from the 2004 Leaving Certificate biology honours level, Q8a.

Figure 3. Example of an application question from the 2004 honours-level biology paper, Q9bi.

Figure 4.

Example of an analysis type question from the 2004 honours-level paper, Q9bii.

Figure 5.

Example of an evaluation-type question from the 2006 ordinary level, Q10ci.

Irish Educational Studies

257

questions, then the paper comprised 60.58% of questions from the knowledge objective. (2) Comparing the percentage of marks allocated for each objective and the percentage of questions in each objective The data gathered from identifying the questions into the different objectives were used to identify how many marks each objective received. The marking scheme from each paper determined by the State Examination Commission (SEC) was consulted to identify the depth of treatment needed (i.e. the amount of information and skills required to answer the question). As mentioned previously, some questions contained more than one part, thus each part was classified as a single question. In this situation the marks were divided evenly across both parts unless otherwise stated in the marking scheme. The examination papers from both curricula were marked out of 400 marks; however, the analysis included all questions including optional sections on the paper. When all questions were assigned the marks as prescribed by the marking scheme the amount of marks per paper ranged from 632 in the 2007 higher level paper to 814 in the 2001 higher level paper (see Table 2).

Inter-rater reliability In order to ensure validity of the analysis and results from the methodology used, two other individuals were asked to use the instruments to analyse the same examination paper. This was carried out to compare similarities between raters. These individuals were biology teachers and researchers conducting research in the area of Bloom’s Taxonomy. They were familiar with the syllabus, the information required to answer the questions and the application of the taxonomy. The comparison of results is illustrated in Table 3. It is evident that despite it being a subjective process, the low standard deviation (σ) values (close to 0) show similarities and a strong inter-rater reliability between the three analyses carried out.

Results Analysis of the examination papers using Bloom’s taxonomy cognitive domain Tables 4–7 present the quantitative analysis of the 1999–2008 examination papers. This analysis of the examination papers showed they were dominated by questions from the knowledge and comprehension objectives. Between 82% and 95% of the questions were from these two objectives. The higher objectives of the taxonomy are Table 2. Total number of marks (as prescribed by the marking scheme) in each higher level and ordinary-level examination papers (1999–2008). Includes all optional questions. Previous curriculum

New curriculum

Year

1999

2000

2001

2002

2003

2004

2005

2006

2007

2008

HL OL

787 770

779 770

814 770

772 775

800 801

634 632

634 633

644 633

632 635

634 633

258

A. Cullinane and M. Liston

Table 3.

Illustration of inter-rater reliability and the standard deviation (σ) between analysers

Question type

Author

Researcher 2

Researcher 3

σ

Knowledge Comprehension Application Analysis Synthesis Evaluation

62.2% 25.2% 3.7% 7.4% 0.7% 0.7%

59% 23% 9% 5% 3% 1%

59.6.% 24.3% 7.8% 5.9% 1.6% 0.8%

1.7 1.1 2.78 1.21 1.15 0.15

Table 4. Frequency of cognitive objectives in previous curriculum on the higher level Leaving Certificate biology paper. Frequency of cognitive objective in previous curriculum (higher level)

Knowledge Comprehension Application Analysis Synthesis Evaluation

1999

2000

2001

2002

2003

Average

43.33% 35.83% 7.50% 10.83% 1.67% 0.83%

67.95% 23.72% 3.85% 3.21% 0.64% 0.64%

55.86% 24.83% 14.48% 3.45% 0.69% 0.69%

47.79% 37.17% 7.96% 6.19% 0.88% 0.00%

45.13% 46.90% 7.96% 0.00% 0.00% 0.00%

52.01% 33.69% 8.35% 4.74% 0.78% 0.43%

Table 5. Frequency of cognitive objectives in previous curriculum on the ordinary-level Leaving Certificate biology paper. Frequency of cognitive objective in previous curriculum (ordinary level)

Knowledge Comprehension Application Analysis Synthesis Evaluation

1999

2000

2001

2002

2003

Average

74.55% 18.18% 3.64% 1.82% 0.91% 0.91%

77.04% 17.78% 2.22% 1.48% 1.48% 0.00%

69.70% 25.76% 4.55% 0.00% 0.00% 0.00%

68.81% 28.44% 1.83% 0.92% 0.00% 0.00%

54.05% 44.14% 0.90% 0.90% 0.00% 0.00%

68.83% 26.86% 2.63% 1.02% 0.48% 0.18%

represented in a limited capacity in the examination papers. The papers often contained many closed questions, requiring one word or one sentence answers. These often required specific information which often favoured rote learning and memorisation of facts. There were few open questions which allow students to develop their answers in more detail and demonstrate their understanding of a topic. Inspection of the suggested answers provided in the marking scheme showed the level of restriction on student answering. Only specific key words and terminology were accepted in order to obtain full marks for the question. This practice further facilitated rote learning and teaching towards the test, and making the attainment of higher order thinking very difficult (Harlen 2005).

Irish Educational Studies

259

Table 6. Frequency of cognitive objectives in current curriculum on the higher level Leaving Certificate biology paper. Frequency of cognitive objective in current curriculum (higher level)

Knowledge Comprehension Application Analysis Synthesis Evaluation

2004

2005

2006

2007

2008

Average

57.38% 21.31% 9.84% 8.20% 2.46% 0.82%

56.57% 27.27% 9.09% 7.07% 0.00% 0.00%

62.22% 25.19% 3.70% 7.41% 0.74% 0.74%

65.87% 25.40% 3.97% 4.76% 0.00% 0.00%

54.14% 27.07% 9.02% 9.77% 0.00% 0.00%

59.24% 25.25% 7.12% 7.44% 0.64% 0.31%

Table 7. Frequency of cognitive objective in current curriculum for the ordinary-level Leaving Certificate biology paper. Frequency of cognitive objective in current curriculum for the ordinarylevel paper

Knowledge Comprehension Application Analysis Synthesis Evaluation

2004

2005

2006

2007

2008

Average

74.05% 22.90% 1.53% 1.53% 0.00% 0.00%

80.49% 13.82% 4.88% 0.00% 0.81% 0.00%

78.33% 13.33% 5.00% 2.50% 0.00% 0.83%

88.49% 9.35% 2.16% 0.00% 0.00% 0.00%

77.39% 17.39% 2.61% 2.61% 0.00% 0.00%

79.75% 15.36% 3.24% 1.33% 0.16% 0.17%

Frequency of marks for each objective in the cognitive domain present on the examination papers The frequency of marks for each objective in the cognitive domain was examined in order to investigate what types of questions were rewarded in the exam (i.e. were higher order questions receiving higher marks). Tables 8–11 show some variation in the percentages of the same objective. The lower objectives of the cognitive domain made up of between 78.69% on the 2004 higher level paper and 98.19% on the 2003 ordinary-level paper. Table 12 highlights the imbalance between the percentage of questions and percentage of marks between the lower cognitive domain and the higher cognitive domain. In certain instances, it shows that the higher cognitive domain does receive more percentage marks in comparison to the percentage questions. This illustrates that candidates were being rewarded more for higher order thinking but this percentage difference was minimal to have any substantial impact on a candidate’s grade.

Addressing the research questions The study highlighted that not all objectives were present on every examination paper. Some examination papers only catered for the first three levels, and this was

260

A. Cullinane and M. Liston

Table 8. Frequency of marks per cognitive objective in previous curriculum for the higher level Leaving Certificate biology paper. Frequency of marks per cognitive objective in previous curriculum for the higher level paper

Knowledge Comprehension Application Analysis Synthesis Evaluation

1999

2000

2001

2002

2003

Average

31.30% 38.83% 13.12% 13.25% 2.73% 0.78%

55.19% 30.52% 10.52% 2.60% 0.52% 0.56%

39.50% 25.95% 28.03% 4.95% 0.39% 1.17%

56.24% 41.34% 1.61% 0.81% 0.00% 0.00%

27.27% 63.12% 9.61% 0.00% 0.00% 0.00%

41.90% 39.95% 12.58% 4.32% 0.73% 0.50%

Table 9. Frequency of marks per cognitive objective in previous curriculum for the ordinarylevel Leaving Certificate biology paper. Frequency of marks per cognitive objective in previous curriculum for the ordinary-level paper

Knowledge Comprehension Application Analysis Synthesis Evaluation

1999

2000

2001

2002

2003

Average

52.21% 41.17% 3.12% 1.17% 1.56% 0.78%

71.43% 20.00% 2.86% 1.04% 4.68% 0.00%

53.25% 42.60% 4.16% 0.00% 0.00% 0.00%

68.81% 28.44% 1.83% 0.92% 0.00% 0.00%

41.25% 57.16% 0.80% 0.80% 0.00% 0.00%

57.39% 37.87% 2.55% 0.79% 1.25% 0.16%

Table 10. Frequency of marks per cognitive objective in current curriculum for the higher level Leaving Certificate biology paper. Frequency of marks per cognitive objective in current curriculum (higher level)

Knowledge Comprehension Application Analysis Synthesis Evaluation

2004

2005

2006

2007

2008

Average

51.74% 28.80% 12.66% 5.06% 1.42% 0.32%

58.81% 29.52% 8.25% 8.41% 0.00% 0.00%

62.2% 25.19% 3.70% 7.41% 0.74% 0.74%

55.05% 30.13% 7.10% 7.73% 0.00% 0.00%

49.05% 23.66% 13.88% 13.41% 0.00% 0.00%

55.37% 27.46% 9.12% 8.40% 0.43% 0.21%

particularly evident in the 2001 and 2007 ordinary-level paper. The cognitive objectives present were mostly from the lower levels knowledge, comprehension and application.

Irish Educational Studies

261

Table 11. Frequency of marks per cognitive objective in current curriculum for the ordinarylevel Leaving Certificate biology paper. Frequency of marks per cognitive objective in current curriculum for the ordinary-level paper

Knowledge Comprehension Application Analysis Synthesis Evaluation

2004

2005

2006

2007

2008

Average

66.7% 29.91% 1.90% 1.42% 0.00% 0.00%

74.88% 16.59% 7.11% 0.00% 1.42% 0.00%

70.30% 16.75% 5.06% 6.48% 0.00% 1.42%

82.99% 12.76% 4.25% 0.00% 0.00% 0.00%

72.39% 19.08% 6.16% 2.37% 0.00% 0.00%

73.45% 19.02% 4.90% 2.05% 0.28% 0.28%

Table 12. Illustrates the average percentage of both the cognitive objectives and the marks per cognitive objectives for both the previous and current curriculum.

Previous curriculum (1999–2003) Current curriculum (2004–2008)

Higher level Ordinary level Higher level Ordinary level

%Q %M %Q %M %Q %M %Q %M

Lower cognitive domain. (knowledge, comprehension)

Higher cognitive domain (application, analysis, synthesis, evaluation)

85.70 81.85 95.69 95.25 84.49 81.85 82.80 92.47

14.30 18.15 4.31 4.75 15.51 18.15 18.16 7.53

This study highlights that despite the syllabus reform in 2002, the percentage of cognitive objectives present in the current examination corresponds to the percentage of objectives of its predecessor. There is little variation in the percentage of cognitive objectives present between the examination questions on the previous and current syllabus. The previous syllabus did, however, contain some more questions falling into the higher objectives, but this was observed only at a minimal level. The results also indicated that there was a difference in the composition of cognitive objectives on the higher level and ordinary-level examination papers, across both syllabuses. The higher level papers contained more questions objectives analysis, synthesis and evaluation than the ordinary-level papers. This was expected; however, this variation is marginal with only 10% of a difference in the results between higher level and ordinary-level examination papers. The final research question aimed to quantify the percentages of scores (marks) allocated to the different cognitive objectives of Bloom’s Taxonomy in the examination papers. The results showed that the current syllabus contained a minor variation between the marks allocated to each cognitive objective and the percentage of

262

A. Cullinane and M. Liston

objectives present on the papers. The results did show that there was some level of variation in the previous syllabus. The most notable variation was between the knowledge objectives and the comprehension objectives. Table 4 illustrates that the average percentage of knowledge objectives in the examination papers was 52.01%. Table 10 showed that the average percentage of marks for the knowledge objective was 41.90%. The same tables showed that the average percentage of comprehension objectives was 33.69%. The average percentage of marks for the comprehension objective was 39.95% (Table 10). Therefore the comprehension objective received more percentage marks than the percentage of questions present. Other objectives had equal percentages compared to the allocated marks (see Tables 8–11), indicating that the previous syllabus rewarded higher levels of thinking more so than the current syllabus. Conclusion and recommendation This study strongly highlights how high-stake examinations in Ireland have a narrow scope in terms of student achievement in cognitive thinking. It shows how the structure of current examination procedures of biology (e.g. 100% paper and pencil tests) promotes shallow learning among second-level students. If we are to prepare second-level students for a wider working world, there needs to be an increase in not only the cognitive demands of the examination, but also in methods of assessment to create a profile of achievement. There needs to be a move away from paper and pencil examinations as the sole means of accrediting students in the sciences. Useful learning does not arise from biology knowledge alone. Students must be shown how knowledge can relate to their everyday lives so their interest can be maintained, and teachers are stifled by the procedures of the high-stakes examinations. Although there is no doubt that teachers strive to do the best to help their students to achieve their full potential, current procedures hinder inquiry-based and self-directed learning. This study serves to illustrate that the present ‘exam-oriented’ practices narrow the scientific thinking skills, diminishes the promotion of understanding of science concepts that reflect the authenticity of science and does little in the way of promoting inquiry-based learning (Harlen 2005; Hyland 2011; Orpwood 2001; Stecher and Barron 1999; Supovitz 2009; Wideen et al. 1997). Many reports and articles have called for changes to assessment procedures in the Irish education system (Bennett and Kennedy 2004; Government of Ireland 2006; Looney 2006; McCoy et al. 2014; NCCA 2003). If changes are to occur in the pedagogy of biology and student learning, changes in assessment methods and assessment questions are needed as there is a very strong relationship between assessment design and the effectiveness of student learning (Boud 1988). Despite the introduction of the new syllabus in 2002 (first examined 2004) there is comparatively no difference in the percentages of questions being asked in each higher and lower cognitive levels. This study highlights the imbalance between the higher and lower objectives of questioning in the examination papers and the skills that are valued by the Biology assessment developers. This low level of thinking promotes rote learning and regurgitation of facts, requiring little to no understanding of the topics (Harlen and James 1997). The Chief Examiners report for Leaving Certificate biology (SEC 2013) confirms this by reporting that a large percentage of students did not answer the questions pertaining to the higher objectives. The small percentage of candidates answering these types of questions is an indication that many of the candidates were

Irish Educational Studies

263

not equipped with the skills to answer questions they had not encountered before. The report stated even those who did answer the questions demonstrated very little ‘middle ground’ in answering higher order questions. Some did answer them at a very thoughtfully level, and usually correctly. Others failed to answer the question actually being asked of them, or answer them in such a way as to suggest that little thought has been given to the answer (SEC 2013, 15). This is evidence that few students are developing their higher order thinking skills during their study of biology. Assessment of the proposed new syllabus should move towards skills required for more modern practices that reflect skills needed for higher education and future employment. As part of the requirements for entry to third-level science courses, students need to achieve a minimum of a C3 grade in a science subject. Biology is often used as a gateway subject by many students venturing on to study science at third level. If second-level biology education is to adequately prepare students to participate in science courses at third level, they must leave school with the skills to think critically and analyse problems in not only biology but across all Science, Technology, Engineering and Maths subjects (Allen and Tanner 2005). It is increasingly important to ensure students at all levels gain a deep understanding of biological systems and develop problem-solving skills which are adaptable and transferable across various disciplines (NRC 2009; Page and Reiss 2010; Teagasc 2011; Scott 2007; Zhang et al. 2010). A more accurate balance must be sought between lower and higher cognitive objectives in the examination papers. Achieving this balance will contribute to the effective evaluation of students who are to become the skilled scientists and citizens of tomorrow. There has been widespread consensus that science teaching strategies should incorporate critical thinking and problem-solving skills through constructivist approaches and inquiry-based learning (Bybee 2000; Linn, Davis, and Bell 2004; Linn, Davis, and Eylon 2004; Linn et al. 2006; Task Force on the Physical Sciences 2002; Tobias and Duffy 2009). New directions are emerging in science curricula in Ireland. Teaching needs to concentrate more on scientific concepts rather than on the retention of information, de-emphasising the memorising of facts and place a further emphasis on the process of inquiry-based learning (OECD 2006). Considering assessment is what often drives the syllabus and what is taught in the classroom, such activities can be fostered with good quality assessment procedures at second level (Kellaghan 2000; Williams 1992). The assessments to determine student achievement have been slow to change. The results of this analysis are testament to this, considering analysis of examination papers covering two syllabuses, spanning six decades, showed little to no variance in the cognitive structure of the questions. The new draft syllabus released has indicated that there will be a change in the assessment procedures (NCCA 2011). However, there has been little mention of a movement to introduce any assessment-led reform as has happened in the UK with the development of tests and league tables, and in the US with the No Child Left Behind Act of 2001 of the policy of the federal administration. And there is no mention that there will be changes to the written assessment procedures or the development of assessment frameworks that will guide the development of the examination. Evidence from this review shows that questions predominately classed as higher order were graph-type (analysis), debate- or argumentation-type (evaluation) and opinion-type questions (analysis, synthesis and evaluation). The cognitive demands

264

A. Cullinane and M. Liston

of the written examination could be increased by incorporating the Nature of Science through these various assessment formats. These could include concept cartoons, political and science in society debates, case studies on the history of biological sciences and scientists. This would provide depth and richness to what students learn (Erduran and Dagher 2014). Final comments The methodology of this study should be applied to other examination papers in not only science subjects but also other subject areas. It is not only sufficient for this to occur in one area but in order to increase the cognitive abilities of students it needs to emerge across various subjects. This will aid in tackling legitimate concerns about high-stakes testing lowering standards, problem-solving skills not fostered, students lacking basic skills needed for employment and not be able to compete internationally in the jobs market. There is a need for academic leadership and crosscollaboration between the syllabus and assessment research communities, as often those knowledgeable in their respective areas are not often up-to-date with developments in the other communities (Orpwood 2001). Dialogue across this boundary is required for real change to occur at both areas, particularly in that of assessment.

Notes on contributors Ms Alison Cullinane is a Ph.D. candidate in EPI∙STEM – National Centre for STEM Education, based in the Department of Education and Professional Studies, University of Limerick, Republic of Ireland. Dr Maeve Liston is currently Director of Enterprise and Community Engagement and is lecturer of Science Education in the Department of Learning, Society, and Religious Education in Mary Immaculate College, Limerick, Republic of Ireland.

References Allen, K., and K. Tanner. 2005. “Approaches to Biology Teaching and Learning: Understanding the Wrong Answers – Teaching Towards Conceptual Change.” Life Science Education 4 (2): 112–117. Anderson, L. W., D. R. Krathwohl, P. W. Airasian, K. A. Cruikshank, R. E. Mayer, P. R. Pintrich, J. Raths, and M. C. Wittrock. 2001. A Taxonomy for learning, teaching, and assessing: a revision of Bloom’s Taxonomy of Educational Objectives. Abridged Edition. New York: Longman. Baird, J., T. N. Hopfenbeck, J. Elwood, D. Caro, and A. Ahmed. 2014. “Predictability in the Irish Leaving Certificate.” OUCEA Report 14/1 commissioned by the State Examinations Commission, Dublin, Ireland. Bennett, J., and D. Kennedy. 2004. “Practical Work at the Upper High School Level: The Evaluation of a New Model of Assessment.” International Journal of Science Education 23 (1): 97–110. Black, P., and D. Wiliam. 1998. “Assessment and classroom learning.” Assessment in Education. Principles, Policy & Practice 5 (1): 7–74. Bloom, B., M. Englehart, E. Furst, W. Hill, and D. Krathwohl. 1956. Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive domain. New York, Toronto: Longmans, Green. Boud, D. Ed. 1988. Developing Student Autonomy in Learning. London: Kogan Page, Second Edition.

Irish Educational Studies

265

Bybee, R. W. 2000. “Teaching Science as Inquiry.” In Inquiring into Inquiry Learning and Teaching Science, edited by E. H. van Zee, 20–46. Washington, DC: AAAS. Cohen, L., L. Manion, and K. Morrison. 2007. Research Methods in Education, 6th ed. New York: Routledge. Creswell, J. 1994. Research Design: Qualitative and Quantitative Approaches. London: Sage. Crooks, T. J. 1988. “The Impact of Classroom Evaluation Practices on Students.” Review of Educational Research 58: 438–481. Dalton, J., and D. Smith. 1986. Extending Children’s Special Abilities - Strategies for Primary Classrooms. http://www.teachers.ash.org.au/researchskills/dalton.htm Department of Education and Science. 2001. Leaving Certificate Biology Syllabus. Dublin: The Stationary Office. Department of Education and Science. 2004. A Brief Description of the Irish Education System. Accessed October 10, 2013. http://www.most.ie/webreports/Fatima%20reports/School/dept_ education_system04.pdf Department of Education and Skills. 2010a. ‘Speech by the Minister for Education and Science, Batt O’Keeffe TD, at the launch of the 2010 Agricultural Science Farm Talk ’n Walk Series’, Accessed March 10, 2011. http://www.education.ie/en/Press-Events/Speeches/2010-Speeches/ SP10-02-22.html Department of Education and Skills. 2010b. The National Strategy for Higher Education to 2030. Report of the Strategy Group. Dublin: Government Publication Office. Desmond, T., and M. Desmond. 2008. “Large scale Assessment – Maintaining Public Confidence in High Stakes state examinations.” In Proceedings 34th Annual Conference of the International Association for Educational Assessment, Cambridge, September. Donnelly, K. 2015. “Teachers to picket every second-level school again.” The Irish Independent, 23 April, Accessed July 8, 2015. http://www.independent.ie/irish-news/education/teachers-topicket-every-secondlevel-school-again-31165491.html. Engineers Ireland. 2010. Report of Taskforce on Education of Mathematics and Science at Second Level. Dublin: Department of Education and Science. Erduran, S., and Z. R. Dagher. 2014. “Regaining focus in Irish Junior Cycle Science: Potential New Directions for Curriculum and Assessment on Nature of Science.” Irish Educational Studies 33 (4): 335–350. Forfás. 2009. Health Life Sciences in Ireland – An Enterprise Outlook. Dublin: Forfás. Gardner, J., Harlen, W., Hayward, L., Stobart, G., and M. Montgomery. 2010. Developing Teacher Assessment. Berkshire: Open University Press. Geiser, S., and M. V. Santelices. 2007. “Validity Of High-School Grades In Predicting Student Success Beyond The Freshman Year: High-School Record vs. Standardized Tests as Indicators of Four-Year College Outcomes.” Center for Studies in Higher Education: Research & Occasional Paper Series 6: 1–35. Government of Ireland. 2006. Strategy for Science, Technology & Innovation, 2006–2013. Dublin: Government of Ireland. Government of Ireland. 2008. Building Ireland’s Smart Economy: A Framework for Sustainable, Economic Renewal. Dublin: Government of Ireland. Hamilton, L., B. M. Stecher, and S. P. Klein. 2002. Making Sense of Test-based Accountability in Education. Santa Monica, CA: RAND. Harlen, W. 2005. “‘Teachers’ Summative Practices and Assessment for Learning – Tensions and Synergies.” The Curriculum Journal 16 (2): 207–223. Harlen, W., and M. James. 1997. “Assessment and Learning: Differences and Relationships Between Formative and Summative Assessment.” Assessment in Education: Principles, Policy & Practice 4 (3): 365–379. Hyland, A. 2011. “Entry to Higher Education in Ireland in the 21st Century.” NCCA/HEA Seminar, September 21, 1–24. Dublin: Higher Education Authority. IBM. 2008. Global CEO Survey: The Enterprise of the Future, Life Sciences Industry Edition, August. Kellaghan, T. 2000. “Using Assessment to Improve the Quality of Education.” In International Working Group on Education, Florence, June 14–16, 2000, 1–34. Linn, M. C., E. A. Davis, and P. Bell, Eds. 2004. Internet Environments for Science Education. Mahwah, NJ: Lawrence Erlbaum Associates.

266

A. Cullinane and M. Liston

Linn, M. C., E. A. Davis, and B. S. Eylon. 2004. “The Scaffolded Knowledge Integration Framework for Instruction.” In Internet Environments for Science Education, edited by M. C. Linn, E. A. Davis, and P. Bell, 47–72. Mahwah, NJ: Lawrence Erlbaum Associates. Linn, M. C., H. S. Lee, R. Tinker, F. Husic, and J. L. Chiu. 2006. “Inquiry Learning: Teaching and Assessing Knowledge Integration in Science.” Science 313 (5790): 1049–1050. Looney, A. 2006. “Assessment in the Republic of Ireland.” Assessment in Education: Principles, Policy and Practice 13 (3): 345–353. Marzano, R. J., and Kendall, J. S. 2007. The New Taxonomy of Educational Objectives. Thousand Oaks, CA: Corwin. McCoy, S., Smyth, E., Watson, D., and Darmody, M. 2014. Leaving School in Ireland: A Longitudinal Study of Post-School Transitions, 36. Dublin: The Economic and Social Research Institute. NCCA. 2002. Biology Leaving Certificate Ordinary Level and Higher Level Guidelines for Teachers. Dublin: Government Publications. NCCA. 2003. Junior Certificate Curriculum, Department of Science and Education. Dublin: The Stationery Office. NCCA. 2008. “National Council for Curriculum and Assessment.” Accessed December 2008. www.ncca.ie. NCCA. 2011. Leaving Certificate Biology: Draft Curriculum for Consultation. Dublin: Government Publications. NRC. 2009. A New Biology for the 21st Century: Ensuring the United States Leads the Coming Biology Revolution. Washington, DC: National Academies Press. Accessed December 5, 2009. www.nap.edu/catalog.php?record_id_12764. Ó’ Buachalla, S. 1988. Education Policy in the Twenty Century Ireland. Dublin: Wolfhound Press. OECD. 2006. Evolution of Student Interest in Science and Technology Studies. May 2006. Orlich, C., R. Harder, R. Callahan, M. Trevisian, and A. Brown. 2004. Teaching Strategies: A Guide to Effective Instruction. 7th ed. Boston: Houghton Mifflin Company. Orpwood, G. 2001. “The Role of Assessment in Science Curriculum Reform.” Assessment in Education: Principles, Policy & Practice 8 (2): 135–151. O’Sullivan, D. 1985. “Science Teaching in Ireland Today.” Unpublished final year dissertation thesis Thomond College, Limerick. Page, G., and M. Reiss. 2010. “Biology Education Research.” Journal of Biological Education 44 (2): 51–52. Pollitt, A., C. Huchinson, N. Entwistle, and C. De Luca. 1985. What Makes Exam Questions Difficult? Edinburgh: Scottish Academic Press. Scott, N. R. 2007. “NanoScience in Veterinary Medicine.” Veterinary Research Communications 31 (Suppl. 1): 139–144. State Examination Commission (SEC). 2013. “Leaving Certificate Examination 2013 Biology Chief Examiner’s Report.” State Examination Commission. https://www.examinations.ie/ archive/examiners_reports/Chief_Examiners_Report_Biology_2013.pdf. Stecher, B., and S. Barron. 1999. “Test Based Accountability: The Perverse Consequences of Milepost Testing.” Paper presented at the annual meeting of the American Educational Research Association, Montreal, Canada. Supovitz, J. 2009. “Can High-Stakes Testing Leverage Educational Improvement? Prospects from the Last Decade of Testing and Accountability Reform.” Journal of Educational Change 10 (Anniversary Issue): 211–227. Swart, A. J. 2010. “Evaluation of Final Examination Papers in Engineering: A Case Study Using Bloom’s Taxonomy.” IEEE Transactions on Education 53 (2): 257–264. Task Force on the Physical Sciences. 2002. Report and Recommendations of the Task Force on the Physical Sciences. Dublin: Department of Education and Science. Teagasc. 2011. “Agricultural Education: supporting Economic Recovery.” Accessed October 12, 2014. http://www.teagasc.ie/news/2011/201101–26.asp. Thomas, J. W., L. Bol, R. W. Warkentin, M. Wilson, A. Strage, and W.D. Rohwer. 1993. “Interrelationships Among Students’ Study Activities, Self-Concept of Academic Ability, and Achievement as a Function of Characteristics of High-School Biology Courses.” Applied Cognitive Psychology 7: 499–532. Tobias, S., and T. M. Duffy. 2009. Constructivist Instruction: Success or Failure. London: Routledge.

Irish Educational Studies

267

Wideen, M. F., T. O’Shea, I. Pye, and G. Ivany. 1997. “High-Stakes Testing and the Teaching of Science.” Canadian Journal of Education 22 (4): 428–444. Williams, K. 1992. “Assessment: A Discussion Paper, Dublin Association of Secondary Teachers.” Ireland. Zhang, L., D. Pornpattananangkul, C.-M. J. Hu, and C.-M. Huang. 2010. “Development of Nanoparticles for Antimicrobial Drug Delivery.” Current Medicinal Chemistry 17: 585–594.