External examiners and the continuing inflation ... - Wiley Online Library

4 downloads 2897 Views 405KB Size Report
riences as an undergraduate external examiner but is also based on my own experiences as a geography student and as an internal examiner for the last 38 ...
bs_bs_banner

Area (2012) 44.2, 178–185

doi: 10.1111/j.1475-4762.2011.01077.x

External examiners and the continuing inflation of UK undergraduate geography degree results John E Thornes School of Geography, Earth and Environmental Sciences, University of Birmingham, Birmingham B15 2TT Email: [email protected] Revised manuscript received 1 November 2011 The percentage of ‘good’ (first or 2.1) geography degrees in the UK has risen from around 40 per cent in the early 1970s to 71 per cent in 2010. The likely reasons for this are discussed in detail. Each university has its own weighting and variant scheme to calculate the final degree results but there is little evidence of consistency within or among universities. The role of undergraduate external examiners is discussed including their need to create a ‘level playing field’ at a time when their role is rapidly changing and their influence is declining. Borderline cases between degree classes are increasingly being preclassified automatically and the days of vivas and adjusting marks are long gone in most universities. Nevertheless undergraduate external examiners should continue to be concerned at the continuing rise in the percentage of ‘good’ degrees and they need to be aware of the comparative ‘good’ degree statistics, over a number of years, in the departments they are examining for. A new scheme is proposed to circulate to all undergraduate external examiners in geography an annual table of the percentage of ‘good’ degrees across the country. There will be an increasing shift in emphasis from external examiners moderating the results of individual students to moderating the marks of individual courses. The likely impact on external examiners of both the higher education achievement reports (HEARs) and the likely introduction of grade point averages (GPAs) to replace traditional degree classifications, are also discussed and a new hybrid degree classification system, that is easy to understand for employers, is presented. Key words: external examining, degree results, ‘good’ degrees, new degree classification

Introduction When I graduated with a geography BSc degree in 1972 from the University of Manchester, I had no idea how my degree result was calculated. I do remember having a very tense ‘viva’ in which I had to revisit a selection of my weaker exam answers, hopefully to the satisfaction of the ring of internal and external examiners. Apart from the dissertation, all of the courses I took were exam based at the end of the third year (finals) and half of those third year courses were compulsory. Today exam systems are modular and many, if not most, courses are optional in the second and third years and continuous assessment means that the final exam period is no longer to be feared. Does this now make it easier to obtain a ‘good’ (first or 2.1) degree? Today students still seem to know very little about how their final degree classification is calculated and even less about the role of the external examiners, despite the fact that details are usually published in their student

handbook. Students, however, do know that they want to achieve a ‘good’ degree and hopefully they are prepared to work hard to achieve one. The inspiration for this paper has arisen from my experiences as an undergraduate external examiner but is also based on my own experiences as a geography student and as an internal examiner for the last 38 years (30 years at the University of Birmingham and 8 years at University College London). I have always been curious to discover what proportion of ‘good’ geography degrees are awarded in different universities across the UK, and why that proportion has increased so markedly over time. I had seen the number of ‘good’ degrees rise steadily in my own department and university but I had not checked to see if this rise was happening simultaneously elsewhere. For example, in 1972 there were 55 BA/BSc graduates in geography single honours from the University of Birmingham from which two firsts were awarded and nineteen 2.1s to give a proportion of 38 per cent ‘good’ degrees.

Area Vol. 44 No. 2, pp. 178–185, 2012 ISSN 0004-0894 © 2012 The Author. Area © 2012 Royal Geographical Society (with the Institute of British Geographers)

External examiners and the continuing inflation of UK undergraduate geography degree results Fast forward to 2010 when there were 123 BA/BSc graduates in geography single honours of which 13 firsts were awarded and 92 2.1s to give a proportion of 85 per cent ‘good’ degrees. In my first year as an undergraduate external examiner, at another Russell Group1 university, I calculated that the proportion of ‘good’ degrees was lower than at Birmingham and I felt that it was part of my duties to find out how significant this was. I soon discovered that there are no detailed published statistics on this metric despite it being an obvious theme for the attention of internal and external examiners, students, parents and of course the editors of ‘university guides and league tables’. A number of questions immediately sprung to mind. How typical are these statistics across the UK? Are external examiners aware of the increasing proportion of ‘good’ degrees? Should external examiners ask for and be given these statistics each year? What are the reasons for this inflation of success at a time when student numbers have more than doubled? If the vast majority of students are now being given ‘good’ degrees, has the degree classification system become worthless, especially to employers? What, if anything, can be done to ensure a level playing field? Or should we just ignore this grade inflation or indeed should we be proud of our achievements – having raised standards to such an extent over the years! I always tell my first year tutorial students that they are all capable of achieving at least a 2.1 if they work hard and read widely! In a recent review of university assessment Peter Williams, the chief executive of the Quality Assurance Agency (QAA), which regulates quality in higher education, said there was currently no way of ensuring equal standards between universities. He said there were 118 awarding bodies which had approval to award their own degrees, and that there was ‘no evidence of consistency between subjects in institutions and between institutions’ (Woolcock 2008). Is it therefore impossible or pointless for external examiners to try and compare standards across universities? Will the proposed shift towards higher education achievement reports (HEARs; Burgess 2009) and/or grade point averages (GPAs; Times Higher Education 2011) make the external examiner’s job easier or more difficult?

The role of the undergraduate external examiner The joys of being an undergraduate external examiner. Having survived weeks of marking, mitigations and examiners’ meetings in your own institution you then volunteer to pontificate over illegible scripts and indecipherable red ink in another institution! In most universities there is no training involved in how to operate as an external examiner; you just have to rely on experience (France and

179

Fletcher 2004, 8). The system normally works like this: when internal examiners have produced a final mark (after internal moderation), for each student, for each course/ module, the marks are compiled to calculate the student’s overall average mark taking into account weightings for first, second and third year marks. For example at Birmingham the first year marks are not included in the final weighting, the second year marks count for 25 per cent and the third year marks 75 per cent. Different universities have different weightings. However, it is nearly always the case that a weighted average mark of between 60 per cent (normally 59.5 per cent rounded up to 60 per cent) and 69.4 per cent will constitute a 2.1 whereas a mean mark of 69.5 per cent or greater will constitute a first-class degree. It has long been recognised that an average mark only tells part of the story and a variety of methods have been used to take into account the full mark profile of a student. For example at Birmingham we use what is called a profiling variant that lowers the class thresholds if there is a preponderance of marks in the class above (Table 1). Table 1 The profiling variant – classification scheme University of Birmingham School of Geography 2010 First either weighted average 70 or above or weighted average of 67.0–69.4 and more than 240 (i.e. more than half) units 70 or above and no fails or fails compensated by an equal number of additional passes at 70 or above to a maximum of 60 units (e.g. if 20 units are failed, then more than 260 units must be 70 or over) or weighted average of 67.0–69.4 and exactly 240 units are 70 or above with not less than 80 units between 60 and 69 and no fails Upper second either weighted average 60–69 or weighted average of 57.0–59.4 and more than 240 units 60 or above and no fails or fails compensated by an equal number of additional passes at 60 or above to a maximum of 60 units (e.g. if 20 units are failed, then more than 260 units must be 60 or over) or weighted average of 57.0–59.4 and exactly 240 units are 60 or above with not less than 40 units are 70 or above and no fails In this scheme, final year credit values are multiplied by three to ensure the ratio of final year to second year weighting is 75:25. Thus there are 3 ¥ 120 = 360 units from the final year and 120 units from the second year, giving a total of 480 units. Classification begins by computing a weighted average of all second- and third-year module results

Area Vol. 44 No. 2, pp. 178–185, 2012 ISSN 0004-0894 © 2012 The Author. Area © 2012 Royal Geographical Society (with the Institute of British Geographers)

180

Thornes

Although the weighted average degree class bands are the same for all UK universities, most universities have their own weighting and variant that help to push some borderline students up into the next class. There is a huge range of schemes and Curran and Volpe (2003, 15) show that in some universities it is possible to be considered for a first even with an overall average as low as 51 per cent! One of the roles of the external examiner is to ensure that the weighting and variant schemes are being applied properly and fairly, although they have no say in their original derivation or apparent defects (although criticisms can be aired in the external examiners’ reports). The role of the undergraduate external examiner has changed greatly in recent years. In 2004 the QAA’s Code of practice for external examining suggested that an institution should ask its undergraduate external examiners, in their expert judgement, to report on: 1 Whether the academic standards set for its awards, or part thereof, are appropriate. 2 The extent to which its assessment processes are rigorous, ensure equity of treatment for students and have been fairly conducted within institutional regulations and guidance. 3 The standards of student performance in the programmes or parts of programmes which they have been appointed to examine. 4 Where appropriate, the comparability of the standards and student achievements with those in some other higher education institutions. 5 Good practice they have identified. The specific roles of external examiners vary across subjects and institutions, but typically include the following (quoted from Higher Education Funding Council for England 2009, 15): 1 Approval of examination questions (external examiners can and do ask for changes). 2 Advice on continuous assessment or coursework. 3 Moderation of assessment results, following an internal double-marking and/or moderation process, through sampling student assessment and/or examination scripts and by looking at the overall spread, breakdown and comparability of marks. Not surprisingly the need for undergraduate external examiners and the quality of their verification role has been continually questioned over the years (Williams 1979; Warren Piper 1985; Williams 1986; Chapman 1994; Dearing Report 1997; Cuthbert 2003; Quality Assurance Agency for Higher Education 2009; Universities UK 2011). The vast majority of countries around the world do not see the need for undergraduate external

examiners and only use external examiners for postgraduate degrees: In the ‘real world’ the outcome is that we have retained [undergraduate] external examiners, but do they fulfil any function other than ceremonial? (Cuthbert 2003, 6)

In the last few years most institutions have considerably changed their internal procedures for awarding undergraduate degrees. For example, at Birmingham, we have modularised our degree programme, introduced anonymous marking, eliminated the ‘viva’ for marginal candidates, introduced computerised ranking of both BA and BSc candidates and standardised and broadened mitigation procedures. Table 1 shows that to be considered for a first at Birmingham an overall average of just 67 per cent is required (and for a 2.1 just 57 per cent) compared with 68 per cent and 58 per cent in the recent past. In our school, until this year, internal examiners were always asked to relook at calculated individual course marks of 39, 49, 59 and 69 per cent (the mean of several exam/ course work questions) with a view to pushing the mark up into the next class, or pushing it down to 38, 48, 58 or 68 per cent; but from this year calculated course marks of 39, 49, 59 or 69 per cent have been frozen. The degree result will be automatically calculated using the Birmingham weighting and variant scheme. As a result, from this year, our external examiners will no longer be asked to look at marginal cases except those cases involving serious mitigation issues. Thus an amount of tinkering with the system is always going on. The role of the external examiner has undoubtedly become less onerous. The days of a continuous stream of viva candidates are long gone. It is much more a ‘rubber stamping’ exercise today as far as the marking consistency and actual marks are concerned. Indeed increasingly marks cannot be changed by the external examiner. The vast majority of degrees are now pre-classified anonymously and the number of borderline candidates has been reduced considerably. One of the most important remaining duties of an undergraduate external examiner is to certify that the boundaries between degree classes are consistent, within their bounds of experience, to ensure that a ‘good’ degree (first or 2.1) at one institution meets the same standard as at their home university and hopefully at any other institution in the UK. Most external examiners will only have experience of a handful of departments. What evidence is available to external examiners and students to ensure that this duty is accomplished? Also is a ‘good’ degree awarded today the same as a ‘good’ degree awarded in the past?

The proportion of ‘good’ degrees As the number of students admitted to read geography, and other subjects, has expanded considerably over the

Area Vol. 44 No. 2, pp. 178–185, 2012 ISSN 0004-0894 © 2012 The Author. Area © 2012 Royal Geographical Society (with the Institute of British Geographers)

External examiners and the continuing inflation of UK undergraduate geography degree results last 20 years one would expect, all else being equal, that the proportion of ‘good’ degrees might decrease as the number of less qualified undergraduates has increased. The facts show the opposite. In 1973 the proportion of UK geography students graduating with a ‘good’ degree was about 40 per cent rising to 52 per cent by 1990 (Chapman 1994) and to 71 per cent by 2010 (Higher Education Statistics Agency – HESA 2010). However, there are many other factors at play that might account for why students are getting better results today than in the past. For example, due to modularisation, students take fewer compulsory courses than in the past, e-libraries mean that many textbooks and journal articles are available online, intranets (e.g. Blackboard) make copies of lecture notes and PowerPoint presentations available to students; widespread access to the web makes gathering up-to-date information easier, desk-top publishing has improved the presentation of course work and students are more motivated to do well now that they are effectively paying towards the cost of their degrees. Hence it is not surprising that the percentage of ‘good’ degrees is continuing to rise and the number of 2.2s is declining. Bate (2008) in his article ‘Scandal of the missing 2.2s’ suggests that several institutions have also put pressure on academics to give more ‘good’ degrees and the growth of league tables in the media has also added to the mix. Geography and other departments are now ranked according to ‘value-added’, staff/student ratios, library facilities, equipment budgets, percentage of ‘good’ degrees etc. Therefore the route from entry qualifications (A-level scores) to the percentage of ‘good’ degrees is influenced by a variety of factors that the external examiner has to try and digest. Normally an external examiner will be in place for 3 years and in geography it is commonplace to have two external examiners (one human and one physical) who start in different years to give some continuity. It is not, in my experience, standard practice to provide statistics on the percentage of degree classes over the years. One would expect that the percentage of ‘good’ degrees would not fluctuate much from year to year, given the large class sizes. However, effectively the percentage of ‘good’ degrees in geography departments is being kept a closely guarded secret! Not surprisingly there is also pressure to get rid of degree classification all together and replace it with an American style GPA or transcript (Times Higher Education 2011). Also many institutions are piloting the HEAR (http:// www.hefce.ac.uk/learning/diversity/achieve/) which contains much more information about a student’s achievements throughout their degree programme, such as individual module marks, extracurricular activities, prizes, work experience, charity work and ‘ambassador’ activities at open days (Burgess 2007 2009; Higher Education Funding Council for England 2009; Woolcock 2009). For

181

example, the University of Manchester HEAR contains several pages of useful information about the student, built up over all undergraduate years, available to potential employers through a secure link (Agnew 2011). Unfortunately such a set of reports would make the – between university – verification function of the external examiner even more difficult. For comparison purposes it would be much more realistic to maintain degree results as part of the achievement reports. As an external examiner it should be an imperative to compare the percentage of ‘good’ degrees achieved in the department you are examining with the achievements of other geography departments. The information presented here looks at ‘good’ degree data for 39 universities with more than 400 graduates over the 6-year period 2002/2003 to 2007/2008. The data are readily available, for a fee, from HESA, although certain changes mean that the comparison with Chapman (1994) has to be treated with some caution as the number of universities offering geography degrees has increased to provide almost 6000 graduates per year from nearly 120 institutions. Also it must be noted that the HESA data discussed here include all L7 and F8 courses of which 81 per cent are single honours L700 (human geography) and F800 (physical geography) graduates. The geography national average of ‘good’ degrees in 2009/2010 was 71 per cent (5805 graduates) and Figure 1 shows how the percentage of ‘good’ degrees has increased over time. The national average for all subjects in 2009/2010 was 63 per cent (350 860 graduates). Figure 2 compares the percentage of ‘good’ geography degrees from 39 universities (with more than 400 geography graduates; see Table 2) between 2002/2003 and 2007/ 2008 with the corresponding graduate mean total A-level and A-level equivalent entry scores (r = 0.85; R2 = 72%). It is not really surprising that there is such a high correlation between qualifications that were obtained before university and the percentage of ‘good’ degrees awarded after 3 years of university education. A good student will nearly always remain a good student. As can be seen in Figure 2 nine universities are averaging 75 per cent or more ‘good’ degrees and two greater than 90 per cent! One could argue that those universities above the line are ‘adding value’ whereas those below the line are not. This is essential information that should be provided to external examiners to help assess ‘the comparability of the standards and student achievements with those in some other higher education institutions’ (Quality Assurance Agency for Higher Education 2004)? Indeed one could argue that in the era of the Freedom of Information Act, such relevant information might also be made available to schools and prospective students and parents at open days? Table 2 shows the total number of geography graduates (L7 + F8) in each department between 2003 and 2008 and their percentage of ‘good’ degrees. The University of

Area Vol. 44 No. 2, pp. 178–185, 2012 ISSN 0004-0894 © 2012 The Author. Area © 2012 Royal Geographical Society (with the Institute of British Geographers)

182

Thornes

Figure 1 Average percentage of good geography degrees 1973–2010 Source: Chapman (1994) and Higher Education Statistics Agency (2010)

Figure 2 Average percentage of firsts and 2.1s versus mean A-level scores 2002/2003:2007/2008 (R2=72%) (39 universities with more than 400 geography graduates over the 6 years. (Note that the A-level scores include all possible points, e.g. from AS levels and other qualifications) Source: Higher Education Statistics Agency (2010) Area Vol. 44 No. 2, pp. 178–185, 2012 ISSN 0004-0894 © 2012 The Author. Area © 2012 Royal Geographical Society (with the Institute of British Geographers)

External examiners and the continuing inflation of UK undergraduate geography degree results

183

Table 2 Total number of geography graduates 2003–2008 and percentage with ‘good’ degrees (from 39 universities that had more than 400 graduates taking L7 and F8 courses) University

No. of graduates

‘Good’ degrees (%)

0112 The University of Bristol 0156 The University of Oxford 0155 The University of Nottingham 0114 The University of Cambridge 0124 The University of Leeds 0160 The University of Southampton 0157 The University of Reading 0119 The University of Exeter 0116 University of Durham 0167 The University of Edinburgh 0204 The University of Manchester 0149 University College London 0179 Cardiff University 0126 The University of Liverpool 0162 The University of Sussex 0110 The University of Birmingham 0154 The University of Newcastle-upon-Tyne 0159 The University of Sheffield 0177 Aberystwyth University 0051 The University of Brighton 0152 Loughborough University 0168 The University of Glasgow 0073 The University of Plymouth 0134 King’s College London 0125 The University of Leicester 0180 Swansea University 0081 University of the West of England, Bristol 0069 The University of Northumbria at Newcastle 0123 The University of Lancaster 0120 The University of Hull 0141 Royal Holloway and Bedford New College 0066 The Manchester Metropolitan University 0063 Kingston University 0184 The Queen’s University of Belfast 0074 The University of Portsmouth 0139 Queen Mary and Westfield College 0170 The University of Aberdeen 0056 Coventry University 0038 University of Cumbria

566 551 1081 603 1823 1139 874 973 1127 807 1154 605 450 834 463 1467 961 1186 893 464 737 550 1528 605 550 756 656 548 995 537 438 1035 420 861 836 515 466 414 426

93 92 89 88 79 79 77 76 75 75 74 74 74 73 72 70 67 66 66 66 65 61 60 60 59 58 58 55 54 54 54 53 52 51 51 51 45 41 37

Leeds had the most graduates with 1823 and the University of Bristol gave the most ‘good’ degrees with 93 per cent.

Conclusion We cannot ignore the statistics that tell us that there has been a huge rise in the proportion of ‘good’ degrees awarded in geography, from 40 per cent on average in 1973 to 71 per cent in 2010, although undoubtedly we have to be aware that statistics do not tell us everything. This change has taken place in the UK despite the pres-

ence of undergraduate external examiners who appear to have been largely powerless to overt this huge increase. The influence of the undergraduate external examiners has been reduced over the last few decades and it is very unlikely that they could have prevented this rise. Certainly some universities have been putting pressure on internal examiners to raise the proportion of ‘good’ degrees in an attempt to climb up the league tables and thereby attract better students. The complexity of our degree classification weightings and variants mean that it is more than likely that students with identical mark profiles are awarded different degrees depending upon which univer-

Area Vol. 44 No. 2, pp. 178–185, 2012 ISSN 0004-0894 © 2012 The Author. Area © 2012 Royal Geographical Society (with the Institute of British Geographers)

184

Thornes

sity they attend. What should employers make of this? At a time when graduate employment is scarce, perhaps a first will become the normal requirement for recruitment? The number of firsts in geography has risen from 8 per cent in 2003 to 12 per cent in 2010 but is more than 20 per cent in several universities. It is my contention that monitoring the proportion of firsts and ‘good’ degrees across UK universities is very useful in helping to judge the performance of individual departments as well as the performance of individual students. Rather than each external examiner having to produce this kind of information for themselves, it would be much better if an updated ‘spreadsheet’ was supplied to external examiners, each year. The cost of this exercise could be split between geography departments and administered by the geography Heads of Departments Committee that meets annually under the auspices of the Royal Geographical Society (with the Institute of British Geographers – IBG). Of course this would require the coordination of undergraduate external examiners but surely this would be a good development. Indeed the Royal Geographical Society (with IBG) might be persuaded to host an annual meeting for new and old geography undergraduate external examiners to try and ensure a more ‘level playing field’ for the future. In summary, the role of the undergraduate external examiner to pursue ‘a level playing field’ is still very important and any additional information that can make their verification role easier is to be welcomed. HESA already store all the necessary information to provide a simple annual update on the percentage of ‘good’ degrees from each geography department – so let the statistics be used in a positive way. If this is not done officially then inevitably the statistics will emerge in a piecemeal fashion and the reputation of geography graduates, among employers, will suffer as a consequence. Finally how will the role of the external examiner change when and if the new HEARs are offered to new university entrants from October 2012? This has yet to be thought through. Presumably these comprehensive reports will initially contain a degree classification soon to be replaced by a form of GPA and an academic transcript of individual course marks. The external examiner cannot possibly moderate every student achievement report. Inexorably the externals will be asked to concentrate on moderating module marks rather than individual students. However the same problem of comparison of GPAs between universities will emerge and it will be just as important to publish comparison figures. For example a ‘good’ degree may be defined as having an overall GPA of ⱖ3.0 out of 4.0 or ⱖ4.0 out of 5.0, whatever scheme is introduced. What percentage of students achieve this goal? The form and scale of GPA to be used has yet to be agreed and every country around the world has a different

Table 3 Propsed new scheme (16 bands) expanding degree class for employers Existing degree class

New degree class

Mark range

First First First 2.1 2.1 2.1 2.2 2.2 2.2 3 3 3 Fail Fail Fail Fail

High first (1) Mid first (2) Low first (3) High 2.1 (4) Mid 2.1 (5) Low 2.1 (6) High 2.2 (7) Mid 2.2 (8) Low 2.2 (9) High 3 (10) Mid 3 (11) Low 3 (12) High fail (13) Mid fail (14) Low fail (15) Bad fail (16)

90–100 80–89 70–79 67–69 63–66 60–62 57–59 53–56 50–52 47–49 43–46 40–42 30–39 20–29 10–19 0–9

system. Soh (2011, 127) demonstrates that globally the variety of different GPA schemes creates ‘a lot of confusion, frustration and anxiety’. Employers will have an even harder task to distinguish between applicant qualifications. It would be much simpler for employers to understand if a simple additional indicator was added to the existing system. For example a high 2.1(4) or a low 2.1(6) taken from, for example, a 16-point scale as shown in Table 3. This kind of scale can be used for individual modules as well as to produce an overall weighted average for the final assessment. It combines the existing degree classification scheme with an equivalent GPA type scheme that everyone could simply understand. Whatever information is finally contained in the HEARs, external examiners clearly need to be consulted and ideally offered training before any new scheme is implemented.

Note 1 The Russell Group represents 20 leading universities in the UK.

References Agnew C 2011 Personal communication July 2011 Bate J 2008 Scandal of the missing 2.2s The Sunday Times 8 June Burgess B 2007 Beyond the honours degree classification Burgess Group final report, Universities UK (http://www.universitiesuk. ac.uk/Publications/Documents/Burgess_final.pdf) Accessed 14 December 2011 Burgess B 2009 I hope student records make degree classes obsolete The Independent 5 November

Area Vol. 44 No. 2, pp. 178–185, 2012 ISSN 0004-0894 © 2012 The Author. Area © 2012 Royal Geographical Society (with the Institute of British Geographers)

External examiners and the continuing inflation of UK undergraduate geography degree results Chapman K 1994 Variability of degree results in geography in United Kingdom universities 1973–90: preliminary results and policy implications Studies in Higher Education 19 89–102 Curran J and Volpe G 2003 Degrees of freedom: an analysis of degree classification regulations (http://www.londonmet.ac.uk/ library/x75868_3.pdf) Accessed 10 March 2011 Cuthbert M 2003 The external examiner: how did we get here? UK Centre for Legal Education (http://www.ukcle.ac.uk/ resources/examiners/cuthbert.html) Accessed 28 July 2010 Dearing Report 1997 Higher education in the learning society HMSO, London France D and Fletcher S 2004 The motivations and professional development needs of aspiring and serving external examiners in the GEES disciplines (http://www.gees.ac.uk/projtheme/ extexam/extexam1.pdf) Accessed 25 March 2011 Higher Education Funding Council for England 2009 Report of the sub-committee for Teaching, Quality, and the Student Experience, Ref. 2009/40 (www.hefce.ac.uk/pubs/hefce/2009/ 09_40/) Accessed 9 January 2012 Higher Education Statistics Agency 2010 Higher Education Statistics Agency (http://www.hesa.ac.uk/) Accessed 1 March Quality Assurance Agency for Higher Education 2004 Code of practice for the assurance of academic quality and standards in higher education Section 4: External examining 2nd edn Quality Assurance Agency (http://www.qaa.ac.uk/ academicinfrastructure/codeOfPractice/section4/default.asp) Accessed 8 October 2010

185

Quality Assurance Agency for Higher Education 2009 Thematic enquiries into concerns about academic quality and standards in higher education in England Final report Quality Assurance Agency April 2009, p 45 (http://www.qaa. ac.uk/Publications/InformationAndGuidance/Pages/The-natureof-doctorateness-Notes-from-a-discussion-meeting-hosted-byQAA-and-the-University-of-Reading-Graduate-School-26.aspx) Accessed 9 January 2012 Soh K C 2011 Grade point average: what’s wrong and what’s the alternative? Journal of Higher Education Policy and Management 33 27–36 Times Higher Education 2011 ’Two tribes’ to the wall? Elite set may adopt GPA Times Higher Education 23 June Universities UK 2011 Review of external examining arrangements in universities and colleges in the UK London Warren Piper D 1985 Enquiry into the role of external examiners Studies in Higher Education 10 331–42 Williams G 1986 The missing bottom line in Moodie G C ed Standards and criteria in higher education Society for Research into Higher Education, Guildford 31–45 Williams W F 1979 The role of the external examiner in first degrees Studies in Higher Education 4 161–8 Woolcock N 2008 Degree classifications not fit for purpose says watchdog Times Online 18 July Woolcock N 2009 ‘Too vague’ degree classification system may be replaced by a points system The Times 1 October

Area Vol. 44 No. 2, pp. 178–185, 2012 ISSN 0004-0894 © 2012 The Author. Area © 2012 Royal Geographical Society (with the Institute of British Geographers)