EVALUATING THE RESEARCH OUTPUT OF AUSTRALIAN UNIVERSITIES’ ECONOMICS DEPARTMENTS*
Richard Pomfret ♣ and Liang Choon Wang University of Adelaide
ABSTRACT This paper presents measures of the research output of Australian economics departments. Our study covers the 640 academic staff at rank Lecturer and above in the 27 Australian universities with economics departments containing eight or more staff in April 2002. We construct publication measures based on journal articles, which can be compared with weighted publication measures, and citation measures, which can be compared with the publication measures. Our aim is to identify the robustness of rankings to the choice of method, as well as to highlight differences in focus of departments’ research output. A striking feature of our measures is that the majority of economists in Australian university departments have done no research that has been published in a fairly long list of refereed journals over the last dozen years. They may publish in other outlets, but in any event their work is rarely cited. Thus, average research output is low because many academic economists in Australia do not view research as part of their job or, at least, suffer no penalty from failing to produce substantive evidence of research activity.
Keywords: Rankings; Publications; Citations; Economics Departments.
JEL Classification: A11, L31.
*
We are grateful to Ian McLean, John Siegfried, and John Wilson for helpful comments on earlier drafts. An earlier version of this paper was presented at the 2002 Australian Conference of Economists held in Glenelg, South Australia. ♣ Correspondence: Richard Pomfret, School of Economics, University of Adelaide, South Australia 5005, Australia. Telephone: + 61-8-8303-4751. Facsimile: + 61-8-8223-1460. Email:
[email protected]
1 I. INTRODUCTION This paper presents estimates of the output of Australian economics departments in 2002 on the basis of published research. A large literature on the ranking of university economics departments has developed over the last half century.1 Some of this literature continues a longer tradition of rankings based on the perceptions of senior academics or university leaders, but most of the recent literature has aimed at more objective measures of research output. 2 In Australia such exercises assumed added importance in the 1990s as the government adopted several vintages of measures of research output, which were used to allocate research-associated funds. Our study covers the research output of the 640 academic staff at rank Lecturer and above in the 27 Australian universities with economics departments containing eight or more staff in April 2002 (Table 1). There is no best way of ascribing cardinal measures to individuals’ research output, and the approach taken here is to provide rankings based on several methods. Some methods are inferior to others, but all methods involve biases. In the fundamental divide between counting publications or counting citations, the former approach favours more recent work and the latter favours more established scholars. One goal of the paper is to identify how and to what extent the choice of measure affects the ranking of departments. Beyond this, we aim to analyse the research output of Australian-based university economists and to establish proximate explanations for their low output relative to economists in other countries’ universities. The first section of this paper discusses the methodological alternatives, drawing on the international literature. The Australian literature is reviewed in section 2. Our data are presented in section 3, and the results in section 4, where we focus on the sensitivity of departmental rankings to the choice of method and on the distribution of individual research output. Our main conclusions are that, although rankings are sensitive to method and time period, the groups of high and low researchoutput departments are fairly stable over both method and time period, and that the distribution of research output is extremely skewed. In most departments a few high producers account for much of the research output. The majority of academic economists in Australia do not publish and their work is rarely cited. II. METHODS Perception-based rankings of universities have existed for centuries. Recent examples include the National Research Council Report on US university departments or the survey of 81 full professors in Australian universities by Anderson and Blandy (1992). The results of such surveys provide a guide to reputation. Some of these exercises are held in high esteem, when the peer reviewers are recognized as valid judges (eg. the US NRC Report or the UK research evaluations), but they are not readily adapted to cross-country comparisons. In some countries, including Australia, perception-based rankings are not held in high esteem due to their subjectivity. The surveyed academics or administrators do not know all departments well and their responses are likely to reflect prejudgments as much as assessments of current standing. Perception-based rankings may cover the range of university departments’ 1
The ranking by Fusfeld (1956), based on presentations at American Economics Association conferences 1950-4, is often identified as the forerunner. 2 No measure is, of course, completely objective. As will be clear from the following literature survey the choice of the “objective” measure is subjective.
2 activities, but these are often conflated so that it is unclear how, say, teaching and research are weighted. The approaches described in the remainder of this section focus specifically on scholarship contained in published research. The starting point for measuring research output is publications. In days of yore ideas spread by word of mouth and a lecture or seminar or even informal discussions could have a large impact in the circle of academic economists, but today unpublished research is unlikely to reach a sufficiently wide audience to be influential. The immediate issue is how to define a publication? Practically all publications-based rankings go no further than journals and books, with US studies being more likely to focus only on refereed journals.3 In all cases a key issue is quality. The quality issue can be resolved by only counting articles in the leading journals, but that involves a decision about which are the leading journals and whether they should receive equal weight. One cut-off is only to include core journals.4 The weights can be impact-weights (based on citations to articles in each journal) or perception-weighted.5 The fundamental assumption of this approach is that withinjournal quality variation is less than across-journal quality variation. This is plausible if the gap between two journals’ rankings is large, but less plausible when the gap is small, and there will be similar quality journals above and below any cut-off line. Major articles have appeared in field journals rather than “core” general journals, and any top journal includes some below-par accepted papers.6 Some studies weight by page numbers, adjusted to account for the larger page sizes in some journals than in others, in order to separate major contributions from comments or minor model refinements. In practice, length is a crude and faulty weighting procedure. Some of the major contributions to economics have been brief 3
The focus on publications alone dates back to, at least, Lovell (1973), who treated journal publications as representative of all publications. Stigler (1982, 177) provides evidence against this assumption’s validity at the time of Lovell’s study, and it seems unlikely to be true today. This is a different argument to the one that significant economic research contributions only appear in refereed journals. In 1991, journal publications accounted for 33 percent of research output, measured by items, of economists in Australian universities, unpublished monographs and reports 20 percent, book chapters 14 percent, unpublished conference papers 8 percent, books 6 percent, published conference papers 6 percent, published monographs and reports 5 percent, edited books 2 percent, and other categories 7 percent (Hill and Murphy, 1994, 42). 4 Conroy et al. (1995) define the core journals as the American Economic Review, Econometrica, International Economic Review, Journal of Economic Theory, Journal of Political Economy, Quarterly Journal of Economics, Review of Economics and Statistics, and Review of Economic Studies. Inclusion of JET, the IER and REStuds means that this list favours pure theory; JET in particular is a journal which splits economists in their opinions as to its value. 5 Laband and Piette (1994a) rank journals by impact weights and Masson, Steagall and Fabritius (1997) provide a comparative perception-based ranking. Any attempt to rank journals by quality faces the problem that journal quality may change over time, and the problem is more severe for studies covering longer periods. Laband and Piette (1994, Table 2) rank the Review of Economics and Statistics as the fifth most influential journal in 1970 but twenty-ninth in 1990 and the Economic Journal as twelfth in 1970 and twenty-eighth in 1990, implying that publishing in these journals was a greater achievement in 1970 than twenty years later. In the more commercial journal-publishing environment of recent years, a high reputation has sometimes been followed by increases in size (pages per year) or in submission fees, which may lead to reduced average quality. Nevertheless, Coupé (2002, Appendix A4) finds high rank correlation coefficients for a number of alternative periods and ranking criteria, including the Laband-Piette ranking which we use. 6 Laband and Piette (1994b) study citations to articles published in 1984 in 28 leading economics journals, and find large standard deviations, which they explain by editors’ active search for good articles (so that some top quality articles are acquired by second-tier or specialized journals) and by either favouritism or chance (so that some articles place in journals above their merit).
3 and succinct, including Nobel-worthy work (eg. by John Nash, or Paul Samuelson’s “Pure Theory of Public Expenditure” in REStats 1954). One of the most influential Australian-content papers in the last three decades was the Kemp-Wan proposition, which occupied three pages in a field journal. Finally, a major problem in ranking by publications is how to deal with anything other than refereed journals. Most studies ignore all other categories, and yet significant contributions have appeared in conference volumes and other collections. Even more obviously, books are still significant vehicles for disseminating research output, especially empirical studies requiring extensive documentation and even some important sustained theoretical arguments (eg. Nobelwinning work by Amartya Sen). If books are included, the problem is quality control because, even more than with top journals, the top academic book publishers have great quality variance within their lists. The impact of a researcher’s output can be captured by citation counts. Laband and Piette (1994a, 641) describe citations as “the scientific community’s version of dollar voting by consumers for goods and services”. One difference is that, whereas consumers will not willingly buy a dud product, an author may refer to a poor paper as an example of how not to address an economics problem. There are also different categories of positive citations and some writers have suggested excluding categories such a self-citations, citations from articles published in minor journals, and citations from other fields, or weighting citations according to whether the cited article is referenced several times or is a single irrelevant reference in an obscure footnote (Diamond, 1986, 207). 7 Some survey articles are frequently cited, especially when they attain locus classicus status and a bandwagon effect ensues, but many economists would consider survey articles not to be real research unless they contain some novel results. 8 Addressing these issues is, of course, operationally difficult and involves problems of defining “minor” journals or “obscure” footnotes or “original” surveys.9 Nevertheless, citing is non-trivial because there are costs to citing (eg. locating the reference, and the risk of being accused of misrepresentation) and it carries information. 10 The construction of citation counts has been facilitated by the existence of the Social Science Citation Index (SSCI) maintained by the Institute of Scientific 7
These issues are discussed in the essays in Part IV of Stigler (1982). Stigler illustrates the difficulty of distinguishing between favourable and unfavourable references by quoting John Hicks’s review of a book by Don Patinkin “The main things I have learned [from the book] are not what the author meant to teach me” -- Stigler (1982, 203) comments that “It is not complimentary to be told that one did not understand his own message; it is complimentary for an economist to be able to teach Sir John anything”. Citation counts have a longer history in the natural sciences, but Stigler’s essays, coauthored with Claire Friedland and originally published in the 1970s, and Eagly (1975) were influential early uses in economics. Posner (1999, 4-9) discusses pure and impure motives for citing. 8 Citations of edited volumes may also be misleading if the editors contributed only a small part of the research contained in the book. Although some credit should go to editors who assemble a collection worth citing, an ideal citation count would give credit to the other contributors too. 9 There may also be a field bias; citation counts and impact weights tend to favour econometrics and finance journals because of their interlinked and frequent citations, whereas economic history articles cite historical sources relatively often and the work of other economic historians relatively infrequently. Blockbuster citees usually involve an econometric method; of the twenty most-cited articles published between 1975 and 2000 ten were in Econometrica, two in JASA and one in OBES (Coupé, 2002, Appendix A10). 10 The seriousness with which citations analysis is taken in the USA is reflected in its use in legal cases. Posner (1999, 19n) lists four cases in which academics have claimed to have been discriminated against by their university, and citations analysis was used in the court proceedings to help determine whether the alleged discrimination was invidious or based on lack of scholarly distinction.
4 Information, although reliance on a single dominant source has operational drawbacks.11 In particular, early versions of the SSCI listed citations only by first author, which involved an alphabetic bias unless researchers were willing to carefully follow up citations to non-first authors of co-authored publications. There were also inconsistencies in listing authors by initial or first name, some misspellings, and the problem of distinguishing between citations to identically named authors. The most recent electronic versions list only by initials, but allow for relatively easy crosschecking, and identify all authors not just the first-named.12 The current electronic version lists only citations in journals published since 1995, although they include citations to earlier journal articles and to books and other publications. In choosing between citations and publications as the basis for ranking methods, we should consider both the practical difficulties, which tend to be greater with citations, and the conceptual difficulties, which tend to be greater with publications. There is also a time dimension insofar as citation counts will favour departments stacked with established scholars, while publications capture more recent research.13 Every ranking method has its strengths and weaknesses and sources of bias. In a meta-analysis of five US studies,14 Feinberg (1998) found a clear pattern of the authors’ institutions always ranking much higher in their own study than in any other. While this bias may be driven by self-interest, it may be more insidious in that different departments place different weight on theory versus applied work, core economics versus fields, books versus articles, policy-relevant versus esoteric research, and so forth, and the preferred ranking method may reflect the ethos of a researcher who choses to be working in a particular style of department.15 Whichever explanation one accepts of home-institution bias, the implication is that any single ranking method is unlikely to produce a generally acceptable ranking of departments. Finally, it should be noted that all ranking systems are sensitive to decisions about who to include in each department, the time period covered and other such decisions. Thursby (2000) has emphasised the dangers of placing too much weight on minor ranking differences, and he uses the National Research Council measures to develop a test of distance between departments in the USA. Significance tests are, however, conceptually difficult when we are dealing with the population of defined publications or citations, rather than a sample. Despite all their flaws, in practice published rankings continue to be treated seriously. 11
The SSCI was developed in 1969. Smyth (1999, 122-3), who works with citations in five Australian journals rather than with the SSCI, discusses previous sources of citation counts. 12 The entries are, however, only as good as the original reference source, which might be to Abbot et al. rather than to named co-authors. It is difficult to evaluate the completeness or accuracy of the ISI website, but a check by a well-published but alphabetically disadvantaged Adjunct Professor at Adelaide University found many cases of his work being cited under co-authors’ names but not his own. 13 This difference will be diminished as the number of years covered by publications increases, and it is also reduced by the methodology of the current SSCI which only reports citations in journals published since 1995. Another time dimension is that journal articles have become longer (Ellison, 2002), so that any page-weighted measure will favour more recent publications. 14 Berger and Scott (1990) from the University of Kentucky, Conroy et al. (1995) from the University of Texas, Scott and Mitias (1996) from Louisiana State University, Tremblay, Tremblay and Lee (1990) from Kansas State University and Tschirhart (1989) from the University of Wyoming. Dusansky and Vernon (1998) is an update of the Conroy et al. study. 15 Stigler (1982, 209-12) observed a similar home-institution bias in the citation practices of PhDs from five leading US graduate schools. He constructed a parochialism index which showed disproportionate citing of faculty from their own school, especially by Harvard PhDs.
5
III. AUSTRALIAN LITERATURE Economists at Australian universities as a whole perform rather poorly in publishing. Kocher et al. (2001) report that Australian-affiliated authors published a total of 66 articles in the top ten impact-weighted journals in even years from 1980 to 1998, while researchers based in other English-speaking university systems produced 5963 (USA), 379 (UK), 364 (Canada), 185 (Israel), 21 (New Zealand) and 9 (Ireland).16 When adjusted for population or number of universities, Australia lags far behind all of these countries except Ireland, and even behind non-English-speaking countries such as Belgium, Sweden and Switzerland.17 In a large ranking exercise funded by the European Economics Association, only five Australian universities ranked in the top 200 economics departments, both by publications in economics journals and by citation of articles in economics journals (Coupé, 2002). 18 Jonson and Brodie (1980) analysed the publication record of Australian economists on the basis of the bibliographies of the five surveys in Fred Gruen (ed.) Surveys of Australian Economics vol.1 (George Allen & Unwin, Sydney, 1978). The criterion for inclusion is subjective; expert evaluation of whether an item is worthy of inclusion or not, with no weighting of publications by quality or length. Of the 664 items in the bibliographies, 289.5 were by academics at Australian universities. The references are concentrated; the seven universities which accrue more than ten items apiece (ANU 65, Melbourne 37.5, Monash 37, Sydney 34, Adelaide 25.5, UNSW 22.5 and New England 21.5) account for over four fifths of the Australian university entries. The Jonson-Brodie ranking suffers from serious selection bias. The top seven 16
The impact factor is determined by citations, and excluding the Economist, the top ten are the Journal of Economic Literature, Journal of Financial Economics, Brookings Papers on Economic Activity, Journal of Political Economy, Econometrica, Quarterly Journal of Economics, Journal of Law and Economics, American Economic Review, Journal of Monetary Economics, and Review of Economic Studies. The list is open to criticism; for example, the Journal of Financial Economics was the most cited journal in the world over the period 1977-82, but over a fifth of total citations to the JFE were to a single article published in 1976 (Smyth, 1999, 122) and over a different time period the rankings would have been different. Note the significant differences in inclusion and ranking between this list and the Conroy et al. definition of core journals. 17 The reasons for Australian-based economists’ relatively poor research performance are unclear. Ireland appears to have suffered from mobility, with some of the most productive researchers having emigrated. Australia also suffered from the emigration of productive researchers such as Max Corden and Stephen Turnovsky to the USA in the 1980s, but emigration alone cannot explain Australia lagging Canada and New Zealand. Fox and Milbourne (1999) identify time as a crucial determinant of research productivity in Australian economics departments. Bhattacharya and Smyth (2002), based on a survey of full professors in Australian economics departments, find that output measured by either publications or citations is positively correlated with time devoted to research and negatively correlated to the time spent on other duties. McCormick and Meiners (1998) show that research output of economics departments is lower in universities which give a greater role to academic staff in university management, suggesting that the heavy administrative and committee burdens in the Australian system may be distracting academics from research. 18 By 1990-2000 publications ANU ranked 53rd., UNSW 79th., Melbourne 109th., Monash 125 th., and Sydney 145th., while measured by citations the rankings were lower with Sydney dropping out of the top 200 and UWA appearing. For comparison, Canada, with a fifty percent larger population than Australia, has 15 universities in the top 200 by publications and 14 by citations. Only four Australianbased economists made the top 1000 individual publishers (John Quiggin, Simon Grant, Yew Kwang Ng and Paul Miller) and only two made the top 1000 citees (John Quiggin and Steve Dowrick). Coupé’s sources, which rely heavily on Econlit, and method, which only counts citations to and in articles listed in both Econlit and the SSCI, may be criticized, but it is not obvious that any net bias works against Australian-based economists.
6 departments excelled in different fields; Adelaide in monetary economics, Melbourne in wages policy, UNSW in inflation, UNE and Sydney in agricultural policy, and ANU and Monash in protection policy. Failure to count the second and third volumes of the Surveys discriminated against departments with strengths in industrial organization, income distribution and so forth. Even allowing for field bias there is a suspicion of parochial bias; the monetary policy survey was by Adelaide economists, the wages policy survey by Melbourne economists, and the protection policy survey by an ANU economist. Harris (1988) undertook the first major attempt to quantify research output as the basis for ranking Australian universities’ economics departments. Harris included staff of lecturer and above in eighteen teaching departments.19 He does not use citations because “With a small number of exceptions, Australian academic economists, and therefore their departments, are quite inadequately represented in the SSCI, mainly because Australian academics tend to publish in journals which are not widely cited in other journals” (Harris, 1988, 103-4). Instead he counts publications over the period 1974-83, which are divided into eight categories of journal articles and books determined by (fairly subjectively evaluated) quality, for which the weights range from 35 points for a research book to one point for articles not included in other categories (eg. articles written for secondary school students).20 The results identify “three outstanding departments in terms of research output per head” – ANU, ADFA and Newcastle – and otherwise “not much variation between departments” (Harris, 1988, 107). 21 This conclusion illustrates the distinction between basing conclusions on per capita or total output, because the Australian Defence Forces Academy, with fewer than five staff members, ranked sixteenth out of eighteen by total points (Table 2). By total points, Harris’s rankings were ANU, Newcastle, Queensland, LaTrobe, Sydney, UNSW, Monash, Macquarie, Melbourne, Adelaide and UWA, followed by a substantial point gap before UNE and Flinders. Harris (1990a) updated his results to cover publications in 1984-8. For this period “In terms of total points earned, Newcastle, Melbourne, Adelaide and the ANU were the largest producers” and “The leading five departments in terms of publication points per head were Newcastle, ANU, Tasmania, Adelaide and University College (ADFA)” (Harris, 1990a, 251).22 Harris also provided data on the top twelve publishers in 1984-8, which showed Clem Tisdell, then at Newcastle now at Queensland, with 99 publications worth 274 points, to be far ahead of anybody else. In two departments a single academic accounted for over half of the publication points (John McDonald at Flinders and Raymond Markey at Wollongong). Harris (1990a; 1990b) provides the only published citation-count analysis of Australian–based economists, although his discussion is preceded by a list of caveats and he only covers two years of citations (1986 and 1987). The rankings suggest that ANU economists’ research had the most impact, with UWA, Macquarie, Melbourne, 19
Unlike most other studies, Harris assigns a publication to the institution with which the author was affiliated at the time the publication was written. Most studies, including ours, assign publications to the institution with which the author is affiliated at the end of the period studied. 20 Category one journals (counting ten points each) include “12 first-rank general journals, plus the Economic Record and Australian Economic Papers, together with 25 first rank specialist journals”, whereas category two (counting six points) “includes some 50 second rank journals”, but the criteria for distinguishing between first and second rank are not set out. 21 Macquarie, LaTrobe, UNSW and Sydney were all bunched on 47-48 points per head, eleven points behind Newcastle and four ahead of eighth place Melbourne. 22 In Harris’s table Macquarie is second in the total points ranking, and appears to have been omitted from this sentence by error.
7 Adelaide and Monash closely grouped in second to sixth places (Table 3). Although the small number of citations appears to reinforce Harris’s earlier scepticism about the degree to which Australian economists’ work achieves international recognition, it also reflects the limitation of covering only two years. His listing of the twelve leading researchers by citations indicates substantial potential differences between prolixity and impact (Table 4).23 Harris’s work was pioneering, and his weighting system attempts to account for quality and covers a wide range of publications, rather than just journal articles. His weighting system does, however, have a large subjective element. He observes that the composition of points totals varies substantially among the top departments, with some winning points for first-rank journal articles and others gathering points mainly for books. It follows that the rankings are likely to be sensitive to the precise weighting system, and Harris’s has no clear justification; is a research book worth three and a half articles in first-rank journals and almost six second-rank journal articles, and should the weights be the same for all “research” books? Anderson and Blandy (1992) identified 81 professors of economics, econometrics and economic history in Australia, of whom 65 percent responded to their survey conducted by mail in 1992. The questions mainly concerned economists’ ideas, but they also included a section on what the professors thought of Australian economics departments, other than their own (which they tended to rate very highly).24 Departments were ranked on six criteria (undergraduate, honours and postgraduate education, research, contribution to public policy, and quality of faculty), as well as overall. ANU, Melbourne, UNSW and Monash led on all criteria, in that order overall, with Adelaide and Sydney following fifth and sixth overall. Apart from these six, only Macquarie, Flinders, LaTrobe and UWA received mention.25 The study by Towe and Wright (1995) has been influential as the only publication-based rankings published in the decade after Harris’s work. Their rankings are based on number of pages published in journals during the 1988-93 period. They divide the 332 journals listed in the Journal of Economic Literature into four tiers, and for the 71 journals in the top three tiers they weight articles by their AER-equivalent length (Table 5). 26 They cover the academic staff of twenty-three economics and five econometrics departments as of 1st. March 1994, and find that “economics departmental rankings are similar over a broad range of journal groupings with Melbourne, Monash, Sydney, Tasmania, and ANU consistently ranked in the top
23
It also illustrated the importance of self-citations and of “blockbuster” articles. David Hensher ranks first by total citations, but third if self-citations are excluded (he also illustrates the problem of affiliation, because he works at the institute of Transport Studies at Sydney University rather than in an economics department as defined in many studies). The most cited article in 1986-7 was by Ian McDonald of Melbourne (55 citations over the two years), which alone would put him in the top five, whereas Clem Tisdell’s more numerous articles which placed him far ahead on publication measures garnered only enough citations to make the top dozen. 24 Home town bias was especially strong in Melbourne, in economics as in AFL. Clements and Wang (2001, 18) found a similar bias in the citation patterns of Australian Ph D students. 25 This ranking is similar to the peer rankin g from a less well-documented May/June 1987 survey of university economists reported in Harris (1990b) which ranked ANU first by a large margin, followed by Monash, Melbourne, UNSW and, after a gap, Sydney and Adelaide, and then another gap before Macquarie and Flinders. 26 Fox and Milbourne (1999) use a similar method to calculate research output of Australian academic economists, although they are concerned with explaining r esearch productivity rather than ranking performance. Harris and Kaine (1994) tackle similar issues using the Harris weighting.
8 third. The econometrics departments that consistently ranked highly under the same journal groupings were Monash and Sydney” (Towe and Wright, 1995, 9). Sinha and Macri (2002) examine the 1988-2000 research output of academic staff at 27 teaching departments with at least eight staff at rank of lecturer and above at the beginning of 2001 and in 1994. They include publications in some 400 journals listed in EconLit, weighting journals in two ways (perceptions and citation-based impact measures) and weighting articles by standardized page length.27 The authors suggest that the impact-based weighting is more commonly used, and by that measure ANU, UNSW, Melbourne, Sydney, UWA and Monash lead, with a big gap before next placed Queensland (Table 6). Adjusting for department size, the ranking is ANU, Sydney, UWA, UNSW, Melbourne, Monash and Griffith. With perceptionbase weighting of journals the ranking is Melbourne, ANU, Queensland, UNSW, LaTrobe, Monash, UNE, UWA and Sydney, with a gap before next-ranked Adelaide. On a per capita basis, the ranking is Melbourne, UWA, ANU, LaTrobe, Sydney, Queensland, Tasmania, Monash, UNE, UNSW, Western Sydney, Adelaide and Murdoch. The Sinha-Macri rankings suggest a leading group of ANU, Melbourne, and UNSW, followed by Sydney, UWA and Monash, but the rankings are affected by the choice of weighting (with Melbourne and Queensland ranking higher with perception weights)28 and rankings per capita obviously improve the position of smaller departments such as UWA, LaTrobe, Griffith or Tasmania. Sinha and Macri also report results for 1994-2000. The LP rankings produce the same top six departments based on total publications and one change when adjusted for size (Tasmania displacing Monash at number six), with some reordering in positions 2-6. With MSF weights the top six are also fairly stable, although UWA displaces Monash at number six on a total publications basis, and Tasmania displaces Sydney in the top six on a per capita basis. Thus, at least at the top end, these rankings are robust to the method of weighting journals and to the choice of time period. The approach is, however, an unconvincing one, because length is a poor guide to an article’s quality and because the weights place a very high value on publishing in the leading journals.29 A single fourteen page AER article between 1988 and 2000 by a member of the last-ranked department would have raised that department by twenty places to seventh overall in the impact-based rankings (the first column of Table 6). A feature of the Australian literature, apart from its paucity, is the limited range of methods used. Leaving aside Anderson and Blandy’s perceptions-based study, the other studies focus on weighted publications counts. Harris tried to cover a wide range of publications, but his weighting system was idiosyncratic and the results apply to a quarter of a century ago. Both Towe and Wright and Sinha and Macri restrict their analysis to articles in refereed journals, and they adjust for quality by weighting the journals and by measuring each paper’s length. It is surprising that only one Australian study has used citations as a basis for ranking individuals’ research performance. The sole exception (Harris 1990) clearly 27
The quality weightings are based on Masson, Steagall and Fabritius (1997) and Laband and Piette (1994a) respectively. 28 The LP impact weights decline much more rapidly than the MSF perception weights, so that the former give a greater weight to publications in the top journals while the latter place relatively more weight on lesser journals. 29 The general approach of Sinha and Macri follows the method of Conroy et al, (1995) and Dusansky and Vernon (1998), which has also been adopted by Coupé (2002). “Pages just don’t do it” is the comment by Griliches and Einav (1998, 211) on the pages-weighted approach of Dusansky and Vernon (1998), which ranked Pittsburgh above Chicago in the USA.
9 devoted little energy to compiling this measure, which only involved two years’ worth of data. Yet, Harris’s claim that Australian economists are not cited in major journals is simply not true, at least at the professorial level. The list of the 1082 most frequently cited economists in Blaug (1999), based on references in about 200 SSCIlisted economics journals during 1984-96, includes 26 Australian-resident economists, although not all are still actively associated with university departments (Table 7).30 The seventeen active ones are concentrated in the sandstone universities – Melbourne three, Adelaide, ANU, Monash, Sydney and UWA two each, Bond, Macquarie Queensland and the AGSM one each – and the youngest was born in 1952. 31 IV. DATA Our study covers the research activity of academic staff at rank lecturer and above in Australian economics departments with eight or more staff in the first semester of 2002 (Table 1). 32 Emeritus and adjunct staff are not included, and staff on leave are not excluded. Leave status poses a problem insofar as academics often continue to be listed in a department when they have not resigned but are in another position from which they may never return, but it is difficult to distinguish these situations from sabbatical or other short-term leave from which the academic will soon return.33 For most universities the economics department is well-defined,34 but some decisions have to be made about disciplinary boundaries. Where econometrics or economic history are in separate departments, we amalgamate them with economics. Finance and industrial relations are more difficult because they are sometimes in the economics department, sometimes free-standing departments, and sometimes integrated into other departments; where finance and industrial relations are separate departments, we do not amalgamate them with economics. The general rule of thumb was not to attempt to identify individual staff in such departments and count them as economists. Research institutes and staff primarily affiliated with them were not included; for ANU the economics department is that in the teaching faculty, and economists in the research schools are not included.35 30
That Blaug’s list contains only half of Harris’s top twelve citees reflects sensitivity to the time period (too short in Table 4, and perhaps too long in Table 7) and to emigration. 31 This is an extreme example of the bias of citation counts in favour of older economists. Blaug’s count covers a long time period (1984-96) followed by a time lag before publication. An economist born after 1952 would normally be hitting their publications stride by the mid or late 1980s and not attracting citations until towards the end of the 1984-96 period. 32 The basic source for staff lists is departmental web pages as they stood in April 2002. Care was taken to remove the names of people who had recently left departments but were still listed on their websites. No adjustment was made for less than full-time status of people listed as regular academic staff. 33 This matters because some of the cases are prolific publishers, such as Ron Bewley (UNSW) who is now at the Commonwealth Bank or Stephen King (Melbourne) who is now at Melbourne Business School or Simon Grant (ANU) who was at Tilburg and then moved to Rice University. Some departments do not list staff in such positions, eg. Sue Richardson is not on the Adelaide website since she went to the National Institute of Labour Studies, even though she is technically on leave. 34 For Newcastle, the School of Policy is treated as the economics department. For Sydney we include three economics disciplines from the School of Economics and Political Science, but exclude the Politics and Political Economy disciplines and economists in the School of Business. For Melbourne we exclude the Household Research Unit and actuarial studies staff. For Curtin we exclude Property Studies. 35 This excludes some of the long-term most productive publishers in Australia such as Bob Gregory and Adrian Pagan at ANU or Peter Dixon at Monash. Note that staff on “research only” positions within teaching departments, eg. ARC research fellows such as Steve Dowrick and John Quiggin at
10 While our methods are those which have been applied to other countries’ academic economists, there are pitfalls in applying the methods to Australia. Especially in a smaller university system, such as Australia’s, a department’s ranking can be hugely influenced by a single star. When the median academic produces zero publications in most years, the choice of dates can be critical in including or excluding a good year. Especially with measures such as publications in impact-weighted core journals, a single American Economic Review article can raise a department’s ranking by many places. 36 Moreover, some Australian economists work on domestic issues for which appropriate outlets are Australian journals (and book publishers). Should the major Australian journals such as the Economic Record, Australian Economic Papers or the Australian Economic Review receive special weights (and what about policy journals such as Agenda, or field journals such as the Australian Economic History Review, the Australian Journal of Agricultural Economics or the Australian Journal of Agricultural and Resource Economics?), or does this penalize economists working on research projects without specific Australian content who publish in international journals of similar standing to the EcoReco or AEP?37 In counting publications, we focus on journal articles, and our rule of thumb was to exclude comments, replies, obituaries and book reviews. Our base publication list is all articles published since 1990 in the top 88 journals as listed by Laband and Piette (1994a, Table A2 final column).38 To test for whether there is an “Australiantopic” bias, we report publication in the six academic journals listed in the previous paragraph (not Agenda). Our preferred source for publication lists is an individual’s curriculum vitae downloaded from the university website. Sometimes these are out of date and for many staff they are absent, although there is a (favourable) selection bias ANU, are included. Harris (1989) assessed the 1974-86 research output of six research centres (the Centre for Economic Policy Research and the economics departments in the Research Schools of Pacific Studies and of Social Sciences (RSPS and RSSS) at the ANU, the Centre for Policy Studies at Monash, the Institute of Applied Economic and Social Research at Melbourne, and the National Institute of Labour Studies at Flinders) using th e same weighting system as in Harris (1988), and concluded that although the research centres on average published more per staff member than teaching departments, this was less than proportionate to the extra time available for research; although he also adds that funding of the research schools differs so that some (eg. the Centre for Policy Studies) are constrained by the need to raise consulting money, which draws time away from academic research. 36 Advocates of the use of citation counts in the USA often argue that it is “important to realize that the existence of ‘noise’ in data does not invalidate quantitative analysis. Critics of citations analysis often fail to note that if errors in data are randomly distributed with respect to the variable of impact . . . they are unlikely to invalidate the conclusion of the study, provided that the sample is large” (Posner, 1999,12-3). In ranking Australian departments, errors, even if randomly distributed, could make a big difference in the relative positions of the few leading departments – and even larger difference in rankings of the non-leading departments. 37 A general point is that rankings are relative, so that any special weights on certain publications discriminate against other publications. There is also a specificity problem given the small number of Australian journals and, in some cases, their association with particular universities, eg. including the Australian Economic Review may favour Melbourne and including Australian Economic Papers may favour Adelaide and Flinders (although the AEP editor claims this is untrue). 38 The 88 th ranked journal was the top-ranked Australian journal, the Economic Record. The list is similar to Towe and Wright’s top three tiers of journals, which appears to be their preferred measure of “quality journals”. The LP ranking is based on citations to articles published 1985-89 and thus the list excludes new journals (eg. Health Economics which began publication in 1991) and may include some whose quality has declined over the last decade. Any ranking is sensitive to method and dates, but Pieters and Baumgartner (2002) found a high correlation between their ranking of forty-two leading economics journals based on citations in 1995-7 and other citation-based rankings of journals, including the Laband-Piette ranking.
11 in that more active publishers are more likely to have their curricula vitae posted. In some cases of absent or incomplete vitae, we followed up directly to the academic or department. Our alternative source for publications was the Econlit database, but as far as possible we kept this as a final resort or used it as a check against the primary source.39 For citation counts there is little choice other than to use the Social Science Citation Index on the ISI website. The raw numbers are even more unreliable than with the publication counts, for reasons outlined in section 2. The citation counts are as they existed in November-December 2002, but because the website is continuously updated there are biases in terms of how early in the period the count was made. Care was taken, especially with common names, to avoid false ascription of citations, but undoubtedly errors have been made. It was also too difficult to weight for joint publications or to exclude self-citations, so each citation is worth one point. In sum, the numerical citation counts should be treated with caution, but the orders of magnitude are useful. V. RESULTS Our publication and citation counts are presented in Table 8 by department. We are painfully aware of the many opportunities for error and need for judgment calls. Both Econlit and the SSCI contain omissions and errors, but it is impossible to assess the extent of the problem. Despite the weaknesses of the cardinal measures, the general patterns contain useful information. a) Ranking departments Whether to rank on a total or per capita basis has obvious implications. Total publications probably capture reputation better, whereas per capita publications indicate productivity.40 In Australia the big established departments (Melbourne, Monash, Sydney and UNSW) always rank highly by total publications, whereas Adelaide, Tasmania and UWA do better on a per capita publication basis. For smaller departments the per capita measure is sensitive to exactly how many people are counted, as well as to the choice of dates and publications to be counted. Only nine departments produced more than one “top 88” journal article per capita between 1990 and 2001, and for the other eighteen differences between the per capita measure is small. Focusing on Australian publications is problematic. Publications in the six Australian journals are heavily skewed, with Melbourne (139), ANU (31), La Trobe (28), Monash (20) and New England (20) at the top of the list.41 This may reflect the five departments’ strength in research on domestic issues, but it is impossible to 39
Econlit is easy to work with but full of pitfalls. Listings under initials and full names are not consistent for the same person, and publications from similarly named people are commingled. Transcription errors, eg. one of Olan Henry’s papers is listed under Ulan Henry and one of Bharat Hazari’s under Bharathazari, mean that a search does not capture misfiled entries. It is impossible to assess the number of such errors, but they appear to be many. Finally, the publication selection process is incomplete as soon as one moves beyond publications in the leading journals. 40 There may be better measures of productivity. If Melbourne spend more on support staff and UNSW more on academic staff out of similar total budgets, then the per capita measure will make Melbourne look better than if a measure of all inputs were in the denominator. 41 Out of the top thirteen individuals publishing in the six Australian journals ten are from Melbourne University.
12 separate this explanation from a location bias (editors in Melbourne and Canberra).42 The Sydney universities in particular fare much worse on this measure. Rankings are obviously susceptible to the method used. Among journal publication measures, extreme values might be expected to be between the doubleweighted (by journal quality and by article length) approach of Sinha and Macri and our unweighted count. By the Sinha and Macri measure ANU, UWA, Griffith and La Trobe are boosted by articles in peak journals, while the more numerous field journal publication by economists at Melbourne, Monash and Adelaide boost their relative importance by our unweighted measure (Table 9).43 We have already mentioned the drawback of using length as a quality indicator, but even more troublesome in the Australian context is the impactweighting of journals. The Laband-Piette (LP) weights are based on an index with the AER equal to 100, and only seven other journals with weights over 50. The leading European journals score 12.8 (EJ) and 3.6 (EER) and non-econometric field journal even less (eg. JEH 4.8 and JDE 1.8) in LP weights. In a setting like Australia’s where few researchers publish in the top 30 journals, rankings based on LP weights are hugely influenced by a publication in a top-ranked journal. In Sinha and Macri’s ranking based on LP weights (Table 6, column 1) the metric is AER pages; a single fourteen-page AER article would have raised the bottom-ranked department to seventh place. Fifty articles in the Journal of Development Economics would have less impact on a department’s ranking than a single AER paper of equal length. A method which is so much influenced by one or two papers in a handful of journals is a poor basis for ranking departments which publish few such articles, because a paper could be a poor article in a good journal or the author may move. We would suggest that the Sinha and Macri measure based on LP weights gives some information about the six leading departments, but is of limited value in ranking the other twenty-one departments. Output measures based on articles in unranked journals are also flawed. Although it is not obvious that an AER article is over fifty times better quality than a JDE article, it is clear that the average quality of the former is higher, ie. the “true” weight is somewhere between the two extremes. As with the Sinha-Macri numbers, those in Table 8 must be treated with caution. Western Sydney has the tenth most articles in top 88 journals, but the 18.33 articles are mostly in Economic Letters, a highly ranked journal which contains short articles some of which verge on the status of comments, and in Applied Economics, a frequently published journal which contains enough good articles to justify its top 88 status but also has perhaps the lowest quality cut-off among the top 88. 44 Caution is especially necessary once we leave the safety of large numbers. Only twelve departments contain staff who, combined, published more than one article a year in “top 88” journals between 1990 and 2001. In only nine departments the academic staff published more than one “top 88” journal article per capita between 1990 and 2001. In the remainder, which constitute the majority of Australian economics departments, our output measures are 42
There may also be a field bias insofar as UNE’s higher ranking on the Aus6 criterion than on the top 88 criterion may reflect the department’s strength in a gricultural and resource economics. 43 Sinha and Macri consider the same 27 deprtments that we do and use the same criteria (level B and above) for including staff, but their census date is 2001. Thus, their ranking may differ from ours due to changing departmental composition, as well as due to the differences in method. 44 These comments say nothing about the quality of the articles published by the UWS staff, but are intended to make the general point that that fewer articles in higher ranked journals may reflect higher quality research output.
13 especially sensitive to staff changes, choice of terminal dates, and cut-off point for the journal list.45 In sum, both weighting methods are imperfect, and can best be thought of as extreme bounds measures of research output based on journal article publication.46 Our conclusion is that rather than proposing a single best criterion, rankings should be obtained on several plausible criteria and then analysed for patterns. Moreover, all measures of research output by economists in Australian universities encounter small number problems. Even among the top-ranked departments, little should be read into differences in rankings based on small variations in the cardinal numbers. Below the top eight or a dozen departments the ranking can be heavily influenced by a single publication, depending on the weighting. The numbers of citations are reported in Table 8. As mentioned above the raw numbers for individuals’ citations are likely to be flawed, but the orders of magnitude can still provide a guide to the influence of economic research conducted in each department. In particular, if we have missed some references to unpublished work by people with common names, this would have little impact on the departmental aggregates. The bigger omissions are likely to be to co-authored work, eg. Peter Lloyd’s work is cited 251 times according to the SSCI but that includes no references to his Intra-Industry Trade book, for which 257 citations are listed under the coauthor’s name, while Bill Griffiths’ econometrics textbooks are partially cited under his name but receive over 3,000 citations under GG Judge. 47 The data in Table 8 reflect the general characteristics of citation counts: blockbuster articles, especially in econometrics,48 or editorship of influential collections earn many citations, 49 while recent articles even in top journals have had insufficient time to garner citations. A field bias towards econometrics boosts the citation counts for inter alia Monash and New England and a bias towards finance boosts UTS relative to departments with other strengths, although the fact that citation counts pick up non-journal output provides more recognition to areas such as economic history. The citation counts also give recognition to the work of economists who cross interdisciplinary boundaries and publish in non-economics journals read by economists. The citation counts in Table 8 suggest a hierarchy. The work of economists in departments at ANU, Melbourne, and Monash was cited over 2000 times. Exactly 45
Our decision to use the Economic Record as the cut-off highlights the problem, because of the skewed distribution of contributors’ departmental affiliation. A “top 87” list reduces Melbourne’s output from 138 to 102 articles and La Trobe’s from 28 to 20, while affecting other departments much less. These big changes have, however, no impact on Melbourne’s ranking, and La Trobe drops only one place in the ranking by total publications and two places in the ranking by per capita publications. . 46 The perception-based weighting of journals used by Sinha and Macri ((MSF weights) yields rankings of departments in between the LP-weighted and unweighted rankings (Table 9). The MFP weights are less extreme than the LP weights, but there is no “true” quality indicator. 47 There is a temptation to include the references to the books by Lloyd and by Griffiths, which would substantially increase Melbourne’s citation count, but we stuck to the principle of not making exceptions to the criteria of including only what was listed under the academic’s own name in the source on the grounds that such exceptions would be biased in favour of departments and fields better known to ourselves. At UWA, Darrell Turkington’s work is cited 118 times under Bowden, but not under his own name. 48 Trevor Breusch is the only individual with over a thousand citations; Breusch’s 1168 citations are dominated by three articles in Econometrica 1979 (502 references), Review of Economic Studies 1980 (367) and Australian Economic Papers 1978 (118). The second-most cited article is Ian McDonald’s 1981 AER article, with 385 citations. 49 Over half of the 493 citations to Kym Anderson, for example, are to two volumes that he co-edited while seconded to the GATT in 1991-2.
14 how these department should be ranked depends upon whether one agrees with our treatment of staff on leave (which favours ANU), known joint publications (which penalizes Melbourne) and econometrics (which favours Monash). UNSW with 1942 citations could easily be in this top group if the inclusion criteria were changed. Queensland and New England rank fifth and sixth, followed by Sydney, Adelaide and UWA. These nine departments are the only ones whose work is cited over a thousand times. There is a substantial gap between them and next-ranked La Trobe and UTS, and another gap before Macquarie and Newcastle. If the citations are weighted by department size, then the same nine departments dominate but UWA moves from ninth to second, UNE from sixth to third and Adelaide from eighth to fourth. The various considerations mentioned in the previous paragraph should temper placing too much store in the exact rankings, but there appears to be a significant gap between the one-third of Australian economics departments doing substantial quality research and the two-thirds whose research output is less-cited. Table 10 lists the top ten Australian departments by publications and by citations, together with the previously published rankings covering the last quarter century. Rankings are obviously susceptible to the method used, but apart from a few obvious biases this is not a major issue with respect to rankings based on journal articles. Although economists at some newer universities, notably Newcastle and Macquarie in the 1980s and La Trobe and Griffith in the 1990s, have published well at times, the list is dominated by the established universities. As mentioned earlier, over the last decade ANU and UNSW rank highest when the core journals are weighted heavily and Melbourne ranks highest when greater weight is given to second-tier or field journals (or Australian journals). On a per capita basis Tasmania and La Trobe always rank more highly than on a total publications basis, and Sydney, UWA and Griffith rank highly when the core journals are weighted heavily. The rankings based on citations could easily be manipulated. By including coauthored textbooks Melbourne would move to top place, or excluding editorship of collections would move Adelaide down the list. Nevertheless, there is a substantial gap between the nine research-active departments whose outputs garner over a thousand citations and other departments, and the distribution is more stretched out than that based on publications. Economists in the leading research departments not only publish more but also publish better (in the sense of more often cited) research. The general impression in Table 10 is of stability. By the twelve reported measures, ANU and Melbourne are always in the top ten departments. La Trobe, Monash and UNSW are among the top ten by eleven measures, Sydney by ten measures, Adelaide, Queensland and UWA by eight, and UNE by seven. In this respect “objective” numerical rankings track perceptions of the leading Australian departments. The Group of Eight plus La Trobe and New England dominate. Figure 1 illustrates the extent of this domination in the list of unweighted publications in the “top 88” journals from 1990 to 2001; one third of Australian university departments of economics, with about two-fifths of the academic staff, account for over threequarters of the publications. (b) Skewedness It is commonplace in this literature to note that rankings are heavily skewed, so that a few superstars play a big part in their department’s ranking. This phenomenon appears to be especially strong in Australian university departments.
15 Towe and Wright emphasised the highly skewed nature of the distribution of publications within departments, with the top quarter of the academic staff accounting for most of the publications. The median number of pages published in the seventyone tier 1-3 journals during 1988-93 was zero for all departments except Tasmania and Griffith. The majority of Australian academic economists at lecturer level and above had no publications in these journals during the six-year period.50 The same phenomenon is apparent in our data. Among the leading departments, it is especially true of ANU, whose ranking is heavily dependent on the research output of John Quiggin, Simon Grant and Steve Dowrick. The most obvious exception in our data is Melbourne University, whose top ranking by total publications has a broad base, even though John Creedy’s twenty publications are the second-highest of any individual in the database. The importance of one or two successful publishers is also apparent for some of the lower-ranked departments - and especially so with the Sinha/Macri weights. 51 Similar skewedness is apparent at the national level. John Quiggin with 20.83 and John Creedy with 20.0 articles in the “top 88” journals are far ahead of next most prolific Yew Kwang Ng’s 10.83.52 The 640 economists in our population produced a total of 630 “top 88” journal articles over the twelve year period, ie. less than one per person. The top thirty individual publishers account for over three eighths of the total articles published by the 640 economists, and 386 of these academic economists published nothing in the “top 88” journals between 1990 and 2001 (Figure 2). The median, as well as the mode, number of publications over the twelve-year period was zero. The citation counts reveal a similar pattern (Figure 3). Of the 2898 references to the work of members of ANU’s economics department, 1168 were to Trevor Breusch, 902 to John Quiggin and 405 to Steve Dowrick, while the least productive half of the department received only 31 citations; if the latter ten economists were in a separate department, it would be the second-lowest ranked in the country. Adelaide’s three economics professors account for 944 of the department’s 1265 citations, and Michael McAleer, Paul Miller and Ken Clements for 937 of UWA’s 1121 citations. Over a third of Queensland’s citations are to Clem Tisdell, and over half of UNSW’s are to Nanak Kakawani or Ron Bewley. Harry Clarke and John Kennedy accrued half of La Trobe’s citations, and George Battese and Tim Coelli half of UNE’s. Further down the list, three of Flinders’ academic staff received 248 out of the department’s 265 citations, and the median number for the department was zero. In only ten of the twenty-seven departments was the median citation count above seven, ie. one reference per year over the 1995-2002 period. Given the degree of skewedness and 50
The median publications in the 71 tier 1-3 journals by all associate professors and by all senior lecturers was zero, and “lecturers in general pr oduced zero research output” (Towe and Wright, 1995, 15). For the sixty-one professors, the median number of pages published in the top 71 journals in 1988-93 was 4.3, or less than a page per year. Brennan and Teal (1991), in an unpublished paper, reported similarly high levels of concentration. 51 Even with our unweighted measure, 10.5 out of the 18.3 UWS publications came from Satya Paul and Russel Cooper. 52 The numbers for individual staff are in an Appendix, available from the authors upon request. Although all of our results build up from individuals’ performance measures, we hesitate to give lists of the top publishers, because the sensitivity to selection of journals and their weightings, and to other problems such as omission of books, is most acute at the individual economist level. The list of individual citation counts is even more dubious for reasons given in the text, even though we believe that, at the department level, the orders of magnitude provided by citation counts are the best single guide to research output.
16 the dominance of a handful of frequently cited researchers, errors and subjective criteria (such as treatment of staff on leave or of known joint publications) could substantially affect departments’ rankings, although they would be unlikely to change the top-third bottom-two-thirds division referred to above. In sum, although some Australian academic economists publish internationally and their work is cited in respectable journals, the majority does not publish and their work is rarely cited. This is consistent with the Kocher et al. (2001) finding that academic staff in Australian economics departments are relatively inactive publishers in refereed journals. The explanation for Australia’s dismal performance in international rankings lies in part that there are not enough stars (perhaps due to emigration or non-return from overseas), but it is also because Australian economics departments contain many people who are not even in the publishing game. Despite concerns about deleterious consequences of a publish-or-perish ethos, the Australian norm is that most academic economists do neither. VI. CONCLUSIONS Do a range of measures yield the same rankings? In the USA where rankings are taken very seriously, the methodological debate can be heated. For example, Griliches and Einav (1998, 235) dismiss the Dusansky/Vernon approach, used by Sinha and Macri and by Towe and Wright for Australia, as “the largely irrelevant metric of total pages in a limited set of journals and a narrow time frame. The result is very misleading. One should be able to do better than that”. On the other hand, Thursby (2000) found that the correlation between rankings by different methods was high, and differences between departments ranked several places apart were of little significance. Our interpretation of the Australian data takes on board both counts. The DV approach can be misleading, but overall there is not a huge difference in the outcome from various ranking methods. Of course, Griffith may worry that Sinha and Macri ranked them eighth by the DV measure, while they are not in the top ten by our simple publication count, and Adelaide may self-congratulate that it is ninth by our measure and eighth when we use the “top 87” journals, even though it was not in Sinha and Macri’s top ten. The overall Australian picture is, however, of fairly stable rankings with the established universities dominating by any measure and any time period, although there is some decade-to-decade movement in the ranking of these departments and some entry and exit of other departments. We have also emphasized the small numbers problem, and conclude that any of these methods is inappropriate for ranking the majority of Australian economics departments, which publish so little that changing the metric will have a large influence on their ranking but the change in ranking will have little meaning. Rather than poring over precise rankings, we prefer to concentrate on asking what this literature tells us about Australian academic economists. Assuming that the story would be broadly similar with any reasonable measure of research output, we have a stark picture. A few dozen active researchers publish their research in leading journals and their work is cited. Exactly how important that work is and which individuals and departments produce the best output depends on the precise metric used. More strikingly, and less debatable, the majority of academic economists in Australian universities have done no research that has been published in a fairly long list of refereed journals over the last dozen years. In two-thirds of Australian economics departments, the median academic staff member’s total research has
17 received less than one reference per year since 1995 in an even longer list of academic journals, i.e. their research has virtually no impact on the community of academic economists and makes no addition to scholarly knowledge. Of course, there are reasons for individual non-performance by these measures; young researchers have not yet published, some important research takes time before its significance is recognized, and some researchers are working on a magnum opus that will appear in the future. Such extenuating circumstances, however, appear unlikely to account for many of the zero entries. The inescapable conclusion is that, contrary to stated mission goals, Australian universities do not value the same research output as other countries do, or they do not provide sufficient support for academic research, or our academic staff are subject to different incentives and sanctions than elsewhere with respect to producing publishable research.
18 REFERENCES Anderson, M. and Blandy, R. 1992, ‘What Australian Economics Professors Think’, Australian Economic Review, vol. 100, pp. 17-40. Berger, M. and Scott, F. 1990, ‘Changes in U.S. and Southern Economics Departments Rankings over Time’, Growth and Change, vol. 21, pp. 21-31. Bhattacharya, M. and Smyth R. 2002, ‘The Life Cycle Research Output of Professors in Australian Economics Departments’, paper presented at the Australian Conference of Economists, Glenelg SA, September 30th. to October 3rd. Blaug, M. 1999, Who’s Who in Economics, Third Edition. Edward Elgar, Cheltenham UK. Brennan, G. and Teal F. 1991, ‘Academic Research Productivity: Are Economists paid their Marginal Product?’ Discussion Paper No. 254, Centre for Economic Policy Research, Australian National University, Canberra. Clements, K. and Wang P. 2001, ‘Who cites What?’ ms. Economic Research Centre, Department of Economics, University of Western Australia. Conroy, M., Dusansky R. and Kildegaard A. 1995 ‘The Productivity of Economics Departments in the U.S.: Publications in the Core Journals’, Journal of Economic Literature, vol. 33, pp. 1966-71. Coupé, T. 2002, ‘Revealed Performances: World wide Rankings of Economists and Economics Departments’, ms. ECARES Université Libre de Bruxelles – available at http://student.ulb.ac.be/~tcoupe/ranking.html – references to the paper are to the version posted on 20 June 2002. Davis, P. and Papanek G. 1984, ‘Faculty Ratings of Major Economics Departments by Citations’, American Economic Review, vol. 74, pp. 225-30. Diamond, A. 1986, ‘What is a Citation Worth?’ Journal of Human Resources, vol. 21, pp. 200-15. Dusansky, R. and Vernon C. 1998, ‘Rankings of U.S. Economics Departments’, Journal of Economic Perspectives, vol. 12, pp. 157-70. Eagly, R. V. 1975, ‘Economics Journals as a Communications Network’, Journal of Economic Literature, vol. 13, pp. 878-88. Ellison, G. 2002, ‘The Slowdown of the Economics Publishing Process’, Journal of Political Economy, vol. 110, pp. 947-93. Feinberg, R. 1998, ‘Correspondence: Ranking Economics Departments’, Journal of Economic Perspectives, vol. 12, pp. 231-3.
19 Fox, K. and Milbourne R. 1999, ‘What Determines Research Output of Academic Economists?’ Economic Record, vol. 75, pp. 256-67 Fusfeld, D. 1956, ‘The Program of the American Economics Association’, American Economic Review, vol. 46, pp. 642-4. Gans, J. ed. 2000, Publishing Economics: Analyses of the Academic Journal Market in Economics. Edward Elgar: Cheltenham UK. Griliches, Z. and Einav L. 1998, ‘Correspondence: Ranking Economics Departments’, Journal of Economic Perspectives, vol. 12, pp. 233-5. Harris, G. 1988, ‘Research Output in Australian University Economics Departments 1974-1983’, Australian Economic Papers, vol. 27, pp. 102-10. Harris, G. 1989, ‘Research Output in Australian University Research Centres in Economics’, Higher Education, vol. 18, pp. 397-409. Harris, G. 1990a, ‘Research Output in Australian University Economics Departments: An Update for 1984-88’, Australian Economic Papers, vol. 29, pp. 24959. Harris, G. 1990b, ‘Research Performance Indicators in Australian University Economics Departments, 1986-7’, Economic Analysis and Policy, vol. 20, pp. 73-82. Harris, G. and Kaine G. 1994, ‘The Determinants of Research Performance: A Study of Australian University Economists’, Higher Education, vol. 27, pp. 191-201. Hill, S., and Murphy P. 1994, Quantitative Indicators of Australian Academic Research (National Board of Employment, Education and Training Commissioned Report No.27), Australian Government Printing Service, Canberra. Jonson, P.D., and Brodie M. 1980, ‘Who Publishes What?’ Australian Economic Papers, vol. 19, 224-6. Kocher, M., Luptacik M. and Sutter M. 2001, ‘Measuring Productivity of Research in Economics’, ms. Institute of Public Economics, University of Innsbruck, Austria. Laband, D. and Piette M. 1994a, ‘The Relative Impact of Economics Journals’, Journal of Economic Literature, vol. 32, pp. 640-66. Laband, D. and Piette M. 1994b, ‘Favoritism versus Search for Good Papers: Empirical Evidence regarding the Behavior of Journal Editors’, Journal of Political Economy, vol. 102, pp. 194-203. Lovell, M. 1973, ‘The Production of Economic Literature: An Interpretation’, Journal of Economic Literature, vol. 11, pp. 27-55. McCormick, R. and Meiners R. 1988, ‘University Governance: A Property Rights Perspective’, Journal of Law and Economics, vol. 31, pp. 423-42.
20
Masson, P., Steagall J. and Fabritius M. 1997, ‘Economics Journal Rankings by Type of School: Perceptions versus Citations’, Quarterly Journal of Business and Economics, vol. 36, pp. 69-79. Pieters, R. and Baumgartner H. 2002, ‘Who talks to Whom? Intra- and Interdisciplinary Communication of Economics Journals’, Journal of Economic Literature, vol. 40, pp. 483-509. Posner, R. 1999, ‘The Theory and Practice of Citations Analysis, with Special Reference to Law and Economics’, John M. Olin Law and Economics Working Paper No.83, The Law School of the University of Chicago, Chicago IL. Scott, L. and Mitias P. 1996, ‘Trends in Rankings of Economics Departments in the U.S.: An Update’, Economic Inquiry, vol. 32, pp. 378-400. Siegfried, J. 1972, ‘The Publishing of Economic Papers and Its Impact on Graduate Faculty Ratings, 1960-1969’, Journal of Economic Literature, vol. 10, pp. 31-49. Sinha, D. and Macri J. 2002, ‘Rankings of Australian Economics Departments, 19882000’, Economic Record, vol. 78, pp. 136-46. Smyth, R. 1999, ‘A Citation Analysis of Australian Economic Journals’, Australian Academic and Research Libraries, vol. 30, pp. 119-33. Stigler, G. 1982, “The Pattern of Citation Practices in Economics” in The Economist as Preacher and Other Essays (University of Chicago Press. Chicago IL), pp. 173-91. Thursby, J. 2000, ‘What Do We Say about Ourselves and What Does it Mean? Yet Another Look at Economics Department Research’, Journal of Economic Literature, vol. 38, pp. 383-404. Towe, J., and Wright D. 1995, ‘Research Published by Australian Economics and Econometrics Departments: 1988-93’, Economic Record, vol. 71, pp. 8-17. Tremblay, C., Tremblay V. and Lee B. 1990, ‘Field Publishing Performance of U.S. Economics Departments’, Atlantic Economic Journal, vol. 18, pp. 37-48. Tschirhart, J. 1989, ‘Ranking Economics Departments in Areas of Expertise’, Journal of Economic Education, vol. 20, pp. 199-222.
21 Table 1: Departments included in the Sample
University E 3 Adelaide 1 ADFA 4 ANU 1 Canberra 3 Curtin 2 Deakin 1 Edith Cowan 2 Flinders 1 Griffith 2 LaTrobe 4 Macquarie 11 Melbourne 6 Monash 1 Murdoch 1 Newcastle 6 New England 4 Queensland 2 QUT 2 RMIT 4 Sydney 1 Tasmania 5 UNSW 4 UTS 3 UWA 1 Victoria Western Sydney 4 2 Wollongong Total
Staff by level D C B Total 3 7 6 19 1 5 8 15 5 6 5 20 0 5 7 13 8 5 11 27 0 5 7 14 1 1 13 16 2 4 6 14 3 3 3 10 4 7 6 19 1 9 8 22 7 21 39 13 35 54 2 5 2 10 4 3 3 11 7 7 4 24 11 14 9 38 2 3 14 21 3 8 20 33 10 10 7 31 1 4 2 8 13 12 8 38 6 12 12 34 1 8 2 14 2 14 26 43 3 10 18 35 6 5 5 18 640
Name of department(s) School of Economics Economics & Management School of Economics Economics & Marketing Economics & Finance School of Economics Finance & Business Economics School of Economics School of Economics Economics & Finance Economics Economics Economics & Econometrics Economics School of Policy Economics Economics Economics & Finance Economics & Finance Econ, Econometr & Econ Hist Economics Economics Finance & Economics Economics Applied Economics Economics & Finance Economics
Source: University websites as of April 2002. Note: level E is Professor, D is Associate Professor or Reader, C is Senior Lecturer and B is Lecturer.
22
Table 2: Harris Rankings for 1974-83 and 1984-8
Adelaide ADFA ANU Flinders James Cook LaTrobe Macquarie Melbourne Monash Murdoch Newcastle New England Queensland Sydney Tasmania UNSW UWA Wollongong
Publication points 1974-83 Total Per staff member 841 (10) 37.2 (16) 287 (16) 62.4 (2) 1404 (1) 88.3 (1) 470 (13) 41.6 (10) 316 (15) 40.5 (11) 1040 (4) 47.1 (5) 959 (8) 48.0 (4) 910 (9) 42.7 (8) 996 (7) 39.1 (13) 71 a 19.2 a 1221 (2) 58.7 (3) 530 (12) 42.4 (9) 1101 (3) 37.4 (15) 1003 (5) 46.7 (7) 365 (14) 36.1 (17) 1002 (6) 46.8 (6) 772 (11) 39.4 (12) 278 (17) 38.6 (14)
Publication points 1984-8 Total Per staff member 519 (4) 23.6 (4) b 138 23.0 b 513 (5) 30.1 (2) 149 b 9.9 b 60 (14) 7.4 (14) 463 (9) 17.8 (11) 576 (2) 19.2 (9) 551 (3) 19.7 (7) 488 (7) 21.1 (5) c 52 12.9 c 646 (1) 35.5 (1) 276(12) 20.9 (6) 490 (6) 18.0 (10) 386 (10) 15.3 (13) 272 (13) 25.6 (3) 483 (8) 19.5 (8) 286 (11) 17.2 (12) 173d 14.0 d
Source: Harris (1988, 109) and Harris (1990a, 251) Notes: Numbers in parentheses are rankings. a 1977-83. b 1984-6, c 1984-7, d1986-8.
23 Table 3: Citation Measures for 1986-7
Adelaide ADFA ANU Flinders James Cook LaTrobe Macquarie Melbourne Monash Murdoch Newcastle New England Queensland Sydney Tasmania UNSW UWA Wollongong
Proportion of staff cited 59 50 82 44 12 47 48 47 43 7 30 59 43 54 46 41 33 27
Source: Harris (1990a, 255) Notes: Numbers in parentheses are rankings.
Citations per staff member 3.1 (5) 1.1 (15) 6.0 (1) 2.3 (7=) 0.2 (17) 2.0 (9=) 3.5 (3) 3.4 (4) 2.9 (6) 0.1 (18) 2.0 (9=) 1.4 (13) 2.3 (7=) 1.6 (12) 1.8 (11) 1.3 (14) 3.6 (2) 0.8 (16)
24 Table 4: The Twelve Leading Australian-based Economists by Publication Points (1984-8) and by Citations (1986-7). By publications Number
By citations Number per year 57.5
Of which self cites 22.5
54.5
3.5
40.0
13.5
36.5
3.0
30.0
0
25.0
1.0
Kym Anderson Eric Jones
24.5
4.0
18.0
0
Ken Clements Peter Lloyd
17.5
1.0
16.5
0.5
Clem Tisdell Richard Cornes
16.0
1.5
15.5
0.5
Points
Clem Tisdell
99
274
David Hensher Yew-Kwang Ng Peter Groenewegen N. Long
38
168
25
159
21
115
22
92
Richard Snape Raymond Markeya Murray Kempb G. Mehta
15
92
10
89
21
86
17
77
John McDonaldc C. Stahl
11
77
11
77
Eric Jones
13
75
David Hensher Geoff Brennan YewKwang Ng Ian McDonald T. Takayama A. Evansd
Source: Harris (1990a, 253 & 256). Notes: a 1986-8 only, b 1984-7 only, c 1984-6 only, d 1987 only.
25 Table 5: Towe and Wright Rankings for 1988-93
Adelaide ANU Bond Curtin Deakin Flinders Griffith James Cook LaTrobe Macquarie Melbourne Monash Murdoch Newcastle New England Queensland RMIT Sydney Tasmania UNSW UTS UWA Wollongong Econometrics ANU Monash New England Sydney UNSW
Number of Pages published per staff member Staff Top tier only Tiers 1&2 Tiers 1-4 21 0.59 (9) 0.92 (12) 14.74 (15) 19 1.95 (1) 6.58 (1) 22.46 (9) 6 0.67 (7) 1.34 (10) 26.14 (5) 19 0.23 (11) 0.39 (16) 12.92 (16) 18 0 0.59 (13) 7.44 (20) 14 0.21 (12) 0.21 (17) 9.47 (19) 8 0 0 12.05 (17) 10 0 0 0 (23) 24 0.96 (6) 1.71 (9) 24.76 (8) 25 0 0.47 (14) 10.71 (18) 25 1.03 (4) 4.08 (2) 31.88 (2) 31 1.25 (2) 3.39 (3) 33.16 (1) 9 0 0 21.41 (11) 27 0 0 15.48 (14) 13 0 0 28.71 (3) 31 0.48 (10) 1.16 (11) 26.18 (4) 20 0 0.44 (15) 2.74 (21) 26 0.64 (8) 2.41 (5) 25.40 (7) 12 0 2.09 (6) 25.75 (6) 30 1.10 (5) 2.47 (4) 22.36 (10) 27 0 0.06 (18) 1.04 (22) 17 1.06 (3) 1.93 (7) 20.49 (12) 16 0 1.85 (8) 17.48 (13) 5 11 8 8 9
1.71 (2) 1.43 (3) 0.95 (4) 3.81 (1) 0.54 (5)
1.71 (5) 9.23 (2) 5.39 (3) 10.44 (1) 3.71 (4)
8.97 (5) 19.53 (3) 22.84 (1) 22.24 (2) 18.19 (4)
Source: Towe and Wright (1995). Notes: Numbers in parentheses are rankings. The top tier consists of 12 journals, tier 2 of 23 and tiers 1-4 include all 332 journals in the printed version of the Journal of Economic Literature.
26 Table 6: Sinha and Macri Rankings for 1988-2000 Total Per capita LP weights MSF weights LP weights MSF weights 7.51 (13) 445.66 (10) 0.34 (13) 20.26 (12) Adelaide 0.25 (27) 85.42 (26) 0.02 (27) 5.69 (25) ADFA 144.77 (1) 959.78 (2) 6.03 (1) 39.99 (3) ANU 0.57 (25) 80.95 (27) 0.06 (24) 8.99 (20) Canberra 3.92 (18) 342.67 (12) 0.14 (22) 12.24 (18) Curtin 5.32 (17) 185.04 (18) 0.33 (15) 11.56 (19) Deakin 0.37 (26) 95.58 (25) 0.02 (26) 5.97 (24) Edith Cowan 3.03 (19) 111.94 (24) 0,20 (18) 7.46 (21) Flinders 12.23 (8) 143.06 (22) 1.11 (7) 13.01 (17) Griffith 10.52 (9) 670.88 (5) 0.53 (9) 33.54 (4) LaTrobe 8.20 (12) 368.34 (11) 0.37 (12) 16.74 (14) Macquarie 77.24 (3) 1851.24 (1) 2.03 (5) 48.72 (1) Melbourne 34.15 (6) 577.36 (6) 1.42 (6) 24.06 (8) Monash 2.87 (20) 230.22 (17) 0.24 (17) 19.18 (13) Murdoch 1.62 (24) 163.82 (19) 0.16 (21) 16.38 (15) Newcastle 10.30 (10) 575.24 (7) 0.41 (10) 23.01 (9) New England 13.87 (7) 949.54 (3) 0.41 (11) 27.93 (6) Queensland 2.67 (21) 156.12 (21) 0.13 (23) 7.43 (22) QUT 7.05 (15) 232.15 (16) 0.20 (19) 6.45 (23) RMIT 58.89 (4) 499.93 (9) 3.46 (2) 29.41 (5) Sydney 7.29 (14) 240.69 (15) 0.81 (8) 26.74 (7) Tasmania 85.05 (2) 828.69 (4) 2.24 (4) 21.81 (10) UNSW 10.02 (11) 158.00 (20) 0.29 (16) 4.51 (26) UTS 38.79 (5) 564.67 (8) 2.77 (3) 40.33 (2) UWA 2.39 (22) 139.82 (23) 0,06 (25) 3.68 (27) Victoria 254.55 (14) 0.17 (20) 21.21 (11) Western Sydney 2.04 (23) 6.53 (16) 259.77 (13) 0.34 (13) 13.67 (16) Wollongong Source: Sinha and Macri (2002, 142-3). Notes: LP refer to the impact-based journal weights and MSF to the perceptions-based journal weights. Numbers in parentheses are rankings.
27 Table 7: Australian-based Economists among the 1082 Most-cited 1984-96
Heinz Arndt Robert Bartels Ray Byron Ken Clements John Creedy Peter Dixon Simon Domberger John Freebairn Robert Gregory David Hensher Rodney Jensen Murray Kemp Peter Lloyd Harold Lydall Michael McAleer Ian McDonald Yew-Kwang Ng Adrian Pagan Jonathan Pincus Richard Pomfret Alan Powell Peter Swan David Throsby Clem Tisdell Peter Warr Alan Woodland
Year of Birth 1915 1947 1941 1950 1949 1946 1949 1944 1939 1934 1926 1937 1916 1952 1947 1942 1947 1939 1948 1937 1944 1939 1939 1943
Affiliation emeritus -- ANU emeritus -- Sydney Bond UWA Melbourne Monash (COPS) AGSM (Sydney & UNSW) Melbourne ANU (RSSS) Sydney emeritus -- Queensland emeritus -- UNSW Melbourne emeritus -- Adelaide UWA Melbourne Monash ANU Adelaide Adelaide emeritus -- Monash Sydney Macquarie Queensland ANU Sydney
Source: Blaug (1999), page 1225 plus individual entries (except for Hensher and Warr, who according to Appendix 4 qualified for inclusion but did not provide personal data).. Affiliations are as given in the source.
28 Table 8: Citations 1995-2002 and Publications 1990-2001 by Department
Number of Staff Adelaide ADFA ANU Canberra Curtin Deakin Edith Cowan Flinders Griffith LaTrobe Macquarie Melbourne Monash Murdoch New England Newcastle Queensland QUT RMIT Sydney Tasmania UNSW UTS UWA Victoria Western Sydney Wollongong Total
19 15 20 13 27 14 16 14 10 19 22 39 54 10 24 11 38 21 33 31 8 38 34 14 43 35 18 640
Citations 1,265 68 2,898 174 257 98 9 265 113 640 431 2,287 2,229 146 1,623 367 1,792 112 83 1,332 239 1,942 634 1,121 108 223 225 20,681
Publications Top 88 Aus 6 24.17 14.75 0 2.00 56.91 31.17 2.33 8.17 11.00 12.42 7.42 5.75 0 0 4.33 3.00 4.83 3.83 27.58 28.25 13.17 5.33 137.92 139.25 62.50 20.33 5.50 8.58 12.83 19.50 9.17 3.50 29.83 18.25 14.17 5.75 4.42 6.92 43.17 11.33 14.33 10.33 68.92 19.67 6.50 3.00 37.50 18.33 4.50 6.67 18.33 13.00 9.00 10.50 630.33 429.58
Notes: Citations are references in journal articles published since 1995 to all publications by academic staff in each department.
29 Table 9: Ranking of Australian University Departments by Total Publications
Adelaide ADFA ANU Canberra Curtin Deakin Edith Cowan Flinders Griffith LaTrobe Macquarie Melbourne Monash Murdoch Newcastle New England Queensland QUT RMIT Sydney Tasmania UNSW UTS UWA Victoria Western Sydney Wollongong Source: Tables 6 and 8
S&M LP wts MSF wts 13 10 27 26 1 2 25 27 18 12 17 18 26 25 19 24 8 22 9 5 12 11 3 1 6 6 20 17 24 19 10 7 7 3 21 21 15 16 4 9 14 15 2 4 11 20 5 8 22 23 23 14 16 13
New 9 26= 4 25 15 18 26= 24 21 8 13 1 3 20 16 14 7 12 23 5 11 2 19 6 22 10 17
30
Table 10: Top Ten Ranked Departments, various periods and measures.
1974-83 Harris
Total Publications 1984-88 1988-2000 1990-2001 1974-83 Harris S&M New Harris
Publications per capita 1984-88 1988-93 1988-2000 1990-2001 Harris T&W S&M New
ANU NewC Qld LaTrobe Sydney UNSW Monash Macq Melb Adel
Newc Macq Melb Adel ANU Qld Monash UNSW LaTrobe Sydney
NewC ANU Tas Adel Monash UNE Melb UNSW Macq Qld
ANU UNSW Melb Sydney UWA Monash Qld Griffith LaTrobe UNE
Melb UNSW Monash ANU Sydney UWA Qld LaTrobe Adel UWS
ANU ADFA NewC Macq LaTrobe UNSW Sydney Melb UNE Flinders
Melb UWA Tas Sydney ANU Monash UNSW UNE Griffith LaTrobe
ANU Sydney UWA UNSW Melb Monash Griffith Tas LaTrobe UNE
Melb ANU UWA UNSW Tas LaTrobe Sydney Adel Monash Newc
Citations per capita Total Citations 1986-87 1995-2002 1995-2002 Harris New New ANU UWA Macq Melb Adel Monash Flinders (7=) Qld (7=) LaTrobe (9=) NewC (9=)
ANU UWA UNE Adel Melb UNSW Qld Sydney Monash LaTrobe
ANU Melb Monash UNSW Qld UNE Sydney Adel UWA LaTrobe
Notes: Harris is based on his weighting of many types of publications, the other three rankings consider only journal articles. T&W is based on number of pages published in top three tier journals (71 journals), S&M on DV weightings of page length and journal quality, and the new rankings on number of publications in top 88 journals.
31
Figure 1: Share of Total Publi cations in Top 88 Journa ls of Economics Departments (1990-2001) Others 22% Melbourn e 22%
Adelaide 4% LaTrobe 4% Queensland 5%
UNSW 11%
UWA 6% Sydney 7%
Monas h 10%
ANU 9%
Figure 2: Cumulative Total Publications in Top 88 Journals of Australian Academic Economists (1990-2001) 672
Cumulative Publications
588 504 420 336 252 168 84 0 1
65
129
193
257
321
385
449
513
577
Cumulative Numbe r of Academic Staff in All Departments
640
32
Figure 3: Cumulative Total Citations of Australian Academic Economists' Publications by Journal Articles (1995-2002) 22500
Cumulative Citations
20000 17500 15000 12500 10000 7500 5000 2500 0 1
41
81 121 161 201 241 281 321 361 401 441 481 521 561 601 640 Cumulative Number of Acade mic Staff in All Departments
33 APPENDIX 1: Publications 1990-2001 and Citations 1995-2002 Louise Kym Michelle L. Raul A. Brian L. Richard John H. Anthony W. Ian W. Margaret Duc Tin Jonathan J. Richard W. T. Ramkishen S. Colin Tom Randy John E. Jenny
Allsopp Anderson Barnes Barreto Bentick Damania Hatch Hughes Mclean Meyler Nguyen Pincus Pomfret Rajan Rogers Sheridan Stringer Whitley Williams Adelaide (19)
Top 88 0.00 9.50 0.33 1.00 0.00 2.50 0.00 0.00 1.50 0.00 1.33 0.50 4.50 2.00 0.00 0.00 0.00 0.50 0.50 24.17
Aus. Citations 0.00 0 4.25 493 0.00 6 0.00 1 1.00 49 2.00 18 0.00 10 0.00 1 1.00 78 0.00 0 1.50 53 0.00 218 2.00 233 0.00 31 0.50 41 2.00 32 0.00 1 0.00 0 0.50 0 14.75 1265
Elizabeth Hock-Beng Iain Peter Twan Sharon Sid Gary Stefan R. Ian Paul Jung-Soo Massimilano Garrett James
Barber Cheah Densten Hall Huybers Jackson Knell Manger Markowski Mcewin Oslington Seo Tani Upstill Warn ADFA (15)
Top 88 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Aus. Citations 0.00 0 0.00 13 0.00 4 0.00 27 0.00 0 0.00 3 0.00 0 0.00 0 0.00 8 1.00 6 1.00 6 0.00 0 0.00 0 0.00 1 0.00 0 2.00 68.00
Matt H. Mac Robert V. Trevor Paul Selwyn Steve John Simon Chris Alan Flavio M.
Benge Boot Breunig Breusch Chen Cornish Dowrick Gage Grant Jones Martina Menezes
Top 88 2.00 0.00 1.00 0.25 2.50 0.00 7.83 0.00 8.83 1.00 0.00 3.83
Aus. Citations 3.00 5 3.00 25 0.00 0 0.00 1168 0.00 2 1.00 1 3.00 405 0.00 0 1.00 77 1.00 14 0.00 11 0.00 16
34 Don S. John Alex Guillaume Matthew J. Ben Rodney Graeme
Poskitt Quiggin Robson Rocheteau Ryan Smith Tyers Wells ANU (20)
Craig Petra Tania Anne E. Michael Desh James Philip E. T. Greg Muniappan Heather S. Nicholas Margaret
Applegate Bouvain Crosbie Daly Francis Gupta Hanratty Lewis Mahony Perumal Prior Samuel Wallace Canberra (13)
Top 88 0.00 0.00 0.00 0.50 0.00 0.00 0.00 1.50 0.00 0.00 0.00 0.33 0.00 2.33
Aus. Citations 0.00 1 0.00 0 0.00 0 4.50 95 0.00 0 0.00 7 0.00 0 3.67 58 0.00 6 0.00 2 0.00 0 0.00 5 0.00 0 8.17 174.00
Alles Austen Bloch Chotikapanich Crockett Evans Hopkins How Kemp Kerr MacDonald Madden Manzur Mayall Oborn O'hara Pang Parasuraman Pereira Pope Rath Rowland Simpson Stromback Thorpe Verhoeven Western Curtin (27)
Top 88 0.50 0.00 2.75 1.00 0.00 0.00 0.50 0.50 0.00 0.00 1.50 2.08 0.67 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 1.00 0.50 0.00 0.00 11.00
Aus. Citations 1.50 2 0.00 1 3.25 75 1.00 14 0.83 18 0.00 0 0.00 3 0.00 2 0.00 1 0.25 0 0.75 5 2.67 40 1.00 19 0.00 0 0.00 0 0.00 28 0.00 0 0.00 0 0.00 0 0.33 10 0.00 0 0.00 0 0.00 0 0.33 34 0.50 2 0.00 0 0.00 3 12.42 257.00
Lakshman A. Siobhan Harry Duangkamon Geoffrey V. John P. Sandra Janice C. Y. Steven Ian Alexander Garry A. Gary G. Meher Peter Michael Phillip Anthony Johnney Aloysius David Jeff Subhrendu Patrick John C. Thorsten Michael W. Peter David
3.50 20.83 0.00 1.33 0.00 0.00 4.00 0.00 56.91
0.00 15.67 0.00 0.00 0.00 0.00 2.50 1.00 31.17
93 902 0 8 0 15 152 4 2898.00
35
Bantow Doucouliagos Geller Graham Hazari Hellier Hone Keneley Neath Racionero Scarborough Sgro Thomson Torre Deakin (14)
Top 88 0.00 3.50 0.00 0.00 1.33 0.00 0.75 0.00 0.00 1.00 0.00 0.83 0.00 0.00 7.42
Aus. Citations 0.00 0 3.00 41 0.00 0 0.00 1 0.00 25 0.00 0 1.75 8 1.00 1 0.00 0 0.00 0 0.00 1 0.00 20 0.00 1 0.00 0 5.75 98.00
David E. Ray Mahendra Yun Hsing Marilyn Paul Steven Lee Kian Edward Chien-Tin Stuart Ron Greg Roy Christopher Clive Mark
Allen Boffey Chandra Cheung Clark-Murphy Gerrans Li Lim Lin McKay Moore Parry Pearce Reynolds Reynoldson Waring Edith Cowan (16)
Top 88 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Aus. Citations 0.00 9 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 9.00
Owen Antonio Donald E. Susan John Kevin Philip A. Daniel John Penny Graham Ralph Peter Joe
Covick Dottore Fuller Gunner Hayles Kirchner Lawn Leonard McDonald Neal Scott Schlomowitz Wagstaff Williams Flinders (14)
Top 88 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 2.67 0.00 0.00 1.67 0.00 0.00 4.33
Aus. Citations 0.00 2 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 9 0.00 72 1.00 38 0.00 5 0.00 0 2.00 138 0.00 1 0.00 0 3.00 265.00
Ray Hristos (Chris) Chris Mary Bharat R. Phillip Phillip Monica J. David Maria del Mar Helen Pasquale M. Dianne Andrew
36
Bandaralage Brown Cybinski Fitzgibbons Forster Nguyen Riley Selvanathan Smith Swift Griffith (10)
Top 88 0.50 0.50 0.00 0.00 0.00 0.00 0.00 2.83 0.00 1.00 4.83
Aus. Citations 0.00 19 0.50 12 0.00 0 0.00 56 0.00 0 0.50 4 0.00 0 1.83 21 0.00 0 1.00 1 3.83 113.00
Harry T. Buly A. Harry R. Robert Geoff W. Iain M. Lionel E. Michael Darren Gillian J. Greg John O. S. Jae (Paul) H. John E. Imad A. David Michael P. Robert George Xiangkang
Burley Cardak Clarke Dumsday Edwards Fraser Frost Harris Henry Hewitson Jamieson Kennedy Kim King Moosa Prentice Schneider Waschik Yin La Trobe (19)
Top 88 0.00 1.00 4.00 0.00 1.25 1.00 0.00 0.00 0.00 1.00 0.00 0.00 1.00 2.83 7.00 1.00 0.00 2.50 5.00 27.58
Aus. Citations 0.00 1 1.00 2 10.00 156 0.00 2 0.75 75 1.00 23 4.50 50 2.00 0 0.00 0 2.00 16 0.00 0 0.00 161 0.00 7 0.00 71 2.00 64 1.50 0 0.00 4 0.00 2 3.50 6 28.25 640.00
Peter W. Melanie Wylie D. William D. Anthony Craig Kim M. Christopher Jocelyn Pui C. Glenn S. Roselyne C. William Marc P. Graham M. Allan A. Daehoon Roderick M. Dipendra N. C. David
Abelson Beresford Bradford Bryant Freedman Hawtrey Heaton Horne Ip Jones Joyeux Junor Lombard Madden McHarg Nahm O'Donnell Sinha Throsby
Top 88 0.00 0.00 0.00 3.00 1.50 2.00 0.00 2.00 0.00 0.50 0.00 0.33 1.00 0.00 0.00 0.33 1.00 0.50 1.00
Aus. Citations 0.00 64 0.00 45 0.00 0 0.00 3 0.50 10 2.00 6 0.00 1 1.00 46 0.00 1 0.50 0 0.00 1 0.00 0 0.00 2 0.00 0 0.00 0 0.33 0 0.00 84 0.00 12 0.00 156
Jayatilleke S. Allan L. Patti Athol John Duc Tho Jen Saroja Christine Robyn
37 Roger S. Sean R. A. Trevor
Tonkin Turnell Whitehead Macquarie (22)
0.00 0.00 0.00 13.17
0.00 1.00 0.00 5.33
0 0 0 431.00
Mary Akihito Peter Suren Jeffrey I. Lisa A. Hsiao-Chuan Yuan John Mark Robert John Malcolm Lisa John W. Lata William E. David C. Olan T. Joseph G. Sisira Carol G. Stephen P. Guay Cheng Peter J. Jenny N. Donald Gary B. Vance L. Ian M. Neville R. Nilss Katerina Kalvinder Michael A. Christopher L. Rhonda L. Peter J. Rabee Ross A.
Amiti Asano Bardsley Basov Borland Cameron Chang Chou Creedy Crosby Dixon Dowling Farrell Freebairn Gangadharan Griffiths Harris Henry Hirschberg Jayasuriya Johnston King Lim Lloyd Lye MacLaren Magee Martin McDonald Norman Olekalns Sherstyuk Shields Shields Skeels Smith Stemp Tourky Williams Melbourne (39)
Top 88 3.00 0.00 7.00 1.00 8.33 6.33 0.00 0.00 20.00 5.50 1.00 0.83 1.08 3.33 1.50 0.50 1.00 2.83 3.17 2.67 1.08 7.50 5.25 10.50 2.67 0.50 1.00 4.25 6.92 0.00 7.83 4.50 2.50 3.50 3.00 0.00 1.33 3.83 2.67 137.92
Aus. Citations 1.50 27 0.00 0 3.00 91 0.00 4 14.83 110 1.50 37 0.00 0 0.00 0 24.00 403 7.33 18 2.83 39 3.50 22 0.00 16 12.00 154 0.00 9 1.58 193 0.00 11 3.83 10 0.33 43 0.00 26 1.50 4 13.33 39 5.50 13 6.00 251 1.50 24 1.00 17 2.00 21 2.00 16 9.00 491 1.00 23 8.33 31 0.00 10 0.00 4 0.00 71 0.00 13 3.00 1 6.33 11 0.00 30 2.50 4 139.25 2287.00
Heather M. Mita Ross Larry Tony Anita Phillip
Anderson Bhattacharya Booth Cook Dingle Doraisami Edwards
Top 88 2.33 1.00 0.00 0.00 0.00 0.00 0.00
Aus. Citations 0.50 34 0.50 11 0.00 0 0.00 5 0.00 77 0.00 8 0.00 0
38 Merran A. Dietrich K. Cathy Catherine S. Peter J. Lee Barry A. Rob J. Brett A. Carol Gennadi Maxwell L. Ian Elizabeth Pushkar Gael M. Keith R. Alan Kathy Harmindar Siang Yew Kwang Ram Robert C. Judith Shirley George Geraldine He-Ling Paramsothy Russell Ralph D. Geoff Bruce Judy Christis G. Keith Farshid Maria Rebecca J. Marika Ian Michael V. Ian R. Koi Nyen Jill Xiaokai Xueyan
Evans Fausten Fletcher Forbes Forsyth Gordon-Brown Goss Hyndman Inder Jeffs Kazakevitch King Kirkwood Maharaj Maitra Martin McLaren McLean Meikle Nath Ng Ng Pillarisette Rice Rich Richardson Rivers Roberts Shi Silvapulle Smyth Snyder Spenceley Stephens Tennant Tombazos Trace Vahid Valenzuela Vicziany Ward White Wills Wong Wright Yang Zhao Monash (54)
Simon Murray Paul R. Anne Margaret Frank
Avenell Brennan Flatau Garnett Harman
1.33 1.50 0.00 0.33 0.00 0.00 1.83 0.00 3.83 0.00 0.00 6.33 0.00 0.00 1.50 0.25 3.58 0.00 0.00 0.00 0.50 10.83 0.00 0.50 2.00 0.00 0.00 0.00 3.33 3.33 0.50 0.67 0.00 0.00 0.00 3.00 0.00 2.00 0.33 0.00 0.00 5.00 0.50 0.00 0.00 5.83 0.33 62.50 Top 88 0.00 0.00 0.50 0.00 0.00
0.33 1.50 0.00 0.00 0.50 0.00 3.17 0.00 1.17 0.00 0.00 0.50 0.00 0.00 0.00 0.00 1.58 0.00 0.00 0.00 0.00 4.50 0.00 0.00 0.50 0.00 0.00 0.00 0.00 0.00 0.50 0.00 0.00 0.00 0.00 0.00 1.00 0.50 1.00 0.00 0.00 1.00 1.00 0.00 0.00 0.33 0.25 20.33
203 10 0 4 95 0 44 59 90 0 0 409 0 5 20 3 47 0 0 0 0 651 0 4 0 0 0 0 5 41 27 63 10 0 0 4 4 93 9 28 1 54 5 0 0 97 9 2229.00
Aus. Citations 0.00 0 0.00 0 2.67 13 0.00 0 0.00 3
39 Robert Rajasundrum Herb Malcolm T. Gavin A.
Leeson Sathiendrakumar Thompson Tull Wood Murdoch (10)
5.00 0.00 0.00 0.00 0.00 5.50
1.00 0.00 0.00 1.50 3.42 8.58
61 7 17 16 29 146.00
Jonathan C. George E. Oscar J. Hui-Shung Tim J. Brian E. Euan M. Pauline Jeff Graydon R. Amarjit Alfons Christopher John Christopher J. Roley R. D. S. John M. Phillip R. John A. A. Mahindapala Greg O. Malcolm L. Paul
Baldry Battese Cacho Chang Coelli Dollery Fleming Fleming Gow Henning Kaur van der Kraan Lloyd Nightingale O'Donnell Piggott Prasada Rao Pullen Simmons Sinden Siriwardana Smith Treadgold Winters New England (24)
Top 88 1.00 0.00 0.33 0.50 1.50 2.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.83 0.33 2.67 0.00 0.50 0.50 2.33 0.00 0.00 0.33 12.83
Aus. Citations 2.00 52 0.50 519 0.83 45 1.00 53 1.75 341 3.33 69 0.00 28 0.00 0 0.00 3 1.00 5 0.00 10 0.00 0 2.00 110 0.25 24 0.33 22 2.17 39 0.00 26 0.00 48 2.00 8 0.33 186 2.00 18 0.00 3 0.00 7 0.00 7 19.50 1623.00
John John R. Akhtar Jim Paul William F. Allen Sudha Charles W. John Martin J.
Burgess Fisher Hossain Jose Kniest Mitchell Oakley Shenoy Stahl Tate Watts Newcastle (11)
Top 88 0.83 0.00 1.00 0.00 0.83 2.00 0.00 0.00 0.00 0.00 4.50 9.17
Aus. Citations 1.00 40 0.00 18 0.00 33 0.00 5 0.00 2 1.50 23 0.00 55 0.00 0 0.00 109 0.00 4 1.00 78 3.50 367.00
Mohammad John Rodney M. Philip M. Richard P. C. Harry F.
Alauddin Asafu-Adjaye Beard Bodman Brown Campbell
Top 88 0.33 0.00 0.00 3.50 0.00 2.50
Aus. Citations 0.00 94 0.50 2 0.00 0 3.00 17 0.00 102 1.50 87
40 Joseph C. H. Averil George E. Darrel P. L. Alan Maurice Peter E. John Stephen R. Robert V. Sukham Neil Dias Gareth Bruce Renuka Anthony J. Thomas D. John E. Ghanshyam B. Colin Jason Alicia N. Paul C. Jacqueline Kartik Chandra Jon D. Sam M. Kam-Ki Clement A. Guy R. Leopoldo Jie
Chai Cook Docwra Doessel Duhs Dwyer Earl Foster Harrison Jackson Jackson Karunaratne Leeves Littleboy Mahadevan Makin Mandeville Mangan Mehta Nicholson Potts Rambaldi Riethmullen Robinson Roy Stanford Strong Tang Tisdell West Yanes Zhang Queensland (38)
John Ralf Mark Michael Glyn Renee Helen Stan A. Radhika Allan P. Boon Eugene Andrew Vlad John Marc Tim Tommy Abbas Peter
Anderson Becker Christensen Drew Edwards Fry Higgs Hurn Lahiri Layton Lee McCann Paltridge Pavlov Polichronis robinson robinson Tang Valadkhani Whelan
0.00 0.00 0.00 0.00 0.00 0.00 1.00 3.83 0.00 0.00 0.00 0.00 0.00 0.00 1.50 1.00 0.00 0.00 3.83 0.00 0.00 1.50 0.00 0.00 0.00 0.00 1.00 0.50 0.33 0.00 0.00 9.00 29.83 Top 88 0.00 0.00 0.00 0.00 0.00 0.00 0.00 3.67 0.00 7.50 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
0.00 0.25 0.00 0.00 0.00 0.00 0.00 0.25 0.00 2.50 0.00 0.00 2.00 0.00 0.00 4.00 0.00 1.00 0.00 0.00 0.00 0.50 0.00 0.00 0.00 0.00 0.50 0.00 1.00 1.25 0.00 0.00 18.25
46 0 4 29 2 0 164 103 77 83 29 19 2 5 1 20 9 8 119 0 5 21 26 0 13 3 4 3 592 72 0 31 1792.00
Aus. Citations 0.00 0 0.00 0 0.00 0 0.00 0 0.33 1 0.00 0 0.33 0 0.00 3 0.00 2 3.00 64 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 14 0.00 6 0.00 0 0.00 1 0.00 0
41 Andrew C.
Worthington QUT (21)
3.00 14.17
2.08 5.75
21 112.00
John Colin Robert D. Susan Stewart Bruce Rod Sinclair Richard Jenny Amalia Robert Robert W. Peter Harry Terrence A. Thomas Ed Paula Tony Greg Michael D. Warren Heather Vanitha John H. Magdy Mark F. Malick Ousmane Stuart Maira Miriam Cynthia Ching Lun
Anderson Bent Brooks Campbell Carter Cowling Crane Davidson Diggle Di Iorio Evans Faff Fuller Goldberg Hallahan Josev Koken Loveday Martin Maynes McKenzie McKeown Mitchell Ragunathan Shannon Stephan Stewart Sy Thomas Vitols Weisz Wilson Wu RMIT (33)
Top 88 0.00 0.00 2.08 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 1.33 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 1.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 4.42
Aus. Citations 0.00 0 0.00 0 3.08 30 0.00 0 0.00 0 0.00 0 0.00 0 0.00 3 0.00 0 0.00 0 0.00 0 1.33 25 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 0.00 0 1.00 20 0.00 0 0.00 0 0.00 2 0.00 1 0.00 0 1.00 0 0.00 2 0.00 0 0.00 0 0.00 0 0.50 0 0.00 0 6.92 83.00
Murali Robert Elie Tony Robert Diane Dilip K. John Moshe Richard Erne Diane Tig Surinder Anthony J.
Agastya Aldrich Appelbaum Aspromourgos Bartels Dancer Dutta Goodhew Haviv Holden Houghton Hutchinson Ihnatko Joson Phipps
Top 88 2.00 0.00 6.00 0.50 2.50 0.00 0.50 0.00 1.00 0.00 0.00 0.00 0.00 0.00 0.00
Aus. Citations 0.00 4 0.00 118 0.50 358 1.50 15 0.33 89 0.00 0 0.00 12 0.00 0 0.00 89 0.00 0 0.00 10 1.00 2 0.00 0 0.00 2 3.00 10
42 Lily Anu Russell T. Abhijit Kunal Jeffrey R. Michael S. Murray D. Frank Benjamin Yanis Andrew Graham Alan D. Donald J. Judy Steffen
Rahim Rammohan Ross Sengupta Sengupta Sheen Smith Smith Tipton Varoufakis Wait White Woodland Wright Yates Ziss Sydney (31)
0.00 0.00 1.00 0.75 5.25 2.00 2.83 1.00 0.00 1.00 0.00 0.50 3.33 7.00 0.00 6.00 43.17
0.00 0.00 1.50 0.00 0.00 0.50 0.00 0.00 0.00 0.00 0.00 0.50 0.00 2.50 0.00 0.00 11.33
14 0 19 9 18 1 68 17 84 27 0 2 272 12 55 25 1332.00
Michael A. William O. Jerry Bruce S. Sarah M. Ranjan Hugh Steve
Brooks Coleman Courvisanos Felmingham Jennings Ray Sibly Thollar Tasmania (8)
Top 88 2.00 4.00 0.00 1.83 0.00 1.67 4.83 0.00 14.33
Aus. Citations 0.00 18 2.00 15 0.00 5 2.33 16 0.50 8 0.83 169 4.67 8 0.00 0 10.33 239.00
Chris M. Garry F. Hazel Ronald A. Gautam David Robert M. Denise J. Diane Denzil G. Geoffrey Lance A. Kevin J. Jack Arghya Robert J. Nanak C. Geoffrey H. Peter John Elisabetta Kieron J. David Mehdi S. Hodaka
Alaouze Barrett Bateman Bewley Bose Clark Conlon Doiron Enahoro Fiebig Fishburn Fisher Fox Frisch Ghosh Hill Kakwani Kingston Kriesler Lodewijks Magnani Meagher Meredith Monadjemi Morita
Top 88 0.00 3.33 0.33 3.75 3.00 0.00 0.00 4.50 0.00 5.83 0.00 3.58 4.83 0.00 0.33 5.00 6.33 2.33 0.50 0.00 1.00 1.00 0.00 1.50 2.00
Aus. Citations 4.00 60 0.33 33 0.33 13 2.00 253 0.00 11 0.00 4 1.00 23 0.00 47 0.00 0 0.33 48 0.00 13 1.50 14 0.50 29 0.00 0 0.00 1 0.00 12 0.00 931 3.67 15 0.00 34 0.00 14 0.00 11 0.00 15 0.00 28 1.00 10 0.00 0
43 Glenn David Anthony D. John John Nripesh Peter E. William Eric R. Trevor R. Truong P. Geoffrey Minxian Louis
Otto Owen Perkins Piggott Podder Robertson Schworm Sowey Stegman Truong Waugh Yang Yeung UNSW (38)
4.00 0.33 0.00 2.83 2.50 2.00 1.50 0.00 0.33 0.00 0.00 6.25 0.00 68.92
1.83 0.00 0.00 1.17 1.00 1.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 19.67
64 26 26 52 25 8 14 40 4 15 30 19 0 1942.00
Christopher Helen B. Ron G. Wing Keith K. W. Carl Jock Carolyn Rosalie Peter Nadima Mark Anthony D. Xue Zhong Tiffany Oh Kang Joelle Hugh Bill Sian Graham Maurice Len J. Eckhard Mike Erik Andrew Maxwell Guy Christopher Satish Rowan Scott Patrick James
Bajada Bendall Bird Bui Chan Chiarella Collins Currie De Gabriele Docherty El Hassan Freeman Hall He Hutcheson Kwon Miffre Morris O'Connor Owen Partington Peat Perry Platen Poe Schlogl Simos Stevenson Ta Terry Thosar Trayler Walker Wilson UTS (34)
Top 88 1.00 0.00 0.00 0.00 0.00 1.33 0.00 0.00 0.00 0.00 0.33 0.00 1.33 0.00 0.00 0.00 1.00 0.00 0.00 0.00 0.00 0.50 0.00 0.00 0.00 0.00 0.00 0.50 0.00 0.00 0.50 0.00 0.00 0.00 6.50
Aus. Citations 1.00 0 0.00 0 0.00 11 0.00 0 0.00 1 0.00 201 0.00 57 0.00 0 0.00 0 0.00 0 0.00 0 0.00 30 1.00 117 0.00 78 0.00 0 0.00 2 0.00 2 0.00 0 0.00 0 0.00 0 0.00 2 0.00 2 1.00 0 0.00 110 0.00 0 0.00 2 0.00 0 0.00 2 0.00 0 0.00 1 0.00 3 0.00 0 0.00 0 0.00 13 3.00 634.00
David J.
Butler
Top 88 3.00
Aus. Citations 0.00 4
44 Kenneth W. Paul Mel Nicolaas Michael Paul B. Paul W. M. Abu B. Pamela MoonJoong Darrell A. Ernst Juerg Yanrui
Clements Crompton Davies Groenewold McAleer McLeod Miller Siddique Statham Tcha Turkington Weber Wu UWA (14)
Steve Jim Roberto Marian Trevor Sarath Graham Donald Hurbert Michelle Terdich Yashar Alan Pemasiri Mohammed Ohidul Inka Robert Kandiah Malay Vanaja Laszlo Nadara Sydney Muhammad George Mario Alan G. Theo John Jadunath Andrew Jordan Z. Shashi Jesse Lindsay John Lindsay W. Jo Steven
Bakalis Bates Bergami Burford Coombes Divisekera Dunkley Feaver Fernando Fong Garry Gedik Golden Gunawardana Haque Havrila Howard Jegasothy Joshi Karagiannidis Konya Kulendran Lambrick Mahmood Messinis Miranda Morris Papadopoulos Pettitt Pradhan Rogers Shan Sharma Singh Smyrk Tippett Turner Vu Wdowik
2.00 0.00 0.00 3.33 9.67 0.00 9.83 0.00 0.00 3.33 2.00 1.00 3.33 37.50 Top 88 0.00 0.00 0.00 0.00 0.00 0.00 0.00 1.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.50 0.00 0.00 0.00 0.00 0.50 0.00 0.00 0.00 0.00 0.50 0.00 0.00 0.00 0.00 0.00 0.00 0.00
1.00 0.00 0.00 2.83 1.33 0.00 11.83 0.00 1.00 0.33 0.00 0.00 0.00 18.33
187 7 13 39 382 32 368 9 11 20 3 14 32 1121.00
Aus. Citations 0.00 0 0.00 0 0.00 0 0.00 0 0.00 1 1.00 7 0.00 7 0.83 9 0.00 0 0.00 0 0.00 4 0.00 0 0.00 0 0.33 7 0.00 12 0.00 0 0.00 0 0.00 2 0.00 0 0.00 0 0.00 1 0.33 17 0.00 0 0.33 0 0.00 0 0.00 0 0.83 8 1.00 0 0.00 0 0.00 2 0.00 0 1.00 13 0.00 0 0.00 0 0.00 0 0.00 0 0.00 13 0.00 0 0.00 0
45 Ruth Kenneth W. Christabel Segu
Williams Wilson Zhang Zuhair Victoria (43)
0.00 2.00 0.00 0.00 4.50
0.00 1.00 0.00 0.00 6.67
0 5 0 0 108.00
John R. Haydir Dale Anis Mamta B. Russel J. Kevin James Wayne Craig James Partha Verghis Roger Neil Samanthala P. N. (Raja) Steve Harbhaja Michael Uwe Zelko Bill Girija Andrew Satya Brian Michael Anne Maureen Ingrid Gary Gang Sean Cong Nghe Thomas J. Maria Estela William E.
Ablett Alhashimi Boccabella Chowdhury Chowdhury Cooper Daly Dwyer Ellis Farrell Gangopadhyay George Ham Hart Hettihewa Junankar Keen Kehal Kelly Lilje Livaic Lucarelli Mallik Marks Paul Pinkstone Rafferty Scarff Schraner Tian Toohey Truong Valentine Varua Worner Western Sydney (35)
Top 88 1.50 0.00 0.00 0.00 0.00 4.67 0.00 0.00 0.50 0.00 2.33 0.00 0.00 1.00 0.00 1.00 0.00 0.00 0.50 0.00 0.00 0.00 0.00 0.00 5.83 0.00 0.00 0.00 0.00 0.00 0.00 0.50 0.50 0.00 0.00 18.33
Aus. Citations 2.50 8 0.00 0 0.00 1 0.00 38 0.00 0 1.17 42 0.50 1 0.00 2 0.00 4 0.00 0 0.00 1 0.00 0 0.00 0 1.50 0 0.00 4 0.83 54 0.00 11 0.00 0 0.00 8 0.00 0 0.00 0 0.00 0 0.00 0 0.50 0 1.50 20 0.00 6 0.00 0 0.00 0 0.00 0 1.00 7 0.00 0 0.00 1 3.50 15 0.00 0 0.00 0 13.00 223.00
Khorshed Charles Tran Van Ann Kankesu Boon Chye Amnon Donald E. Raymond Maree
Chowdhury Harvie Hoa Hodgkinson Jayanthakumaran Lee Levy Lewis Markey Murray
Top 88 1.00 0.00 1.00 0.00 0.00 0.00 0.00 1.50 0.00 0.00
Aus. Citations 0.00 0 1.00 12 0.00 16 0.00 0 0.00 2 0.00 0 2.00 9 1.00 36 0.00 37 0.00 0
46 Frank Nelson Eduardo Joan R. John L. Nadia Simon P. Edgar
Neri Perera Pol Rodgers Rodgers Verrucci Ville Wilson Wollongong (18)
1.00 0.00 0.00 1.50 1.50 0.00 1.00 0.50 9.00
1.00 0.00 1.00 0.50 0.50 0.00 2.00 1.50 10.50
Appendix 2: Laband and Piette Measure of Journal Impact Rank 1 2 3 4 5 6 7 8 9 10 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 38 39 40 41 42 43
Journal American Economic Review Journal of Financial Economics Econometrica Journal of Political Economy Quarterly Journal of Economics Journal of Monetary Economics Journal of Economic Theory Journal of Finance Review of Economic Studies Bell Journal of Economics Rand Journal of Economics Journal of Economic Perspectives Journal of Mathematical Economics J. Acc. Econ. Journal of Business Journal of Econometrics Amer. Econ. Rev. Pap. & Proc. J. Finan. Quant. Anal. Journal of Economic Literature Journal of Money, Credit and Banking J. Lab. Econ. International Economic Review Brookings Pap. Econ. Act. Review of Economics and Statistics Journal of Law and Economics Economic Journal Journal of Business and Economic Statistics Journal of Economic Education Industrial and Labor Relations Review Journal of Public Economics Journal International Economics Economics Letters Journal of Industrial Economics International Journal of Industrial Organization Journal of Economic and Dynamic Control Social Choice and Welfare Journal of Human Resources Journal of Banking and Finance Economic Inquiry Western Economic Journal Journal of the American Statistical Association Journal of Urban Economics Demography J. Roy. Statist. Soc. Ser. B -- Meth. Journal of Economic History
LP weight 100.0 90.7 89.0 79.1 64.5 59.3 51.1 51.0 47.6 46.5 37.0 32.4 28.7 27.1 26.8 26.7 21.3 18.6 17.9 17.1 16.7 14.7 14.0 12.8 12.8 12.4 12.2 11.7 10.8 10.8 9.6 9.5 8.5 8.1 7.3 7.3 6.7 5.9 5.5 5.3 5.1 4.8 4.8
5 2 0 21 10 0 75 0 225.00
47 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88
Oxford Bulletin of Economics and Statistics J. Finan. Res. Economica Public Choice J. Risk Ins. J. Acc. Res. European Economic Review Scandinavian Journal of Economics Econometric Theory Journal of Comparative Economics Journal of Economic Behavior and Organization Journal of Labor Research Explorations in Economic History Public Finance Cato Journal Journal of Development Economics American Journal of Agricultural Economics Southern Economic Journal J. Legal Stud. Journal of Macroeconomics Industrial Relations Kyklos Journal of Health Economics Canadian Journal of Economics Oxford Economic Papers Public Finance Quart. British Jnl. of Industrial Relations Manchester School of Economic and Social Studies Manchester School J. Royal Stat. Soc. Ser. A -- Gen. World Economy Population Development. Rev. National Tax Journal Applied Economics Quart. Rev. Econ. Bus. J. Int. Bus. Stud. Journal of Forecasting Scottish Journal of Political Economy Reg. Sci. Urban Econ. Mon. Lab. Rev. Economic Modeling Cambridge Journal of Economics Economic Development and Cultural Change Land Economics Weltwirtschaftliches Archiv Economic Record
4.4 4.2 4.1 3.9 3.8 3.7 3.6 3.6 3.3 3.1 2.7 2.5 2.4 2.2 1.8 1.8 1.8 1.7 1.6 1.4 1.1 1.1 1.1 1.1 1.0 1.0 0.8 0.8 0.7 0.6 0.6 0.6 0.5 0.4 0.4 0.4 0.4 0.3 0.3 0.3 0.3 0.2 0.2 0.2 0.2
Source: Laband and Piette (1994a) Table A2 final column. Equal ranking indicates a change of journal name.