Australasian Plant Pathology: an analysis of authorship and citations in the 21st century M. C. Calver, P. A. O’Brien & M. Lilith
Australasian Plant Pathology Journal of the Australasian Plant Pathology Society ISSN 0815-3191 Volume 41 Number 2 Australasian Plant Pathol. (2012) 41:179-187 DOI 10.1007/s13313-011-0106-2
1 23
Your article is protected by copyright and all rights are held exclusively by Australasian Plant Pathology Society Inc.. This e-offprint is for personal use only and shall not be selfarchived in electronic repositories. If you wish to self-archive your work, please use the accepted author’s version for posting to your own website or your institution’s repository. You may further deposit the accepted author’s version on a funder’s repository at a funder’s request, provided it is not made publicly available until 12 months after publication.
1 23
Author's personal copy Australasian Plant Pathol. (2012) 41:179–187 DOI 10.1007/s13313-011-0106-2
Australasian Plant Pathology: an analysis of authorship and citations in the 21st century M. C. Calver & P. A. O’Brien & M. Lilith
Received: 4 July 2011 / Accepted: 2 November 2011 / Published online: 16 November 2011 # Australasian Plant Pathology Society Inc. 2011
Abstract To better inform editorial planning, we analysed Australasian Plant Pathology’s (APP) authorship and readership 2001–2010. Authors came from Australia (57%), the Americas (Canada, USA and South American countries) (11%), New Zealand (7%), other Pacific and Asian countries (9%), Europe (5%) and other nations (11%), with the Australian contribution declining over the decade. Most authors were government employees (55% overall), but this category declined from 58% in 2001 to 46% in 2010. Academic authors (40% in 2001 to 49% in 2010) and other authors (2% in 2001 to 5% in 2010) increased. Using Scopus (December 2010), ≥73% of papers between 2001 and 2007 were cited ≥1, declining to 19% in 2010. Authors citing APP came from 114 countries (ISI Web of Science’s cited reference feature, December 2010). Compared to 23 plant pathology journals over 2002–2007, APP ranked 15th over 10 journal usage statistics. In cluster analysis APP was closest to Journal of Phytopathology, Forest Pathology and Canadian Journal of Plant Pathology. Given its increasing proportion of authors from outside Australia, the many countries citing it and its usage relative to similar journals, APP makes a broad regional contribution with global recognition. The editorial challenge is to identify and solicit the ‘new and significant work’ that the journal web site claims to prioritise for publication. Keywords Australasian Plant Pathology . Citation . Bibliometrics . SNIP . SJR M. C. Calver (*) : P. A. O’Brien : M. Lilith School of Biological Sciences and Biotechnology, Murdoch University, Murdoch, Western Australia 6150, Australia e-mail:
[email protected]
Introduction From modest beginnings as a newsletter for the Australasian Plant Pathology Society in 1989, Australasian Plant Pathology has developed into a significant regional journal publishing over 70 papers annually. Its aims have also broadened from an initial desire ‘…to help overcome the problems of communication which arise because of the vast distances which separate groups of Plant Pathologists in Australasia’ (Taylor 1972). The journal now seeks to present ‘…new and significant research in all facets of the field of plant pathology. Dedicated to a worldwide readership, the journal focuses on research in the Australasian region, including Australia, New Zealand and Papua New Guinea, as well as the Indian, Pacific regions’ (Springer 2011). As part of that shift to a worldwide readership, publication of Australasian Plant Pathology moved from CSIRO Publishing to Springer at the end of 2010. With the shift in publisher, we deemed it appropriate to profile the journal’s authorship and citations over the previous decade to determine whether or not it was achieving its aims of a global readership while giving particular prominence to plant pathology research in Australasia. We addressed these questions: & & & & &
considering country of origin and institutional affiliation, who publishes in Australasian Plant Pathology? are there trends over time in authorship such as the country of origin of authors, the number of authors/ paper or the affiliations of authors? how many papers are cited and how often? what is the country of origin of authors citing papers in Australasian Plant Pathology? is Australasian Plant Pathology’s pattern of use changing over time?
Author's personal copy 180
&
M.C. Calver et al.
how does Australasian Plant Pathology’s pattern of use compare to other journals, both international and regional, publishing similar content?
Methods Authorship analysis We assessed trends in Australasian Plant Pathology’s authorship over the period 2001–2010, drawing data from Scopus (Elsevier 2010). We classified all authors on a paper, not just the first author, into one of the following countries/groupings: Australia, New Zealand, the Americas (including Canada, the USA and South American countries), Europe, Pacific/Asian countries other than Australia and New Zealand, and ‘other’ (including African nations and the Middle East). If an author gave two addresses, each from a different country, we recorded both countries, but two or more addresses from the same country were entered only once. We also noted whether an author’s address indicated an academic affiliation (university, school or college), a government agency or ‘other’ (including private street addresses, non-government organisations and private businesses). Multiple addresses within an affiliation category for an author were entered only once, but where addresses in different affiliation categories were given, each address was entered. We also recorded the number of authors for each paper and used this to calculate the median authors/paper and the mean authors/paper in each year to describe authorship trends over time (Harrison 2006; Calver and Bryant 2008). Assessing journal pattern of use Use of journals is assessed most commonly using citation data from one or more databases (e.g. Harzing and van der Wal 2008; Calver et al. 2010; Sainte-Marie 2010), although citations need not imply quality (Bloch and Walter 2001; Whitehouse 2002; Cameron 2005) and may underestimate the real usage of journal articles that are read and applied, but not cited (Bollen et al. 2009, see also Vanclay 2008 for an attempt to use Google Scholar to distinguish between citations in the journal literature and nonacademic citations). We chose Scopus (Elsevier 2010) as our primary source because of its breadth of coverage (Pauly and Stergiou 2005; Meho and Yang 2007; Harzing and van der Wal 2008) and its inclusion of regional journals that are important outlets for locally applicable research in plant pathology. In December 2010 we downloaded Scopus data for Australasian Plant Pathology between 2001 and 2010, including all citations for each paper from publication until
that time. From these records we identified Australasian Plant Pathology’s 15 most highly cited papers over the decade and the institutional affiliation and country of origin of their authors. We then calculated for each year the mean citations per paper (mean CPP), median citations per paper (median CPP), and the proportion of papers in that year cited since publication as measures of journal use. We also took directly from Scopus two further annual measures of journal usage, SCImago journal rank (SJR) and Source Normalized Impact per Paper (SNIP). SJR and SNIP are recent measures, so for comparison we also present both the two-year and the five-year versions of the better known Journal Impact Factor (JIF) (Garfield 2006), available from Thomson Reuters’ Journal Citation Reports (Thomson Reuters 2011) (see Table 1 for a fuller description of these journal evaluation metrics). We assessed the geographic distributions of citations to Australasian Plant Pathology papers published between 2001 and 2010 using the ‘cited reference search’ and ‘analyze results’ functions in Web of Science (WOS). These functions allow one to specify the journal to be searched and the time period, before identifying all citations to papers from that journal in the given time range. The citations can then be scored for several criteria, including the country of origin of the citing authors. This avoids trawling the more comprehensive Scopus, but citations are only from journals included in WOS, possibly underestimating total citations. Furthermore, when considering a paper with multiple authors, WOS also includes each country only once, irrespective of the number of authors from that country on the paper. For example, if a paper had two Australian authors, two New Zealand authors and one Fijian author, WOS would tally only one record for each country. Thus the final results probably deflate the relative contribution of the larger countries, while inflating that of the smaller ones (see fuller discussion in Appendix 3 of Calver and Bryant 2008). We classified authors citing Australasian Plant Pathology as coming from: Australia, New Zealand, other Asia and Pacific, the Americas (including Canada, South America and USA), Europe and other (including African nations and the Middle East). Comparisons with other journals We used online searches with the terms plant pathology, forest pathology, plant disease, phytopathology and plant protection to locate 35 journals other than Australasian Plant Pathology with a major focus on plant pathology. Twelve of these journals (Australasian Plant Disease Notes, African Journal of Plant Pathology, Asian Journal of Plant Pathology, Canadian Plant Disease Survey, European Journal of Forest Pathology, Hellenic Plant Protection Journal, Iranian Journal of Plant Pathology,
Author's personal copy APP: authorship and citations
181
Table 1 Metrics for assessing the usage of journals assessed in this paper Metric
Description
Source of data in this paper
SCImago journal rank (SJR)
The SJR (Jacsó 2010; Colledge et al. 2010) uses an algorithm similar to the ‘page rank’ employed by the Google internet search engine to prioritise internet sites corresponding to a search term. Its critical distinguishing features are assessment of the quality of the citing sources (a citation from a respected source carries more weight than a citation from a less respected source), partial exclusion of selfcitations and use of a three-year time window in assessing a journal in a given year). Jacsó (2010) considers these points, plus the basing of SJR on the broad Scopus database, to be significant advantages. SNIP (Moed 2010) corrects for variation in citation potential across research fields, providing a metric that offers the best opportunity to compare journals in different disciplines. The JIF (Garfield 2006) for a journal in a given year (x) is the ratio of citations to the journal in the 2 years prior to x, divided by the number of papers published by the journal in those 2 years. It is published by Thomson Reuters in their Journal Citation Reports. Critics argue that the two-year time interval in the JIF is too short for many disciplines where citations accumulate slowly and that Thomson Reuters’ Web of Science database (WOS) from which the citation data are drawn is too narrow and therefore unrepresentative (Nisonger 2004; Cameron 2005; Harzing and van der Wal 2008). In response, since 2007 Thomson Reuters have included a five-year impact factor calculation in their Journal Citation Reports. The h-index is a measure for assessing individual research performances (Hirsch 2005) and adapted for comparative assessment of journals (Braun et al. 2005; Calver and Bryant 2008). Papers are ranked from the most highly cited to the least highly cited. h is the highest ranked paper where the number of citations equals or exceeds that paper’s rank. Egghe (2006) developed the g-index to assess individual research performances in which more weight is given to highly cited papers. We follow Harzing and van der Wal (2008) and Calver and Bryant (2008) in applying it to comparisons amongst journals. Papers are ranked in order of citations as for the h-index, before the cumulative sum of the citations is compared against the rank squared. The highest rank whose square is equalled or exceeded by the cumulative number of citations of all papers to that rank determines g. The mean CPP (Harzing and van der Wal 2008) is simple to calculate for any time period, but it may be biased by a small number of papers with very high citations (Calver and Bradley 2009). The median CPP was suggested by Calver and Bryant (2008) as an alternative to the mean CPP that is not biased by extreme values. Calver and Bradley (2009) showed that it correlates strongly with mean CPP but that it reduces the magnitude of differences between journals. The SCImago is a collaborative group of Spanish researchers focused on information analysis (http://www.scimago.es/). Working from Scopus data, they indicate annually the percentage of papers published by individual journals with authors from multiple countries (‘% international collaboration’). The values are often volatile, but this can be dampened by averaging them over several years. This is the proportion of all papers published over the period evaluated that have been cited at least once.
Scopus (Elsevier 2010)
Source Normalized Impact per Paper (SNIP) Journal Impact Factor (JIF)
h-index
g-index
Mean Citations/paper (Mean CPP) Median Citations/paper (Median CPP)
% International collaboration
Proportion papers cited
Journal of Plant Protection Research, New Disease Reports, New Zealand Plant Protection, Plant Health Progress, Plant Protection Science) were either not listed in Scopus or had only very limited data (e.g., new journals), leaving 23 for use in a comparative analysis with Australasian Plant Pathology. We examined the period 2002–2007 because extending the analysis to more recent years would not allow sufficient time for many citations to accumulate and Scopus did not include earlier data for all journals.
Scopus (Elsevier 2010)
Journal Citation Reports (Thomson Reuters 2011)
Scopus (Elsevier 2010)
Scopus (Elsevier 2010)
Scopus (Elsevier 2010)
Scopus (Elsevier 2010)
SCImago (http://www. scimago.es/)
Scopus (Elsevier 2010)
We collected data in December 2010 and calculated 10 measures of journal usage: h-index for the entire period, gindex for the entire period, SJR for 2007, mean SJR 2002– 2007, SNIP 2007, mean SNIP 2002–2007, mean% international collaboration 2002–2007, mean CPP, median CPP and proportion papers cited (see Table 1 for details). These variables were measured on different scales, so we rangestandardised them between 0 and 1, allowing each an equal impact. Values were then summed to calculate a single composite index of usage (a similar approach to that used
Author's personal copy 325 3086 46 (14) 330 (11) S10 =21, p=0.036 28 (9) 286 (9) S10 =18, p=0.054 3 (1) 145 (5) S10 =6, p=0.300 52 (16) 331 (11) S10 =25, p=0.014 32 (10) 231 (7) S10 =11, p=0.190 164 (50) 1763 (57) S10 =−29, p=0.0046 2010 Total Significance of trend in the proportion over time
314 322 294 63 (20) 34 (11) 19 (6) 37 (12) 49 (15) 32 (11) 34 (11) 15 (5) 12 (4) 26 (8) 48 (15) 51 (17) 138 (44) 148 (46) 133 (45) 2007 2008 2009
16 (5) 28 (9) 47 (16)
343 429 358 31 (9) 44 (10) 48 (13) 16 (5) 62 (14) 22 (6) 15 (4) 19 (4) 22 (6) 39 (11) 40 (9) 26 (7) 228 (66) 237(55) 225 (63) 2004 2005 2006
14 (4) 27 (6) 15 (4)
165 257 279 2 (1) 9 (4) 34 (12) 2 (1) 25 (10) 13 (5) 3 (2) 3 (1) 19 (7) 14 (8) 8 (3) 27 (10) 16 (10) 17 (7) 19 (7)
Total Other Asia/Pacific Europe
128 (78) 195 (76) 167 (60)
Using the Scopus database, the percentage of papers cited at least once was 86% or higher each year for 2007 and earlier, declining to 19% in 2010 (Table 4). Median CPP declined from a high of five in 2001 to 0 in 2010, while mean CPP declined from 7.7 to 0.2 over the same period. Of course, papers published in recent years have had less
2001 2002 2003
Assessing journal pattern of use
Americas
Over the entire period 2001–2010 authors came from Australia (57%), the Americas (Canada, USA and South American countries) (11%), New Zealand (7%), other Pacific and Asian countries (9%), Europe (5%) and other nations (11%). There was a significant association between country of authorship and year (# 245 =327.6, p