Academic productivity in the field of ecology - Wiley Online Library

8 downloads 71192 Views 458KB Size Report
Jan 6, 2017 - For instance, as university budgets ... academic and research institutions best equipped ..... University of California, Davis drops from a first.
Academic productivity in the field of ecology MEGAN P. KEVILLE,1 CARA R. NELSON,1,  AND F. RICHARD HAUER2 1

Department of Ecosystem and Conservation Sciences, University of Montana, 32 Campus Drive, Missoula, Montana 59812 USA 2 Center for Integrated Research on the Environment and Flathead Lake Biological Station, University of Montana, 32 Campus Drive, Missoula, Montana 59812 USA

Citation: Keville, M. P., C. R. Nelson, and F. R. Hauer. 2017. Academic productivity in the field of ecology. Ecosphere 8(1):e01620. 10.1002/ecs2.1620

Abstract. Although the broad field of ecology and its role in understanding the distribution and diversity of life on earth is a central part of the natural sciences, there is currently no comprehensive ranking of academic institutions for this discipline, which has quadrupled in research volume and visibility over the past three decades. We assessed scholarly productivity in the field of ecology for 316 North American academic institutions between the years 2000 and 2014. We present institutional rankings of productivity in terms of number of publications, number of citations, and Hirsch’s h index, a measure that integrates productivity and impact. For the top-ranked institutions, we also calculated hm, an h-based metric used to compare productivity across institutions of different sizes. In addition, we analyzed the effect of institution size on h index, publication rate, and number of citations. We found that scholarship in ecology is not significantly different between public and private universities, and from institutions ranging in size from extremely large to relatively small. Many of the institutions in the “ecology top 20” are widely considered as being among North America’s elite schools. However, there are several smaller institutions that have very high research productivity and impact in ecology with an apparently very strong ecology faculty. Administrators, faculty, prospective faculty, and students use institutional rankings, along with other data, to make decisions about programs in which to invest or participate. Relative institutional strength in the field of ecology, however, has not been previously measured, perhaps because faculty with expertise in ecology are often dispersed through multiple academic units of a university. Thus, our findings fill an important gap for ecology as a discipline. Key words: academic productivity; citation rate; ecology; h index; publication rate; scholarship. Received 22 September 2016; accepted 24 October 2016. Corresponding Editor: Michael F. Allen. Copyright: © 2017 Keville et al. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.   E-mail: [email protected]

INTRODUCTION

budgets shrink and as administrators work to allocate resources to programs that demonstrate high potential and leadership in their field, rankings of research productivity and impact can aid administrators in identifying competitive and emerging programs and assist faculty in understanding and communicating the value of their programs (Hicks 2009). In addition, indices of scholarly productivity could assist funders of research, such as government agencies and foundations, with strategic allocation of funds. These funders of ecological research stand to benefit greatly from a ranking that points them toward the

Understanding the relative academic strengths of educational institutions has become increasingly important as competition for research funding increases and recruitment of high-quality faculty and students becomes more intense. The use of quantitative indices based on faculty research performance is a common method by which both individuals and institutions are judged. Institutional comparisons can play a critical role in decision-making processes for a diversity of stakeholders. For instance, as university ❖ www.esajournals.org

1

January 2017

❖ Volume 8(1) ❖ Article e01620

KEVILLE ET AL.

productivity in this field is overdue. Ecology, defined as the study of the processes that influence the abundance and distribution of organisms, the interactions among organisms and their environment, and energy transformations and fluxes (Cary Institute of Ecosystem Studies, http://www.caryinstitute.org/discover-ecology/defi nition-ecology), has experienced significant growth and diversification since the term “oekologie” was first proposed by German scientist Ernst Haeckel in 1869. The field has now expanded to address interactions between humans and their environment. For example, the Dynamics of Coupled Natural and Human Systems program at NSF specifically is designed to address complex human–natural system interactions at diverse scales. Recognition of the important contributions of ecology to increased understanding of human impacts on global systems, and to developing potential mitigation strategies, has increased the visibility of the field. In the United States and Canada, academic institutions produce the majority of the research and journal publications that advance the state of science and knowledge in the field. The importance of academic institutions to the field of ecology coupled with the growing influence of, and demand for, rigorous ecology research underscores the importance of ranking academic institutions in terms of ecological research productivity and impact. Ecology also encompasses a broad range of organizational scales from molecular to community to ecosystem levels, and uses a wide array of methodological approaches, such as recent advances in genomics, spatial ecology, and restoration ecology. The diversity of scales and approaches means that researchers within the broad discipline of ecology are often found across a wide array of departments and colleges within academic institutions. For example, faculty and students engaged in ecological research can be found in departments ranging from Agriculture, Biology, Biotechnology, Chemistry, Engineering, Ecology & Evolution, Geography, Geology, Mathematics, Natural Resources, Statistics, and many others. Even in institutions with a dedicated “Ecology Department,” there are likely to be ecologists that are housed within other units on campus. Given both the breadth and dispersion of ecology research in academic

academic and research institutions best equipped to meet their priorities. Likewise, prospective undergraduate and graduate students, as well as potential faculty members, need information to evaluate future educational and career options and to assist in making educational choices with lifelong implications. Although institutional rankings are available for many disciplines, to date there is no comprehensive ranking of academic institutions in North America for the field of ecology, a scientific discipline whose global impact is increasing at a rapid rate due to mounting interest in addressing the effects of land-use change, loss of biodiversity, climate change, and other anthropogenic impacts that place enormous stress on the world’s ecosystems. Here, we provide the first ranking among North American academic institutions based on scholarly productivity in the field of ecology. Journal publications are the primary means of communicating research results. The number of journal publications and their number of citations are commonly the basis for indices of research productivity and impact. Fields such as medicine and economics have utilized standardized methods of ranking institutions by research productivity for decades (Laband 1985, Dusansky and Vernon 1998). The increased accessibility and utility of electronic databases, such as Thomson Reuters’ Journal Citation Reports that allow for the tailored querying of specific journals, time periods, and institutions, have increased the potential for bibliometric analyses of other fields. For example, investigators have ranked institutions by their research output in the fields of conservation biology and forestry (Laband and Zhang 2006, Grant et al. 2007). The U.S. News & World Report and National Research Council both utilize bibliometric analyses for their rankings of graduate schools by subject category (U.S. News & World Report 2014). Despite the increased use of bibliometric analyses for the development of institutional rankings, the field of ecology has yet to be subject to its own ranking, and instead has been included with other disciplines such as environmental science and evolution (U.S. News & World Report 2014). Given that the field of ecology has an over 100-year history—2015 marked the centennial anniversary of The Ecological Society of America, a major milestone in the advancement of the discipline—an assessment of institutional scholarly ❖ www.esajournals.org

2

January 2017

❖ Volume 8(1) ❖ Article e01620

KEVILLE ET AL.

fields, including Conservation Biology (Grant et al. 2007), and Physics, and Materials Science (Lazaridis 2010) have been based on the h index. As with all widely used indices, the h index has been subject to critiques and has inspired a number of “h-based indices” that aim to address some of its specific limitations (Kelly and Jennions 2006, Van Raan 2006, Schreiber et al. 2012, Swihart et al. 2016). Since the h index is a function of the number of publications, larger institutions that put out high numbers of publications may have a larger h index than smaller institutions regardless of the quality of the publications. To address this issue, Molinari and Molinari (2008) identified a universal growth rate for the h index, and, using that growth rate, devised an equation that allows for the comparison of academic productivity across institutions of different sizes. We calculated both h and hm indices for each institution by ranking the institution’s publications by the number of citations to each publication. We chose to calculate h index for the time period 2000–2009 assuming that more recent publications would not be accurately ranked simply because they have not been in the literature for a sufficient time as yet to have a good representation of citation frequency. To determine publication rate (number/yr) for each institution, we divided total number of publications by the total number of years in our sampling period. To control for different university sizes, we calculated hm, using the equation: hm = h/N0.4 where h is the h index for an institution from 2000 to 2009, where N is the number of ecology publications from the institution during that time period, and the exponent 0.4 is the universal growth rate identified by Molinari and Molinari (2008). Finally, we calculated Pearson’s correlation coefficient to assess the relationship between overall university size, based on total number of full-time faculty, and three scholarship metrics: h index, publication rate, and number of citations. All analyses were conducted in R (R Core Team 2015).

institutions, it is generally not possible to assess an institution’s strengths in ecology by focusing the assessment at the individual department level within any institution. Therefore, a synthetic approach to understanding the contributions and strengths of ecologists across an institution as a whole must be taken for comparisons among institutions. We reviewed the published literature in the top-ranked journals in ecology in order to present a comprehensive ranking of scholarly productivity among North American academic institutions and to determine the institutional characteristics, if any, that influence the rankings.

MATERIALS AND METHODS We assessed scholarly activity in ecology over a 15-year period from 2000 through 2014 for the 316 North American academic institutions most likely to have productive researchers in this field (sensu Grant et al. 2007). Institutional ranks were derived based on publications and citations in the most influential journals in ecology. We first identified journals in the field of ecology using the Web of Science (Thomson Reuters 2014a) Ecology subject category. We then determined the most influential journals using Web of Science Impact Factor scores at two different time points (2005 and 2013). We defined “influential journals” as the 40 journals with the highest impact factors (Appendix S1: Table S1). Next, we searched for articles published in these “influential journals” in ecology by institution. We used the 2014 Journal Citation Reports (Thomson Reuters 2014b) to generate three measures of scholarly productivity: (1) total number of publications for the 15-year time period 2000– 2014, (2) total number of citations by institution for the same time period, and (3) Hirsch’s h index (Hirsch 2005) for the time period 2000–2009. The h index is based on both research productivity and impact and is defined as the numerical rank of the publication for which the number of citations is greater than or equal to its rank. While a large number of citation-based indices for research productivity have been developed in the past few years, the use of the h index has received support because of the ease with which it can be calculated as well as its relative fairness (Chapron and Huste 2006). Recent rankings in other science ❖ www.esajournals.org

RESULTS AND DISCUSSION The ranking of North American academic institutions by scholarly productivity in the field of ecology reveals interesting—and sometimes unexpected—results (Table 1). Among the “top 3

January 2017

❖ Volume 8(1) ❖ Article e01620

KEVILLE ET AL. Table 1. Various institutional bibliometric rankings in the field of Ecology. 2000–2009

2000–2014 Publication rate (pubs/yr)

h-based indices

Citations

Institution

hm index rank

h index rank

h index value

Value

Rank

Value

Rank

Univ. Calif Santa Barbara Univ. New Hampshire Univ. Maryland Univ. Minnesota Univ. Montana Univ. Alaska Duke Univ. Harvard Univ. Univ. Calif Irvine Brown Univ. Princeton Univ. Colorado State Univ. Stanford Univ. Yale Univ. Washington State Univ. Univ. Washington Univ. Connecticut Univ. Calif Berkeley Univ. Calif Santa Cruz Univ. Michigan Univ. Tennessee Univ. Colorado Utah State Univ. Univ. Calif San Diego Univ. Wisconsin Arizona State Univ. Univ. Calif Davis Oregon State Univ. Univ. British Columbia Univ. Arizona No Arizona Univ. Univ. Chicago McGill Univ. Univ. Calif Los Angeles Univ. Laval Michigan State Univ. Univ. Georgia Penn State Univ. Univ. N Carolina Univ. Alberta N Carolina State Univ. Cornell Univ. Univ. Toronto Univ. Texas Univ. Calif Riverside Queens Univ. Univ. Florida Univ. Illinois Indiana Univ. Simon Fraser Univ.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50

2 32 15 5 16 24 4 8 26 44 19 7 8 27 48 11 44 3 19 18 31 27 44 32 6 19 1 11 8 14 48 37 29 39 39 24 17 34 34 29 39 11 23 42 34 44 19 42 37 48

110 66 82 96 81 73 100 89 71 59 76 91 89 70 58 88 59 103 76 77 68 70 59 66 93 76 114 88 89 86 58 63 69 61 61 73 80 64 64 69 61 88 75 60 64 59 76 60 63 58

60.5 18.8 33.5 61.9 44.6 33.1 71.4 60.3 32.7 18.7 39.0 61.3 55.6 43.1 19.8 58.6 20.1 95.4 43.1 45.7 35.3 43.3 24.0 32.3 72.2 43.4 123.7 64.5 79.4 61.5 23.9 29.4 44.4 35.5 33.5 49.0 55.0 36.4 37.8 45.7 34.2 83.4 59.0 33.4 35.3 33.6 66.4 34.8 40.0 32.0

12 66 37 9 21 40 6 13 41 67 28 11 16 25 63 15 62 2 25 19 32 24 50 42 5 23 1 8 4 10 52 45 22 31 37 18 17 30 29 19 35 3 14 39 32 36 7 34 27 43

46,030 13,960 21,558 38,838 27,571 24,144 43,218 37,124 21,291 11,261 25,652 36,716 32,530 20,490 13,292 35,032 16,191 55,473 23,166 26,355 20,512 23,011 12,948 17,255 45,437 24,853 72,894 35,764 41,171 32,853 13,104 17,011 23,067 16,860 14,644 24,581 29,549 17,735 20,386 21,466 16,215 39,550 28,356 17,877 18,154 14,633 30,753 15,141 18,688 14,293

3 47 27 8 18 23 5 9 29 58 20 10 14 31 49 12 41 2 24 19 30 26 51 37 4 21 1 11 6 13 50 38 25 39 43 22 16 36 32 28 40 7 17 35 34 44 15 42 33 45

❖ www.esajournals.org

4

January 2017

❖ Volume 8(1) ❖ Article e01620

KEVILLE ET AL.

purposes. Knowledge of an institution’s ranking within a field can assist administrators in understanding their institution’s research strengths relative to peer institutions. This knowledge in turn can assist with decisions about future program emphasis, allocating academic resources, marketing university programs, and fundraising. Data on rankings can also assist prospective faculty with decisions about job applications and offers, as the rankings reveal information about the strength of potential collaborators and the research culture of the institution. Graduate students may similarly find rankings useful as they indicate academic opportunities and program strengths. Although graduate students often apply to graduate programs to work with a specific faculty member in a specific research area, the academic culture of an institution plays a large role in a graduate student’s experience and research productivity is one important aspect of academic culture, although there are of course other key indicators. Because research enhances teaching, knowledge of an institution’s relative ranking in scholarly contributions may also indicate the extent to which emerging concepts in the field are filtering into the classroom. In addition, federal agencies, nonprofits, and other groups that regularly collaborate with academic institutions for their ecological expertise will find great value in the rankings that may assist with identifying top potential institutions deserving their support. Reliance on citation-based rankings or a single index as a measure of institutional quality is not recommended. Nonetheless, rankings provide valuable information, and it is important to recognize that they provide one of the multiple dimensions that factor into academic institution excellence. Thus, while we do not advocate the use of these rankings by administrators, faculty, or students as the sole source of information, we do believe that institutional rankings can play an important and useful role in the suite of information that can assist in decision-making by any stakeholder. For example, graduate students interested in gaining their degree from an institution that is highly productive in the field of ecology could use these rankings as a starting point in their graduate school search, but should also research individual advisors and programs and other institutional characteristics as they make their decisions.

20” institutions, based on h index scores, there is considerable diversity in terms of geographic location and university type: 41% are located in the eastern states, vs. 59% from the west, and 18% are private universities vs. 82% publicly supported institutions (Table 1). When we evaluated institutions using the hm metric, some universities, both public and private, change significantly in their ranking. For example, the University of California, Davis drops from a first place ranking by h index to a 27th place ranking by hm while Cornell University drops from 11th to 42nd when comparing h index rank to hm rank. We also observed that some smaller institutions moved dramatically in the hm rankings. For example, the University of Montana moved from the 16th place ranking by h index to fifth place by hm, and Brown University moved from 44th place to the 10th place ranking when moving between h index and hm rankings. Overall, there is a turnover in 35% of the top 20 ranked institutions when moving between the two rankings. We did not find significant relationships between any of the scholarship metrics and overall institution size (h index, r = 0.084, p = 0.562; publication rate, r = 0.156, p = 0.287; and number of citations, r = 0.097, p = 0.562). This was unexpected, as we had assumed larger universities that support more faculty and graduate students would result in higher values of all metrics. There are several possible explanations for these observations, in terms of faculty size. First, it is possible that, regardless of the size of the institution, a small subset of the faculty is responsible for the bulk of the ecological contributions in terms of publications and citations. Another explanation may be that the per capita ecological research output at select small universities may rival or be even greater than that of larger universities, especially at those smaller institutions that have a history of ecological research as a strategic focus of the university. The important role played by academic institutions in the advancement of ecological research will only continue to grow, with increasing recognition of the need to repair past damage and prevent further degradation and loss done to the Earth’s ecosystems. As such, an assessment that reliably ranks the most productive institutions in terms of both research output volume and impact in the broad field of ecology can serve a variety of ❖ www.esajournals.org

5

January 2017

❖ Volume 8(1) ❖ Article e01620

KEVILLE ET AL.

ACKNOWLEDGMENTS

Laband, D. N., and D. Zhang. 2006. Citations, publications, and perceptions-based rankings of the research impacts of North American Forestry Programs. Journal of Forestry 104:254–261. Lazaridis, T. 2010. Ranking university departments using the mean h-index. Scientometrics 82:211–216. Molinari, J., and A. Molinari. 2008. A new methodology for ranking scientific institutions. Scientometrics 75:163–174. R Core Team. 2015. R: a language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. https://www.R-project. org/ Schreiber, M., C. C. Malesios, and S. Psarakis. 2012. Exploratory factor analysis for the Hirsch index, 17 h-type variants, and some traditional bibliometric indicators. Journal of Informetrics 6:347–358. €o €k, J. A. Dewoody, Swihart, R. K., M. Sundaram, T. O. Ho and K. F. Kellner. 2016. Performance benchmarks for scholarly metrics associated with fisheries and wildlife faculty. PLoS ONE 11:e0155097. https:// doi.org/10.1371/journal.pone.0155097 Thomson Reuters. 2014a. Web of Science. http://wokinfo. com/ Thomson Reuters. 2014b. Journal Citation Reports. http://thomsonreuters.com/en/products-services/ scholarly-scientific-research/research-management -and-evaluation/journal-citation-reports.html U.S. News & World Report, L.P. 2014. Best graduate schools rankings. U.S. News & World Report. http://grad-schools.usnews.rankingsandreviews.com/ best-graduate-schools Van Raan, A. F. J. 2006. Comparison of the Hirsch h index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups. Scientometrics 67:491–502.

Additional information about materials and methods is available in Appendices in Ecological Archives. Data are archived by the University of Montana Institute on Ecosystems, which provided funding. We thank Dr. Viktoria Wagner and two anonymous reviewers for edits on a previous version of this manuscript.

LITERATURE CITED Chapron, G., and A. Huste. 2006. Open, fair and free journal ranking for researchers. BioScience 56: 558–559. Dusansky, R., and C. J. Vernon. 1998. Rankings of U.S. economics departments. Journal of Economic Perspective 12:157–170. Grant, J. B., J. D. Olden, J. J. Lawler, C. R. Nelson, and B. R. Silliman. 2007. Academic institutions in the United States and Canada ranked according to research productivity in the field of conservation biology. Conservation Biology 21:1139–1144. Hicks, D. 2009. Evolving regimes of multi-university research evaluation. Higher Education 57:393–404. Hirsch, J. E. 2005. An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Science of the United States of America 102:16569–16572. Kelly, C. D., and M. D. Jennions. 2006. The h index and career assessments by numbers. Trends in Ecology and Evolution 21:167–170. Laband, D. N. 1985. An evaluation of 50 ranked economics departments—by quantity and quality of faculty publications and graduate student placement and research success. Southern Economic Journal 52:216–240.

SUPPORTING INFORMATION Additional Supporting Information may be found online at: http://onlinelibrary.wiley.com/doi/10.1002/ecs2. 1620/full

❖ www.esajournals.org

6

January 2017

❖ Volume 8(1) ❖ Article e01620

Suggest Documents