CHAPTER 34
The University Research Rankings’ Landscape through the Eyes of a Library Professional Ruth A Pagell University of Hawaii
[email protected] for Collnet November 2015
Abstract The globalization of the education industry has been accompanied by a growing number of league tables, designed to evaluate the performance of research institutions. A country’s, institution’s or researcher’s desire to be in the top tier of the rankings has had an impact on national R&D and educational policies and on institutional hiring practices and has put pressure on faculty to publish in “A” journals and even to promote themselves on social media. This article is a compilation from a monthly column for e-Access Newsletter, an Asian information industry newsletter. It focuses on explaining the rankings and rankers to information and library professionals. Since most underlying data come from Thomson Reuters Web of Science or Elsevier’s Scopus, and librarians are often holders to the subscriptions, they need to understand bibliometrics and their derivatives to support their decision makers. The paper looks at the intersection of policies, bibliometrics and rankings to provide the reader with the tools needed to interpret any ranking or new metric that is being devised by the bibliometricians. Keywords: Bibliometrics, Rankings, Universities, Research
INTRODUCTION Rankings of scholarly output have gone from counting pages in journals in specific subject areas to becoming a worldwide phenomenon. The globalization of the higher education industry has fueled the growth in the number of rankings, articles about rankings, and metrics which impacts policy, funding, researchers and library and information professionals The intersection of these rankings with information professionals and library and information science research is in the use of bibliometrics as a major rankings component. Librarians, as the keepers of the subscriptions to underlying sources such as Thomson-Reuters’ Web of Science and Incites and Elsevier’s Scopus need to have some understanding of the strengths and weaknesses of the different rankings and their research and scholarly components.
Over the past year, I have been writing columns for e-Access Newsletter which go into detail about individual rankings. For the purpose of the proceedings, I will synthesize these columns, focusing on the broader picture of the rankings’ landscape and incorporate the information from the different rankings articles that will help readers interpret any ranking they may encounter. Scholars in the U.S. have been publishing research rankings since the first half of the 20 th century. In 1925, Raymond Hughes published “A Study of Graduate Schools of America”, on behalf of the America Council of Education (ACE). The Council is still interested in rankings today. Hughes’ study rated 19 graduate departments in the U.S., primarily Ivy League private universities and the major mid-western state universities. Fourteen of his top 15 universities appear today on some top 15 list (Hughes 1925). Studies carried out through the mid-20th century used metrics such as publications and page counts in addition to qualitative peer review. The American Educational Research Association sponsored research rankings of American Professional Schools in the 1970s based on the judgment of deans in each field and today some rankings still incorporate peer review (Blau, P M., & Margulies, R. Z., 1974). By the 1960’s Eugene Garfield recognized that his Science Citation Index could be used as an evaluation tool and articles began to appear in the 1970’s for a limited number of universities and disciplines, using citation metrics (Garfield, E. & Sher, I.H.1963; Garfield, E. (1979) ). These studies were by scholars and for scholars.
POLICIES AND ACCOUNTABILITY The earlier rankings from U.S. News and World Report and other individual country– level rankings were designed to support student choice. Today’s rankings are used by policy makers and funding agencies to inform national educational policy and decision making. Policies are designed to enhance and evaluate the quality of higher education and to distribute R & D expenditures. Bibliographic metrics are an integral part of the new global rankings because of the growing availability of tools to measure scholarly output. Policy makers use the metrics from the global rankings as a way to improve a nation’s scientific and technical position on the world stage. At a country level, governments are interested in creating and evaluating world class universities. The United Kingdom’s Higher Education Funding Council (HEFCE) is one of the better known assessment bodies as was its Research Assessment Evaluations from the 1990s. These were replaced in 2014 by a new Research Excellence Framework (REF). “HEFCE distributes public money for higher education to universities and colleges in England, and ensures that this money is used to deliver the greatest benefit to students and the wider public” Japan, China, Taiwan and South Korea have developed excellence programs to enhance research and attract top students. Since the end of the 20th century, these countries have all targeted a limited number of universities to receive special funding For example in the late 1990s, China introduced Projects 211 and 985 mainly designed to develop Chinese universities to top global ranking positions in the 21st century (Chinese Education Center). According to Nature ( May & Brody, 2015) more than a quarter of the publications in the Nature Index 2014 comes from the Asia-Pacific region with China Japan, South Korea, Australia, India, Singapore and Taiwan among the world’s top-20. An Australian policy note,” Government Research Funding in 2014 in Selected Countries”, noted that four fifths of the world’s Research and Development (R & D) expenditure is now found in the USA, China, Japan and the European Union. According to this report, In 2014 the USA spent the most of any county on R & D,
followed by China, Japan, Germany, South Korea, France, UK, India, Russia and Brazil. Spending on R & D in China is 60% of that in the USA and it is predicted that China will pull ahead by 2022 (Group of Eight, 2014). India lags behind in policy and expenditure, abandoning the 2012 proposed Universities for Research and Innovation bill. Deloitte released the 2014 ASHE, Annual Status of Higher Education of States and UTs in India, printed in conjunction with the Ministry of Human Resource Development and the Confederation of Indian Industry in October 2014. It provides an in-depth analysis of the status, needs and initiatives required to “catapult this sector to the next level” (Deloitte 2014). Nature (2015) also includes an analysis of the problems with India’s funding model. Table 1 includes the top ten countries in GERD and GERD as a percent of GDP using the most recent data from the National Science Foundation and includes the top ten countries in expenditures and % of R&D Table 34.1: Science and Engineering Indicators (NSF, Feb 2104) International comparisons of gross domestic expenditures on R&D and R&D share of gross domestic product by region/country/economy NSF Table 4.4 Region/country/economy GERD PPP $ millions
GERD/ GDP %
Rank $
Rank %
United States (2011) 429,143.0 2.85 1 10 China (2011) 208,171.8 1.84 2 19 Japan (2011) 146,537.3 3.39 3 4 Germany (2011) 93,055.5 2.88 4 8 South Korea (2011) 59,890.0 4.03 5 2 France (2011) 51,891.0 2.24 6 13 United Kingdom (2011) 39,627.1 1.77 7 20 Russian Federation 35,045.1 1.09 (2011) 8 32 Taiwan (2011) 26,493.1 3.02 9 7 Brazil (2010) 25,340.2 1.16 10 30 India (2007) 24,305.9 0.76 12 28 Sweden (2011) 13,216.2 3.37 15 5 Switzerland (2008) 10,525.2 2.87 16 9 Israel (2011) 9,822.7 4.38 17 1 Finland (2011) 7,634.8 3.78 20 3 Singapore (2011) 7,060.2 2.23 21 13 Denmark (2011) 7,052.4 3.09 22 6 Table Notes: Gross Expenditures for R&D - Domestic and Development and GERD as a percent of GDP -Gross Domestic Product from OECD data, using PPP (purchasing power parity to calculate US dollars). Even before the release of the first global rankings in 2003, multi-national groups recognized a need to impose standards of accountability on the rankings themselves. A group of experienced
rankers and ranking analysts from Europe and the U.S. met in 2002 to create the International Ranking Expert Group (IREG, 2015) now called the International Observatory on Rankings and Excellence. IREG, the UNESCO-European Centre for Higher Education (UNESCO- CEPES and the U.S. based Institute for Higher Education Policy (IHEP) met in 2006 and developed the Berlin Principles for rankings and league tables, which include: A) B) C) D)
Purposes and Goals of Rankings Designing and Weighting Indicators Collection and Processing of Data Presentation of Ranking Results
The guidelines aim to insure that "those producing rankings and league tables hold themselves accountable for quality in their own data collection, methodology, and dissemination” (Berlin, 2006). In 2008, OECD launched the “Feasibility Study for the International Assessment of Higher Education Learning Outcomes” (AHELO) which was designed to gauge “whether an international assessment of higher education learning outcomes that would allow comparisons among HEIs across countries is scientifically and practically feasible. “ The final results are presented in several publications (OECD, 2013). Seventeen countries, representing five continents are included in the study. Incorporating both the Berlin Principles and the AHELO learning outcomes, the European Commission, Directorate General for Education and Culture funded the development of a new system U-Multirank, launched in 2014 (European Commission 2014).. This controversial new player is neither a ranking nor a measurement of world-class universities. In its 2015 iteration, it is primarily useful for a European market (Pagell, 11 June 2015) Evidence-based bibliometrics used in global research rankings appeal to policy makers and are often questioned by academics. The next step is to learn more about the most commonly used bibliometrics, the ranking organizations and what to look for when interpreting rankers’ methodologies.
RANKING METRICS: USING BIBLIOMETRIC METHODOLOGY Early rankings were criticized for being primarily output measures where bigger is better, favoring scientific fields, including only English language publications and ranking a small number of universities. Research centers such as the Centre for Science and Technology Studies (CWTS) and the Spanish National Research Council (Consejo Superior de Investigaciones Cientificas, CSIC) have been experimenting with a wide range of new measures, as has Thomson Reuters. In some rankings, bibliometrics make up only a small part of the weightings. Many other metrics have been added and the basic counting has morphed into sophisticated algorithms. Although most bibliometric rankings start from the same datasets of Thomson-Reuters Web of
Science or Elsevier’s Scopus, results are different. In order to understand the rankings it is important to understand what is measured and how it is measured. “Ranking positions depend more on methodology than performance.” (Schmoch, 2015) The basic metric for all bibliometrics is publications, identified for individual authors and their institutions. To answer the simple question: What university had the most publications between 2010 and 2013 requires specification of types of publications, disambiguation of author names and rules for what institutions are included and how they are named. Publications: Counting publications looks straightforward and leads to the following common metrics: The number of publications attributed to an institution; and The number of times other publications cited those articles; and The number of these in the top 1% or10% of their fields Variables for publications: Publications are the underlying metric, even in those rankings that do not use them directly in their rankings. When it comes time to understand the difference among the rankings we need to determine the definition of a qualifying publication. Among the document types in Web of Science and Scopus are scholarly articles, reviews, proceedings or conference papers, books or book chapters, editorials and letters. Not all types are included in most rankings. Variables for institutions: How are institutions chosen for inclusion? Are only higher education institutions included? What constitutes a higher education institution? Some rankings include Chinese Academy of Sciences; most do not. Are institutions included based on the entire system, a single location or an individual publishing unit? For example, some rankings aggregate the Indian Institutes of Technology. How many institutions are ranked? Does the institution have to have a minimum number of publications to be included? Variables for authors: Once the rankings have standardized definitions for publications and institutions, how do the rankers match authors’ publications with institutions? Disambiguation of author names is still a problem, especially with common names. Variables for counting: A concept that has become more prevalent is that of size dependency. Is the number of publications size dependent or size independent? After assigning qualifying papers to the correct institution, how the articles are counted? In the United States, the University of Michigan has over 4000 faculty and the Massachusetts Institute of Technology has less than 1000. Not surprisingly, over the past five years, Michigan published more scholarly articles than MIT. The total is dependent on the size of the faculty. Calculating articles per faculty, a size independent measure, MIT has over three times more articles per faculty. Another counting issue is how to count articles with multiple authors. The globalization of higher education in the past twenty years has changed the trends in academic publishing. Early
versions of the citation indexes only presented the first author. Today multiple authors, from multiple institutions in multiple countries write one article and list authors’ names alphabetically. For example, an article on body-mass index for Asian populations published in The Lancet ten years ago lists 28 authors from four western countries and 13 Asia-Pacific countries from over 20 institutions. Is this article counted 28 times or is it ”fractionalized” across all authors and institutions? Some rankings give full credit to each institution; others “fractionalize” the credit’ some give you a choice. As an example, if being in the top 100 in Asia were important to Thailand’s Mahidol University, it would use a “total publications” ranking. which places it in one top 100. It falls out of the top 100 using a fractionalized count. Citations: While there are now many open access sources for citations, such as Google Scholar, the rankers depend on either Web of Science or Scopus. Variables for Citations: The body of works considered for citations is dependent on the definitions above of qualifying articles and distribution among multiple institutions. The most important variable in evaluating citation counting is the subject area. Criticisms of the earliest global rankings were that they did not factor in these differences. For example, out of 251 WOS fields, since 2004, Biochemistry and Molecular Biology has over thirteen million citations with over 70% of documents cited while Information Studies & Library Studies has 272,000 with 23% cited (Incites, 22 June, 2015). Instead of measuring total citations or average citations per article, QS measures citations per faculty. To account for these differences rankers now apply normalization algorithms that take into account subject, year and country. Another way that rankers handle these differences is to include separate ranking by broad subject areas Highly Cited Papers: The number of times scholarly publications cite an article is not enough for some rankers. In order to clearly identify excellence, “highly cited papers” may be a separate category. The criteria for being highly cited depend on both the broad topic and publication date. The most highly cited Indian paper in chemistry, published in 2009, had 1,400 citations while the most highly cited Indian paper in agriculatural science, published in 2005, had 259 citations (from Essential Science Indicators, searched 22 June 2015). Finally, the output measures are effected by the time periods used by the rankers which can vary from 11years to current year. Other Metrics: Some rankings use only bibliometric indicators. Some rankings still incorporate peer review, based on survey data. Other rankings include such indicators as Nobel Prize winners, percent of international faculty and students, and metrics designed to measure teaching. Even where similar bibliometrics are used, different weightings are used for each indicator which results in different results.
THE RANKERS Shanghai Jiao Tong University produced the first global ranking of universities ’scholarly output in 2003. Those 2003 rankings were based on Nobel Prize and field medal winners, publications in Nature & Science, highly cited researchers and universities with the most papers
in Web of Science. Since then, I have identified nine other open access sources of global rankings, plus the new Web of Science Incites platform which allows you to do your own rankings - for a price (Pagell, Imcite’s 2015) Each ranking has a unique web interface. Some are more flexible than others. While the policy makers and funders are analyzing the results, information professionals should be looking at the rankings’ interfaces to see they are actually providing and what options they give your institution. In an article of this length, it is only possible to give you some tips on what to look for in each ranking. Before you use any ranking, check the methodology. Some are more transparent than others. Table 2 lists the different rankers, with number one in the world, in Asia and in India, in the most recent ranking (as of 22 June 2015) and the number of the individual overview article for each. Tables 3 and 4 in the Appendix following References, list the website for each ranking and each article about the ranking. Use the following check-list when you look at each interface. Check the rankings to see if they:
Provide just a rank; a composite score; individual scores per rank or both. It is important to see the differences between rankings. In QS, the top 10 universities differ by only 3.5 points but in Shanghai Jiao Tong, the difference is over 40 point. Does it provide individual ranks to all of the institutions listed? Shanghai Jiao Tong only provides individual ranks to the top 100 worldwide and top fifty by field Are the raw data underlying the rankings available Can you easily identify the weightings of each metric?
Check the interface:
Does it have separate rankings by region or country; can you customize the countries for comparison? Ranking Web of Universities has a grouping just for East and Southern Asia. Does it have rankings by fields and are those field relevant to your institution? Can you select the type of institution when more than universities are included? What, if anything can you download and in what format? Leiden provides its entire dataset. How many years of rankings are available? Does it have visualization tools? Table 34.2: Comparing World Rankings (accessed 22 June 2015)
NAME OF RANKING Composite Scores Times Higher Education World’s
Ranking Date
Top World
Top Asia
Top Indian in World Rankings
Ruth’s Article
10/14
California Institute of
University of Tokyo (2015)
Indian Institute of
5
Best Universities 2014-2015 QS World University Rankings –Top Universities 2014-2015 ARWU –World Top 500 Universities National Taiwan University
Technology
Science
9/14
Mass. Institute of Technology
National University of Singapore (2015)
Indian Institute of Technology Bombay
5
8/14
Harvard University
University of Tokyo
6
10/14
Harvard University
University of Tokyo
Indian Institute of Science Indian Institute of Science Indian Institute of Technology Bombay University of Delhi Indian Institute of Technology Roorkee
12
6
Ranking Web of Universities (1)
1/15
2
Harvard University
National Taiwan University
Best Global Universities Rankings (US News) U-Multirank Readymade Rankings Research Individual indicators Nature Index (2) Global Weighted Fractional Count Nature Index Asia Pacific (rolling year) Papers Published (3) Scimago Institutions Rankings Excellence Rate – Higher Education Leiden Rankings PP Top 10%
10/15
Harvard University
University of Tokyo
3/15
Rockefeller University
Weizmann Institute
6/153
Chinese Academy of Sciences Chinese Academy of Science
India Institute of Science
7
Indian Institute of Science
7
07/20 14
Kyoto University of Education
Gandhigram Rural Institute (India)
8
5/15
Massachus etts Inst of Technology
Indian Institute of Technology Roorkee Indian Institute of Science
8
6/15
Thomas Reuters 22/6/1 University Incites –Total 54 of London Publications 1. Issued two times a year
Weismann Institute of Science Chinese Academy of Sciences
9
11
10
2. Updated regularly throughout the year 3. Monthly regular updates throughout the year; based on individual academic institutions; subscription service If you search long and hard enough, almost anyone can be number 1 at something. India’s Aligarh Muslim University has the most Incites publication in mathematics. According to UMultirank, Asian Institute of Technology was number one in international orientation. Using Ranking Web of Universities, Bogor Agricultural University (Indonesia) was number one in “Openness” in Asia
Conclusion On one hand, authors and analysts criticize the rankings for the impact they are having on higher education. On the other hand, bibliometricians continue to come up with new metrics that address some of the criticisms and create more options. Not every institution can or should aspire to be a top research university in the world. There is unrealized potential in something like a U-Multirank, if more institutions participate, or a Ranking Web or even Altmetrics, if we expand the concept of impact, to offer options for universities of all sizes and goals. It is our role as information and library professionals to, at a minimum, know about the various rankings and know where to find the methodology and understand the functionalities of the various websites. At best, we should know the mission and vision of our institutions and which if any of these rankings are useful for their benchmarking and planning.
REFERENCES American Council on Education, accessed 20 June, 2015 at http://www.acenet.edu/Search/Pages/results.aspx?k=rankings. ASHE- Annual Status of Higher Education of States and UTs in India 2014 (October, 2014). Deloitte, accessed 23 June 2015 at http://www2.deloitte.com/content/dam/Deloitte/in/Documents/IMO/in-imo-annual-status-ofhigher-education-2014-noexp.pdf. Berlin Principles on Ranking of Higher Education Policy (2006) accessed 20 June 2015 at http://www.che.de/downloads/Berlin_Principles_IREG_534.pdf. Blau, P. M. & Margulies, R. Z. (1974). A research replication: The reputations of American professional schools. Change Winter (74-75), 42-47. Centre for Science and Technology Studies, Leiden, updated 22 June 2015; accessed on 24 June 2015 at http://www.cwts.nl/Home. Chinese Education Center, Project 211 and 985, accessed 20 June 2015 at http://www.chinaeducenter.com/en/cedu/ceduproject211.php. Consejo Superior de Investigaciones Cientificas (CSIC – Spanish National Research Council), accessed 24 June 2015 at http://www.csic.es/home.
Council of Europe (2014) Higher Education and Research accessed 20 June 2015 at http://www.coe.int/t/dg4/highereducation/default_en.asp. European Commission (13 May 2014). New international university ranking: Commission welcomes launch of U-Multirank. Press Release accessed 24 June 2015 at http://europa.eu/rapid/press-release_IP-14-548_en.htm. Garfield, E. & Sher, I.H. (1963). New factors in the evaluation of scientific literature through citation indexing. American Documentation, 14 (3), 195-201. Garfield, E. (1979). Is citation analysis a legitimate evaluation tool? Scientometrics, 1(4), 359375. DOI:10.1007/BF02019306. Group of Eight, Australia (2014). Policy note: Government research funding in 2014 in selected countries, April 2014 accessed 20 June, 2015 at https://go8.edu.au/sites/default/files/docs/publications/policy_note__government_research_funding_in_2014_in_selected_countries_final.pdf Higher Education Funding Council for England (HEFCE), accessed 20 June 2015 at http://www.hefce.ac.uk/about/. India (26 March 2015) Nature, 519, S86-87 accessed 28 June at http://www.nature.com/nature/journal/v519/n7544_supp_ni/full/519S66a.html. IREG Observatory on Academic Rankings and Excellence, accessed 20 June, 2015 at http://ireg-observatory.org/en/. Institute for Higher Education Policy – IHEP, accessed 20 June 2015 at http://www.ihep.org/. May, M. & Brody, H. (26 March 2015) Nature Index 2015 Asia-Pacific Nature, 519, S49. http://www.nature.com/nature/journal/v519/n7544_supp_ni/full/519S49a.html. National Science Foundation. (February 2014) Science and engineering indicators 2014, accessed 24 June at http://www.nsf.gov/statistics/seind14/index.cfm/etc/tables.htm OECD. (March 2013). Background documents for the AHELO Feasibility Study Conference, accessed 24 June 2915 at http://www.oecd.org/site/ahelo/backgrounddocumentsfortheahelofeasibilitystudyconference.ht m Pagell. Ruth A. (2015). Imcite’s benchmarking and analytics capabilities. Online Searcher 39(1), 16-21. Pagell, Ruth A. (11 June 2015) Ruth’s rankings 12: U-Multirank: Is it for “U”. e-Access Newsletter, accessed 20 June 2015 http://librarylearningspace.com/ruths-rankings-12-umultirank-u/
Research Excellence Framework (15 December 2014). accessed 24 June 2015 at http://www.ref.ac.uk/about/. Schmoch, Ulrich (2015) The information value of international university rankings: Some methodological remarks in. In Welpe, I.M, Wollersheim, J, Ringethan, S & Osterloh, M, (Eds). Incentives and performance: Governance of research organizations (pp. 141-154). Switzerland: Springer International Publishing DOI: 10.1007/978-3-319-09785-5 For those library and information professionals interested in more information about policy and the history of rankings, check the bibliographies in: European Journal of Education (March 2014) Special issue: Global university rankings. A critical assessment. http://onlinelibrary.wiley.com/doi/10.1111/ejed.2014.49.issue-1/issuetoc Pagell, R. A. (2014) Bibliometrics and university research rankings demystified for librarians. In Chen, C. & Larsen R. Library and information sciences: Trends and research (pp. 137-160). Berlin – Heidelberg: Springer. DOI 10.1007/978-3-642-54812-3
APPENDIX Table 34.3: URLS for Rankings Academic Ranking of World Universities (ARWU) Times Higher Education (THE) QS (Quacquarelli Symonds) National Taiwan University (formerly HEEACT) Ranking Web of Universities U.S. News & World Report Best Global Universities U-Multirank Leiden Rankings Nature Index Global Nature Index AsiaPacific Scimago Institutions Rankings
http://www.shanghairanking.com/ https://www.timeshighereducation.co.uk/world-universityrankings/ http://www.topuniversities.com/qs-world-universityrankings http://nturanking.lis.ntu.edu.tw/Default.aspx http://www.webometrics.info/en http://www.usnews.com/education/best-globaluniversities/rankings?int=9cf408&int=9cf408 http://umultirank.org/#!/home?trackType=home http://www.leidenranking.com/ranking/2015 http://www.nature.com/nature/supplements/nature-index2015-global/ http://www.nature.com/nature/supplements/nature-index2015-asiapacific/?utm_source=NPI&utm_medium=email&content=E N&utm_campaign=NatureIndex http://www.scimagoir.com/
Table 34.4: URLs for rankings’ articles Article Rankings Number 5 THE and QS 6
ARWU and NTU
7
Nature Indexes
8
Scimago and Leiden
9
Ranking Web of Universities
10
Thomson-Reuters Incites U.S. News & World Report U-Multirank
11 12
URL http://librarylearningspace.com/ruths-rankings5-comparing-times-higher-education-qsrankings/ http://librarylearningspace.com/ruths-rankings6-scholarly-rankings-asian-perspective/ http://librarylearningspace.com/ruths-rankings7-asian-institutions-grow-nature/ http://librarylearningspace.com/ruths-rankings8-something-everyone/ http://librarylearningspace.com/ruths-rankings9-expanding-measurement-science-citationsweb-visibility-tweets/ http://librarylearningspace.com/ruths-rankings10-rankings-incites/ http://librarylearningspace.com/ruths-rankings11-u-s-news-world-report-goes-global/ http://librarylearningspace.com/ruths-rankings12-u-multirank-u/
From Jain, P.K. Kar, D.C., Kretschmer, H, Babbar, Parveen (2015). Emerging Trends and Issues in Scientometrics, Informetrics and Webometrics, New Delhi, Ane Pvt . Ltd. Pgs; 315-322. http://anebooks.com/bookdetails.asp?id=1989 And see also PDF of expanded powerpoint of this paper/talk presented for Library Association of Singapore 24 November 2015.