Cookie policy: This site uses cookies to simplify and improve your usage and experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Your privacy is important to us and our policy is to neither share nor sell your personal information to any external organisation or party; nor to use behavioural analysis for advertising to you.

THE unveils broad, rigorous new rankings methodology

Tally of key indicators jumps from six to 13 in 2010 world league tables, reports Phil Baty

Rankingschart.png


Details of the proposed new methodology for the Times Higher Education World University Rankings have been unveiled.

THE confirmed this week that it plans to use 13 separate performance indicators to compile the league tables for 2010 and beyond - an increase from just six measures used under the methodology employed between 2004 and 2009.

The wide range of individual indicators will be grouped to create four broad overall indicators to produce the final ranking score.

The core aspects of a university's activities that will be assessed are research, economic activity and innovation, international diversity, and a broad "institutional indicator" including data on teaching reputation, institutional income and student and staff numbers.

"The general approach is to decrease reliance on individual indicators, and to have a basket of indicators grouped across broad categories related to the function and mission of higher education institutions," said Thomson Reuters, the rankings data provider. "The advantage of multiple indicators is that overall accuracy is improved."

THE announced last November that it had ended its arrangement with the company QS, which supplied ranking data between 2004 and 2009. It said it would develop a new methodology, in consultation with Thomson Reuters, and with advisers and readers, to make the rankings "more rigorous, balanced, sophisticated and transparent".

The first detailed draft of the methodology was this week sent out for consultation with THE's editorial board of international higher education experts.

They include: Steve Smith, president of Universities UK; Ian Diamond, the former chief executive of the Economic and Social Research Council and now vice-chancellor of the University of Aberdeen; Simon Marginson, professor of higher education at the University of Melbourne; and Philip Altbach, director of the Center for International Higher Education at Boston College. A wider "platform group" of about 40 university heads is also being consulted.

The feedback will inform the final methodology, to be announced before the publication of the 2010 world rankings in the autumn.

Indicators in detail

While the old THE-QS methodology used six indicators - with a 40 per cent weighting for the subjective results of a reputation survey and a 20 per cent weighting for a staff-student ratio measure - the new methodology will employ up to 13 indicators, which may later rise to 16.

For "research", likely to be the most heavily weighted of the four broad indicators, five indicators are suggested, drawing on Thomson Reuters' research paper databases.

This category would include citation impact, looking at the number of citations for each paper produced at an institution to indicate the influence of its research output.

It would also include a lower-weighted measure of the volume of research from each institution, counting the number of papers produced per member of research staff.

The category would also look at an institution's research income, scaled against research staff numbers, and the results of a global survey asking academics to rate universities in their field, based on their reputation for research excellence.

"Institutional indicators" would include the results of the reputation survey on teaching excellence and would look at an institution's overall income scaled against staff numbers, as well as data on undergraduate numbers and the proportion of PhDs awarded against undergraduate degrees awarded.

For 2010, the "economic/innovation" indicator would use data on research income from industry, scaled against research staff numbers. In future years, it is likely it would include data on the volume of papers co-authored with industrial partners and a subjective examination of employers' perceptions of graduates.

Institutional diversity would be examined by looking at the ratio of international to domestic students, and the ratio of international to domestic staff. It may also include a measure of research papers co-authored with international partners.

Ann Mroz, editor of Times Higher Education, said: "Because global rankings have become so extraordinarily influential, I felt I had a responsibility to respond to criticisms of our rankings and to improve them so they can serve as the serious evaluation tool that universities and governments want them to be.

"This draft methodology shows that we are delivering on our promise to produce a more rigorous, sophisticated set of rankings. We have opened the methodology up to wide consultation with world experts, and we will respond to their advice in developing a new system that we believe will make sense to the sector, and will be much more valuable to them as a result."

phil.baty@tsleducation.com

THE PROPOSED NEW RANKINGS METHODOLOGY

10% Economic activity/Innovation

Research income from industry (scaled against staff numbers)

10% International diversity

Ratio of international to domestic students

Ratio of international to domestic staff

25% Institutional indicators

Undergraduate entrants (scaled against academic staff numbers)

PhDs/undergraduate degrees awarded

PhDs awarded (scaled)

Reputation survey (teaching)

Institutional income (scaled)

55% Research indicators

Academic papers (scaled)

Citation impact (normalised by subject)

Research income (scaled)

Research income from public sources/industry

Reputation survey (research).

Readers' comments (1)

  • The Times Higher Education should drop its reputation surveys completely. In 'Anchoring effects in world university rankings: exploring biases in reputation scores' in 'Higher Education' preprints Nicholas A Bowman and Michael N Bastedo show that respondents to the Times Higher Education's reputation surveys were affected by the THE's world university ranking. Furthermore, Bastedo and Bowman found that academics were influenced not by the university’s ranking in their broad field, but by the university’s overall rank. They conclude - 'Because reputational assessments are quite susceptible to anchoring effects, and because peer assessments of reputation are strongly correlated with other rankings indicators (Volkwein and Sweitzer 2006), reputation scores may add relatively little value to university rankings systems. So what, then, is the purpose of including reputation surveys in rankings formulas? From our perspective, the inclusion of reputation largely serves to maintain the status quo, establishing the credibility of the rankings and ensuring stability in results over time.' Bowman, Nicholas A and Bastedo, Michael N (2010) Anchoring effects in world university rankings: exploring biases in reputation scores, Higher Education. DOI 10.1007/s10734-010-9339-1

    Unsuitable or offensive? Report this comment

  • Thanks Gavin,
    It is worth noting that the Bastedo study was on the old reputation surveys carried out between 2004 and 2009 by THE's old rankings data provider, QS. We have openly acknowledged some serious weaknesses with the old reputation survey. See for example: http://www.timeshighereducation.co.uk/story.asp?storycode=410709
    THE is no longer using QS for its rankings data. From 2010 and beyond, all the data will be provided by Thomson Reuters. TR have made some major improvements to the reputation survey, with a larger sample, and a much better targetted sameple. Despite this, we acknowledge concerns about the use of a subjective measure like reputation, and we will reduce the weighting given to the survey in the 2010 rankings. The current plan is to reduce the weighting to 20 per cent

  • Print
  • Share
  • Save
  • Print
  • Share
  • Save
Jobs