The ‘damaging culture of league tables’
A damning report on the “damaging culture of league tables” in the UK was this week launched by the British Academy.
So it may come as a surprise that as editor of the most influential and widely cited of all university league tables, the Times Higher Education World University Rankings, I was happy to speak at the launch and heartily endorse many of its recommendations.
The report from the Academy’s Policy Centre, Measuring Success, League tables in the public sector, covers rankings of schools, police forces and universities, and it concludes that many “have all the appearance of being useful”.
“But they are often one-dimensional and can have perverse side effects that cause more harm than good,” it says.
Although the report is focused on the public sector, of which universities are not part, it nevertheless has valuable lessons for everyone involved in league tables, and it makes some very sensible recommendations.
A key recommendation of the report, written by Harvey Goldstein and Beth Foley, is for better evaluation of the league tables and for more discussion of their drawbacks “to ensure that they meet their aims and best serve policymakers, professionals and the general public”.
In my presentation at the launch, I explained how Times Higher Education took the decision to scrap the influential but flawed global rankings it had been publishing between 2004 and 2009, in order to develop a new and improved system with new data providers, Thomson Reuters, published since 2010.
We worked hard to listen to critics of rankings and to generate a public discussion of their weaknesses: we convened a special meeting of our editorial board to discuss the concerns about rankings and we reported the outcome in our magazine; we hired a polling company to help us carry out a worldwide survey of university staff, asking them what they wanted and needed from rankers and we published the results; we used the results of the survey to help develop our plans, and we submitted detailed proposals to a specially convened group of more than 50 leading figures from 15 countries around the world; we ran a weekly column in the magazine for several months explaining step by step how our thinking was developing; we wrote in other newspapers around the world; we opened up uncensored discussion threads on our website; we attended dozens of conferences to engage in open public debate about rankings.
Chairing a panel debate at the launch of this week’s British Academy report, Ian Diamond, vice-chancellor of the University of Aberdeen, was kind enough to acknowledge the improvements Times Higher Education made to its rankings for 2010 and beyond.
Professor Diamond, former chief executive of the Economic and Social Research Council, based in Swindon, said: “For many years I was ‘Outraged of Swindon’ every time that the rankings came out, on the grounds that in the initial years they did not benefit the arts and humanities and social sciences.
“I would like to say that Phil and his colleagues didn’t just file the ‘Outraged of Swindon’ letter, they actually invited me up and we had some very sensible conversations which in my view…certainly changed positively the rankings.”
Another core recommendation of the report is that rankings should be published with appropriate “health warnings”, making clear any limitations to the data or indicators.
Again, this is an approach I am delighted to endorse. When Times Higher Education unveils its rankings results each year, we publish the tables with reams of methodological information, explaining the logic of each indicator, making it clear to users where compromises have been made, where proxies are employed and what the tables – any tables – simply cannot capture.
Also in line with the report’s recommendations, Times Higher Education is committed to opening up our data to allow the user greater insight into the different indicators and how they combine into a single composite number.
The rankings are published on an interactive website which allows users to disaggregate our 13 indicators and look at the performance of institutions in five broad areas of performance. We also publish six subject-level tables as well as the results of our annual Academic Reputation Survey in isolation.
One of the things I am most proud of is that we have created a free world university rankings application for the iPhone and iPad, which I believe represents a major step forward.
Of course, we choose our indicators and weightings very carefully and only after lengthy consultation. But with the app, the weightings can be changed by the user to suit their individual needs. If you don’t agree with our weightings, you can set your own.
This not only makes the data more helpful to the user, but it also exposes how much individual ranking positions can change depending on the weights allocated to the indicators by the compilers.
We publish the Times Higher Education World University Rankings to provide meaningful and useful information, in the right context, for everyone in the global academy. I am very proud of what we have achieved, but we will keep working and, most importantly, keep talking.
Phil Baty is editor, Times Higher Education World University Rankings