RAE results may not reflect true quality of UK research, warns chair

Concern that some disciplines will suffer as a result of tactical submissions. Zoe Corbyn reports

December 4, 2008

Academics should be "very careful" in the conclusions they draw from the research assessment exercise about the quality of research in their disciplines, the chair of an assessment panel has warned.

The results of RAE 2008 will be published in full in Times Higher Education on 18 December. They will determine the allocation of more than £1 billion a year in research funding and establish a new pecking order for research excellence.

But a chair of one of the 15 main RAE panels this week told Times Higher Education that academics could expect some "amazingly poor" showings in subject areas in which the UK was actually very good. Impressive showings are also likely in areas where the UK is relatively weak.

The discrepancies will be the result of tactical decisions taken by institutions to submit their best researchers into specific units of assessment, to maximise their chances of success, combined with a lack of data available on the proportion of a department's researchers submitted in each unit.

The panel chair said: "The fear for a discipline is that the results will be taken as, prima facie, that 'this subject in Britain is not very good', which is actually complete rubbish. It is actually quite good, only the good people have been submitted to another panel."

He added that the "biggest mistake" in the procedures for the RAE had been the failure to create submission rate data, which he said would have helped elucidate the problem.

Meanwhile, there is also concern about just how comparable results will be between the 15 main RAE panels. RAE 2008 has seen a two-tier panel structure where 67 subpanels have worked under the guidance of 15 main panels to ensure common quality levels and standards, but the question academics are now asking is how consistent is the approach across main panels?

Dame Nancy Rothwell, who holds a Medical Research Council chair at the University of Manchester and sat on the pre-clinical and human biological sciences subpanel, said she had "no evidence" of bad practice but "hoped" main panel chairs had together been able to look across panels and moderate.

"It is difficult task. How do you compare sciences with social sciences?" she asked.

The Higher Education Funding Council for England said a series of main panel meetings was held throughout the assessment phase to discuss consistency.

But the main panel chair spoke to Times Higher Education about the difficulties involved.

"In theory, results are supposed to (be comparable) but between main panels (the comparison) is weaker. Conceptually, there is a huge problem. Is a piece of medical research better than a piece of economics research, better a piece of physics research? I can't construct a pair of scales to weigh that very easily."

He stressed his confidence that the research submitted in his sub-panels was fairly evaluated.

zoe.corbyn@tsleducation.com

REF DELAYED OWING TO INCOMPLETE INSTITUTIONAL DATA

Plans to begin phasing in a system to replace the research assessment exercise from 2011 have been abandoned, after a pilot exercise found that the sector was not adequately prepared.

The Higher Education Funding Council for England said that the first sector-wide implementation of citation assessment within the research excellence framework, planned for 2010 to inform funding from 2011, is instead "likely" to be developmental. It will be used only to inform a "small element of funding" if found to be "sufficiently robust".

The REF will use a "family" of numerical indicators, such as the number of times an academic's research publications are cited by others, to measure quality, instead of relying solely on peer review.

The delay was announced last week at a conference on the REF at King's College London.

The pilot exercise asked 22 institutions to provide data on their academics' citations to help Hefce address many unanswered questions related to the design of the citations component of the system.

Graeme Rosenberg, who manages the pilot, said: "Data that institutions have about their publications are just not going to be complete enough across the sector to run a full round (of bibliometrics assessment) straight away."

He said Hefce was still committed to full implementation of the REF in 2013, to inform funding from 2014.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored