Post-92s leap up the TQA table as 'bias' is removed

November 17, 2006

Old universities are the big losers after teaching quality inspection data are adjusted to remove the link with research. Phil Baty reports

Cambridge, Warwick, York and Essex universities are among the institutions that have lost their places at the top of a teaching quality league table under a dramatic reworking of data for an academic journal.

Sheffield Hallam, Middlesex, West of England, Central Lancashire and Nottingham Trent universities all find themselves promoted to the top ten in an "adjusted" table of official teaching quality assessment scores.

The recalibrated table adjusts Quality Assurance Agency teaching quality inspection (TQI) scores to remove general "bias" towards pre-1992 universities.

Adjustments also take account of grade inflation over time, the size of the department inspected and variations in the performance of different subject groups.

Sheffield Hallam University takes first place for teaching quality, up from 48th. Middlesex University leaps to second from 54th.

York, Cambridge, Warwick, Essex and Lancaster universities, as well as Imperial College London and the London School of Economics, lose their places in the top ten, which is now occupied largely by new universities.

The findings, which cover more than 1,000 inspection results published between 1995 and 2001, appear in the most recent edition of Quality in Higher Education .

Philip Garrahan, pro vice-chancellor for academic development at the newly elevated Sheffield Hallam, said: "The university welcomes and is delighted by the results of this research.

"Bearing in mind the dangers of deriving a single score from a complex review methodology, we nevertheless feel that the research validates our experience of providing high-quality learning to students from diverse backgrounds."

Geoffrey Alderman, who was pro vice-chancellor responsible for quality and standards at second-placed Middlesex University from 1994 to 1999, said: "At last, the true strength of Middlesex's teaching is being recognised - albeit in an obscure journal.

"A lot of us said at the time that the whole exercise was skewed - biased towards pre-1992 universities. It also put great emphasis on research as the essential foundation for teaching, when it does not have to be."

The paper, Recounting the Scores: An Analysis of the QAA Subject Review Grades 1995-2001 , was written by Scotland-based team of academics led by Robert Raeside, director of research at Napier University. It avoids analysis of Scottish institutions and focuses on the results in England for 83 institutions.

Under the old quality assurance system, all departments were inspected in what became known as "subject review" after 1995.

Departments were given marks out of four in six "aspects of provision" - including curriculum design and content, student achievement and learning resources - Jgiving a total score out of 24.

In the paper, the researchers say that previous studies had revealed concerns that "subject review had been successful at little more than measuring institutional wealth". They argue that clear links between research performance and performance in teaching inspections had been established.

"This study has taken a relatively straightforward approach to account for four of the main biases to the published scores - departmental size, date of review, pre or post-1992 university and subject," the paper says. "The findings suggest that the official tables are misleading and that controlling for size and subject group significantly alters the ranking of the universities... effectively there was no control of the grading process over the six years this approach was used and the consequences were either ignored, or not detected, by the QAA."

An Essex University spokeswoman said: "The variability of subject review scores between disciplines and over time was a known factor of the previous QAA audit system.

"But Essex departments achieved high scores throughout that period, and the National Student Survey confirms that our students continue to be very satisfied."

Cambridge University declined to comment.

Roger Cook, a research team member from Paisley University, said: "What we did do was to take the QAA at its word - that is, that subject review is performance against your own stated objectives.

"On that basis, we then controlled for issues that had nothing to do with the objectives such as when you were visited, subject content, size of department, type of institution."

Dr Raeside said: "The paper is, in essence, a plea never to repeat such an exercise, especially with uncalibrated and unchecked data, either in the UK or elsewhere in Europe.

"We have added to the weight of evidence that £30 million of public money was wasted on this process up to 2001 - and that ignores stress and misdirected effort."

Peter Williams, QAA executive, said the paper would be useful to those researching subject review scores prior to 2001.

phil.baty@thes.co.uk

The old top ten

DOWN

York (down to 57th)

Cambridge (down to 13th)

Oxford (stays in 3rd)

Warwick (down to 15th)

Loughborough (down to 6th)

Essex (down to 55th)

Lancaster (down to 41st)

Imperial College (down to 12th)

LSE (down to 72nd)

Sheffield (down to 19th)

The new top ten

UP

Sheffield Hallam (previously 48th) Middlesex (previously 54th)

Oxford (3rd)

Leeds (previously 43rd) University of the West of England (previously 24th)

Loughborough (6th)

Central Lancashire (previously 55th)

Nottingham Trent (previously 46th)

Salford (previously 66th)

Southampton (10th)

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored