Time to test private colleges' efficiency

Andrew McGettigan considers the findings of the National Audit Office’s investigation into alternative providers

December 11, 2014

Source: Eduardo Fuentes

Margaret Hodge will grill officials next week about what she has described as the “extraordinary rate of expansion and high dropout rates” at “alternative” higher education providers next week. On 15 December, representatives from Pearson, the Department for Business, Innovation and Skills, the Higher Education Funding Council for England and the Student Loans Company are expected to appear before MPs on the Public Accounts Committee, chaired by Hodge.

The obvious conclusion for the committee to draw from last week’s National Audit Office report, Investigation into Financial Support for Students at Alternative Higher Education Providers, is that BIS lost control of its “designation” scheme for alternative providers in 2012-13.

In that one year, the number of funded students at private colleges leaped from 13,000 to 33,000 with associated loan and grant outlay climbing from £123 million to £421 million. It is clear that a slew of colleges with unfamiliar names and limited track records adopted Higher National qualifications and cashed in. And although BIS took emergency measures to halt the recruitment of funded students at the fastest growing providers, growth continued: SLC provisional figures indicate that 53,000 private students benefited from £675 million in 2013-14.

It is the NAO’s new findings on dropout rates that attracted most attention in the pages of Times Higher Education (“Report fuels concerns over private colleges”, 4 December). The independent parliamentary body used SLC figures to establish that on average students at private colleges are three times more likely to quit their studies. Its report highlights the case of London School of Science and Technology where 999 students received £9.9 million in 2011-12, including £2.5 million in tuition fee loans.

Almost half its students (49 per cent) dropped out that year. That rate fell to 18 per cent in 2012-13, for 1,675 students. Another college, London Churchill, is revealed to have experienced dropout rates of 59 per cent and 30 per cent over the same period. Yet BIS reapproved both colleges for student support this autumn, raising questions about its “designation” process. How can we have confidence in quality assurance if there is no benchmark for retention rates and no sector body has responsibility for monitoring funded student progression?

More confusing and concerning, however, is the NAO’s revelation that in 2012-13 the SLC may have funded 3,000 students for Higher Nationals – 20 per cent of the total – who had never registered with Pearson. That would suggest that these students have received money but never submitted any coursework. What were they doing? Or is this a data management failure on the part of the SLC? The NAO notes drily that “no work has been undertaken by the oversight bodies into why there is this apparent discrepancy”.

That could be the report’s leitmotif. It is almost as if BIS fears to lift any more stones in case it might have to act on what it finds and deal with colleges that may be “too big to fail”.

BIS was advised in 2011 to bring in legislation so as to create the powers necessary to regulate alternative providers, but the department pressed on with its reforms even after the planned higher education bill was pulled in 2012. Its Heath Robinson jumbling of sector bodies, responsibilities and existing powers has left a gaping hole where oversight ought to be.

Elsewhere, the NAO’s investigation shines a light on new forms of agent-led recruitment. There has been no inquiry into the use of agents or recruitment practices more generally, despite BIS identifying “unusual patterns in applications for student support”. And although it withdrew funding from more than 5,000 students from the European Union during the course of 2013-14, BIS appears to have shrugged off the idea that this might inform designation decisions. The NAO quotes the party line: “Without evidence to the contrary, BIS could not hold the providers responsible for claims made by their students.” How hard has BIS looked for that evidence? One could begin with the online advertisements listed in an annex of the NAO report and then ask which recruitment agents were used by those 16 colleges that were home to 83 per cent of ineligible maintenance applicants.

On the question of academic standards, the NAO surveyed 23 alternative providers and ascertained that 11 set only “modest” language requirements, judged against IELTS standards for entry. Such speakers might only have “partial command of the language and are likely to make many mistakes”. A further three colleges set their entry bar lower, at “limited”, potentially enrolling students with “frequent problems in understanding and expression”. An established university would normally require an applicant to be at least “competent”. Has the SLC been funding overpriced language classes? Or is something else going on?

So what steps should MPs on the PAC take next? They should ask the NAO to conduct a “value for money” review and to aim at “success rates” (how many qualifications are awarded), which no regulatory body monitors at present. Higher National expansion began in late 2012; those who received two years of full-time funding from then ought to start completing in significant numbers soon. The first test for private sector efficiency will be how many qualifications result from the millions spent so far.

Times Higher Education free 30-day trial

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored