Research Intelligence - Measured weights and measures
Australia's new funding chief aims for open access and a pragmatic, flexible ERA. Paul Jump reports
The Australian Research Council's recent indication that it will become the latest funder to institute an open-access mandate is one of the first acts of its new chief executive, Aidan Byrne.
The former dean of science and director of the College of Physical and Mathematical Sciences at the Australian National University, Professor Byrne, a physicist, took up the ARC post in July. He has already begun consulting the country's universities on bringing the ARC's open-access policy into line with that of the country's other public research funder, the National Health and Medical Research Council. The latter announced in February that it would require papers arising from its funding to be made available in an open-access repository within 12 months of publication.
Professor Byrne welcomed the fact that the two organisations were collaborating on such "mutual concerns", but he rejected the suggestion that the ARC should absorb its cousin and bring clinical medicine and dentistry within its otherwise comprehensive funding remit.
"The two organisations are faced with some distinct issues and challenges, and the separate arrangements generally work well," he said.
The ARC is a statutory authority reporting to Australia's minister for tertiary education, skills, science and research. However, Professor Byrne said it had sufficient independence to advise the government and was guided in most matters primarily by its own advisory council of senior academics and business people, which he chairs.
But he endorsed the government's expressed view that "a more significant weighting should be placed on measuring the impact and application of research". He suggested that even though Australia's national research evaluation exercise, the ARC-administered Excellence in Research for Australia (ERA), already uses some measures of "research application", such as patents, industry funding and successful commercialisation, it might in future encompass "additional measures of research application, knowledge exchange and collaboration".
Such an evolution, he said, would be in keeping with the "pragmatism" that was "characteristic of Australian public policy".
Another example, he added, was ERA's embrace of both metrics and peer review. Each is drawn on differently by the various assessment panels, with the "robustness and comparability" of ratings "assured through moderation rather than any formulaic weighting of metrics".
"We simply refused to get too caught up in doctrinaire debates between bibliometricians and peer-review purists. Instead, the ARC engaged the broader research community to find an administratively efficient approach that would work for all disciplines," he said.
Despite this, the first ERA exercise, run in 2010 under Professor Byrne's predecessor, Margaret Sheil, was far from uncontroversial. Its use of journal rankings in particular was heavily criticised, both because of their alleged lack of accuracy in some disciplines and for the perverse incentive they presented to research managers to require academics to publish only in top-ranked journals.
Professor Byrne said that those "most engaged" with ERA had always been clear that it employed a wide set of indicators, including citation analysis, peer review, research income, plus esteem and applied measures, which were "designed in consultation with the sector to drive positive behaviours and minimise...perverse incentives".
But he agreed that "a focus on journal rankings in some parts of the sector did lead to instances of their misuse". He was confident that their replacement in this year's ERA by lists of each unit of evaluation's most frequent publishing venues would allow "journal quality to continue to play a role while ensuring that ERA-related assessments of journal quality are not used inappropriately in other contexts".
Stocktaking on a national scale
ERA assesses all of a university's research produced in each disciplinary category during the previous six years, provided it passes a volume threshold. This is because the exercise is intended to be more of a stocktake of the nation's research - assessed against global standards - than a means of devising a pecking order for funding purposes.
Nevertheless, Australian newspapers were quick to produce rankings based on the 2010 results. At the time, Professor Sheil lamented the "crudeness" of such efforts, and Professor Byrne said that the ARC would do what it could in 2012 to ensure "the results are presented in a way that encourages others to use the data in ways that are meaningful and don't distort outcomes".
Era results are also being used by the Australian government to inform the allocation of a small but growing proportion of universities' block grants for research. Professor Byrne said this was a marked improvement on the previous use of a formula that merely counted papers, making no assessment of quality, impact or disciplinary differences.
He said that 2010 ERA data were also being used by universities for strategic planning and by the government for its formulation of a national research investment plan. The assessment exercise's ability to "locate specific areas of research strength, identify opportunities to develop research capacity and allow for comparisons of research effort over time" made it an "ideal tool for aligning research strengths with institutional, regional and national priorities to maximise the benefits of public investment in research", he said.
Newspaper analysis of the 2010 results suggested that Australian work in social science and (to a lesser extent) humanities subjects was of a lower standard than that in the sciences. However, Professor Byrne said that the ARC had neither compiled aggregate scores for entire disciplines nor altered disciplinary funding levels based on ERA results.
Many hands make lighter work
Professor Byrne admitted that although the ARC was "confident" about the 2010 results in the humanities and social science, feedback suggested that some of the peer reviewers had been overloaded. This had prompted the council to recruit "hundreds" of additional evaluators for the 2012 exercise, the results of which are due to be published in December.
Changes have also been made to better accommodate multidisciplinary research, while the volume threshold for assessment has been raised to 50 outputs in light of strong concerns among many 2010 committee members about anomalous results arising from some small units.
As for the ARC's core grant-making activity, Professor Byrne admitted that it was a "challenge" to fund the "brightest researchers and research projects in the country at an acceptable level" in the face of a 42 per cent increase in applications over the past five years.
"The question of whether to fund more grants or provide more funding for fewer grants is a continuing debate," he said. But he added that success rates had remained at "comparable levels" during that time (the current success rate for the ARC's fundamental research stream is just under 22 per cent).
Professor Byrne would not be inclined to put further pressure on the system by submitting his own application, even if he were permitted to do so. He has parked his research into the use of gamma rays to probe atomic structure, at least for the duration of his initial five-year term.
"I am fully focused on and dedicated to the task at hand as chief executive of the ARC, which does not leave much time for research," he said.