Research paper ‘sloppiness’ on the increase, warns publisher

Editors have noted a “certain level of sloppiness” creeping into research papers, the executive editor of the Nature Publishing Group has said.

May 12, 2013

Speaking at the 3rd World Conference on Research Integrity, held in Montreal, Canada, from 5 to 8 May, Véronique Kiermer said a lot of errors that needed correction were “actually avoidable errors…and I think that is a very troubling trend”.

Although - unlike across academic publishing as a whole - the publishing group’s 18 journals had seen no increase in the number of retractions per year, the number of corrections issued had risen, said Dr Kiermer.

Directing her concerns mainly at the biomedical sciences, she listed problems with papers that included missing control tests, inappropriate and poor image manipulation, issues in experimental design and reporting, and problems with statistics.

“It’s not always that the information is wrong, it’s that it’s not described properly…So it’s both an issue of rigour and the design and execution of these experiments, but also precision in reporting these experiments so they can be interpreted properly,” she told the biennial conference. 

In response to the problem, Nature has announced efforts to raise the standard of reporting and transparency in the papers it publishes, she added.

This would include commissioning statistical reviews of some papers, making authors fill out a checklist relating to common problems and removing the limit on the length of articles’ methods sections.

Dr Kiermer recognised that in doing so it was just “scratching the surface” of a much bigger problem, which included issues to do with insufficient training, for example in the quantitative aspects of biology, as well as in mentoring.

She also flagged up other issues such as the phenomenon of publication bias - where statistically significant, positive results are more likely to be published - as contributing factors.

“It’s a very large problem and it needs a joint approach from all the members of the community,” she concluded.

Dr Kiermer also put the falling standards NPG had observed in the context of problems raised by recent industry studies over replicating academic data; although she said the two issues were “coincidental” rather than causally related.

A 2011 study carried out on behalf of biotech firm Bayer found that results were either fully or partially replicable in only 32 per cent of the academic studies that they tried to repeat. In a further study by biotechnology company Amgen, industrial labs were only able to fully confirm the original scientific findings in 6 of 53 papers.

However Dr Kiermer stressed that there were many good reasons why results might not be transferable from an academic lab to the scrutiny of drug development programmes.

elizabeth.gibney@tsleducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored