Publish or be damned

Pressure to have research accepted by the 'right' journals in order to get on damages scholarship, warns Dennis Tourish

December 16, 2010

Aspectre is haunting business schools everywhere - that of journal rankings. More than most disciplines, we have embraced the notion that the quality of scholarship can be measured by where it is published rather than its content. The Financial Times has a favoured list of 45 journals and uses faculty publication in them as one measure of the worth of MBA programmes. The Association of Business Schools (ABS) in the UK publishes an evaluation of journals, and grades them using the scoring criteria of the research assessment exercise. Australia and others have followed suit.

Increasingly, these lists are used to micromanage the academic research effort, with scholars steered to publication in preferred journals - and lambasted for their inadequacy if they do not achieve it. When advertising jobs, many business schools require publication within the 94 journals that have been given a quality rating of 4 in the ABS' Academic Journal Quality Guide. Others specify publication within a "world elite" group of 22 journals (a 4* rating). One business school insists that probationary staff must publish at least one paper in an ABS 4 journal to have their appointment confirmed.

This distorts scholarship by devaluing work in journals that are not ranked or those lower down the list.

The most lauded journals are based in the US, and reflect the positivist and functionalist orthodoxy that dominates the discipline there. They pay relatively little attention to such problems in management theory and practice as exploitative working conditions, race or ethics. UK academics who wish to publish in such outlets - and few succeed - have to adapt to their priorities.

Journals themselves are competitive and emulate the habits of those already successful. More now insist that each paper must "make a distinctive theoretical contribution" rather than creatively employ existing theory, prioritise replication in order to eliminate rogue findings or address the needs of practitioners. Particular forms of theorising and research thus become privileged over others.

The elite journals have a rejection rate, typically, of more than 90 per cent. For no good reason, this rate is used to justify the quality of a journal and becomes a target for others to emulate. Editors have become judge, jury and, mostly, executioner.

But even if you dodge that bullet, an arduous obstacle course remains: increasingly critical reviewer comments, a long process of revisions before resubmission, then another barrage of criticism. The publications that emerge from this are not always improved.

Where will this end? Perhaps a journal will announce that it has achieved a rejection rate of 100 per cent - nothing by anybody is considered good enough to grace its pages.

Here is another endpoint. Australia's University of Queensland has just announced its "Q-Index". This measures an individual's research income, research publication (weighted by reference to journal lists), higher-degree completions and research degree advisory loads. A Q-Index - to two decimal points - is produced, and compared with average scores at university, faculty and school levels.

It is also compared against all staff in an academic's faculty at the same appointment level and is open to inspection by managers. Essentially, people become numbers: I am a 9.22, you are a 10.33, she is a 12.34.

I can imagine few better methods of reducing intrinsic motivation, extending managerial oversight and demoralising people. It is part of the commercialisation of the academy, often justified in terms of "accountability". Such is the trajectory on which journal rankings, whatever their original intent, have placed us.

On this issue, business schools are at the cutting edge of bad ideas. It is time for a rethink. Above all, I urge a moratorium on journal-ranking lists. Meekly accepting constraints on where we publish limits academic freedom. It means that what we research, how we do it and what methods we employ are driven by criteria at odds with what is important or interesting. Our careers are dehumanised, our scholarship damaged. This is a spectre whose haunting days should most definitely be terminated.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored