Academics should stop fixating on the editorial power fallacy

Complaints about journal editors’ decisions ignore the root cause of the research assessment problem: career structure, says Richard Sever   

October 24, 2019
Judge assessing sheep at country show
Source: Alamy

One of the things life scientists learn very quickly is that they must publish or perish. In particular, if they want to land a faculty position, it is often claimed that they must secure a publication in Cell, Science or Nature – or, at the very least, a spin-off such as Molecular Cell or Nature Genetics.

This situation leads to frequent complaints about the immense amount of power it places in the hands of the editors of this relatively small group of “high-impact” journals, allowing them to determine what is the best science and, by extension, who are the best scientists – with little accountability for their judgements.

Some critics see selectivity itself as the problem, but others focus on the fact that high-impact journals are typically edited by professional editors who left academia early on, rather than by the active scientists that oversee most academic titles. The extraordinary demands these supposedly “failed” scientists place on authors (albeit at the demand of the academics who serve as referees) are resented, and their selection criteria for “novelty” are often dismissed as being informed more by hype or hipness than by academic rigour or reproducibility.

These editors are highly trained professionals, however, and it’s debatable whether the judgements of academic editors are any less subjective or biased. Professional editors also insist that their role is simply to identify articles of broad general interest; they never set themselves up as arbiters of who should get a job and never claimed exclusive judgements on scientific excellence.

ADVERTISEMENT

Nor, they add, are they responsible for the scientific community’s obsession with the much-maligned journal impact factor, designed by bibliometricians to help librarians decide which journals to buy but so frequently used by academics as a proxy for article quality that “impact factor” is the most common co-search term in Google for any journal and “what is your impact factor?” is the question academics most frequently ask editors.

The editors are right in this respect (although they may want to chat with their colleagues in marketing, who have fuelled the obsession, and with their publishers, who have launched endless spin-offs to cash in on journal brands). And while the publishing process undoubtedly can and should be improved, griping about high-impact journals fails to recognise the root cause of the problem.

ADVERTISEMENT

Because there is another thing life scientists learn early on: there are very few faculty jobs. Permanent academic positions await less than 5 per cent of doctoral graduates in the UK, for instance, so the vast majority of “trainees” will never do the job they are supposedly being trained for. This was not the case in the early days of molecular biology, when 61 per cent of US PhDs could get tenure track positions. Why? Because the number of graduate students and postdocs churned out each year has exploded since then, with no concurrent increase in faculty positions. The average US faculty member now trains 13.6 PhDs in their academic lifetime, and there will be space only for one when they retire – which they are doing later and later.

The system is perpetuated in part because PhDs and postdocs are cheap labour, easily turning over every few years without the systemic burden of employment contracts, benefits and pay increases that would be seen in typical public sector work. 

Meanwhile, scientific output has increased significantly, generating more and more papers from increasingly fragmented subdisciplines. People don’t have the time or expertise to read and assess all the papers themselves – and they can’t wait years to see how important the work proves to be. When deciding who gets hired, promoted or funded, they must resort to leading indicators and signals that can be quickly and easily compared across fields. 

So when scientific grandees decry the modern obsession with high-impact journals (despite publishing in the very same high-impact titles themselves), they are failing to grasp the realities of the hypercompetitive environment that has supplanted the rosy past they recall, in which journal-agnostic quality judgements were made and science was more gentlemanly (in every sense of the word). Everyone was less concerned about editorial decisions and impact factors then because everyone had a good chance of getting a faculty job and research funding regardless of where they published.

ADVERTISEMENT

This hypercompetition is also evident in ever-decreasing grant paylines and the flawed incentives that have led to recent reproducibility concerns. And unless we acknowledge it as the root cause of the editorial power problem, any reform to hiring practices risks simply replacing one flawed proxy with another.

It is right to lament the fact that where someone publishes has become more important than what they publish. But judging people on where or with whom they trained would arguably be worse. Article-level metrics are little better: they can be gamed and inevitably become targets. And anyone who thinks the assessment system should work more like Yelp or eBay ratings probably needs to use Yelp or eBay a bit more.

Real change will come only with a restructuring of academic careers. Forward-thinking PhD programmes that prepare candidates for a variety of different potential careers are a good start. A bolder and perhaps more necessary step would be to admit the fiction that a postdoc is a training position for group leadership and to create more permanent staff positions below principal investigator level.

That would reward people’s scientific expertise honestly – rather than leaving them insecure well into their forties, worrying that their whole future depends on the whims of a journal editor.

ADVERTISEMENT

Richard Sever is assistant director at Cold Spring Harbor Laboratory Press and co-founder of bioRxiv and medRxiv.

POSTSCRIPT:

  • Would you like to write for Times Higher EducationClick here for more information.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Reader's comments (1)

More to and fro between academia and broader society sounds like a good idea, and not just in life science.

Sponsored

ADVERTISEMENT