Sexing up the science

Journals want 'no ifs and buts' research, but Tim Birkhead wants honesty

December 18, 2008

I was asked recently to take part in a radio programme. The producer had found a science story that she wanted to discuss. I hadn't heard of the study, but it sounded interesting and I was sufficiently intrigued to agree to take part.

I tracked down the original paper in a reputable but fairly low-impact journal that had been published a couple of years earlier. When I read it, it didn't seem all that original.

Searching online for the paper in the journal contents pages, I noticed that a critique had been published highlighting its lack of originality, the fact that the results were overblown, and a number of inconsistencies - similar to the points I noticed on first reading the paper. I then saw that the author of the original paper had published a response.

To my amazement, the author openly admitted sexing up his study: firstly, to make it sound as though he had started with a hypothesis that he had set out to test, and secondly, by overstating the results so that they appeared rather better than they really were.

ADVERTISEMENT

I couldn't decide whether this was commendable (if somewhat delayed) honesty or sheer stupidity. Either way it was very odd, but at the same time it confirmed a suspicion held by many researchers of my generation: that hyping up results is on the increase and seems to pay off.

A hierarchy of scientific journals exists, reflected by their impact factor, and researchers scrabble to get their work published in those that are highest ranked because of the impact this has in our assessment-based academic culture.

ADVERTISEMENT

Typically, these journals have a rejection rate of well over 90 per cent, and to be published one has to have discovered something of general importance and to have written it up in a particularly convincing manner. It is also well known that these journals favour clean results with no ifs, buts or provisos. Is it any wonder then that some researchers are tempted to emphasise rhetoric over results and tell an overly slick story?

But it is not only in high-ranking journals that this occurs. Sexing up now pervades the entire gamut of journals, as exemplified by the paper I had been invited to discuss on radio. It is also driven by another phenomenon: the methods-free journal. Recent years have seen a plethora of new journals that publish bite-sized bits of research, uncluttered by the details of just how the study was conducted. To be honest, method sections are not totally absent, but they are typically much reduced, downgraded (often in microscopic fonts) or relegated to online supplementary material.

A year or so ago at a conference, I heard a biologist describe a study in which the results were interesting but unconvincing because the methodology was sloppy and the sample sizes ridiculously small. The inadequate methodology meant that the study was inconclusive and open to multiple interpretations. Later, by sheer chance I was sent the paper to referee for one of these "methodless" journals. The manuscript was based on exactly the same information as the talk, but skilfully dressed up so that all possible wrinkles had been smoothed out.

A referee's dilemma: I told the editor and left them to decide whether or not to accept the paper. Methods-free journals increase the chances for opportunistic dishonesty: it is rather like a department store advertising the fact that its CCTV security cameras will be off between certain hours to save electricity.

ADVERTISEMENT

Increased competition in academia is creating a misleading aura of apparently clear-cut results. In truth, science is rarely cut and dried. Most studies, especially in biology, contain a few inexplicable glitches. I try to encourage my students and postdocs to publish their results warts-and-all, but referees and editors tend not to like manuscripts with any hint of potential error or inconsistency.

As competition increases, there seems to be an upward spiral of intolerance. Authors who have had their papers treated harshly are then overly critical of those they are given to referee. Under these circumstances, it seems that the only way to succeed is to submit the "perfect" study.

Sadly, many researchers regard publishing as a game where the main objective is to outsmart the referees. But wouldn't it be great to read something really honest in a paper: "We have studied this fascinating phenomenon; we've done several experiments. The results aren't clear-cut, but we have done the best we can using the best methodology and we do have some interesting results. We're submitting it in the hope that it might inspire new research."

Just occasionally, when we have done this we are lucky and the referees are of a similar mind. It is rare, but when it happens it is deeply reassuring.

ADVERTISEMENT

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT