‘CV-blind’ grant review divides Dutch scientists

Preventing expert reviewers from reading applicants’ CVs could damage Netherlands’ hard-won research excellence, warns senior researcher

February 17, 2023
Source: iStock

New procedures that prevent reviewers in the Netherlands from seeing a grant applicant’s CV have been accused of “blindfolding” selection panels amid fears that the change might harm the quality of Dutch science.

In January, the Dutch Research Council (NWO) began a new two-stage assessment of applications to its main competitive research funding scheme, known as Veni, Vidi, Vici, which is designed to cut the amount of time that scientists spend preparing grant proposals and to eliminate bias in decision-making.

Under the new rules, applicants to the country’s talent scheme submit an “evidence-based CV” along with the outline of a research idea, limited to 150 words, for review by a “broad scientific committee”. Those whose applications are selected will be invited to write a full proposal, but expert reviewers at this stage will not see CVs so the focus of attention is on the proposal rather than a scientist’s reputation or track record.

Preventing reviewers from scrutinising CVs is, however, a recipe for bad decisions because it deprives reviewers of vital information, said Raymond Poot, associate professor of cell biology at Erasmus University Rotterdam, who argued that “the CV is often the most important indicator of future success, much more than the proposal”.

ADVERTISEMENT

“Successful projects depend on the talent and effort of the researcher, and an idea is hard to judge anyway,” Dr Poot added.

The new system is also frustrating for external referees, who are “having to make judgements based on the proposal alone, which is ultimately a sales pitch”, he said. “This makes it much more difficult and laborious to quickly make a good and objective judgement.”

ADVERTISEMENT

Dr Poot said that it was starting to prove difficult to find reviewers willing to take on this task and that Dutch funders regularly had to send out 30 requests to would-be referees to find one recruit. For this reason, some programmes had abandoned external peer review altogether and let a broad scientific committee of non-experts run the entire selection process, he said.

New restrictions on what applicants can mention in their CVs will also make it harder for reviewers to make informed decisions, insisted Dr Poot. In updated advice for senior researchers applying for the Vici scheme, which offers up to €1.5 million (£1.3 million) over five years, the NWO forbids the inclusion of journal impact factors or h-indexes, while applicants “may not mention lists or total numbers of publications, grants or prizes, nor the total acquired sum” in their 1,200-word narrative section.

Applicants are allowed 700 words to describe up to 10 outputs, including their citation counts, but are banned from using any “descriptions of reputation”, including terms such as “good” or “leading journal”.

“The only measurable criteria left are the citations (or other indicators) of these 10 output items, but these numbers are only meaningful after several years, but then the publication is already getting old,” said Dr Poot.

“Personally, I’d prefer just having five pages where you can put anything you want – the new reality is making it very tricky for scientists and reviewers,” said Dr Poot, who, with many other leading Dutch scientists, has recently flagged concerns about the shift away from researcher assessment on measurable output parameters, which is becoming increasingly common across Europe, including in the UK and Switzerland.

The new policies risk harming Dutch universities’ research performance, he said. “In the Netherlands, we spend relatively little money but get good results. Thirty years ago, that wasn’t the case, but we started requiring selection using qualitative and quantitative data. Now we’re rapidly moving away again from measuring outputs, risking throwing away the Netherlands’ prominent world position,” said Dr Poot.

His appeals to reconsider the changes appear to be doomed, however. Last month, the Dutch parliament’s science committee issued a letter following a review of the Netherlands’ Recognition and Rewards change agenda, which has championed introducing narrative CVs and ending the use of journal metrics. It claims that it is too early to tell if “another form of scientific evaluation will improve the international position or threatens the reputation of Dutch science”.

ADVERTISEMENT

On preventing reviewers from seeing CVs, Robbert Hoogstraat, project leader (rewards and recognition) at the NWO, said experts would be told that the “committee has already judged these CVs to be in the top 30 to 40 per cent” and had a “broad view on all the candidates that applied, which external reviewers don’t have”.

“The problem with letting external reviewers judge the CVs is that these external reviewers only see one CV – the committee, however, sees all candidates and is therefore better suited to judge a candidate’s position within the applications that were sent in,” he said.

“We therefore feel confident to let [reviewers] solely judge the research proposal, which, as opposed to CVs, relies less on the comparison with other candidates and more on expertise of the subject,” he added.

For Dr Poot, the early signs are not good. “Hundreds of young or experienced scientists, including prominent ones, have complained about the new science policies, but there is still no meaningful discussion, even when, to me, it is obvious we are moving away from an imperfect scientific meritocracy towards somewhere between a lottery and a fashion show to adhere to the tastes of science administrators.”

ADVERTISEMENT

jack.grove@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

A push to end the habit of assessing researchers by their publication metrics is gaining momentum. But are journal impact factors really as meaningless as is claimed? And will requiring scientists to describe their various contributions really improve fairness and rigour – or just bureaucracy? Jack Grove reports

9 December

Reader's comments (1)

I find it hard to understand why you would argue for reinstating "prestige" when researchers should clearly be judged on the merits of the application. The alternative compounds the problems of inequality making it very hard for early career researchers with talent to come to the fore. It also adds to the problem of journal prestige which drives up subscription costs for research organisations and transfers money which should be spent on research into shareholders pockets.

Sponsored

ADVERTISEMENT