Will Covid-19 change the scientific method long term?

Scholars say the pandemic has drawn attention to the uncertain and adaptive nature of research and called for a broader range of evidence to be valued

September 2, 2020
Boris Johnson visits the Mologic Laboratory in March 2020
Source: Getty
Boris Johnson visits the Mologic Laboratory in March 2020

Throughout the Covid-19 pandemic, many Western governments have repeatedly defended their actions (or inaction) on the grounds that they are “following the science”. The mantra is a prime example of the way science is often viewed as the painstaking search for a singular truth, while policymaking is seen as the process of acting on definitive findings.

Experts have criticised the use of the phrase because of its assumption that there is consensus among scientists and because of fears that it is being used to abdicate responsibility for political decisions.

But while scientists have called for a broader range of disciplines to be involved in and to help direct Covid-19 research, scientific advice and government policy, are academics and policymakers still relying too heavily on the traditional scientific method of forming hypotheses and conducting randomised controlled trials and too readily dismissing “real-life” evidence? And could the value of practice-based evidence during the crisis lead to a shift in the way scientific research is conducted in future?

Tim Rhodes, professor of public health sociology at the London School of Hygiene and Tropical Medicine, said that the Covid-19 pandemic has drawn attention to “evidence as an uncertain, emergent and adaptive thing” that is “always produced in a particular context”.

ADVERTISEMENT

“The ‘business as usual’ [approach] of evidence-based medicine – the idea that certainties can be progressed towards in a linear fashion over time by making science more and more accurate and precise – becomes more obviously not the case,” said Professor Rhodes, who co-authored a recent paper titled “Making evidence and policy in public health emergencies: lessons from Covid-19 for adaptive evidence-making and intervention”.

The research argues that the challenges of Covid-19 “do not simply require us to speed up existing evidence-based approaches, but necessitate new ways of thinking about how a more emergent and adaptive evidence-making might be done”.

ADVERTISEMENT

Trisha Greenhalgh, professor of primary care health sciences at the University of Oxford, said “one of the great success stories” of academia’s response to the Covid-19 crisis had been the speed at which traditional research approaches, such as clinical trials, were established.

But she worried that “conventional science” was revered too highly and had overshadowed broader, interdisciplinary approaches to research.

“In a situation that is so complex, that is so fast-moving, that is so shot through with uncertainty, some clinical scientists are still seeing the randomised controlled trial as the pinnacle of scientific evidence. And it’s not,” she said.

“We have seen a catastrophic failure of 20th-century evidence-based medicine in giving us the answers we need in this pandemic. The old tools and techniques for doing science weren’t enough – they weren’t broad enough, they weren’t diverse enough.”

Professor Greenhalgh, who in June published an article in Plos Medicine titled “Will Covid-19 be evidence-based medicine’s nemesis?” and co-authored a recent BMJ blog on “managing uncertainty in the Covid-19 era”, said that while trials were essential to make informed decisions about drugs, in some cases scholars and governments needed to follow a practice-based evidence approach of “just do it and see what happens”.

“Because of the high levels of uncertainty, we’ve got to do science in a different way,” she said. “That’s not to say that it’s bad science – it’s a different kind of science. We don’t know whether the intervention that we put in place is going to be effective, so we have to use research techniques – such as data gathering and data analysis – to get real-time feedback on the impact of whatever policy it is. And what that means in effect is fewer controlled experiments and more natural case studies or experiments.”

Professor Greenhalgh cited the use of face masks as a key example of a policy that was initially rejected because of a supposed lack of research: at the beginning of the pandemic the World Health Organisation argued that there was not enough evidence to say that healthy people should wear masks, but it changed its advice in early June to recommend that they be worn in public where social distancing was not possible.

However, Professor Greenhalgh said, “real-life evidence” had long shown that countries that adopted masks very quickly, such as nations in East Asia, had very low numbers of deaths.

ADVERTISEMENT

“Why on earth is the West still screaming for its randomised controlled trials as we bury more bodies on a daily basis?” she asked.

“We will never get the randomised controlled trials that they’re asking for; partly on a practical level but also at a philosophical level there are too many variables to control…Look at Donald Trump and the way he’s influencing the pharmaceutical industry. Can you do an experiment that’s going to take that into account? No, you can’t.”

Professor Greenhalgh said the “hierarchy of evidence” in the field of evidence-based medicine, which places randomised controlled trials at the top and case studies at the bottom, had also led academics and policymakers to discount other “stories”, such as 52 out of 60 singers in a choir in Washington becoming ill in March.

Scientific advisers to the UK government “would say things like ‘the evidence isn’t there’ and ‘there’s not any evidence’. What they meant was, there wasn’t any evidence of the shape they were expecting,” she said.

Professor Rhodes agreed that there needed to be “much more open definitions of evidence beyond the randomised controlled trial”, adding that he would “treat equally evidence drawn from experience or from qualitative or ethnographic accounts of experience”.

ADVERTISEMENT

Mathematical models on the effect of social distancing had ignored “evidence that points to the complexity of how social distancing is actually done in real life”, he said.

“Without looking at that more anecdotal, qualitative, experiential evidence, the models are arguably so detached from their context that they are almost meaningless,” Professor Rhodes said.

“I would argue that modellers need to spend more time speaking to these producers of other kinds of evidence, including lay people and sociologists and anthropologists and ethnographers, in a more deliberative approach to try to make models a little more real.”

Marcus Munafò, professor of biological psychology at the University of Bristol and chair of the UK Reproducibility Network, said there had been a “speed/accuracy trade-off” in some Covid-19-related research and suggested that there might need to be “two systems of research” – a “conventional mode” and an “accelerated mode” for delivering evidence very rapidly when it is needed.

He said the accelerated mode would not necessarily be a matter of “revving up the way we ordinarily do things” but might require “a fundamentally different approach” to science and a more top-down and directive allocation of funding.

But Hans IJzerman, associate professor in social psychology at Grenoble Alpes University and author of a recent study that argued that psychology research was “not ready” for use in pandemic policy, said it was the job of scientists to provide an accurate appraisal of their level of certainty in the effectiveness of particular interventions.

While he acknowledged that “hypothesis-generative” research approaches, such as exploratory research or discussing research with community experts, were useful and should be used in a crisis, he said there were “limits to the level of certainty they can establish and whether there is causality between one variable and another”.

“In order to establish that the relationship is true and that there is a causal relationship between one variable and the other, a randomised controlled trial is thus still the gold standard,” he said.

Eric Rubin, editor-in-chief of the New England Journal of Medicine, said “unsupported theories are not a substitute for evidence” when asked whether practice-based evidence could be used in the absence of randomised trial evidence.

“Physicians often must make choices before definitive information exists. We must recognise that these are just guesses until questions are adequately studied,” he said.

Professor Rhodes said academia was moving towards “more complex, adaptive thinking, which accepts that randomised controlled trials alone are probably not enough. But I think that’s a long way from suggesting that we need a different way of doing science.”

He added that scientists in public health and medicine generally “still hold on to the idea that science is moving towards certitude, to a singular truth of how things actually are” and there was only a small minority group of more critical academics who saw evidence as emergent and contingent on context.

Professor Greenhalgh added that “the dominance of clinical academics and their privileged ways of thinking and the dismissal of the social sciences and humanities is a problem that academia has been struggling with for years” and predicted that it would take “a bit more than Covid to change the culture in academia”.

However, Professor Rhodes said the uncertain and fast-moving nature of Covid-19, combined with the public nature of scientific evidence during the pandemic, did “create a momentum for thinking how science is done differently”.

“The very public nature of how research is being done in relation to Covid-19 – which domesticates it in a way and makes it more open to dialogue and controversy and contestation – invites more of a debate about what constitutes evidence and more of a potential undermining of a singular way of doing science, as has been the mainstream,” he said.

ADVERTISEMENT

ellie.bothwell@timeshighereducation.com

POSTSCRIPT:

Print headline: Pandemic puts the standard of evidence to stern test

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Reader's comments (1)

The confusion of the "scientific method" with evidence-based practice (EBP) in this article jaw dropping to say the least. I can assure you that the current pandemic responses will have no effect whatsoever on the basic assumptions and methods of science which have been in place since the Enlightenment. I can't believe that EBP gurus like Trish Greenhalgh (quoted here) believe that either. It's the application of certain thoughtlessly applied principles of EBP that must change, not the basic ways of doing science.

Sponsored

ADVERTISEMENT