Infected blood: how academics both caused and exposed tragedy

Report into scandal highlights danger of relying on singular sources of knowledge, in contrast to inquiry itself, which utilised vast range of expertise to get to the bottom of what happened

May 28, 2024
London, UK. 19th May, 2024. The Hepatitis C Trust assembles at Parliament Square to demand action ahead of the final report on the infected blood scandal. The demonstration highlights the plight of tens of thousands affected by contaminated blood products
Source: Atlantico Press/Alamy

Concluding his lengthy seven-volume report on the UK’s infected blood scandal, former High Court judge Sir Brian Langstaff said that what was the “worst treatment disaster” in the history of the NHS “could largely, though not entirely, have been avoided”.

In the 1970s and 1980s, more than 30,000 people were infected with diseases such as HIV and hepatitis C after being treated with contaminated blood products – leading to nearly 3,000 deaths – and the experts who advised the government at the time should shoulder some of the blame for what happened, the inquiry made clear.

It also shone a light on the research practices of the past, highlighting that patients were unaware that they were participating in medical trials and had not consented to be the subjects of research, which often did not lead to safer treatments.

But while the mammoth six-year process – the biggest inquiry ever held in the UK – exposed the failings of the academy in preventing the tragedy, it also demonstrated its power in obtaining justice for those involved, with dozens of academics working for years to get to the bottom of what happened.

Given the complexities of the issues at hand, the unusual model Sir Brian developed – setting up seven expert groups to investigate aspects of the events under instruction from those affected – meant that a vast array of multidisciplinary expertise fed into the investigation, said Stephen Evans, emeritus professor of pharmacoepidemiology at the London School of Hygiene and Tropical Medicine, who was the convener of the statistics expert group for the inquiry that modelled the impact of the crisis.

This stood in contrast to the practices of the period under investigation, with the report highlighting how singular scholarly voices had been able to dominate the debate.

Particularly criticised was Arthur Bloom, a professor in haematology at what was then the University of Wales College of Medicine, now part of Cardiff University, whose views were said to have been “overly influential”.

Professor Bloom, who died in 1992, was a leading expert in his field. The inquiry showed that he had consistently downplayed the risks despite mounting evidence and that there had been “an uncritical acceptance of his line of thinking and a failure to probe his advice”.

The inquiry report found that had Professor Bloom, who was often the only academic at key meetings, “been faithful to the facts” then “it is not difficult to see that the events that followed might have taken a different turn”.

Government advisory committees today draw expertise from a much wider range of sources, but there is still a danger that a “very strong personality” could exert undue influence and prevent dissenting voices from emerging, said Professor Evans.

Given the inquiry’s focus on the impact of lone voices, it was “pertinent” that it was run in a way that pulled together groups of experts, said Emma Cave, professor of healthcare law at Durham University, who co-chaired the medical ethics group.

“I think it shows it is a useful way forward because it is so transparent and enables there to be other perspectives and viewpoints that come in. Perhaps that would have been something that would have been useful then. But comparing the two is like chalk and cheese; they are entirely different situations,” she said.

Bobbie Farsides, professor of clinical and biomedical ethics at the Brighton and Sussex Medical School and another co-chair of the ethics group, said she saw part of their role as helping to explain to those affected the medical context of the time. It was important, she continued, not to “demonise” people while not excusing their mistakes.

In medicine, “very often some of the best advances have come through passionate individuals building up teams or expert centres, and pioneering work that in the early stages is risky, is difficult”. However, the researchers working with the inquiry had “felt increasingly uncomfortable with how things had been done”, she said.

Modern-day medicine had become much more aware that doctors cannot be left alone to make decisions that are important to society as a whole, Professor Farsides added, and multidisciplinary input was now regularly sought.

“If you go to a modern medical school, the people working there are medically qualified, but they are also sociologists, ethicists, anthropologists, geographers, economists,” she said.

“There is much more discussion between people who work in medical schools and broader academic communities. They are not as insular as they used to be.

“The medical profession has also had to become much more used to criticism and judgement of what they do, both at a societal level but also academically. They aren’t shut away doing their own thing; they are subject to the gaze of a number of different disciplines.”

tom.williams@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Sponsored