Smoke and mirrors

‘Agnotology’, the art of spreading doubt (as pioneered by Big Tobacco), distorts the scepticism of research to obscure the truth. Areas of academic life have been tainted by the practice, but some scholars are fighting back by showing the public how to spot such sleight of hand, reports Matthew Reisz

August 16, 2012

Doubt is the lifeblood of the academy. Historians and political scientists try never to take on trust any public statement that cannot be independently verified. Scientists look for every possible alternative factor and explanation before claiming that there is a causal link between A and B. Philosophers have even been known not to take their own existence for granted. An attitude of radical scepticism is essential for most serious research.

Yet there is also a point at which such scepticism becomes pathological and irresponsible. Whole industries have an interest in casting doubt on the overwhelming evidence that smoking damages health, that nuclear energy imposes substantial risks, that climate change is taking place and that the pre-credit crunch banking system was a house of cards. Academics who cultivate the art of spreading doubt - what one scholar calls “agnotology” - are often de facto protecting corporate profits and discouraging governments and individuals from taking action. They also give authority to views that would be taken with a large pinch of salt if put forward by journalists, lawyers or public relations firms.

Many people writing about academic integrity focus on clear conflicts of interest that can lead to the distortion of research agendas and the risk of corruption.

In Inside Job: The Financiers Who Pulled Off the Heist of the Century (2012), a book based on his Academy Award-winning documentary of the same name, Charles Ferguson returns to the case of Frederic Mishkin, Alfred Lerner professor of banking and financial institutions at Columbia Business School, who was paid $120,000 (£77,000) by the Iceland Chamber of Commerce to co-author a 2006 report titled Financial Stability in Iceland. This not only painted a glowing picture of the country’s banks, but later turned up on Mishkin’s CV under the title Financial Instability in Iceland. Defending himself after the documentary was released, Mishkin said that his report had identified “several risks” to Iceland’s economy and he also dismissed the inaccuracy in his CV as a “typo” that he had corrected before the interview took place.

ADVERTISEMENT

Ferguson also cites an example of an academic acting as an expert witness for Microsoft in an antitrust (anti-monopoly) trial where his testimony directly contradicted his own published research. Along with such striking individual cases, he points to the growing number of US university presidents serving on the boards of financial services companies. Ferguson fears that this leaves entire disciplines “severely distorted by the conflicts of interest now endemic to them”, with prudent young researchers shying away from questions such as “Why did deregulation and economic theory fail so spectacularly and completely?”

Although happy for academics to provide “openly disclosed expert advice”, Ferguson objects to them acting as “covert, highly paid” lobbyists. All in all, he suggests, scholars “have proved stunningly easy to buy; for very small sums, considering the stakes involved, the financial sector has hired the best propagandists in the world”.

ADVERTISEMENT

All this clearly raises important questions. Yet a number of powerful recent books, most of them written by academics, put the emphasis elsewhere. These argue that significant areas of academic life have been tainted by a malign or exaggerated version of the scepticism at the heart of science and scholarship. Each is adamant that the public has been misled in damaging ways and suggests ways for ordinary citizens to see through the abuses.

Robert Proctor, professor of the history of science at Stanford University and the academic who coined the term “agnotology”, is not a man to mince words. “We face a calamity of global proportions,” he writes in Golden Holocaust: Origins of the Cigarette Catastrophe and the Case for Abolition (2012), “with too many willing to turn a blind eye, too many willing to let the horror unfold without intervention.”

The 6 trillion cigarettes smoked each year are quite simply “the deadliest artifacts in the history of human civilization”. While the manufacturers count as “racketeers”, their academic accomplices are guilty of “a breach of academic integrity more serious - and deadly - than anything since the horrors of the Nazi era” because “a hundred million people died of smoking in the 20th century”.

In developing this point, Proctor points to the “thousands of scholars [who] have worked as consultants for Big Tobacco”: experts in everything from addiction to advertising, cardiology to computer science, psychology to pulmonology - not to mention, and perhaps more surprisingly, historians. When tobacco companies began to be sued for causing cancer, they had to walk a narrow line in response: they argued both that everybody had always known that smoking was dangerous (so it was their own fault if they knowingly took the risk) and that the evidence linking tobacco to cancer was never totally watertight (so manufacturers could not be blamed for not being more upfront about the hazards).

Historians, as Proctor tells it, duly produced evidence from novels, newspapers and magazines about an “information environment” where “anyone with even half a brain would have been ‘aware’ that smoking was bad for you”, while skating over the industry’s role in creating a parallel “disinformation environment”. And when appearing in court, historians also had the advantage of being plausibly able to claim that they simply had no view about whether cigarettes caused cancer due to their lack of medical training.

Proctor suggests that one can even calculate the human costs of this process of manipulation. Expert witnesses save legal costs by helping to defeat or minimise liability claims, which keeps prices low, so more cigarettes are smoked - and more people die. Reasonable assumptions then indicate that “the industry’s deployment of historical expertise in litigation causes about 160 deaths per year”.

It is worth spelling out exactly what Proctor is and is not claiming. Tobacco firms, he writes, do not “ask researchers to falsify data or to come up with some preordained conclusion, and they don’t usually interfere with a scholar’s freedom to publish…There are of course such abuses, but that is not how the industry ordinarily exercises its influence. The pattern has been to finance large numbers of scholars, with additional awards then going to those who come up with results the industry can live with…Scholars who pass such tests are then further cultivated and if all goes well will be invited to testify at hearings or in court as expert witnesses.”

It is here that they prove their true worth by producing “a string of denials, qualifications, rationales, diversions or whatever else might be needed to exculpate the industry’s conduct, past or present”.

ADVERTISEMENT

In itself, Proctor adds, research funded by the tobacco industry into, for example, “alternative causation” of cancer may be totally legitimate science - but it can also be “used to distract attention from the ‘main issue’ - the deadly harms of smoking”.

Asked about researchers’ motivations, Proctor responds that “it is always hard to allocate blame between gullibility and greed or malpractice. Part of the problem seems to be that when the industry hires experts, it does not always tell them what is really going on, the ‘big picture’. They are sometimes led to believe they are not even working for Big Tobacco, but rather for ‘historical truth’, ‘the Court’, ‘the legal system’ or perhaps a particular law firm.

“Some seem to feel that since they are given a measure of freedom in researching a particular topic, they remain thereby untainted. But of course the rub is that they are only asked to investigate certain topics.”

A further key issue, continues Proctor, is the “clear and massive evidence that the tactics developed by the tobacco industry have been used in many other spheres. Several of the most influential global climate deniers got their denialist training working for the tobacco industry, and the denialist chain of argumentation is quite similar in both instances.”

It is, of course, hardly news that manufacturers fought strenuously to dispute the accumulating medical evidence about the dangers of smoking and to resist attempts at regulation. Proctor’s book is remarkable for charting the sheer scale of their efforts almost entirely through “the industry’s former secret archives” using “a new kind of historiography: history based on optical character recognition, allowing a rapid ‘combing’ of the archives for historical gems (and fleas)”.

The most notorious unintentional sound bite appears in a 1969 memo from John W. Burgard, vice-president of marketing for the Brown & Williamson tobacco company: “Doubt is our product since it is the best means of competing with the body of fact that exists in the mind of the general public. It is also the means of establishing that there is a controversy.”

This wonderfully incriminating quote proved too good to resist for the authors of two other books that take the story forward from the sins of Big Tobacco into other areas.

Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming (2011) is written by Naomi Oreskes, professor of history and science studies at the University of California, San Diego, and Erik Conway, a historian working at the California Institute of Technology.

ADVERTISEMENT

“Doubt is crucial to science,” they explain, “in the version we call curiosity or healthy scepticism, it drives science forward - but it also makes science vulnerable to misrepresentation…This was the tobacco industry’s key insight: that you could use normal scientific uncertainty to undermine the status of actual scientific knowledge. As in jujitsu, you could use science against itself.” The book goes on to give examples of similar techniques “through Star Wars, nuclear winter, acid rain and the ozone hole, all the way to global warming”.

A parallel case is developed in the 2008 book Doubt Is Their Product: How Industry’s Assault on Science Threatens Your Health by David Michaels, currently on a leave of absence from his professorial post at the George Washington University School of Public Health and Health Services to serve as US assistant secretary of labor for occupational safety and health.

Pointing to “growing trends that disingenuously demand proof over precaution in the realm of public health”, Doubt Is Their Product documents how “in field after field, year after year, conclusions that might support regulation are always disputed…Whatever the story - global warming, sugar and obesity, secondhand smoke - scientists in what I call the ‘product defense industry’ prepare for the release of unfavorable studies even before the studies are published…Industry has learned that debating the science is much easier and more effective than debating the policy.”

Another piece in the puzzle is provided by PR guru James Hoggan, co-author, with journalist and corporate communications professional Richard Littlemore, of Climate Cover-Up: The Crusade to Deny Global Warming (2009). This features a rum cast of what Hoggan calls “fake scientists” - professional mavericks and contrarians, and those happy to comment on topics far beyond their core areas of expertise or long after they have done any significant original research.

“Certain people in the academic community have done great damage,” Hoggan says. “One of the problems for industry and industry associations is that polls show the oil, gas and fossil fuel industries are not trusted by the public. So they find academics to say things that the CEOs of oil companies could never say. If people see you’ve got a PhD, they don’t bother to check your credentials.

“Effective PR depends on compelling messages repeated frequently from credible sources. If you don’t have credible sources, you have to come up with them. The public trusts academics and scientists, as they should. What you’ve got here is a corruption of that trust.”

Anyone worried by the evidence presented in these books is bound to ask what can be done.

“A good way to promote scientific integrity is to require full disclosure,” says Proctor. “In my book I call for the publication of all consulting/financial relationships on university websites, since sunshine is a good disinfectant. At present, disclosures tend to be intrainstitutional and not public.”

Equally useful, in Proctor’s view, would be to ensure that meta-analyses “control for industry funding. There have been several instances where an exposure-disease signal has only become visible once you control for industry funding…it used to be said that smoking actually prevented (or slowed) Alzheimer’s and some studies supported this, but when you control for industry funding in the aggregate body of such studies, it turns out that smoking is positively correlated with the disease…You often cannot do objective science without controlling for industry influence.”

Such “full disclosure” is, of course, not yet a requirement. Although a number of campaigners devote considerable effort to “following the money” and revealing the industrial links (or free market, anti-regulatory ideologies) of particular funders or ostensible “grass-roots organisations”, most people have other things to do with their lives. In the meantime, however, we risk being hoodwinked by the one-sided research, statistical sleight of hand and other “tricks” set out so clearly in these books.

“There’s a gap left between the research frontier, where scientists are,” says Oreskes, “and the policy level, where action takes place. And the various forms of scepticism and denial, both unorganised and organised, passive and active, live in that space between.”

Many of these forms of scepticism and denial rely on a stock repertoire of techniques that can be spotted by those alerted to the dangers. So what are some of the notable warning signs?

Michaels, Oreskes and Conway offer many examples of how the “merchants of doubt” have long tried to deceive us - and so forewarn us about tactics that are bound to be used again.

For Michaels, we should beware any attempts to break down the essential distinction between academic and regulatory science (in terms of timescales, appropriate standards of proof and so on): “To wait for certainty is to wait forever. The fundamental paradigm of public health is and must be to protect people on the basis of the best evidence currently available…do not demand certainty where it does not and cannot exist.”

Skilful reanalysis of the figures, and suggestions for modifying the assumptions, can be used to cast doubt on the results of almost any piece of research. In the words of one expert cited by Michaels: “Risk assessment data can be like a captured spy; if you torture it long enough, it will tell you anything you want to know.” This is why interested parties have often lobbied for publicly funded research to be opened up to scrutiny, so that they get a chance to try to discredit it while at the same time keeping their own under wraps.

While these points are generally true, suggests Michaels, epidemiology presents a number of particular challenges (extrapolating from animal research, assessing the precise levels of exposure to dangerous substances, eliminating other possible factors) that make it “a sitting duck for uncertainty campaigns”. Even if we can seldom achieve “proof beyond reasonable doubt”, far less “absolute proof”, this hardly absolves us of the need to make our best assessments and take appropriate action. Short-term studies are usually worthless and designed solely to “misinform those not trained in the subtleties of epidemiology”, he adds. Equally suspect are very long reports that “do not impress scientists” but may seem significant to judges and jurors when waved about in court.

Oreskes and Conway agree that “any evidence can be denied by parties sufficiently determined”. We ought to be wary of those who claim to be “all-purpose experts” in areas as complex as climate science and those who “invoke the hobgoblin of absolute proof”. Other common tactics are calls for a focus on the “best evidence” rather than “exhaustive inclusion”, which can easily translate into “excluding studies you don’t like and including the ones you do”; and “paralysis by analysis…insisting on more and more and more data in order to avoid doing anything”.

Remarkably effective, according to Oreskes and Conway, is a simple trick long used at press conferences by the tobacco companies: create “the impression of controversy simply by asking questions, even if you actually knew the answers and they didn’t help your case”.

ADVERTISEMENT

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT