Risk in the rules

March 22, 1996

A NASA work culture that 'normalised' the danger signs led tothe Challenger disaster, sociologist Diane Vaughan tells John Davies. Diane Vaughan is no rocket scientist; she is a sociologist. But when at the start of the year the United States media prepared to commemorate the tenth anniversary of the Challenger space shuttle disaster, it was to her that they turned.

She had just published a book - the fruit of some nine years' work on the "sociology of mistake" - examining the background to the disaster of January 28, 1986. Just 73 seconds after take-off, the space shuttle Challenger exploded, killing a crew of seven that included the New Hampshire teacher Christa McAuliffe. At 575 pages (over a quarter taken up by appendices, notes, bibliography and index), The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA, could turn out to be the definitive account of what she calls "the most well-recorded disaster ever".

Did it need to be so long? "The argument of the book is that the explanation lies in the complexity," says Vaughan, associate professor of sociology at Boston College. "Understanding the details of what happened [means] you have a long book."

She had not intended it that way. In December 1985, she had been asked to contribute a paper to a panel the following summer on secrecy in organisations. One month later, while she was still mulling over the topic, Challenger exploded, and at the subsequent public hearings held by the presidential commission into the causes of the disaster, evidence of wrongdoing, poor communication and mismanagement at NASA began to emerge.

ADVERTISEMENT

When Challenger took off, one of the rubber-like O-rings between sections of a solid-propellant rocket booster failed to block the escape of hot gases created by ignition: hence the explosion. The shuttle was launched when it was just 36oF at Cape Canaveral, but it appeared that warnings about the O-rings' unreliability at low temperatures had been ignored by NASA management. In other words, the objective of meeting the launch schedule had taken priority over safety considerations. Good material, it seemed, for an academic whose previous books included one on corporate crime.

"I was going to do a paper on how the structure of organisations keeps people from knowing and understanding," recalls Vaughan. She began studying the Challenger disaster "thinking it would be a short project. But by the time of the [panel] meeting I had more questions than answers."

ADVERTISEMENT

Time to go into such questions in depth is not normally available - Vaughan teaches five courses a year in her department - but she had just received a fellowship at the Centre for Socio-legal Studies at Wolfson College, Oxford: "a free year in which to pursue some of these questions."

By the time she arrived in Oxford she was able to bring with her the president's full commission report. "I spent the year looking at the hearings testimonies, which were about 5,000 pages of transcript, plus the technical appendices," she recalls. And it was about halfway through her year, "in a lovely 15th-century cottage in Bladon [Oxfordshire] where I was ill at the time", that Vaughan realised she "had made a tremendous mistake" in her interpretation of the Challenger disaster.

"My whole attraction to the project was that I believed that safety rules had been violated," Vaughan explains. "I had begun with the idea that this was a typical example of organisational misconduct in which individuals or groups of individuals violate laws and rules in pursuit of an organisation goal - which for NASA was to stick to the launch decision. But I discovered that what I thought were violations were actions completely consistent with NASA rules, and that the presidential commission had made a mistake." (That mistake, "the idea of managers violating rules", had become the main point picked up in news stories after the report's publication, "making it look like wrongdoing".)

Or, as she puts it in the book: "The attention paid to managers and rule violations after the disaster deflected attention from the compelling fact that, in the years preceding the Challenger launch, engineers and managers together developed a definition of the situation that allowed them to carry on as if nothing was wrong when they continually faced evidence that something was wrong."

It took eight more years, however, before Vaughan was able to complete her research. To begin with, there was a lot to learn: "I came to [the project] not knowing anything about aerospace engineering. This sounds so stupid, but with a background of studying organisations and deviance in particular, I assumed I could learn about the technology. (In her book, Vaughan remarks wryly: "My two auto-repair courses, circa 1976, scarcely had trained me for the world I was entering.") Her immersion in the NASA culture led her to realise that "a lot of things that in retrospect looked suspicious and deviant to outsiders were normal practice in aerospace engineering. No test that you could do was really capable of simulating all the forces in the environment that the shuttle would experience . . . Retrospectively we said 'How could they possibly have continued launching when they knew the design was flawed?' But they really learn by doing.

"In the beginning they knew there were going to be problems and anomalies, but their job as engineers was to try to understand and correct those and make them better."

ADVERTISEMENT

Nevertheless, some time passed before Vaughan actually spoke to NASA staff. "I felt that with such a public incident, talking to people afterwards was going to be distorted by retrospection," she explains. "My best way of figuring out what happened was to rely on documents created at the time and come to some understanding of my own about what happened - then go to people with interview questions that were informed by an understanding of [NASA] procedures . . . That meant years of reading flight readiness review documents, studying engineering drawings and consulting with a few engineers about how the technology worked". Notable among these few were two "very good tutors" in Roger Boisjoly, the Morton Thiokol engineer-turned-whistleblower who had been closely involved with the joints and seals of the solid rocket boosters (SRBs), and Leon Ray - "the person at NASA most closely connected with the SRB joints. The amazing thing is he was mostly ignored by the presidential commission and journalists." (In fact, she notes, "working engineers were under-represented" at the commission hearings.)

As she read the documentation, Vaughan found she needed to widen her focus - to examine more than what happened on the eve of the Challenger's launch. "I looked at other instances of what I had to call 'alleged' rule violations by managers [at NASA] - which took me back into the history of decision-making. That's when it really became an ethnography, because I was using documents and archival data to try to reconstruct the process of decision-making and the culture in which it occurred.

ADVERTISEMENT

"What was exciting for me was that the data available was so extensive it gave me a first-time opportunity to look at how the culture of an organisation affects decision-making, and how external contingencies in the environment trickle down through the organisation in very subtle and insidious ways."

All of which led Vaughan to formulate a theory about the "normalisation of deviance" at NASA: "The decision to launch [Challenger] resulted from NASA's institutionalised process of normalising signals of danger," she declares.

"Behaviour that outsiders see as objectively deviant can become normal and acceptable to people inside the organisation." There was, she writes in the Challenger book, "a work group culture in which the managers and engineers working most closely on the SRB problem constructed beliefs and procedural responses that became routinised. The dominant belief . . . was that the SRB joints were an acceptable risk: therefore it was safe to fly."

This normalisation of deviance principle is one that she thinks "applies across the board" - from organisations as big as NASA to those as small as the family. (An earlier book of Vaughan's, Uncoupling, looked at "turning points in relationships".) In both large structures and one-to-one relationships, she says, "we become beguiled by our own styles, language, strategies and culture. Our behaviour can incrementally change. We get on a slippery slope as NASA did."

But even if we recognise the slippery slope at our feet, Vaughan cannot guarantee we will always stay safely at the top of it. "We live in a time when technology is to a great extent taken for granted. We lose sight of its riskiness, and of the fact that not only can the technology fail but the organisations that produce and manufacture and use it are also subject to failure . . . There are a lot of things that can be done to make organisations more safe, but you can never have a fail-safe organisation, just as you can never have a fail-safe shuttle."

The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA is published by the University of Chicago Press, price Pounds 19.99.

ADVERTISEMENT

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT