In search of singular insight

Social sciences have advanced little because inquiry and discovery are stifled by 'theory' and 'the search for order' in the academy. Gary Thomas says we need to break away from the pattern

July 9, 2009

The Nobel prizewinning economists with their clever formulae proved to know nothing: they failed to see anything coming. The sun was out, the sky was blue. And with their mind-boggling formulae they furnished reasons why the party would continue. They were wrong. Not only were they wrong, but we believed them, and it is a disgrace that they should have been lauded by universities and Nobel prize committees alike for talking nonsense.

This at least is the mantra of Nassim Nicholas Taleb, author of The Black Swan: The Impact of the Highly Improbable and a fellow at the Courant Institute of Mathematical Sciences at New York University. And, as he is one of the few people to have predicted the financial crisis, it's worth taking a look at what he says. His thesis is that randomness will always spike attempts to establish coherence and pattern - and his bile is reserved especially for economists. But the economists should not be singled out for opprobrium. If we are looking for reasons for the failure to predict what happened, the malaise is with universities in general and with the predilections of social scientists in particular.

Our universities lumber through the world of discovery. Driven by risk-averse funding programmes, the clever people who work in universities are lobotomised. They are, philosopher A.C. Grayling has suggested, squeezed into the form of latter-day Edward Casaubons - "scholarly and uninspired ... scrupulous and dim-sighted".

The machinery of the university nourishes the conservative. Universities are wonderful institutions, and I love them, but they are renowned neither for their quick reflexes nor for their critical self-reflection. If they took a magnifying glass to the greatest advances in thinking of the 20th century, they would notice that while universities have been connected to those advances, the advances themselves have tended to happen on the borderlands of their terrain, not at the centre. And the advances have usually happened in spite of the university system, not because of it. They have happened from people drawing on the resources of the university, certainly, but often subverting it, and working outside its systems.

ADVERTISEMENT

In science and technology and even in business, the great advances - from powered flight to relativity, penicillin, superconductivity and the DNA double helix - have in the main had the help of university structures in some way or other. But they were brought to life by people who were marginalised or even snubbed by universities: people who were being told by their universities to get on with something else (James Watson and Francis Crick with the double helix), or who were rejected by the university system completely (Albert Einstein and his annus mirabilis papers including special relativity and mass-energy equivalence). Or they happened out of intelligent noticing, separate from a programme of research (Alexander Fleming and penicillin), or out of quick-wittedness, curiosity and doggedness in a minor investigation (J. Robin Warren and Barry J. Marshall with Helicobacter pylori). Or they were done by people who used the university as a platform (Larry Page and Sergey Brin with Google), and then jumped off, realising that university proceduralism would suffocate their energy. Or they just worked completely on their own (Orville and Wilbur Wright with powered flight). Even in well-documented examples of institutionally planned (rather than individual and serendipitous) advance, such as Bell Laboratories' invention of the transistor, scientists such as John Bardeen were bought out of universities to do the job.

Part of the problem is in the ways those who work in universities are encouraged to think about and discuss discovery and knowledge and make it a special process involving something going under the name of "theory". There is a relentless narrative of "theoretical advance". Advance in thinking has to be made to look like a specialised business, accessible only with arcane epistemological tools. It all goes back - as much that is misleading does - to Aristotle.

ADVERTISEMENT

In his Nicomachean Ethics, he asserted that theoria is the highest form of activity. Of course Aristotle is a pretty big cheese to gainsay, epistemologically speaking, and sadly his pronouncements often reverberate as "The Truth" through the history of thought. If Aristotle says that theoria is the highest form of activity, it's a strong endorsement, and by converting thinking into the more mysterious "theory" or "theorising", universities can claim some ownership over the specialised tools of discovery, explanation and prediction. Rather in the way that English ballerinas early in the 20th century took on Russian names to make themselves sound more impressive, academic disciplines talked increasingly of theorisation rather than thinking. If Aristotle is associated with theoria, the roots of poor think are traceable only to the dull Old Saxon thenkian and Norse thekkia. Aristotle versus Hengist and Horsa in the thinking stakes? No contest.

Yet the leaps of imagination of the 20th century happened not as the progeny of "theory", but out of thinking, or what the great mathematician George Polya called "having brains and good luck", from intelligent noticing, from serendipity, from the inspiration and creativity of individuals and from hard work. Theory - that distillation from inductive thinking - sometimes emerged from these advances, but its place in the riders and runners of discovery itself is not high, because it is really just a summary - a platform for plausible explanation and incremental accretion based on that explanation. Yet it is always up there as an impostor in the winners' circle of discovery and original thought. Promoted by the university establishment as the zenith of intellectual achievement, it always muscles its way on to the scene.

This reification of theory has had damaging consequences in many branches of inquiry and is part of the reason that discovery - that is to say, really important discovery - is made in spite of universities rather than because of them.

If this overstates the case a little, it overstates it only for the natural sciences. In the social sciences, the presumptions of inductivism, from which theory springs, are arguably largely responsible for the lack of conspicuous advance in the area. Can you think of any major social scientific achievements? I can't. We can't even predict with any degree of reliability. It's not because there aren't clever people in social science, or because the field is worthless. It's because we're barking up the wrong tree. Taleb's idiosyncratic promotion of a science of the singular, while not original, offers the chance to think again about life without theory.

ADVERTISEMENT

Today's social science conforms to everything that Taleb abhors. His abhorrence is for the near-universal reverence paid to model-making and theory building in the social sciences. It is for the seeking of pattern where there are no patterns (or at least only perpetually shifting patterns). He teaches a course on the history of probabilistic thinking - or the failure of models. Model-making and theory building, tidying generalisations and a fondness for all things quantitative in the social sciences are chimeras, says Taleb. He has been proved right, at least in economics: there is very little useful prediction to be had from the social sciences, and the "theory" that does emerge can be profoundly misleading. He's not the first of course: Alasdair MacIntyre said much the same 25 years ago in his devastating critique of the social sciences, but no one really listened - except in a kind of laid-back academic way - and I guess no one will listen now. Taleb will be branded with that most terrible of put-downs of the academic community: maverick.

When I hear the word "maverick" used by academics, I think of Ted Hughes' The Iron Man. It's about a great iron behemoth that just marches on and on, unaware of anything around it, and it makes me think of social science, hypnotised by a shining vision - theory! Never mind that no one knows what theory should actually look like in the social sciences, or that when someone supposedly emerges with one it proves worse than useless at prediction or explanation. The sightless gargantuan trudges forward, arms outstretched towards the epistemological apogee.

I say worse than useless and I mean it. As MacIntyre pointed out, the theory proposed by social scientists is one of two things: it is either facile or misleading. The facile variety - often simply guesses about the meanings of relationships among variables - is so banal that anyone with a copy of Microsoft Excel could come up with it, while the complex variety is so complex that it is invariably highly misleading. Witness Piagetian theory, which led teacher-educators on four decades of wild goose chases, where daft notions such as "reading readiness" were propagated. And now witness the Nobel prizewinners' dazzling-but-potty theorising in economics.

I say no useful prediction, because anyone can come up with a degree of prediction about behaviour in a chaotic system, so one has to do better than this vernacular kind of prediction if one is claiming scientific credentials. Just by looking out of the window I can predict tomorrow's weather with a 70 per cent degree of accuracy from the weather today; likewise, I can predict the price of shares tomorrow with reasonable accuracy. It's going beyond these local and short-range predictions that is difficult. Meteorologists do it up to a point (but only a point), but social scientists' record is even less enviable.

ADVERTISEMENT

Why is this? It's partly down to our predilection for learning from patterns and generalisations. Perhaps we always want patterns because patterns are what the human brain evolved to see. It's what we are good at, and the atavistic pull to do it has served us well in the establishment of natural scientific laws and patterns. But this fondness for patterns radiates into all forms of informal and formal inquiry, and maybe that pull from the savannah goes too far.

They're not so much use when it comes to ways of thinking about social behaviour, either. In the social sciences, there is no reason why our thinking tools should follow the same pattern-finding path that natural scientists follow. Taleb is not the first to say this - Michael Oakeshott talked of the "irritable search for order". Ludwig Wittgenstein talked of the "craving for generality".

ADVERTISEMENT

What's the alternative? For Taleb it is the "black swan", for Jerome Bruner the breach of "canonicity", for Thomas Kuhn the "awareness of anomaly", for Paul Feyerabend the need to proceed counter-intuitively. People have been saying it for ages: Taleb is by no means the first. The financial crisis and economists' failure to predict it surely should presage a renewed questioning of the shibboleths of the social science enterprise. Instead of the relentless homogenise-generalise-theorise enterprise beloved of social science, we can try giving a little more credence to analyses of the singular - to learning from what is different.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT