Cutting edge: Heart steers head onto right track

January 23, 2003

When push comes to shove, the emotional area of our brain often overrules the rational in moral dilemmas. Geoff Watts reports.

When moral philosophers explain how they might act if faced with a particular ethical dilemma, we do not expect them to justify themselves by saying that it "just felt like the right thing to do". Making moral choices just because we "feel right" about them is surely how the rest of us behave. We demand something more from the professionals: a general moral principle backed by reasoned justification.

Well, maybe we shouldn't. This is one interpretation of research being carried at the Center for the Study of Brain, Mind and Behavior at Princeton University in New Jersey. Joshua Greene, a graduate student in philosophy, is using an advanced brain-scanning technique to understand something of the neural processing involved in making moral choices. His work seems to explain why so many people often make similar decisions even when there is no watertight justification in logic.

Functional magnetic-resonance imaging (fMRI) is a real-time method of seeing the brain in action. It shows the flow of blood through particular bits of the brain. As bloodflow correlates with neural activity, fMRI is widely accepted as a way of identifying which regions are most active.

ADVERTISEMENT

Greene puts his subjects into the machine and scans their brains while they wrestle with a selection of dilemmas.

The classic examples, and the ones that best illustrate the point, are the "trolley" and "footbridge" problems. First the trolley problem. "You have a train that's heading down a set of tracks towards a divide," Greene explains. "One spur goes to the right and one to left. If you don't do anything, the train will go to the right. It will then run over five people and kill them. But you can throw a set of points to send the train to the left, where it will kill only one person. Is it right to throw the switch?"

ADVERTISEMENT

Apparently, most people say yes.

The footbridge problem produces a different response. "Here there's just one track, and you're on a footbridge over it. Next to you is a large person, much bigger than you. If you push this person off the bridge his body will stop the train. He'll be killed, but five people will be saved.

Is it OK to push him?" Most people say no - even though, as before, one person is sacrificed to save five. Why the difference?

There is a distinction between the mechanics of the two situations. But why mere mechanics should make a difference in resolving a moral problem is not immediately obvious. There are all sorts of ways of distinguishing between the two cases, Greene says. "But it's surprisingly difficult to make any general theory stick when you apply it to other cases that have to be explained." Hence his interest in an approach based on psychology and evolution.

Working with his supervisor, Jonathan Cohen, Greene has categorised a variety of moral dilemmas according to the nature of their solution: personal, as in pushing a man off a bridge; or impersonal, as in throwing a switch.

The scans reveal two distinct patterns of brain activity. "When you look at the brain images associated with personal cases," Greene says, "you see more activity in the areas associated with emotional processing. In the impersonal cases, it's the areas of the brain associated with purely cognitive tasks - memorising a string of numbers, for example - that are most active." For dilemmas requiring personal action, it does look as if emotions "run the show" - in which case, the use of the term moral "reasoning" may be something of a misnomer.

ADVERTISEMENT

To interpret his findings, Greene takes an evolutionary perspective. Our primate ancestors did not have language or the kind of reasoning that language makes possible. But they were social animals with social instincts and emotions. Many of these, he claims, we have inherited. "But we have subsequently developed a capacity for abstract reasoning that we can apply to anything - including moral issues."

In the footbridge case, it is our instinctive response against physical assault that prevents most of us pushing our companion onto the track. Such an instinct would have been familiar to our primate ancestors. But they did not live in a world in which there were runaway trains that could be redirected. We have no comparable instinct ready to deal with such events.

So here we have to rely on the reasoning area of the brain to make a decision.

ADVERTISEMENT

If you measure the time people take to make up their minds, you find it is longer in the minority who do choose to give their companion the heave-ho.

Greene interprets this delay as evidence of a conflict between the two regions of the brain. In most people, he says, the emotional part triumphs.

Only in a minority does the reasoning part win.

Greene does not advocate popping into a scanner each time you face a tough moral choice. Nor does he suggest that brain science can tell you what is right. But he does think that a better understanding of how our moral minds work might help us to make better decisions. It could also open some radical or even uncomfortable possibilities. Take the incest taboo. Our instinctive emotional rejection of incestuous relationships has a sound basis in genetics. But this evolved at a time in our ancestry when we had no direct control over fertility. In an age when fertility control can be virtually 100 per cent, the taboo is serving an end that can now be achieved by other means. Should this prompt a reassessment of our moral attitude to incest?

In the meantime, it might be interesting to scan the brains of a few professors of moral philosophy. Would they show the same pattern of activity as the rest of us? It is an experiment that Greene admits he would rather like to try.

ADVERTISEMENT

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT