Roz Picard tells Tony Durham why the search for ways to give computers emotionality is not just navel-gazing
Marvin the paranoid android had a brain as big as a planet and the lack of intelligent company got him down. Today's dim, uncooperative desktop computers might feel depressed for the opposite reason if they were sensitive to the waves of bile that wash over them from frustrated users. So maybe Roz Picard is right to concentrate on making computers able to detect human emotions without, for the time being, endowing the machines with too much emotional complexity of their own. We would hate them to suffer.
In the past two years her students at the MIT Media Lab have taught computers to recognise frowns and smiles. As facial expressions are movements, not poses, the computer needs to work with a video feed rather than a snapshot. And if digital eyes fall short of the human version in reading the play of facial muscles, then electrodes sensitive to sweat, breathing and muscle tension may help to corroborate the camera's story.
A student played the bloody computer game Doom while wired to the recording apparatus. "The biggest change we got was not actually when he was being brutally murdered by machine gun, but it was at this spot where the software didn't work properly" Picard recounts. This tale went down well with her audience of human-computer interaction experts at last August's HCI'97 conference in Bristol. They were quite ready to believe that on the scale of stressful events a software malfunction scores above a simulated machine-gunning, and probably above divorce or house purchase.
Picard's Affective Computing is one of the first books on emotional machines to emerge from the laboratory rather than the philosopher's study. Emotion is beginning to yield to the cognitive approach, just as reasoning did in the 1970s and 1980s.
Surprisingly, the emotion which leaps from Picard's first paragraph is one of embarrassment at getting embroiled in such a soft and "feminine" subject. "My education has been dominated by science and engineering . . . and a pride that shuns the 'touchy-feely'," she declares in her preface, just a little defensively.
She trained as an electrical engineer and has worked in digital signal processing and integrated circuit design. She did not set out to do research on emotion but was jogged into it while studying human perception. She came across Richard Cytowic's book The Man Who Tasted Shapes, which argued that tasting shapes, seeing sounds and other forms of synaesthesia are the result of crosstalk in an area of the brain where impulses from all the senses converge, the limbic system. Picard was astonished and a little worried that her search for the mechanisms of perception was leading straight into the emotional areas of the brain. What would hard-scientist colleagues think?
Emotion, she now believes, "is an integral part of human perception. It turns out it is also an integral part of human decision making." Encouragement came from Antonio Damasio's book Descartes' Error, which reasserted emotion's role in reasoning, and from Dan Goleman's Emotional Intelligence. "Goleman says emotional intelligence is a lot more important in many ways for your success in life than IQ, the traditional verbal analytical intelligence," Picard explains. If emotion is so crucial in human-human communication, humans could hardly be expected to interact with computers in an atmosphere of clinical calm. And of course they do not.
Thirty years ago in the film 2001: A Space Odyssey, Arthur C. Clarke and Stanley Kubrick depicted the computer HAL, which not only murdered most of the spacecraft's crew but deployed emotional intelligence when pleading with Dave not to switch it off. This scene embodies our fear of machines which become too human-like.
In the book HAL's Legacy, edited by David Stork and published last year by MIT Press (full text on the Web: mitpress.mit.edu/e-books/Hal), computer scientists measured their achievements against the fictional HAL 9000. While the other writers - all male - discussed HAL's intelligence or brute computing power, Picard argued that it is the machine's emotionality that makes its role in the film so memorable: "Many viewers feel a greater loss when HAL 'dies' than they do when Frank Poole floats away into space."
Clarke's message seems pretty clear: emotional computers are trouble. Picard's interpretation is subtler. "Emotion appears to be a necessary component of intelligent, friendly computers like HAL," she maintains.
She points out that HAL only misbehaves after emotional stress precipitates a malfunction. HAL snaps when a tougher computer might not. Emotionality is not itself a defect.
Though she finds it almost self-evident that machines ought to be able to recognise human emotional states, Picard is cautious about giving them emotions of their own. "That's a lot harder to swallow because we see negative influences as well as positive influences in those areas," she says. "Until you can promise people the positive without the negative, which I am not sure you will ever be able to do, there are going to be a lot of people who have scary thoughts about computers like HAL."
Enough of software. Let's talk form factors. A computer that gets to know you will have to accompany you for much of your life, like a watch or a wedding ring. "The idea is, it is going to be almost always on and almost always with you, as much as you want, so it has the potential to get to know whatever you are willing to share with it. It could pick up on this whole conversation, it could remember your name, conceivably it could identify the part of the conversation that most interested you or me."
Wearable computers will come in small packages and soft shapes, nothing like the hefty and conspicuous "rigs" that Picard's student Steve Mann wears as he wanders the MIT campus, on dry days only.
Picard allows that she would leave her cyberjewellery with the towel when showering or swimming. There must be other activities she would wish to conduct off-camera, but another newspaper will have to ask her about that. The THES needs assurance on a far more pressing point. Surely, obviously, students cannot be allowed to wear computers for exams.
Picard disagrees. "We think of this wearable equipment as more of a part of you, like as (BT scientist) Peter Cochrane says, a third lobe, than like an assistant that you would bring into the classroom, which would not be allowed. You are not allowed to bring another human in, but you are allowed to bring in whatever technology supports you, so if you are 'cyborgian', you have this third lobe or these other devices that you have come to rely upon, then it's fine to bring those in."
Some cyberculture figures believe our thrilling destiny is to evolve into cyborgs, but Picard seems untroubled by visions of flesh fused with metal. She presents herself as a down-to-earth scientist who may, if she succeeds, make all our lives a bit easier. The better the technology gets, the less we will notice it. She is now so used to seeing Steve Mann with his rig that she hardly recognises him without it. But she is bothered when the equipment interferes with eye contact: "Some of Steve's rigs drive me nuts, especially the one where he had to look at my navel." Clearly this is a field where technology can only get better.
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to THE’s university and college rankings analysis
Already registered or a current subscriber? Login