x Abu Dhabi, UAEWednesday 24 January 2018

Machines to read your heart

Consciously or unconsciously, we give away our feelings. Scientists are now trying to develop gadgets that can correctly interpret all the signs, but will we like them more of it.

Body language betrays the emotions of Super Bowl fans. Now scientists are teaching machines to recognise the signals that reveal our true feelings.
Body language betrays the emotions of Super Bowl fans. Now scientists are teaching machines to recognise the signals that reveal our true feelings.

Sunday, Feb 1, 2009, and 100 million Americans have got only one thing on their minds - the Super Bowl. The Pittsburgh Steelers are about to battle the Arizona Cardinals in the most popular televised sporting event in the US. In a hotel room in New York, 46 supporters gather to watch the game, munching burgers and downing beers. Nothing strange about that, of course, aside from the machines that are monitoring these sports fans' every move and every breath they take.

The viewers are wearing vests with sensors that monitor their heart rate, movement, breathing and sweat. A market research company has kitted out the party-goers with these sensors to measure their emotional engagement with the advertisements during commercial breaks. Advertisers pay US$3 million (Dh11m) for a 30-second slot during the Super Bowl, so they want to be as confident as they can be that their ads are hitting home.

"It's a rapidly growing market - our revenues this year are four times what they were last year," says Carl Marci, the chief executive officer and chief scientist for the company running the experiment, Innerscope Research, based in Boston, Massachusetts. Innerscope's approach is the latest in a wave of ever more sophisticated emotion-sensing technologies. The latest technologies could soon be built into everyday gadgets to smooth our interactions with them. In-car alarms that jolt sleepy drivers awake, satnavs that sense our frustration in a traffic jam and offer alternative routes and monitors that diagnose depression from body language are all in the pipeline. Prepare for the era of emotionally aware gadgets.

The most established way to analyse a person's feelings is through the tone of their voice. For several years, companies have been using "speech analytics" software. Computers flag up calls in which customers appear to get angry or stressed out, perhaps because they are making a fraudulent insurance claim, or simply receiving poor service. Voice works well when the person whose feelings you are trying to gauge is expressing themselves verbally, but that's not always the case, so several research teams are now figuring out ways of reading a person's feelings by analysing their posture and facial expressions alone.

Many groups have made impressive progress in the field, first by training computers to identify a face as such. The computer can then keep track of facial features as they move, often classifying the movements according to a commonly used emotion encoding system. Using these techniques, computer programs can correctly recognise six basic emotions - disgust, happiness, sadness, anger, fear and surprise - more than nine times out of 10, but only if the target face uses an exaggerated expression. Software can accurately judge more subtle, spontaneous facial expressions as "negative" or "positive" three-quarters of the time, but they cannot reliably spot spontaneous displays of the six specific emotions - yet.

That is because facial expressions alone are ambiguous. A smile on your face might actually signal embarrassment if it is also accompanied by a downward pitch of the head, for instance. A backward head motion is one part of an expression of disgust. But if someone combines that with a downward movement of the mouth and one raised shoulder, they're conveying indifference. "If I just looked at the face and saw the mouth going down, I would score it as sadness. But the combination with the shoulder and head motion is 'I don't care'," says Maja Pantic, who studies computer recognition of expressions at Imperial College London.

Dr Pantic's team eventually hopes to find ways of fusing information from body gestures and facial expressions together in real time to read emotions accurately, although she concedes it may be an impossibly complex challenge. In the meantime, they are studying the dynamics of how expressions change, to see if this can help computers identify emotions more accurately. Intuitively, most people know that a faked smile is more exaggerated than a real one, and switches on and off more abruptly. Facial-tracking technology has confirmed that and also revealed some more subtle differences.

These subtleties came to light in a 2004 study of 81 adults by Jeffrey Cohn and Karen Schmidt at the University of Pittsburgh in Pennsylvania. They used tracking technology to compare forced smiles with spontaneous smiles provoked by comedy videos. This showed that spontaneous smiles are surprisingly complex, with multiple rises of the mouth corners. Other teams have been highly successful at the opposite end of the emotional spectrum: pain detection. Computers are surprisingly good at distinguishing fake pain from the real thing, according to a study published this year by Gwen Littlewort of the University of California, San Diego, and colleagues.

One group of researchers has developed emotion-reading technology for a particularly vulnerable group of people. Rosalind Picard at the Massachusetts Institute of Technology and Rana el Kaliouby have built an "Interactive Social-Emotional Toolkit" designed to help children with disorders linked to sensory processing, such as autism, to understand emotions in other people. Not everyone welcomes these developments. William Gaver, a designer at Goldsmiths, University of London, concedes some of the applications may be beneficial, but fears emotion-sensing computers will be used in patronising ways. Who could forget Microsoft's cringe-making "paper clip" that offered help with writing letters: Microsoft wisely killed it off because people found it so irritating. But what if some emotion-triggered reincarnated "Mr Clippy" started popping up everywhere?

Emotion sensors could undermine personal relationships, adds Prof Gaver. Monitors that track elderly people in their homes, for instance, could leave them isolated. "Imagine being in a hurry to get home and wondering whether to visit an older friend on the way," he says. "Wouldn't this be less likely if you had a device to reassure you not only that they were active and safe, but showing all the physiological and expressive signs of happiness as well?"

Prof Picard raises another concern - that emotion-sensing technologies might be used covertly. Security services could use face and posture-reading systems to sense stress in people from a distance (a common indicator a person may be lying), even when they are unaware of it. Imagine if an unsavoury regime got hold of such technology and used it to identify citizens who opposed it, she says. There has already been progress towards stress detectors. For instance, research by Ioannis Pavlidis at the University of Houston, Texas, has shown that thermal imaging of people's faces can sense stress-induced increases in blood flow around the eyes. Another fledgling technique, called laser Doppler vibrometry, measures tiny stress-related changes in respiration and heartbeat from afar.

Prof Picard says that anyone utilising emotion-sensing systems should be obliged to gain informed consent from the people they plan to "read". At least that way, whether you find it patronising, creepy or just plain annoying, you can hit the big "off" button and it will, or at least should, leave you and your emotions in peace. www.newscientist.com