x Abu Dhabi, UAEFriday 22 September 2017

It may not be long before your car begins to act on your emotions

A spate of companies is coming up with devices that can respond to the way we're feeling.

Nuance is working on technology in cars that will recognise the driver's emotions and act accordingly. Getty Images
Nuance is working on technology in cars that will recognise the driver's emotions and act accordingly. Getty Images

You can tell something of the story of information technology – more specifically, about its colonisation of our daily lives – by looking at the changing nature of our interaction with it. Rewind 25 years to the early days of home PCs, when we were faced with a blank screen, a blinking cursor, and typed-in commands such as "CD/documents". Then came Windows, and point-and-click. Now, we interact with our devices by touching their screens, or increasingly, via Siri – the iPhone's voice-activated personal assistant – by talking to them.

It's a story of interaction that has become ever more natural, ever more human. So what's next? A spate of companies coming to the fore this year think they have the answer: devices that can apprehend and respond to the way we are feeling. It's being called emotion recognition, and – if you believe the hype – it's going to be a big part of our near future.

Nuance is one such proponent. Nuance is the multi-billion dollar king of voice recognition, and thanks to a deal with Apple, it provides the software that powers the iPhone's Siri. Last month, the company announced a push into voice recognition in cars. Their new Dragon Drive technology allows drivers to dictate messages, play music and access GPS simply by asking. Crucially, Nuance says it views emotion recognition as central to future iterations of Dragon Drive. Prepare for a new generation of cars, they say, that will discern that you sound stressed and send a text message to let colleagues know you're running late. Or see that you're in a good mood, and play suitably upbeat music.

Meanwhile, emotion recognition is already a reality in other industries. This burgeoning technology is of huge interest to advertisers and brands, who constantly search for new ways to make us fall in love with their products. Now, the tech companies Affectiva (www.affectiva.com) and nViso (www.nviso.ch) are offering a world of possibilities. Their facial imaging software analyses the emotions of a subject watching an advertisement, opening the door to commercials that are scientifically designed to produce the greatest impact. At Affectiva's website, you can watch advertisements from this year's Super Bowl broadcast, and see how your emotional response compared to that of others - but you'll need a webcam to do it.

The ability to measure the emotional response of an audience has exciting implications for the arts, too. Sensum (www.sensum.co) recently tested its technology at the SXSW film festival, where audiences gathered for a special horror movie and wore a wristband that measured and analysed their physical response. When heart rates climbed - indicating high emotion - it triggered scarier footage and music.

Emotion recognition is all part of the drive to weave information technology ever more tightly into the fabric of our lives. The holy grail? Interactions with our machines that are so natural, they're like talking to a person.

It sounds like a worthy aim and may prove to be so. But might there be downsides? When our machines start to understand and respond to our emotions, might we yearn for the days when they were cold and logical? Isn't there something reassuring – and convenient – about a car or a computer that is predictable, that stays just the same whatever mood you're in?

Emotion recognition is coming. What's unknown is how often we'll be reaching for the off switch.