x Abu Dhabi, UAEWednesday 24 January 2018

Never get lost in translation again, but is that a good thing?

With universal translators getting quicker, and a conversation between two unilingual speakers fast becoming a possibility, Jonathan Gornall asks if this is actually a good thing.

Raytheon BBN Technologies is displayed its TransTalk unit at Idex which will allow the translation of two languages.  Delores Johnson / The National )
Raytheon BBN Technologies is displayed its TransTalk unit at Idex which will allow the translation of two languages. Delores Johnson / The National )

It sounds like every lazy expat's dream - a portable device that will almost instantly translate anything you say into any one of a number of languages, and translate right back into English anything said to you.

True, Raytheon's BBN TransTalk wasn't the most eye-catching piece of equipment on show at the defence exhibition Idex 2013 in Abu Dhabi last month.

Among all the guns, ships and unmanned drones, the simple belt-worn device, about the size of an iPad mini and complete with a hand-held microphone, was always going to struggle to turn heads.

But what it did promise was to enable soldiers who could speak only English to understand, and make themselves understood in Arabic, Pashto, Dari, Farsi, Malaysian and Indonesian.

TransTalk is, of course, attuned to military parlance and some of the conversational gambits at which it excels.

"This is ammunition," the machine's robotic voice faithfully translates during a demonstration at an imaginary Iraq checkpoint.

"It is illegal", might be of limited assistance to a newly arrived expat dealing with, say, the intricacies of securing a residential visa.

But there is no doubting the technological achievement.

The road from speech-recognition programs, which allow your computer or smartphone to convert what you say into the written word, to instantaneous multilingual verbal translation has proved a long and surprisingly difficult one.

The first step was to make computers understand what people were saying.

It began in 1952 with Audrey, an "Automatic Speech Recogniser" developed by Bell Laboratories. Audrey was not that smart. In fact, she could recognise only the first 10 numbers, with accuracy ranging from 60 to 99 per cent.

Sixty years on from Audrey, voice recognition is much better - common, in fact, especially on annoying automated telephone switchboards. But it can still be cranky, as anyone who has been told "I'm sorry, I did not understand that request", while trying to book cinema tickets will know.

The consistency of all such systems is hampered by the fact that language comes out of any one person's mouth in many different ways, depending on whether we are tired, angry, in a hurry and so on.

Take Apple's Siri, a voice-recognition application bought by Apple in 2010 and now built into its iPads, iPhones and computers.

Depending on how its user speaks, it is just as capable of hearing the phrase "Drop the weapon and put your hands in the air" as "Doctor with your hands on here" or "Top the weather in Tanzania".

Even when computers are reading the typed word, and aren't being asked to hear and understand a language in an almost unlimited range of accents and dialects, they can still struggle to translate accurately.

Take this phrase, from the website VisitAbuDhabi.ae: "Cheap taxis, a well-planned road system and plenty of sidewalks make Abu Dhabi easy to navigate."

Google translates the original Arabic thus: "The low tariff taxis and good planning roads, are all factors that led to the ease of navigation in Abu Dhabi", while Bing offers "The low rate of taxis and good road layout are all factors that led to easy navigation in Abu Dhabi".

Minor differences, perhaps, but differences all the same - which makes applications such as Raytheon's TransTalk all the more impressive.

On-the-go, two-way instantaneous voice translation is, doubtless, the next big thing and it is only a matter of time before more civilian-friendly solutions are on the market. Apple has hinted it is in the game but so far it is Microsoft that is leading the pack.

Fans of Star Trek: Enterprise will recall the Universal Translator, as wielded by Ensign Hoshi Sato, the communications officer on board the starship.

"We may not have to wait until the 22nd century for a usable equivalent," blogged Rick Rashid, Microsoft's global head of research, last year.

To prove his point, in October he addressed an audience at the Computing in the 21st Century Conference in Tianjin, China, demonstrating an as-yet-unnamed Microsoft technology that allowed him to speak in English - and be heard almost simultaneously in Chinese.

"The results are not perfect. There are quite a few errors," he said (to wild applause when the translation followed in Chinese, with an electronic voice mimicking the speaker's own speech patterns for once, rather than those of Stephen Hawking).

The message was loud and clear: the Universal Translator will be a consumer reality in the very near future. Some say Microsoft's version could be on the market this year.

But is that really such a great idea?

Microsoft's Rashid obviously thinks so and doubtless legions of IT proselytisers share his view that "as barriers to understanding language are removed, barriers to understanding each other might also be removed".

For cynics, such Utopian talk might bring to mind the cautionary tale of the Babel Fish, encountered in Douglas Adams's book The Hitchhiker's Guide to the Galaxy.

By removing all barriers to communication between different races and cultures, the Babel Fish "caused more and bloodier wars than anything else in the history of creation".

Fanciful, perhaps. But some language and cultural observers see a more likely disaster looming.

They believe that relying on a machine to translate for us, rather than actually bothering to learn another language, is part of an insidious process by which computers are slowly but surely undermining the way our brains function.

Back in 2004, researchers at University College London scanned the brains of 105 people and found that learning another language developed grey matter in much the same way that exercise developed muscle.

In 2010 a study of Alzheimer's patients in Canada, published in the journal Neurology, showed that on average patients who spoke two languages were diagnosed with the condition more than four years later than their monolingual peers.

In other words, one does not have to be a fan of the Terminator film franchise to see that lazily ceding yet another human brain task to machines is not such a great idea.

In 2008, The Atlantic magazine carried an influential cover story by the author Nicholas Carr, under the headline "Is Google Making Us Stoopid?".

On the surface, it sounds like a fairly counter-intuitive proposition. After all, thanks to the internet and Google, the web's most ubiquitous agent, more human knowledge is available to more people than ever before in history.

But what if such ease of access to wisdom was impeding our ability to think for ourselves?

"Over the past few years I've had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory," wrote Carr, who traded up his Atlantic article into a book that was shortlisted for a 2011 Pulitzer Prize.

"Immersing myself in a book or a lengthy article used to be easy … now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do."

It won't be long, perhaps, before the only reading of which humans are capable will be the instructions for uploading software - and, of course, anything written in 140 characters or less.

But what about speaking?

After Microsoft's Rashid gave his virtuoso performance in China last year, Time magazine's applause was muted by a sneaking concern that removing the need to learn another language might have a negative effect on our cognitive abilities.

"Doesn't learning another language actually increase our brainpower?" it fretted. "Would externalising that diminish us somehow?"

Language experts certainly think so.

Way back in October 2007, the Duke University Talent Identification Programme, a non-profit organisation dedicated to assisting academically talented children, responded with concern to budget-driven cuts in foreign-language teaching in American schools, declaring them a false economy.

Duke TIP drew support from a number of language experts, among them Therese Sullivan Caccavale, president of the National Network for Early Language Learning.

All of the research evidence, she said, showed learning a foreign language enhanced children's cognitive development, giving them advantages over peers who did not - including in apparently unrelated subjects such as maths.

Whether or not the internet, Google, Microsoft, Apple et al really are making us dumber, Google is certainly set to make us all at least look pretty stupid, with an imminent product that is designed to make taking time out from our computer-controlled lives harder than ever.

It's not every day that one sees a billionaire riding the subway, but New Yorkers (and a tipped-off photographer) did in January.

Google co-founder Sergey Brin was seen wearing a prototype of Google Glass - spectacles equipped with a miniature camera, computer and heads-up, "mid air" display screen.

Talk nicely to your glasses and, among many other things, they will take a picture, film some video, send a dictated email, show you the nearest restaurants, the temperature in the city you are planning to fly to and countless other things for which smartphones and, er, our brains used to be so essential.

Increasingly, it seems, we are to think less for ourselves and to store less information about the world around us in our brains.

Personal-technology enthusiasts celebrate the blurring of the distinction between our brains and the computers, in all their forms, that increasingly serve as their remote hard drives.

Dr Gerald Crabtree, a geneticist at Stanford University, California, is not one such enthusiast. In February he published a study in the journal Trends in Genetics claiming technological advances were dumbing down human intellect.

"I would wager," he said, "that if an average citizen from Athens of 1000BC were to appear suddenly among us, he or she would be among the brightest and most intellectually alive of our colleagues and companions, with a good memory, a broad range of ideas and a clear-sighted view of important issues."

The problem is, with lost languages unlikely to figure on Google's pulldown translation menu, it would all be so much Ancient Greek to us.