x

Abu Dhabi, UAESunday 24 March 2019

Driverless cars risk being racist, new research suggests

Study claims some detection systems are less likely to see pedestrians with dark skin

A driverless car seen during testing in Singapore. AFP
A driverless car seen during testing in Singapore. AFP

A new study that found self-driving cars could be more likely to crash into pedestrians with dark skin has raised concerns among experts in the UAE.

Academics said the research could indicate Gulf nationals would be more at risk from being hit by autonomous vehicles, without improvements to software.

A recent paper, published by the Georgia Institute of Technology, found detection systems were better at recognising lighter skin tones than dark.

The intriguing issue has also raised questions over whether traditional dress, such as the black abaya or white kandura, may also confuse existing technology.

“I would exercise some caution until the study is peer-reviewed but it could have significant implications,” said Lakmal Seneviratne, Professor of Robotics at Khalifa University.

“For example, this study implies that what people wear could be important and in this country that can be all black or all white if the person is in traditional dress.

“It appears to imply a difference in detection rates between them. So it really shows the need for local testing.”

The UAE is a world-leading advocate of self-driving technology, with Dubai setting particularly ambitious targets.

City transport authorities who have been quick to adopt artificial intelligence systems, have committed to ensuring a quarter of all journeys will be autonomous by 2030.

Some experts are adamant that self-driving cars will be a common feature of UAE transport within a few years.

But others remain much more sceptical, believing many technical, legislative and infrastructure barriers have yet to be overcome.

Prof Seneviratne said that robust testing of any new technology was the only sure wayto ensure accidents were prevented.

He said it was particularly important that cars were tested in extreme conditions, such as unusually bad weather.

Prof Seneviratne said urban environments posed unique challenges to cars’ systems.

“If you take a fully autonomous driverless vehicle and put it on a motorway, it will work well most of the time,” he said.

“But if you put it in an urban environment it is a completely different proposition, much more complex, not least because of issues with other traffic, traffic lights, roundabouts, pedestrians, and so on.”

Researchers at the Georgia Institute of Technology tested advanced object detection systems against thousands of images of pedestrians.

Professor Lakmal Seneviratne in his laboratory. Victor Besa / The National   
Lakmal Seneviratne said driverless cars throw up all manner of complex issues. Victor Besa / The National

Each person's skin colour was graded on a six-point scale and results showed that those in the three groups with lighter skin were about five per cent more likely to be identified by the system as human than those with darker pigmentation.

Initial theories to explain the discrepancy included the time of day at which images were taken, or the extent to which some people's faces had been obstructed.

But subsequent testing dismissed these explanations, indicating existing systems had difficulties identifying darker skin tones.

“This behaviour suggests that future errors made by autonomous vehicles may not be evenly distributed across different demographic groups,” the authors of the study said.

FILE - In this March 20, 2018, file photo provided by the National Transportation Safety Board, investigators examine a driverless Uber SUV that fatally struck a woman in Tempe, Ariz. A prosecutor has determined that Uber is not criminally liable in the crash that killed 49-year-old Elaine Herzberg. (National Transportation Safety Board via AP, File)
Driverless cars have been involved in a handful of accidents, including one crash that led to the death of 49-year-old Elaine Herzberg in Arizona last year. AP

“We hope this study provides compelling evidence of the real problem that may arise if this source of capture bias is not considered before using these sort of recognition models.”

Critics of the study say the findings are based on publicly available detection systems, rather than the technology under development by companies such as Google and Uber.

Some leading self-driving cars also use Lidar technology, which detects shapes rather than colours, alongside cameras.

But Kate Crawford, a researcher at New York University who specialises in the social implications of AI, said the findings should be taken seriously.

“In an ideal world, academics would be testing the actual models and training sets used by autonomous car manufacturers,” Ms Crawford said.

“But given those are never made available, papers like these offer strong insights into very real risks.”

Last week, W Motors, a Dubai company developing autonomous vehicles, said fully self-driving cars could be operational on UAE roads by 2023.

But a spokeswoman for company said it was unaware of the US research, and had yet to begin studies of its own on the subject.

“W Motors and partner Iconiq Motors, who have just launched the first UAE Autonomous Car, Muse L5, have not done yet any studies related to this topic and cannot draw any analysis or conclusions related to that matter,” the spokeswoman said.

Updated: March 12, 2019 09:56 AM

SHARE

SHARE