x Abu Dhabi, UAESunday 24 September 2017

Tesla must bear some blame for self-driving car accident fatality, report says

America’s National Transportation Safety Board ruled that the Autopilot system in the Model S needs to be disabled in certain conditions

The Tesla was being driven by Joshua Brown when he was involved in a fatal crash in Florida in May 2016. The Associated Press
The Tesla was being driven by Joshua Brown when he was involved in a fatal crash in Florida in May 2016. The Associated Press

A report into the first fatality involving a self-driving car concluded on Tuesday that the vehicle’s maker bears some of the responsibility for the death of Joshua Brown on a Florida highway in May 2016 behind the wheel of his Tesla Model S.

America’s National Transportation Safety Board (NTSB) ruled that despite the possibility that Mr Brown had not paid attention to the car’s safety warnings while using Autopilot, Tesla needed to prevent autonomous driving systems from being used on roads for which they are not designed.

“The combined effects of human error and the lack of sufficient system controls resulted in a fatal collision that should not have happened,” said Robert Sumwalt, the chairman of the NTSB.

The accident that resulted in the death of the 40-year-old Mr Brown, a former Navy Seal, occurred on a divided road with occasional intersections, conditions that Tesla had warned owners not to use Autopilot in. Mr Brown crashed into an articulated lorry at 74mph

In spite of such warnings, the car’s software allowed drivers to go as fast as 90 miles an hour under automated steering, the NTSB found.

“In this crash, Tesla’s system worked as designed,” Mr Sumwalt said. “But it was designed to perform limited tasks in a limited range of environments. The system gave far too much leeway to the driver to divert his attention to something other than driving.”

Earlier this year, in what was deemed to be a victory for Tesla, a National Highway Traffic Safety Administration report on the accident said that vehicles with the Autopilot system didn’t need to be recalled, and the company yesterday showed due contrition.

_______________

Read more:

Robot enabled cars make driving more dangerous

Uber’s driverless cars back on road after crash

_______________

“We appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommendations as we continue to evolve our technology,” a statement from Tesla said. “We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.”

During tests after the fatal accident, Mr Brown’s car showed no signs that he had tried to brake or evade the truck, which was making a left turn. The truck driver’s failure to yield as he made the turn and Brown’s over-reliance on Tesla’s automation were the primary causes of the accident, the NTSB found. It also concluded that the automation contributed because it permitted Brown’s “prolonged disengagement from the driving task.”

Mr Brown’s Model S had warned him seven times during the 37 minutes before the crash that his hands weren’t on the steering wheel, but he was able to touch the wheel momentarily and the system continued driving itself. Newer versions of Autopilot stop the car after the third such infraction, but drivers can go for minutes at a time without steering or quickly stop the warning, according to NTSB.