x

Abu Dhabi, UAESaturday 23 June 2018

Tesla crash driver's hands off wheel, in Autopilot

Driver had “about five seconds and 150 metres of unobstructed view” of the concrete highway divider

Scene where a Tesla electric 4x4 crashed into a barrier on US Highway 101 in Mountain View, California. The driver was in Autopilot mode. AP
Scene where a Tesla electric 4x4 crashed into a barrier on US Highway 101 in Mountain View, California. The driver was in Autopilot mode. AP

Tesla confirmed the Model X driver who died in a crash a week ago was using Autopilot and defended the safety record of its driver-assistance system that is back under scrutiny following a fatality.

Computer logs recovered from the Tesla driven by Wei Huang, 38, show he did not have his hands on the steering wheel for six seconds before the 4x4 collided with a highway barrier in California and caught fire on March 23, according to a blog post the company published.

“The driver had received several visual and one audible hands-on warning earlier in the drive,” Tesla said in the post. The driver had “about five seconds and 150 metres of unobstructed view” of the concrete highway divider and an already-crushed crash cushion that his Model X collided with, according to the company. “But the vehicle logs show that no action was taken.”

The collision occurred days after an Uber Technologies self-driving test vehicle killed a pedestrian in Arizona, the most significant incident involving autonomous-driving technology since a Tesla driver’s death in May 2016 touched off months of finger-pointing and set back the company’s Autopilot program. A US transportation safety regulator said on Tuesday it would investigate the Model X crash, contributing to Tesla’s loss of more than $5 billion in market value last week.

“This is another potential illustration of the mushy middle of automation,” said Bryant Walker Smith, a University of South Carolina law professor who studies self-driving cars. Partial automation systems such as Tesla’s Autopilot “work unless and until they don’t” and there will be speculation and research about their safety, he said.

_______________

Read more

Uber settles with family of victim in fatal self-driving car crash

Tesla shares plummet after fatal Model X crash in California

_______________

Tesla defended Autopilot in the blog post, saying a vehicle equipped with the system is 3.7 times less likely to be involved in a fatal accident. US statistics show one automotive fatality every 86 million miles driven by all vehicles, compared with 320 million miles in vehicles equipped with Autopilot, according to the company.

“None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends,” Tesla wrote, pushing back against criticism that it lacked empathy for bringing up safety statistics to counter scrutiny of Autopilot. “We must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety.”

Tesla has introduced driver-assistance features through Autopilot that the company continuously improves by over-the-air software updates. While the company said as of October 2016 that it was building all of its cars with hardware needed for full self-driving capability, it has not said when its vehicles will clear testing and regulatory hurdles necessary to drive without human involvement.

The US National Transportation Safety Board sent investigators to look into the March crash. The agency and the National Highway Traffic Safety Administration also are examining a January 22 collision in Los Angeles involving a Tesla Model S using Autopilot that crashed into a fire engine parked on the motorway.

The NTSB concluded in September that Autopilot’s design was a contributing factor in the 2016 fatal crash in Florida involving a Model S driver who had been using the system and collided with an articulated lorry. The agency criticised Autopilot for giving “far too much leeway to the driver to divert his attention to something other than driving”.

In the wake of that crash, Tesla has said Autopilot stopped allowing drivers to ignore repeated warnings to keep their hands on the wheel.

While the NTSB also was critical of systems that only monitor steering wheel movement and don’t measure whether drivers are looking at the road, Tesla has not adopted or enabled driver-facing scanners that can monitor whether eyes are looking ahead toward the road.