Self-driving Uber SUV kills pedestrian in Arizona

The fatal accident could have far-reaching consequences for new automated technology

A car passes the location where a woman pedestrian was struck and killed by an Uber self-driving sport utility vehicle in Tempe, Arizona, U.S. March 19, 2018. REUTERS/Rick Scuteri
Powered by automated translation

A self-driving Uber SUV struck and killed a pedestrian in suburban Phoenix, Arizona, in what is the first death involving a fully autonomous test vehicle.

The accident could have far-reaching consequences for the new technology.

The fatal crash on Sunday night in Tempe was an incident many in the auto and technology industries were dreading but which some say was inevitable.

Uber immediately suspended all road-testing of such vehicles in the Phoenix area as well as Pittsburgh, San Francisco and Toronto. The testing has been going on for months as automakers and technology companies compete to be the first with cars that can operate on their own.

The Volvo SUV was in self-driving mode with a human back-up driver at the wheel when it hit 49-year-old Elaine Herzberg as she walked her bicycle outside the lines of a pedestrian crossing, police said. She died in hospital.

Uber chief executive Dara Khosrowshahi expressed condolences on Twitter and said the company is working with local law enforcement on the investigation.

The United States' National Transportation Safety Board (NTSB), which makes accident-prevention recommendations, and the National Highway Traffic Safety Administration, which enacts regulations, sent investigators to the scene.

The public's image of the vehicles will be defined by stories like the crash in Tempe, said Bryant Walker Smith, a University of South Carolina law professor who studies autonomous vehicles.

Although the Uber car and its human back-up could be at fault, it may turn out that there was nothing either could have done to stop the crash, he said.

Either way, the fatality could hurt the technology's image and lead to a push for more regulations at the state and federal levels, Prof Smith said.

_______________

Read more:

_______________

Autonomous vehicles with laser, radar and camera sensors and sophisticated computers have been billed as the way to reduce more than 40,000 traffic deaths a year in the US. Ninety-four per cent of crashes are caused by human error, the government says.

Autonomous vehicles don't drive drunk, don't get sleepy and aren't easily distracted. But they do have faults.

"We should be concerned about automated driving," Prof Smith said. "We should be terrified about human driving."

In 2016, the latest year available, more than 6,000 pedestrians were killed by vehicles in the US.

The federal government has voluntary guidelines for companies that want to test autonomous vehicles, leaving much of the regulation up to individual states.

Many, including Michigan and Arizona, have taken a largely hands-off approach, hoping to gain jobs from the new technology, while California and others have taken a harder line.

California is among the states that require manufacturers to report any incidents during the testing phase. As of early March, the state's motor vehicle agency had received 59 such reports.

Governor of Arizona Doug Ducey used light regulations to entice Uber to the state after the company had a shaky roll-out of test cars in San Francisco. Arizona has no reporting requirements.

Hundreds of vehicles with automated driving systems have been on state roads.

Mr Ducey's office expressed sympathy for Ms Herzberg's family and said safety was the top priority.

The crash in Arizona isn't the first involving an Uber autonomous test vehicle. In March 2017, an Uber SUV flipped onto its side, also in Tempe. No serious injuries were reported, and the driver of the other car was cited for a violation.

Ms Herzberg's death is the first involving an autonomous test vehicle, but not the first in a car with some self-driving features. The driver of a Tesla Model S was killed in 2016 when his car, operating on the Autopilot system, crashed into a tractor-trailer in Florida.

The NTSB said that driver inattention was to blame but that design limitations with the system played a major role in the crash.

The US Transportation Department is considering further voluntary guidelines that it says would help to foster innovation. Proposals also are pending in Congress, including one that would stop states from regulating autonomous vehicles, Prof Smith said.

Peter Kurdock, director of regulatory affairs for Advocates for Highway and Auto Safety, said it sent a letter on Monday to US Transportation Secretary Elaine Chao saying it was concerned about a lack of action and oversight by the department while autonomous vehicles are developed. The letter was planned before the crash.

Mr Kurdock said the fatality should serve as a "startling reminder" to members of Congress that they need to "think through all the issues to put together the best bill they can to hopefully prevent more of these tragedies from occurring".