Uber Technologies Inc.’s self-driving vehicle wasn’t programmed to stop for obstructions in its path, including the pedestrian it struck and killed last March in Arizona, according to federal investigators.
Sensors on an Uber SUV being tested in Tempe detected the woman, who was crossing a street at night outside a crosswalk, eventually concluding “an emergency braking maneuver was needed to mitigate a collision,” the National Transportation Safety Board said in a preliminary report released Thursday.
But the system couldn’t activate the brakes, the NTSB said.
“According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior,” investigators said.
The responsibility for hitting the brakes was left with a safety driver, who sits behind the wheel monitoring the autonomous test vehicle. Even though Uber’s computers concluded the car would hit the pedestrian 1.3 seconds before impact, or about 82 feet (25 meters) away, it also didn’t alert the driver and she was looking away from the road at the time, according to NTSB.
The NTSB’s preliminary report raises multiple questions about the company’s autonomous system as well as the actions of the safety driver and the pedestrian felled in the crash. The pedestrian had drugs in her system and didn’t look for traffic, according to the NTSB. The report does not establish what caused the collision.
The report comes as Uber is working to figure out the next steps for its self-driving car program. The company halted its public autonomous testing after the collision. On Wednesday, it permanently shut down its self-driving car program in Arizona, adding that it planned to restart testing in Pittsburgh this summer. That irked Pittsburgh’s mayor who said the company needed to make serious changes to its autonomous program before it could get back on the road.
The Volvo XC90 SUV involved in the Tempe collision was also equipped with a set of sensors that could activate an automated braking system, which Volvo calls City Safety. The vehicle is also capable of detecting driver awareness. However, the systems were disabled whenever the SUV was being operated in self-driving mode, according to NTSB.
“That seems like a serious design omission,” Bryant Walker Smith, a professor at the University of South Carolina’s School of Law, who studies autonomous vehicle regulations. “I can understand disabling Volvo’s systems, but it sounds like a lot of tasks were placed on a single safety driver.”
The crash has been a closely watched bellwether for the safety of autonomous cars in development and being tested on streets in multiple states. Uber permanently shut down its self-driving testing in Arizona Wednesday, before the release of the NTSB’s findings.
“This report makes it clear that a self-driving car was tested on public roads when it wasn’t safe enough to be there, and it killed a pedestrian. That’s the bottom line,” William Wallace, senior policy analyst for Consumers Union, the advocacy division of Consumer Reports, said in a statement. “Uber’s system couldn’t properly understand and react to its surroundings and relied excessively on the human operator, and was far too dangerous to be tested off a closed track.”
The Tempe Police Department completed its investigation of the crash and referred it to Maricopa County prosecutors for review, police said in a statement issued Wednesday. Information from the investigation would be released by the county attorney’s office following its review, police said.
Uber has initiated its own safety review of the company’s self-driving vehicles, the company said in an emailed statement. It also hired a former chairman of the NTSB to advise it on its safety culture, the company said. “We look forward to sharing more on the changes we’ll make in the coming weeks,” the company said in the statement.
The Uber car’s radar and lidar sensors observed the pedestrian about six seconds before impact. Elaine Herzberg, 49, who was walking her bicycle across the street, was identified by the sensors first as an unknown object, then as a vehicle and then as a bicycle, the NTSB said.
The system was operating as designed and concluded it was going to hit something 1.3 seconds before the crash. It hit Herzberg while traveling at 39 miles an hour. The driver, who engaged the steering wheel less than a second before the impact, didn’t hit the brakes until afterward.
“The vehicle operator is relied on to intervene and take action,” the NTSB wrote in the report. “The system is not designed to alert the operator.”
The operator was looking away from the road for long stretches in the time before the crash, according to an internal video showing her that was released by police. She told investigators she was monitoring the self-driving system’s interface.
“It’s further evidence that people just don’t experience driverless cars as being their responsibility even when they’ve been told that it is, even when the system relies on them,” said Ryan Calo, a law professor at the University of Washington.
While the Uber safety drivers are responsible for monitoring the car’s driving and stopping for pedestrians and other obstructions, they must also monitor the car’s computers and keep logs of events, the NTSB said.
The driver’s personal and work mobile devices weren’t in use at the time of the crash, she said. The driver wasn’t tested for drugs or alcohol, but police said she showed no signs of impairment.
Herzberg was wearing dark clothing and her bicycle didn’t have reflectors that would have been visible from the side, NTSB said. She crossed the street in an area “not directly illuminated by the roadway lighting,” the report said. Still, she was visible in various video images recorded by the car’s cameras, according to NTSB.
She entered the roadway from a median that had four signs warning pedestrians to use a crosswalk located 360 feet north of where the crash occurred and only looked at the car just before the impact.
Toxicology tests showed she had taken methamphetamine and marijuana, though the NTSB didn’t conclude how intoxicated she may have been.
Herzberg’s family settled legal claims with Uber shortly after the accident, according to an email from their lawyer. The family didn’t disclose terms of the settlement.
The safety board will produce a full report, similar to what it did after a Tesla operating in autonomous mode was involved in a fatal accident.
“This is a human failing, specifically it is a human failing of the engineers who have made certain choices,” Calo said about the Uber collision. “You jerry-rigged things to make it work more smoothly.”
Was this article valuable?
Here are more articles you may enjoy.