The National Highway Traffic Safety Administration said Wednesday it is seeking additional details about a series incidents that raise concerns about the performance of Alphabet’s Waymo self-driving vehicles.
In May, the U.S. auto safety regulator opened an investigation after receiving 22 reports about the company’s robotaxis exhibiting driving behavior that potentially violated traffic safety laws, or demonstrating other “unexpected behavior,” including 17 collisions.
NHTSA said several incidents “involved collisions with clearly visible objects that a competent driver would be expected to avoid.”
Related: U.S. Agency Seeks More Details on Self-Driving Zoox Crashes
The agency on Wednesday said Waymo is to respond to a series of detailed questions by Aug. 6, including comprehensive details of all travel on public roads by the company’s driverless vehicles. It wants to know if any of the vehicles were grounded and of any testing or updates to address specific incidents.
NHTSA said “reports include collisions with stationary and semi-stationary objects such as gates and chains, collisions with parked vehicles, and instances in which the (automated driving system) appeared to disobey traffic safety controls.”
Related: Alphabet’s Waymo Probe Expanded After More Robocar Incidents
Waymo, which did not immediately comment, earlier this month said it was “proud of our performance and safety record over tens of millions of autonomous miles driven.”
NHTSA said it is concerned Waymo self-driving vehicles “exhibiting such unexpected driving behaviors may increase the risk of crash, property damage, and injury” and added many incidents occurred near other road users, including pedestrians.
Related: Waymo Recalls 444 Self-Driving Vehicles Over Software Error
This is the latest in a series of NHTSA investigations into the performance of self-driving vehicles after it opened probes into General Motors Cruise GM.N and Amazon.com’s Zoox.
The Waymo investigation is the first stage before the agency could demand a recall if it believes the vehicles pose an unreasonable risk to safety.
In February, Waymo recalled 444 self-driving vehicles after two minor collisions in quick succession in Arizona, saying a software error could result in automated vehicles inaccurately predicting the movement of a towed vehicle.
(Reporting by Shepardson; Editing by Aurora Ellis)
Was this article valuable?
Here are more articles you may enjoy.