Published: Tuesday, May 14, 2020
DETROIT – The U.S. highway safety agency opened a new investigation into automated driving systems. This time, the agency is looking at crashes involving Waymo’s self-driving cars.
The National Highway Traffic Safety Administration published documents on its website detailing the investigation early Tuesday morning after receiving 22 reports about Waymo vehicles crashing or possibly violating traffic laws.
The agency appears to be becoming more aggressive with its regulation of these devices. In the last month, it has opened four investigations into vehicles that are able to drive themselves or at least perform some driving functions.
The agency has reported 17 accidents and five possible traffic violations in its investigation of Waymo. There were no injuries reported.
The Waymo cars hit stationary objects like gates, chains, or parked cars. Documents show that some of these incidents occurred shortly after Waymo’s driving system behaved in an unexpected manner near traffic control devices.
Waymo stated that NHTSA has an important role to play in road safety. It will continue to work with the agency as part of its mission “to become the most trusted driver in the world.”
The company reported that it made over 50,000 trips each week with riders, in challenging environments. The statement read: “We are proud to have a safety and performance record that spans tens millions of autonomous miles, as well our commitment to transparency in safety.”
Waymo of Mountain View, California has operated robotaxis in Arizona and California without the use of human safety drivers.
Michael Brooks, Executive Director of the Center for Auto Safety (a nonprofit organization), said that NHTSA’s aggressive actions showed that autonomous vehicles might not be ready for public roads yet.
Brooks stated that the agency’s sole enforcement power over autonomous vehicles is to initiate investigations and request recalls. This is what it does. Brooks stated that NHTSA was criticized for its slowness in regulating Tesla and other companies offering automated driving systems.
Brooks stated, “I think that it is a positive thing to take these steps and try to find out why the vehicles behave in this way.”
NHTSA announced that it would investigate 22 incidents involving Waymo’s fifth-generation driving system, as well as similar scenarios to “more closely assess any similarities in these incidents.”
The agency stated that it believes Waymo’s automated system was active throughout each incident. In some cases, where a test car was involved, a driver would disengage the system right before an accident occurred.
Documents stated that the probe would evaluate the system’s performance at detecting and responding traffic control devices and avoiding collisions with stationary or semi-stationary vehicles and objects.
NHTSA opened an investigation into collisions that involved self-driving cars operated by Amazon-owned Zoox as well as driver-assist technologies offered by Tesla and Ford.
The agency has ordered that all companies using self-driving cars or systems with partial automation report all accidents to the government by 2021. These investigations rely heavily upon the data provided by automakers in response to that order.
NHTSA is also investigating General Motors Cruise’s autonomous vehicle unit, after receiving reports that the vehicles might not have been using proper caution when around pedestrians. Cruise has recalled cars to update their software after one of its vehicles dragged a person to the side of San Francisco’s street early in October.
The agency has also questioned whether the recall of Tesla’s Autopilot system last year was enough to ensure that human drivers were paying attention. NHTSA reported that it found 467 accidents involving Autopilot, resulting in 54 injuries as well as 14 deaths.
Ford is investigating two fatal nighttime accidents on freeways.
Last year, the agency forced Tesla to recall its “Full Self Driving System” because it could misbehave at intersections and didn’t always adhere to speed limits.
Tesla’s Autopilot and its “Full Self Driving System” cannot drive themselves. The company insists that human drivers should be on standby at all times.
NHTSA also set performance standards on automatic emergency brake systems. These systems must be able to stop quickly in order to avoid pedestrians or other vehicles.
These standards are the result of investigations into automatic braking systems, such as those from Tesla, Honda and Fisker, which can stop without reason. This increases the risk of an accident.
Steven Cliff, the NHTSA administrator in 2022, said that the agency would increase its scrutiny of automated cars. The agency has recently taken additional action. Since Cliff’s departure for the California Air Resources Board in August 2022, NHTSA is without an administrator confirmed by the Senate.
Source: ABC News