Advertisement

Uber chose to disable emergency braking system before fatal Arizona robot car crash, safety officials say

An Uber driverless car is displayed in a garage in San Francisco in this file photo.
(Eric Risberg / AP)
Share

The automated emergency braking system on an Uber robot test car was turned off when the vehicle hit and killed a pedestrian in Arizona in March, according to federal investigators.

The car’s sensor system was operating normally and a test driver was behind the wheel, according to a report issued Thursday by the National Transportation Safety Board. Yet neither the human driver nor the computer system braked the car before the collision.

Although the finding by the NTSB is preliminary, it raises serious questions about the state of driverless technology as more such vehicles are put on public highways.

Advertisement

The NTSB listed three reasons the brakes were not applied before the fatal crash:

  • Uber had chosen to switch off the collision avoidance and automatic emergency braking systems that are built into commercially sold versions of the 2017 Volvo XC90 test vehicle. Uber did that whenever its own driverless robot system was switched on — otherwise, the two systems would conflict.
  • However, Uber also disabled its own emergency braking function whenever the test car was under driverless computer control, “to reduce potential for erratic behavior.”
  • Uber had expected the test driver to both take control of the vehicle at a moment’s notice, while also monitoring the robot car’s performance on a center-console mounted screen. Dashcam video showed the driver looking toward the center console seconds before the crash. The driver applied the brakes only after the woman had been hit at 39 mph.

It’s unclear whether the tragedy could have been avoided even if a human or a robot had braked the car. The woman, 49-year-old Elaine Herzberg, emerged at night from planted shrubbery on a divided highway median strip and into traffic. Toxicologists later found traces of methamphetamine and marijuana in her body. A dashcam video, however, showed her emerging from the bushes, which an attentive driver presumably would have noticed as well.

The failure to brake in the Arizona accident highlights the immaturity of driverless technology and tradeoffs made by programmers that could end in tragedy. While the basics of various companies’ driverless systems are the same, in detail they differ greatly, from how software is written to which sensor systems are used. Tesla, for instance, bucks the industry in general by dismissing the need for lidar, an expensive technology that uses laser pulses to draw images of physical objects.

“What is not being stressed is that the performance of these systems varies greatly,” said Alain Kornhauser, director of Princeton University’s autonomous vehicle engineering program. “The technology is not all the same.”

Driverless advocates, including Kornhauser, say the technology will dramatically reduce traffic deaths, now about 40,000 a year in the U.S., even if it can’t eliminate fatalities altogether. But robot car deaths are inevitable, and the companies behind driverless cars will continue to be tested on the quality of their technology and their willingness to share data on crashes and their driving tests.

The NTSB report said the Uber system first identified the woman crossing the road as a vehicle, then as a bicycle. Accurate classifications are essential. Driverless systems not only “see” objects, they make judgments on matters like direction and speed based in part on whether an object is a car, a pedestrian, a bicycle or something else.

Object recognition and the resultant decisions made by a robot system have come far over decades of research but are still being refined. The “erratic behavior” described in the report apparently refers to the problem of false positives – misidentifying objects or seeing something that isn’t there. The result can be too much braking — which, if it occurred in a commercial driverless taxi, would make passengers uncomfortable.

Advertisement

So programmers must find a balance between too much braking and not enough.

Kornhauser said that while he understands corporate concern for preserving trade secrets and other intellectual property, companies will have to figure out a way to share crash data to improve safety industrywide, and head off a public and political backlash against driverless cars.

“There is a large amount of misunderstanding and a false sense of security associated with these systems,” he said. “Each designer needs to clearly divulge how each addresses the potential for erratic behavior.”

“It’s in everyone’s best interest that everyone be safe,” he said. “The Uber crash negatively affected everyone, even Waymo,” the driverless arm of Google’s Alphabet. “Similarly with the Tesla crashes. They’ve had a negative impact on everyone.”

The NTSB is investigating several deaths involving Tesla’s semi-autonomous Autopilot system.

On Wednesday, Uber announced plans to shut down its driverless car program in Arizona. Programs in Pittsburgh, San Francisco and Toronto have been on pause since the Arizona crash, but an Uber spokeswoman said the company is preparing to resume activity in those locations and elsewhere.

“We are not shutting down our self-driving program, but our road testing is still grounded,” she said. “We are actively working to make our return to the road a reality, with a goal of resuming operations in Pittsburgh this summer. We are also in conversations with the governor of California, the California DMV and the cities of San Francisco and Sacramento.”

Advertisement

The Associated Press contributed to this report.


UPDATES:

11:40 a.m.: This article was replaced with a staff-written version.

This article was originally published at 8:05 a.m.

Advertisement