Self-driving Uber failed to identify woman as pedestrian in fatal crash

Self-driving Uber failed to identify woman as pedestrian in fatal crash
© Getty Images

The self-driving Uber vehicle that hit and killed a woman in Arizona earlier this year initially misidentified the pedestrian, according to a preliminary report from the National Transportation Safety Board (NTSB).

The safety agency on Thursday said the vehicle's software detected the pedestrian six seconds before the collision, but initially believed the woman to be “an unknown object,” then a vehicle and finally a bike.

The software also established 1.3 seconds before the crash that it would need to enact emergency braking “to mitigate a collision,” according to the report.

But the cars are unable to implement emergency braking when the computer is operating the vehicle.

“According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior,” the report said. 

“The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator,” it said.

The car, which had an operator inside at the time of the crash, was traveling 39 mph when it hit the pedestrian. The safety board said the software was functioning normally and “there were no faults or diagnostic messages.”

The agency’s report comes one day after Uber said it would end its autonomous vehicle tests in Arizona after previously halting the program. The company paused testing after the March crash in Tempe, Ariz., that killed the female pedestrian.

Toyota also halted its self-driving car tests on American public roads in response to the wreck.

The NTSB in its preliminary report noted that it does not assess probable cause and that its investigation of the Arizona crash is ongoing.