Driver error, 'overreliance' on autopilot cited in Tesla crash

Driver error, 'overreliance' on autopilot cited in Tesla crash
© Greg Nash

A California crash involving a parked fire truck and a Tesla Model S was likely caused by a combination of driver error and Tesla’s autopilot design, the National Transportation Safety Board (NTSB) said Wednesday.

In the Culver City, Calif., crash, the autopilot system’s design allowed the driver to “disengage from the driving task,” the NTSB said Wednesday, after saying the previous day that it had allowed the driver in the crash to spend most of the nearly 14-minute trip with his hands off the wheel.

ADVERTISEMENT

The NTSB singled out the driver’s “inattention and overreliance on” the autopilot system for the January 2018 crash into the fire truck, which was unoccupied. The driver was not injured.

The NTSB ruled that had the driver been paying close attention, “he could have taken evasive action to avoid or mitigate the collision,” saying that the system sent him several alerts prompting him to place his hands back on the wheel.

After a 2016 crash in Florida that the NTSB blamed on the autopilot system, it asked Tesla and five other automakers with advanced driver assistance systems — Volkswagen AG, BWM AG, Daimler AG, Volvo and Nissan — to improve their applications to better detect levels of driver engagement.

“All manufacturers except Tesla have responded to the NTSB explaining their current systems and their efforts to reduce misuse and keep drivers engaged,” the NTSB said.

A Tesla spokesperson told The Hill in a statement that data indicates that drivers "using Autpilot remain safer" than those who do not.

“Tesla owners have driven billions of miles with Autopilot engaged, and data from our quarterly Vehicle Safety Report indicates that drivers using Autopilot remain safer than those operating without assistance,” the spokesperson's statement reads.

“While our driver-monitoring system for Autopilot repeatedly reminds drivers of their responsibility to remain attentive and prohibits the use of Autopilot when warnings are ignored, we’ve also introduced numerous updates to make our safeguards smarter, safer and more effective across every hardware platform we’ve deployed."

"Since this incident occurred, we have made updates to our system including adjusting the time intervals between hands-on warnings and the conditions under which they’re activated," the spokesperson added.

Updated at 4:40 p.m.