Florida Tesla crash had no defect in autopilot system

Federal regulators have closed the investigation into the crash that killed a Tesla driver in May, saying that officials found no defects in the semiautonomous Autopilot system being used at the time.

But while investigators with the National Highway Traffic Safety Administration found no flaws in the software or braking systems, a broader federal review of dozens of Autopilot crashes did point to industry-wide challenges as drivers — sometimes inattentive ones — increasingly rely on cars to do more of the driving for them.


In this case, the driver had seven seconds to react to a danger ahead but did not do so, investigators found.

Autopilot is not the same as “self-driving,” though some Tesla drivers have tried to treat it that way. It is instead a more limited set of features, such as cruise control that can gauge the speed of cars up ahead and some automatic steering. The company says drivers should keep their hands on the wheel and pay constant attention, though safety researchers say drivers can easily be lulled into a false sense of security.

The May collision occurred when a truck turned in front of a speeding Tesla driver in Williston, Fla. Tesla said at the time that “neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.”

NHTSA spokesman Bryan Thomas said Thursday that “the tractor-trailer should have been visible to the Tesla driver for at least seven seconds prior to impact.” Investigators said the crash “appears to have involved a period of extended distraction,” though the precise cause remains under investigation by the National Transportation Safety Board.

Tesla says drivers working in conjunction with its Autopilot technology are safer than those without it.

Company founder Elon Musk called the NHTSA report “very positive” and highlighted one of its findings: “The data show that Tesla vehicles’ crash rate dropped by almost 40 percent after Autosteer installation,” a reference to technology that keeps the car centered in its lane.

Months after the crash, Tesla sent out software upgrades that Musk said “very likely” would have prevented the Florida crash by making better use of onboard radar technology. Radar can be a powerful collision-avoidance tool but can be fooled.

“Slamming on the brakes is critical if you are about to hit something large and solid, but not if you are merely about to run over a soda can,” Musk said in a company blog post last year. Since Teslas are linked wirelessly to company computers, the cars’ software can now rely on the experiences of drivers to teach the safety technology which hazards are real, Musk said, sharply increasing safety overall.

Tesla also tightened its cars’ approach to drivers who seem to not be paying attention. Drivers who ignore an alarm more than three times in an hour “will have to park the car and restart it in order to enable Autosteer,” Musk said.

NHTSA investigators said their broader look at Tesla crashes included those that occurred when Autopilot was being used or within 15 seconds of a transition from Autopilot.

“Many of the crashes appear to involve driver behavior factors, including traveling too fast for conditions, mode confusion, and distraction,” the investigators wrote. Mode confusion is the idea that it is unclear who is in control, man or machine, and occurred “during attempted Autopilot activations” and “after inadvertent overrides.”

The investigators concluded that some of the crashes “occurred in environments that are not appropriate for semiautonomous driving (e.g., city traffic, highway entrance/exit ramps, construction zones, in heavy rain, and road junctions/intersections).”

Months after the Florida crash, Tesla announced that its new cars now all include hardware needed to be completely self-driving and that improved Autopilot software is being developed to exploit those new tools.

More information: The Washington Post

Comments are closed, but trackbacks and pingbacks are open.